Getting Testy with Product Development
Here's how systems integrators can play a role and help manufacturers drive innovation.
While technology professionals may get excited about bleeding edge innovations, their clients may not. Often, this latter group is more concerned that the systems being deployed will function consistently. To avoid glitches and failures, systems integrators must test out the products they’re planning to install well before getting them to the job site.
Integrators who take product testing seriously are also well-positioned to drive innovation and product development. Because they work on the front lines, they can provide valuable insight to manufacturers on how their products perform in the real world. After all, it’s these factors that guide integrators through product selection.
“The integrator plays a really critical role [in product development],” explained Tyler Troutman, manager of global strategic market development at Shure. “Integrators can let manufacturers know things like how easy or difficult their systems are to install, and how complex software configuration may or may not be. Their feedback early on in product development is critical to make sure that we’re not missing anything.”
Feedback: Formal vs. Informal
Tyler Troutman
Image credit: Shure
Ira Beyer
Image credit: Level 3 Audiovisual
Afiya Cupid
Image credit: TRITECH
At Shure, the combination of initial feedback solicitation and alpha and beta testing is a formalized process. “Typically, it will be a script people follow because the product isn’t fully developed,” Troutman said. This helps the manufacturer perform trials in a controlled fashion during the early stages of development.
However, from there, the process is less structured. “We get to a point in product development as we get closer to launching where, pretty much, I’m calling somebody saying: 'Here’s a product. You take it and you do what you want with it.'"
Within reason, of course. What Troutman is seeking at this stage is input on a product’s limitations, what the integrator loves about it, what they love less, and suggestions for improvement.
Mesa, AZ-based Level 3 Audiovisual has an in-house Engineering Lab dedicated to product testing. “We’re very careful about what we select and who we partner with, and we want to make sure that everything we use is going to suit our needs and deliver the best experience for our clients,” explained Ira Beyer, engineering manager at the firm.
A daily selection of the top stories for AV integrators, resellers and consultants. Sign up below.
According to Beyer, products may disappoint for a couple of reasons: Either they have legitimate limitations, or the team testing them hasn’t fully grasped what they’re all about. “When we get to that impasse," he said, "we work closely with the manufacturer to make sure that we’re understanding the product correctly [and] that we’re using it correctly." If this is the case, the conversation shifts to focus on client requirements and what solutions are the best fit for a specific application.
In the Lab
About six years ago, TRITECH Communications in Garden City, NY, launched the Dev Lab, its in-house product testing department. Afiya Cupid, Dev Lab engineer, explained that she fields inquiries from internal engineers seeking confirmation that the systems they’re planning to deploy work properly, as well as manufacturers requesting feedback on products in development. In each case, she works with designers and engineers to establish what she should test for. She also assesses systems on some common criteria, such as the quality of manufacturer tech support, reliability, and scalability.
AV integrators can be of particular value to manufacturers when they uncover unforeseen issues. Stefanos Stefanidis, director of engineering at TRITECH, said this often happens when end clients want to use systems slightly differently than the manufacturer originally intended.
At Verrex in Mountainside, NJ, Ben Dandola-Grubb, VP of technical integration, explained that his team tests out all products, even renowned technology, for each project as part of the company’s quality assurance process. “Everything is getting put together and we try to build it as closely as possible to how it will be deployed at the client site,” Dandola-Grubb said. “All the microphones are hooked up, the cameras are passing video, [and] every touchpanel has the code loaded. We have those checks and balances [built into] our QA process.”
For example, a conference room system designed to allow meeting participants to share content from their devices to a display may perform well with one laptop—but may start glitching when two or three are connected. In this scenario, the manufacturer didn’t understand that for most meetings, end users want more than one person to be able to use a laptop for collaboration purposes.
“This is the feedback they need,” Stefanidis said. “It’s in the Dev Lab where we find things that maybe a manufacturer did not think of.”
At Level 3, Beyer's team conducts a thorough analysis before deciding whether it’s worth the time and effort to test out a new product. This includes determining where the product is made, its availability on a global scale, and the manufacturer’s reputation.
During testing, Level 3 assesses products for how well they integrate with other systems, how secure they are, and how easy or complex the programming process may be. “We have skilled people in our organization, but is it worth learning a whole new language or a whole new configuration solution?” Beyer offered.
Level 3 has cut off ties with a manufacturer for a variety of reasons, bad support being one of them. Beyer added that a lack of reliability and poor interoperability are also dealbreakers. “We may see that their support is available and we can get the help we need, and everything looks good,” he said. “But then we use [their product] in a project and we have problems, [and] we can’t get people to help us on a timely basis.”
Product Tests, Not Reviews
Stefanos Stefanidis
Image credit: TRITECH
Rob Pickering
Image credit: TRITECH
According to her colleagues, Cupid has a reputation for taking copious notes during the testing process, which she transforms into detailed reports. According to Cupid, this is intentional.
“Our reports have a lot of detail to ensure that anybody who picks up that report can go through it, step by step, and see how to set up [that system], what pitfalls to expect, and any troubleshooting issues I came across,” she explained. “You want to think about the person who has never dealt with the system before.”
While one of the main goals behind product testing is to identify flaws, Rob Pickering, TRITECH’s director of technology, reminds us that in this context, it’s not the AV integrator’s job to declare whether a system in question is good or bad. “[The goal] is to highlight what worked well and what didn’t to allow those teams, whether they’re internal engineers or manufacturers, to make a decision on how they want to move forward,” he said.
In Beyer’s view, any thoughtful input integrators can offer their manufacturer partners—positive or negative—can potentially result in better systems. “One of our ways of saying thank you to the manufacturers is we share the test results with them, whether good or bad,” he said. “And sometimes that feedback, if it’s less than optimal, will result in follow-up conversations that help drive improvements.”
Carolyn Heinze has covered everything from AV/IT and business to cowboys and cowgirls ... and the horses they love. She was the Paris contributing editor for the pan-European site Running in Heels, providing news and views on fashion, culture, and the arts for her column, “France in Your Pants.” She has also contributed critiques of foreign cinema and French politics for the politico-literary site, The New Vulgate.
