The publications and newsletters we read are filled with product news that outline the latest AV innovations coming to market. These stories help system integrators and end users understand the features and benefits of each new product, and whether it’s right for the reader’s next project.
We hear far less about how these products come to fruition. The journey these products take before coming to market is always long and often arduous. The products need to be reliable, predictable, and free of bugs.
Research and development teams are instrumental in bringing the vision of product into focus. However, the last mile of the product’s journey to market – quality assurance – is what determines how well the product will perform once released into the wild.
QA for professional AV poses a unique set of challenges. The seemingly endless variety of professional and consumer devices that must be integrated together in a typical installation makes testing every possible scenario a daunting task. Endpoint and source device manufacturers aren’t as concerned with third-party device integration. The onus for that interoperability falls on the distribution and control vendors. Companies like Atlona, are held accountable for making sure we can work with every display, projector, laptop, streaming stick, camera, etc. That’s why we have started to leverage best practice QA methods from our parent company Panduit.
Quality assurance is an often-undervalued piece of the research and development puzzle. The primary focus of a quality assurance program is to ensure that every product delivered excels in performance for each customer, no matter how unique the use case.
We have invested a great deal into our quality assurance initiatives at Atlona over the past 12 months to make that seemingly simple goal a reality. This includes establishing a larger QA team that has a strong knowledge and understanding of the AV business, its technologies, and software QA methods that help us evaluate readiness for production release.
Paramount to this strategy is a rigorous testing process that helps us make better decisions about product designs and readiness. This includes the QA Team’s specific focus on software and firmware testing, and system level validation. It also means testing returned products to identify defects, and how to improve ease of use and the customer experience if no defects exist.
My professional career to date has largely focused on the implementation of quality processes and tools that ensure that structured testing strategies are in place. This means writing plans that determine how each product will be tested based on a defined set of user requirements. Today, we leverage industry-leading test automation tools like Atlassian Jira and SmartBear Zephyr to reduce costs and increase test efficiency. With these tools, we can document all test cases, defects, and performance issues within the same toolset, using a very structured approach.
So, what does that structure look like? Our initial focus is to identify what type of test cases we need to develop to comprehensively test each product, as well as each feature within that product. It also means validating both the functional and non-functional aspects of the product’s performance. From the functional perspective, it’s as simple as testing customer access. Have we properly encrypted the user’s login and password from client to server, ensuring that customer’s security? Can the user then navigate through the system and understand its features and functions?
Deeper into the process comes error handling. How many firmware and software errors are occurring, and how is the system handling those errors? This extends from simple login issues to compatibility across browsers and operating systems, and how the product functions from platform to platform.
That goes even deeper when extended to mobile apps, which opens an entirely new area of compatibility issues to test. Beyond operating systems, it means understanding performance against the signal strength of the mobile network – i.e., how many bars the user has – as well as how performance changes as users shift between 4G and 5G networks, for example.
At the non-functional stage, we are applying a variety of testing techniques and tools that go deep under the hood. These processes are the subject for another blog. But what matters to our customers is how far Atlona has come in this area, and the dividends it pays for helping us drive continuous improvements in product testing and readiness.
One of first initiatives as Director of Quality and Support was to push for International Software Testing Quality Board certification (ISTQB). Atlona is now certified as a testing unit by the ISTQB as understanding the proper testing techniques, best practices and industry standards that ensure product quality and performance. This ensures that the deepest non-functional aspects of a product’s performance meet or exceed expectations for security, integration, compatibility, usability, error handling, and reliability.
Our quality assurance initiatives are more important than ever as AV increasingly becomes the responsibility of the IT manager at the end business. Panduit’s IT industry experience has only helped strengthen our value proposition for these customers, and we are now hard at work to bring a new generation of products to market that IT managers can quickly learn and easily understand. Our Quality Assurance team is more than prepared for AV’s IT future.
About the Author
Iftekhar Hossain is the Director of Quality and Support at Atlona and has more than 20 years of experience in Quality discipline in various industries. Iftekhar has a master’s degree in Computer Science from Colorado State University and a bachelor’s degree in Computer Engineering from Assumption University in Thailand. Iftekhar has numerous certifications from the American Society for Quality and the International Software Testing Qualifications Board. Iftekhar enjoys travelling, listening to music, and has recently gotten into vinyl collecting.