Discussion on The Vision-First Test Automation Keynote Session with Jason Huggins at #TestMuConf2024! 🚀

Which is the best tool for vision first test automation?

What are challenges limitations you encountered in mobile best test automation

What are best tool for vision first tes automation

Which tool is more sufficient to test automation

Why is adopting a Vision-first approach beneficial for test automation projects? What specific problems does it address?

Are there any such advances in other kinds of testing tools like performance, accessibility etc as there is in automation tools

Should we learn computer vision in order to incorporate these approaches ahead of the industry ?

Is top down approach using cameras and sensors be applied/ suggested for 3d device/games testing ?

Can automated testing be used to test computer vision (associated with AI and cameras), which is used for interpreting and processing visual information in some apps?

What are the use cases for this vision-first test automation?

What is your primary goal when addressing complexity in your CI pipeline?

How to apply this OpenCV based test steps into my Java based Selenium tests testing the websites?

Which programming language have demand for automation testing ?

Can the automated testing process work as well for laptops and other screens (besides tablets and smartphones), or does the form factor need to be scaled and enlarged to handle it?

Could FPGA boards work as well as Raspberry Pis as a single-board computer for the heart of the testing framework?

Hi,

Yes, The New York Times used visual testing tools to ensure their website’s design remained consistent across different devices and screen sizes.

By applying vision-first test automation, they were able to quickly detect and fix visual issues, ensuring that their readers had a seamless and engaging experience regardless of the device used to access their content.

Choosing the right test automation tool really comes down to what you need and how you work. Here’s a quick rundown to help you decide:

  • Selenium: It’s a veteran in the field, great for web apps, and works with several programming languages. If you’re comfortable coding and need something flexible, Selenium is a solid choice. Just be prepared for a bit of setup and possible extra tools for visual testing.

  • Appium: If you’re testing mobile apps, both Android and iOS, Appium is your go-to. It’s versatile and works with different programming languages, so it’s pretty handy for mobile testing across platforms.

  • Cypress: Known for its speed and ease of use, Cypress is fantastic for modern web applications. It’s got a great interface and integrates well with CI/CD. If you’re focused on web apps and want something straightforward, Cypress is worth a look.

  • Playwright: This is a newer tool that’s making waves for its speed and reliability. It supports multiple browsers and is great for testing modern web apps, especially if you’re dealing with complex scenarios.

  • TestCafe: Easy to set up and doesn’t need WebDriver. It’s great for web app testing and supports a wide range of browsers. If you want something simple and effective, TestCafe might be the way to go.

So, it really depends on what you’re testing, your team’s skills, and how much setup you’re up for. What kind of projects are you working on? That might help narrow it down!

The vision-first approach to test automation focuses on validating the visual aspects of an application, aiming to catch issues that traditional methods might miss.

By using image recognition and visual AI, it can detect layout inconsistencies, design anomalies, and visual bugs that impact user experience. This method often results in higher accuracy for visual validation compared to traditional methods, which rely on backend elements or code-based interactions.

In terms of efficiency, vision-first testing can reduce the need for extensive manual checks and rework caused by missed visual defects.

However, it can be more resource-intensive, as it involves processing visual data and may require additional setup. Despite this, the increased reliability and the ability to catch subtle visual issues often make it a valuable complement to traditional testing, especially for applications where visual fidelity is crucial.

I attended this session, and being a test automator, I feel testing smartphone cameras for AR/VR applications is quite challenging. In my opinion, it’s not just about testing image quality but also validating real-time tracking, depth sensing, and how virtual objects interact with the real world.

From my experience, the hardest part is ensuring these things work across different devices and lighting conditions. Automation doesn’t cover everything here, so manual testing plays a big role. I’ve found using simulators helpful for things like motion tracking, but overall, it’s a tricky balance between automation and manual validation.

Hope this helped :slight_smile:

As a test Automator, I believe the knowledge of automating robots can definitely be applied to real-world scenarios, even to start a company in our home countries. In my opinion, using automation to solve local problems in areas like manufacturing or agriculture could be a good starting point.

From my experience, the automation frameworks we use for testing can be adapted to robotics, making it easier to scale and deliver reliable solutions. It’s a great way to tap into local industry needs while leveraging automation skills.