How Visual Testing and Driverless Cars Go Hand-In-Hand

Advanced Topics — Published December 20, 2016

Everywhere we look, we see more and more car commercials with the “auto-brake” feature added. Studies show that many car manufacturers are already including this feature into all of their new cars. And a recent survey shows that by the year 2023, over 20 car manufacturers will have this as standard equipment.

If you’re not familiar with “auto-braking” and have missed the humorous and insightful commercials showing how the feature works, then I will explain. While it seems elementary in design, this feature will provide the ability for the vehicle to identify if there is a potential risk of an accident in front of the car. If the speed of the vehicle is greater than what you would expect as you approach an obstacle in front of the car, the system will automatically brake and slow the car – even if the driver is not paying attention. 

Vehicle manufacturers predict that this feature will save millions of future accidents, collisions, and ultimately lives. The predictions also show a reduction in insurance costs due to better driving records. And all of this occurs because the car is going to “make a decision” for the driver when it appears the driver is not paying attention.

As a testing professional, it’s difficult for me to watch these commercials on auto-braking, or the news articles on the “self-driving cars” without having so many questions around “what if”. Questions immediately come to my mind like:

  • “What if an animal zips by your car inches from the front – does the car slam to a stop and risk the people inside being injured?” OR
  • “What if the driver has their foot on the gas pedal when the car decides to auto-brake, and the surprise jolt makes the driver press the gas pedal, and, instead of stopping, they speed up and have an even greater collision?”

There are many other questions I have, as I learn more on this technology. If you’re a tester as well, I’m sure you do this for many of the everyday “new features” you hear that companies release. We always wonder if they “thought of every situation” or not.

Regardless of the “new risks” based on “new technology”, I think everyone would agree that the auto-braking feature will always do more good than evil. And we also know that having the car acting as a backup when we are not paying attention is an important factor in reducing the risks of a potential collision.

As I learned more about auto-braking technology, I began to relate this to software testing. I’m sure there are many drivers who believe they do not need this feature in their car. They do not need the “double vision” where the car helps them to see potential collisions. Many believe they can be looking around, doing something on their mobile phone, or talking to passengers, and not be distracted. But based on historical records, we know that many collisions, accidents – and even deaths – occur because the driver was simply not paying attention, and drove their vehicle into something.

This same theory could apply to testers.
We sometimes believe that we can find everything and that there is no way we can miss an obvious problem that is right in front of our eyes.

I am a strong advocate for the thinking tester. I value the ability for a human mind to make decisions as a tester and to identify the potential issues with the product. Our experiences and our ability to take a different route, or to change a plan for testing, make the thinking tester’s role critical in our world today.

However, we have to face the reality of “in-attentional blindness”. It’s a well-known and documented dilemma. And I have, in fact, conducted many workshops and speaking sessions on this very topic. People simply think they see things exactly as they are, when, in fact, they simply will miss things. This occurs even when they think they are paying the closest of attention.

Let me give you an example. I gave an exercise to the audience in one of my testing conference workshops, based on an idea I got from a television show called “Brain Games”. I asked the audience to look at the following characters:

Guess the characters

 

Almost everyone in the room would say they see “Mike Lyles Is Jumping to Conclusions”. Our brains will fill the blanks when needed, and make assumptions as needed, especially when it looks OBVIOUS that this is what we are seeing.

However, when I remove the cover from the bottom half of the letters, everyone is surprised to see the following screen:

Guess the characters 02

 

What the audience learns from this, and many other workshop exercises is that we are not always as good as we think. I called my workshop “Visual Testing: It’s Not What You Look At, It’s What You See”  because so many times we can look at something and miss the obvious. This occurs, even when we have seen the application or the screen dozens or hundreds of times. It becomes very apparent that even if a person is TOLD that there is something missing or changed, they sometimes do not see it.

This begs the need for visual testing tools to fill the gaps. The thinking tester will use logic to determine where to go, what steps to make, when to try a new route, or how to best induce a defect in the application. Adding tools that look for things, which may not be easily “seen” by the human eyes, is the “double vision” that provides us the ability to deliver a higher quality product.

Those that have taken my workshops and training, and those that have participated in visual testing exercises will admit that it’s easy to miss critical deficiencies. Having a strategy that includes a visual testing tool enables testing teams to reduce the risk of “collisions” and gives that “auto-braking” system to your application. When you have experienced that first benefit of having a second set of “eyes” watching over your application, you will truly understand the value of having these tools in place.

Many years ago, U.S. President Ronald Reagan was quoted as saying “Trust, but verify”. With visual testing tools, testers can ensure that they compliment their work with a precision that sometimes the naked eye will miss. Having this visual tool in place allows testers to focus on other areas of the application and improve quality of the product delivered.

Increase coverage in minutes - with Applitools Eyes Automated Visual Testing

To read more about Applitools’ visual UI testing and Application Visual Management (AVM) solutions, check out the resources section on the Applitools website. To get started with Applitools, request a demo or sign up for a free Applitools account.

About Mike Lyles:

Mike Lyles is a QA Director with over 20 years of IT experience in multiple organizations, including Fortune 50 companies. He has exposure in various IT leadership roles: software development, program management office, and software testing. He has led various teams within testing organizations: functional testing, test environments, software configuration management, test data management, performance testing, test automation, and service virtualization. Mike has been successful in career development, team building, coaching, and mentoring of IT & QA professionals.

Mike has been an international keynote speaker at multiple conferences and events, and is regularly published in testing publications and magazines. Mike’s passion to help others improve and grow, in the field of testing, leadership, and management, is his key motivation. His first published book on leadership will be released in early 2017.

Are you ready?

Get started Schedule a demo