What UI testing will look like in 20 years

It’s the year 2040, self-driving cars are on the roads, and we pay for our coffee with Bitcoins – or do we? Such forecasts are complicated to make, but one thing is certain: digitization will continue to shape our lives in the future. And with the advancing networking of end devices, new demands will also be placed on software. 

To meet these requirements satisfactorily, the testing of this software must also adapt to the new conditions. Current approaches will become obsolete, and new tools will emerge. Of course, no one can say exactly how and to what extent the world will change. Despite all this, in this article, I dare to take a small look into the future and show my thoughts about it.

Voice-user interfaces are everywhere

User interfaces (UI) have changed dramatically over time and will continue to change in the future. In the early days of the digital world, the Command Line Interface (CLI) was the most common interface between computers and humans. Although this is still highly valued by tech pioneers and developers today, it did not catch on for the general public.

It wasn’t until the Graphical User Interface (GUI) introduction that computers became suitable for the masses. Today, the GUI is the most widely used form of a UI. It is used on pretty much all laptops, tablets and even smartwatches. Meanwhile, even smart home devices and automobiles are equipped with GUIs. This is also reflected in the development of testing tools. Visual testing tools, which search for display errors in precisely such GUIs, are becoming widespread and well-known.

But what comes after the GUI? We can already look at what will probably be the next big UI – think of Alexa, Siri and co. The so-called Voice User Interfaces (VUI) go one step further than the GUI. Instead of using the eye and haptic movements, inputs are made via human speech.

But how to test such VUIs??

Visual approaches are difficult to use if the visual element is missing. New tools will emerge, which will increasingly deal with the simulation and validation of speech input. Probably Natural Language Processing will play a major role in this. Perhaps we will also experience talking robots that think up crazy sentences, who knows.

Human behavior is simulated

Artificial Intelligence has evolved greatly as a field of research over the past decade. Although we are still a long way from Artificial General Intelligence (AGI), it is clear that artificial intelligence (AI) will change all our lives in the future. The use of AI is also increasing in software testing tools. It can help prioritize tests or repair test cases through self-healing already. This makes software tests more efficient and the work of software developers and testers easier.

With the further development of AI, new application areas in software testing will also emerge. I even go so far as to predict that we will simulate human behaviour for UI testing. Enough data about human behaviour with end devices are already available on the Internet now. It just needs to be processed and used. Thus, a future tool could simulate dependent test personas and therefore automate software tests more efficiently.

Internet of Things, Smart Home and Automotive

UI testing today is very much focused on UIs on the World Wide Web. Tools such as Selenium, Cypress and Taiko focus exclusively on automation in the browser. In the future, digitization and the networking of end devices will continue to advance. Much of this will be browser-based – but other technologies will also arise. New solutions will be developed, particularly in smart home devices and automotive applications, where offline access seems necessary sometimes.

Due to this, testing strategies will also have to adapt. Testing tools must become more independent of operating systems and browsers – platform-independent solutions will determine our future. This trend is now also finding its way into many tool manufacturers. Thus, testing with the help of image recognition seems to be a promising candidate to address the problem of platform independence. We will see a lot more of this in the future.

Key Takeaways

Our world is changing: AI, autonomous driving and blockchain are the buzzwords of the last decade. This also has an impact on the software testing industry. Manufacturers and testers must adapt to new circumstances and make adjustments.

I venture three predictions in this article:

  • The Testing of VUIs will challenge us in the future.
  • Artificial Intelligence will create new ways to automate testing with simulated human behaviour.
  • Platform-independent solutions will become more and more important.

I’m curious to know, how do you think the world will change for us software testers in the future?

About Author

Success! You're on the list.

3 thoughts on “What UI testing will look like in 20 years

Add yours

  1. I was interested in your view of VUIs.

    At the moment, VUIs are all very app-dependent – Alexa, Siri, Cortana and so on. Now, I’m no tech expert, but it seems to me that VUIs for general applications will only take off once the VUI exists within some sort of front-end API that allows “smart microphones” to take speech and make the output intelligible to any app. Once devs can write applications where their first job isn’t to understand the user’s instructions, but instead to take a standard set of voice inputs from a generic front-end smart microphone and then act on them, then VUIs will take off and become widespread very quickly.

    1. I definitely agree with you on this. We are only at the beginning of VUIs, I also think there is a need for more open source projects in this area to get more developers involved.

Leave a Reply

Up ↑

%d bloggers like this: