Voice-enabled technology is conquering the homes of millions of people these days. In the US, one in six adults already owns a voice-activated smart speaker, and its adoption rates are even outpacing those of smartphones and tablets. This growing awareness of virtual assistants has a strong influence on the automotive sector, too: “Today, people are anticipating voice control to become more mainstream and more accurate, and expect it to be a fundamental part of their in-vehicle user experience,” says Adam Emfield, Head of the Nuance DRIVE Lab.
Smart automotive assistants are becoming game changers
When looking at the voice-control systems in series production these days, we note that we are in a transition phase: Human-machine-interfaces and infotainment systems with a menu-based approach and a static set of commands are frequently being replaced by personalized automotive assistants that can be operated more naturally and intent-based. These automotive assistants are becoming central elements of the user experience and enhance the brand of the vehicle manufacturers. With introduction of MBUX, (Mercedes-Benz User Experience), Daimler not only emphasizes the natural language interaction but even created a unique, telling name for their automotive assistant, underpinning the strong focus and importance of the system.
Technical feasibility causes mind shift
“Five years ago, I heard users express that they weren’t that interested in voice input because it wasn’t accurate enough yet,” states Adam Emfield. “But since then, we traced a shift in user acceptance as the systems became more powerful and convenient. Today, people really feel that voice control is an improved mechanism to accomplish many tasks in the vehicle.” In addition to the technical capabilities which have emerged over recent years, the increasing connectivity of our lives and 24/7 access to information from a range of sources is driving development. These developments also affect vehicle design, and will continue to do so with stepwise vehicle automation.
“Our research shows that already today, users are watching the capabilities of their technology become increasingly complex, but altering combinations of buttons, switches, touch elements, etcetera, are far away from offering a standardized interaction,” explains Adam Emfield. Learning curves, such as the ability to learn and transfer the interaction processes when going to a new system, strongly depend on the design quality. Advanced voice systems, based on natural language understanding (NLU), make a significant contribution to simplifying and standardizing the interaction process. Because language is the most natural form of expression, advanced automotive assistants enable the user to complete even complex tasks in a one-shot voice interaction—for example, a complex query to find parking that meets many criteria instead of filtering by one thing at a time.
Car manufacturers are adapting to these changes and are using the opportunities to make their vehicle not only a personalized, convenient means of transportation, but also a central hub of the user’s connected ecosystem.
Importance of advanced voice control recognized
As a most recent example, the German automotive and tech magazines AUTO BILD and COMPUTER BILD awarded the new Audi A8 with the 2017 connected car award highlighting its natural language control. The fully customized, conversational system makes full use of the advantages, offered by our hybrid Dragon Drive platform, which combines embedded and cloud services. The deeply integrated solution offers voice-controlled access to embedded services like music and navigation, and in-car functions such as air conditioning. At the same time, the cloud provides connected services, including weather, points of interest and addresses, parking, and gas stations, as well as calendars and notes. The intelligent distribution of data streams and functions to the cloud and the vehicle assures a very low latency rate and maximizes productivity even when no internet connection can be provided.
The NLU capabilities of the system enable the user to interact with it very naturally and use rather implicit commands. For example, the command “It’s too cold in here,” is sufficient to change the air conditioning settings in the user’s seating zone. Thanks to text-to-speech technology, interaction with the automotive assistant is more than a mere exchange of commands and prompts, but a very human-like interaction. The system is able to read weather forecasts as well as other types of content, providing the driver with the requested information in the least distracting way possible.
Next evolution steps will go beyond voice control
This Audi example shows that impressive automotive assistant applications can already be seen on the road today. However, there is still vast potential for further improvements as the rise of new mobility concepts, such as vehicle automation, cause changing user demands. Nuance offers technologies to improve in-vehicle speech quality, such as the Speech Signal Enhancement to make the car a third living and working space.
In addition, Nuance stands ready to combine our artificial intelligence with additional sensor technologies, high quality maps, and vehicle positioning technologies as they are introduced by OEM and third-party vendors with stepwise vehicle automation. These anticipated new technologies are deeply integrated into our holistic automotive platform development approach. As a result, these solutions will foster advanced interaction methods, such as gaze interaction as introduced by Nuance at Consumer Electronics Show (CES) in Las Vegas early in 2018.