The new Audi A8 with Dragon Drive

At IAA, Audi showcased the concept car Aicon, that can communicate with pedestrians, as well as passengers. Apart from this futuristic vision, Audi also shows how it feels to communicate with a car today – with Nuance Dragon Drive in the new Audi A8.
By

How will we communicate with our automotive assistants in the future? Well, from my perspective, there really is only one way: through natural language. Complicated, convoluted and frustrating voice commands to navigate through menus are a thing of the past – now, with Dragon Drive, you can simply tell the system what you want and it will react accordingly.

At IAA 2017, Audi presented Aicon, a concept car, which provides a vision of tomorrow and offers numerous intelligent functions. Aicon is not only able to communicate with passengers but also with pedestrians, e.g. by projecting signs on the street.

It also has a built-in intelligent assistant that we can directly communicate through natural speech. However, smart automotive assistants that understand natural language aren’t exclusively a thing of the future. Indeed, they already feature in the new Audi A8, available today. Audi and Nuance are taking the voice assistant to the next level by turning it into a true dialog partner. At the Audi booth in Hall 3.0, you can experience the intelligence of the voice assistant in the brand-new Audi A8 and discover Dragon Drive, the connected car platform by Nuance.

We’re all familiar with the time-consuming and potentially frustrating nature of basic speech recognition in cars, but Nuance and Audi’s dialog-based automotive assistants truly represent the future. Dragon Drive provides a system that listens, understands and responds to the driver. This includes cloud-based speech recognition, natural language understanding and text-to-speech conversion. The driver can access all connected services and information through natural speech.

Drivers can get access to weather information, navigation and points of interest, as well as parking and petrol station information. Alongside this, calendar and memo functions are also quick and easy to set up. Infotainment and air-conditioning can also be controlled through our voice. Check out my video from the Audi booth to see the technology in action.

With Nuance’s advanced technology, speech recognition doesn’t just mean responding to commands. For example, when a driver says “I’m cold”, the system will ask for the desired temperature to adjust the air conditioning. Also, commands such as “Please decrease the temperature by two degrees”, “Set the temperature to 72 degrees” or “72 degrees please” can all be understood and executed by the system. Of course, we don’t always speak in direct and explicit commands, but with Audi and Nuance’s new natural talking feature, all communication with the car will feel much more personal, which is important for drivers who spend hours in their car each day.

Thanks to the new dialog manager, the system is always ready to take on commands, no matter what menu or system you’re using at the time. For example, you can set the climate control in the navigation menu or your destination in the radio menu. In some older models, you may need to switch to the respective menu in order to activate certain functions.

Natural interaction with your car will be an important part of tomorrow’s autonomous vehicles and to safely navigate journeys today. Check out our video from IAA for an insight into how we as passengers – and pedestrians – will communicate with cars in the future.

Tags: , , ,

Johannes Knapp

About Johannes Knapp

Johannes Knapp is founder of NewGadgets.de and runs the blog since 2006, focusing mostly on topics around mobile computing and mobility. Johannes spends most of his time on airplanes and visits the most important events and trade shows worldwide. Especially Taipei, being one of the technology capitals, appeals to him and he spends time there every two months to visit the most important companies and to stay ahead of the latest technological trends.