Calm Down! How emotion recognition interactions can improve driving experiences

Imagine a future where your car’s automotive assistant can recognize your emotion. A future where the system learns about you and your environment, and knows how to respond, or if it’s best to leave you alone. We are collaborating with Affectiva to create such a solution, and the Nuance DRIVE Lab has been investigating how drivers feel about these concepts.
By and
Nuance DRIVE Lab study explores acceptance of automotive assistants detecting emotions

Automakers take pride in the emotional connections drivers build with their cars, as they have for generations.  Each automaker has its own feeling – its own brand – carefully crafted for drivers, down to the smallest detail, creating generation-spanning loyalty. The measure of success for automakers is a driver that comes back to buy or lease the next car from them, whether it is in a year or in a decade’s time. Traditionally, automakers focus on emotions over the long term.  Nothing within the car itself monitors the day-to-day emotional state of the driver.

 

Let’s zoom in and consider what emotions the driver experiences every day. Imagine a car that can interpret your emotions — if you are frustrated with something, angry at being stuck in traffic, or happy while taking a pleasant drive. Better yet, imagine a car that can tell you are drowsy or zoned out, increasing the chance of an accident. These are some of the ideas we have been collaborating with Affectiva on to make a reality. If you are taking a step back and considering the implications, though, do not worry.  Nuance has been conducting user studies with real drivers to ensure emotion-recognition solutions make drivers feel safer.

 

Our Study

Think about today. Imagine you are feeling happy while taking a drive. What triggered this? Maybe you were listening to your jam while on a scenic drive. Maybe you received a phone call telling you that your daughter made the team. Now imagine a different scenario, where you are angry because you are stuck in traffic – again – or someone cuts you off and it startles you. In each of these circumstances, envision your car’s automotive assistant recognizing these emotions. What can the assistant do to help you out? When should the assistant leave you alone?

 

The DRIVE Lab has been studying these questions, and we spoke about the results at the in Boston in early September. We ran 18 drivers through a user study in the Detroit area, exposing them to videos aimed at triggering different emotions: anger with traffic images, surprise with near accident images, and joy with scenic images. We then had the system respond with songs, phrases, and navigation rerouting. For example, we played Katie Perry’s “Teenage Dream” to help driver enjoy a scenic route more, or in another scenario, to calm a surprised driver the system would say, “Close call, but you’re alright!” This was our initial test to discover how the system should interact with drivers when it can detect emotion. What did drivers think of the system?

 

What did we learn?

 

Most people (72%) rated themselves as content with our system detecting their emotional state, whereas only 6% found it frustrating. Drivers were surprised by the system’s interaction after emotion recognition at first, but then drivers reacted positively in many cases after they learned what was going on.

Most people (72%) rated themselves as content with our system detecting their emotional state

In this study, we found there were two types of drivers: ones who valued productivity and ones who valued emotional stability from an emotion-recognition system. These types were split perfectly along gender lines. Women preferred for the system to recognize their stress or mood and wanted the system to help improve their mood directly (e.g. with a connected smart-home preparing calming music) whereas men preferred a car that would aid in productivity (e.g. maximizing efficiency on a route.) Similarly, men only rated our interventions as positive for the traffic conditions, while generally neutral in the other conditions. Women were even more positive for traffic interventions, though were also more positive for the near-accident scenarios when the system intervened.

Women preferred for the system to recognize their stress or mood and wanted the system to help improve their mood directly

After drivers experienced the various scenarios and the automotive assistant’s responses to the drivers’ emotional states, we interviewed the them.  We found that while drivers remained positive, they had some preferences for when a system should – and should not – interfere.  For example, when they were happy while cruising along a scenic route, they did not think the system should get involved unless the driver had pre-approved songs from a specific playlist.  When they were in a near accident, drivers would listen to the system when instructed to stop and take a deep breath, but they were a bit more lukewarm on the interventions we tested.  In traffic, drivers were pleased when the system responded by rerouting or calming them down, as long as the system could take the context into account. Notably, privacy was not a major concern, despite the fact we see privacy come up in many of our recent studies.  But above all, users wanted a system that would be customizable, that would learn from and about them over time, and would help increase safety and productivity.

 

Drivers look forward to more human-like automotive assistants

 

These results about customizability and learning are consistent with patterns we have seen over years of research at Nuance. Drivers look forward to automotive assistants that are going to be more human-like, more intelligent, and that can increase safety and productivity in a car.  They believe that these technologies will be imperfect, but if these customized systems can learn over time, the systems will offer even greater value.  They expect features that are cautious when proactive, but there to help out passively.  Drivers realize that emotions are like every other new feature coming in cars – systems should be smart enough to take context into account, and be designed holistically so each element works in harmony.  Finally, drivers expect that until we have autonomous cars, new features like these should make them safer and help them attend to the road.

drivers expect that until we have autonomous cars, new features like emotion detction should make them safer and help them attend to the road.

More answers to come

As the DRIVE Lab works on advancing voice technology in collaboration with Affectiva, we will research many more ideas to understand what drivers want when it comes to emotional states while driving.  We are going to continue to come up with and test new ideas for how these day-to-day interactions will create better in-vehicle experiences that lead to longer-term satisfaction with the cars in the future, building on the emotional connections drivers have with their cars.

Tags: , , , ,

Adam Emfield

About Adam Emfield

Adam Emfield is the principal user experience manager at Nuance Automotive. He leads the Design & Research, Innovation, and In-Vehicle Experience (DRIVE) Lab, and is responsible for the usability program for Nuance’s Automotive division. Coming from a background in both cognitive psychology and industrial engineering, he and his cross-functional team work across the division to develop new ideas for in-vehicle experiences, as well as to validate existing concepts.