Not a trust fall, but a trust rise to the occasion

Is a transparent system enough in an autonomous vehicle? Or do accuracy and reliability trump transparency? In this article, user experience researcher Dr. Carie Cunningham shares her personal thoughts as well as insights into ongoing research work at the Nuance DRIVE Lab.
By
When it comes to autonomous vehicles it is not enough to simply strive for innovative, but instead achieve ideal interactions

Who do you trust? Why do you trust them? For me trust is the measure of a true friend. If I don’t trust someone, then that person and I aren’t close. There is something so instinctual with feeling the security and safety of trust. In fact, safety needs are primal — just after our physiological needs of food water, and warmth, according to Maslow’s Hierarchy of needs. This basic need of trust gives us the confidence that someone has your back.

We all have that friend that we trust day-to-day, but get in a car with them and you wince. Your safety level and confidence in them drops with every one of your impulsive floor board stomps. Those are the trips where you are reaching for the car door before the car even comes to a complete stop.

Then there are those relationships where you know your place. Like assigned seats in grade school, you take your positions in the car — ready to conquer the road trip. These roles are more habitual where trust was established long ago and endures with every mile. Trust and driving are inseparable.

 

“I’ve got your back”

Okay, now think back to those first questions… who do you trust? Why do you trust them? Now name the pilot of your last flight. Don’t worry. I don’t know either. So why is trust so easy with pilots of planes, but the thought of having “airplane mode” in our car makes us take pause.

I think back to the implementation of anti-lock braking systems (ABS). I think back to the implementation of airbags. I think back to the implementation of lane assist. These “new” technologies greatly enhance our safety, yet they only work when we trust them. As we move into the future of autonomous vehicles, our trust needs to also evolve. Just like that calm voice, “take it slow” as you took over the wheel for the first time in drivers’ education, now we need a voice in the car telling us, “I’ve got your back.”

 

Transparency meets reliability

Recently, the DRIVE — Design, Research, Innovation and In-Vehicle Experience — Lab looked into trust and a car’s voice system. The DRIVE Lab focuses on the driver, the passengers, the pedestrians, and the future. In our study we had the voice system be more transparent and less transparent. In this case, transparency was shown by having the system given a system status update for example the system might say, “My apologies. Completing request now.” From our study in our driving simulator we found that drivers may think it is nice for a system to be transparent, but overall want the system to simply work.

 

Looking towards the future while also watching your back

These kinds of studies help us think into the future.  Is a transparent system enough in an autonomous vehicle?  Or do accuracy and reliability trump transparency? I think back to who I trust to drive my car today. Are they transparent or are they reliable? When it comes to autonomous vehicles it is not enough to simply strive for innovative, but instead achieve ideal interactions. We are doing the research of the future today to make sure your voice-enabled technology interaction of the future is one that allows for you to give up control and give trust.

So, consider the ease of a future where you give away the burden of driving and give trust to your car. Nuance is looking towards the future, but also watching your back.

Tags: , , , ,

Dr. Carie Cunningham

About Dr. Carie Cunningham

Dr. Carie Cunningham is User Experience Researcher at Nuance Communications. In this role she is responsible for qualitative and quantitative testing, research, and analysis through the use of focus groups, in-depth interviews, eye tracking, surveys, and experiments with the DRIVE Lab. Carie has researched users’ preferences of multiple virtual assistants and the personification of those assistants. She has also tested users’ driving performance while engaged with their infotainment system and voice-enabled technology. Most recently, Carie has done several exploratory studies on trust and learning around AI and VR. She is interested in attention and cognitive processing in media communication and interactions. Carie is a former television news producer and assistant professor.