Interview with Dr. Thomas Vöhringer-Kuhnt of Continental

Intuitive Vehicles - Speakers.png

In your round table @ our Intuitive Vehicles event you will discuss User Experience in Autonomous Driving with our attendees.  Without revealing too much where do you see the main challenges with regards to UX in autonomous driving?

First of all I prefer to talk about automated rather than autonomous driving. Unless the vehicle is not completely driverless, it is not fully autonomous. But still, for a lot of people especially highly- automated driving is a very emotional subject. On the one hand there is a lot of skepticism about giving up control, on the other hand we have seen how overtrust in system capabilities can lead to fatal accidents. Ironically, complacency has more severe consequences when automation failures are less frequent. The more reliable the automation works, the more it is trusted, and the more complacent drivers become. As technology companies we are very successful in improving technical capabilities and eliminating failures, but we need to focus much more on solving desires and needs of the end users. Let me cite Don Norman from a book chapter in which he writes about the human side of automation: "The paradox of automated driving is that it will only work if the human user is placed at the center."
There’s been a lot of discussions conducted on HMI for level 3 and 4  and the challenges of take-over situations. What are your thoughts on this?

In order to cope with take-over scenarios, which are certainly challenging, both vehicle and driver need to know something about each other. The vehicle has to be aware of driver's actual capabilities to perform the driving task again, the driver needs to know about actual automation state, system confidence, and future actions taken by the system. But first of all, we should not design the HMI to a certain level of automation, but to a collaborative interaction on an automation continuum. Frankly, those distinct levels are more relevant to lawyers who need to decide whom to blame for driving errors.
When talking about Autonomous Driving, often trust is mentioned as key for automation to become a reality. What needs to be done in this direction? How can we build up trust in the system?

Building trust needs communication, interaction, and time. Translated to human-machine collaboration this means providing multimodal information continuously e.g. about active driving mode, system state, system capabilities, current and future actions, and the like. Or, to put it with Don Norman again: "The information given by the vehicle is central to making that a reassuring experience". After some successfully mastered interactions the amount of information can be reduced, and trust will increase with the number of positive experiences with the system.

Can designs inspire trust in your opinion?

Deducting requirements for automated systems from user-centered design methods should be our priority. If we provide answers to end users desires and needs, our holistic HMI design will increase not only trust, but will also lead to a more exciting driving experience.
Can you elaborate more on the concept of cooperative driving?

Let me refer to Don Norman again, who postulates that user experience for automated driving has to evolve from I DRIVE to WE DRIVE. Our goal should not be to perfect and fully automate each and every task that can be automated. In fact, we must develop cooperative systems that keep the human in the control loop. Both actors (human and machine) can then execute those task(s) which they can perform best: Humans giving high level goals, machines controlling accurately vehicle stabilization and maneuvering. An example for a cooperative automation would be that drivers give an overtaking command, which is then executed automatically by the vehicle. I such situations it is totally clear who gives instructions and who does the driving, but both (vehicle and driver) are still involved in the overall task of getting from A to B.