Adam Emfield, senior manager of user experience at Nuance Automotive, explains why the future of in-car interactions will be multimodal
When we bring up innovation in the automotive industry, most conversations instinctively swing toward connected, autonomous, shared mobility and electrification (CASE) technologies. An important aspect of these developments is the part they’ll play in changing the way people interact with their vehicle infotainment and control systems. In the future, actions will be initiated on more than simply touch and voice: it will almost be a full-body experience, leveraging many of your senses and optimizing your non-verbal communication skills. We’re not suggesting you’ll be licking your steering wheel anytime soon, but the subtleties of how we communicate with other people will make their way into the car through mobility assistants. These assistants will be contextually driven, providing users with tailored information to suit the time and place. They will no longer be user-centric, but me-centric, catering to each individual user.
Body language
Talking to a car today feels a lot like speaking to a person on the phone, with verbal exchanges being passed each way. In the near future, cars will determine where you’re looking and what you’re doing with your hands. Imagine if you could point or look at something in your environment and ask questions about it? A future where you could just talk to the car without pushing a button or using an activation word. Today, it is difficult to display information where you need it most. There’s a good chance your car is designed so that you look at a display in the center console when interacting with it, or that you have to pick up your phone while driving. We believe that future vehicles will put the information exactly where you need it. For example, information will be projected onto your windshield, providing an augmented view of the world outside [see page 40 for holographic HUDs]. We may even be able to direct the audio to a single person (say, the driver) when others don’t need to hear it. The system will also be able to generate natural phrases and adjust its tone according to the urgency of the situation, rather than repeating monotone instructions.
Concerned about privacy or distraction?
Don’t worry, so are we. Nuance is researching and designing balanced systems that provide an appropriate level of customization. These personal assistants will learn about you and your habits, but will only be there when you need them.
At Nuance, we understand that not all driving situations are equal, so we’re designing multimodal systems that complete every interaction as seamlessly as possible, allowing the driver to keep their eyes firmly on the road.