Chinese EV manufacturer Nio offers all its cars with the optional feature of an in-car personality called Nomi Mate, an AI-driven HMI for Nio’s operating system. The interface is presented to the driver via a dashtop device that represents the vehicle’s personality in the form of a head and face. The system is a dynamic one, as shown in this video, with Nio seeking to imitate the subtleties of human interaction. Automotive Interiors World spoke to Ted Li, associate vice president of product management at Nio, to understand more about the development of this characterful in-car assistant.
What was the main reason behind the development of Nomi Mate?
People always talk about products, sales, marketing and technology. But very few people talk about the relationship between machines and people, and especially, ultimately, the relationship between a brand and the user.
Nio started by thinking about this question: What is the relationship between future users and a Nio car as a friend? We believe that we are about more than just selling vehicles or providing automotive services. It’s about building lifetime relationships with the user, building an emotional connection and emotional attachment throughout the whole vehicle life – this drives a lot of the fundamental thinking behind our product. That’s why we came up with Nomi Mate, and we call it that because that is the definition of the companionship we want users to have via the interface.
For example, when we discussed technology like voice commands, and wanted to add that capability to our cars, immediately there is an issue with that functionality and building a relationship. Our company founder, William Li, spotted the problem. If the car is truly a companion to the user, why does the user have to shout into thin air when communicating with the vehicle? Normally, when a person is talking to another person, they look at each other – they’re not talking to the air.
Why the decision to make the system move?
This relates back to the companionship. The car not only has to talk to you, you have to feel it. Life is not static. There are different ways to make the system seem alive. You could have a graphic that just pops up when you talk and then goes way, but we felt that didn’t build that companionship. We didn’t just want a digital figure to represent the relationship; we wanted the system to feel alive, and to feel part of the car, so we came up with this head and face for the vehicle.
Initially, back in 2015 when we started developing the system, there was a feeling that having Nomi move and talk could actually alarm people, so there was an idea that it would just move, gesture in a way, and make sounds to acknowledge that it was listening. But eventually, we figured out that talking was necessary to help it actually achieve functionality, otherwise it would just be a cute device that didn’t add much value.
How important was it to impart natural movement to the system?
When we were defining it, we wanted a very perfect movement, resembling human head movements. In the latest Mate, there are two motors: one motor controls the horizontal rotation, the other motor controls the… the… let’s say the vertical rotation, and when combined they can create this 3D motion. To achieve the natural feel, we really looked to the film and animation industry to see how we could design the whole character so that the movements are not simply robotic and mechanical, but driven by emotions. We had to design the right hardware to allow the tiny, almost imperceptible movements – the motors, the belts, all of that – with the movement determined with millisecond accuracy.
How important is the software element of the system?
The fundamental software that enables the system is called our emotion engine. We developed our own software stack, with active speech recognition, which is kind of the front end, then in the middle we have natural language understanding. Some of that is processed on the car, but most is done in the cloud. That part of the system is updated and improved almost daily. The final part is how the system speaks to you, the natural language generation. That part is a very standard engine, but we have to customize how it sounds.
The other part [of the software] relates to the connectivity and the content access service, which is endless. For example, Nomi is linked in to many of the capabilities, like navigation and music, that are integrated into the Nio OS.
What are the boundaries of Nomi’s involvement in the driving experience?
Initially we set a clear boundary where Nomi couldn’t execute driving-related and safety-critical tasks. For example, that boundary has been set to prevent the system changing the drive mode. You don’t want a child, or your wife or your other friends in the car to say, “Change drive mode”, and then that changes. The driving mode alters the pedal force and changes the steering wheel, so might interfere or impact your driving behavior.
There are also some other subtler things. We didn’t allow Nomi to operate the tailgate, because someone outside the car might say, “Open tailgate” and it just opens. However, this year we have figured out that many of those constraints are not necessary and we can work out ways of adding functionality. For example, hopefully in the future we can even talk to Nomi and say, “Drive for me”, which would activate the automated driving functions. Of course, we will need to do a lot of testing and validation, as well as look at the impact on the user experience before we implement things like that.