Voice assistants in cars are becoming more like co-drivers, said Christophe Couvreur, vice president, product at Cerence. They’re not only responding to commands, but are also understanding common language and offering solutions.
Couvreur has been in the midst of these changes for years, as Cerence’s core business is speech technologies and other artificial intelligence (AI) technologies for the automotive industry. It supplies these to original equipment manufacturers (OEMs) and Tier-1 suppliers of equipment that gets into the car.
“Wider data connectivity will allow the car to behave in a much smarter way, assisting the driver in a much smarter way,” Couvreur told PYMNTS.
In recent weeks, Cerence has expanded its work with additional manufacturers, including Audi, where it will power the in-car assistant platform for the new Audi e-tron GT and Audi Q4 e-tron.
Cerence Pay, the company’s voice-powered, in-car payment product, is available and has been supplied to OEMs and Tier-1s, but the cars that integrate it are not yet on the road due to the development cycle of new cars, which can last 18 months to three years.
“People don’t want to talk to their cars for the sake of talking — they want to tell their cars to do something, and we keep looking for new use cases, new applications or new technologies that do something for the driver,” Couvreur said.
Paying Without a Smartphone
In the context of payment, Cerence saw an opportunity to make the driver’s life easier by enabling them to use the voice assistant in their car to pay for gas, to make reservations at restaurants and other similar applications.
The company works with classic payment processors so that if the retailer can accept credit cards, they can accept Cerence Pay. They do need something to interface with, such as an API. In the case of restaurant reservations, Cerence Pay connects with OpenTable. It also handles tolls and parking, enabling drivers to prepay for parking without using their smartphones and forging partnerships across parking vendors.
“As a driver, I don’t want to get my phone out of my pocket,” Couvreur said. “I may be on my way and I spotted parking.”
Cerence Pay is available separately from its voice assistant, so it can be used with or without that.
A Safer Way of Operating
The first application of Cerence voice technology was the ability to operate a phone with voice. Since then, the company has been increasing its capabilities. Cerence collaborated with Ford on SYNC, Ford’s voice-powered, in-car infotainment system that launched in 2007.
“The reason speech was relevant for automotive was initially safety,” Couvreur explained. “Voice gives you a very safe way of operating things in the car, hands-free and eyes-free.”
In the years that followed, the technology got better and the computers in cars got more powerful. Voice control of navigation systems was added. “Entering an address, even with a touchscreen, is not that easy while driving,” Couvreur said. “Even if you forget about safety concerns, it’s much easier to say the address.”
Similarly, navigating a city, searching for information and adjusting a seat when driving are all done more easily and safely with voice. “As the technology was getting better at understanding people, we saw that as a very attractive market with a lot of headroom for growth, and that’s why we got involved,” Couvreur noted.
Voice Is Growing More Common, More Capable
In 2010, Cerence introduced natural language understanding technology in the car, which allowed for a more natural and flexible way of speaking rather than requiring a specific command. As cars became connected in 2012, BMW introduced a system that allowed drivers to dictate text messages by voice.
Over the years, voice usage in cars has become more common and much more capable. Couvreur cited third-party data that showed 60% to 70% of cars have voice assistants, and of the people who have them, half are using them on a regular basis.
“In the medium and high range, it has become almost the de-facto standard,” Couvreur said. “It’s only on the very low end that you don’t see it as a standard feature.”
Cars Are Becoming Aware
The past and present development of voice assistance has been driven by computers becoming more powerful and less expensive, data connectivity being introduced and improved, and the acceleration of AI and machine learning.
“You are going to see cars become much more context-aware of what’s going on,” Couvreur said.
For example, today, if you ask a car to find a charging station, it can only look at the map, find the nearest charging station and tell you about it. But with connectivity, it can look at which stations are in use and recommend one with a free slot. Even better, it could see if other cars are heading to that charging station and if a car that is already there is nearly done charging.
Less an Assistant, More a Co-Driver
Also becoming more common are built-in cameras that detect if a driver is not paying attention to the road, is drowsy or is using a phone. Some cars have this today, but the European Union has mandated that all cars must have driver monitoring systems by 2025.
“Getting those extra sensors in the car to be combined with the voice assistant, or the co-driver in general, is one of the things you’re going to see,” Couvreur predicted. “The fact that the system will get smarter also means it can become less of an assistant that you ask to do something and reacts to it, and more of a co-driver that takes all of that information — visual and contextual — and tries to assist you proactively.”
Overall, he concluded, voice technologies are becoming less about command and control and more about AI and making the system “disappear.”
“We are moving in the direction where you will be able to interact with your car as you normally would with any other human being,” Couvreur concluded. “The car will be smart enough to be like a great assistant to you, like a co-driver that can talk to you, advise you and help you with things.”