Mapping/navigation company TomTom says it has developed an artificial intelligence (AI)-powered, in-car conversational assistant in collaboration with Microsoft.
The tool offers enhanced voice interaction with infotainment, location search, and vehicle command systems, and arrives at the end of a year that saw voice AI integrated into more and more applications.
“Drivers can converse naturally with their vehicle and ask the AI-powered assistant to navigate to a certain location, find specific stops along their route, and vocally control onboard systems to, for instance, turn up the temperature, open windows, or change radio stations,” TomTom said in its Tuesday (Dec. 19) announcement. “All with a single interaction.”
According to the announcement, the solution integrates Microsoft Azure OpenAI Service to take advantage of large language models (LLMs) and other Azure applications. TomTom says its assistant can be integrated into other automotive infotainment systems, and is also built into Digital Cockpit, the company’s in-vehicle infotainment platform.
The launch of the new assistant comes at a time when nearly 40% of U.S. consumers are using voice tech to get directions on the road, according to “Preparing for a Voice Commerce Future,” a PYMNTS Intelligence report.
In addition, that study found that 15% of consumers had used their cars’ voice capabilities in the 12 months prior to being surveyed, a share that “is expected to grow as businesses provide solutions to meet the increasing demand for easy-to-use voice interactions in vehicles,” PYMNTS wrote last month.
Beyond cars, this year has also seen voice AI become the main user interface of the world’s first purpose-built AI device, added into ordering systems at quick-service and fast-casual restaurants, and continually honed by tech giants like Google, Meta, OpenAI, and Anthropic.
“But despite its vast promise as a generalist solution, voice AI, to date, is still a nut waiting to be cracked,” PYMNTS wrote this week.
That’s because, that report noted, the human mind doesn’t work the way even the most advanced AI operates, and with artificial general intelligence (AGI) still not a reality, modern AI systems perform best “when they are trained on and built atop domain-specific, localized data sets for purpose-designed tasks.”
This includes things such as restaurant ordering from a preset menu, or ambient notetaking within a clinical healthcare setting.