The proliferation of voice-activated technology, from virtual shopping assistants to speech recognition in smartphones and smart homes, is reshaping how consumers interact with devices.
Advancements in this field are continually expanding the possibilities of what voice technology can achieve.
For instance, assistive voice technology, like the one developed by Whispp, is empowering individuals with voice disabilities, enabling more fluent and confident communication. Additionally, applications like self-service and in-car voice technologies are not only enhancing convenience and enriching experiences but also boosting sales for brands.
However, with the growing interest and adoption of voice tech comes concerns about potential misuse for fraudulent activities.
As Karen Postma, managing vice president of risk analytics and fraud services at PSCU, told PYMNTS last October, fraudsters utilizing generative artificial intelligence (AI) “can effectively mimic a voice within three seconds of having recorded data,” and are “utilizing AI to not just commit attacks, but to become very good at committing these attacks.”
AI tools such as voice cloning have acted as a potent boost for cybercriminals, streamlining the process of manipulating targets by minimizing the effort needed, PYMNTS also wrote in January.
Overall, over 28% of the 46.75 billion unknown calls analyzed by voice security firm Hiya in 2023 were determined to be spam or fraud, an increase from 24% in 2022. Moreover, in 2023, 16% of consumers fell prey to phone scams, highlighting a trend compounded by the low adoption of fraud prevention apps aimed at bolstering call protection.
These concerns have spurred government agencies into action.
The Federal Trade Commission (FTC) finalized a rule earlier this month banning the use of AI for government and business impersonation schemes. This move follows the agency’s Voice Cloning Challenge last year, designed to solicit ideas for preventing the misuse of this technology.
The new rule equips the FTC with stronger tools to combat impersonation scams, including the ability to file federal court cases directly to recover funds obtained through impersonation. Additionally, the FTC has proposed an extension to make using AI to impersonate individuals a civil offense, addressing feedback received during the public comment period.
FTC Chair Lina Khan emphasized the urgency of protecting consumers from AI-driven impersonation scams, stating that “with voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever.”
Meanwhile, the Federal Communications Commission (FCC) has intensified measures against voice-related fraud by recently outlawing AI-generated voices frequently employed in robocall scams.
The proliferation of these calls has increased in recent years, the agency said in a Jan. 31 press release, with the technology capable of misleading consumers by mimicking the voices of celebrities, politicians and family members.
To underscore its commitment, the FCC’s Enforcement Bureau took action earlier this week (March 27) to remove voice service provider BPO Innovate from the Robocall Mitigation Database.
This action follows similar recent orders against other carriers that failed to comply with FCC’s regulatory process, providing the agency with leverage to shut down illegal robocall campaigns by providers that disregarded consumer protection measures.
These actions aren’t merely about shutting down operations; they also include imposing penalties. Notably, the FCC fined Sumco Panama company nearly $300 million for orchestrating auto warranty robocalls in 2022, marking the largest penalty in the agency’s history.
Commenting on the BPO Innovate news, FCC Chair Jessica Rosenworcel said: “Scammers are still finding ways to defraud consumers and degrade public trust in our communications networks. So, we won’t let up either. We have tools available to go after illegal robocallers and the phone companies that help them, and we will continue to use them. If you break the law, we’ll keep coming after you.”