A PYMNTS Company

As States Lead Efforts to Address Deepfakes in Political Ads, Federal Lawmakers Seek Nationwide Policies

 |  April 30, 2024

By:   (Inside Global Tech)

A magician from New Orleans recently grabbed headlines by leveraging artificial intelligence (AI) to replicate President Biden’s voice without his consent in a deceptive robocall to New Hampshire voters. This incident underscored the genuine risks posed by AI-generated “deepfakes” to the integrity of elections. As the swiftly advancing capabilities of AI intersect with the ongoing 2024 elections, both federal and state policymakers are increasingly moving to safeguard the public from deceptive AI-generated political content.

The emergence of media produced by AI, mimicking individuals’ voices or appearances, presents formidable challenges for regulators. As deepfakes grow progressively indistinguishable from genuine content, members of Congress, federal regulatory bodies, and external stakeholders are all urging action to mitigate the electoral threats posed by deepfakes.

Several federal regulators have taken preliminary steps to explore the regulation of AI-generated content within their existing jurisdictions. On February 8, the Federal Communications Commission (FCC) issued a declaratory ruling affirming that the Telephone Consumer Protection Act imposes restrictions on the use of “current AI technologies that generate human voices,” a stance supported by 26 state attorneys general.

In the preceding year, the Federal Election Commission (FEC) initiated efforts to clarify whether AI-generated deepfakes could contravene the Federal Election Campaign Act’s prohibition on deceptive campaign practices. The FEC solicited public feedback on the possibility of commencing rulemaking on this matter. After previously reaching an impasse regarding a petition from Public Citizen to commence such rulemaking, the FEC unanimously agreed in August 2023 to invite public commentary on the initiation of rulemaking procedures. However, the agency has not yet progressed beyond this initial step.

CONTINUE READING…