In response to the escalating national concern over the role of technology in moderating illicit content, Democratic Assemblymember Marc Berman from Silicon Valley is spearheading a legislative effort to crack down on AI-generated depictions of child sexual abuse. Berman’s proposed bill, initially reported in California Playbook, seeks to update the state’s penal code, criminalizing the production, distribution, or possession of such material, even if it’s entirely fictitious.
Backing this initiative is Common Sense Media, a nonprofit founded by Jim Steyer, renowned for its longstanding advocacy for cyber protections for children and their privacy. The legislation, if enacted, has the potential to open a new avenue of complaints against social media companies, already under fire for perceived inadequacies in moderating harmful material on their platforms.
Berman’s bill is part of a broader legislative landscape, with at least a dozen proposals set to be considered by California lawmakers this year, all focused on setting limits on artificial intelligence. The move comes as a response to growing concerns surrounding the unchecked proliferation of AI-generated content, particularly those depicting child sexual abuse.
This legislative push builds on a bipartisan law signed by Governor Gavin Newsom last year, which mandated increased efforts by social media platforms to combat child sexual abuse material. The law also granted victims the ability to sue companies deploying features leading to commercial sexual exploitation. Despite facing opposition from influential entities such as the California Chamber of Commerce and tech groups including Technet and NetChoice, the bill passed into law.
Read more: CFPB Begins To ‘Muscle Up’ AI Regulations
The new bill proposed by Berman takes a unique approach by targeting the creators and distributors of AI-generated images rather than explicitly focusing on the platforms hosting such content. However, this shift in focus raises concerns about potential challenges for the tech industry already grappling with broader implications of AI deployment.
Tech industry groups, including representatives from Google, Pinterest, TikTok, and Meta (the parent company of Instagram and Facebook), have expressed reservations about the legislation. They argue that such stringent regulations might inadvertently create a chilling effect in online spaces, raising questions about free expression.
Berman, however, emphasizes the necessity of addressing the troubling trend in the use of AI, particularly in the context of child sexual abuse material. In just one-quarter last year, Meta reported a staggering 7.6 million instances of child sexual abuse material to the National Center for Missing and Exploited Children.
The heart of the matter lies in the fact that AI-generated content depicting minors often relies on scraping information and images from real instances of sexual abuse material, potentially leading to real-life abuse of children. Berman points out that despite encounters with such material, law enforcement agencies in California have been hindered in their efforts to prosecute individuals due to the digitally-manufactured nature of the content.
Source: Politico
Featured News
Big Tech Braces for Potential Changes Under a Second Trump Presidency
Nov 6, 2024 by
CPI
Trump’s Potential Shift in US Antitrust Policy Raises Questions for Big Tech and Mergers
Nov 6, 2024 by
CPI
EU Set to Fine Apple in First Major Enforcement of Digital Markets Act
Nov 5, 2024 by
CPI
Six Indicted in Federal Bid-Rigging Schemes Involving Government IT Contracts
Nov 5, 2024 by
CPI
Ireland Secures First €3 Billion Apple Tax Payment, Boosting Exchequer Funds
Nov 5, 2024 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – Remedies Revisited
Oct 30, 2024 by
CPI
Fixing the Fix: Updating Policy on Merger Remedies
Oct 30, 2024 by
CPI
Methodology Matters: The 2017 FTC Remedies Study
Oct 30, 2024 by
CPI
U.S. v. AT&T: Five Lessons for Vertical Merger Enforcement
Oct 30, 2024 by
CPI
The Search for Antitrust Remedies in Tech Leads Beyond Antitrust
Oct 30, 2024 by
CPI