
By: Kate White & Ioana Gorecki (Kelley Drye)
The National Eating Disorder Association recently encountered an unexpected issue with their mental-health chatbot. Instead of offering support to individuals seeking help for eating disorders, the chatbot began giving diet advice. Initially developed as a closed system, the chatbot was later enhanced with an AI component in 2022, without consulting or obtaining authorization from NEDA. Consequently, NEDA promptly deactivated the chatbot to address the situation.
This incident highlights the potential risks that companies may face when utilizing AI chatbots for customer service and addressing consumer needs. Recognizing the significance of the issue, regulators and law enforcement agencies such as the CFPB and FTC have issued blog posts and reports cautioning companies against excessive reliance on chatbots and generative AI for customer service and resolving consumer concerns…
Featured News
Belgian Authorities Detain Multiple Individuals Over Alleged Huawei Bribery in EU Parliament
Mar 13, 2025 by
CPI
Grubhub’s Antitrust Case to Proceed in Federal Court, Second Circuit Rules
Mar 13, 2025 by
CPI
Pharma Giants Mallinckrodt and Endo to Merge in Multi-Billion-Dollar Deal
Mar 13, 2025 by
CPI
FTC Targets Meta’s Market Power, Calls Zuckerberg to Testify
Mar 13, 2025 by
CPI
French Watchdog Approves Carrefour’s Expansion, Orders Store Sell-Off
Mar 13, 2025 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – Self-Preferencing
Feb 26, 2025 by
CPI
Platform Self-Preferencing: Focusing the Policy Debate
Feb 26, 2025 by
Michael Katz
Weaponized Opacity: Self-Preferencing in Digital Audience Measurement
Feb 26, 2025 by
Thomas Hoppner & Philipp Westerhoff
Self-Preferencing: An Economic Literature-Based Assessment Advocating a Case-By-Case Approach and Compliance Requirements
Feb 26, 2025 by
Patrice Bougette & Frederic Marty
Self-Preferencing in Adjacent Markets
Feb 26, 2025 by
Muxin Li