OpenAI’s ChatGPT has begun to lose users, though it’s not quite clear why.
A Friday (July 7) report by the Washington Post, citing numbers from data firm Similarweb, says worldwide mobile and desktop traffic for the generative artificial intelligence (AI) tool fell by nearly 10% from May to June.
And data firm Sensor Tower says ChatGPT’s iPhone app downloads have been steadily declining after reaching a peak in early June.
The Post article offered some reasons for a possible user drop-off: a theoretical decline in quality as popularity drove up the cost to keep ChatGPT running, leaving OpenAI to make tweaks to reduce expenses. It may also be that fewer students are using it to write papers now that school isn’t in session.
A separate report by Ars Technica on Friday notes that ChatGPT is facing a number of external factors that could impact its user numbers, such as companies urging their workers not to use generative AI tools due to privacy concerns.
In addition, the report said, OpenAI has also begun responding to user backlash and pressure from regulators by censoring harmful ChatGPT responses, which may have caused some users to leave the tool, “possibly viewing it as less useful, less trustworthy, or simply less fun.”
The company is also facing a federal lawsuit from a California firm that accuses OpenAI of engaging in a campaign to “secretly harvest massive amounts of personal data from the internet.” This data, the suit claims, included private information and conversations, medical data and information about children, without owners’ knowledge or permission.
“Without this unprecedented theft of private and copyrighted information belonging to real people,” the lawsuit by the Clarkson Firm says, OpenAI and ChatGPT “would not be the multibillion-dollar business they are today.”
Meanwhile, PYMNTS recently examined the cost of adding AI to businesses, noting that the rapid growth of GPT products “could easily become unsustainable,” with even the White House pointing to the potential environmental impact of the increased energy consumption and data center space needed for extended generative AI applications.
“Before dealing with the cost of running large language models (LLMs), most companies interested in developing their own generative AI solutions will come up against the cost of training them,” PYMNTS wrote.
To train generative AI requires either owning or renting time on hardware, substantial data storage needs and intensive energy consumption. The cost just to train OpenAI’s GPT-3 — the version before the one employed in ChatGPT — exceeded $5 million.