UK Wants More Transparency Into AI Models

The U.K. is reportedly working toward greater transparency into how tech firms train artificial intelligence (AI) models.

These efforts, the Financial Times (FT) reported Sunday (May 19), come as creators are voicing concerns that their work is being copied and used without their consent by AI projects.

British culture secretary Lucy Frazer told the FT that the government is working on rules governing AI company’s use of books, music and TV shows.

She said government ministers would focus initially on greater transparency over what content was being used by AI companies to train their models, allowing creative industries to determine if work it produces is being stolen.

Frazer acknowledged that AI represented a “massive problem not just for journalism, but for the creative industries.” 

“The first step is just to be transparent about what they [AI developers] are using. [Then] there are other issues people are very concerned about,” she added. “There’s questions about opt in and opt out [for content to be used], remuneration. I’m working with industry on all those things.”

According to the report, Frazer declined to discuss what mechanisms would be needed to deliver greater transparency so that rights holders can determine whether content they produced was being used to train AI models.

The efforts are happening as content creators are expressing concerns about AI’s encroachment into another area: Google search, which now offers AI-generated summaries of search queries.

“Our initial analysis suggests that SGE [Search Generative Experience] could significantly reduce the traffic directed to content creators’ websites, directly impacting their ad revenue and, by extension, their livelihoods,” Marc McCollum, the chief innovation officer of Raptive, told PYMNTS last week, estimating that “the total revenue impact to creators would be $2 billion within one year.”

He also questioned the use of content creators’ intellectual property.

“The current model does not adequately compensate creators for the use of their work, nor does it align with the principles of fair use,” McCollum said, stressing that this is a matter of survival for many independent creators.

“Content creators are the backbone of a diverse and vibrant digital ecosystem, and their work deserves recognition and remuneration,” McCollum said.

However, not everyone is pessimistic about AI search’s prospects. Michael Hasse, a cybersecurity and technology consultant, told PYMNTS that AI-based search can help and hinder consumers looking for specific products like jackets.

“With traditional search functionality, the first few pages of results will be dominated by companies that have perfected their SEO or paid for preferential placement,” Hasse said. This often leads to consumers settling for products that are merely “good enough.”

For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.