AI-generated news should carry ‘nutrition’ labels, thinkt...
Tech Beetle briefing GB

AI-generated news should carry ‘nutrition’ labels, thinktank says

Essential brief

AI-generated news should carry ‘nutrition’ labels, thinktank says

Key facts

AI-generated news should include standardized 'nutrition' labels detailing source information to improve transparency and trust.
Tech companies must fairly compensate publishers for the use of their content in AI news through licensing agreements.
Current AI tools show uneven use of news sources, often favoring licensed outlets and potentially sidelining smaller publishers.
AI news summaries reduce traffic and revenue for publishers, highlighting the need for new, sustainable business models beyond tech dependency.
Government intervention is crucial to protect media plurality, support local and investigative journalism, and encourage innovation with AI.

Highlights

AI-generated news should include standardized 'nutrition' labels detailing source information to improve transparency and trust.
Tech companies must fairly compensate publishers for the use of their content in AI news through licensing agreements.
Current AI tools show uneven use of news sources, often favoring licensed outlets and potentially sidelining smaller publishers.
AI news summaries reduce traffic and revenue for publishers, highlighting the need for new, sustainable business models beyond tech dependency.

The rapid rise of AI-generated news content has prompted calls for greater transparency and fair compensation for publishers. The Institute for Public Policy Research (IPPR), a left-of-centre thinktank, argues that AI-produced news should include standardized “nutrition” labels detailing the sources used to generate the content. These labels would reveal whether information was drawn from peer-reviewed studies, professional news organizations, or other credible sources, helping users assess the reliability of AI news outputs. This recommendation comes amid concerns that AI companies are becoming the new gatekeepers of online information, shaping public discourse without sufficient accountability.

In addition to transparency, the IPPR emphasizes the need for a licensing regime in the UK that enables publishers to negotiate with tech firms over the use of their content in AI-generated news. The thinktank highlights the importance of fair payment to journalism providers, noting that if AI companies profit from journalism, they must compensate publishers accordingly. This approach aims to protect media plurality, maintain public trust, and safeguard the future of independent journalism. The IPPR suggests that the UK’s Competition and Markets Authority (CMA) could leverage its new enforcement powers over companies like Google to initiate licensing frameworks, potentially curbing unauthorized content scraping.

The report also sheds light on the current landscape of AI news sourcing. IPPR tested four AI tools—ChatGPT, Google AI overviews, Google Gemini, and Perplexity—using 100 news-related queries and analyzing over 2,500 links generated by these platforms. Findings revealed disparities in source usage: ChatGPT and Gemini did not cite the BBC, which restricts bot access, while Google’s overviews and Perplexity included BBC content despite objections. The Guardian, which holds a licensing deal with OpenAI, was heavily cited by ChatGPT and Gemini, whereas other outlets like The Telegraph, GB News, The Sun, and the Daily Mail appeared in fewer than 4% of ChatGPT’s responses. This uneven representation raises questions about the influence of licensing agreements on AI content and the potential marginalization of smaller or local news providers.

The proliferation of AI news summaries, especially those featured at the top of Google search results, has impacted publishers’ web traffic and revenue. Many users consume AI-generated overviews without clicking through to original articles, reducing advertising income for news organizations. While licensing deals may offset some financial losses, the IPPR warns that overreliance on tech giants for revenue could jeopardize the independence and sustainability of news media. The thinktank advocates for government intervention to foster new business models for journalism that do not depend solely on the tech sector. This includes public funding for investigative and local news as well as support for the BBC to innovate with AI technologies.

Overall, the IPPR’s report calls for a balanced regulatory approach that enhances transparency, ensures fair compensation, and promotes a diverse and trustworthy news ecosystem in the AI era. By implementing “nutrition” labels and licensing frameworks, policymakers can help users better understand AI-generated content and protect the viability of independent journalism amid technological disruption.