Final Subheadline: Who Should Control the News? Stuff and Other Publishers Take a Stand Against AI-Generated Content
In a brave move, Stuff, New Zealand’s largest news publisher, has decided to block Open AI from using its content to power its AI tool ChatGPT. Stuff is concerned that AI-generated news articles are already producing low-quality results and wants to protect its valuable journalism from being harvested without permission. This decision follows other prominent publishers like CNN, The New York Times, and The Guardian, who have also prevented Open AI from accessing their content.
The debate surrounding generative AI tools like ChatGPT and Google’s Bard is heating up, with critics raising concerns about the impact on journalism and news consumption. Some argue that AI tools can be beneficial for gathering and publishing news digitally, improving efficiency, and adapting content to user preferences. However, others worry about the potential dangers of a “walled garden” scenario, where AI tools decide which news users can access and restrict the sources they can consult.
Stuff CEO Laura Maxwell argues that generative AI tools need high-quality journalism to provide accurate and valuable information to users. Without permission to access news content, these AI models would have to rely on unverified information and misinformation found on the internet, leading to a less reliable news experience. Maxwell warns of the risk of the AI industry “eating itself” if it continues to generate content based on content already generated by AI.
While publishers like Stuff are taking a stand against AI-generated content, some believe that finding a middle ground is essential. The BBC, for example, uses AI to adapt content quickly based on user locations, and local subscriber service BusinessDesk has seen significant time savings by using ChatGPT. However, these AI products have sometimes breached media copyright, highlighting the need for arrangements between news publishers and AI makers to protect original content.
The question remains: who should control the news? Should AI companies have free rein to scrape and reproduce news content without permission, or should publishers retain control over their valuable journalism? The issue of licensing and compensation for media creators is central to this debate. As the news industry grapples with the rise of AI and technology giants, it needs to ensure that the mistakes of the past, where platforms sucked up all the value from news content, are not repeated.
We want to know what you think! Should AI companies be allowed to freely use news content, or should publishers have the ultimate control? Leave a comment below and join the discussion on who should control the news in the age of AI.
IntelliPrompt curated this article: Read the full story at the original source by clicking here