Google is testing a product that uses artificial intelligence technology to produce news stories, and direct it to news organizations including The New York Times, The Washington Post and Wall Street Journal owner News Corp, according to three people familiar with the matter.
The tool, known internally by the business name Genesis, can take information — details of current events, for example — and generate a copy of the news, said the people, who spoke on condition of anonymity to discuss the product.
One of the three people familiar with the product said Google believes it can serve as a kind of personal assistant for journalists, automating some tasks to save others time, and that the company viewed it as responsible technology that could help steer the publishing industry away from the dangers of generative AI.
Some executives who saw Google’s offer described the offer as troubling, and asked not to be identified as they discussed a confidential matter. Two people said that the effort to produce seemingly original and accurate news stories was taken for granted.
A Google spokeswoman did not immediately respond to a request for comment. The Times and Post declined to comment.
“We have an excellent relationship with Google and appreciate Sundar Pichai’s long-standing commitment to journalism,” a News Corp. spokesman said in a statement, referring to Google’s chief executive.
As described, Google’s new tool, as described, has both positive and potential negative sides, said Jeff Jarvis, a professor of journalism and media commentator.
“If this technology can reliably deliver factual information, then journalists should use the tool,” said Mr. Jarvis, director of the Tonight Center for Entrepreneurial Journalism at the City University of New York’s Craig Newmark Graduate School of Journalism.
He continued, “On the other hand, if it is misused by journalists and news organizations on topics that require nuance and cultural understanding, it can damage the credibility not only of the tool but also of the news organizations that use it.”
News organizations around the world are grappling with whether to use AI tools in their newsrooms. Several, including The Times, NPR, and Insider, have notified staff that they intend to explore potential uses for AI to see how it can be applied responsibly to the high-stakes world of news, where seconds and accuracy are critical.
But the new Google tool is sure to also cause anxiety among journalists who have been writing their own articles for centuries. Some news organizations, including The Associated Press, have long used AI to create stories on matters including corporate earnings reports, but it remains a small fraction of the service’s articles compared to those produced by journalists.
Artificial intelligence can change that, enabling users to create articles on a larger scale that, if not carefully edited and vetted, can spread misinformation and affect how traditionally written stories are perceived.
While Google has moved at a rapid pace to develop and deploy generative AI, the technology has also presented some challenges to the power of advertising. While Google has traditionally played a role in curating information and sending users to publishers’ sites to read more, tools like chatbots, Bard, make factual assertions that are sometimes incorrect and don’t send traffic to trusted sources, such as news publishers.
The technology was introduced as governments around the world called on Google to give news outlets a larger slice of their advertising revenue. After the Australian government tried to force Google to negotiate payments with publishers in 2021, the company built more partnerships with news organizations in different countries, under its News Showcase programme.
Publishers and other content creators have criticized Google and other major AI companies for using decades of their articles and publications to help train these AI systems, without compensating the publishers. News organizations including NBC News and The Times have taken a stand against AI sucking their data without permission.