ONA 23, the Online News Association’s annual conference held last week in Philadelphia, demonstrated just how quickly artificial intelligence and machine learning have become a dominant focus in the industry, even as news organizations largely remain cautious about integrating Large Language Models like ChatGPT into workflows.
At last year’s conference, two months before the momentous launch of ChatGPT, only two sessions focused specifically on AI. Fast forward to 2023, and it would be easier to count the sessions that didn’t involve AI. Microsoft and Google, each vying for a dominant share of the AI-driven search engine space — a space that, more and more, determines which news stories people see and when they see them — were each represented in discussions that emphasized an optimistic outlook on these technologies as time-saving tools with remarkably low barriers to entry.
One conviction was ubiquitous: journalism cannot afford to miss the boat on another technological leap forward.
“We like to say that we tend to embrace the fear and manage the opportunity,” said Jim Brady, the vice president of journalism at the Knight Foundation, noting the industry is still paying the price for failing to fully embrace and monetize the internet at its outset. “We have to do the opposite now. AI can do amazing things with collecting public information, suggesting stories to write based on the stories you’ve done…we have to get out of this cycle of fearing the new thing. Other industries don’t fear the new thing and make much better use of it.”
Embracing the new thing
In that spirit, ONA panelists like Aimee Rinehart, program manager for the Associated Press Local News AI initiative, downplayed concerns about generative AI potentially replacing journalists and focused on the technology’s capabilities as a workhorse for mundane, time-intensive tasks.
“We’re not talking about creating an article,” Rinehart said. “Right now, we’re going to solve some workflow problems. …You want to identify annoyances and free up reporters to focus more on the reporting.”
That is the aim of YESEO, a Slack bot created by 2022 RJI Fellow Ryan Restivo to generate suggestions for headlines, subheads and story descriptions geared toward search engine optimization. Like many in the industry, Restivo pivoted the app’s development to integrate generative AI, but avoided unleashing a more flexible — but less curated — version of the technology in the app.
“YESEO offers five headline suggestions right now,” Restivo said. “I had thoughts about including the crazier suggestions, but at the end of the day I wanted it to be useful and not off-putting to people using the app for the first time.”
“You have to look at what AI can do better and what it can’t,” echoed Ernest Kung, AP’s AI product manager. “What AI is good at is finding patterns, but it’s not necessarily good at reaching a point where it can write a very clean article.”
Kung pointed to several practical AI applications AP has already built for local newsrooms around the country, such as a system for the Brainerd Dispatch in Minnesota that automatically collects information from police blotters and a tool for Michigan Radio that pulls key information from records of city council meetings (much like Agenda Watch, an RJI collaboration with Stanford University).
An expanding frontier
This view of AI as sidekick rather than replacement-in-training aligns with what some are calling an “ethical assistive” approach to using AI tools in the newsroom. Start-up platforms like Legitimate seek to help journalists lean on AI for just about everything in the reporting process except the bulk of the writing, providing suggestions for articles to write based on the content of previous stories, a light outline for the story’s structure, background information and statistics, Grammarly-like editing suggestions and auto-generated social media posts. A tool called Runway can even create videos based on text prompts.
“The process of creating content needs to live with the journalist,” said Gerard Donnelly, CEO of Legitimate.
Donnelly believes widespread adoption of these assistive tools will happen once one or two major news organizations model a path forward for others to follow.
Others, such as Jonathan Soma — the Knight Chair in Data Journalism at Columbia University — argue that waiting for popular AI tools to follow what Soma sees as an inevitable trajectory from free or cheap to expensive or even iterated out of existence would be a mistake.
“It’s important to have independence in your organization,” Soma said. “A lot of this stuff can run on your own computer — you don’t need the cloud. And it’s all additive: these tools can work together to make a much bigger, fancier product. The most powerful thing you can do is create an internal playground of accessible, connected components.”
But regardless of how newsrooms choose to employ an increasingly staggering range of options, one thing is clear: fear of the unknown is unlikely to stop the industry from embracing an era of experimentation.
“We’ve discovered fire,” said Mahesh Ramachandran, head of news technology at Reuters, “and we can either figure out how to use it properly, or it can burn us.”