A portrait of the author generated by Lensa

A portrait of the author generated by Lensa

2023 is going to be a hot year for AI — but what about for journalism?

Nia Springer-Norris is a RJI Columnist whose column explores the role of technology in small newsrooms — mainly the use of artificial intelligence in community journalism.

The ethical implications of AI in news

AI has become a hot topic everywhere. Our social media has been filled with AI generated avatars, AI art, and heated debates around both. People are worried (for good reason) about the ethical implications of handing your picture and likeness over to a corporation. Many of the artists I know have been very vocal about how they feel that the machine steals the style of human artists when generating AI art; the flipside is that artists are always influenced by one another and that AI makes art more accessible. And some academics are sharing that they’ve had students submit papers that are written by AI. 

In an article for Wired, sex work, technology, and policy scholar Olivia Snow wrote about putting her childhood photos into the Lensa app and having the app return sexualized photos. It’s true: The photos the app generated for me were definitely on the sexier side with certain features (my breasts, my waistline) exaggerated to match cultural beauty ideals. When we consider that machine learning applications use large datasets to understand patterns, this implies that users of the app are breaking the anti-nudity policies when submitting photos. This brings up the question of whether someone could use this app to generate sexualized images of someone without their consent. 

But what does this mean for journalists? It’s complicated. There are plenty of ethical questions that we will have to ask ourselves about how we source stories, especially if we are using social media or other platforms. 

Since local journalism is a research interest of mine, but I haven’t worked for a local newsroom, I wanted to get a perspective from someone who does. Katie Hyson from WUFT News in Gainesville, Florida, had a lot to say. 

My most optimistic vision is that AI writers could free local reporters to do the work they often don’t have time for by taking some of the more cut and dry reporting off their plates. I’d caution newsrooms to be very mindful of the racial, language and gender biases ingrained in article intelligence.

Katie Hyson

“My most optimistic vision is that AI writers could free local reporters to do the work they often don’t have time for — original investigative reporting, immersive narratives, community engagement events — by taking some of the more cut and dry reporting off their plates,” Hyson says. “The industry is so short-staffed, there are so many news deserts, I’d welcome any innovation that could free up local journalists for work that can’t be automated.” 

It’s nice to know that local reporters share a similar vision that I have as an outside researcher. But Hyson also has concerns about bias. 

“I’d caution newsrooms to be very mindful of the racial, language and gender biases ingrained in article intelligence,” Hyson says. “These biases need to be critically examined and addressed to ensure AI-powered journalism serves all our community members equally.”  

AI likely cannot replace human creativity, and the scholars agree. In an article for Journalism and Mass Communication Quarterly, NYU data journalism professor and former RJI fellow Meredith Broussard (whose book, Artificial Unintelligence, is worth the read) wrote, “We can benefit from using technological tools to commit acts of journalism, but at its heart, journalism is about telling stories about the human condition.” 

Another study that looked at researchers’ perceptions of AI in newsrooms came back with three different camps of beliefs: one group believing that humans are irreplaceable, one believing that “the public’s need is more important than the journalist job,” and the final group seeing the collaboration between humans and machines as an integrative relationship where the machines do what we don’t want to do. 

When I first began this research, I was team AI and human collaboration. Now, I am not so sure that news corporations will place humans in positions to think more creatively when they can simply replace them with machines and provide a less human service, because machines could easily round up all of the “need to know” information without any real storytelling. 

The best future I can imagine for news is one that is nonprofit, which comes with its own obstacles and barriers. Then, we have to consider funding gaps and reader interest — alongside reader willingness to pay and other institutional and governmental support. But public radio has been doing this for decades.

Related Stories

Expand All Collapse All
Comments

Comments are closed.