Journalists discuss social media’s struggle to navigate the line between free speech and hate speech
Tessa Weinberg and Yanqi Xu
WASHINGTON — Social media sees it all.
From the photos we share to keep friends and family in the loop, to violent content in the wake of terror attacks, the ubiquitous platforms that increasingly define our lives are struggling to spur connections rather than deepen divisions.
It’s an issue journalists wrestled with at the 2019 Hurley-Sloan Symposium at the National Press Club on April 25. The symposium, titled “Free Speech, Free Press or Free for All? Social Media and the First Amendment,” sponsored by the University of Missouri School of Journalism and School of Law, featured journalists from The New York Times, Politico, PolitiFact, USA Today, The Dallas Morning News and NewsGuard.
In a panel moderated by Barbara Cochran, Curtis B. Hurley Chair in Public Affairs Journalism at the Missouri School of Journalism, reporters who cover tech giants, dispel misinformation and use social media to engage with their readers discussed journalism’s changing role in a digital age.
“When you post something on social media, you want it to highlight your subject matter expertise, how you’re advancing the conversation, how you’re creating positive dialogue,” said Manuel Garcia, ethics and standards editor for the USA Today Network. “Because this is what we’re trying to do: enhance, educate.”
Clamping down on hate
Content shared on social media platforms, can be far from positive and sometimes even violent.
In just the past few months, a white supremacist gunman live-streamed his attack on two mosques in Christchurch, New Zealand. The Sri Lankan government shut down multiple social media platforms, including Facebook and Instagram, following terror attacks to stem the spread of misinformation. And Special Counsel Robert S. Mueller III found that the Russian government “interfered in the 2016 presidential election in sweeping and systematic fashion,” in part, through a social media campaign.
“Whack-a-mole” might not even be a sufficient analogy for the sheer scale of content platforms are trying to moderate, said Cecilia Kang, a tech reporter for the New York Times.
“They’re really struggling,” Kang said of the tech companies. “I think they’re doing it individually within the companies. There’s not a lot of sharing of information at this point.”
With a combination of artificial intelligence, human content moderators and even partnerships with fact-checking organizations, social media platforms are tasked with navigating the grey area between freedom of expression and hate speech.
“They’re having to make calls about that in real time, day in and day out, in cultures around the world,” said Nancy Scola, a senior tech reporter for Politico. “And those are difficult decisions to make.”
Facebook has teamed up with the International Fact-Checking Network to verify content shared on the platform. Facebook does not eliminate content debunked by fact-checkers. Instead, it downgrades the content to show up in a less prominent position in the newsfeed, and prompts users with a warning when they try to share the post that lets them know the content has been marked by fact-checkers as inaccurate. It leaves them with a choice, asking them if they want to continue sharing the content.
While it’s difficult to measure the impact of PolitiFact’s efforts on the flood of misinformation as a whole on Facebook, Angie Holan, editor of the fact-checking website PolitiFact, said she’s seen a decline in the total volume of misinformation. It takes multi-pronged efforts to improve incrementally, Holan said.
“Two years later, it’s not such a tidal wave, it’s more like a river,” Holan said.
In the face of social media’s use as a springboard for violence, misinformation and hate speech, Facebook is expecting to be fined $5 billion by the Federal Trade Commission — a record amount — for violating user privacy, Kang reported.
While the amount may not be a large dent in Facebook’s overall revenue, it was a significant move for the FTC, an agency that isn’t always the most aggressive, Scola said.
“The thinking in the last couple months was even a number with a billion attached to it was something farther than the FTC was likely to go,” Scola said.
“This is all going to set a blueprint for, in many ways, how social media companies are viewed and potentially regulated going down the road,” Kang said. “It’s not just about Facebook. It’s about the U.S. government drawing really some lines and starting to regulate, if you will, social media companies.”
Adapting to social media
USA Today’s Garcia talked about misinformation in breaking news situations when there’s also fierce competition to get stories done.
“Admit the errors. Correct the errors,” Garcia said, is the top-town mantra at USA Today to flag mistakes and continue to monitor what’s being spread on social media.
Holan said that some sites claim that their content is purely satire, but don’t really make it clear to the audience that the content is satirical. Fact-checkers normally don’t bother checking genuine satire, but still see it as their responsibility to set the record straight when sites obfuscate the nature of their content.
James Warren, the executive editor of NewsGuard, applauded fact-checkers for their endeavors in individual checks and said that NewsGuard focuses on using a rubric to evaluate credibility and transparency of news sites to inform people about the quality of their news sources.
NewsGuard is also keeping an eye on foreign news sites by collaborating with freelancers overseas who translate the original content into English.
New platforms like Tik Tok (a mobile app first released in 2016 for users to share short videos) emerge, with drastically different demographics of readers, said Hannah Wise, audience development editor at The Dallas Morning News.
Wise said having a personality on social media can help journalists reach new audiences, and it’s okay for news organizations to have fun with what they post.
“I think that reporters who are only tweeting out their headlines are leaving a reporting tool and an engagement tool on the table,” Scola said.
Ultimately, efforts should still fit into the framework of journalistic integrity. The Dallas Morning News staff leverages social media as a tool for community outreach. However, by embracing civil participation and being active participants in discussions, sometimes journalists expose themselves to vitriol and criticism. Wise said The Dallas Morning News removes remarks that turn into personal attacks in its Facebook group for readers.
“Our journalists are expected to also be participants in this community and explain their work,” Wise said. “We are trying to form deep relationships with these people in our local community.”
And at the end of the day, it’s important to remember that social media doesn’t replace reality, Kang said.
“I think we also need to realize that Twitter is actually not a real town,” Kang said. “It’s also good to just check yourself and realize the conversation that you see here is not the conversation of all 50 states. It’s not the conversation of the whole world.”
Tessa Weinberg is a recent graduate of the Missouri School of Journalism. She was a David Kaplan Memorial Fellow at ABC News in Washington, D.C. Yanqi Xu is a graduate student focusing on computer-assisted reporting at the Missouri School of Journalism. They were enrolled in the school’s Washington program for the Spring 2019 semester.
Comments