Over the course of this RJI fellowship, I’m working with the nonprofit environmental magazine Grist to think through what it’d mean to build out a national environmental data-journalism unit—and, in turn, to take what we learn at Grist and translate these lessons into guides for other small or nonprofit newsrooms that might find themselves interested in similar work.
In behavioral economics and environmental law, climate change is known as a super-wicked problem. It’s slow-moving, dispersed across borders and generations, often subtle in its effects, and requires action on behalf of the people and industries responsible for the problem. There’s also the fact that time is running out, and humans are notoriously bad at valuing the future to the same extent they value the present.
Datasets related to climate change are similarly difficult to wrap one’s head around. Windows of uncertainty reign supreme in modeling efforts, databases are often enormous (and stuffed to the gills with subtleties), and the sectors tangential to climate science—energy economics, global supply chains, and human behavior, for example—are no less complicated. For these reasons and others, environmental media organizations have not taken advantage of the huge swathes of available climate data to bolster their reporting and communicate the subtleties of this global challenge to a wide audience.
This isn’t to suggest that hard-hitting data reporting in climate journalism doesn’t exist. Environmental publications like Carbon Brief and the Pulitzer-winning InsideClimate News have produced dozens of serious, data-driven investigations. The New York Times should be lauded for its approach to interactive climate-data features that turn grossly complex problems into digestible, digital-first experiences. But there is no visual data-journalism unit wholly dedicated to the environment. There is no Pudding of climate change.
It’s critical that media institutions fill this gap. The monolithic, overwhelming span of climate change prevents its internalization: People don’t get it until it affects them. But interactive, data-driven features harbor precisely the grounding potential needed to pull climate change into the present. Climate-data journalism can make the abstractions relevant to readers. Whether via interactive graphics, filterable features and news apps, or experimental technologies like augmented reality, the contemporary tools available to data journalists offer a means for helping readers understand climate change as the reality it is—and what we can do about it.
The availability of the relevant data further highlights the opportunity here. Unlike in the case of, say, homelessness policy—wherein government datasets are kept under lock and key to protect vulnerable populations—climate data and datasets relevant to environmental justice are often freely available online or via public-record requests.
When combined with the recent proliferation of high-level, open-source tools for data analysis and visualization, this data has no reason to remain stuck in a lab or gathering dust on a virtual shelf. Climate-data journalism offers a vital communications engine (and investigative opportunity) for institutions seeking to move the climate conversation forward and hold power to account. Data investigations offer a means of detecting subtle effects; charts are eminently shareable; modeling efforts offer tools that can be leveraged for future projects. (Once you build a statistical model of some phenomenon, you can use it for forecasting!)
Data journalism pipelines remain inaccessible for many small and nonprofit newsrooms, both with respect to funding and technical capacity. Outside of the open-source software community, proprietary data analysis and visualization software is often expensive and time consuming to learn. Free analysis and visualization packages, too, require staff knowledge—and dedicated data scientists often command high salaries. Accordingly, this project will seek to develop journalistic pieces (for free syndication) and code (for free distribution) in order to build data-journalism capacity across environmental and general-interest newsrooms.
One of the central questions posed by this work will be the extent to which one can replicate the practices of a legacy newsroom’s data team with a staff of, say, 1.5 FTEs. The Wall Street Journal and the Los Angeles Times and The Economist have fully staffed data desks with visualization teams and data scientists and custom news-app templates and Django engines and Yeoman generators and the like. How can we leverage existing open-source tools to bring a world-class visual data-journalism experience to, say, a WordPress-powered CMS? How much of The Upshot can you approximate with one or two people? We’ll let you know as we find out more in the months ahead.
Have you built a data-journalism unit from scratch? Reach out with what you’ve learned! Are you a data editor at a legacy publication who imagines some of your pipelines might be readily adaptable to a small- to mid-sized nonprofit organization? I’d love to chat. Want to lend some pro-bono backend support? When can you start?