Innovation in Focus is a series exploring emerging technology and methods of storytelling for newsrooms worldwide. We interview experts, test tools and provide our findings on a different topic each month.
For Innovation in Focus this month I spoke with Tamar Charney, managing director for personalization and curation at NPR, about the ethics of redubbing, using technology to change the audio of what someone says for works of journalism. Redubbing, or overdub, is one of the features of Descript Pro, which I tested this month.
Dingmann: What edits to a source’s quotes for an audio story would you deem as being ethical?
Charney: I mean, ideally, the minimum possible. Obviously, things need to be short for length. But we really try very hard not to do any edits that in any way, shape, or form alter what is being said. We shy away from internal edits within a quote, and never do that with someone like the President of the United States. There are times where we do it, various entities I’ve worked for have had different approaches. But really the goal is to do as little as possible because you don’t want to alter the meaning of what somebody’s saying. Overall, I think we try to do the minimum needed for clarity and efficiency.
Dingmann: Is that for transparency reasons and responsibility to listeners? How is editing in audio different from in print?
Charney: I’ve honestly not really worked in print. With audio you have an additional responsibility because you’re trying to accurately reflect how that person talks, and how they communicate, and it’s in their own voice.
Dingmann: Would there ever be a situation where it would be OK to use a function like redubbing on Descript to correct an error a source made in an interview? For example, I interviewed someone about making metal flowers and at one point they referred to the petals and leaves when pointing to and describing the process of forming the petals. Would it be ok to use this software to correct that?
Charney: My first gut reaction on this is, no. When talking about journalism, that gets into some really dicey territory, deep fakes and undermining trust. I think if we realized someone said something wrong, my inclination is that you need to go back and redo the interview, because maybe there was a person who was trying to communicate by saying it was petal. If you’re hearing somebody’s voice, it should be what they said, not what the journalist decided we wanted them to say to make our story better. So, I personally as a journalist, again, not an ethics specialist, I have some qualms about that.
Dingmann: What about using the technology to edit your own voice? Say you recorded a reporter track and decided last minute it would sound better phrased another way.
Charney: I might be old school here, but you know it’s fine if we’re talking like advertised or fiction. Our job is to reflect reality, and I think we need to honor as much of that contract with our audience as we can. That we are who we say we are, that we are presenting what was told to us, we are taking pictures that accurately reflect what really was there and aren’t something that is digitally altered. I think that is part of our contact with our audience, to faithfully reflect what is there. I think there’s other places where we can take more creative license but that’s fiction. I think there’s a much more documentarian approach of journalism in really reflecting reality. Again, not an ethicist, but I think we end up quickly down this road of undermining what people trust about our work if we’re taking liberties because it makes us sound better or it makes our source look better. I think that’s how we lose trust and I think the more we shoot to what’s really there – not altered, unfiltered – the more we’re going to be able to maintain trust.
There’s the AI that can alter things, but then the AI that can figure out was something altered. So, even if right now people can’t notice that we made digital alterations to something, the time will come when there may be reports out on like, oh you know this media outlet did all of these digital alterations to that story. And once that gets out there, that really starts to undermine the trust and the work that was done using that technology.
Dingmann: Could redubbing software be helpful in a journalistic capacity?
Charney: The place where I could see it perhaps being useful, and this is with us being very transparent that we’re using it in this way, is when we have to protect the identity of the source. A lot of the places where I’ve worked, if we are protecting the identity of a source, we say: this voice was digitally altered, the name has been withheld, or whatever. Whatever we are not providing to the audience, because of concerns about this person’s safety. If that technology enables us to create some bot version of what they said voicing their words, then yeah, I think that may be a use of that kind of faith, again, provided we say that’s what we’re doing and we’re transparent about that.
Sara Dingmann is a student at the University of Missouri School of Journalism and a research assistant at the Reynolds Journalism Institute.