News organizations should be investing in chatbot experimentation
A conversation with Maria Crosas Batista
Maria Crosas Batista has been studying, experimenting with and writing about new technologies in journalism like virtual reality and chatbots. She even published a book last year about designing, implementing and training chatbots. Now, her expertise at the intersection of technology and communication, with a focus on AI, has led her to working with global teams at Nestlé on technological innovation.
She spoke with our Innovation in Focus team about her advice for other journalists looking to experiment with practical use cases for chatbots.
Lytle: What kinds of experiments have you done with chatbots in journalism?
Crosas Batista: My experiments first started as kind of a news feed — so trying to gather information or things that were interesting to me and delivering that through a chatbot. For instance, instead of doing a tweet, I could just do a chatbot recapping the main point and deliver that through a channel. I started with Telegram because it was the easiest way to do that. Then, when I was at Birmingham [City University], I was part of the student newspaper. We were using chatbots to add all this information that didn’t fit in an article. Usually, you have a limited number of, let’s say, characters that you can put in the article, but we believed that there were tons of information that could be useful for the user. So we were putting that in a bot so that if you’re a reader, and you like this article and you want to know more, you can chat with the bot.
Lytle: When you’re setting up that chatbot, how do you decide the topic or the scope of the bot? What do you think about in those planning stages?
Crosas Batista: I think the first question is, “what do we want to solve?” Why are we building a chatbot? And why not build an application or any other interface? Why is a chatbot going to solve a specific pain point? Usually, who tells us that are our readers or the consumers. When they call or when they send you an email, or when they tweet you, you can see that there is a huge amount of information that you are constantly repeating, or you’re publishing, that could be automated. Usually we decide what the bot will talk about when we identify the main topics that our consumers want to know about.
If we want to test something new, then my recommendation is before building it – before using a specific technology — do some [user] testing. Grab a selected group of people that might be your future audience, and then do some type of testing and identify if they would use it. [In one example when planning a recipe chatbot], we saw that when people are cooking, they don’t use chatbots with text — they use it via voice. Why? Because they don’t have their hands free.
Lytle: What kinds of metrics or areas might you look at to measure the success of a chatbot for news organizations?
Crosas Batista: You need to have at least a good number of people using it, or [look at] the number of conversations. But what is most important is actually the duration of that conversation. Did the user just talk during the first minute or are they actually talking or interacting for more than five minutes?
Another metric is how many users are coming back. So not just once, but the recurring users. Because that might give you the information that your chatbot is actually good because they found it useful. Then there are the topics that they actually talked about or interacted within the chatbot. For instance, if you plan that the chatbot was mainly focused on providing information about local news, and most of the people are asking about international news, then you might have an issue there because either your goal is wrong and you need to change or enhance the scope of the chatbot, or they maybe want to know about something else.
Lytle: I imagine these metrics would be helpful for smaller news organizations looking to fund a chatbot, too?
Crosas Batista: Correct. There’s also a metric which is the feedback one, so after the conversation, you ask the user, “did that help you? Or was there something missing? Or did I match your expectations?” Because if you’re receiving “no,” then what might have happened is that either you didn’t understand what they wanted, or you did understand, but you weren’t serving the right content.
Lytle: Once the chatbot is live or published, what goes into maintaining that bot for your readers?
Crosas Batista: That’s actually the biggest challenge because I think people believe that once it has been built and launched, then that’s the end of the chatbot process, but it’s actually the beginning.
You need one person to constantly look at the conversations that the chatbot is having with the audience. They are making sure that the chatbot is identifying properly what we’re talking about, then the other thing that I think is important is to review those conversations and identify if we’re missing any content. If our initial scope was “A”, but I also see that I’m having more and more questions about “B” and “C,” should we increase our scope? For a news organization, I think it’s an interesting one for identifying new stories, because maybe there’s something happening in a region that we’re not aware of, but there’s someone telling us and reviewing the conversation is actually useful to find the lead.
Lytle: Are there any other tools or practical tips for maintaining or measuring the success of a chatbot?
Crosas Batista: There are many tools out there that can help you with the metrics. There’s a tool called QBox that can be used to measure the metrics of your chatbot in an easy way and Chatbase, as well.
Lytle: What advice do you have for those who want to experiment with chabots but don’t have technical expertise in this area?
Crosas Batista: Even though you don’t have a technical background, you can always learn the basics. I did some coding, I did some basics that I never thought I would do, especially coming from journalism. It can be useful to know a little bit about Python, HTML, the basic language programs. Knowing the basics helps you to understand how it’s being built when you go to the technical team, and they tell you, “well, this is doable because … or it’s not doable because …”
That being said, there are many tools out there that you can use. Test as many tools as you can. Test, test and test. For the use case you’re looking for, one tool might be better than another one. Then from the learnings, just choose one and go for that because you cannot be signing up for several tools, or customize one on your own if you have the budget and you have the skills and the team for that.
Lytle: Is there anything else that we haven’t asked that you want to share in regards to chatbots, or AI in general, in journalism?
Crosas Batista: I think we need to bear in mind that at the end of the day, humans are training these models and there are limitations. And with generative AI, there are tons, much more than that. For instance, “who has the rights of the content that AI created?” So if I go to Dalle and I ask for a new image, who has the right? Dalle? Open AI? Or whoever put the information there before that?
In news organizations and the media, we need to be very careful on how we use these tools, or the outputs that are coming from these tools. We can use it to automate tasks, but I will never use it to put information out there without actually reviewing or putting someone like a human to review the quality of the content out there, because it can be a big mess.
Lytle: If someone works for a small organization, do they have the ability to test chatbots without a significant budget to put toward it?
Crosas Batista: Even with a small budget, there are tools out there, which you don’t need to invest a lot in. I would first invest in the team, either internal or you can outsource it. But look for people who have different skills, technical skills; but not only a developer, you can look for a data scientist, or you can look for an engineer, it doesn’t matter.You can use other models that are out there, you can use big ones that are free. For now, ChatGPT is free, so feel free to use it. There are many tools that can help you to build chatbots; obviously, there’s going to be a constraint, maybe you cannot personalize it, but there’s no excuse to not test it. Then, if it brings value and you can convince your stakeholders, you can put more money on top of that. But I wouldn’t agree with the sentence saying that there’s no budget for experimenting or for experimentation purposes. For scaling, it is different because each organization has its own budget. But I think we should be investing in that. Because otherwise, you’re out of the race.
Editor’s Note: This interview has been edited for clarity and brevity. Laine Cibulskis contributed to this article.
Sign up for the Innovation in Focus Newsletter to get our articles, tips, guides and more in your inbox each month!
Comments