“Hello, I'm Eva. It’ll be ten months since I was first imprisoned here. They accuse me of international trafficking and organized crime, although I feel more like a victim than anything else."
This is how the conversation begins with Eva, a chatbot powered by artificial intelligence (AI) developed by the team at Paraguayan media outlet El Surti.
Although it uses a fictitious name, there is a real person behind every word of the chatbot. She is not a virtual assistant. Eva is actually a 28-year-old woman held in the Buen Pastor prison in Asunción, Paraguay. She was accused of serving as a mule for a drug trafficking network and the chatbot is a tool to maintain her anonymity.
“My lawyer says the sentence will be 10 years. And I told him that I cannot accept something that does not correspond to me,” Eva continues. “What I had was an attempt, but there was no completion, that work was not completed. So, why are they going to judge me by that measure?”
As its creators explained to LatAm Journalism Review (LJR), Eva the chatbot can give more than 118 responses via text. The user can write their questions but at the same time has options that facilitate the conversation.
Eva's story represents the more than 400 women who are deprived of their freedom in Paraguay for illicit drug trafficking. Women in the last link of the drug trafficking chain who end up with years-long sentences, separated from their families and, as in Eva's case, suffering from depression.
“This was an opportunity for Eva to narrate her own story and the audience to approach the story from another angle and through more direct communication,” Juliana Quintana, a reporter for the project, told LJR.
It is not common to see the use of chatbots to tell journalistic stories in Latin America. Most chatbots made in the region are virtual assistants helping to combat misinformation. For example, Fátima from Aos Fatos, La tía de WhatsApp from Efecto Cocuyo, and Chequeabot from Chequeado.
“The audience has a lot of training in the use of chats,” Sebastián Hacher, conversational designer of the El Surti chatbot, told LJR. “However, journalism has not finished taking advantage of it. At least not in this format of telling a nonfiction story.”
Profitability of chatbots in the media
To build Eva, the El Surti team processed hours of interviews with the source and divided them into small fragments to simulate a fluid dialogue.
The design was enhanced with the use of AI, specifically ChatGPT 3.5 turbo.
“We use AI to train the chatbot so it can understand what is being said to it and can direct a response,” Hacher said.
Unlike other chatbots created by Latin American media, Eva does not work through the messaging platform WhatsApp.
The reason goes back to money. Meta, the company that owns the application, charges per conversation. Therefore, having a standard chatbot plan on WhatsApp can cost US $500 per month – “a cost that can be difficult for an independent media outlet,” Hacher said.
To build the chatbot, El Surti had the support of the Gabo Foundation as it was one of 14 proposals selected for the 5th edition of the Fund for Investigations and New Narratives on Drugs (FINND, for its acronym in Spanish). The scholarship came with US $5,000 to carry out the project.
In the first month after launch, the chatbot had 10,000 interactions and a high retention ranking. “It's a lot, if you compare it to the time spent on a regular article,” Hacher said.
Media chatbots, in general, are not created for economic performance but to improve the content and the way in which information is transmitted.
Laura Zommer, co-founder and CEO of Factchequeado, an initiative to combat misinformation in Spanish in the United States, said chatbots allow us to understand the information needs of users.
“We are not going to monetize our chatbot, rather we invest money to create it,” Zommer said about the Factchequeado chatbot that will be launched in October. “For our journalism model it can be profitable, but not necessarily because of money, but because if we can know what the information gaps are, we can prevent the ball of misinformation from continuing to roll.”
Protecting the source
The project with Eva was completed in four months. While part of the team was dedicated to developing the tool, another was in charge of interviewing the source.
Quintana said that they paid close attention to every word that appeared in the conversations with Eva and remained faithful to her statements. They only left out some details to protect the source’s identity.
“She is still in the middle of a judicial process,” Quintana said. “We don't want her to be exposed or identified in prison. There are many conflicts that are important to take into account when you do work with vulnerable populations.”
Chatbots not only keep the identity of sources safe, they can also make the audience feel more comfortable asking questions.
“People are encouraged to ask things that they might be embarrassed to ask in person,” Zommer said.
So comfortable that, as Eva's developers explained, they have encountered provocative, offensive or out-of-place messages that seek to mess with the chatbot. What the team has done is to train the tool so that it provides more assertive responses and can better connect with the audience.
“The chatbot is not just a way to sneak artificial intelligence into our work dynamics,” Quintana said. "The chatbot serves the story, it serves to bring the life of the protagonist closer and creates a bridge with audiences that usually do not empathize much with themes like these."
This article was originally published on LatAm Journalism Review and republished on IJNet with permission.
Yutong Liu & Kingston School of Art / Better Images of AI / Talking to AI / Licenced by CC-BY 4.0