Languages

Data storytellers share narrative tips at Data on Purpose conference

TOPIC: 

Data storytellers share narrative tips at Data on Purpose conference

Jenny Manrique | February 23, 2016

Stories are more convincing if they include analytics and data. At the same time, an effective way to engage audiences when presenting complex digital data is to include real-life anecdotes and personal stories. A lineup of expert data storytellers discussed these two concepts during the Data on Purpose conference, organized by the Social Innovation Review at Stanford University.

The speakers highlighted the fact that there is always a meaningful story hidden within the data, and that powerful storytelling goes far beyond infographics and visualizations.

“We have more information than ever before, and the abundance of data that can be mined, understood and harnessed is enormous,” said Jake Porway, founder and executive director of DataKind, a company that aims to connect data scientists with social change organizations.

“Sometimes your (visualization) tool won’t tell the whole story,” he said. “It’s important to practice safe stats, which means to know the context in which data was collected and how to analyze it.”

Here are IJNet’s main takeaways from the event:

Making stories out of numbers

With the increasing access reporters have to documents and databases, the challenge of making stories out of numbers relies also in strong narrative principles: plot, twists and an ending. According to Cole Nussbaumer, founder of Storytelling With Data, the stories that resonate with us “tell anecdotes that a simple spreadsheet doesn’t.”

“The principles go back to identifying who is your audience, what you need them to know or to do, and how can data help make your point,” she said. Two things Nussbaumer recommends as good practices in data viz are to choose an appropriate visual, “even if that is simple text,” and eliminate the clutter, “getting rid of ineffective graphs that aren’t adding anything to the story.”

Scientists have found that in the same way that primitive, fast thinking is driven by emotion, individuals interact very differently with a table or a graph; tend to hierarchize information according to tone colors, contrast and proximity; and are looking for visuals that are scannable.

For data journalists and investigative reporters, being aware of those choices helps make narratives more cohesive and readable.

Verifying documents and data

“We have to interview the data the same way we interview a person,” said Cheryl Phillips, an investigative data journalist and current Hearst Professional in Residence at the Stanford Department of Communication.

“One of the fundamental things to know is that every dataset has its problems, whether it’s a government or nonprofit source,” she said. “So one of the first tasks to do is dealing with (these problems) and cleaning that data, before you really start to use it.”

According to Phillips, reporters need to “trust the source, understand how they collect the data and figure out if it helps to reflect the accuracy of the story we are telling.”

Tools for gathering data

As relying on search engines would make reporters miss database information, content behind firewalls and registration screens as well as dynamic generated pages, experts recommend searching directly for the datasets in governmental websites or sites built by data experts or other journalists. Here are some resources useful for data mining:

Data Miner: Data Miner is a Google Chrome extension that extracts data from web pages directly into spreadsheets. Features include scraping data from paginated results and downloading an entire website with all the images to your desktop.

The Right to Know Network: It is a repository for environmental issues. The databases include information from states and cities on toxic chemical spills, waste handling, mining, and industrial facilities’ disposal of extremely hazardous substances.  

Digital Impact: A novel resource launched by the Digital Society Lab at Stanford, Digital Impact aims to put together all the data that civil society organizations collect based on three principles: consent, privacy and openness.

“Our goal is to use the public data for public benefit,” said Lucy Bernholz, senior research scholar at the Lab. “Following the principles of good governance in collecting data, we are telling our partner organizations, don´t collect what you can’t protect.” Although the site is still in development, there are useful resources about database licenses and ethical practices in using data.

U.S. resources for data mining include GAO Reports, federal audits and Guidestar.

Main image CC-licensed by Flickr via World Bank Photo Collection.

Secondary image taken by Jenny Manrique.

Tags: 

POST A COMMENT

Please log in or register in order to comment this post.