The narrative about big data often focuses on their vigor and potential. The possibilities of big data for civic good are, in fact, exciting. Big data are used for everything from improving crisis response during natural disasters like Hurricane Sandy to providing mobile banking to Africa’s poorest citizens.
But the excitement about the potential of big data to improve our lives must not keep us from noticing that this new era raises ethical questions, which are in urgent need of answers. The question many are asking already is, "To what extent have our personal data been misused by governments and companies?"
Millions of people’s digital footprints have been appropriated for economic reasons and surveilled in the name of security, often without our consent. The current case of U.S. National Security Agency espionage, as revealed by leaker Edward Snowden, is perhaps the most palpable example. But even something as seemingly benign as unwanted, targeted advertisements in our email inboxes reveal the “not so rigid rules” around your private data.
Invited by the Rockefeller Foundation and Pop Tech to participate in their Inaugural Bellagio/Pop-Tech fellowship, I joined a multidisciplinary group of scholars and professionals to discuss the implications of big data on community resilience. We asked ourselves, "When a city or neighborhood is faced with disaster or stress, such as economic shock, climate change or a natural disaster, what should or could be big data’s role in boosting recovery, or even in helping prevent the disaster in the first place?"
Our conclusion is that when data are misused, networks are eroded, undermining the trust of citizens toward the institutions which exist to serve them. For example, during a natural disaster, geolocation is incredibly useful to recovery efforts. Yet very few people agree to share their location on social media platforms because they are wary that their data will be captured by companies eager to exploit it for inappropriate reasons.
Our group then asked, “How can we encourage people to share their data for social good? How can we enable people to share their information only when they feel it is necessary?”
In response, we have a proposal. Let's adopt the hashtag “#noshare,” or “#ns” as a statement on data ownership and our right to opt out of the current digital drainage of personal information.
We believe this initiative, with enough support, can act as a social norm about what information should or should not be transformed into searchable data. It's a line in the sand for those who want to tap into your data without you knowing it. Will it prevent them from doing it? No. But it will make it abundantly clear if they use it in a way that violates your wishes. There will be no way for someone who uses your data to deny knowing that you didn’t want it used.
We can use the #noshare tag on social media, but also on buttons pinned on clothes, on signs posted in buildings, or in any other situation when you decide you don't want your actions to be sensed or surveilled. As Patrick Meier, another Pop Tech fellow in our group, wrote on his blog, iRevolution, this is “a humble attempt at regaining some agency over the machines in the interest of privacy.”
Why this matters for journalism
In the era of big data, a discussion of ethics will be key for journalism organizations. The enormous amount of data currently generated is reshaping journalism as we know it. Data journalism, for example, is a whole new branch of our field. It relies on our growing technological capacity to access and analyze large data sets so we can reveal and tell the stories behind them.
Media groups are already creating algorithms to capture and read data. One of the most impressive examples I have seen is the Open Paths project of the New York Times, a “secure data locker for personal location information” which allows people to share their geodata with specific arts and education projects.
There are also academic efforts underway that seek to understand how the traditional practices of sourcing, producing and distributing information are transforming. Scholar Emily Bell, director of the Tow Center at Columbia University Journalism School, along with some of her colleagues, say we’re experiencing the emergence of post-industrial journalism.
No doubt, there is a thrill in using data to innovate journalism. This is especially important in a field that often is considered moribund, given the large number of redundancies and closures in newsrooms around the world. By working with large amounts of data, media companies are creating interactive products and investing in crowdsourcing information from their audiences.
Nevertheless, the public's trust in the media and the importance of journalism will be larger if we adopt policies that respect privacy and honesty towards the use of data. Only if journalism can earn the trust of the public can it effectively fulfill its role of holding powerful people and institutions to account.
In my opinion, one the most interesting concepts describing the relationship between culture and technology is one developed by German sociologist Ulrich Beck, who writes that we live in a “risk society,” in which we deal with hazards and threats simultaneously with the introduction of modern technology. He frequently uses the example of nuclear energy. It’s one of the most advanced forms of extracting energy from matter, but it is potentially the most dangerous as well.
It seems that, if we do not use ethical boundaries when dealing with big data, we will keep facing the same contradiction of risks and opportunities.
Gustavo Faleiros is an environmental journalist and media trainer who specializes in data journalism. He is an ICFJ Knight International Journalism Fellow based in Brazil. He thanks Pop Tech Fellows Kate Crawford, Amy Luers, Patrick Meier, Claudia Perlich and Jer Thorp for their feedback, input and inspiration for this post.
Global media innovation content related to the projects and partners of the Knight Fellowships on IJNet is supported by the John S. and James L. Knight Foundation and edited by Jennifer Dorroh.
Image CC-licensed on Flickr via CyberHades.