We are living through a privacy tipping point. Technology is changing dramatically what is possible in terms of surveillance, monitoring, persistence, analysis. We are cracking open the lid of Pandora’s box. We still don’t really know what’s in the box. Maybe the benefits of the disappearance of privacy will outweigh the negative outcomes. Is it possible that the quantified self will be worth trading for the all-seeing eye of the corporation or state? We don’t understand the full story yet.
One thing that is clear is that this is an important time to stop and think. Before we give away privacy in ways that may (already) be very difficult to undo, we ought to slow down and consider the implications. That’s what brings me to write about this. I am not an expert on privacy but I believe it to be important enough of an issue that it will require all of us to come up with an approach to it that realises the benefits of technology without undermining our rights and our autonomy.
We know that both corporations and governments are actively collecting data about us. We are not happy about the covert collection of data about us by governments but ironically we queue up to give away our privacy to corporations in exchange for services. We know that proliferation of technology is making it harder to be anonymous. In particular, the smarter our mobile phones get, the more data they leak to governments and corporations alike.
We’ve known for some time that it doesn’t take many pieces of data to uniquely identify someone. More than 20 years ago, researcher Latanya Sweeney showed that with just three pieces of data (date of birth, gender and postal code) she could identify 87% of the US population. In 2006, researchers Arvind Narayanan and Vitaly Shmatikov shocked Netflix by de-anonymising a massive dataset of recommendations that Netflix had released after stripping it of what they thought was all personally identifiable information.
That is disturbing enough on its own, but it has actually got worse since then. Last year, researchers from MIT were able to uniquely identify individuals from cellphone records using just four data points that indicated location and time. In fact, location data turns out to be incredibly informative. In a competition called the Nokia Mobile Challenge, researchers were able to estimate the user’s gender, marital status, occupation and age based on location information alone. Researchers on location-tracking point out that the accumulation of data is significant. What is anonymous is small amounts becomes personally identifiable information in large amounts.
Some massively open online courses (MOOCs) are now collecting keystroke information on students which they use to uniquely identify students. The goal of eliminating fraudulent behaviour in MOOCs is laudable but the collection of this data raises privacy issues. How would you know for certain when this data is or is not being collected from you? What if this data found its way into the hands of other less scrupulous organisations who might conceivably use it to find you anywhere on the Internet?
So, we live in an era where it is becoming increasingly challenging to protect one’s privacy. In fact, I am told that de-anonymisation researchers have recently reached the point where some are choosing not to publish some of their research results because they might be used to further undermine privacy.
My data
One popular reaction to the problem of the erosion of personal privacy is to attempt to reclaim privacy through personal data control where we are able to establish and exert our own individual preferences in order set our own boundaries for privacy. The notion of privacy being an individual transaction where we are each allowed to choose whether to share or not to share personally identifiable information sounds like a great improvement on what we have now where we have very little individual control. Laura James of the Open Knowledge Foundation makes the case in a blog post that the right to choose should be an essential element of “my data”. She says: “If it’s my data, just about me, I should be able to choose to access it, reuse it, share it and open it if I wish.”
Until recently, I would have said that this was hard to argue against. But what I have learned recently has made me realise that privacy cannot be so easily reduced to individual transactions. In his excellent lecture series, Snowden and The Future, Eben Moglen makes the case — in part 3 — against privacy as a transactional issue. He points out that “if your family contains somebody who receives mail at Gmail, then Google gets a copy of all correspondence in your family”. Your personal decision has privacy implications for everyone you know.
Perhaps even more worryingly, researcher Scott Peppet argues that decisions to reveal personal information publicly have implications for those who choose not to. He suggests that people with “valuable credentials, clean medical records and impressive credit scores will want to disclose those traits to receive preferential economic treatment”. Pressure is then put on those with only marginally less valuable credentials to share in order to benefit. Peppet argues that others could find they also need disclose personally identifiable information in order to avoid negative inferences that may be drawn through staying silent.
New metaphors
So, apparently we need a new way of looking at privacy issues. Researchers Paola Tubaro and Antonio A Casilli have explored a multi-dimensional agency-based model. In their research, they found that a tendency to share more online was accompanied by a counter-tendency among people to protect themselves online. This plays out in complex ways in which we all influence each other through our privacy (or lack of privacy) practices.
Moglen has suggested that, from a legal perspective, privacy is much more like an environmental issue than a transactional issue. He points out that “environmental law is not law about consent. It’s law about the adoption of rules of liability reflecting socially determined outcomes: levels of safety, security and welfare.” Perhaps this is a better way of looking at privacy. I wonder what the privacy equivalent of a fine for littering is?
I wonder if we might look at privacy from a health perspective and consider certain privacy practices as “vaccines” against the more egregious invasions of personal privacy. The notion that privacy is a social thing seems almost oxymoronic at first glance but the closer you look, the more evident it is that privacy is something that we collectively engage in but benefit from individually.
I am still digesting these ideas and reading more. I hope to see something from you, too. Privacy is something too important to be left up to technological determinism or to 20-something billionaires. We all need to read, think and engage.
- Thanks to @barefoot_techie for links to many thought-provoking articles and for the opportunity recently to listen to privacy researcher Kate Crawford
- Steve Song is founder of Village Telco
- This piece was originally published on Song’s blog, Many Possibilities