Who controls what in the digital world? Apple is currently involved in a court face-off with the FBI, and has refused to produce software that would help investigators to unlock the phone of San Bernardino gunman Syed Rizwan Farook. The clash is just the latest illustration of how important, as more and more of our lives are reduced to streams of data, access to that data has become.
The process by which Apple could help the FBI weaken the iPhone’s access security is technically feasible. But rather than focus on the technical challenge, we should ask why the FBI has asked Apple to undermine these security mechanisms. The implication of a court judgment in the FBI’s favour — with the precedent it would create — is that law enforcement and the state would gain the right to undermine the security we apply to our devices to access the enormous amount of personal information that we store on them.
Smartphones have become powerful pocket computers, stuffed with data revealing our thoughts and behaviour through records of our online browsing, our social activities, connections to friends and groups, interests and so on. A smartphone is more than a contact list, it provides an intimate portrait of its user. Just as the law affords an individual in their home a degree of privacy and security and protection, so it should treat our phones as digital “homes”.
Following this line of reasoning, consider a smartphone like a serviced apartment. While it is where we “live” digitally, it is also maintained and supported by a third party whose services include the offer of protection from unauthorised entry.
In the physical world, law enforcement may request the right to enter our real homes lawfully through obtaining a warrant. The physical act of gaining entry is rather trivial; it is the legal process that must be worked through first that is sometimes trickier to negotiate.
In the case of our digital homes, gaining access can be much more complicated and nuanced. Should the landlord of our digital homes — in this case Apple — be required to help law enforcement force open the door?
It comes down to a question of who we should trust to protect us and our digital lives
If the Internet has shown us anything, it’s that technologies, once invented, can be quickly copied, altered and distributed. Look, for example, at the issues surrounding digital content and copyright infringement. It’s no different in the cybersecurity business: tools to break into software or digital devices are quickly replicated, modified and distributed once discovered.
The Stuxnet worm — weapons-grade software that attacked industrial control systems in an Iranian nuclear processing plant — turned up online six months later, with significant parts of the code freely available for anyone to use. The fear is that the same could happen here with the tool Apple would be required to create. Worse still, is that the tool could be used covertly to break into people’s phones and leave no trace.
It comes down to a question of who we should trust to protect us and our digital lives. Traditionally, the role of government was to protect its civil population, and yet in this case it would seem that a global tech corporation is acting as the defender of our civil liberties.
Does society believe there are appropriate legal protections in place that will provide a balance for civil liberties against police and investigators’ demands? In the wake of the Snowden revelations, it’s fair to say that Western societies are questioning whether the current protections are strong enough, and if the legal framework has fallen far behind the pace of technological change.
- Daniel Prince is associate director, Security Lancaster, Lancaster University
- This article was originally published on The Conversation