With recent advances in technology, the power to obfuscate information, which only a short time ago was available just to nation states, is now found in the palm of any person’s hand. This power protects us and our information from both the criminals wishing to get at that information for their own gains and the unnecessary prying eyes of employers, merchants, carriers, and law enforcement agencies.
Everything works out until something bad happens. Then everyone asks: where were the police, why didn’t the FBI know, how come the CIA did not stop this? This is followed by law enforcement’s quest for information in trying to make sure the bad thing doesn’t happen again.
But reality is not so simple. With the new technology, getting to that information is not just a matter of investigation—the traditional gumshoe efforts—but now computer specialists, technology analysts, and a cadre of experts sleuth the electronic realms where people, both good and bad, have their information. Here they now run into encryption and access controls that stymie the best investigative efforts. So where do they turn? To those companies that design and maintain that infrastructure, with hopes that the same technology that locks law enforcement out can be used to give law enforcement what they seek. But, as just noted, this isn’t so simple.
It is not the technology part that is complex, but rather the question of whether those companies should aid law enforcement and national security. Arguments for providing aid stress that without it, investigative efforts will be for naught and more bad things will happen. Arguments against stress that without the encryption and access controls, we will live in a surveillance state and more bad things will happen.
Where is the truth?
Following the San Bernardino terrorist attack in December, law enforcement sought to make sure that the next bad thing did not happen, while trying to understand how this bad thing did happen. Through investigation, they uncovered the iPhone 5c of Syed Rizwan Farook, one of the terrorists in the attack. This is what they want to search, this is where information is kept, this is the unknown. Unfortunately, the device is locked, and as a default for the protection of the phone’s user, after a number of failed log-in attempts, the phone will wipe the information contained within it.
Because they believe the information on the device may be critical in furthering the investigation, and recognizing the risks in running up against the failed log-in attempt limit, the FBI reached out to Apple, the manufacturer of the iPhone, to see what might be done to disable the “self-destruct” mechanism. Apple felt it should not help, and the FBI and US Department of Justice sought the federal courts’ aid to procure Apple’s assistance.
The public is divided on this issue. Some feel that Apple should stand up for privacy rights and just say no. Others feel that law enforcement is in the dark with the new technology, and without the ability to have manufacturers provide access, key crucial information won’t be available.
On one hand, Apple has already given assistance in the case, working with law enforcement to provide access to the iCloud backup of the device, and after it was discovered that the iPhone had not been backed up since two months prior to the attack, developing procedures to try to get the iPhone to automatically back up the current contents. Unfortunately, these attempts have been fruitless, partly because of mishandling of the device, and the information remains locked.
On the other hand, the fear with going further is that the current request to disable the automatic wiping, permitting law enforcement unlimited tries to break the passcode, goes beyond what normal law enforcement assistance would entail. It would open the door to permit any iPhone to be opened by law enforcement or governments. The government argues that its request is for assistance on this particular iPhone and is not designed to create unencumbered access to any device.
I think in this particular instance, law enforcement eventually will get the assistance that it requests.
Notwithstanding some privacy advocates’ fears, the case is providing a useful and strong precedent to prevent falling down the slippery slope.
First, the request is narrowly tailored to a particularly identified space to be searched, with a reasonable level of specificity as to what is being sought. While the Fourth Amendment prohibits unreasonable searches and seizures, it does not abolish them. Rather, when law enforcement wishes to conduct a search, it must demonstrate probable cause for why the search should be permitted, define where the search is to be conducted, and identify what is to be found. An impartial magistrate then decides whether or not law enforcement has met its burden. Here, law enforcement wants only access to a specific device and is seeking information relevant to terrorist activities, meeting all of these elements, and a court has agreed and ordered the assistance.
Second, the request is not asking Apple to disable the encryption or provide a back door to the algorithm. Rather, the request looks only to permit law enforcement to keep guessing at the passcode until it determines the correct one. Unlike the partial analogies that other commentators have provided, such as law enforcement trying to break into a safe and not needing the safe manufacturer’s help, this scenario is different: not only is the device itself important, but so is the infrastructure that supports it, which is under Apple’s control. That infrastructure is a crucial part of the overall effort and cannot be separated, so therefore Apple’s help is instrumental. Additionally, to that end, one could argue that having Apple control the means to permit the unlimited tries would be better for privacy than leaving it in the hands of law enforcement to commandeer the infrastructure and put their own code in place.
These details mean that in this single case, the assistance should be given to law enforcement. Of course, in our civil society, law enforcement should keep the focus on this need for this investigation and not attempt to broaden this to any other opportunity without similar due process.
There is no doubt that what can be done for this case can be done in any other situation. It is crucial that law enforcement continue to adhere to its current position that this request is a single request based upon real and suitable facts. It should unequivocally state that this is not a wedge in the door for full, unfettered access to any device, Apple or otherwise. They should stay within the defined realm of the Fourth Amendment and not try to demand the code or control over the infrastructure, and they should not be asking for back doors. Otherwise, how can we build trust and not have Apple, Google, or anyone else question the veracity of their intentions in the future?
Kenneth P. Mortensen, a School of Law lecturer in law, teaches Privacy Law. He was the Justice Department’s associate deputy attorney general for privacy and civil liberties under President George W. Bush. He can be reached at email@example.com.
“POV” is an opinion page that provides timely commentaries from students, faculty, and staff on a variety of issues: on-campus, local, state, national, or international. Anyone interested in submitting a piece, which should be about 700 words long, should contact Rich Barlow firstname.lastname@example.org. BU Today reserves the right to reject or edit submissions. The views expressed are solely those of the author and are not intended to represent the views of Boston University.