By Daniel Ho | Photo Editor
Would you be okay with the government accessing your phone’s camera, microphone, or data files any time it wants, under the pretense of national safety? The FBI’s court order forcing Apple to compromise and weaken the security of their software could be a start to just that: a society in which the information we store digitally and our digital devices themselves inevitably becomes vulnerable to the prying eyes of hackers and the government.
In December 2015, the FBI recovered the work phone of the San Bernardino shooter responsible for killing 14 people. In their haste to access the data on the phone, they reset the password to the iCloud account of the phone, which locked them out of the contents on the device. The iCloud backups they were able to download only contained outdated information from six weeks prior to the shooting, so the FBI turned to Apple for help.
Though the FBI director, James Corney, claimed that the FBI simply wanted “the chance, with a search warrant, to try to guess the terrorist’s passcode… We don’t want to break anyone’s encryption or set a master key loose on the land,” it would be very difficult for Apple to allow the FBI to electronically brute-force the password. To fulfill the court order, Apple stated that they would have to “create a unique version of iOS that would bypass security protections on the iPhone Lock screen. It would also add a completely new capability so that passcode tries could be entered electronically.”
In the district court order the FBI filed on Feb. 16, it argues that the All Writs Act of 1789, a law that gives them permission to do whatever is necessary or appropriate to aid their investigation, compels Apple to unlock the phone. Information pertaining to the San Bernardino shooting and terrorists attacks may seem important enough to national security for the FBI to make this order. However, the FBI’s success in this case would undoubtedly set a precedent that the All Writs Act can force Apple, or any other company, to allow access to encrypted data pertaining to numerous other cases, like convicting drug dealers and burglars or searching for potentially incriminating information.
The court order is not only dangerous because of what it would imply in other cases unrelated to terrorism, but also because it is a step in the wrong direction toward cybersecurity and privacy of information for the public. It would mean that it is legal for engineers to be forced, against their will, to write code that brings into existence ways for malicious users to bypass encryption. FBI director Corney only wants to use the vulnerable operating system for this specific case, but software is not nearly as easy to erase as physical documents. Nothing stops future uses of the All Writs Act from demanding backdoors to software in TVs, phones, or computers so they can take “necessary” measures to maintain safety. In the future, the FBI might find it useful to tap into phones to monitor conversations or hijack smart TVs to make sure what you are doing at home is legal.
Another danger lies in the potential for the backdoor that Apple provides the FBI to be stolen and used maliciously on other phones, either from FBI or Apple servers, by hackers and criminals. This danger would only increase exponentially as other court orders force Apple and other companies (like Google) to release more backdoors and workarounds. Once the code is written and the method the FBI wants exists, it cannot be erased. It would only be a matter of time for copies of the backdoor to spread around and end up in wrong hands. The increase in risk for data to be compromised by programs that software companies write themselves means that people could start losing trust in cyber security.
Although the FBI only wants the software on a specific phone in this case, its complications can compromise trust in cybersecurity for everyone. People may argue that national security comes before privacy, but a world where software companies release hacks to their own code spells disaster for our digital world.