Many readers have asked for a primer summarizing the privacy and security issues at stake in the the dispute between Apple and the U.S. Justice Department, which last week convinced a judge in California to order Apple to unlock an iPhone used by one of assailants in the recent San Bernardino massacres. I don’t have much original reporting to contribute on this important debate, but I’m visiting it here because it’s a complex topic that deserves the broadest possible public scrutiny.
A federal magistrate in California approved an order (PDF) granting the FBI permission to access to the data on the iPhone 5c belonging to the late terror suspect Syed Rizwan Farook, one of two individuals responsible for a mass shooting in San Bernadino on Dec. 2, 2015 in which four people were killed and 22 others were injured.
Apple CEO Tim Cook released a letter to customers last week saying the company will appeal the order, citing customer privacy and security concerns.
Most experts seem to agree that Apple is technically capable of complying with the court order. Indeed, as National Public Radio notes in a segment this morning that Apple has agreed to unlock phones in approximately 70 other cases involving requests from the government. However, something unexpected emerged in one of those cases — an iPhone tied to a Brooklyn, NY drug dealer who pleaded guilty to selling methamphetamine last year.
NPR notes that Apple might have complied with that request as well, had something unusual not happened: Federal Magistrate Judge James Orenstein did not sign the order the government wanted, but instead went public and asked Apple if the company had any objections.
“The judge seemed particularly skeptical that the government relied in part on an 18th-century law called the All Writs Act,” reports NPR’s Joel Rose. “Prosecutors say it gives them authority to compel private companies to help carry out search warrants.
Nevertheless, Apple is resisting this latest order, citing the precedent that complying might set, Apple’s CEO claims.
“We have great respect for the professionals at the FBI, and we believe their intentions are good,” Cook wrote. “Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
Cook continued: “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
In a letter posted to Lawfare.com and the FBI’s home page, FBI Director James Comey acknowledged that new technology creates serious tensions between privacy and safety, but said this tension should be resolved by the U.S. courts — not by the FBI or by Apple.
“We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly,” Comey said. “That’s it. We don’t want to break anyone’s encryption or set a master key loose on the land. I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”
According to the government, Apple has the capability to bypass the password on some of its devices, and can even disable an iPhone’s optional auto-erase function that is set to delete all data on the phone after some number of tries (the default is 10).
The iPhone at issue was an iPhone 5C, but it was running Apple’s latest operating system, iOS 9 (PDF), which prompts users to create six digit passcode for security. Since iOS 9 allows users to set a 4-digit, 6-digit or alphanumeric PIN, cracking the passcode on the assailant’s iPhone could take anywhere from a few hours to 5.5 years if the FBI used tools to “brute-force” the code and wasn’t hampered by the operating system’s auto-erase feature. That’s because the operating system builds in a tiny time delay between each guess, rendering large scale brute-force attacks rather time-consuming and potentially costly ventures.
In an op-ed that ran in The Washington Post today, noted security expert and cryptographer Bruce Schneier notes that the authority the U.S. government seeks is probably available to the FBI if the agency wants to spring for the funding to develop the capability itself, and that the FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.
“There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues,” Schneier wrote. “There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world.”
Schneier said what the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm.
“The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized,” Schneier wrote. “The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.”
Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), said the same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn’t yet in law enforcement’s hand.
“The request to Apple is accurately paraphrased as ‘Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism’,” Weaver wrote.
Apple appears ready to fight this all the way to the Supreme Court. If the courts decide in the government’s favor, the FBI won’t soon be alone in requesting this authority, Weaver warns.
“Almost immediately, the National Security Agency is going to secretly request the same authority through the Foreign Intelligence Surveillance Court (FISC),” Weaver wrote. “How many honestly believe the FISC wouldn’t rule in the NSA’s favor after the FBI succeeds in getting the authority?”
This debate will almost certainly be decided in the courts, perhaps even by the U.S. Supreme Court. In the meantime, lawmakers in Washington, D.C. are already positioning themselves to…well, study the issue more.
In letters sent last week to Apple and the Justice Department, the House Energy & Commerce Committee invited leaders of both organizations to come testify on the issue in an upcoming hearing. In addition, Sen. Mark Warner (D-Va.) and Rep. Michael McCaul (R-Texas) say they plan to unveil legislation later this week to create a “Digital Security Commission” to investigate whether Congress has a bigger role to play here.
Twitter addicts can follow this lively debate at the hashtag #FBIvsApple, although to be fair the pro-Apple viewpoints appear to be far more represented so far. Where do you come down on this debate? Sound off in the comments below.
Tags: apple, Bruce Schneier, encryption, fbi, FISC, Foreign Intelligence Surveillance Court, House Energy & Commerce Committee, ICSI, International Computer Science Institute, James Comey, Joel Rose, Nicholas Weaver, NPR, nsa, Rep. Michael McCaul, Sen. Mark Warner, Syed Rizwan Farook, Tim Cook