The FBI, Apple, and access to your iPhone

The Observer editorial board

A U.S. magistrate has ordered Apple to help the FBI break into a work-issued iPhone.
A U.S. magistrate has ordered Apple to help the FBI break into a work-issued iPhone. AP

Long before Syed Farook and Tashfeen Malik killed 14 people in San Bernardino last December, the federal government and technology companies had been fighting over America’s mobile phones.

The government says information on those devices might help it solve crimes and prevent future terrorist attacks. Technology companies say encryption software the phones contain keeps their customers safe from hackers and snoops.

Now, the San Bernardino case has crystallized and heightened the debate. The FBI has possession of Farook’s iPhone 5C, but investigators don’t have the passcode that would unlock the device and perhaps provide important clues in the case. Apple doesn’t have that passcode, either, and CEO Tim Cook says the encryption software on new iPhones is so sophisticated that Apple can’t access it.

On Tuesday, a federal magistrate in California ordered the company to try. Apple, the ruling said, should essentially invent new code so that the government can get into Farook’s phone without erasing data by entering too many incorrect passcodes.

As with most fights about law enforcement access to evidence, both sides make a compelling case. The FBI says it needs a guarantee of access not only with Farook’s phone, but others, so that criminals cannot go “dark” by covering up their footsteps digitally. In Farook’s case, that access is clearly a matter of national security.

Technology companies say – and experts agree – that once you give the government a “back door” into devices, there’s a strong risk you’re also giving it to hackers and foreign governments worldwide.

For tech companies, such vulnerability might also threaten the bottom line. Customers want to know that their data are safe from prying fingers, a message some companies heard loud and clear when it was revealed that they willingly gave up customer data to federal investigators. Apple, too, reportedly helped the FBI on dozens of occasions before 2015 by unlocking phones at the government’s request.

Now, Apple has created encryption so complex that it can’t crack Farook’s passcode, and Cook has vowed to appeal the court order. We share his worry that the code his company invents might eventually get in the hands of others, and we agree that the court order opens the door to the government demanding other kinds of technological assistance. What’s to stop the FBI from compelling tech companies to write programs that use a phone’s camera or microphone to snoop on someone suspicious?

What we don’t share with Cook is a depth of technological knowledge. So it’s unclear if a compromise exists that allows Apple to keep a phone-cracking code from the FBI while helping it gain access to Farook’s iPhone.

Still, even that would set a troubling precedent. Do we want the government to require that companies invent or direct technology that serve the government’s purposes? In Syed Farook’s case, it’s clear why the answer is yes. But that’s a back door people should be uncomfortable opening.