If Apple gives in to the FBI and creates backdoor access to a device’s contents, the backdoor is likely to get into the hands of the people committing crimes and spreading terror.

Share story

Practical Mac

Apple is dominating the news, without a new product in sight.

In this case, as you have probably read or seen, Apple is tussling with the FBI over an iPhone used by one of the murderers in last year’s San Bernardino shootings.

Briefly, the situation is this: The FBI wants to access the information on an iPhone 5c used by Syed Rizwan Farook, a phone supplied by his employer, the San Bernardino County Department of Public Health.

The phone is protected by a pass code, as well as the setting that erases the device’s contents after 10 attempts. To get in, the FBI wants Apple to create a modified version of iOS that will update the existing operating system and remove the automatic erasure. Then, investigators could “brute-force” the device by trying millions of number combinations.

Unlimited Digital Access. $1 for 4 weeks.

Apple is refusing, for good reason. The FBI claims that this is a one-time-only request, but the agency is either unbelievably naive (doubtful) or being disingenuous about its motives. Here’s why: Creating that compromised version of iOS can’t possibly be limited to this one instance, and the consequences of making it are dire.

I’m not being hyperbolic. If such a version were to exist, it would mean that the iPhone’s built-in encryption can be defeated. It would also set a precedent that would open the door to courts demanding Apple do the same in other cases. (The FBI has since acknowledged that’s a possibility.)

But wait, that sounds good, right? We want to defeat terrorists and catch criminals and keep people safe. Unfortunately, this solution makes people less safe, because once this type of backdoor exists, it’s likely to get into the hands of the people committing crimes and spreading terror.

According to forensic scientist Jonathan Zdziarski, making any evidence gleaned from this type of search stand up in court automatically requires multiple copies of the exploit to be shared — for validation, for review by prosecutors and defendants, and more. Now imagine the value of being able to break into anyone’s iOS device worldwide. Criminals (and governments) would certainly tempt someone in that chain of dissemination to make a copy.

And when it’s in the wild, no iOS device would be safe: yours, lawmakers (the President gets intelligence briefings on an iPad), CEOs (opening up all sorts of potential industrial secrets) and anyone with private information.

And it’s not just iOS. Setting this precedent would mean the FBI or other agencies could compel Google and other manufacturers to expose hundreds of millions of devices. And if the United States can do it, you had better believe China, Russia and other nations will, too.

My colleague Rich Mogull, in one of the most cogent takes on this situation, wrote, “(A)sk yourself, do we have a right to security? To secure devices, communications and services? Devices secure from criminals, foreign governments and yes, even our own? And by extension, do we have a right to privacy? Because privacy without security is impossible.”

Right now, Apple and the FBI are moving into the legal phase of this argument. It’s nearly guaranteed that an upcoming version of iOS will make even this workaround that the government is asking for in this case impossible to perform in the future.

So, from a practical standpoint, what can you do to maintain the security of your own personal information?

First, if you’re not using a pass code on your iOS device, set one up now. The pass code activates encryption of data on the phone. If your pass code is four numbers (the previous default), change it to a six-digit or alphanumeric pass code: Go to Settings > Touch ID & Passcode (or just Passcode on devices without Touch ID) > Change Passcode. After you enter your existing code, tap the Passcode Options link and change the type.

Next, save sensitive information in a secure app such as 1Password. The days of keeping passwords in Notes or your Address Book are over. (That said, the next version of iOS will allow individual notes in the Notes app to be secured by pass code.)

If you’re especially concerned, turn off iCloud backups and back up your device’s data to a computer using iTunes; there you’ll have an option to encrypt local backups. Currently, contents of iCloud backups are not encrypted, and Apple will share that information when subpoenaed to do so.

None of these actions makes you a terrorist sympathizer, because, honestly, terrorism is low on the list of threats to everyday people. Someone stealing your iPhone, breaking into it, and accessing your bank accounts and online accounts is more likely, and even that is a remote possibility compared with having your bank card numbers stolen from a big retailer that doesn’t properly secure its own data.

I’m hopeful that a sensible resolution comes out of this, and it’s good that people who don’t follow technology are being exposed to this issue.

Whether on phones, wearables or “smart home” devices, encryption and security are absolutely vital, and those technologies are coming fast. Undermining security in one instance undermines it for everyone.

Custom-curated news highlights, delivered weekday mornings.