Under increasing political pressure because of its controversial policy of automatically encrypting new iPhones and other devices, Apple has gone on the offensive. But it may already be too late.
Law enforcement has been complaining about Apple’s encryption policies since they were first announced in mid-2014. But in the wake of the recent terrorist attacks in Paris and San Bernardino, the issue has become politicized, with many calling out Apple for using technology that protects terrorists and other villains.
The issue with encryption, of course, is that it’s essentially perfect. And even Apple—in this case—doesn’t have any way to bypass its own security controls and gain access to the contents of an encrypted smart phone. But Apple understandably refuses to build in a so-called backdoor for law enforcement because it would be quickly usurped by hackers and evil-doers.
This is the central crux of the issue. Law enforcement, and now politicians and other policy makers, is arguing that always-on encryption will help terrorists and others hide communication and other information, thus enabling them to do harm to the innocent. Apple is in effect harming the security of the United States and elsewhere.
Apple disagrees, has always disagreed. But in recent days, the company has forcefully gone public about its views.
“I don’t believe that the trade-off here is privacy versus national security,” Apple CEO Tim Cook told Charlie Rose on “60 Minutes” on Sunday night. “I think that’s an overly simplistic view. We’re America. We should have both.”
“Here’s the situation,” he explained. “On your iPhone, there’s likely health information, there’s financial information. There are intimate conversations with your family, your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it.”
Apple’s public stance is notable, and even laudable if you’re a privacy adherent. But building a device that purposefully locks out law enforcement is a legal gray area. And some are hoping to change that.
“As a society, we don’t allow phone companies to design their systems to avoid lawful, court-ordered searches,” Senator Tom Cotton said, retorting to Mr. Cook’s assertions. “If we apply a different legal standard to companies like Apple, Google and Facebook, we can expect them to become the preferred messaging services of child pornographers, drug traffickers and terrorists alike—which neither these companies nor law enforcement want.”
On that note, various U.S. senators are now working on legislation that would prevent Apple and others from locking law enforcement out of smart phones and other devices.
And they’re not alone: A new “Investigatory Powers” bill in the U.K. proposes to require these companies to provide “permanent interception capabilities” for any information on their devices, including “the ability to remove any encryption.”
There’s no simple answer here. But Senator Cotton is right about one thing: Apple doesn’t have or provide access to the encrypted data on its own devices only because it designed them that way. Were it to offer a backdoor to law enforcement, it could easily update that backdoor should it ever be broken into by villains. And that could in turn could lead to an ideal situation for Apple, where customers would over time have to buy new devices in order to get the newest security and other features.
A compromise? Sure. But that’s what politics is all about. And now that device encryption has become politicized, you can expect great change. Instead of fighting this, maybe it’s time for Apple to work with politicians and policy makers to find the best solution that can work for everyone.