Customer Letter - Apple

Encryption. That’s a word which gets thrown around a lot in todays modern world of smart, connected, “internet of things” devices. Tonight, CEO of Apple Tim Cook has boldly gone ahead and written, “A Message to Our Customers” which now appears on the website homepage. In his letter, Tim has arguably hit the nail on the head by openly reiterating Apple’s pro-encryption methods of software and hardware development saying, “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Here’s the thing: The bad guys will always find a way to get what they want, and we shouldn’t be making it easy for them. There are arguments that suggest encrypted services such as iMessage shouldn’t exist, because criminals can use them to communicate without being tracked. Here’s the thing, if iMessage (or iOS in general) loses its encryption, a criminal can pay a small fee per month for a VPN service which allows them to stay anonymous online anyway. By taking away this encryption, you’d be exposing literally billions of devices full of information to hackers in attempt to catch a few out who are already capable of having an almost-zero digital footprint. The only way to open this data up to the government involves making it easily accessible to hackers, a tradeoff which shouldn’t be considered.

[…] it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

This begs the question, is it worth giving up the data of millions of innocent individuals for almost no gain? The answer is unequivocally no. You know what would also discourage terrorism? Attaching a microphone to every citizen, have it recording 24/7, and then analysed for information on potential terrorism plans. Is this going to happen in the near future? No. Is this extreme? Yes, and that’s why it won’t happen. Giving up the privacy of millions in the hope of catching a few doesn’t seem like the best tradeoff, particularly when there will always be other ways for these people to hide. Hence there is no guarantee that even by giving up this data, the government will be closer to finding out about and preventing acts of terrorism.

And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim’s closing line is powerful and speaks the world of truth. There needs to be a line, and a backdoor in iOS certainly crosses it. You don’t promote freedom and safety by exposing personal data. It doesn’t, and can’t, work both ways. Apple currently have established themselves as market leaders in privacy and encryption, leading to a competitive advantage. This sells iPhones and is a big deal to the privacy conscious. It’s great for customers to have choice, and should Apple be forced to decrypt iOS this will begin a slippery slope down a path to a world where all our data is insecurely stored. Apple have always made privacy and security a priority, and it’s great for consumers to have this choice. We need ensure this continues and isn’t disallowed going forward.

I’ll end this post with what, for me, was the Tweet of the night related to Tim’s letter:

To be totally secure, it need to be physically impossible. As long as there’s a human deciding what’s possible & not, there’ll be conflict

— Steve T-S (@stroughtonsmith)

February 17, 2016

This matter can’t be left to human judgement. It needs to be impossible for Apple, their engineers, and the government to access this data, or else the keys will undoubtedly be abused.