Apple is locked in a public battle with the prosecutors of the San Bernardino terrorist case about helping the FBI. Tim Cook has been in full public-relations mode asserting how this "unprecedented" request is like distributing a "master key" that will make everything on iPhones public.
The government's request (as opposed to how it's described in the media) is reasonable; it is a simple extension to iOS 8 of part of a service that Apple already provides to government agencies for tens of thousands of Apple devices. By refusing to continue providing the service, Apple prevents local police from returning stolen iPhones to their rightful owners. Apple prevents law enforcement from solving crimes of murder, sex abuse of children, sex trafficking, robbery and other crimes. And Apple prevents the FBI from keeping us safe from terrorists.
The awful things Cook claims will happen if he complies are already enabled by horribly buggy and security-hole-ridden Apple software. Nothing the government has requested will make things worse.
Apple’s official privacy policy
What was Apple’s privacy policy before the recent war of words on the subject? The policy is clearly stated on the Apple website. There are lots of words about how Apple loves and respects it customers, and Apple is wonderful. The words lead to this conclusion:
That sounds pretty stark! No back door and no server access. Ever! That sure sounds like my information is secure, no matter what!
Apple’s actions on privacy
As it turns out, those are weasel words. Which you can find out by a little digging. All you have to do is go to their “government information requests” page. There they admit that they respond to subpoenas and search warrants. But they “limit our response to only the data law enforcement is legally entitled to for the specific investigation.” Well, maybe it’s not so bad…
Scanning down the page, in HUGE type, is this assurance that practically no one is affected by all this:
An amazingly tiny fraction of “customers” have been affected by this grudging acceptance of government coercion.
How much does that tiny, tiny fraction amount to? Being super-conservative about doing the calculation, I took the quarterly sales just of iPhones only for the last 3 years (2013 to 2015) as reported publicly by Apple. Truncating each reported result to the lower million, the total is 546 million iPhones. The real number, including iPads and going back further in time, is probably more than twice that. But the arithmetic even for that number is interesting. Using Apple’s own 0.00673% number, the total is 36,745 customers.
That number does not include “national security” requests, which according to the same page, is more than 750 requests for the first half of 2015:
To summarize rhetoric and reality about Apple and privacy:
Rhetoric: We don’t create backdoors and “have never allowed any government to access our servers. And we never will.”
Reality: We dish out customer data as required, and do so by the tens of thousands. But we pout while we’re doing it.
What Apple really, really does
Dig a bit further, and you can download the details of what and how customer information is handled at Apple, in this document:
Here’s a bit of the table of contents:
You can see that the range and scope of information available goes way beyond anything you might imagine from scanning Apple's website pages.
The document also declares that Apple can provide an incredible amount of information from any iOS device prior to 8.0, but “will not” perform data extractions from 8.0 or later. The extraction “…can only be performed at Apple’s Cupertino, California headquarters…”
What the government wants
The government’s request is short and to the point.
They want help defeating iOS 8’s PIN brute-force avoidance mechanisms:
Here’s what they suggest an acceptable means of providing the help would be, a piece of loadable software:
They specifically request software that works for only that phone:
They don’t demand possessing the software; it’s OK if Apple physically has the device and keeps the developed software on site, without even requiring that government agents be present:
And if Apple can think of a different way to accomplish the same results, it’s OK with the court:
In summary, the court will provide Apple with the terrorist’s government-issued iPhone, and wants Apple to create software that will enable the government to do the hard work of figuring out the iPhone’s PIN code so that the government can access the data on the phone. The government is willing to let Apple do this work with the phone at Apple’s offices, with no government agents present, wants the software to work only for the iPhone in question, and does not request a copy of the software.
Tim Cook’s response
Apple hacks and gives the government the private data of tens of thousands of customers. Probably a thousand times a year for national security issues. It does this in its facilities, using software it developed for the purpose.
The feds are investigating a terrorist attack on US soil in which 14 innocent people were murdered. The phone in question wasn’t personally owned by Syed Farook; it was owned by the government agency for which he worked, and whose employees he murdered. Breaking years of Apple practice, Tim Cook refuses to help. He explains himself on the Apple website:
He declares the request “unprecedented.” Sure, if you ignore the tens of thousands of other requests Apple had no trouble satisfying.
He says the order “threatens the security of our customers.” And the possibility of future terrorist attacks doesn’t?
He says the order “has implications far beyond the legal case at hand.” Yes it does. But not the way he means it.
A little further down, he gets to the crux of the matter:
He claims he doesn’t have what the government wants. Everyone knows that, and it’s implied in the court order. But he had the equivalent for earlier versions of iOS.
He claims it’s “too dangerous to create.” While he blathers about encryption and about how Apple can’t get at your data, here he makes no claim that the software is impossible to write – and it’s not! He’s just saying he won’t create it, because he’s too moral or something, and the software would be too "dangerous." Although more powerful versions of the requested software were built by Apple for prior versions of iOS, and they somehow weren't dangerous.
He claims the request is for a “backdoor to the iPhone.” Wow. You can review the actual request above. It’s no such thing. It’s a piece of software that circumvents the iOS 8 defense against brute-force PIN-breaking. Apple gets to create the software and use it at their offices on the provided phone.
Cook goes on:
“The FBI wants us to make a new version of the iPhone operating system.” Maybe that sounds technical and accurate to someone who didn’t read the documents, but it simply isn’t true.
“In the wrong hands, this software…” How exactly is it going to get in the wrong hands, Mr. Cook? Apple employees have full and unfettered access to the source code of Apple software, including iOS. Any time one of them felt like it, they could make an unauthorized version and spirit it to some off-site server, and do all sorts of evil with it. That was true yesterday, is true today, and will remain true regardless of what happens here. The current situation doesn’t change the chances of malicious software being used for bad purposes one iota.
“…would have the potential to unlock any iPhone in someone’s physical possession.” BZZZTTT! What this software would do would be exactly and only what the government is asking for: make it possible to brute-force hack the PIN code, which has one million possible combinations for the default 6-digit PIN. For normal humans, this means you would have to:
- Acquire someone’s iPhone
- Get and load the hacking software onto it, assuming it has somehow wafted out of Apple
- Then, by hand, try 6 digit PIN codes until you got to the one that worked
- On average, this would occur after entering half the possible codes, a total of 3 million digits. This would take more than 34 days of continuous one digit per second attempts.
- Or, if you really are a super-hacker, you could automate the process. Which I won’t go into here.
Cook then gets wilder:
Yes, the software, once created, could, would and should be used on "any number of devices." Devices that were provided to Apple at their offices with proper documentation and court orders. Most of these devices, as today, would have been lost by their owners, and Apple is helping the owners identify them so they can be recovered. Many of these devices, as today, would be evidence in criminal proceedings. And hundreds of these devices per year will be related to national security issues, as they are today.
I am very concerned about the FBI being blocked from tracking and stopping terrorists before they kill. But I'm equally concerned about the "merely" criminal aspects of this. For example:
Cook has more:
Because Apple built software used by Apple on specific phones delivered with court orders to Apple facilities, the government will now be able to listen to your microphone or camera. How exactly does this leap happen?
The fact is, Apple software was, is and will be chock full of security holes and other problems. Here is Apple's own list of the dozens of security problems that were fixed in iOS 7. After fixing all those problems, iOS should be secure, right? Apple then found more bugs, refused to fix them in user's devices, and instead released iOS 8 with no less than 53 additional fixes to security flaws. So how did iOS 8 go, with all those fixes? Not so well, according to Wired:
Finally, Tim Cook once more:
Apple products have been buggy and filled with security holes in every release. It's riddled with back doors, side doors and bottom doors, all because of Apple's ineptness. It's not getting better. Mr. Cook wants us to fear that the mean government will force us to walk around without privacy. Well, we already are! And it's Apple software that's responsible! Extending Apple's existing practice to iOS 8 will not create a new situation -- it will maintain Apple's historic cooperation with the legitimate law enforcement operations of government, protecting us from terrorists and criminals.
What is this really about?
I wish I knew. But it's hard not to think of money and market positioning. There is a large portion of the public that thinks that Wall Street and Big Corporations are evil. Meanwhile, Apple makes products that are used by millions of people who think this way. Apple wants to market itself as being for the 99% of people.
But it has a problem. It's one of the richest, most valuable corporations in the world. It charges top dollar for its products, which are entirely made in cheap-labor countries. It plays games to avoid paying taxes. It's bigger and richer than Wall Street! It's even richer than the US Treasury:
It's quite reasonable to imagine that Tim Cook is following in the Steve Jobs tradition of marketing magic to divert its customers from looking at the numbers. Numbers that show that Apple is a corporate behemoth whose sales are slowing, whose new product initiatives have failed, and is desperate to bolster its brand and hold onto customer trust (and revenue) it does not deserve.