Reprinted from TidBITS by permission; reuse governed by Creative Commons license BY-NC-ND 3.0. TidBITS has offered years of thoughtful commentary on Apple and Internet topics. For free email subscriptions and access to the entire TidBITS archive, visit http://www.tidbits.com/ Apple and Google Spark Civil Rights Debate Rich Mogull One small feature in iOS 8, barely mentioned during the keynote at Apple's Worldwide Developer Conference, has engendered a major civil rights controversy. Apple said that, starting with iOS 8, all first party applications would be encrypted with Data Protection, which entangles your passcode with a special ID embedded in your device, that not even Apple can recover. The implications were clear for months, but only sprung into the public consciousness when Tim Cook released an [1]open letter on Apple's privacy stance, highlighting that the technology would nearly eliminate the possibility of Apple accessing data on your device, even if compelled to try by law enforcement or the government (see '[2]Apple Goes Public on Privacy,' 24 September 2014). This was quickly followed by [3]FBI Director James Comey stating, 'What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law.' Other law enforcement officials followed with their own condemnations. [4]The chief of detectives for the city of Chicago even said, 'Apple will become the phone of choice for the pedophile.' Ouch. Before I go on, there are three things you need to know about me: * My day job is advising companies and governments, big and small, about how to improve their security. One of my specialties is mobile devices and another is cloud computing. * As an emergency responder (formerly a full-time paramedic, also with firefighting and mountain rescue experience), I have had to tell parents that their child is dead, more than once, and often for easily avoidable reasons. * I have three small children of my own. Perspective is important. As writers, we are always biased by our experiences, especially with emotional issues that pit our civil rights against fundamental fears for ourselves and others. We shouldn't vilify law enforcement officials who see access to our phones as important to their mission to protect us, yet we also can't allow low-frequency statistical events, however great the emotional impact, to create even greater, more generalized risks. iPhones Have Long Frustrated Law Enforcement -- When the iPhone first came out, I was one of a group of analysts who advised enterprises to avoid it, largely out of security concerns. The original iPhone didn't really have any security, but it also lacked apps. Since then, Apple has dramatically improved the resiliency of the platform from attacks, even if someone has physical possession of the device. This is due to how Apple implemented Data Protection in iOS. Your iPhone is always encrypted, but in a way that is easy to circumvent. Data Protection enhanced the basic encryption by entangling your passcode (if you set one) with a unique identification number burned into the hardware of the device. There is no way to pull that code off the device, which dramatically increases the length of the 'total' passcode that protects your encryption key. One of the more effective techniques to crack phone encryption is to pull the data off the source device, and then brute force attack it from larger, more powerful computers. But by adding that device code into the mix, Apple makes it effectively impossible to crack iOS encrypted data on even the biggest computers, even if you can get the image off the device. Thus most forensics tools used by law enforcement (and criminals) have to crack the encryption on the iPhone (or iPad) itself, using the embedded processor which mixes the device code back in. However, this hardware is rate-limited, so once you hit about six characters in your passcode, it can take many years (even centuries) to break the code via brute force. Data Protection has been around for at least three revisions of iOS devices, but for most of that time it has protected very little: email (which the police can get from servers) and any applications that turned on the Data Protection API. Apple has never been able to access this data'¦ for anyone . However, plenty of other data was exposed, including text messages, photos, app data that didn't turn on Data Protection, contacts, and much more. [5]I detailed all of these limits in my iOS data security papers, because they were major security headaches for my enterprise clients. If you really wanted to secure data on an iOS device from an attacker, you needed to be extremely careful and use all sorts of additional security controls. That's why I traveled to China using only 'clean' devices that didn't have access to my normal accounts or data. iOS 7 expanded Data Protection to all independent apps (without using the API), and iOS 8 merely expanded it to all Apple-provided apps, including Messages, Camera, and everything else. Apple didn't deliberately lock out law enforcement to protect pedophiles, they added a much-needed security control to protect all of us from common criminals. The Bane of Backdoors -- It shouldn't be a surprise that some law enforcement officials consider it their right to access our phones. Since the dawn of civilian law enforcement, we have granted police investigative powers: the ability to access anything outside our heads, with a proper court order. With probable cause, the police can read our mail, enter our homes, listen to our phone calls, and more. That's how many of them view the world ' not with totalitarian malice, but as a legal privilege entrusted to them to protect society. Get the warrant, get the order, get the information. It's all part of the legal process. But technology advances created a modern society that is no longer so clean. There is a term in the defense world called 'dual use.' It describes technologies that can be used for good or evil. As a ski patroller, I used explosives to trigger avalanches. The same explosives are used in construction and in some military munitions. Law enforcement, especially federal law enforcement, has a history of desiring and imposing backdoors into technology. The [6]Communications Assistance for Law Enforcement Act (CALEA) of 1994 requires all telecommunications equipment manufacturers to enable remote wiretapping for law enforcement in the hardware. But CALEA backdoors have also been [7]abused by criminals and intelligence agencies. Last year, [8]the New York Times revealed that the Obama administration was on the verge of backing CALEA-II, which would require any Internet communications service to include a backdoor for direct law enforcement wiretapping. All chat programs, social networks, and even webcasting tools would have had to provide a secret entrance for law enforcement. Even dismissing concerns over NSA monitoring, it is technologically impossible to build such backdoors without the possibility of abuse by attackers. All wiretapping interfaces are dual use, and subject to abuse. No one has ever created a monitoring technology immune from attack or misuse. In a short-sighted attempt to maintain existing investigative capabilities, the FBI is pushing to reduce communications security for the entire Internet, which would also destroy the competitiveness of these companies globally. Then again, [9]Microsoft is currently being held in contempt of court for not providing U.S. law enforcement private email messages from a customer in Ireland, from a data center in Ireland ' even though doing so would violate Irish law. It's a court case that could decimate the global market for all U.S.-based cloud computing providers. A Rusty Key -- On 3 October 2014, the Washington Post Editorial Board [10]proposed Apple and Google build in a 'Golden Key' to provide law enforcement, and only law enforcement with a court order, access to phones. That isn't something I'm opposed to in concept ' except it is technologically impossible. I don't know a single security expert or cryptographer who believes a secure golden key is possible. Especially not one that would be accessible to a range of law enforcement officials, over time, for various cases. It simply isn't possible. The closest equivalent are the [11]fourteen holders (and seven backups) of the Internet DNS signing keys. A system that works only due to their geographic distribution, and infrequent meeting requirements. Such an equivalent is logistically impossible to handle ongoing law enforcement needs. And that ignores the international implications of creating a backdoor to all phones, that could be abused by any government. Apple clearly sees its security as a competitive advantage, [12]as I wrote about, months before Tim Cook's message. Especially since Google, by nature of its business model, can never maintain privacy as well as Apple (despite recent failures), no matter how strong its security capabilities. But all that marketing doesn't eliminate the fact that Apple closed a long-known security flaw first and leverage the marketing later. Attorney General Eric Holder claimed [13]Apple and Google are 'thwarting' law enforcement's ability to stop child abuse. I have witnessed such abuse. Perhaps not on the scale of an FBI agent, but I know what horrors lurk in the world, and I have directly seen the consequences. They are things I do not talk about, not outside the circle of former coworkers who have been there themselves. No matter how statistically rare (and most abuse is by a family member), like all parents I fear for my own children (essentially being on 24/7 suicide watch for the first years of their lives ' fretting about SIDS, the consumption of inedible objects, power outlets, and other infant dangers ' is hard to forget). If something happened to them, I wouldn't care about what laws or social norms impeded my ability to protect them, and I know the feeling of helplessness when you lack the resources or ability to save someone else's child. These are powerful, emotional, forces. I fully understand the drive and motivations the law enforcement community has to maintain access to our devices. That access speeds up or breaks open cases. It enables police, at times, to better protect us from the worst the world has to offer. I understand how they can perceive Apple and Google as interfering with their ability to collect data they are legally entitled to access in the course of their duties. Data that could, at times, save lives. But law enforcement needs to understand that technology companies aren't trying to protect the bad guys, but stop them. That until iOS 8, I had to walk my clients through the iOS security loopholes that made it difficult to protect corporate and personal data. That such back doors are already used to suppress free speech throughout the world, sometimes fatally. That without this encryption, we are all less secure. Society is in the midst of a major upheaval powered by technology. One where the lines of privacy, civil rights, and the role of government are shifting as we fundamentally alter our social and communications structures. We now need to decide if, as we make this transition, we provide our governments pervasive access to all of our information, which also reduces our collective ability to defend ourselves from criminals. Or if we err on the side of stronger inherent security, knowing that some criminals, even the worst of them, will occasionally get away. References 1. http://www.apple.com/privacy/ 2. http://tidbits.com/article/15096 3. http://blogs.wsj.com/law/2014/09/25/fbi-director-concerned-about-new-smartphone-encryption/ 4. http://www.washingtonpost.com/business/technology/fbi-blasts-apple-google-for-locking-police-out-of-phones/2014/09/25/68c4e08e-4344-11e4-9a15-137aa0153527_story.html 5. https://securosis.com/assets/library/reports/Defending-Data-on-iOS-v.2.pdf 6. https://en.wikipedia.org/wiki/Communications_Assistance_for_Law_Enforcement_Act 7. http://www.internet-security.ca/internet-security-news-archives-039/calea-mandated-systems-are-abused-probably-will-continue-to-be.html 8. http://www.nytimes.com/2013/05/08/us/politics/obama-may-back-fbi-plan-to-wiretap-web-users.html 9. http://www.theguardian.com/technology/2014/sep/03/microsoft-contempty-court-judge-data-dispute 10. http://www.washingtonpost.com/opinions/compromise-needed-on-smartphone-encryption/2014/10/03/96680bf8-4a77-11e4-891d-713f052086a0_story.html 11. http://www.theguardian.com/technology/2014/feb/28/seven-people-keys-worldwide-internet-security-web 12. http://www.macworld.com/article/2366921/why-apple-really-cares-about-your-privacy.html 13. http://www.theguardian.com/technology/2014/feb/28/seven-people-keys-worldwide-internet-security-web .