Apple Fighting Order To Assist In Decrypting Phone Of San Bernardino Terrorist

Apple is resisting a Federal Court order that it assist the F.B.I. in decryption of the iPhone used by one of the San Bernardino terrorists.

iPhones

Apple is fighting a Federal Judge’s Order that it help law enforcement unlock the iPhone used by one of the terrorists who carried out the San Bernardino attacks in early December:

WASHINGTON — A judge in California on Tuesday ordered Apple to help the F.B.I. unlock an iPhone used by one of the attackers in the assault in San Bernardino that killed 14 people in December.

The ruling handed the F.B.I. a potentially important victory in its long-running battle with Apple and other Silicon Valley companies over the government’s ability to get access to encrypted data in investigations. Apple has maintained that requiring it to provide the “keys” to its technology would compromise the security of the information of hundreds of millions of users.

The F.B.I. says that its experts have been unable to get into the iPhone 5c used by Syed Rizwan Farook, who was killed by the police along with his wife, Tashfeen Malik, after they attacked Mr. Farook’s co-workers at a holiday gathering.

Prosecutors said in a court filing that Apple had the “exclusive” means to bypass the security features on the phone, but that the company had “declined to provide that assistance voluntarily.” F.B.I. experts say that because of the phone’s security features, they risk losing the data permanently after 10 failed attempts to enter the password.

The Justice Department had secured a search warrant for the phone, which is owned by Mr. Farook’s former employer, the San Bernardino County Department of Public Health. But prosecutors said they saw little choice but to seek the additional order compelling Apple’s assistance.

In an unusually detailed directive, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California ordered Apple to provide “reasonable technical assistance” to the F.B.I. in unlocking the phone. That assistance should allow investigators to “bypass or erase the auto-erase function” on the phone, among other steps, she wrote.

In a statement, Timothy D. Cook, Apple’s chief executive, said the company would oppose the order and resist efforts to provide a “back door” to the iPhone, and he called the implications of the government’s demands “chilling.”

“For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe,” the company said in the statement. “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

Prosecutors said the contents of the phone could provide crucial evidence about the attackers’ communications and contacts before the shooting.

(…)

Eileen M. Decker, the United States attorney in Los Angeles, where the investigation is being handled, said the effort to compel Apple’s technical cooperation marked “another step — a potentially important step — in the process of learning everything we possibly can about the attack in San Bernardino.”

Prosecutors “have made a solemn commitment to the victims and their families that we will leave no stone unturned as we gather as much information and evidence as possible,” Ms. Decker said.

James B. Comey, the F.B.I. director, has been at odds with Apple and other technology companies for months over whether they should provide de-encryption technology for their products. Without it, he has argued, the bureau is at risk of “going dark” in its investigations. The Democratic presidential candidate Hillary Clinton and most of the Republican hopefuls support Mr. Comey’s stance.

Apple and other technology companies say that creating an opening in their products for government investigators would also create a vulnerability that Chinese, Iranian, Russian or North Korean hackers could exploit.

More from The Washington Post:

The order, signed Tuesday by a magistrate judge in Riverside, Calif., does not ask Apple to break the phone’s encryption but rather to disable the feature that wipes the data on the phone after 10 incorrect tries at entering a password. That way, the government can try to crack the password using “brute force” — attempting tens of millions of combinations without risking the deletion of the data.

The order comes a week after FBI Director James B. Comey told Congress that the bureau has not been able to open the phone belonging to one of the killers. “It has been two months now, and we are still working on it,” he said.

The Silicon Valley giant has steadfastly maintained that it is unable to unlock its newer iPhones for law enforcement, even when officers obtain a warrant, because they are engineered in such a way that Apple does not hold the decryption key. Only the phone’s user — or someone who knew the password — would be able to unlock the phone.

The FBI’s efforts may show how impervious the new technology is to efforts to circumvent it. According to industry officials, Apple cannot unilaterally dismantle or override the 10-tries-and-wipe feature. Only the user or person who controls the phone’s settings can do so.

However, U.S. Magistrate Judge Sheri Pym said in her order, Apple can write software that can bypass the feature. Federal prosecutors stated in a memo accompanying the order that the software would affect only the seized phone.

In the statement , Cook said such a step would dangerously weaken iPhone security.

“Once created,” he wrote, “the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

The Apple CEO said that “opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.”

 

(…)

The phone ran on Apple’s iOS 9 operating system, which was built with default device encryption. When a user creates a password, that phrase generates a key that is used in combination with a hardware key on a chip inside the phone. Together, the keys encrypt the device’s data.

If the autowipe function is suspended, the FBI can run a massive number of combinations of letters, symbols and numbers until the right combination is found.

But there’s a complication.

If the combinations are run on the phone itself, the process can be painfully slow, taking, according to Apple, 5½ years for a six-digit lower-case password mixing numbers and letters.

If run on a supercomputer, it can be done many thousands of times faster. But to do it that way, the FBI would need the hardware key, which is built into the phone. Apple says it does not keep a copy of that key. To get that key, one could use a number of techniques, including melting the plastic off the chip and hitting it with bursts of lasers or radio frequencies to recover bits of the key.

Matthew D. Green, a cryptography expert at Johns Hopkins University, said the FBI could crack a six-digit numeric code in about 22 hours.

“But once there’s numbers and letters, that’s when things get interesting,” he said. “It might take 10 years to crack a strong password on the phone, which means they might be stuck till 2026.”

The government requested the order under the All Writs Act, a law dating to the colonial era that has been used as a source of authority to issue orders that are not otherwise covered by a statute. Though Apple has previously complied with court orders under that statute to retrieve data from iPhones running earlier versions of its operating system, it is now resisting such an order in a separate iPhone case in Brooklyn. That case, unlike the one in California, involves a phone with software that allows the firm to extract data.

The government contends that courts over the years have issued orders based on that law for the unencrypted contents of computers, for credit card records and for security camera videotapes. It noted that the Supreme Court in 1977 held that the law gave courts the authority to direct a phone company to execute a search warrant for numbers dialed by a particular customer.

Some legal scholars, however, said the use of the All Writs Act in the California Apple case presents a slippery slope issue. “If the writ can compel Apple to write customized software to unlock a phone, where does it end?” said Ahmed Ghappour, a professor at the University of California’s Hastings College of the Law. “Can the government use it to compel Facebook to customize an algorithm that predicts crime? It’s not clear where the line will be drawn, if at all.”

Not surprisingly, this story is raising many of the same issues regarding the conflict between privacy and individual liberty on the one hand and the governments needing access to the proper tools to investigate criminal and terrorist activity on the other. From the government’s point of view, obviously, this is an easy issue. The government says it needs access to the data on the phone in order to determine if Farook and his wife had contact with domestic or foreign terror suspects or potential co-conspirators in advance of the attacks, and if that means that Apple has to sacrifice the commitments it has made to its customers regarding the security of their data and their phones, then so be it. From Apple’s point of view, there seem to be a myriad of issues motivating the decision to take what has the potential to be an unpopular decision given the circumstances of this case. First of all, there is the fact that ever since the company made the decision to strengthen security on its phones in a manner that essentially allows customers to encrypt data in a manner that makes it nearly impossible to access without the appropriate pass code, the concerns about data security have only become more prominent and that providing a backdoor that does not exist right now would only serve to make the data itself less secure overall. Second, as the Post article notes the use of the All Writs Act in this manner appears to be unprecedented and, if upheld, would essentially allow the government to do almost anything in the name of law enforcement and intelligence gathering. Finally, and perhaps most strongly, it’s important to note that law enforcement isn’t asking Apple to provide information that it already has, which is what an ordinary search warrant does. It is essentially asking a Federal Court to compel Apple to do something, in this case create a backdoor that does not exist. This arguably falls well outside the scope of the Fourth Amendment and, if upheld, would give law enforcement authority to compel technology companies to do almost anything conceivable in the name of a purported investigation or surveillance of a target. That seems to go well beyond what the Constitution and existing law permits law enforcement to do.

This is a debate that has been ongoing ever since Apple announced the changes it was making to the operating system for its mobile devices that would increase security, changes that Google soon adopted on its own for the Android operating system. Almost immediately after the announcement was made, various law enforcement officials claimed that this would make it harder for them to do their jobs and the Director of the Federal Bureau of Investigation hinted that the Bureau may seek a legislative fix that would essentially purport to compel Apple and Google to provide the FBI and other agencies with a backdoor into encrypted phones. To date, it doesn’t appear that any real action has taken place in that area in Congress, though it’s possible that may change now that the new technology has become an issue in the San Bernardino investigation, although it’s unclear whether or nor there is even legislation on this issue pending in either the House or Senate and not at all clear whether any such proposed legislation would pass scrutiny in Court.

In any case, as things stand it strikes me that Apple has the better argument in this matter. Notwithstanding the intelligence value that might be contained in the phone, the overall issue of liberty and security that would be adversely impacted if the Court’s order stands in this case is too important for Apple to simply give in to the government’s demands without having the matter tested in the relevant courts. Additionally, the fact that the government’s request here involves something far beyond simply asking Apple to pass along information it already has in its possession sees to place this outside of the kind of investigatory methods permitted by the Fourth and Fifth Amendment and amounts to a precedent that states that the government should be free to ask any company to do something to make it easier to access data over which it does not have either custody or control. Because of that, Apple is correct to be fighting the good fight here and should take the matter as far through the appellate system as it can, because the implications of this order standing are far too broad reaching to be left to a Magistrate Judge in California.

FILED UNDER: Intelligence, Law and the Courts, National Security, Policing, Science & Technology, Terrorism, US Politics, , , , , , , , , , , ,
Doug Mataconis
About Doug Mataconis
Doug Mataconis held a B.A. in Political Science from Rutgers University and J.D. from George Mason University School of Law. He joined the staff of OTB in May 2010 and contributed a staggering 16,483 posts before his retirement in January 2020. He passed far too young in July 2021.

Comments

  1. al-Ameda says:

    In any case, as things stand it strikes me that Apple has the better argument in this matter. Notwithstanding the intelligence value that might be contained in the phone, the overall issue of liberty and security that would be adversely impacted if the Court’s order stands in this case is too important for Apple to simply give in to the government’s demands without having the matter tested in the relevant courts.

    I tend to agree.

    What is interesting to me is that, you would think that, the government with all of the financial resources at their disposal, could hire the best tech experts to hack this without losing the source data they need?

    On the other hand, it’s good to know that Apple has top-of-the-line, state-of-the-art encryption code.

  2. C. Clavin says:

    OT…oppressed white guy Clive Bundy is being held without bail.
    No word if he has paid the $1M he owes us.

  3. One question I would like to see addressed in this: is there a limit on the amount of effort law enforcement can demand in assisting an investigation before it becomes a fifth amendment taking? This isn’t a case where Apple can just push a few buttons and give them access to the phone. They’d have to undertake a significant development effort, at their own expense, that may require paying a number of people for several months to develop custom software for the FBI. Can the government actually force someone to do that? Suppose it would take years? At some point it seems it’s not so much the FBI issuing a subpoena, so much as forcing someone to become a federal contractor against their will and for no compensation.

  4. James Joyner says:

    Honestly, I’m not sure I see Apple’s side here. The government has the right to subpoena my telephone records, access the privacy of my home with a warrant, and otherwise violate my privacy so long as it secures authorization. There’s no equivalent of privileged communication (as with a spouse, attorney, clergyman, or medical practitioner) here. Nor is this even a blanket snooping expedition as in the NSA cases. This is a specific request regarding a legitimate subject of the most serious of criminal investigations. Of course Apple should comply.

  5. Mu says:

    1 Jan 2017 After its success in the infamous terrorist iPhone case the government has issued a demand under the All Writs Act to Boeing to install self-destruct mechanisms on all their airplanes to avoid a repeat of 9/11. The DoJ also pointed out that the new tracking chip implants are only used on people arrested, not the general population, and that the rumors about an ability to kill the wearers were pure science fiction – yet.

  6. @James Joyner:

    Because it’s not a subpoena in the traditional sense. Apple doesn’t actually know the pin. The FBI already has the phone and all the data on it. They’re not even asking for documentation. They’re asking Apple to develop a capability they don’t currently possess, which seems beyond the bounds of what a subpoena is supposed to do. Suppose the FBI found a safe they didn’t know the combination for. Can they order you to quit your current job and get a degree in locksmithing so that you can help them open it?

  7. James Joyner says:

    @Stormy Dragon:

    They’re asking Apple to develop a capability they don’t currently possess, which seems beyond the bounds of what a subpoena is supposed to do.

    That’s a plausible argument if true. That is, I’m dubious that Apple can’t bypass the lock code. In fact, when I shattered the screen on my previous iPhone, we were unable to enter my code. They were able to reset it and bypass. And that’s not Apple HQ but just some random chick at the local Apple store.

  8. Ben says:

    @James Joyner:

    James, this isn’t a request for information in Apple’s possession. This is an order essentially deputizing Apple to create custom software for the government against their will.

    As for your old phone, do you remember what version if iOS you were running? The default disk encryption began with iOS 8 I believe.

  9. Pch101 says:

    @James Joyner:

    The data doesn’t belong to Apple. It’s akin to expecting the contractor who installed the door to your house being responsible for getting the cops inside, or the maker of a safe that is located in the house to provide the combination (that it doesn’t have) for a safe that it sold to a third-party. It’s not as if Apple is suspected in being a party to the crime.

    Then again, I can appreciate that our laws don’t tend to keep pace with technology, so I am sympathetic to both sides of this argument.

  10. James Joyner says:

    @Ben: Not sure. It was early last year and I generally install new OS nearly immediately.

    @Ben: @Pch101: I think that’s fair. But Apple has built a system that allows criminals to evade legitimate warrants. I don’t have the technical savvy to know how hard what they’re being asked to do is but don’t think it’s unreasonable that they provide a backdoor so long that they don’t have to hand that capability over to the government directly.

  11. Ben says:

    @James Joyner:

    Are all manufacturers of consumer products required to design them with law enforcement assistance in mind? If I were to design a new product with an unbreachable lock, would I have to design a backdoor to it just in case law enforcement needs to get in? Could they deputize me to break into it to serve a warrant?

    And here’s the thing about building a backdoor. Once it exists, it leaks or it gets stolen. It WILL be used by bad actors soon enough. They are ordering Apple to sabotage the security of their own product.

  12. @James Joyner:

    There’s no such thing as backdoor for encryption. There is either encryption that works or encryption that doesn’t work. The government is using “backdoor” as a euphemism to talk around that fact that what they really want is for companies develop intentionally faulty encryption software.

    And once faults are added to encryption, they’re there for everyone, including foreign governments and criminals.

  13. James Pearce says:

    Dump your Apple stock. Dump it now.

  14. Scott says:

    There is a significant financial aspect to this, not just for Apple but for our economy. Apple has strengthened encryption because the market is demanding it. If the Government can compel Apple to weaken their technology, then some other company will offer similar technology and it may not be a US company. Do we really want to damage the most valuable company in the world? It is a trade off and a serious one. Is the needs of law enforcement and intelligence greater than a strong economy? I would argue that we are stronger with the strong economy/

  15. David in KC says:

    @James Joyner: James, was all your data on it, or did you back up from the cloud?

    I like that my iPhone is virtually impossible to break in to. I also like that I can brick it if it gets stolen. I don’t like the fact that the Feds are asking Apple to build in a vulnerability that you know will be exploited for more nefarious purposes.

    This is also a takings issue, the Feds want Apple to expend resources to develop this ability. This does two things, one, it costs money to use engineers to do this, as well as the fact that while they are working on this, they are not working on what Apple needs them to work on for the next software update or improvement. Depending on the man hours needed, this could be a hefty sum, just for the time, let alone the cost of pushing back development time on something else.

  16. DrDaveT says:

    From the government’s point of view, obviously, this is an easy issue.

    Not only isn’t it obvious, it isn’t even true.

    First, law enforcement organizations are not all of “the government”. Their interests are not necessarily the nation’s interests. One of the jobs of government is to decide in which cases we’re better off as a nation, in the long run, with less (or less effective) enforcement. That’s why we have a Bill of Rights, to pick a flagrant example.

    Second, as others have noted, this is not a question of Apple turning over information in their possession. In the immediate case, they are being asked to provide (uncompensated?) professional engineering services that damage their own flagship product. In the longer run, they are being asked to redesign their products to be less capable. Digital security is not selective; you cannot make a product that is secure against everyone except legitimate law enforcement organizations.

    Finally, as yet others have pointed out, the capability they are being asked to provide to the government is indiscriminate. Once it leaked — and there is zero reason to believe it would not — it would be quickly disseminated to the world. At which point, Apple would have no product line at all worth selling, while their foreign competition would face no such obstacles.

    That’s the government’s dilemma. Still think the government’s preferred outcome is obvious?

  17. Davebo says:

    Apple should just stick to it’s original claim and say “It can’t be done”.

    If the government doesn’t believe them and insists it is possible then tell them to do it themselves.

    There’s a significant cost involved here not in Apple trying to develop a software work around but in damage to Apple’s brand which is an order of magnitude greater.

    And what’s the potential upside to this fishing expedition? Doesn’t seem that great to me. We aren’t dealing with the idiotic ticking bomb scenario here.

  18. Mu says:

    It’s always good to see that James Joyner’s only standard is a legal warrant. Like those used to sterilize imbeciles and promiscuous people. Or arrest people sitting on the wrong bus bench. Or SWAT people for drinking lose leave tea.

  19. Jenos Idanian says:

    I see a parallel with attempts to force gun makers to develop “smart guns.”

    But this is also in the spirit of past government interventions, where they have banned technologies that work in favor of those that work far less effectively — if at all. The banning of incandescent light bulbs, the mandating of “low-flow” toilets, the more “environmentally friendly” gas cans — all led to far less useful products being mandated in place of products that worked just fine before, because someone in the government thought their priorities were more important than the people who made the products and the people who bought and used them.

    This one is just hitting a bit closer to home for a lot of people around here…

  20. MarkedMan says:

    FWIW the way this is being explained at the Mac sites alI follow: since iOS8 Apple has implemented hard encryption on their devices, most likely to maintain their security advantage over other systems. This also keeps them off of the hacked wall-o-shame like Target, the US government etc.

    One side effect of the strong encryption they used is that Apple themselves cannot get into the phone. They can wipe it, but not open it. Various government agencies are going nuts as they have been used to trivially getting permission to get total control over suspects information.

    They have ordered Apple to build a special version of iOS9 that can be installed on the phone without overwriting the existing data. This special version has one crucial difference: unlimited password attempts. Normally after a certain number of attempts the phone erases all the user data. This would allow the FBI to plug a virtual keyboard in and try millions of passwords until they get in.

    Apples concern is that once that backdoor exists it will eventually get into the wild. And that is certainly likely to happen.

    So, bottom line, this is not as trivial as the FBI forcing them to do a few million dollars of development work. This is the FBI forcing them to devalue a major selling point of the phone – the fact that if it falls into a bad guys hands they only get a limited number of attempts to guess your password and break in.

  21. James Pearce says:

    @MarkedMan:

    This is the FBI forcing them to devalue a major selling point of the phone

    Like I said, dump your Apple stock….

  22. Mikey says:

    @MarkedMan:

    They have ordered Apple to build a special version of iOS9 that can be installed on the phone without overwriting the existing data. This special version has one crucial difference: unlimited password attempts. Normally after a certain number of attempts the phone erases all the user data. This would allow the FBI to plug a virtual keyboard in and try millions of passwords until they get in.

    Apples concern is that once that backdoor exists it will eventually get into the wild. And that is certainly likely to happen.

    I’d bet a bottle of my favorite bourbon this already exists, that there’s a development load of iOS on Apple’s servers that doesn’t have the password retry limit. Apple just doesn’t want to give it up, for obvious reasons.

  23. grumpy realist says:

    @James Joyner: There’s no such thing as “encryption that only allows the good guys a backdoor to see what is going on.”

    It’s either “encryption that works”, or “encryption that doesn’t work.”

    And given the salaries that Google offers compared to the salaries the US offers, are you really surprised that Google came up with something that the US Government can’t crack into?

    (Pay sh*t salaries, you get sh*t engineers. Duh.)

  24. grumpy realist says:

    @Jenos Idanian: What’s so effective about incandescent light bulbs? There’s a heckova lot of waste heat, which is fine in winter but a damned nuisance in summer.

    Both incandescent and fluorescent are going to be taken over by LEDs at some point, which are even better.

  25. Blue Galangal says:

    @James Joyner: For the iPhone 5 and above (and even the 4s), you can shatter screens until the cows come home if your phone is backed up (ordinarily to the cloud, or to your hard drive). Did they give you a replacement phone? Or did they replace your broken screen with a new one? (If the former, they probably replaced the new phone’s system img with the cloud backup’s; if the latter, they probably did a system wipe & restore, ditto.) Once you log into your account a cloud backup will restore to the old phone with a new touchscreen, or to a new phone entirely and it will be as if the shattered screen never happened; the restoration is seamless and fast. But no one in the Apple store can bypass the disk encryption for iOS8 or above if you don’t know your passcode. If it was that easy your iPhone couldn’t be bricked if stolen, for example, because the bricking could just be bypassed. (Family member is an Apple Genius.)

  26. Blue Galangal says:

    @Mikey: There is. It’s called iOS7.

  27. Neil Hudelson says:

    @grumpy realist:

    GE’s dumping manufacturing CFL’s, and other companies will follow suit soon. A low-end LED bulb now costs $3, last 25 years, and uses 2% of the energy of an incandescent.

    But are you really surprised Jenos would spout off about something he doesn’t understand? See: tragedy of the commons.

  28. EddieInCA says:

    @James Joyner:

    1. You haven’t thought it through, or…

    2. You don’t understand it.

    You’re asking a private company to devote (probably) hundreds of work hours to create a program that overwrites their best encryption protocols. Why should they have to do that? Once that protocol gets into the hands of law enforcement, it will be on the web within 24 hours and everyone’s iphones will be at risk.

    No. You’re not only wrong. But very, seriously, wrong.

  29. James Pearce says:

    @EddieInCA:

    Why should they have to do that?

    To comply with a federal judge’s court order?

    If we were talking about a rancher from Nevada with cattle grazing on BLM land rather than a beloved tech company with encrypted phones, this conversation would be much different.

    And yes, this is where I’ll be told, “But those are two different things.” Yes, yes they are. But they are also soooo similar. (Someone thinks their vision of freedom trumps the rule of law.)

    Don’t get me wrong. When Tim Cook says, ““We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business,” he’s not wrong in the abstract.

    But when it’s extended to “we believe the contents of a terrorist’s iPhone are none of our business,” well…..it’s a bit harder to defend. Don’t you think?

  30. Just 'nutha ig'rant cracker says:

    @Mikey: Ding! Ding! Ding! Tim Cook is being disingenuous.

  31. EddieInCA says:

    @James Pearce:

    No. It’s one thing to ask a company to turn over information they have, or to which they have access.

    It’s something else altogether to have a company have to create software – at a cost to them – in order to access SOMEONE ELSE’S INFORMATION.

    Apple doesn’t have the information. They don’t have the code. They don’t have the software to crack the code. They’d have to completely create – from scratch – a new program to access SOMEONE ELSE’S INFORMATION.

    No. I hope Apple stands strong on this. I’m 100% behind them.

  32. Hal_10000 says:

    What the Feds are doing is trying to bypass the debate over encryption backdoors with a judicial order forcing Apple to create those backdoors without legislation or debate. They have never demonstrated that they are worthy of such trust. Apple should stand their ground.

  33. James Pearce says:

    @EddieInCA:

    It’s something else altogether to have a company have to create software – at a cost to them – in order to access SOMEONE ELSE’S INFORMATION.

    This SOMEONE ELSE is a dead terrorist and the phone is evidence in a criminal investigation.

    The privacy concerns that we would have over SOMEONE ELSE’s information in normal circumstances do not apply.

    Apple doesn’t have the information. They don’t have the code. They don’t have the software to crack the code.

    The government believes they do. Maybe the feds are wrong.

    But we should take Apple’s word for it?

  34. Hal_10000 says:

    Also … doesn’t the government already have the meta-data? Surely, that’s enough to give them something to work with.

  35. David in KC says:

    @James Pearce: No, this someone else is everyone with an iPhone. Once the encryption is busted, it impacts everyone with the OS. The order requires Apple to develop software to break the encryption. it’s like Pandora’s box, once you break it, you can’t put the lis back on.

    If Apple is forced to do this, Google is next, as well as every encryption company out there. This is a bad facts make bad law, except this would be horrendous for everyone in the digital age. Hackers steal personal info all the time, why do we want to make that easier?

  36. David in KC says:

    @Hal_10000: Ding ding ding. The Feds can still get the numbers dialed out and dialed into the phones, as well as numbers texted etc. they can get that from the carriers. My guess is that there is insufficient information to establish anything at this point and they are hoping there is something on the phone to establish probable cause to seize phones of other potential conspirators, which will then require use of the security flaw that was created, unless it’s an android phone and then Google will be required to do the same thing.

  37. EddieInCA says:

    @James Pearce:

    No. You’re wrong. Spectacularly wrong.

    Here’s one scenario: Let’s say I sold a bike on Craigslist to the dead terrorist. I didn’t know him as a terrorist, just some dude in San Bernardino who responded to an ad on CL for a bike I was selling. My phone number, email, and possibly texts are on that home. The Government already has the meta-data, meaning they know I called into that phone. But they don’t know about what. They get Apple to crack the code for the phone, and the Feds find that the dead terrorist made a $1000 payment to me.

    The Government gets that phone, and does that, how long before Feds are knocking on my door wanting to know why I got $1000 from the dead terrorist, who I only knew as Bill, the bike dude? Who would believe me if I said I knew nothing other than he bought a bike from me? I did nothing wrong, yet I’d be hounded out of a livelihood due to the current social media landscape. You’d be one of those saying “I was openly consorting with a known terrorist”.

    No effing way do I want the government opening up my phone, or anyone else’s by this method. Encryption exists for a reason. This request by the government seeks to end encryption. And if you don’t think so, you’re being naive.

  38. Jenos Idanian says:

    @grumpy realist: I could make an argument about their merits, but that would imply that there was a matter of choice in the matter. Would you care to make the argument that the bulbs were so horrible, so dangerous, so irredeemably bad that they needed to be banned?

    Because that’s how it happened.

  39. Jenos Idanian says:

    @EddieInCA: I’m with you in spirit, but in your particular case, giving the government access to your phone would be to your advantage, as it would have the same records — but with the exculpating details.

  40. Tyrell says:

    @Neil Hudelson: LED bulbs: I have already replaced some of these after four or so years. Buying bulbs used to be a simple choice: wattage.
    Now there are more choices than you can shake a stick at. The leds also have to warm up before they get bright.

  41. EddieInCA says:

    @Jenos Idanian:

    Right. And James Crawford was carrying a toy rifle in a “open carry state” and was still gunned down.

  42. James Pearce says:

    @David in KC:

    Once the encryption is busted, it impacts everyone with the OS.

    Yeah, so?

    Does every smart phone need to have unbreakable encryption? Most of us lock our homes with locks that can be easily jimmied with a few minutes of effort. And yet our phones must not be cracked? Even by law enforcement in the course of a murder investigation? (And, yes, I know a lot of people think LEOs are always up to no good, but maybe those people watch too much TV?)

    I think there is a reasonable level of security we should expect in our smartphones. And also a reasonable level of vulnerability that would allow us to get information from them if necessary.

    A dead terrorist? That’s a pretty good case of “if necessary.” (Although I suppose you could also use it to get to a dead relative’s photos or into your kid’s phone or something else totally non-nefarious.)

  43. James Pearce says:

    @Jenos Idanian:

    Would you care to make the argument that the bulbs were so horrible, so dangerous, so irredeemably bad that they needed to be banned?

    Go make an incandescent light bulb in your garage. Make a whole box for every lamp in the house. Don’t worry. No one’s going to stop you.

    Because they weren’t “banned.” They were phased out.

  44. James Pearce says:

    @EddieInCA:

    Who would believe me if I said I knew nothing other than he bought a bike from me?

    Well, I presume the FBI would believe you, since they would have your phone and would have already seen that your only interaction with dead terrorist/bike guy was negotiating a Craigslist transaction.

    I’m not for police sniffing through phones illegally, giving law enforcement carte blanche to do whatever they want. But if they get a search warrant…what’s the problem?

  45. anjin-san says:

    @James Joyner:

    But Apple has built a system that allows criminals to evade legitimate warrants.

    Ford and Chevy have built gadgets that allow criminals to flee crime scenes and police that might be pursuing them. Should Ford and Chevy, at their own expense, be forced to devise a tracking system and kill engine switch that any police department could use to turn off any car or truck that might be used by any criminal?

    At some point, we need to push back on “your rights, privacy and freedom are being rolled back because criminals and terrorists” – this seems like as good a place as any to push, and Apple has cash on hand to hire a lot of legal talent.

  46. Argon says:

    This particular case with the San Bernardino twit is strictly for show. They have more than sufficient call and text data from the phone company.

    Instead, this is a ploy to take the most egregious suspect and use his case as a lever to get cart blanche access for all personal computer (that is an iphone) data. They’re trying to do what legislatures balk at. Once the OS is hacked and given to law enforcement, you can bet they’ll keep copies for future cases.

    This stinks and most people who keep track of this stuff know it.

  47. Jenos Idanian says:

    @James Pearce: So, just what is the difference between the government “banning” something and “phasing out” something? It doesn’t count if it’s slow motion?

  48. Jim says:

    Well once a few suicide bombers hit targets in the USA this whole data privacy conversation and individual liberties
    will be mute. Basically right now all they have to do is use Apple IPhones and do so with impunity.

  49. anjin-san says:

    @Jim:

    Well, perhaps people who feel as that way should volunteer to bend over and have the government shove cameras and microphones up their asses in case their are terrorists lurking in there.

    All of us are in much, much, much greater danger driving to the store to get milk than we are from terrorists. And I think it is indisputable that our gross over response to 9.11 did far more harm than the actual attacks did.

    Sometimes at 3:00 in the morning, I think that bin laden destroyed America on 9.11, and it simply has not become completely apparent yet.

  50. Mikey says:

    @David in KC:

    Once the encryption is busted, it impacts everyone with the OS. The order requires Apple to develop software to break the encryption.

    No, it doesn’t. It requires a means to disable the password retry limit. All this talk about an “encryption backdoor” is a gargantuan red herring.

    My experiences during 15 years in the employ of a telecom equipment manufacturer lead me to the strong suspicion Apple actually possesses a load of iOS9 without the password retry limit, but that’s just my suspicion. Even if they don’t it would be a relatively trivial matter to build one, since the code is modular and that piece is most likely not necessary to the operation of the phone and its file system.

  51. James Pearce says:

    @Jenos Idanian:

    So, just what is the difference between the government “banning” something and “phasing out” something?

    Since you asked…

    Sometimes in an industry, such as say light bulb manufacturing, issues come up. Since all these companies are competing against each other, they look to the government to sort it out. It’s not all save the planet bullshit. The light bulb companies wanted to move the industry forward, start competing with new technologies, making some MONEY. And the government helped them.

    Seriously, dude, get over it.

  52. MarkedMan says:

    I think there are at least two misunderstandings running through the above replies:
    1) the individual rights issue is interesting and important but the most important one is whether or not this eventually puts your bank account/credit card information into the hands of hackers. 2) if Apple is forced to make software that can be loaded onto that phone and it allows unlimited password retries, the encryption is essentially broken for that phone and all others like it. For example, the one I am typing this on. If it makes it out into the wild, which it will, it can be downloaded in a few moments by the same Russian hackers who used a different type of backdoor and are currently holding an entire hospitals patient data hostage for $3.4M. Or who broke into Target and stole tens of millions of credit card numbers and passwords. And I could go on. Admittedly simply disallowing the password retry limit is not as easy to exploit as a Windows buffer overrun but it is not as hard as people here think. And history has shown us that hackers will find even easier ways to exploit such vulnerabilities.

    One of the reasons I have an Apple phone is Apple Pay, which I hope becomes ubiquitous. Quite frankly, I trust Apple with my credit card information much more than I trust the hundreds of online vendors who ask for it on a daily basis. The government is demanding that Apple give up that competitive advantage.

  53. Jenos Idanian says:

    @James Pearce: It’s cute how you think that explanation makes things better.

    The light bulb industry (“Big Light”) wants to make a new, more expensive, bulb. But they can’t, because people will keep buying the old, cheap bulbs. So they go to the government for help. The government says if the new bulb’s better, people will buy it. Big Light says no, they won’t, they’ll keep buying the cheap bulbs. So the government says just stop making the old bulbs, and they’ll have to buy the new ones. Big Light says nope, won’t work, someone will still keep making the old bulbs, and they’ll make all the moneys, and we’ll be stuck. So Big Light needs the government to make everyone stop making the old bulbs, so people will have no choice but to buy the new ones.

    Here, I’m going to hit you with some pretty heavy concepts here: choice is good, mkay, and coercion is bad, mkay? And government coercion is really bad, mkay?

    The threshold for government coercion should be pretty damned high. The default should be that people have choices. Big Light arguing that they want to make new, more expensive bulbs but the stupid people won’t buy them as long as they can choose the old ones is an incredibly shitty reason.

    Back to the topic at hand… Apple set up an OS that has a very potent selling point — protecting your privacy. In the age of The Fappening and endless news stories of hacking of private information, that’s a hell of a selling point. And the government wants to strip them of that advantage. Further, if you believe Apple, it’s not a matter of simply using an existing tool — they are demanding that Apple invent a tool to undo their advantage.

    This brings up an interesting theory. If corporations are to be treated, in many ways, the same as people, wouldn’t this sort of compelled service be a violation of the 13th Amendment? “Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.”

  54. Mikey says:

    Here’s a run-down of what the FBI wants Apple to do and why it’s possible for them to do it.

    http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/

  55. MarkedMan says:

    @Mikey: I don’t think that Apple is arguing that this is not possible. But, for example, creating a bypass that allows unlimited password attempts via WiFi means that a rogue access point someone fires up in a hotel room could try millions of passwords in a night for anything within range. Or even a ‘legitimate’ one that was designed and/or built in China with special requests from the Chinese intelligence service.

    It remains to be seen how such a backdoor could be enabled without physical access. But… Ever wonder about all those apps you download for free and give permission to install things on your phone? What is their business model?

  56. Mikey says:

    @MarkedMan: I think from a technical point of view it would be possible to do what the FBI wants and limit it to direct physical access only, to that specific telephone only, and that Apple could do it on their premise without handing the FBI any code. And I think the risk of this approach is minimal enough that the potential value to this investigation of the information contained on the phone justifies the trade-off.

    But stepping outside the realm of the purely technical, there’s a bigger issue–I’m a lot more bothered by the government’s expanded assertion of the All Writs Act and what that could mean going forward, if Apple is forced to comply. That’s the precedent we need to be wary of, that’s where the greater risk is, I think. Apple’s response is entirely understandable and justifiable in that context.

  57. grumpy realist says:

    @Tyrell: Uh, no. You’re confusing CFBs with LEDs. LEDs pop on immediately, as you would know if you could take a look at how they work. It doesn’t take THAT long for an electron to get from one side of a semiconductor chip to the other.

  58. grumpy realist says:

    @Jenos Idanian: It’s the difference between saying “we won’t allow you to sell any new cars that don’t have seatbelts” and saying “all cars that don’t have seatbelts must be off the road.

    If you can’t see the difference, you have no right to be running around loose.

  59. grumpy realist says:

    @Jenos Idanian: Funny, Japan regularly manages to get old technology off the road and upgraded by this exact sort of regulatory coaxing. It’s a great way to up the bar and push the technology further.

    Then they can turn around and sell it to other countries whose government doesn’t use this sort of artificial urgency and they usually already have a built-in competitive advantage.

    If you don’t believe me, look at how Japan went into pharmaceuticals and high-end biochemical production.

  60. Jenos Idanian says:

    @grumpy realist: If you can’t see the difference between “you can’t have a car on the road without seat belts” and “you can’t have a car without seat belts, whether or not it ever goes on public roads,” then you have no business doing much of anything.

    Because we’re talking about light bulbs used in the privacy of one’s own home, not in any public area. Which makes your analogy even more stupid. And that’s just one way that it’s stupid. For example, if I were to build a car intended for racing, and not ever to use on the streets, I would probably not include seat belts (or air bags or anti-lock brakes or a bunch of other legally-mandated safety features), leaving it up to the buyer to include far more comprehensive safety equipment. And that would be perfectly legal.

    And yay, Japan. Isn’t it wonderful what you can do when you don’t give two craps about people’s right to make choices — even those choices of which you disapprove?

    But once again back to Apple… I’m no Applehead, but it sounds like it can boil down to one of two cases.

    1) Apple has an existing tool to achieve what the FBI wants, and is resisting using it.

    2) Apple does not have an existing tool to do what the FBI wants, and is being ordered to create and use it.

    Apple sounds like they’re saying it’s #2, and that should be intolerable. Because it sounds like Apple — as a selling point to its customers — deliberately refused to create the tool in question, and does not want to create it. I really don’t like Apple, but in this case (if their statement is accurate) they are within their rights.

  61. grumpy realist says:

    @Jenos Idanian: It’s not the public/private thing, if you thought about it for several seconds. That’s irrelevant. If you don’t like my example of seatbelts what about lead paint? That’s another thing we’ve “phased out” as opposed to “go in and insist that you strip everything off the walls.” Ditto for lead pipes.

    If you want to keep an incandescent light bulb burning in your barn, go ahead. The government won’t care. You just won’t be able to buy a replacement many years down the line.

  62. Jenos Idanian says:

    @grumpy realist: In what universe are freaking light bulbs as much of a health menace as lead paint?

    You’re a delightful little totalitarian, are you? You don’t see a reason why people might want something, so you decide for everyone that it’s better for everyone that if you just make the decision for everyone.

    If you want to keep an incandescent light bulb burning in your barn, go ahead. The government won’t care. You just won’t be able to buy a replacement many years down the line.

    You really get off of the control part, don’t you? You are positively gleeful about how people don’t get to do what they want because you don’t think they need what they want.

    Bet you agree with Thomas Friedman, who was talking about China, but it applies to you:

    One-party autocracy certainly has its drawbacks. But when it is led by a reasonably enlightened group of people, as China is today, it can also have great advantages. That one party can just impose the politically difficult but critically important policies needed to move a society forward in the 21st century.

  63. James Joyner says:

    @Mu:

    It’s always good to see that James Joyner’s only standard is a legal warrant. Like those used to sterilize imbeciles and promiscuous people. Or arrest people sitting on the wrong bus bench. Or SWAT people for drinking lose leave tea.

    That’s just a bizarre argument. Getting a legal warrant is enough for the government to enter your home and ransack your belongings. It’s enough for them to wiretap your phone or surveil you for weeks on end. It’s enough for them to seize your computer and look at all its contents. don’t see why it shouldn’t be enough for them to see what’s on your iPhone.

    Can all of this be abused? Of course. It is in fact abused far too often. But the check on that isn’t supposed to be our tech companies but rather the judiciary.

    @EddieInCA: I’m not sure why a judge would issue a warrant to search your phone under the circumstances described. But, again, I’m not sure why an iPhone is any different than, say, your email. Or your home.

  64. SKI says:

    @James Joyner:

    That’s just a bizarre argument. Getting a legal warrant is enough for the government to enter your home and ransack your belongings. It’s enough for them to wiretap your phone or surveil you for weeks on end. It’s enough for them to seize your computer and look at all its contents. don’t see why it shouldn’t be enough for them to see what’s on your iPhone.

    And it is Apple’s right to appeal and try to have the warrant quashed. That some Magistrate Judge issued a warrant under a scenario that can only be described as a massive deviation from standard practice (the All Writs Act, really?!?) is not, and should not, be the end of the story.

  65. SKI says:

    @James Joyner:

    But, again, I’m not sure why an iPhone is any different than, say, your email. Or your home.

    You don’t understand the difference between a phone company, or Apple, turning over records that are in its possession and Apple being told it has to write new software and install it on someone else’s phone?

    Three is a reason they are using the All Writs Act. As a Magistrate Judge said in 2005 in SDNY,

    Thus, as far as I can tell, the government proposes that I use the All Writs Act in an entirely unprecedented way. To appreciate just how unprecedented the argument is, it is necessary to recognize that the government need only run this Hail Mary play if its arguments under the electronic surveillance and disclosure statutes fail.

    The government thus asks me to read into the All Writs Act an empowerment of the judiciary to grant the executive branch authority to use investigative techniques either explicitly denied it by the legislative branch, or at a minimum omitted from a far-reaching and detailed statutory scheme that has received the legislature’s intensive and repeated consideration. Such a broad reading of the statute invites an exercise of judicial activism that is breathtaking in its scope and fundamentally inconsistent with my understanding of the extent of my authority.

    And in 2014, another Magistrate Judge in considering the use of the AWA required Apple to bypass the lock screen (something they could still do at the time) but explicitly said they couldn’t be required to unencrypt the data.

    That is, even in using the aWA to compel cooperation, the Government could compel Apple to provide something they already have but could not require them to work for the Government in creating something new.

  66. MarkedMan says:

    I’m sure not many are following this thread any more but for those who are here’s another way of looking at it. Imagine that we were talking about Fedex. The FBI tells them they must create a system that lets them look into the express envelopes that route through their buildings and send that information to the FBI. Fedex objects and says they don’t read their client’s mail and if they installed such a system they would lose the trust of customers. Well, the FBI says, we are only asking you to use that system on one customer. You can promise your customers you won’t use it on anyone else. of course that almost certainly isn’t true.

    And then imagine that that system, once created, could be copied to a thumb drive in a few seconds and then posted on the internet where it could be downloaded and recreated by millions of script-kiddies (and worse) all over the world.

  67. Mikey says:

    @MarkedMan: Well, the government already has the Communications Assistance for Law Enforcement Act (CALEA).

    CALEA’s purpose is to enhance the ability of law enforcement agencies to conduct electronic surveillance by requiring that telecommunications carriers and manufacturers of telecommunications equipment modify and design their equipment, facilities, and services to ensure that they have built-in surveillance capabilities, allowing federal agencies to wiretap any telephone traffic; it has since been extended to cover broadband Internet and VoIP traffic.

  68. Flat Earth Luddite says:

    Once again, the gummint is busy over reaching. And yes, I’m lumping in Federal law enforcement with Federal government, as one is the tail wagging the dog, and the other is the dog being wagged (you decide which, I don’t care). No reason why they need information off this cell phone, and why they just can’t give it to No Such Agency to crack. Nope, we want it, we need it, and you’ve gotta give it to us.

    All I can say is that I was wrong when I gave up on the revolution in 1969. And that the road is getting slippery and steep.