Police Now Have Ability to Unlock Phones

A decryption device called GrayKey is being used by all manner of government agencies.

A new report says law enforcement agencies around the country have purchased a device that can access even the latest iPhones.

Vice’s Motherboard (“Cops Around the Country Can Now Unlock iPhones, Records Show“):

FBI Director Christopher Wray recently said that law enforcement agencies are “increasingly unable to access” evidence stored on encrypted devices.

Wray is not telling the whole truth.

Police forces and federal agencies around the country have bought relatively cheap tools to unlock up-to-date iPhones and bypass their encryption, according to a Motherboard investigation based on several caches of internal agency documents, online records, and conversations with law enforcement officials. Many of the documents were obtained by Motherboard using public records requests.

The news highlights the going dark debate, in which law enforcement officials say they cannot access evidence against criminals. But easy access to iPhone hacking tools also hamstrings the FBI’s argument for introducing backdoors into consumer devices so authorities can more readily access their contents.

“It demonstrates that even state and local police do have access to this data in many situations,” Matthew Green, an assistant professor and cryptographer at the Johns Hopkins Information Security Institute, told Motherboard in a Twitter message. “This seems to contradict what the FBI is saying about their inability to access these phones.”

As part of the investigation, Motherboard found:

  • Regional police forces, such as the Maryland State Police and Indiana State Police, are procuring a technology called ‘GrayKey’ which can break into iPhones, including the iPhone X running the latest operating system iOS 11.
  • Local police forces, including Miami-Dade County Police, have also indicated that they may have bought the equipment.
  • Other forces, including the Indianapolis Metropolitan Police Department, have seemingly not bought GrayKey, but have received quotations from the company selling the technology, called Grayshift.
  • Emails show the Secret Service is planning to buy at least half a dozen GrayKey boxes to unlock iPhones.
  • The State Department has already bought the technology, and the Drug Enforcement Administration is interested in doing so.

[…]

The GrayKey itself is a small, 4×4 inches box with two lightning cables for connecting iPhones, according to photographs published by cybersecurity firm Malwarebytes. The device comes in two versions: a $15,000 one which requires online connectivity and allows 300 unlocks (or $50 per phone), and and an offline, $30,000 version which can crack as many iPhones as the customer wants. Marketing material seen by Forbes says GrayKey can unlock devices running iterations of Apple’s latest mobile operating system iOS 11, including on the iPhone X, Apple’s most recent phone.

The issue GrayKey overcomes is that iPhones encrypt user data by default. Those in physical possession normally cannot access the phone’s data, such as contact list, saved messages, or photos, without first unlocking the phone with a passcode or fingerprint. Malwarebytes’ post says GrayKey can unlock an iPhone in around two hours, or three days or longer for 6 digit passcodes.

And police forces are ready to use GrayKey. David R. Bursten, chief public information officer from the Indiana State Police, wrote in an email to Motherboard that the force had only recently obtained the GrayKey device, but that “this investigative tool will be used, when legally authorized to do so, in any investigation where it may help advance an investigation to identify criminal actors with the goal of making arrests and presenting prosecutable cases to the proper prosecuting authority.”

[…]

To be clear, the FBI already makes heavy use of technology similar to GrayKey, and spends millions of dollars on equipment that cracks phones without using mandated backdoors. Motherboard previously found that the FBI bought over $2 million worth of forensics tools from established vendor Cellebrite. Back in 2016, the Bureau’s General Counsel said the FBI could unlock most phones it seized.

In March, the New York Times reported that FBI and Justice Department officials have reignited the hunt for backdoors, and have been quietly meeting with security researchers. And earlier this month, Cyberscoop reported that staffers of the Senate Judiciary Committee have been contacting US tech companies regarding potential future legislation around encryption.

Adding an iPhone backdoor, by its nature, adds new vulnerabilities into a otherwise fairly secure phone that provides robust encryption by default. GrayKey’s existence and widespread availability “means that adding backdoors isn’t so much a question of adding a secure door to the walls of a stone castle. It’s like adding extra holes in the walls of a sandcastle,” Green, the Johns Hopkins cryptographer, said. “It seems totally reckless to add additional mandatory vulnerabilities.”
Instead of backdoors, some technologists say the current system of hacking is the best we can hope for: a phone is released; companies such as Grayshift look for ways to access the device; for a time their tools work; then the phone manufacturer issues a fix or a new operating system version, and the cycle repeats.

“The success of companies like Grayshift in finding and exploiting ways to gain access to even the latest, most secure smartphone models demonstrates that flaws will always exist despite manufacturers’ best efforts,” Pfefferkorn said.

While the theme of this report and others on tech-oriented sites is that the existence of devices like GrayKey and obviates the need for phone manufacturers to build in a “back door” for law enforcement. And I suppose it does.

My reaction, though, is that it’s rather shocking that Apple and others haven’t figured out how to defeat third-party decryption tools. After all, even if GrayKey and their competitors have business models that will have them selling only to law enforcement agencies—and I’m skeptical that they won’t expand their market—the technology is bound to get into the hands of criminals and others with nefarious aims.

Beyond that, while there’s no obvious reason police shouldn’t be able to access information on one’s phone in the same way they do other private information pursuant to a warrant, I’m highly skeptical that, having their hands on this technology, they won’t conduct illegal searches of phones in cases where they lack probable cause. While such evidence would presumably be inadmissible in court (although those protections have eroded considerably in recent decades) they wouldn’t have to admit that they’d hacked the phone and having access to its content would certainly give them all the clues they needed to find evidence—or even just probable cause to get a warrant to search the phone—through legal means.

FILED UNDER: Law and the Courts, Science & Technology
James Joyner
About James Joyner
James Joyner is a Security Studies professor at Marine Corps University's Command and Staff College and a nonresident senior fellow at the Scowcroft Center for Strategy and Security at the Atlantic Council. He's a former Army officer and Desert Storm vet. Views expressed here are his own. Follow James on Twitter @DrJJoyner.

Comments

  1. HarvardLaw92 says:

    I only carry my firm provided iPhone. My allowed phone call would be to my assistants, telling them to remotely wipe it. Given the nature of what we do and the sensitive data which may be on our phones, those procedures were distributed across the firm as soon as we became aware that the ability to do so was available.

    It was intended more for instances where the devices might be stolen or lost, but I’d certainly instigate them in this scenario as well.




    1



    0
  2. MarkedMan says:

    I’ve been involved with software for a long time and am not at all shocked that vulnerabilities keep on appearing. The iPhone hacks boil down to finding a way to fool the counter that keeps track of password attempt and then repeatedly trying every combination. There are many ways this can be done ranging from the rumored CIA methods of opening the iPhone and connecting probes to the actual integrated circuits, to the more mundane. Some software vulnerabilities can be fairly global and difficult to patch, and others are incredibly specific to one version of the operating system. But they all have one thing in common: they use a legitimate system call to gain privileges they shouldn’t get.

    I’ve debugged problems that boggle my mind at how arcane and unexpected they were. Now, I never used these problems to hack into anyone’s system (and they almost certainly couldn’t have been used in that manner) but it leads me to believe that every system of sufficient complexity can be broken.




    1



    0
  3. Given the fact that this system no doubt involves some use of iOS code to develop the tools to hack into the phones, I’m kind of surprised that Apple hasn’t proceeded against the manufacturer for copyright infringement and other legal claims. Perhaps that action is yet to come.

    I also wonder what this means for Android phones, which have similar encryption features.

    In the meantime, I would suspect that Apple will likely find a workaround for this.




    0



    0
  4. Argon says:

    “Wray is not telling the whole truth.”

    That’s been generally understood for some time. Understand that his organization keeps pushing for mandating OS changes that provide backdoors for decryption. Security experts uniformly agree that such backdoors will be readily exploited by ‘bad actors’, leaving everyone at risk.




    0



    0
  5. OzarkHillbilly says:

    Once again my Luddite tendencies prove prescient.




    2



    0
  6. HarvardLaw92 says:

    @MarkedMan:

    Question, since this is much more your wheelhouse than mine:

    All of our firm owned phones are set with the “erase all data on this phone after 10 failed passcode attempts” option activated. We disable the “show when locked” options as well, essentially killing control center when the phone is locked (to prevent the thief from activating airplane mode to negate the remote wipe).

    Based on what you are saying, about the device repeatedly guessing passcodes, would this option be activated by the cracking device’s guessing attempts? I.e. would the cracking device’s 10th/11th attempt kick off the option and wipe the phone?




    1



    0
  7. HarvardLaw92 says:

    @Doug Mataconis:

    In reading through the articles, I noted that one of their employees is a former Apple security engineer. The likelihood that he’s exploiting Apple’s proprietary & protected information in his current employment is enormous. That ground is ripe for a lawsuit from Apple. It’ll be interesting to see how the company reacts to this one.




    3



    0
  8. Kit says:

    @MarkedMan:

    Based on the what little information I could find, this device breaks weak passcodes. Trying to fill in the blanks, the iPhone somehow gives up a hash of the passcode within two minutes of handshaking sorcery, after which the device can be disconnected until it can work out the actual passcode, perhaps using rainbow tables.

    For those lucky enough to find the above mumbojumbo, the solution is to use a long and complicated password: upper- and lower case, digits, and symbols. Current wisdom involves mixing the above into a gibberish pass phrase which only makes sense to you. For example: hardly-5urpri5ing-OF-course!

    Long story short: for people who take security seriously, an iPhone is still the way to go.




    0



    1
  9. HarvardLaw92 says:

    @Kit:

    The iPhone uses numeric passcodes. How would you use a complicated pass phrase in that scenario?




    0



    0
  10. Erik says:

    @HarvardLaw92: Disclaimer: this is not my field and my knowledge level is unlikely to be better than “well informed layman,” but this type of problem is one that I have been interested in for a while so I’ve engaged with it as best I can without the right math background.

    As @kit says, it seems to be about length and variety (as long as the passphrase itself isn’t common enough to be in a database). Using the tool at https://www.grc.com/haystack.htm (that site has great info in general BTW) and assuming worst case it seems one would want at least a 12 character mixed capital/lower/number/symbol password, or, if you wanted to use only numbers so you didn’t need to mess with the full keyboard, at least a 24 character numeric password. I would probably add at least 1 or even 2 characters to those lengths for insurance.

    To use a stronger passphrase on iPhone, in settings:touch ID & passcode choose “change passcode.” Then “passcode options” will allow you to select an alphanumeric code, or a custom numeric code of longer length as well as the standard 6 character and obsolete 4 character numeric code.




    1



    0
  11. Kathy says:

    I’d be astonished is there isn’t an app that can wipe the phone clean in one tap.

    For Android, at least. I’m not sure whether you can install non-store apps on Apple’s overpriced slabs.




    1



    2
  12. Kit says:

    Settings / Passcode / Change Passcode

    That second step might include a reference to Touch ID or Face ID. You will then see Passcode Options where you can choose alphanumeric.

    Cyber security is like physical security in the sense that you should soberly assess your needs, and then be able to foot the bill in terms of cost and inconvenience. Most of us just need to protect against lost phones and weak passwords. I find an app like 1Password sufficient for my needs. But if you have more of a tinfoil-hat survivalist mentality, you might be surprised at the rabbit hole that awaits.




    0



    0
  13. HarvardLaw92 says:

    @Erik:

    I wasn’t even aware that these options existed. You really do learn something new every day.

    I’ll be passing the info along to our IT department and discussing mandatory complex passphrases with the rest of the management committee. I foresee us adopting them very shortly.

    Many thanks to you and Kit for the information. Much appreciated 🙂




    1



    0
  14. HarvardLaw92 says:

    @Kit:

    It’s more of a liability issue with us (large multinational law firm). Spillage of client data can potentially be devastating for us, so we have to go a tad overboard with respect to preventing it whenever possible.




    0



    0
  15. Kit says:

    @HarvardLaw92:

    Your phones are probably not even your biggest security hole. You might consider investing in a professional security audit. There’s email (probably not encrypted), laptops (drives probably not encrypted), thumb drives, backups (God only knows how that’s handled), collaborating in the cloud, PCs not set to lock automatically, documents not disposed of correctly… Give me your credit-card number and I’ll have a look for you 😉




    0



    0
  16. R. Dave says:

    @HarvardLaw92: I only carry my firm provided iPhone. My allowed phone call would be to my assistants, telling them to remotely wipe it.

    Funnily enough, that remote wipe practice at firms is why I stopped carrying only a firm-provided phone and reverted to maintaining both a work phone and a personal phone. When I was leaving my old firm, I was allowed to keep the device, subject to a wipe, of course. However, IT didn’t have the courtesy to wait until the end of my last day to send the wipe command, so I lost all my personal contacts, text history, etc. in the process. I decided to never again mix personal and work devices, so now I’m stuck carrying two devices everywhere I go now. #FirstWorldProblems, for sure, but ugh.




    0



    0
  17. HarvardLaw92 says:

    @Kit:

    Not sure, to be honest. We use smart cards with PKI (?) for login to all firm devices (even the copiers). Maintaining firm / client data on any personal devices is a “get escorted out of the building” level violation which is incorporated into our employment agreements. It has happened. We hired a new IT director, formerly with DOD, several years ago and all of this came about through his direction.

    My understanding was that one of the purposes for the smart cards was across the enterprise encryption, but I’ll readily admit that IT is not my wheelhouse beyond a rudimentary understanding of the principles. Your suggestion has merit. I’ll bring it up as well, thanks.




    0



    0
  18. Kit says:

    @HarvardLaw92:

    Let me wrap up by saying just how fantastic you’ve been with the Mueller investigation. I regularly find myself scrolling to your comments first before going back for the others. You’d make my life easier if you added an avatar…




    2



    0
  19. MarkedMan says:

    @HarvardLaw92: It looks like others have already answered your questions, but just in case:
    The “erase all data” is activated when the attempt counter reaches the designated number of tries. If this is working like I think it is, it is able to essentially reset that counter before it reaches the magic number. It can then try to brute force it. The article stated that it takes 20 minutes for a 4 digit code (10,000 possible combinations) and 3 hours for a 6 digit code (1,000,000 possible combinations), so it’s doing something a bit trickier than pure brute force (otherwise those numbers would be 2 minutes/3 hours or 20 minutes/30 hours). As Kit says, using an alphanumeric or simply a longer numeric code increases the amount of time required. A long enough code may make it practically impossible to brute force without some method of eliminating possibilities.

    What Kit said below about getting a hashtag is interesting, and would make this more complex (and potentially easier for Apple to fix). It could significantly reduce the number of attempts needed but it doesn’t change the fact that they must be resetting the attempt counter.

    You shouldn’t allow fingerprint unlocking or Face ID. Aside from the current haziness about whether they can compel you to unlock with a your finger or face, there is also the very real possibility that those can be spoofed by someone with enough resources.

    Finally, if security is a really serious concern, don’t use an Android phone. There may be specific models with specific versions of the operating system that are as secure as the corresponding iPhone (although I doubt it) but as a consumer you really have no insight on what those are. And since people rarely update their Android phones, whatever protection they had could disappear. My general rule of thumb for iPhone updates is: On a dot update, ex. 11.2 to 11.3, wait 1 or 2 days and then update if it seems solid. For a integer update, ex iOS 10 to iOS 11, wait for the x.1 version before updating.

    Finally, if the information you have on that phone is truly critical, you should use that phone only for that stuff. You shouldn’t install anything else on it, and instead get a second phone to play candy crush or get updates pushed for sports scores. This is a general and long term piece of advice and not based on any specific exploits. Carrying two phones may sound like a pain but it’s not as bad as it seems. I have done it myself in the past and know a number of people who do it simply because an employer is free to snoop on your phone at any time for any reason, if they gave it to you.




    0



    0
  20. HarvardLaw92 says:

    @MarkedMan:

    Noted. I use an iPad for personal things, and just followed the advice above to change it to a 26 number passcode. 🙂

    The firm owned phones are locked down from a security and installing apps standpoint somehow with something called Mobileiron (I have no idea …). They’re essentially work only and secured in ways that are beyond my understanding. Way out of my wheelhouse.




    1



    0
  21. HarvardLaw92 says:

    @Kit:

    Thank you 🙂




    1



    0
  22. HarvardLaw92 says:

    @MarkedMan:

    Just curious – the Malwarebytes article mentioned something like up to three days to crack a phone with a six digit passcode. I wonder how long it would require to crack a device with a 26 digit passcode like the one I just set.

    I presume that setting an actual complex passcode, like Pa55w0rdz$&@19573 for example, would dramatically increase that time even further beyond what’s necessary for one comprised just of numbers?




    0



    0
  23. Erik says:

    @HarvardLaw92: FWIW the website I linked above says that a “massive cracking array” making one hundred trillion guesses a second would take 3.5 centuries to run all the possibilities for a length 24 numeral only password, and a hundred times longer for a length 26 numeral only password. Note however that that time is for an exhaustive (meaning every possible combination) search. Since unless it starts at 0 and your password is all 9’s your password will not be the last one found, the time will be shorter than that. Anyone could get lucky and guess the passcode on the first try just by random chance, for example. So we are talking probabilities here, not definite lengths of time.

    This type of security is all about balancing the probability of failure against the usability of the solution. You only need to guard against the threats you deem reasonably likely. I.e. if the NSA isn’t coming for you your attacker might not have a massive cracking array to deploy against you. Or they might decide the effort wasn’t worth it after trying for a month because they have more important phones to crack. Or they would just use incidental ccTV footage to capture you inputting your password. Or torture you until you gave up the passcode. So your caution depends on the value of the data, the likelihood of being targeted, the likely resources of the attacker, and how thick your personal tinfoil is. I really would encourage you to look at and play with the website I linked. It explains a lot of this in an easy to understand real world use way.

    My iPhone has a less than 24 numeral passcode. I don’t want to make it easy, but my most likely attacker is someone that snatched my phone. So I’m more secure than baseline mostly because I’m a geek and have a thing about data privacy.

    Also, I second the comment about your legal insights in the comments. I also look for your comments first. Thank you.




    0



    0
  24. MarkedMan says:

    @HarvardLaw92: You are a better man than I if you can stomach a 26 digit passcode every time you want to log in 😉

    Erik is right – 26 digits is way past the point at which brute forcing it becomes problematic. Am I right in suspecting that those are not 26 random digits? If so, that would be where a truly dedicated cracker would start. What’s your birthday, wife’s birthday, etc. But even then, I wouldn’t worry too much.

    FWIW, Microsoft had… (something that I can’t remember)… that relied on an autogenerated random key to provide security. It was a long sequence and they bragged that it couldn’t be cracked for 78 jillion years. Then someone guessed they were probably using the internal Microsoft random number generator in their IDE and they happened to know that it used the internal computer clock as a seed. So bottom line, given how long it had been out, there was a very finite number of seeds it could have been. If I remember correctly cracking time went from 78 jillion years to something like a couple of hours.




    0



    0
  25. HarvardLaw92 says:

    @MarkedMan:

    Am I right in suspecting that those are not 26 random digits?

    They’re not random. They’re the date my grandparents arrived in the US, the date my other grandparents got married, the date that Auschwitz was liberated by the allies, the street number of my first childhood crush and a randomly selected five digit seed. Those all spring instantly to mind for me. Im a freak, I know 🙁

    (If anybody is clever enough to figure all of that out, lol, they’re welcome to see the photos of my dog.




    1



    0
  26. Hal_10000 says:

    Well, I’m sure the police won’t abuse this technology to say, look for nudes on someone’s phone or snoop around for any salacious but legal behavior. Not at all.




    1



    0
  27. MarkedMan says:

    This thread is probably dead but just for the record, another reason why people who are serious about security shouldn’t use android phones:

    Several top-tier Android phone vendors—Samsung, HTC, and the like—have mislead users into believing security patches have been installed when in fact they have not. That’s according to Security Research Labs (SRL), which announced its findings at the Hack in the Box security conference. SRL’s research involved tests of more than 1,200 phones from more than a dozen manufacturers, according to Wired, which first reported the findings on Thursday.




    0



    0
  28. grumpy realist says:

    @HarvardLaw92: From the stories my computer geek friends tell me, most security breaches aren’t through password cracking, but getting access to the system via humans (phishing attempts allow someone from the outside to place key capture or similar software into the system) or getting at the system though very low-level OS stuff where the developers put in idiot passwords and your IT group never bothered to change said passwords when installing.




    0



    0