Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
IOS Security Apple

Apple Is Testing a Feature That Could Kill Police iPhone Unlockers (vice.com) 187

Lorenzo Franceschi-Bicchierai, reporting for Motherboard: On Monday, at its Worldwide Developers Conference, Apple teased the upcoming release of the iPhone's operating system, iOS 12. Among its most anticipated features are group FaceTime, Animoji, and a ruler app. But iOS 12's killer feature might be something that's been rumored for a while and wasn't discussed at Apple's event. It's called USB Restricted Mode, and Apple has been including it in some of the iOS beta releases since iOS 11.3.

The feature essentially forces users to unlock the iPhone with the passcode when connecting it to a USB accessory everytime the phone has not been unlocked for one hour. That includes the iPhone unlocking devices that companies such as Cellebrite or GrayShift make, which police departments all over the world use to hack into seized iPhones. "That pretty much kills [GrayShift's product] GrayKey and Cellebrite," Ryan Duff, a security researcher who has studied iPhone and is Director of Cyber Solutions at Point3 Security, told Motherboard in an online chat. "If it actually does what it says and doesn't let ANY type of data connection happen until it's unlocked, then yes. You can't exploit the device if you can't communicate with it."

This discussion has been archived. No new comments can be posted.

Apple Is Testing a Feature That Could Kill Police iPhone Unlockers

Comments Filter:
  • Cludge fix? (Score:4, Interesting)

    by sinij ( 911942 ) on Tuesday June 05, 2018 @09:05AM (#56730518)
    I admit, I don't know exactly how GrayKey and Cellebrite work. However, if viewed from proper access control and privileges point of view, it shouldn't be possible to siphon the kinds of data (e.g. contacts, calls) that it is reportedly capable of doing.

    So, could someone explain to me why they went with a solution that still leaves 1 hour window of opportunity to compromise a phone instead of fixing, what I guess are overly permissive privileges within the file system?
    • Re:Cludge fix? (Score:5, Informative)

      by Anonymous Coward on Tuesday June 05, 2018 @09:10AM (#56730554)

      I admit, I don't know exactly how GrayKey and Cellebrite work. However, if viewed from proper access control and privileges point of view, it shouldn't be possible to siphon the kinds of data (e.g. contacts, calls) that it is reportedly capable of doing.

      So, could someone explain to me why they went with a solution that still leaves 1 hour window of opportunity to compromise a phone instead of fixing, what I guess are overly permissive privileges within the file system?

      The file system isn’t left open, there are kernel exploits in iOS. Apple’s developers aren’t perfect and don’t know where they left things like buffer overflows that can be exploited. They fix them as they find them, but of course GrayKey won’t share its trade secret. Instead of thinking that patching every possible exploit is possible, they restrict access to the device so that although exploits will probably always exist, someone without the passcode can’t interact with the phone at all. Problem is though, when you forget your password with this feature, there is no restore. Cool brick!

      • by bondsbw ( 888959 ) on Tuesday June 05, 2018 @09:19AM (#56730638)

        Apple’s developers aren’t perfect

        No no no... that's not how it works. Apple developers definitely are perfect, and everything they "fix" is really just better perfection.

        • by tsa ( 15680 )

          Indeed. They never make mistakes. Steve smites them with fire and fury if they do.

      • Apple’s developers aren’t perfect and don’t know where they left things like buffer overflows that can be exploited.

        Having looked at the kernel code, I would suggest they aren't trying very hard.

      • Re:Cludge fix? (Score:5, Interesting)

        by NFN_NLN ( 633283 ) on Tuesday June 05, 2018 @09:40AM (#56730790)

        The file system isn’t left open, there are kernel exploits in iOS. Apple’s developers aren’t perfect and don’t know where they left things like buffer overflows that can be exploited.

        I remember back in the satellite smart card hacking days when we had to "glitch" cards. We would put them in a special card reader and run commands through a loop over and over. As the commands were running through you could adjust the VCC voltage supplied to the card. If you hit the right timing/voltage the card would "glitch" and you could write to protected memory and gain access. You could buy unhacked cards by the hundreds and with enough skill 90% of the cards were glitchable. There isn't any amount of coding skill that can defend against a glitch like that.

        • >There isn't any amount of coding skill that can defend against a glitch like that.

          Actually there is a fairly simple solution. Though it is not about coding skill is is about understanding the problem.

          If you data can is some cases be modified you need to sign it using digital signature methods, and if the signature is not correct you refuse to use the data.

          Of course the "smart" cards of the era were nothing of the sort and wider understanding of digital signing is from a later era so not really a realist

          • How would that protect against a hardware compromise that allows the attacker to write to memory that *should* be protected? What would prevent the attacker from then just changing the required signature to their own?

            • NFN_NLN was referring to fucking with the cards.

              luvirini pointed out that doesn't fucking matter if the device reading the card checks that it's signed by a trusted key.

              You're referring to fucking with the device reading the cards.

          • You can duplicate digital signatures though. This can be solved by other means, but it's primarily why a lot of systems try to just hide the data instead.

        • Re:Cludge fix? (Score:4, Interesting)

          by AmiMoJo ( 196126 ) on Tuesday June 05, 2018 @11:14AM (#56731438) Homepage Journal

          Reminds me of the attack that finally recovered the hidden Gameboy boot ROM. Up until that point it had to be replaced by an open source one in emulators. The ROM was inside the CPU, and the final instruction in it disabled the ability to read said ROM until the next reset.

          Someone realized they could simply count the number of clock cycles needed to exit the ROM after reset, then sent that number -1 and glitched the clock line. The glitch caused the ROM-read-disable instruction to be skipped and the ROM could be dumped with a custom cart.

      • by mysidia ( 191772 )

        Problem is though, when you forget your password with this feature, there is no restore. Cool brick!

        What if they changed the logic... (1) Enter USB restricted Mode as soon as the phone is locked, but only if the iPhone has been Unlocked at least Once after booting up.

        (2) Turn off the phone, and turn it back on while holding the Home button (or something), or with no USB device connected --- the device will either boot without entering restricted mode or detect no USB device connected and stay out

      • ... when you forget your password ...

        You've also forgotten how to use the goddam phone [apple.com].

        If you forgot your passcode, or if a message says that your device is disabled, follow these steps to remove your passcode.

      • I believe you can still do a full reset of the device by connecting it to iTunes in recovery mode. This doesn't allow you to access anything stored on the device, but you can erase everything on the phone. Of course, activation lock still prevents it from being activated again unless you have access to the Apple ID previously used on the phone.
    • Re:Cludge fix? (Score:5, Insightful)

      by bensafrickingenius ( 828123 ) on Tuesday June 05, 2018 @09:10AM (#56730564)
      I too was thrown by the 1 hour window. How often outside of sleepy time does one's phone remain unlocked for an entire hour?
      • by Anonymous Coward on Tuesday June 05, 2018 @09:13AM (#56730592)

        I too was thrown by the 1 hour window. How often outside of sleepy time does one's phone remain unlocked for an entire hour?

        When the police seize it.

      • It would be smarter if that one hour window only applies to unlocks that grant USB access, not all unlocks. Much like an unlocked phone still requires confirmation for an app store purchase.

      • How often outside of sleepy time does one's phone remain unlocked for an entire hour?

        Every evening, when I leave it in the bedroom and I'm watching something in the movie room. I don't let my phone be a cybershackle out of business hours.

        Want me? Call me! Otherwise I'll get back to you whenever.. if ever.

        Weekends? Many hours pass without me looking at it or unlocking it. I just don't caaaaaaaaare about constant connectivity, in fact, the older I get the more I loathe it.

      • The time from when a cop takes it from you, and when they get a judge to sign a search warrant allowing them to look at it.

        • You might be living under a rock.
          The recent 5 or more years /. was full with news that cops don't need a search warrant to look at the data of your phone.

      • by Megol ( 3135005 )

        I think you have an addiction.

      • I would assume the time allowance is for syncing and backups. Depending on the phone and the computer that could take a long time if the phone has a lot of files and the computer is older and using USB2.
      • Apple should just make the USB lock come on one hour after the last unlock-via-passcode event.

        The vast majority of my phone unlocks are via fingerprint/TouchID, and these should not count.

        I enter the passcode on my iPhone:

        * After a reboot
        * When my thumb is damp and won't read
        * When installing an update

        If it works this way, my phone will require a passcode for USB access... essentially all the time.

    • by AmiMoJo ( 196126 )

      I'm not sure this change will affect GrayKey and Cellebrite anyway. My understanding is that they attack the phone's bootloader. It's a special bit of firmware that loads at boot time and is designed to make recovery from a broken OS image possible. It seems that they found some vulnerability in it that they can exploit to disable the passcode attempt limit and then automatically try passcodes until they find the right one.

      Also, this fix doesn't seem to be enough... On my Pixel you always have to unlock to

      • by sinij ( 911942 )
        Why is modifying bootloader doesn't require root access on iOS?
        • by AmiMoJo ( 196126 )

          The bootloader loads before the OS does. It doesn't have any concept of users. All it can do is ask for the passcode to decrypt flash memory or secure erase and overwrite the flash with a new image (for disaster recovery).

          The idea is that the secure element rate limits the number of password attempts. However, it appears that they have found some way to circumvent the limit, which involves exploiting the bootloader. It might be a case of loading their own code, or causing the secure element to crash and res

        • by Nkwe ( 604125 )

          Why is modifying bootloader doesn't require root access on iOS?

          The boot loader is what *starts* iOS. iOS isn't actually running yet when the boot loader loads it, so iOS can't protect itself at this point. Pretty much all computers work this way - they have a lightweight piece of code (the boot loader) that is in the firmware of the device, this code's sole job is to read the operating system from storage, and start the operating system. The hardware of the device loads and runs the firmware boot loader, which in turn loads and runs the software operating system.

      • Re:Cludge fix? (Score:5, Informative)

        by UnknowingFool ( 672806 ) on Tuesday June 05, 2018 @10:12AM (#56730984)

        I'm not sure this change will affect GrayKey and Cellebrite anyway. My understanding is that they attack the phone's bootloader.

        How does GrayKey and Cellebrite get access to the boot loader? Cellebrite [cnn.com] currently sells a small device that plugs into the phone.

        Eventually, law enforcement came to rely on Cellebrite's Universal Forensics Extraction Device, the UFED. It's a small, hand-held device that's easy to use. Police can simply plug in a phone and download the device's memory to a flash drive in a matter of seconds. That's how police can find your deleted text messages.

        GrayKey [scmagazine.com] is a box that plugs into the Lightning port.

        The product itself is a gray box four inches deep by two inches tall, with two lightning cables sticking out of the front. Up to two phones can be plugged into the device at a time and are connected for about two minutes.

        If the iPhone refuses to communicate via cable then neither device can probably work unless the companies find a flaw they can exploit.

        • by AmiMoJo ( 196126 )

          The bootloader can be accessed via the lightning port. That's how iTunes can recover an unbootable phone by doing a "factory reset". In that case iTunes instructs the bootloader to secure erase the flash memory and writes a new OS image to it.

          • A DFU restore? I wonder how this USB "locking" mechanism will deal with that. Maybe iBoot or the firmware will allow a firmware overwrite and erase, but not any ability to read.

            • by AmiMoJo ( 196126 )

              I don't think locking will affect DFU.

              Even if you read the flash (an optional part of the DFU spec) it's encrypted. The only realistic attack is on the passcode.

          • The bootloader can be accessed via the lightning port. That's how iTunes can recover an unbootable phone by doing a "factory reset". In that case iTunes instructs the bootloader to secure erase the flash memory and writes a new OS image to it.

            That would probably destroy any ability to recover the data on the phone as the per file encryption keys would be lost forever. This feature isn't to make a phone immune to theft; it's to make the data on the phone more secure from hacking.

            • by AmiMoJo ( 196126 )

              Yes, that's the point. You can only erase the flash, you can't recover data from the phone via the bootloader.

      • Re:Cludge fix? (Score:4, Interesting)

        by msauve ( 701917 ) on Tuesday June 05, 2018 @11:18AM (#56731466)
        "I'm not sure this change will affect GrayKey and Cellebrite anyway."

        I'd assume that Apple has gotten their hands on one, knows how it works, and has used it to develop and test their new feature.
    • They work by cracking the passcode, basically. Supposedly, they found a way to repeatedly test the passcode without triggering the cooldowns, or something similar. Once the phone is unlocked, obviously, all the data is available to whoever wants it.

    • The American Government probably requires Apple to have backdoor access to phone data via USB. If it wasn't deliberate they would have blocked the access by fixing the USB bug. They should also block software updates without unlocking the phone, to prevent the FBI getting a court warrant to force Apple to make "unlock assistance" software.
      • > If it wasn't deliberate they would have blocked the access by fixing the USB bug.
        Is that not exactly what they're doing? They don't know exactly what the bug is, so they're making the USB port useless unless the phone is already unlocked.

        Of course that's sort of the nuclear option, as it removes the easiest routes to repairing a phone with "broken" software, as well as probably interfering with the functionality of a number of clock-radios and other often-idle accessories. But it does the job.

      • "They should also block software updates without unlocking the phone, to prevent the FBI getting a court warrant to force Apple to make "unlock assistance" software."

        You mean like how I have to enter my passcode to update either the iPhone or a Paired Apple Watch?

    • I admit, I don't know exactly how GrayKey and Cellebrite work. However, if viewed from proper access control and privileges point of view, it shouldn't be possible to siphon the kinds of data (e.g. contacts, calls) that it is reportedly capable of doing.

      I would assume that both require plugging in a cable instead over wifi or cellular connection. The problem isn't "siphoning" data. The problem is taking advantage of some flaw in the iPhone. Apple can fix each and every flaw however this would also help mitigate many attacks.

      So, could someone explain to me why they went with a solution that still leaves 1 hour window of opportunity to compromise a phone instead of fixing, what I guess are overly permissive privileges within the file system?

      I would say all security involves a balance of convenience vs effectiveness. If they didn't leave the 1 hour, that would mean that their customers would have to use a passcode every single time which would be inconvenient. The fingerpri

    • by AHuxley ( 892839 )
      Re "1 hour window of opportunity to compromise a phone"
      Police move in. The well educated computer aware protester shuts their trendy new big brand phone off.
      Police make arrests. Time taken to fill the van, bus back to the police station due to more arrests. Questions about name, ability to call to lawyer. Identity and citizenship questions. More time passes given the numbers arrested.
      Property gets sorted. An advanced new phone is discovered beyond the exiting guides police have on most new big bra
    • I admit, I don't know exactly how GrayKey and Cellebrite work. However, if viewed from proper access control and privileges point of view, it shouldn't be possible to siphon the kinds of data (e.g. contacts, calls) that it is reportedly capable of doing.

      So, could someone explain to me why they went with a solution that still leaves 1 hour window of opportunity to compromise a phone instead of fixing, what I guess are overly permissive privileges within the file system?

      It's not that the privileges are "overly permissive", it's just that, once you're in, it trusts you to be you.

      Anything else would be like the first version of UAC in Visturd. Annoying as fuck, with very little additional benefit.

      I agree that 1 hour is a bit long; but it probably drops to zero if you have time to lock the phone with the "panic gesture" (press the sleep button 5 times).

  • Hyperbole much? (Score:5, Informative)

    by Jason1729 ( 561790 ) on Tuesday June 05, 2018 @09:07AM (#56730540)
    "Apple Is Testing a Feature That Could Kill Police iPhone Unlockers. " Um, the feature you describe will prevent current unlockers from working on an iPhone with the feature enabled. But it's not going to kill the unlocker. That conjures up imagery of something that will detect the unlocker and fire high voltage into it or some such.

    I guess my 4-digit pin kills anyone who tries to casually snoop at my phone.
    • by arth1 ( 260657 )

      Yeah, this is false advertising. Although it might be possible to cause the battery to explode, and at least get a decent chance of maiming them.
      Just reducing the number of police trigger fingers might make this part of the world a safer place.

    • by teslar ( 706653 )

      If a product no longer works, it is likely to get discontinued ("killed off"). It's not that much of a stretch to say that making software useless will kill it.

      Captcha: epitaphs.

  • by idji ( 984038 ) on Tuesday June 05, 2018 @09:15AM (#56730606)
    What if your left thumb unlocked your phone and your right thumb wiped the device invisibly? The criminal could never know, you deniability and the police will be too scared to tap your dead finger to the phone.
    Or what if left-right-left unlocked and left-right-right wiped?
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Fingerprints have a non-zero chance of being misidentified, and the user a huge chance of accidentally doing the wrong swipe command because they forgot or recently switched gestures.

      Bad idea, imho

      • by Anonymous Coward
        It's actually a good idea, just poorly implemented. Instead of wiping after a single swipe of the panic finger, it would require 3-4 swipes.

        Of course, to really work, you would need to allow the user to decide if the fingerprint sensor should only function as an unlock, or only function as a panic-wipe. Otherwise, you as a user would want to know if the "error reading fingerprint" message is the real deal or the phony "swipe X more times to initiate factory reset" message. But if you use an alternate m
        • But there'd still be the possibility of the bad guys choosing the correct finger before enough successful panic-swipes.

          Plot twist, all 10 fingers are invalid, the body part that actually unlocks it is left up to the readers imagination.

          • by arth1 ( 260657 )

            Plot twist, all 10 fingers are invalid, the body part that actually unlocks it is left up to the readers imagination.

            I think it's fairly clear that this should be the brain.

            The problem is in assuming that the finger (or in your case other body part) always will be representing the brain. That's a bad assumption.
            Authentication and authorization (by the user, not the device) need to be decoupled - the former does not imply the latter.
            "I am arth1" does not automatically validate "and arth1 wants to unlock".

    • Comment removed based on user account deletion
    • by OzPeter ( 195038 )

      What if your left thumb unlocked your phone and your right thumb wiped the device invisibly? The criminal could never know, you deniability and the police will be too scared to tap your dead finger to the phone.

      Or what if left-right-left unlocked and left-right-right wiped?

      Given that Apple is moving to Face ID for phone unlocking I don't see any changes based on finger prints happening. Plus the possibility of accidentally wiping a phone would have Apple really nervous about lawsuits.

    • What if your left thumb unlocked your phone and your right thumb wiped the device invisibly? The criminal could never know, you deniability and the police will be too scared to tap your dead finger to the phone. Or what if left-right-left unlocked and left-right-right wiped?

      I'm hoping this is tongue in cheek ... humans are far too unreliable to make it this easy to accidentally wipe your phone.

    • by wbr1 ( 2538558 )
      What you are promoting is a dead-man-switch. Technically easy to implement, but not done by any device manufacturer currently. Probably because they do not want the piles of support calls for accidental phone wipes.
      • by wbr1 ( 2538558 )
        Thinking about it some more, having it have multiple steps would help. Perhaps the dead-man trigger would not wipe the device but put it into an 'alert' state such that any attempt at data connection through USB or failed 'real' unlock attempt would wipe the device.
    • by PPH ( 736903 )

      Because my wife pulls her phone out of her purse upside down or face down just as many times as she does right side up. One swipe the wrong way and everything is gone.

    • Good idea, until you fumble in your pocket, or in the dark, or try and catch the 'phone as it slips off the table and....wipe it.

    • by crow ( 16139 ) on Tuesday June 05, 2018 @10:26AM (#56731104) Homepage Journal

      What I want is to have encrypted VMs on my phone, with different fingerprints unlocking different VMs. Or perhaps different levels of unlocking. Unlocking the phone doesn't have to be a binary operation.

      Something like this would also be great for handing my phone to my son so that he can play games, while locking him out of my email and such.

      • by dargaud ( 518470 )
        I don't know about iPhones, but on android you can have different users with different unlocking methods (one can be password, the other fingerprint, the other a drawing, etc), each with it's own account. I'm not sure how it merges with an encrypted phone, but, yes, you can basically do that... if you don't have an iPhone (as usual).
    • What if your left thumb unlocked your phone and your right thumb wiped the device invisibly? The criminal could never know, you deniability and the police will be too scared to tap your dead finger to the phone. Or what if left-right-left unlocked and left-right-right wiped?

      Uh, do you really think it's going to be "so much easier" to explain to law enforcement why you erased your smartphone and not make it look like you were destroying evidence?

      Try and remember the "criminals" Apple is trying to defeat here. I can assure you the larger battle will be more legal than technical when it comes to end-users wiping their own devices.

    • What if your left thumb unlocked your phone and your right thumb wiped the device invisibly? The criminal could never know, you deniability and the police will be too scared to tap your dead finger to the phone. Or what if left-right-left unlocked and left-right-right wiped?

      Sounds like a good solution for iUsers who don't drink.

    • Tangentially related, but I believe you can tap the power button 5 times to bring up the emergency prompt. Doing that will lock the phone out of biometric logins adding another layer of security.
  • Image the underlying flash, wire to wire. Boot the image on a new phone, cache writes to delta, attempt unlock till limit. Reboot state, clear delta, attempt next set of codes, get combo. 6 digit passcodes are the norm and useless against this attack. USB access be damned.
    • by tsa ( 15680 )

      Every criminal knows this so they use longer passwords.

    • That is a much less trivial attack though, and not 100% reliable-- the secure enclave should be able to limit the effectiveness.

    • Doing all that will require a lot more time and expertise than an officer simply plugging in a usb cable. By raising the amount of effort required to break the security the authorities are forced to prioritize which phones they can crack. Overall this should result in fewer people having their phones compromised.

  • Does anybody know? What was the holdup? Certainly it couldn't have been difficult to implement, could it?
    • My guess is they break their MFI program parameters with it.

    • It has to be implemented most likely at a very low level in the hardware or iOS or it might be circumvented somehow via software.
    • by AHuxley ( 892839 )
      Users observed during testing would press the dongle in wrong and damage the delicate notch.
      Better cartoons got tested by artists so users will now know how to hold the dongle.
      The better cartoons and artwork is now ready so the product is now ready for average users.
  • by Zorpheus ( 857617 ) on Tuesday June 05, 2018 @09:32AM (#56730722)
    Sounds pretty much like it works in Android
  • and in china they will have an unlock code for government.

  • by in10se ( 472253 ) on Tuesday June 05, 2018 @09:38AM (#56730762) Homepage

    It seems like killing police for unlocking an iPhone would get Apple in trouble.

    • This prevents unauthorized access. There's no guarantee that it's the police or some other lawful agency that's attempting to unlock your phone without your consent. If the police want access, they can get a warrant. Failure to comply at that point puts you in prison in most jurisdictions so from the perspective of the police, they don't really need to care if they can't actually get into the device.
    • It seems like killing police for unlocking an iPhone would get Apple in trouble.

      The headline made me envision an exploding battery.

  • I take it that the USB device the phone is connected to can not be just any USB device but one that the phone knows?

    • by AHuxley ( 892839 )
      A factory crafted idongle that only works with the iproduct it got made with. Together at a factory in a distant nation with laws about working with the police...
  • by kiviQr ( 3443687 ) on Tuesday June 05, 2018 @09:40AM (#56730784)
    If they really wanted to kill unlockers they should have included capacitor based USB Killer.
  • While they're at it, why not also fix the vulnerability that the unlockers exploit?

    • They can fix the exploit but that also means they can only fix exploits one at a time whenever someone finds another. With this feature they can mitigate a whole class of exploits.

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...