Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Cellphones Communications Encryption Handhelds IOS Iphone Security The Courts Apple

Apple Tells US Judge It's 'Impossible' To Break Through Locks On New iPhones (reuters.com) 225

An anonymous reader writes: Apple told a U.S. judge that accessing data stored on a locked iPhone would be "impossible" with devices using its latest operating system, but the company has the "technical ability" to help law enforcement unlock older phones. Apple's position was laid out in a brief filed late Monday, after a federal magistrate judge in Brooklyn, New York, sought its input as he weighed a U.S. Justice Department request to force the company to help authorities access a seized iPhone during an investigation. In court papers, Apple said that for the 90 percent of its devices running iOS 8 or higher, granting the Justice Department's request "would be impossible to perform" after it strengthened encryption methods.
This discussion has been archived. No new comments can be posted.

Apple Tells US Judge It's 'Impossible' To Break Through Locks On New iPhones

Comments Filter:
  • Sounds like (Score:5, Insightful)

    by Chrisq ( 894406 ) on Wednesday October 21, 2015 @07:30AM (#50772197)
    Sounds like a challenge!
    • Sounds like a challenge!

      Not really. It is not hard to break, just not something they want to, or can do automatically. So yes there is no easy way of doing, so it is "impossible"...

    • and this, my dear fellow slashdotters, is why we need more platforms. we NEED people to run windows phones, blackberries with neutrinoOS, android with enforcing selinux, ubuntu phones with tight apparmor, ios with integrated lawyers, tizen with something else, etc.. we all know about eggs and baskets. apps should be written in some stupid interpreted javascript crap that works on all platforms, run preferably in containers/jails/zones/whatever.

    • Simple.

      1. Remove the flash.
      2. Mount it with a non Apple device.
      3. Run a dictionary attack on the password.

      With the right equipment, it would only take a few hours depending on the complexity of the user's password.

      Am I missing something?

      • Re: Sounds like (Score:3, Informative)

        by Anonymous Coward

        Yes, the security processor handles the passwords, the flash is encrypted with a sufficiently long symmetric key, brute force will take longer theoretically than the heat death of the universe, though every few years it seems to halve. The better attack is against the keychain in the active device. Depending on whether the user updated to a longer pin, then only a few days. But if they did enable a passphrase, then no, back to very long time beyond usefulness to LEOs, assuming they didn't choose correct-hor

      • Re:Sounds like (Score:5, Informative)

        by tlhIngan ( 30335 ) <slashdot@worf.ERDOSnet minus math_god> on Wednesday October 21, 2015 @03:17PM (#50776439)

        Simple.

        1. Remove the flash.
        2. Mount it with a non Apple device.
        3. Run a dictionary attack on the password.

        With the right equipment, it would only take a few hours depending on the complexity of the user's password.

        Am I missing something?

        Yep. Starting with the iPhone 4, the flash media is encrypted with a key held in the device memory. That key is encrypted with the device UID key, the user's PIN (if enabled), and an instance key. The encryption key is changed when you select "Clear and Delete Everything" (it throws away the key and generates a new one, and re-encrypts it).

        Moving the flash chip to a new device means you lack the per-device key which makes the flash inaccessible.

        It's a fairly sophisticated system and short of implementation flaws, it's unbreakable.

      • Simple.

        1. Remove the flash.
        2. Mount it with a non Apple device.
        3. Run a dictionary attack on the password.

        With the right equipment, it would only take a few hours depending on the complexity of the user's password.

        Am I missing something?

        Yes you are missing a lot.
        https://www.apple.com/business... [apple.com]
        https://developer.apple.com/li... [apple.com]

        Apple has done a lot of work to improve their systems.
        So has Microsoft, FWIW.

        It was public knowledge even before the breach at Sony that system failures and
        the naive use of systems by customers would prove to be trouble. Those without
        their head up their exit port could read the writing on the wall.

        Another less discussed topic is IPv6 and the internet of things.
        Some minimum safety existed behind home NAT but with IPv

  • by rmdingler ( 1955220 ) on Wednesday October 21, 2015 @07:39AM (#50772235) Journal
    Impossible or not, is it a private company's (or individual's) duty to engage in the evidence-gathering duties of law enforcement?

    I'm not sure the judicial conviction of this one suspect is worth granting law enforcement the unfettered ability to deputize anyone, any time it's convenient.

    • Re: (Score:2, Informative)

      by Anonymous Coward
      It is their duty when the court orders it so as part of evidence gathering. Law 101, dude.
      • [Engaging in evidence-gathering duties] is their duty when the court orders it so as part of evidence gathering. Law 101, dude.

        No, it is not. There's a big difference between providing information they have, which is their legal duty, and gathering information that they wouldn't otherwise have, which is not their legal duty. That's why plea bargains that provide immunity are a thing: they can't order you to do their job for them, but they can provide strong incentives for you to do so.

    • by gstoddart ( 321705 ) on Wednesday October 21, 2015 @08:09AM (#50772397) Homepage

      Because, apparently, it is now "un-American", or straight up illegal, for private companies to NOT be part of the spy apparatus.

      So, either you accept the provisions of stuff like the PATRIOT Act which says every company is required to participate and keep it secret ... or you have to somehow get a court to overturn that (or have the lawmakers repeal it).

      But, make no mistake about it, in the present situation, spying is a given, the requirement for corporations to help is real, and the expectation that making something you can't help them break into is just helping terrorists.

      So, yes, this may not the be the right question. The problem is to whom are you supposed to ask the right question?

      Because apparently most Americans now accept this crap as perfectly normal, and have fully embraced that if you have nothing to hide you have nothing to fear.

      The cope creep of national security and terrorism to common day to day crimes was inevitable. And now law enforcement expects to bypass any legal controls, and get what they wish because they want it.

      Papers please, comrade. That particular cat has been out of the bag for a while.

      • This is the exact reason why Apple made changes to their encryption and is actively fighting it.

        Are any other phone companies doing the same?

      • So, either you accept the provisions of stuff like the PATRIOT Act which says every company is required to participate and keep it secret ... or you have to somehow get a court to overturn that (or have the lawmakers repeal it).

        But, make no mistake about it, in the present situation, spying is a given, the requirement for corporations to help is real, and the expectation that making something you can't help them break into is just helping terrorists.

        I remember Mr. Comey on TV saying as much. He certainly has made it clear that he does not think a lock the FBI can't open should be permissible.

        We also know patriot act requires production of "any tangible thing" as if the "third party doctrine" did not already.

        Yet there is a difference between being compelled to assist with opening a lock or providing information to advance a specific "investigation" vs being ordered by government not to produce a lock that can't be opened in the first place. The author

      • Because apparently most Americans now accept this crap as perfectly normal, and have fully embraced that if you have nothing to hide you have nothing to fear.

        That's because they either don't care enough or they don't understand the issue. Most tech savvy people understand what power comes with access information. I'm willing to trust the authorities within reason but I will protect my data just in case.

        I've said this before and I still stand by this belief that the user should have the right to protect his data. Should this person be in a position where access to the data can prove him innocent OR guilty, he should have to provide access to the data with the ris

    • by Matheus ( 586080 )

      I typically am opposed to Apple's way of doing business. This action I applaud. As many commenters have already stated, "Impossible" is not possible when it comes to hacking BUT for Apple to rebuke the USGOV in saying they can't is a great example to set.

      That being said: It's also entirely likely that they are making a big show of saying no while quietly working with the man behind the scenes. "It's way better for the public to think we can't do this and helps us sell iDevices but here's the magic tool you

      • I like to believe the tech companies are made up of people like the ones from Slashdot who are tired of the clammy-handed government overreach.

        Of course, I recognize the convenience of my belief system; as, if there are no corporations on the side of privacy in a Corporatocracy, we are truly SOL.

    • Impossible or not, is it a private company's (or individual's) duty to engage in the evidence-gathering duties of law enforcement?

      It was in the past. For example, the phone companies (well, phone company initially) set up their networks to make it easier for law enforcement to wiretap if they showed up with a warrant.

      But given recent publicity about NSA data collection, all of that public trust and goodwill is probably gone now.* I don't think playing the "Apple is being a bad corporate citizen!" card

      • Very interesting... I had never considered the NSA had once been thought of in good terms by the security community.

        Do you suppose anyone's left employed in public relations at the NSA?

  • Bad guys (Score:4, Insightful)

    by Anonymous Coward on Wednesday October 21, 2015 @07:41AM (#50772251)

    This is what encryption is for. Keeping data from the bad guys.

    • This is what encryption is for. Keeping data from the bad guys.

      So, it has come to this. Law Enforcement are now the bad guys. I'm not saying I disagree (at least not in all contexts), but it is a sad state of affairs in a once promising nation.

  • Introducing the "Mom, Freedom, and Apple Pie Anti-Terrorist Act of 2015," that requires that all phone manufacturers build in government approved backdoors into every phone. And after a few Democrats and Rand Paul pretend to object to it, and briefly pretend to stand up against it, it will be approved by Congress with a unanimous vote and signed by the President (who will also pretend to give a flying fuck about privacy concerns by pinkie-swearing that it won't be abused).

    • by schwit1 ( 797399 )
      I have phone service from country_A with a cell phone I bought in country_B and while I'm traveling in country_C I'm talking to a person from country_D who is in country_E and his phone was bought in country_F.

      So what government gets to control the backdoor on my phone?

      • All of them? At the very least, with data sharing agreements they'll all get access to it.

        Are you still laboring under the illusion pretty much all the governments are colluding to fuck over their citizens rights?

  • Comment removed based on user account deletion
    • by jaseuk ( 217780 )

      Yes sure, you can enroll an iOS device in MDM and then send it an unlock command. The end-user has to agree and approve this first of all of course.

      Apple have built the system so that it is immune to a direct unlock. Apple and Microsoft have been giving clear signals that they no longer want to be stuck in the middle of international legal / court disputes requiring them to unlock under court order. So they've re-engineered their encryption and unlock protocols so that they no longer hold any master keys

    • That would realy depend if it was running or not. If it was fully powered down it needs credentials to decrypt the storage and finish booting (I am assuming they are similar to android devices). If it's powered up and connected to even wifi what's stopping it from getting a remote wipe command?

      RF shielding is fairly easy, they make evidence collection bags for just this purpose they even keep the phone charged. Both major OS's have build in remote wipe capabilities. So you would need the carrier and the

      • Then I see the need for an app that will automatically wipe the device if it has not been accessed by passcode within some user-configurable period of time. The user sets it for three days or seven days, and if it is kept powered on or if it is booted after that time, poof, no more data. A delay of 30 seconds or a minute can be user enabled so the user could get in if he hadn't used the phone, but law enforcement wouldn't know the app was there so would power it up to see if they could access it, or take
        • These sort of countdown clocks exist for other things. It would be extremely hard to fully implement as an app, they simple dont have the access. It might be able to erase a sdcard but not the rest.

    • There's two flaws here. 1: When your device is encrypted on KitKat and below, you must enter the decryption password to boot. So no remote access unless the device is already running (which it probably is, but still). I don't know if Lollipop and above are different since I keep encryption off in favor of speed. 2. You can install all the apps you want remotely, but they must be launched by the user at least once before they can start running any background processes. There was an exploit in Android 2.1 and

      • There's two flaws here. 1: When your device is encrypted on KitKat and below, you must enter the decryption password to boot. So no remote access unless the device is already running (which it probably is, but still). I don't know if Lollipop and above are different since I keep encryption off in favor of speed.

        The same is true on Lollipop and Marshmallow. Note that on KitKat and below, breaking the device decryption is not terribly hard, since most user passwords are weak, for convenience. What you do is:

        1. Access the flash directly. The easiest way is probably to desolder it from the device and pop it into another device.

        2. Read the crypto footer on the data partition. This contains the disk encryption key (DEK), encrypted with a key encryption key (KEK) derived from your password with scrypt.

        3. Brute force

    • On Android you can browse the Play Market on a desktop-browser and remotely install applications on your phone, with no confirmation or anything needed on the phone.

      That only helps if apps can unlock the device. They can't on Android, and I see no reason why they'd be able to on iOS, either.

    • by flink ( 18449 )

      On iOS you have to unlock your phone before you sync with iTunes, so I don't think you can push an app over WIFI without knowing the passcode.

      • On iOS you have to unlock your phone before you sync with iTunes, so I don't think you can push an app over WIFI without knowing the passcode.

        Unless the computer it is syncing with has previously synced with that iPhone. During the first access of the phone by a computer, the phone pops up a box asking if this computer should be trusted, and if the person selects yes, a cookie is exchanged. At a later time, if the phone is hooked to the same computer, because of the cookie it will automatically be allowed to access the phone's contents. This is one of the ways law enforcement uses to access seized phones, by also seizing the computer it syncs

  • by c ( 8461 ) <beauregardcp@gmail.com> on Wednesday October 21, 2015 @08:16AM (#50772449)

    It's a straight up application of Schneier's Law:

    Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break.

    -- Bruce Schneier [schneier.com]

    Someone [nsa.gov] might be able to break it, but if they can I doubt they'd talk about it.

    • by Dr. Evil ( 3501 )

      I'm not sure they're denying that:

      "In court papers, Apple said that for the 90 percent of its devices running iOS 8 or higher, granting the Justice Department's request "would be impossible to perform" after it strengthened encryption methods."

      When the courts ask "can you provide us with the key for this device?", the answer isn't "yes, theoretically, we could, if we invested millions of dollars and years of effort, there's a possibility to crack it", the answer is "no, we are not able to."

  • by TFlan91 ( 2615727 ) on Wednesday October 21, 2015 @08:43AM (#50772645)

    This sounds like a marketing scheme to get people to think:

    "Oh nos! DOJ can break into my 'older phones' running 'iOS [7 or lower]'! Better buy the newest one!"

    • iOS updates are free. iOS 8 is supported on devices going back to the iPhone 4S (which is now 4 generations old).
  • How does an Apple customer verify that the claim is true?

  • In other news, the Department Of Homeland Security declares that Apple is now an "Enemy of the State", and will be moving to seize all of their assets.

    • by halivar ( 535827 )

      That will be a day of great internal struggle for most /.'ers.

      • That will be a day of great internal struggle for most /.'ers.

        Yes, but that will be offset by the news that DHS will also declare Microsoft to be an "Enemy of the State".

  • It is just that Apple doesn't have the tools in place to do it, and in fact may not know how to do it, and Apple is likely not pursuing the capability to do it. The court cannot compel Apple to do something that they do not know how to do.
    • Yes, my understanding of the situation is the same. Maybe the NSA has a way or has created a backdoor, but Apple may not. It might be possible but Apple doesn't have the mathematicians, resources, or desire that NSA has to defeat the security.

Technology is dominated by those who manage what they do not understand.

Working...