Forgot your password?
typodupeerror
Iphone Crime Encryption Government Privacy Apple

Apple Deluged By Police Demands To Decrypt iPhones 239

Posted by Soulskill
from the atf-struggles-with-slide-to-unlock dept.
New submitter ukemike points out an article at CNET reporting on a how there's a "waiting list" for Apple to decypt iPhones seized by various law enforcement agencies. This suggests two important issues: first, that Apple is apparently both capable of and willing to help with these requests, and second, that there are too many of them for the company to process as they come in. From the article: "Court documents show that federal agents were so stymied by the encrypted iPhone 4S of a Kentucky man accused of distributing crack cocaine that they turned to Apple for decryption help last year. An agent at the ATF, the federal Bureau of Alcohol, Tobacco, Firearms and Explosives, 'contacted Apple to obtain assistance in unlocking the device,' U.S. District Judge Karen Caldwell wrote in a recent opinion. But, she wrote, the ATF was 'placed on a waiting list by the company.' A search warrant affidavit prepared by ATF agent Rob Maynard says that, for nearly three months last summer, he "attempted to locate a local, state, or federal law enforcement agency with the forensic capabilities to unlock' an iPhone 4S. But after each police agency responded by saying they 'did not have the forensic capability,' Maynard resorted to asking Cupertino. Because the waiting list had grown so long, there would be at least a 7-week delay, Maynard says he was told by Joann Chang, a legal specialist in Apple's litigation group. It's unclear how long the process took, but it appears to have been at least four months."
This discussion has been archived. No new comments can be posted.

Apple Deluged By Police Demands To Decrypt iPhones

Comments Filter:
  • by APE992 (676540) on Saturday May 11, 2013 @10:15PM (#43699651) Journal
    If they're going to expect Apple to spend time doing their work for them are they are least compensating them for the time and energy necessary for this?
    • by noh8rz10 (2716597) on Saturday May 11, 2013 @10:24PM (#43699719)
      i see this story as being a GOOD thing, generally speaking. the feds are stumped by my iphone. now the only people we need to cockblock are in cupertino...
      • How ? (Score:4, Interesting)

        by Taco Cowboy (5327) on Saturday May 11, 2013 @11:20PM (#43699977) Journal

        i see this story as being a GOOD thing, generally speaking. the feds are stumped by my iphone. now the only people we need to cockblock are in cupertino

        The question is, how ?

        The Apple platform is a closed platform, and they closely guard against any attempt to change their products (even after we have purchased them with our own money)

        Until now, there is no way to safeguard our secret stored in i-Device from the prying eyes of Apple Inc

        • Re:How ? (Score:5, Insightful)

          by BrokenHalo (565198) on Sunday May 12, 2013 @04:18AM (#43700933)

          Until now, there is no way to safeguard our secret stored in i-Device from the prying eyes of Apple Inc

          If you want something kept secret, you're a fool if you put it on your phone.

          • Re:How ? (Score:4, Insightful)

            by kthreadd (1558445) on Sunday May 12, 2013 @10:21AM (#43702109)

            Not at all if the computer (I don't know why so many call modern hand-held computers phones since they are not very phone-like) is using strong and trustworthy encryption which you control. I don't know the details in this case (Slashdot is seldom trustworthy), but if anyone except you can decrypt it using something other than brute force then the encryption is certainly not trustworthy. If that's the case then putting secrets on this computer that you call phone is absolutely a terrible idea, but I see very little problem with it if it's actually good encryption.

        • Re:How ? (Score:4, Interesting)

          by erroneus (253617) on Sunday May 12, 2013 @07:34AM (#43701435) Homepage

          Jailbreak, inject a new encryption key?

      • by Shavano (2541114) on Saturday May 11, 2013 @11:34PM (#43700025)
        You understand that in this case the police HAD a warrant. What's your complaint?
        • Re: (Score:2, Insightful)

          by Anonymous Coward

          My complaint is that the police can fuck right off if they want to decrypt anything on mine.

        • by Charliemopps (1157495) on Sunday May 12, 2013 @06:05AM (#43701221)

          You understand that in this case the police HAD a warrant. What's your complaint?

          That encryption is not encryption if Apple can "undo" it.

          • by Impy the Impiuos Imp (442658) on Sunday May 12, 2013 @07:39AM (#43701449) Journal

            Is it a user's password or is it Apple's? Is there a back door in the algorithm? Is it an inherently weak algorihm, but the police don't know what it is so they can't launch an attack?

            Inquiring minds want to know!

            • by Savage-Rabbit (308260) on Sunday May 12, 2013 @12:00PM (#43702645)

              Is it a user's password or is it Apple's? Is there a back door in the algorithm? Is it an inherently weak algorihm, but the police don't know what it is so they can't launch an attack?

              Inquiring minds want to know!

              Apparently you encrypt an iOS device when you enable the pass code option. The default pass code is numerical and is only 4 digits, which is very weak. You can activate a 'pass phrase' option that gives more security but the pass phrase should be at least 12 characters long. An 8 char password can, for example apparently be cracked (brute forced presumably) in under 2 hours. Since the iPhone defaults to a 4 digit numerical code I don't suppose cracking 98% of these devices will be terribly hard. However, as always, it appeals far more to the Apple haters here to jump to the conclusion that iOS devices phone home to Apple and send them your encryption keys and pass phrases in clear-text. I am not so sure about that myself, I know of a criminal case where a FileVault image was sent to Apple for decryption but they returned after a while saying that their people had failed to crack it.

            • by Joce640k (829181)

              Is it a user's password or is it Apple's? Is there a back door in the algorithm? Is it an inherently weak algorihm, but the police don't know what it is so they can't launch an attack?

              Inquiring minds want to know!

              If there's a "seven week delay" they're probably brute-forcing something.

          • by BasilBrush (643681) on Sunday May 12, 2013 @10:36AM (#43702169)

            Apple can't "undo" encryption. But a lockscreen pin code is 4 digits long. Guess how many tries they on average and as a maximum in order to brute-force it?

            Reduce that average time, because some passcodes are used more often than others. (0000,9999,1234, numbers that spell out various 4 letter words)

            After 6 attempts, you have to wait a minute before trying again. At some point there will be a complete lockout, but even that can be reset via iTunes.

            So brute-forcing is by no means impossible. But it will take time and, realistically, automation. Hence why law enforcement have to wait once they've issued Apple with a warrant.

            Those who are Android fans should bear in mind that Google will also retrieve data from Android devices if the Police issue them with a warrant.

            The smartphone of choice for those people who need to protect their phone data from the Police is still the Blackberry.

            • by mjwx (966435)

              Those who are Android fans should bear in mind that Google will also retrieve data from Android devices if the Police issue them with a warrant.

              The beauty of Android is that it is very, very easy to make this very, very hard for Google (or anyone trying really).

              But the best defence against the Police is a Nokia 6110. As long as you dont use SMS they store practically nothing.

              The only real security for mobile devices is to store nothing sensitive (or incriminating) on them.

        • by TheCarp (96830)

          My complaint is that Apple is even capable of complying. If I buy a device, its mine, if I encrypt that device, I, and whoever I give the key to, should be the only people able to decrypt it (key weakness and cryptanalsys not withstanding, obviously).

          If this is not the case, then it should be made explictly obvious up front, and not even just buried in the fine print, because this, in reality, is a HUGE difference between expectation and reality.

          But.... I have already exercised my right as a consumer in thi

      • by FuzzNugget (2840687) on Saturday May 11, 2013 @11:45PM (#43700047)

        You're deluding yourself if you think a backdoor is a good thing.

        No, this is overall a bad thing: Apple is able and willing to break the encryption on an iPhone, presumably through a backdoor or brute force.

        Then again, we could all be mistakenly conflating "encryption" with "lock screen", which really speaks to the level of (in)competence on the part of law enforcement.

        Hmmm, maybe this is a good thing (just not quite in the way you were thinking)

        • by bytesex (112972) on Saturday May 11, 2013 @11:59PM (#43700103) Homepage

          Maybe the backdoor isn't so much the crypto format itself - it's in the password to decrypt. After all - these companies have a thing for you sharing information 'in the cloud', right? What's to stop them from simply posting your password somewhere central - for recovery purposes on your (and apparently, other people's) behalf? I reckon 90% of users would find it super-convenient!

        • by Mojo66 (1131579)

          presumably through a backdoor or brute force.

          I doubt there is a backdoor because if there was then it wouldn't take them so long. Probably brute-force.

        • by sribe (304414) on Sunday May 12, 2013 @07:55AM (#43701531)

          No, this is overall a bad thing: Apple is able and willing to break the encryption on an iPhone, presumably through a backdoor or brute force.

          Brute force. 10 failed attempts at the lock screen results in the phone being wiped. But Apple can copy out the encrypted contents, and then keep guessing until they find the code, no matter how many tries.

          Then again, we could all be mistakenly conflating "encryption" with "lock screen", which really speaks to the level of (in)competence on the part of law enforcement.

          On the iPhone, same thing--when you set up the lock screen, it sets up a random key which is used to encrypt/decrypt data in-flight to the flash, so that nothing is stored decrypted. The passcode is used to de-scramble the key, which is stored in a special location...

      • by SeaFox (739806) on Sunday May 12, 2013 @01:50AM (#43700493)

        i see this story as being a GOOD thing, generally speaking. the feds are stumped by my iphone. now the only people we need to cockblock are in cupertino...

        No, I'd say this is a bad thing. A back log of getting these requests fulfilled will only be used as justification for there to be a regular law-enforcement back door built into a later version of iOS. "This process is taking too long and Apple is being burdened with fulfilling these requests, if only we had a way of accessing an iPhone ourselves without needing their assistance it would make things easier for all parties when investigating terrorism and child pornography..."

      • An encryption that someone needs to wait only seven weeks to get broken by the manufacturer is not, in any sense, a useful encryption.

    • by Anonymous Coward on Saturday May 11, 2013 @10:26PM (#43699731)

      You're kidding, right? The real issue is that Apple has a backdoor to decrypt its customers' private information. That is outrageous.

      It is irrelevant how much Apple spends to operate that backdoor.

      • by Anonymous Coward on Saturday May 11, 2013 @10:33PM (#43699785)

        Now you know and knowing is half the battle. Don't buy iPhone.

        • by fustakrakich (1673220) on Saturday May 11, 2013 @11:10PM (#43699941) Journal

          That's right. Steal somebody else's

        • by deains (1726012)

          Or better yet, don't store sensitive data on your smartphone. Android/Windows Phone are likely to have their own backdoors as well, so simply avoiding Apple doens't necessarily solve the problem.

        • by sribe (304414) on Sunday May 12, 2013 @08:01AM (#43701563)

          Now you know and knowing is half the battle. Don't buy iPhone.

          Right, because, as the article points out:

          Google takes a more privacy-protective approach: it "resets the password and further provides the reset password to law enforcement," the materials say, which has the side effect of notifying the user that his or her cell phone has been compromised.

          Oh, good for google! Wait, why doesn't Apple just reset the password and provide the new password to law enforcement. Oh, yeah, right, better security--they can't just reset the password. And boy, how much better it is for the suspect's privacy that google notifies him. Let's see, he's been arrested, his phone seized, a warrant obtained to examine its contents--I'm sure he'd be so much more relieved if he were to get email from Apple when his pass code is cracked, because by god that is so important to his privacy!

      • by Shavano (2541114)
        Did you receive documentation that said otherwise?
      • by node 3 (115640) on Sunday May 12, 2013 @12:38AM (#43700251)

        You're kidding, right? The real issue is that Apple has a backdoor to decrypt its customers' private information. That is outrageous.

        It would be, were that the case. But it's all but certainly not. There's no way Apple would put an actual back door into their products.

        If you had read the article, you'd notice that the process takes four months. If they had a back door, it would take a few minutes. Also, had you read the article, you'd notice that Google will reset the password and send that to law enforcement.

        But I'm sure that's not outrageous. Lol!

        It is irrelevant how much Apple spends to operate that backdoor.

        That's true, but only if there was an actual back door.

        However, in all fairness, if you have proper evidence that Apple has a back door, I'll be right there with you. That would be wholly unacceptable.

        • by gd2shoe (747932) on Sunday May 12, 2013 @03:22AM (#43700753) Journal
          The summary implies that it did only take a couple of minutes... after months of sitting on a shelf while Apple dealt with the backlog of other phones needing to be unlocked by law enforcement.
          • by node 3 (115640)

            Actually, even in the summary, the relevant part is here:

            "Because the waiting list had grown so long, there would be at least a 7-week delay, Maynard says he was told by Joann Chang, a legal specialist in Apple's litigation group. It's unclear how long the process took, but it appears to have been at least four months."

            It says that the waiting list is 7 weeks, and the process takes four months. However, even so, the entire article is quite vague. The only thing that's not is that there's no way there's as b

        • by AmiMoJo (196126) * <{ten.3dlrow} {ta} {ojom}> on Sunday May 12, 2013 @03:40AM (#43700819) Homepage

          No, the backlog is 4 months. Nobody knows how long actual decryption takes, but the nature of these things is that it will either be minutes or thousands of years with a supercomputer dedicated to the task. Apple claims [apple.com] that it uses AES with a 128 bit key, so if they can unlock it that quickly they MUST have a backdoor to the encryption key.

          This is absolute proof that they have your encryption key on file somewhere. Others have already verified that they do indeed use AES 128.

          To cover themselves legally Apple will have to evaluate every request that comes in, handle the evidence securely (maintaining the chain of custody) and then handle the potentially sensitive and illegal decrypted data in a way that doesn't expose its staff. It's no wonder there is a backlog.

          • by Cyberax (705495)
            Dudes, Apple holds your encryption key in escrow to allow device restores. That's even disclosed in their freaking policy.
          • by kasperd (592156) on Sunday May 12, 2013 @04:58AM (#43701055) Homepage Journal

            Apple claims that it uses AES with a 128 bit key, so if they can unlock it that quickly they MUST have a backdoor to the encryption key.

            The input provided by the legitimate user for decrypting the content has way less than 128 bits of entropy. So they just need to brute force that input. What Apple can do, which the forensics people might not know how to do, is to extract the encrypted data and put it on a computer, where brute forcing can happen without each input having to be entered through a touch screen. Any security one might think this adds, is nothing but security-through-obscurity. Real security of the encryption could only be achieved by the user entering some sort of password with sufficient entropy. A 39 digit pin code would be sufficient to make AES be the weakest point. But would anybody use a 39 digit pin on their phone? Anything less would make the pin be easier to brute force than AES.

            You can shift the balance a bit by iterating the calculation which produces a key from the pin code. A million iterations would probably be acceptable from a user experience perspective, but that would only reduce the required number of digits from 39 to 33. A milliard iterations would not be good for the user experience, since they now have to wait quite some time after entering a pin. And with the pin still needing to be 30 digits in length, they'll often need to re-enter it multiple times, before they get it right.

            • by AmiMoJo (196126) *

              I don't know about the iPhone but Android lets you enter a password for encryption, not just a PIN. You enter it once when the phone is turned on, so it isn't a big inconvenience to pick a secure one.

              It isn't a question of if Apple can unlock the phone due to the user choosing a poor password. They can always unlock it. Someone else can confirm if they were just stupid and only allowed you to enter a PIN number instead of a real password, or if they have a copy of the key.

          • by sribe (304414)

            No, the backlog is 4 months. Nobody knows how long actual decryption takes, but the nature of these things is that it will either be minutes or thousands of years with a supercomputer dedicated to the task. Apple claims [apple.com] that it uses AES with a 128 bit key, so if they can unlock it that quickly they MUST have a backdoor to the encryption key.

            It would be proof only if the user had to enter the 128-bit key to access the phone, but that of course is not the case. The user only enters a short passcode, so the key is stored somewhere in the device, protected only by whatever encryption/scrambling they can do to it with a relatively short passcode.

            This is absolute proof that they have your encryption key on file somewhere. Others have already verified that they do indeed use AES 128.

            It is proof of no such thing; your statement is absolutely wrong.

            • by AmiMoJo (196126) *

              The user only enters a short passcode

              Can you absolutely confirm that you must enter a short passcode, rather than an arbitrary length password? Android allows the latter. If iOS only allows short numerical codes then... well, it's shit.

              • by sribe (304414)

                Can you absolutely confirm that you must enter a short passcode, rather than an arbitrary length password? Android allows the latter. If iOS only allows short numerical codes then... well, it's shit.

                By "short", I meant significantly shorter than the hex (or base-64) version of a 128-bit key--not 4 or 6 digits. Default is 4 digits, but simply clicking the "simple passcode" option to off gets you a full keyboard for entry.

      • by blaster (24183) on Sunday May 12, 2013 @12:50AM (#43700305)

        Apple does not have a backdoor per se. But Apple does have the device signing key and can thus completely compromise the chain of trust. The only thing stopping you from compromising a phone with a 4 digit passcode in seconds by brute forcing it is the fact that software rate limits attempts, and the option to have it delete its intermediary keys after 10 bad attempts. If you have the ability to load an arbitrary kernel it is trivial to bypass both of these, but only Apple has that capability, at least on devices without jailbreaks that can be executed them while locked.

        If you want to make sure your data is secure then use a full password and not a PIN, which will make Apple's ability to run code moot since brute forcing it will not be practical any more. You can look at https://acg6415.wikispaces.com/file/view/iOS_Security_May12.pdf/343490814/iOS_Security_May12.pdf [wikispaces.com] for more info on the actual architecture.

        • Wish I had my mod points today...
        • The expression "Apple does not have a backdoor per se" basically cannot be proven unless you have a full source code. Moreover, nothing will stop a real hackers from desoldering a flash and attaching it to reader. And also: I've never seen a modern device which does not have some JTAG or similar debug port that can be useful to program the very bootloader that verifies the digital signatures of bootable code. Times when BIOS was pluggable are gone.

          • by tlhIngan (30335)

            The expression "Apple does not have a backdoor per se" basically cannot be proven unless you have a full source code. Moreover, nothing will stop a real hackers from desoldering a flash and attaching it to reader. And also: I've never seen a modern device which does not have some JTAG or similar debug port that can be useful to program the very bootloader that verifies the digital signatures of bootable code. Times when BIOS was pluggable are gone.

            Except around the 3Gs era, Apple started hard-encrypting the

          • by blaster (24183) on Sunday May 12, 2013 @03:00AM (#43700677)

            Would you have preferred if I had written "Apple does not actually need a backdoor per se in order toto perform the actions mentioned in the article?" My point was that what law enforcement is asking does not require a backdoor, since a lot of posters seem to think it implies there must be one. Furthermore, security researchers can and do look and see how all the signing keys etc are structured on running systems even without source code access. Is there a chance there is still something hidden, sure, but there is also a chance someone snuck a root exploit into an innocuous looking commit in an important open source project. Source code access generally does lead to more trustworthy code, but it isn't so black and white as you claim. In the end we depend on people to validate what we use, and just having the source available is not in and of itself validation.

            As for the rest of the your comments, you simply don't know what you are talking about, but you would if you had actually read the PDF I linked. First off, rewriting the bootloader via JTAG is not an option on a lot of SoC's and embedded devices once they have had some of their internal fuses blown. From the PDF:

            "When an iOS device is turned on, its application processor immediately executes code from read-only memory known as the Boot ROM. This immutable code is laid down during chip fabrication, and is implicitly trusted. The Boot ROM code contains the Apple Root CA public key, which is used to verify that the Low-Level Bootloader (LLB) is signed by Apple before allowing it to load."

            So the stuff in flash might be rewritable, but it won't be executed unless it is signed. Reading the raw flash is also completely useless, because all data written to it is AES encrypted via a DMA engine in the SoC that uses various different keys, but all of them are tied to or derived from values fused into the processor and not readable via software or JTAG (they are routed directly to the DMA block and never exposed). That means the brute force needs to be attempted on the SoC in that particular iPhone, or you need to drastically increase the search space. A suitably advanced attacker code probably also obtain the SoC keys by decapping the chip, dying it, and looking at the fuses with a scanning electron microscope, but I generally don't worry about an attacker with sorts of resources; they would probably just beat my PIN out of me...

          • Except - the PDF linked to specifically states that the encryption is dependent on the silicone within the device. The chip identifiers on the device are part of the encryption. Storage removed from the device are unreadable, until the storage media is returned to the device.

            "The content of a file is encrypted with a per-file key, which is wrapped with a class key
            and stored in a file’s metadata, which is in turn encrypted with the file system key. The
            class key is protected with the hardware UID and,

      • by pitchpipe (708843)

        The real issue is that Apple has access to its customers' private backdoor, and that they don't like lube.

        You had some words switched around there.

        It is irrelevant how much Apple spends to operate that backdoor.

        Agreed.

      • by Cyberax (705495)
        Are you stupid? Apple holds your encryption keys in escrow so you can restore them if you accidentally forget them. Everybody with a couple of functioning brain cells should know that if a company can restore password for you then they can do this for law enforcement as well.
      • by sribe (304414)

        You're kidding, right? The real issue is that Apple has a backdoor to decrypt its customers' private information. That is outrageous.

        They don't have a backdoor. They just have the skills to get a copy of the encrypted data so they can bypass the 10-failure limit at the lock screen and brute-force the pass code.

    • by PopeRatzo (965947)

      If they're going to expect Apple to spend time doing their work for them are they are least compensating them for the time and energy necessary for this?

      They are being compensated by not being prosecuted for tax evasion. I seriously doubt that Apple's claim that 2/3 of its profits come from outside the U.S. would stand up to any serious scrutiny.

      Even putting aside the issue of Apple keeping all it's patents in offshore shell corporations that are nothing but mail-drops.

  • by Aryden (1872756)
    I wonder if they just overwrite the password hash....
  • by jtownatpunk.net (245670) on Saturday May 11, 2013 @10:25PM (#43699723)

    The summary talks about decrypting the data on the phones. The articles talk about getting past the lock screen on the phones. Those are two entirely different things. On my phone, I have to first enter the decryption code before I'm presented with the lock screen.

    • by Sycraft-fu (314770) on Saturday May 11, 2013 @10:30PM (#43699769)

      Most phones aren't encrypted and usually the company can bypass it. For example with Android phones tied to a Gmail account, Google can bypass the lock screen. So if you forget your password, that is a recovery mechanism. Also data can be accessed if you physically removed the flash chip from the phone and put it in another reader. Lock screens are protection against most kinds of attacks, not high level security. Most people don't need high level security though, so it works well.

      You can also encrypt your phone. Well I presume you can encrypt iPhones, having not owned one I don't know. You can encrypt Blackberries and Androids. There you set a key and it does basically a full-disk encryption type of thing. You have to enter the key to access the device at all (whereas lock screen lockouts will allow some stuff to happen) and there is no recovery. If you forget the password, you're boned, flash the device and start over. Few people do that because it is not pushed and is inconvenient.

      It is also more security that is generally useful. Most people are worried about someone running up a phone bill, or getting at your account information or something if they steal a phone. A lock screen stops that. Device encryption is needed only against more serious threats, hence most don't use it.

      • Most phones aren't encrypted and usually the company can bypass it. For example with Android phones tied to a Gmail account, Google can bypass the lock screen. So if you forget your password, that is a recovery mechanism.

        Who you replied to is correct the article is of the pass code
        FTA : Quote "the Apple legal specialist, told him that "once the Apple analyst bypasses the passcode,
        the data will be downloaded onto a USB external drive" /Quote

        I have a Google tablet (Motorola_XOOM_MZ604) the only way to bypass the password is to reset the unit.
        Now one may do this then run forensics on the SSD, but that to is a lot of work (money).

        The Google Tablet is the only password I've bypassed (by resettng) for a friend
        I would hope the res

    • Pretty convince you've hit the nail on the head. This isn't an issue of cracking encryption but simply gaining initial access to the phone via pin
  • by pitchpipe (708843) on Saturday May 11, 2013 @10:58PM (#43699895)

    Court documents show that federal agents were so stymied by the encrypted iPhone 4S of a Kentucky man accused of distributing crack cocaine that they turned to Apple for decryption help last year... Because the waiting list had grown so long, there would be at least a 7-week delay...

    As soon as they are able to get these phones decrypted, this war on drugs will be won!

  • Maybe I should buy a copy of PhoneView (http://www.ecamm.com/mac/phoneview/) and setup my own computer forensics firm.
  • by Frankie70 (803801) on Saturday May 11, 2013 @11:01PM (#43699911)

    Unless the iPhone has a backdoor - the effort required for either Apple or others should be the same. Does this mean that the iPhone has a backdoor?

    • by csumpi (2258986)
      It does. And I just used it.
    • by steelfood (895457)

      Even if they had one, it seems it's not one that is so simple as to make unauthorized decryption effortless. I would rather think that they purposely included some design flaws into their scheme, and are using those known flaws as an exploit to (much) more easily get to the key.

      • by mlw4428 (1029576)
        So a company purposely makes shitty security so that they can break their own security whenever they want and people are OK with this? Sounds like an even better reason to stick/switch to Android. At least an open-source product has a better chance at security over some proprietary bullshit.
    • by nospam007 (722110) *

      "Unless the iPhone has a backdoor - the effort required for either Apple or others should be the same. Does this mean that the iPhone has a backdoor?"

      No, Apple removes the maximum number of tries for the password with an 'update' and runs a brute force from 0000 to 9999.

      If you use a real long password, they're fucked.

  • by Verteiron (224042) on Sunday May 12, 2013 @12:21AM (#43700175) Homepage

    Brute-forcing an iPhone's lock code is relatively trivial with freely available tools [google.com]. This puts the device in DFU mode, so "Erase device on X unlock attempts" doesn't take effect. That version of the tools only bruteforces lockcodes, but there's no theoretical reason you couldn't try at least a dictionary attack on a password, too. Since it's also possible to dump the hardware key and a complete (encrypted) image, I imagine an offline attack on the image is possible, too. You wouldn't have to rely on the relatively slow hardware in the iPhone.

    Using those tools I have successfully bruteforced the 4-digit lockcode to an iDevice running 6.0.2, and that's with no prior experience with or knowledge of iOS. I even used an emulated Mac to compile the necessary firmware patch. And that's just what I was able to do in with a few hours of fiddling. There are people who do this for a living, and tools dedicated specifically to extracting data from mobile devices. Are these PDs really saying they can't get into devices with simple lock codes?

    • by node 3 (115640)

      You mean to say you were able to run through the ten thousand numbers between 0000 and 9999? You must be a super-hacker!

  • by RenHoek (101570)

    This is good right? I mean with the DMCA even trivial protections are illegal to circumvent, so you remove the people who would be capable and interested in reverse engineering from the market. Then don't be surprised then when nobody can decrypt smart phones.

Parkinson's Law: Work expands to fill the time alloted it.

Working...