Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Cellphones Communications Encryption Government Handhelds IOS Iphone Privacy Apple

Apple Is Said To Be Working On an iPhone Even It Can't Hack (nytimes.com) 405

An anonymous reader writes with this story at the New York Times: Apple engineers have already begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts. If Apple succeeds in upgrading its security — and experts say it almost surely will — the company would create a significant technical challenge for law enforcement agencies, even if the Obama administration wins its fight over access to data stored on an iPhone used by one of the killers in last year's San Bernardino, Calif., rampage. The F.B.I. would then have to find another way to defeat Apple security, setting up a new cycle of court fights and, yet again, more technical fixes by Apple.
This discussion has been archived. No new comments can be posted.

Apple Is Said To Be Working On an iPhone Even It Can't Hack

Comments Filter:
  • Precedent (Score:5, Interesting)

    by Dorianny ( 1847922 ) on Wednesday February 24, 2016 @09:03PM (#51579393) Journal
    It would be trivial for Apple to disable all IPSW image installations without a unlock code making what the FBI requested technically impossible, however if the FBI were to prevail in court the Judiciary is likely to take a dim view of Apples actions
  • you never know who gets hacked.
  • by Xylantiel ( 177496 ) on Wednesday February 24, 2016 @09:19PM (#51579491)
    Why does apple get headlines for doing what they should have done in the first place? Anything else is a broken, insecure device. If the vendor has a backdoor, it's not secure, whether they allow the government to access it or not.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Because other phone companies don't?

      • by AmiMoJo ( 196126 ) on Thursday February 25, 2016 @04:49AM (#51581063) Homepage Journal

        Google's Nexus devices are secure and don't have the same firmware update flaw that iPhones do. In fact all Snapdragon 810 based phones are immune because the 810 does not allow firmware updates to the secure memory, it's a ROM burned into the silicon.

        Android has in fact offered full device encryption with the key held in secure storage for years now. Since the Nexus 6 it was enabled by default, and Google has been pushing for other vendors to enable it by default too.

        Samsung has been offering it's "Knox" security for phones for many years now too. No idea if that it hackable, but it's not true to say that no-one else has offered full device encryption that was claimed to be unbreakable.

        • by c ( 8461 )

          Google's Nexus devices are secure and don't have the same firmware update flaw that iPhones do.

          No. But I suspect Google could push a Google services update targeted at a specific phone, and those can do darn near anything. I don't believe Apple is quite as prolific about OTA updates to very powerful core services; unlike Google, they can bundle that stuff into the core O/S without being worried that it won't make it to end users.

          On the other hand, the option is there to lock down an Android phone pretty sol

        • by Shawn Willden ( 2914343 ) on Thursday February 25, 2016 @10:45AM (#51582851)

          Google's Nexus devices are secure and don't have the same firmware update flaw that iPhones do. In fact all Snapdragon 810 based phones are immune because the 810 does not allow firmware updates to the secure memory, it's a ROM burned into the silicon.

          As an Android security engineer I appreciate you standing up for Google, but this isn't true.

          The relevant software for device encryption includes:

          1. The system image. This contains the vold daemon which mounts the encrypted disk and configures the kernel with the key.
          2. The boot image. This contains the Linux kernel, which includes dm-crypt, the code that does device encryption.
          3. The trusted OS image (TOS). This contains the code that knows how to use device-specific hardware-bound secrets. Vold calls into it when decrypting the disk encryption key to pass to the kernel.
          4. The bootloader image. This is used to load all of the above. The details vary, but generally the TOS is verified and loaded first, then the bootloader switches out of secure mode (I'm describing the process for ARM-based devices; it's a bit different for others), then verifies and loads the boot image and boots the kernel. The kernel mounts the system image and configures dm-verity which does run-time verification of system image blocks.

          All of the above are flashable images, and replacing them would enable bypassing the security controls they implement. The bootloader image is the most critical one, since it verifies and loads both the TOS and the boot image. If you can change the keys it uses to verify those, you can change everything else. The bootloader (including the keys it contains) is signed by a key whose public part is burned into ROM. That key can't be changed, and the private key is held by the device OEM. I believe the keys used to sign the system and boot images for Nexus devices are held by Google (not sure), and the key used to sign the TOS is held by the TOS maker (Qualcomm, on the recent Nexus devices).

          You could compromise Android device encryption with the assistance of any of these parties. Getting the OEM to sign a new bootloader allows you to provide your own versions of any of the higher-level pieces, though these things are pretty intricate and writing replacements from scratch that would work is a big, big job. If I were working for the FBI, I probably wouldn't take that approach. Getting Google to sign a modified system image would, from a technical perspective, be much better. You'd still have to brute force the password, and you'd still have to have the TOS perform a 50ms operation for each password you try, but that would be no problem for a four-digit PIN. If the user used, say, an eight-character password, though, it wouldn't be enough. Also, Google's response to a request for a modified system image would probably be about the same as Apple's.

          The best point of attack would be Qualcomm (for recent Nexus devices; other platforms and older Nexus devices use different TOSes). Get them to sign a TOS image that takes the device secrets and simply exports them in response to some request. With those secrets in hand, and a copy of the device flash, you can then brute force the device encryption key off-device, on big hardware. No realistic user password would stand up to that. The process is complicated so I won't bother explaining it here, but it would be very doable.

          To be clear, the Android security team considers these multiple points of entry a bug, not a feature. I, personally, want to get to a state where if you don't have the user's password, you aren't getting in, barring direct attacks that involve peeling apart chips to extract secrets. Doing that requires a separate secure processor (something most Android devices don't have) running non-updateable software. Working to make this possible is one of my current projects.

          It's a much tougher problem in the Android world than for Apple, though, because of all of the players in the ecosystem. Not because they're unw

    • by timholman ( 71886 ) on Wednesday February 24, 2016 @10:28PM (#51579829)

      Why does apple get headlines for doing what they should have done in the first place? Anything else is a broken, insecure device. If the vendor has a backdoor, it's not secure, whether they allow the government to access it or not.

      Apple's encryption is still very secure. It hasn't been broken, and even Apple won't be able to break it for the FBI. What the FBI wants Apple to do is hack the unlock code for them.

      The only "vulnerability" is this case is that Apple potentially has the ability to push new firmware onto this model of iPhone (the 5c) using its own signed certificate, even if the phone is locked. The FBI wants this new firmware to do two things: (1) bypass the "10 wrong tries on the unlock code and the iPhone erases itself" routine and (2) reduce the time interval between unlock code entries. Once this is done, the FBI will brute force input combinations until the iPhone unlocks.

      The only problem is that Apple hasn't written this firmware. Even if the firmware existed, you'd need Apple's own certificate to push it onto the iPhone. So the iPhone is still quite secure, relatively speaking, provided the courts don't compel Apple to develop a forensics tool for the FBI at Apple's expense.

      Of course, Apple doesn't want this situation to ever, ever happen again. You can bet the iPhone 7 will plug this potential vulnerability by making it impossible for anyone to push firmware onto a locked iPhone, even with Apple's own certificate. At that point, the FBI will no doubt petition Congress to legislate that Apple (and Google, Samsung, LG, etc.) provide a means for altering the firmware of any smartphone sold in the U.S., on court order. And that's when this fight will really get interesting.

    • Why does apple get headlines for doing what they should have done in the first place?

      Why do you think Apple should have "in the first place" required a PIN code to install an OS update? As a technologist do you not find it reasonable you should be able to put the phone into a recovery mode and then install the OS again in case something was messed up?

      Indeed if it's what they "should have done" then you must be apoplectic that no other company has taken this "obvious" step to date.

      Should you be required to

      • The best way to handle it is to make it an "if the unlock code is provided, then you can update the software of the OS and firmware of the device without wiping the encryption keys. If the unlock code is not provided, then I will let you update the software but first I will wipe the encryption keys." Since the encryption is all done in a hardware chip with it's own separate OS and update process, it would not be difficult to accomplish.

    • by wvmarle ( 1070040 ) on Wednesday February 24, 2016 @10:58PM (#51579941)

      What is more: the current line of products with their "secure enclave" chip and so, are already supposedly unbreakable by Apple themselves. So is this an admission that Apple can actually break into the current iPhone 6 line? Or do I miss something here?

      • What is more: the current line of products with their "secure enclave" chip and so, are already supposedly unbreakable by Apple themselves. So is this an admission that Apple can actually break into the current iPhone 6 line? Or do I miss something here?

        More secure in the sense of defeating the encryption since part of the key is embedded in silicon and "unreadable"? Which is something quite different from your passcode which is normally all that prevents one's data from being decrypted by all this fancy hardware. Unless the passcode retry delay is burned into silicon, part of a processor, it would seem to be software that is patchable. If so the only thing the FBI needs is for Apple to digitally sign a tampered iOS or firmware.

        On a positive note if App

    • Presumably you've made a perfectly secure smartphone yourself--that would certainly justify your 'holier than thou' attitude. Can you point me to where I can buy it?

      Failing that, just point me to any perfectly secure consumer computing device. Go ahead, I'll wait.

  • by swell ( 195815 ) <jabberwock@poetic.com> on Wednesday February 24, 2016 @09:23PM (#51579517)

    Can God make a chili pepper so HOT that even He can't eat it?
    Yeah, makes you think, doesn't it?

  • Android? (Score:5, Interesting)

    by irrational_design ( 1895848 ) on Wednesday February 24, 2016 @09:27PM (#51579533)

    What I haven't heard yet is where Android lands on the security spectrum. Are they already as or more secure than what the rumors are now saying Apple is trying to achieve? Are they as or more secure than where Apple is right now? Are they as or more secure than where Windows is right now?

    • by armanox ( 826486 )

      I think that falls on the individual implementation of the phone. If my understanding is correct the operating system does support being at least that secure, but that doesn't mean that the version of Android that actually ships is,or that the phones hardware supports it either. The downside to the fragmented Android community - there are few baselines.

    • Re:Android? (Score:5, Informative)

      by VValdo ( 10446 ) on Wednesday February 24, 2016 @09:52PM (#51579669)

      I think it depends on the OEM. There are factors such as whether the device storage is encrypted by default, whether the bootloader is locked by default, what kind of security hardware is available on the SoC and whether it is used, whether exploits are patched, whether there is a continuing roll out for discovered exploits, whether updates are automatically installed w/o authentication, whether the baseband contains known exploits and attack vectors (cough), etc.

      So there's no one answer because there's no one Android device and many phone OEMs (and the manufacturers of the underlying hardware platform) may be implementing security to different degrees. Though many of these considerations do have google guidelines and policies in place, some of which may be enforceable via google compatibility tests, there is a wide spectrum of what you can expect from Android generally speaking I think.

      You might look to Google's policies and recommendations, and more importantly their Nexus devices themselves as models for what they consider best practices to be. Then there is blackphone [silentcircle.com] and other distros that have security as their primary focus, so they may be good to consider as well.

    • Re:Android? (Score:5, Informative)

      by Shawn Willden ( 2914343 ) on Thursday February 25, 2016 @10:55AM (#51582985)

      What I haven't heard yet is where Android lands on the security spectrum. Are they already as or more secure than what the rumors are now saying Apple is trying to achieve? Are they as or more secure than where Apple is right now? Are they as or more secure than where Windows is right now?

      Android devices with L or M are roughly as secure as the pre-Secure Enclave Apple devices (like the 5C). That is, the security software is all in flashable components which are signed, and if the holder of the signing keys can be coerced into signing a custom image, it's possible to bypass all of the anti brute-force protections. Brute force is still necessary, then, but it's trivial for four-digit PINs and may be feasible even for better passwords (or patterns).

      That's in general. Some OEMs have gone a bit further, such as Samsung's KNOX. I don't know the details and can't comment on whether or not they actually improved the security above the baseline required/defined. by Google.

      I'm the Google Android engineer responsible for lots of these bits.

  • by Anonymous Coward

    Than some stupid phone.

  • by the_Bionic_lemming ( 446569 ) on Wednesday February 24, 2016 @09:51PM (#51579663)

    The U.S. Government can conceivably ban the sale or possession of that type of phone.

    They do it all the time with other products, or require licensing and training and over site after purchase.

  • I have to wonder (Score:5, Insightful)

    by Krishnoid ( 984597 ) on Wednesday February 24, 2016 @09:54PM (#51579683) Journal

    I suspect that Tim Cook as an LGBT individual, has an intimate, proximate, and/or cultivated personal interest, with historical and current backing, in personal privacy. In these particular circumstances, it would express itself as the importance of data privacy on a personal device.

    If I had to guess, it could come down through the ranks indirectly as unstated support from the top.

    • by swb ( 14022 )

      I've always thought that since he came out. It seems like concern for privacy would be a fairly strong value for a man who lived in fear of being exposed.

  • by hsmith ( 818216 ) on Wednesday February 24, 2016 @09:57PM (#51579695)
    to the data on the phone (disabling wipe after 10 attempts) - is the phone really all that secure?
  • Whats going on (Score:5, Insightful)

    by Smiddi ( 1241326 ) on Wednesday February 24, 2016 @10:10PM (#51579741)
    The security "war" is not longer about country versus country, but about "the people" versus the government.
  • Missing the point (Score:5, Insightful)

    by argumentsockpuppet ( 4374943 ) on Wednesday February 24, 2016 @11:12PM (#51580015)

    I RTFA this time. It, like so many other other articles, missed the actual legitimate issues of the case. Every time you read an opinion that says Apple should "unlock the phone" or "decrypt the phone" misses the point that Apple must create software which doesn't exist. Whether Apple should do that or not is itself an interesting discussion, but the real issue here is whether government agencies should be able to force software companies to create hacking software, especially when the software company isn't accused of breaking any law in the case.

    I don't have any issue with the idea that a government agency should be allowed to create hacking software. I wouldn't object if the NSA had required Apple to sign a software update created by the NSA for the purpose of hacking into the phone. In fact, I think that's what the government should do. However, I'm very troubled by the fact that most people are in favor of Apple being forced to unlock a phone when that's not what is really going on.

    Compulsion of speech is an issue that has been supported in food labeling laws and denied in other cases. Creating software is fundamentally different than providing existing information. I believe creation of software is a form of speech, and I think the courts have upheld that viewpoint, so this case is really hinging on whether a judge under "All Writs Act" has the authority to force someone, not even someone accused of a crime, to create something new.

    I think it is important in this discussion to understand how the software the government wants Apple to create would work. Apple updates happen automatically for phones which automatically connect to a known wifi access point. Those updates don't just get pulled from Apple though, the phone creates a code which is encrypted with Apple's public key, so that only Apple with it's private key can decrypt. The update is then provided to the phone, with the code provided by the phone re-encrypted so that only the phone can decrypt it, and only then is the update, signed with Apple's key, loaded into the phone.

    If the government wanted to, they could require Apple to provide source code to their existing software and the government could modify it and either ask Apple to sign it or require Apple to provide its private key. However, by requiring Apple to create the hacking software, they're introducing an idea that software companies cannot refuse to create software when required by the government. Once someone does something for a government official, often that's taken as a reason that the government can require them to do it again. (See In re Boucher - case citation: No. 2:06-mj-91, 2009 WL 424718)

    Apple had asked that the request be sealed, thus kept secret and not able to be used as precedent but the Department of Justice refused and thus made their request both public and able to be used as precedent. If they succeed in forcing Apple to create hacking software they get access to the information on this phone, but more importantly, the hundreds or thousands of phones they'd like to access are much more likely to be accessed by forcing Apple to repeat the process over and over. Apple doesn't want to be in the business of creating hacking software for the government. Much of law enforcement would consider this a victory, but I think the FBI is hoping to lose this case as a general might be willing to lose a battle, in order to win the bigger war. By losing the case, the FBI gains public support that they can use to pressure Congress to create laws forcing software companies to build in backdoors. Such a thing could be done securely, so that it wouldn't open the software to hackers. I have zero faith that Congress or software companies actually would do it in a secure way, but that's not the reason I am against the backdoor. Encryption is math and the math is known and freely available to anyone who searches for it. The ability to create securely encrypted software is something that can't be made to disappear, but it can be made illegal to do in the US. By d

    • Re: (Score:2, Insightful)

      I'm very sorry to tell you so, but Apple needn't to create software that doesn't exist. It needs to modify an existing piece of software, called firmware that set a limit on the number of attempts with a wrong password before deleting data on the phone and it needs to remove the delay they introduced between attempts to avoid an automatic system to try passwords at a rate no human can. So, the piece of software exists and the modification is about two lines of code and maybe something like less than 10 char
      • by kybred ( 795293 )

        Most ot the rhetoric from Tim Cook is pure bullshit in this case. He tries to expand the request to all iPhones in order to create a wave of sympathy and pose as a champion of privacy while in reality he doesn't give a shit, unless this can be a sales point. Pure marketing here.

        Perhaps you missed this story [macrumors.com]

        The twelve cases are similar to the San Bernardino case in that prosecutors have sought to use the 18th-century All Writs Act to force Apple to comply, but none are related to terrorism charges and most involve older versions of iOS software.

      • by shess ( 31691 ) on Thursday February 25, 2016 @01:35AM (#51580555) Homepage

        I'm very sorry to tell you so, but Apple needn't to create software that doesn't exist. It needs to modify an existing piece of software, called firmware that set a limit on the number of attempts with a wrong password before deleting data on the phone and it needs to remove the delay they introduced between attempts to avoid an automatic system to try passwords at a rate no human can. So, the piece of software exists and the modification is about two lines of code and maybe something like less than 10 characters to change in the code.

        So if the government handed you a piece of paper and said "Read this into the microphone", you'd consider that not to be restricting your freedom of speech because you didn't have to actually create the message yourself?

        This Apple software is written a certain way for reasons specific to the desired functionality. Just like you might choose specific words to get across your specific point, and might not agree to choose alternate words which make an entirely different point.

  • The government opposing currently-undefeatable encryption is incongruous with the supposed constitutional right to privacy (which, by the way, isn't there, but the Supreme Court said it is). Consider the following excerpt from the majority opinion in Roe v. Wade:

    The principal thrust of appellant's attack on the Texas statutes is that they improperly invade a right, said to be possessed by the pregnant woman, to choose to terminate her pregnancy. Appellant would discover this right in the concept of personal "liberty" embodied in the Fourteenth Amendment's Due Process Clause; or in personal, marital, familial, and sexual privacy said to be protected by the Bill of Rights or its penumbras.

    The Constitution does not explicitly mention any right of privacy. [T]he Court has recognized that a right of personal privacy, or a guarantee of certain areas or zones of privacy, does exist under the Constitution. This right of privacy, whether it be founded in the Fourteenth Amendment's concept of personal liberty and restrictions upon state action, as we feel it is, or, as the District Court determined, in the Ninth Amendment's reservation of rights to the people, is broad enough to encompass a woman's decision whether or not to terminate her pregnancy. The detriment that the State would impose upon the pregnant woman by denying this choice altogether is apparent. Specific and direct harm medically diagnosable even in early pregnancy may be involved. Maternity, or additional offspring, may force upon the woman a distressful life and future. Psychological harm may be imminent. Mental and physical health may be taxed by child care. There is also the distress, for all concerned, associated with the unwanted child, and there is the problem of bringing a child into a family already unable, psychologically and otherwise, to care for it. In other cases, as in this one, the additional difficulties and continuing stigma of unwed motherhood may be involved. All these are factors the woman and her responsible physician necessarily will consider in consultation.

    Apply the same reasoning, and you'd have:

    The principal thrust of appellant's attack on the application of the All Writs Act is that it improperly invades a right, said to be possessed by the owner of the smartphone, to choose to erase his or her data. Appellant would discover this right in the concept of personal "liberty" embodied in the Fourteenth Amendment's Due Process Clause; or in personal, marital, familial, and sexual privacy said to be protected by the Bill of Rights or its penumbras.

    The Constitution does not explicitly mention any right of privacy. [T]he Court has recognized that a right of personal privacy, or a guarantee of certain areas or zones of privacy, does exist under the Constitution. This right of privacy, whether it be founded in the Fourteenth Amendment's concept of personal liberty and restrictions upon state action, as we feel it is, or, as the District Court determined, in the Ninth Amendment's reservation of rights to the people, is broad enough to encompass a person's decision whether or not to erase data stored on his or her computing devices. The detriment that the State would impose upon the device owner by denying this choice altogether is apparent. Specific and direct harm may be involved. Data, or even the disclosure of personal contact information, may force upon the owner a distressful life and future. Psychological harm may be imminent. Mental and physical health may be taxed by the damage done to interpersonal relationships. There is also the distress, for all concerned, associated with the data, and there is the problem of removing the data, once disclosed by a third party, from a world of interconnected computing devices designed for data retention. In other cases, as in online dating service users, the additional difficulties and continuing stigma of adultery may be involved. All these are factors the device owner should consider when configuring his device.

    The court has already established a precedent here that saving a life is subordinate to the right to privacy.

Hokey religions and ancient weapons are no substitute for a good blaster at your side. - Han Solo

Working...