Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Government Security Encryption IOS Iphone Operating Systems Privacy Software The Courts Apple News Your Rights Online Build Hardware Technology

Slashdot Asks: Should FBI Reveal to Apple How to Unlock Terrorist's iPhone? (latimes.com) 286

After reports that the FBI managed to unlock an iPhone 5c belonging to one of the San Bernardino shooters without the help of Apple, Apple is now the one that needs the FBI's assistance. "The responsible thing for the government to do is privately disclose the vulnerability to Apple so they can continue hardening security on their devices," said Justin Olsson, product counsel at security software maker AVG Technologies. However, many experts in the field believe that the government isn't legally obligated to provide the information to Apple. As mentioned in Los Angeles Times, this creates a new ethical dilemma: Should tech companies be made aware of flaws in their products, or should law enforcement be able to deploy those bugs as crime-fighting tools?
This discussion has been archived. No new comments can be posted.

Slashdot Asks: Should FBI Reveal to Apple How to Unlock Terrorist's iPhone?

Comments Filter:
  • Didn't (Score:3, Insightful)

    by Anonymous Coward on Wednesday March 30, 2016 @05:51PM (#51811177)

    They didn't hack the phone - they're just trying to save face by saying they don't need Apple's help anymore.

    • Re:Didn't (Score:4, Insightful)

      by taustin ( 171655 ) on Wednesday March 30, 2016 @06:09PM (#51811313) Homepage Journal

      And convince terrorists worldwide to use other - less secure - phones. It's not the best outcome for them, but it's better than getting handed their ass in the PR battle, like they were.

      • Re: (Score:2, Flamebait)

        by GrandCow ( 229565 )

        Apple has said from the start that the security on the phone in question was hackable, and that further generations include things like secure enclave that make the only possible means for hacking this particular phone obsolete.

        That's why the case was bullshit from the beginning. The FBI could give a fuck about this particular phone, they wanted a precedent on record that Apple had to write custom big brother software (and digitally sign it so it could be installed without wiping the phone, potentially even

    • Re:Didn't (Score:5, Insightful)

      by VernonNemitz ( 581327 ) on Wednesday March 30, 2016 @06:30PM (#51811427) Journal
      Yeah, Apple is approaching the wrong party. That company in Israel found the flaw, and the FBI paid them to use it. Apple has so far been unwilling to encourage folks to expose bugs, by paying them, so....
      Logically, especially since it is well known that Apple has plenty of cash on-hand to buy things, Apple should buy the vulnerabililty, instead of expecting to get it for free from the Feds. How greedy do you think ordinary folks are willing to let Apple be, in such circumstances?
      • by tlhIngan ( 30335 )

        Logically, especially since it is well known that Apple has plenty of cash on-hand to buy things, Apple should buy the vulnerabililty, instead of expecting to get it for free from the Feds. How greedy do you think ordinary folks are willing to let Apple be, in such circumstances?

        Well, you know how much iOS vulnerabilities go for? Bug bounties that are offered by Google, Microsoft and everyone else pale. $10K? peanuts. An iOS vulnerability sells for $1M. Yes, a million dollars. Hell, Android vulnerabilities

      • Overheard at the synagogue: "... and I said why sell it once when you can sell it twice? Do these goyim take me for a schlemiel already?"

    • is it really that far fetched for the israeli company to have a bootloader hack or code injection-after-boot-but-before-unlock hack?

      because that's all that was needed for hacking the pin protection system on iphone 5C. if you have that, then you can prevent the system from wiping the encryption key after 10 attempts and can attempt the right pin code infinitely.

      and apple 99.99999% probably already knows how they did it, so whats there to tell.

      and has usa gov been telling such things? no.

      fbi is just pissed t

    • Re:Didn't (Score:5, Interesting)

      by marcansoft ( 727665 ) <hector@marcansoft.cDEBIANom minus distro> on Thursday March 31, 2016 @01:59AM (#51812769) Homepage

      Of course they hacked the phone.

      There is a very easy, very reasonable trick that is guaranteed to work to get the data out of that phone with minimal risk (assuming it has a 4-digit PIN). It's not a mistake, it's not a bug, it's not something anyone has to "discover". It's simply an attack outside the threat model that Apple used when designing that particular iPhone (and, with minor differences, all currently released iPhones). I have no doubt Apple knows full well it will work and knew it would work when they designed the phone (it's blatantly obvious, and Apple's security engineers aren't idiots) - protecting against it is just not trivial (it cannot be solved by software, it requires support hardware) so, to this date, they've chosen not to. In fact, they added a minor roadblock against it on newer phones (but only a minor one that can also be bypassed - because doing better is Hard(TM) and costs money), which demonstrates they are fully aware of it. I explained how it works here [marcan.st] (search for "replay attack"). I'm not the first one to mention this approach.

      Making iPhone secure against all physical attacks is impossible. If your PIN is bruteforceable (as is the case here), then security relies on the PIN attempt counter. An attacker with physical possession of the phone can always find a way in. Apple just has to decide how much effort (and money) they want to put into making that harder. The current bar is at approximately the "a couple experienced hardware/software hackers and a couple thousand dollars in R&D costs" level. With some more money/effort they could raise it to the "a crazy dude like Chris Tarnovsky and a medium-budget silicon hacking lab" level. It's not going to get to the "noone will practically be able to do it" level without making the iPhone into a tamper-resistant hardware security module with physical defenses (i.e. not something likely to fit in your pocket).

      It still baffles me why everyone is so concerned about how the FBI got in, when we know an easy way in already.

      • Quick question...if the San Bernadino shooter had locked his own phone (by intentionally password failing 10 times) before he left, could anyone get into it?

    • I've got my tinfoil hat on tight, so it's baseless speculation time: How do we know Apple didn't help them? They could have just done the court dance to keep up appearances, and help the Feds out on the sly. Win-win: Apple keeps their users happy and even gains extra points for standing up to the government, and they keep up good relations with the Feds.
  • Obviously the FBI should keep quiet.

    That way they can hack the phones of government officials with impunity.

    • by LWATCDR ( 28044 )

      Actually I believe that they had a court order so this did follow all legal requirements for a search.
      Yea the FBI will not say a word.

      • Not true the FBI did not ever have a warrant for the data.

        The FBI had permission.

        Example, a police officer knocks on your door. You invite him inside. The officer sees your heroin needle. The officer can arrest you, because you gave him permission to search your home.

        Or

        A police officer knocks on your door. You kerp him outside, you tell the officer to come back with a warrant. The officer suspects from the conversation you have drugs, he gets a court order to search your home.

        I really wish everyone u

        • by mysidia ( 191772 )

          Example, a police officer knocks on your door. You invite him inside. The officer sees your heroin needle. The officer can arrest you, because you gave him permission to search your home.

          OK, so is it possible now that Apple will file a lawsuit against San Bernadino county for soliciting and giving the FBI permission to conduct activities such as reverse-engineering or disassembly which are prohibited by the software EULA?

          What happens if you're at a neighbor's house, and you let the officer in (with

        • by LWATCDR ( 28044 )

          I was not aware of that permission was given.
          Seems even worse for Apple then. The owner of the device gave permission and Apple still refused to help. There was zero privacy issue in that case.

    • Sure, if said government officials will hand over the phone to be disassembled. Recall that this particular hack is likely NAND mirroring. That requires removing the CPU. Not something you would tend to do in bulk.

    • by AHuxley ( 892839 )
      Re "should keep quiet."
      The US gov had that hidden win with PRISM and ICREACH https://en.wikipedia.org/wiki/... [wikipedia.org]
      The cost of parallel construction was not great but the risk of a court of expert teams finally asking questions about the origins of a case was not always risk free.
      Hidden cell phone tracking, voice prints and decryption get decades of easy access to start to build a public case.
      The press, lawyers, tech experts in the US could slowly see that not all cases got built on informants, ex convicts,
    • I don't think it matters. Apple must know that the phone can be broken into - and now have a large hint it is possible.

      But I don't believe it is the gov't who needs to tell Apple this - Apple could hire the same company and ask them how they did it.

      From an ethical hacking point of view - maybe the gov't does have a responsibility to report a vulnerability to the vendor if the attack is "simple" and poses a clear danger to the security of Americans. I believe it is a balancing act with two possibilities.

      I

  • the FBI says to Apple: "we paid XYZ to do it". FBI off the hook, and XYZ company charges Apple $2B for the answer. profit!
    • If you become aware of a means of breaching the security of this device that you own, you are required to reveal it to Apple. Get all other mobile firm companies to add the term to the contract. Then they either have to stop operating mobiles, or hand it over.
  • DMCA? (Score:5, Insightful)

    by BuckaBooBob ( 635108 ) on Wednesday March 30, 2016 @05:56PM (#51811215)

    Shouldn't Apple be chasing after them for circumventing the encryption and digital rights management system on the phone? Its what they do to people coming up with jailbreaks... why would this be diffrent?

    • Re:DMCA? (Score:5, Funny)

      by zlives ( 2009072 ) on Wednesday March 30, 2016 @06:00PM (#51811245)

      because its not illegal when the president does it.

    • Re:DMCA? (Score:5, Informative)

      by Duhfus ( 960817 ) on Wednesday March 30, 2016 @06:02PM (#51811259)
      No, DMCA has exceptions for law enforcement.
    • Shouldn't Apple be chasing after them for circumventing the encryption and digital rights management system on the phone? Its what they do to people coming up with jailbreaks... why would this be diffrent?

      I was thinking about that federal law about "Unauthorized Access to a computer" and/or the "circumventing security measures" law. Both the FBI and/or the supposed "hackers" are guilty of these felonies, period.

      And before you say "Court Order", I believe it was just a PROPOSED Order; I don't think it ever became a real Order. And besides, even a Court can't enter an Order to Break the Law...

      • I was thinking about that federal law about "Unauthorized Access to a computer" and/or the "circumventing security measures" law. Both the FBI and/or the supposed "hackers" are guilty of these felonies, period.

        Be specific.

        Laws often have exceptions for law enforcement, and even when they don't, prosecutors have a massive amount of discretion in who they prosecute.

        It turns out the FBI is allowed to do a lot of things we would not want private citizens to do. Like running their own heavily armed hostage rescue team.

        Realistically, this is a balancing question--needs of the state vs. privacy, for a relatively old phone that will be out of circulation in a few years anyway. So it's not terribly important whether the

        • by swb ( 14022 )

          It turns out the FBI is allowed to do a lot of things we would not want private citizens to do. Like running their own heavily armed hostage rescue team.

          I think you could make a case for a private armed hostage rescue team, and I would guess that such an entity has existed for a long time, whether it was the Pinkertons or something like Blackwater.

          Arguably it would be preferable to have the police handle a kidnapping rescue, but you can probably invent circumstances where involving the police didn't work somehow -- expediency, corruption of local law enforcement, some kind of overseas situation.

          There's obviously a huge legal minefield here when you get into

    • by mark-t ( 151149 )
      Because the DMCA explicitly "does not prohibit any lawfully authorized investigative, protective, information security, or intelligence activity of an officer, agent, or employee of the United States, a State, or a political subdivision of a State, or a person acting pursuant to a contract with the United States, a State, or a political subdivision of a State."
  • by sgrover ( 1167171 ) on Wednesday March 30, 2016 @06:01PM (#51811251) Homepage

    If the FBI does not reveal the hack so they can hack other phones, well that means the bad guys can also continue using that hack. After all we know that there are now at least 3 organizations who can access a locked iPhone 5c without the owner's password.

    • They're probably living in a fantasy world where the Good Guys(tm) have secure encryptions, but anyone else can be cracked.

      How that's quite supposed to work, I cannot guess.

    • We know nothing of the sort. We can assume Apple can do so, but with no further evidence besides an official statement from the FBI, there is no reason to believe that any other organization has such capability.
  • Nope, Due Process. (Score:4, Informative)

    by MobileTatsu-NJG ( 946591 ) on Wednesday March 30, 2016 @06:02PM (#51811255)

    ...or should law enforcement be able to deploy those bugs as crime-fighting tools?

    Um, no, law enforcement doesn't get to skirt around due-process just because it's inconvenient.

  • Apple probably already knows, or could know in a day or less, and in either case the next version of the iPhone will probably be made immune to it.
  • by zenlessyank ( 748553 ) on Wednesday March 30, 2016 @06:07PM (#51811287)
    O wait....we have already bent over. It is too late folks. No one cares what you think anymore. The system is established. Only blood will wash it away. Enjoy.
    • Yesterday's flavor was indignance, so today's is defeatism? Does switching it up make your boss happy?
      • HeHe. What the fuck is indgnance? At least learn how to troll proper. Since I am the boss, yes, it does make me happy. I like to keep on my toes, flow like the wind. I already know what is going on and have done something about it. You will notice that was put in a 3rd person perspective. Maybe if you go back to school, you too will understand sentence structure and overall meaning. Sometimes humor is woven in also to weed out the trolls like your self. You validated my point perfectly. Thank you.
  • this is not unknown (Score:5, Informative)

    by supernova87a ( 532540 ) <kepler1.hotmail@com> on Wednesday March 30, 2016 @06:09PM (#51811315)
    Well, actually, we don't need to leave it to a bunch of internet commenters to decide this issue -- there is an actual process described as "equities review" which the Executive Branch is responsible for, when a cyber vulnerability is known, but not yet disclosed to the public:

    https://www.whitehouse.gov/blo... [whitehouse.gov]>href=https://www.whitehouse.gov/blog/2014/04/28/heartbleed-understanding-when-we-disclose-cyber-vulnerabilities

    The considerations described here (in whether to reveal or keep secret a vulnerability) cover:

    -- How much is the vulnerable system used in the core internet infrastructure, in other critical infrastructure systems, in the U.S. economy, and/or in national security systems?
    -- Does the vulnerability, if left unpatched, impose significant risk?
    -- How much harm could an adversary nation or criminal group do with knowledge of this vulnerability?
    -- How likely is it that we would know if someone else was exploiting it?
    -- How badly do we need the intelligence we think we can get from exploiting the vulnerability?
    -- Are there other ways we can get it?
    -- Could we utilize the vulnerability for a short period of time before we disclose it?
    -- How likely is it that someone else will discover the vulnerability?
    -- Can the vulnerability be patched or otherwise mitigated?

    In this case, I might argue that this is becoming so well known (though the technical specifics have not been revealed), that the FBI/US had better tell Apple to make sure that other users of the affected phones can be secured -- while the intelligence value of the exploit is rapidly decreasing due to its publicity.
    • there is an actual process described as "equities review" which the Executive Branch is responsible for

      Since the FBI is a part of the Executive Branch that is pretty much textbook conflict of interest in this instance. The FBI obviously prefers to keep the ability to circumvent encryption without respect to whether this is either a good idea.

    • Good intelligence officers have never revealed sources or methods, and never will.

      What would be new is if this principle weren't applied to the method used to crack the iPhone that San Bernardino County issued to the terrorist.

    • by bigpat ( 158134 )

      Well, actually, we don't need to leave it to a bunch of internet commenters to decide this issue -- there is an actual process described as "equities review" which the Executive Branch is responsible for, when a cyber vulnerability is known, but not yet disclosed to the public:

      https://www.whitehouse.gov/blo... [whitehouse.gov]>href=https://www.whitehouse.gov/blog/2014/04/28/heartbleed-understanding-when-we-disclose-cyber-vulnerabilities

      The considerations described here (in whether to reveal or keep secret a vulnerability) cover:

      -- How much is the vulnerable system used in the core internet infrastructure, in other critical infrastructure systems, in the U.S. economy, and/or in national security systems?
      --
      Does the vulnerability, if left unpatched, impose significant risk?
      --
      How much harm could an adversary nation or criminal group do with knowledge of this vulnerability?
      --
      How likely is it that we would know if someone else was exploiting it?
      --
      How badly do we need the intelligence we think we can get from exploiting the vulnerability?
      --
      Are there other ways we can get it?
      --
      Could we utilize the vulnerability for a short period of time before we disclose it?
      --
      How likely is it that someone else will discover the vulnerability?
      --
      Can the vulnerability be patched or otherwise mitigated?

      In this case, I might argue that this is becoming so well known (though the technical specifics have not been revealed), that the FBI/US had better tell Apple to make sure that other users of the affected phones can be secured -- while the intelligence value of the exploit is rapidly decreasing due to its publicity.

      In bureaucratic speak all that means that as long as you can write a well worded memo of justification then you can do whatever you want.

  • It's a 5C (Score:5, Informative)

    by bill_mcgonigle ( 4333 ) * on Wednesday March 30, 2016 @06:23PM (#51811383) Homepage Journal

    Apple already knows it's hackable, that's why the 5S and newer have Secure Enclave.

    Still, they should make the FBI rue the day they tried to destroy Apple's market, however they can. Revealing the San Bernadito phone as a ploy is the minimum they should pursue.

    Yet, ultimately I hope Apple loses an inquiry about this break because it's better for all of us if they see the unconstitutional law enforcement agencies as adversaries.

    There, now I've disagreed with both camps.

    • by ras ( 84108 )

      that's why the 5S and newer have Secure Enclave.

      And Apple also knows the Secure Enclave can be by-passed too, by anybody who has the firmware signing key. If you have it, you just upload new firmware bypassing the checks. Currently only Apple has it of course. But that is where this all started.

      Still, they should make the FBI rue the day they tried to destroy Apple's market,

      Which is real simple to do. Put the Secure Enclave firmware in ROM, so it can't be upgraded. Then it becomes truly uncrackable from software, so the LEA's would be reduced to attacking the silicon. It's their worst nightmare.

      This is possible because the Secu

      • And Apple also knows the Secure Enclave can be by-passed too, by anybody who has the firmware signing key.

        It is also vulnerable to exactly the same external memory replay attack that non-Secure-Enclave-equipped phones are vulnerable to (i.e. the Secure Enclave is completely irrelevant to what is currently the easiest, most likely way the FBI got into the phone). I explained how all the pieces fit together in this [marcan.st] blog post.

        Which is real simple to do. Put the Secure Enclave firmware in ROM, so it can't be u

        • by ras ( 84108 )

          That's not the solution - Apple needs to be able to update the Secure Enclave firmware too, it's too complex to be reasonable to bake into a ROM forever.

          TPM's are more complex, simply because the solve a more general version of the same problem. Billions have been sold, and most of them have got along just fine without a firmware upgrade. We do know how to get bugs below 1 per 100k LOC, and I have no doubt Apple is capable of it. It's not cheap, but I doubt the expense concerns them overly.

  • The ethical choice (Score:4, Insightful)

    by Macdude ( 23507 ) on Wednesday March 30, 2016 @06:23PM (#51811385)

    The choice is between helping Apple secure the phones of millions of Americans against phone-thieves, identity-thieves, virus, mal-ware and ransom-ware writers or continuing to leave their citizens vulnerable to the above so that the government can spy on it's own people.

    I know what choice I think they should make.

  • Wasn't this a 3rd party hack? Who says the FBI knows how they did it in the first place?
  • by TsuruchiBrian ( 2731979 ) on Wednesday March 30, 2016 @06:41PM (#51811503)

    Does the FBI care more about fighting crime or reducing crime? There is a common tendency to for people and organizations to try to increase their own importance. So maybe the FBI could help to prevent X amount of crime (in the form of hacking, fraud, etc) from ever happening by helping Apple fix some security flaws. But maybe they will get more credit for allowing this vulnerability to remain and exploiting the vulnerability to catch a few more criminals. It's harder to appreciate crime prevention than punishment of criminals after the fact.

    If someone invented a magic security system for houses that eliminated home invasions, this might actually be bad for the prestige of law enforcement. While it will probably reduce crime (one of the purposes of law enforcement), it reduces the reliance of the population on law enforcement and therefore decreases their importance. A flaw in the security system would create the opportunity for more people to be criminals and more opportunity for law enforcement to come to the rescue. If law enforcement can in addition actually exploit this weakness to catch a few more criminals then even better.

    If the damage done by leaving the hole open exceeds the damage prevented by leaving the hole open, then it is better for society to have the hole closed, but it is not necessarily better for the FBI to have the hole closed. They won't get the blame for damage caused by an security hole unknown to the public, and they won't get any credit for the damage prevented by closing it.

    It would be nice if everyone (especially public officials) did what was best for society rather than what was best for themselves, but this is a rather hard standard to hold human beings to.

    I suspect it would be better for society to have the hole closed, but I wouldn't expect the FBI to have the kind of deep dedication to the improvement of society necessary to see that. Maybe it will be easier for them to see if they somehow become the victim (e.g. a scandal resulting from the FBI director's iphone getting hacked, etc).

    Take for example Nancy Pelosi. She was all for government surveillance. It was only until she became one of the targets of government surveillance, that she was able to be outraged.

  • by PopeRatzo ( 965947 ) on Wednesday March 30, 2016 @06:41PM (#51811509) Journal

    Stop pretending the FBI didn't already have the crack before they brought Apple to court. They were just looking for a legal precedent.

    Second, stop pretending that Apple doesn't know how to crack your phone. This entire story was nothing but theater.

  • Now can some like the fbi have a fake cell tower and use Emergency Call mode to bypass some security? Use it to reset a timeout on password guesses

  • If you keep data on a phone that can be unlocked with a key that the phone is able to check then that data is not secure, it is just very hard to get at. Why? Because the laws of physics do not allow the integrated circuits to be magical black boxes that cannot be monitored, copied and emulated. It is that simple. If you need a 100% secure phone it has to keep all of it's data in the cloud and even then only certain uncommon types of encryption are guaranteed to never be circumvented. This is important as t
    • You got the "magical black box" part right, but you got the rest wrong.

      All you have to do is use a passphrase (not a PIN) long enough to not be bruteforceable. Building a 100% secure device that limits the number of attempts at guessing an insecure PIN is impossible. Building a 100% secure device that protects your data using a secure passphrase is trivial: just use good encryption at rest.

      Putting data in the cloud, at best, does nothing for you security-wise, and at worst, makes it that much easier to get

  • I'm sure all they're doing is taking the plastic off of the NV memory part, attaching a probe, and reading out what's there. Those dies are tested that way at the factory: there will be lands on there for a probe. The government can buy a few phones of the same model for experimentation to get it right, then read out the contents of the NV memory of the phone they care about.

    Once they have those contents, it's just a matter of brute-force decrypting whatever is in the personal/confidential files. Remember i

    • The NV memory part is also encrypted with a key derived from a unique key fused into the CPU SoC (that is too long to be bruteforceable). To do the attack as you describe, they'd have to take the plastic off of the SoC (not the NV part, you can just pull that off the board and read it), and then use a FIB workstation to modify the metal routing and read off the fused UID key to be able to decrypt the external memory and attempt a PIN bruteforce. I explained this and other attacks here [marcan.st]. That attack is techni

      • Interesting...

        Those unique keys are probably recorded at the time of manufacture and saved to a DB (against the serial number of the phone or board). Apple complained about modifying their firmware to put in a backdoor bypassing the PIN entry procedure. I don't think they complained about handing over that CPU key when subpoenaed, or perhaps merely upon a request by the FBI. If the attacker knows the encryption function used by the NV memory controller, then they should be able to emulate that too.

        For an at

        • Those unique keys are probably recorded at the time of manufacture and saved to a DB (against the serial number of the phone or board).

          According to Apple, they UID key is generated during manufacturing and not recorded anywhere except on the device itself.

          I'd expect the software would filter out touches less than 10ms or so.

          Chinese PIN cracking devices for older versions of iOS (exploiting pin attempt counter flaws no longer available) did it via USB. I think it accepts USB HID input or something dumb like

          • Your article is well-thought out. I would wonder, though, if the UID could be read with a simple optical microscope. Presumably the UID is written to a memory cell on the SoC using links that open (like a fuse) when a high current is passed through (like the old PROM memories used to). Those links wouldn't be embedded in layers of silicon: the opening of the link would heat up and perhaps emit material that would need to be dissipated. (The link would look like this ===-=== or this === === if open.) If such

            • Presumably the UID is written to a memory cell on the SoC using links that open (like a fuse) when a high current is passed through (like the old PROM memories used to).

              Ah, this is where it gets fun. There are actually quite a few OTP storage technologies. Fuses, like what you mention, are one. They're not necessarily on top (indeed, they'd usually be on lower, finer pitch layers, since the whole point of a fuse is that it has to be thin), though, so to read them you'd still need to strip off metallization

  • Apple spits in the eye of the FBI and then people expect them to disclose the vulnerability (if that is what it was) to Apple?

    Yeah... right.

    I think it would be better if Apple spent some of its money on finding the vulnerability themselves.

One can't proceed from the informal to the formal by formal means.

Working...