Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Government Security United States Apple

Barr Asks Apple To Unlock iPhones of Pensacola Gunman (nytimes.com) 195

Attorney General William P. Barr declared on Monday that a deadly shooting last month at a naval air station in Pensacola, Fla., was an act of terrorism, and he asked Apple in an unusually high-profile request to provide access to two phones used by the gunman. From a report: Mr. Barr's appeal was an escalation of an ongoing fight between the Justice Department and Apple pitting personal privacy against public safety. "This situation perfectly illustrates why it is critical that the public be able to get access to digital evidence," Mr. Barr said, calling on Apple and other technology companies to find a solution and complaining that Apple has provided no "substantive assistance."

Apple has given investigators materials from the iCloud account of the gunman, Second Lt. Mohammed Saeed Alshamrani, a member of the Saudi air force training with the American military, who killed three sailors and wounded eight others on Dec. 6. But the company has refused to help the F.B.I. open the phones themselves, which would undermine its claims that its phones are secure.

This discussion has been archived. No new comments can be posted.

Barr Asks Apple To Unlock iPhones of Pensacola Gunman

Comments Filter:
  • if joined can they force an backup / send me an reset code?

    • by saloomy ( 2817221 ) on Monday January 13, 2020 @04:10PM (#59616894)
      The backups are secured by an encryption key that can be only unlocked with the users password. The users password creates a hash that Apple doesn't store, the hash encrypts the backup on the device before Apple receive it. Apple only receive the encrypted data. So unless they had the users actual password (which would be used to open said device anyway) they can't access the data on the device. Everything is encrypted with hashes from the password and / or pin code. That's how it should be.
      • by JaredOfEuropa ( 526365 ) on Monday January 13, 2020 @04:22PM (#59616972) Journal

        That's how it should be.

        Just so. The correct answer from a tech company in response to such a request is not: "We won't" but "We can't".

        • by guruevi ( 827432 )

          We won't is also legitimate given no subpoena or other legitimate judicial order is in place.

          • by iamgnat ( 1015755 ) on Monday January 13, 2020 @04:45PM (#59617050)

            We won't is also legitimate given no subpoena or other legitimate judicial order is in place.

            Which is what makes "we can't" the correct answer.

            If they make their system so they have no access to the data then there is no worry about a moral issue when presented with a legal but unjust subpoena. I also expect that being unable to decrypt phones/backups also provides them some legal protections too.

            • Are too ignorant to understand that. Barr's crew tried to crack the iPhone but couldn't so they think Apple has a back door. They think Apple is holding out and iOS is just like Android, which is hackable. Barr, go back to kissing Trump's butt - at least it's the one thing you know how to do. PS - don't even try to lean on Apple. They have way more money and much better lawyers than you think.
        • Re: (Score:3, Insightful)

          by gmack ( 197796 )

          That's how it should be.

          Just so. The correct answer from a tech company in response to such a request is not: "We won't" but "We can't".

          The trouble is that now Barr will accuse apple of not careing about the safety of the USA and will use it an an excuse for laws that mandate backdoors.

          • That's how it should be.

            Just so. The correct answer from a tech company in response to such a request is not: "We won't" but "We can't".

            The trouble is that now Barr will accuse apple of not careing about the safety of the USA and will use it an an excuse for laws that mandate backdoors.

            Except that Apple already gave them the same answer a few years ago regarding the San Bernandino shooting. There was bi-partisan condemnation of Apple at the time, but no new laws have come to pass since then. It even goes back farther than that (Clinton pushed the Clipper chip which was funded and developed by Bush Sr), yet in all the time they have been trying to legislate back doors they have had little success.

            I agree that the anti-Tech crowd will try to use this as a stick, but even the average non-tec

            • At that time, there was an Israeli security firm that was able to unlock most Iphones (if not all at the time). Haven't heard anything recently however, not that they would broadcast this.
        • by AK Marc ( 707885 ) on Monday January 13, 2020 @05:31PM (#59617298)
          No. The answer should be "here it is" and hand them the encrypted backup. Done. Let the FBI worry about unlocking it. Apple should be on the hook for the information requested. It's in there. We think. And they should provide it when subpoenaed.
        • by rtb61 ( 674572 )

          Technically that would be a lie of expertise because yeah they can because of who they are, a major tech enterprise but how much effort would it take is the question and once they figure out a way to do it, back door the security in their device by conducting an extended research program to hack the encryption, they immediately then have to close the back door they discover else they can not claim privacy and security.

          All Apple has to do is provide the complete technical specifications of the device and th

          • by ceoyoyo ( 59147 )

            Breaking encryption isn't like on TV where you let the computer work at it for a day or so (maybe a week for the really hard ones) and the plain text pops out.

            "Major tech enterprise" or not, if Apple has done a competent job of security, and they seem to have, then they *cannot* provide access.

        • by ceoyoyo ( 59147 )

          That's what Apple has insisted all along. US law enforcement seems to be engaging in magical thinking. *Obviously* if you have the data you can provide it to us....

      • by gtall ( 79522 ) on Monday January 13, 2020 @05:16PM (#59617234)

        And you expect that mouth-piece Barr to understand such an explanation. He doesn't have the mental capacity.

        • by balbeir ( 557475 )
          Could he not just ask his good friend MBS for the passcode?
      • The backups are secured by an encryption key that can be only unlocked with the users password. The users password creates a hash that Apple doesn't store, the hash encrypts the backup on the device before Apple receive it. Apple only receive the encrypted data. So unless they had the users actual password (which would be used to open said device anyway) they can't access the data on the device. Everything is encrypted with hashes from the password and / or pin code. That's how it should be.

        It has to be better than that, because what you describe would be trivial for Apple to crack... or for the feds to crack if Apple handed over the encrypted data.

        I'm sure it's done in some way that's more like what we do on Android, which is to use the password and some data that exists only on the device to derive the encryption key. Or, more precisely, to use the password and some data that exists only in secure hardware on the device to derive a key that is use to encrypt the randomly-generated encrypt

        • by edwdig ( 47888 )

          You can back up an iPhone to iCloud, then use that backup to initialize another device. The key can't be tied to the original device.

      • What if the user forgets their password and wants to reset it? If apple provides a way for this to happen then they can do that and the phone will be unlocked and encrypted. A very common thing is the need to reset your password.
  • Reviewing the answers to the previous requests for a backdoor doesn't look good.

    Maybe Tim had a change of heart today?

    Anyhow, it isn't as if they didn't pull his complete internet and search history as well as bring up all his previous telephone conversations.

    • by mark-t ( 151149 ) <markt.nerdflat@com> on Monday January 13, 2020 @04:12PM (#59616918) Journal
      Barr knows fully that when Apple doesn't comply, it will give him ammunition to use against tech companies that by not being able to comply with law enforcement upon request, they are effectively aiding continued disregard for the law by crimiinals who employ their technology, and we can probably expect a renewed effort by him and similarly minded people in the goverrnment to try and create laws that will *force* tech companies to put back doors in their products or not be allowed to sell their products in the USA, nor be allowed to legally import them.
      • If Barr had any credibility at all after his shennanigans elsewhere, I'd tell him to eat a bag of dicks.

        But given his already established lack of credibility, I'd just tell them it's under executive privilege and then ignore him vigorously. Sure sure, he'll eventually get it to court, assuming he doesn't get fired by then for some other rando reason.

        • by rahvin112 ( 446269 ) on Monday January 13, 2020 @05:10PM (#59617208)

          He doesn't want to go to court, he might lose. What he wants is Congress to write a law requiring backdoors, whether it's legal or not he doesn't care. They almost got congress to act with the california islamic shooter. The FBI knows that to get this law they need an Islamic terrorist with an encrypted phone to get the public to mistakenly back this.

          I'm sympathetic to law enforcement's desire to go back and look at this data. But law enforcement is getting lazy, they didn't have this kind of access 20 years ago, and it's availability was nothing more than the lax security in the sector until hacking forced the companies to take security seriously. Barr and the FBI want to roll this back to the lax security because they had complete access. But the simple fact here is the manufacturers and OEMs need tight security or there won't be any security and they'll be destroyed by lawsuits.

          • by shilly ( 142940 )

            It never fails to amaze me that they don't appear to have heard of the phrase "be careful what you wish for". Can they really not have contemplated the almighty shit storm that will descend on them once a bad actor uses a backdoor they insisted on, to do something terrible to lots of people?

        • by gtall ( 79522 )

          He won't get fired, he's too much of Trump man...well, Trump Weasel to get fired. Trump loves anyone telling him he's the greatest.

      • He's already started. See the "comply first, complain later" speech he made in late 2019.
      • Congress, unlike Barr, has much more to lose if they start forcing Big Tech to do their bidding.

        It would be a shame to see all those campaign donations dry up now wouldn't it Senator ?
        Especially those big juicy ones from companies who have more wealth than some entire countries do . . . .

        Apple and the rest know if they put in a backdoor and it becomes public knowledge, ( and it eventually will ) the
        stock price dive alone would cost them dearly. The reputation hit they would suffer would all but destroy th

    • Comment removed based on user account deletion
  • by nysus ( 162232 ) on Monday January 13, 2020 @04:00PM (#59616846)

    Shit on Apple all you want, but they are the only major consumer company going out of their way to protect user data.

  • Barr the guy your investigating is dispicable at best. It probably even actually falls under terrorism. That being said I hope you never get access to this phone or any phones like it. 1. Any backdoor becomes exploitable. Even when back doors are not intended they have drastic consiquences. Look at the impact to compute, financial operations, and even global warming with Meltdown, Spector, Zombieload. 2. Neither Apple, nor FBI have a perfect record with security. (For that matter no-one-else). There are a
  • What about adding an TAP mode to live phones with court order.

    • How is that different from a backdoor? How would you protect the TAP from bad guys abusing it?
      • I dont support adding a backdoor to encryption, I think the government isn't necessarily entitled to know everything a person wants to keep secret.

        But, when we get these discussions, one of the common topics is around "how do you stop the bad guys from abusing it" - which is a fair question. However, theres a corresponding question which remains unanswered right now which is equally relevant - how does Apple stop the bad guys from abusing its software update system?

        The two questions are essentially analogo

        • The two questions are essentially analogous, especially as so many Apple products are designed to update automatically - Apple already have a back door to most devices, so how are they preventing the bad guys access to that?

          Okay so let's talk about that then. So the idea is that Apple can securely update an iPhone automatically. Thus, they could slip in an update that allows unlimited guesses on the PIN with no delay. This is the idea that the FBI has tried to float. So why doesn't Apple just do that?! Because it is open a door that cannot be closed ever again.

          The FBI has the power to investigate and with that, has only a few limitations outlined in law as to how far they can go with those investigations. You needn't loo

          • Okay so let's talk about that then. So the idea is that Apple can securely update an iPhone automatically. Thus, they could slip in an update that allows unlimited guesses on the PIN with no delay.

            I have never seen or heard of an iPhone or any other Apple device updating itself while the user was logged off - which is what would be required in this case. Usually it won't even update itself "automatically" even if you are logged on, you have to request the update or give the OS permission to update in the background. Some provision for this would need to have been done in advance, and we have no reason to believe that Apple has done so.

            • by ceoyoyo ( 59147 )

              What do you mean "usually?"

              You *always* have to give the device permission to update.

              • Not always, Apples pushed updates out without permission before (see the silent update done for Zooms vulnerability on OSX - can you guarantee Apple doesn't have an equivalent for iOS?), and on several occasions I have woken up to my iPhone or iPad having updated (always minor versions of iOS) without me giving permission beforehand. This may have changed recently.

                • by ceoyoyo ( 59147 )

                  That's pretty disingenuous. We're talking about iOS devices, you claim Apple has pushed out forced updates before, and give an example for OS X.

          • Yeah, you miss my point entirely - it has nothing at all to do with whether Apple should or should not do as requested, its why "a backdoor could be discovered, hacked and abused by a malicious third party" differs from "we have a channel where we can push any update we want out right now, but its secure".

            • we have a channel where we can push any update we want out right now, but its secure

              No you've missed the point entirely. If the FBI has a backdoor in their sole control, it's no longer secure. Every single, "super secret" tool that the FBI has ever laid hands on has slipped to our enemies. The FBI couldn't keep a fucking burrito secure in a locked fridge. So if the FBI ever lays hands on the backdoor and rips it from Apple's hands, then the devices are pawned at that point. As sure as the sun rises in the east, the FBI will fumble the fuck out of keeping something under wraps.

    • What about adding an TAP mode to live phones with court order.

      They can already get a court order to "wiretap" your conversations as they move through the phone company....conversations are not encrypted between phone units unless you are using some special software not provided by Apple.

  • by dimmthewitted ( 4023151 ) on Monday January 13, 2020 @04:15PM (#59616938)

    Heeeeello,

    They didn't refuse they simply stated that they are unable to comply.
    That is beauty of end to end encryption. No secret backdoors with which bad guys can take advantage.

    Apple couldn't provide access if they wanted.

    The sign of all good encryption.

    Undermining this would undermine privacy for the American public.

  • Apple: Try passcode 000000
    Feds: Didn't work. ...
    Apple: Try passcode 000008
    Feds: OK but this sure is taking a long time. Didn't work.
    Apple: Try passcode 000010
    Feds: Didn't work and it says it's erased itself.

  • by Tom ( 822 )

    But the company has refused to help the F.B.I. open the phones themselves

    Have they refused as in "we could, but we won't" or have the stated that even if they wanted to, they couldn't open the phones?

    Because if their security is worth anything, that's the case.

    • by guruevi ( 827432 )

      Refused as in - if you give us a signed judicial order, then we'll tell some people in our company to get back to you on the things you can do, until then, the government can't force us to do anything.

    • Have they refused as in "we could, but we won't" or have the stated that even if they wanted to, they couldn't open the phones?

      Because if their security is worth anything, that's the case.

      The last time the FBI requested help from Apple, they eventually gave up because another company did it for them.

      Now, it's not because Apple security sucks, it's because the weak point in most such systems is the password.

      People keep framing this as the FBI asking for a backdoor, and Barr has indeed said he wants companies to avoid strong encryption or include backdoors. We should absolutely, 100% fight that. However, asking for help unlocking a phone that has a search warrant issued to it is perfectly reas

      • I have a couple of questions about that approach.

        First, is that bit of security able to be updated? If it's handled through the secure enclave, you might need to unlock the phone in order to upload a new signed firmware to the enclave, for example.

        Second, once Apple makes that iOS build, who else has access to it?

        • If it's handled through the secure enclave, you might need to unlock the phone in order to upload a new signed firmware to the enclave, for example.

          You can perform a firmware update from the phone's recovery mode [apple.com]. From the last step on that page, "When you see the option to Restore or Update, choose Update. Your computer will try to reinstall the software without erasing your data." It makes sense from a device design perspective that you'd design such that a botched upgrade won't erase your data, and give you the chance to use recovery mode to fix it even if you don't have the ability to log back in.

          Second, once Apple makes that iOS build, who else has access to it?

          That is a very good point, I agree with you. It's wh

  • Next they'll use the pretext of an abducted child to tug at the heartstrings. They're gradually building a case for mandatory backdoors.
  • In most states, the prosecution can argue fleeing jurisdiction an evidence of guilt.

    Similarly it will start arguing not providing passwords to social media accounts, not unlocking phones etc as evidence of guilt. They might even pass laws saying passwords are subjected to subpoena provisions.

    • Prosecutors and investigators have been making this argument for years. Last time I looked, the law was not super-well settled, but the 5th Amendment protection against self incrimination seems to be winning.

      Courts have kind of generally defaulted to saying you must surrender WHAT YOU HAVE (like a physical key) or WHAT YOU ARE (like biometrics, fingerprints, etc.) when presented with a valid court order. Property and identification seem to be readily susceptible to exposure.

      They have generally stopped sho

  • by Narcocide ( 102829 ) on Monday January 13, 2020 @04:51PM (#59617084) Homepage

    There's gotta be legions of skeletons in this guy's closet. I wonder if he's really even the actual William Barr anymore.

  • This is such an incredible dilemma - that law enforcement is crippled by civilians having such easy access to strong encryption, but providing a backdoor seems to inexorably lead to political persecution.

    However, using a blockchain to mediate the requests for data might provide a middle path. I believe it could provide three dynamics that would facilitate legitimate investigations while also preventing abuse.

    1 - Each request for data would be paid for individually. This would prevent a PRISM-like ap
    • by ceoyoyo ( 59147 )

      However, using a blockchain to mediate the requests for data might provide a middle path.

      Well done, I almost thought you were serious.

  • Apple already has plaintext access to photos and more by default for all their products.

    This special request is getting media attention. But the default-on, plaintext iCloud products, which already scans photos to cooperate with law enforcement (search "child exploitation" news articles) is a much bigger security issue.

  • by couchslug ( 175151 ) on Monday January 13, 2020 @05:46PM (#59617360)

    Security through obscurity doesn't work and government tools inevitably leak because they exist. When they do, they can be weaponized against their creator society.

  • who killed three sailors and wounded eight others on Dec. 6.

    If I was related (married, kid, parent) of one of those 3 sailors I'd damn well want "those awful authorities" to be able to access ABSOLUTELY EVERYTHING the shooter had.

    Maybe it's relevant, most likely it's not.
    (Todo Tue: Dog to vet for checkup.
    Todo Wed: Pick up new underwear
    Todo Thu: Pick up more ammunition.
    Todo Fri: Pick up pizza.)

    Your parent was just doing their job / walking by / defending themselves / whatever and here comes a guy that kills them. I'd want ANSWERS. Why did this guy do th

    • by ceoyoyo ( 59147 )

      If I was related (married, kid, parent) of one of those 3 sailors I'd damn well want "those awful authorities" to be able to access ABSOLUTELY EVERYTHING the shooter had.

      That's the triumph of modern justice systems. We recognize that victims and their families are unlikely to take rational positions and instead rely on the rule of law, i.e. prewritten procedures.

  • Mr. Barr's appeal was an escalation of an ongoing fight between the Justice Department and Apple pitting personal privacy against public safety. "This situation perfectly illustrates why it is critical that the public be able to get access to digital evidence," Mr. Barr said, calling on Apple and other technology companies to find a solution and complaining that Apple has provided no "substantive assistance."

    Fine. Then OUTLAW ALL ENCRYPTION IN THE UNITED STATES. You can't have it both ways.

  • They are just 2 phones the FBI has. "2 phones USED by the gunman". Not in his possession or otherwise. Don't believe anything this guy says.
  • Even if Apple could, without a court order the answer is no. There can't be any exceptions to this, not because Barr thinks "well in this case he was so evil...". Court order or GTFO. As soon as one exception is made then the flood gates will be open.

  • I'm thinking each phone has a unique master key which would be embedded in the hardware and only exposed through the destructive disassembly of the phone, like embedded in some part that needs to be milled to expose it.

    You'd need a backup copy of the flash or some way to preserve the flash chips before extracting the physical key.

    This would make the phone less secure, but with the key as only a physical entity and only retrievable through pretty extreme measures that otherwise destroy the phone it seems lik

  • With all the pronouncements of AI capabilities, not one overt claim of such.
  • by CheckeredFlag ( 950001 ) on Monday January 13, 2020 @10:07PM (#59618094)
    Just curious - would be possible for Apple to install a special version of iOS that changed the facial recognition code to simply return "true" and unlock the phone? If so, does this then make it less secure than a passcode alone? It's hard to see how facial or thumbprint recognition isn't a point of vulnerability for a hacked os. Would love to hear an explanation from someone who understands this.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...