Apple's iPhone Already Has a Backdoor 401
Nicola Hahn writes: As the Department of Justice exerts legal pressure on Apple in an effort to recover data from the iPhone used by Syed Rizwan Farook, Apple's CEO has publicly stated that "the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone." But, as one Windows rootkit developer has observed, the existing functionality that the FBI seeks to leverage is itself a backdoor. Specifically, the ability to remotely update code on a device automatically, without user intervention, represents a fairly serious threat vector. Update features marketed as a safety mechanism can just as easily be wielded to subvert technology if the update source isn't trustworthy. Something to consider in light of the government's ability to steal digital certificates and manipulate network traffic, not to mention the private sector's lengthy history of secret cooperation.
Related: wiredmikey writes: Apple said Monday it would accept having a panel of experts consider access to encrypted devices if US authorities drop efforts to force it to help break into the iPhone of a California attacker. Apple reaffirmed its opposition to the US government's effort to compel it to provide technical assistance to the FBI investigation of the San Bernardino attacks, but also suggested a compromise in the highly charged legal battle.
In his first public remarks since Apple CEO Tim Cook said he would fight the federal magistrate's order, FBI Director James Comey claimed the Justice Department's request is is about "the victims and justice."
In his first public remarks since Apple CEO Tim Cook said he would fight the federal magistrate's order, FBI Director James Comey claimed the Justice Department's request is is about "the victims and justice."
Tim Cook's letter (Score:5, Informative)
In the context of this article it is worth pointing out the letter that Tim Cook sent out to Apple employees:
http://arstechnica.com/tech-po... [arstechnica.com]
I believe he makes good points, and where ever we end up, it should be because of proper discussion understanding implications, rather than because 'Apple is evil' mantra, that will end up burning everyone.
Re:Tim Cook's letter (Score:5, Interesting)
From the arstechnica article:
The document closed with a call for Congress to "form a commission or other panel of experts on intelligence, technology, and civil liberties to discuss the implications for law enforcement, national security, privacy, and personal freedoms. Apple would gladly participate in such an effort."
From the leaked White House memo linked in the Counterpunch article:
Proposed Policy Principles
Deputies agreed that attempts to build cooperation with
industry, with advice proposing specific technical solutions, will
offer the most successful option for making progress on this
issue. In particular, given industry and civil society's
combative reaction to government statements to date, any
proposed solution almost certainly would quickly become a focal
point for attacks and the basis of further entrenchment by
opposed parties. Rather than sparking more discussion,
government-proposed technical approaches would almost certainly
be perceived as proposals to introduce “backdoors” or
vulnerabilities in technology products and services and increase
tensions rather build cooperation.
However, if the United States Government were to provide a set
of principles it intends to adhere to in developing its
encryption policy, such a document could spark public debate.
Proposing such principles would not be without risk, as some
constituencies may not distinguish between principles and
specific technical approaches. As a result, these principles
could come under attack, but could also serve to focus Public or
private conversation on practicalities and policy trade—offs
rather than whether the government is seeking to weaken
encryption or introduce vulnerabilities into technology products
and services.
It seems like the plan is proceeding nicely. We getting into the "public debate" phase. Soon it will move on to the trade-off phase decided on by a panel of private and governmental experts.
Re:Tim Cook's letter (Score:5, Insightful)
It seems like the plan is proceeding nicely. We getting into the "public debate" phase. Soon it will move on to the trade-off phase decided on by a panel of private and governmental experts.
Yea, but part of the challenge is that not everything in the world can be "compromised" or "traded-off".
Encryption either works or it doesn't. Your info is either secure or it isn't. If the government can access it, then it isn't secure.
There just isn't any give-and-take here, either you can make your info private, or you cannot.
Re: (Score:3)
You could make that argument, but I would disagree with it.
The flaw in it is that if the government CAN access it, then so can FOREIGN governments, and likely bad actors as well, so the country STILL isn't secure.
My personal privacy and liberty is more important than the government keeping the boarders secure in any case.
Re:Tim Cook's letter (Score:4, Insightful)
Re:Tim Cook's letter (Score:5, Insightful)
I especially like this quote:
"...we strongly believe the only way to guarantee that such a powerful tool isn't abused and doesn't fall into the wrong hands is to never create it."
So the vulnerability is the updating mechanism? (Score:3, Insightful)
Re:So the vulnerability is the updating mechanism? (Score:4, Informative)
Every OS does not have that problem. I'm not even sure that iOS does. It's possible Apple has a way to forcibly push an over the air OS update to your phone, but I don't recall ever hearing any confirmation of that. As far as non-mobile OSes, the only one I've ever heard about forcing updates on you is Windows 10.
Re: (Score:2)
Re: (Score:2)
I didn't really read much of the rant past the first paragraph. Microsoft is on record updating some copies of Windows 7 to 10 without giving the owner an opportunity to "click no." It did not happen to my copy of Windows 7, possibly because it's a corporate site license through the university.
Re: (Score:2)
I have had a Windows 10 update sitting, waiting for install and have had it there for a couple of months now. I always shut down my computer when I am not using it and the update has never attempted to install (I don't use the "Update and shut down" option).
As a matter of fact, I have never, ever, ever had Windows update forcibly install without my permission.
Microsoft is certainly obfuscating the delay/decline options, but I have a feeling that nothing has really changed. You may not have a straight-up "do
Re: (Score:2)
If any other software behaved this way, it would be called malware.
It's like ransomware, except without any ransom. You're just fucked no matter what you do.
Re:So the vulnerability is the updating mechanism? (Score:5, Informative)
>> Literally EVERY OS has this concern
I'm not sure you understand the concern then. The feature in question is, "ability to remotely update code on a device automatically, without user intervention"
Windows allows you to disable automatic updates (even on Windows 10). Linux famously allows you to only put the specific code you want into your OS. (Google "compile kernel", etc.) If iPhones require automated updates or they will stop functioning, I'd say that concern is still fairly unique to the iPhone platform.
Re: (Score:3)
You can't disable updates on Windows 10, only "defer" them, at least on non enterprise versions.
This screenshot is from my Windows 10 Pro machine at work. There is only "aplazar" available (defer)
Re: (Score:2)
Forgot the link: http://imgur.com/EPpxm3n/ [imgur.com]
Re: (Score:2)
Right, you can only defer them.
Except that you never really had the "don't install this update" on Windows 7 either. Sure, you can just choose to not install the update on 7... and it won't ever try to install it. Except, you will also never get rid of the update either unless you "hide" the update.
Windows 10 is no different except that there is apparently no way to "hide" the update.
But, just like with 7, you can endlessly defer the installation of an update simply by ignoring it.
I have had an update on Wi
Re: (Score:2)
Or... navigate to [HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\WindowsUpdate\AU] and set "AUOptions"=dword:00000002 .
Re:So the vulnerability is the updating mechanism? (Score:4, Interesting)
I think the article is not correct. iOS doesn't let you run an update that reboots the phone unless you input the password first (ostensibly to prevent you from being locked out on reboot).
I think Apple can force load a new OS without this limitation, but it needs physical access to do so.
Re:So the vulnerability is the updating mechanism? (Score:4, Interesting)
I think the article is not correct. iOS doesn't let you run an update that reboots the phone unless you input the password first (ostensibly to prevent you from being locked out on reboot).
I think Apple can force load a new OS without this limitation, but it needs physical access to do so.
Exactly correct, the article is wrong on the fundamental premise that Apple can force an over the air update. They, or anyone, can force a firmware update when connected to a wire. The Government want's Apple to create firmware that would turn off the security option in iOS that wipes the phone after 10 failed passcode attempts.
Re: (Score:2)
I hate Apple as much as the next anti-Apple-fan boy, but come on. Literally EVERY OS has this concern. I wouldn't call it a backdoor anymore than I would suggest that having a window not made out of bulletproof glass is an open invitation for robbers into your house. In other words, this is sort of like "duhhhhhhh" material and hardly newsworthy. Now having an open and honest discussion about the security of update services for OS and the security methodologies employed, would be a fantastic article.
Yeah sure, no problem. Then, having confirmed that they can do this they get an endless stream of secret 'national security letters' and iphones for them to break into.
Re: (Score:2)
Secure credential storage doesn't have this concern because its firmware can't be updated (at least not without first successfully authenticating). iPhones have secure credential storage, both inside their cryptographic processor and inside their SIM cards. So it is hard to understand why iPhones have this vulnerability at all. It's either a big screw-up or deliberate.
Even without secure credential storage hardware, you can still make PIN numbers reasonably secure aga
Re: (Score:3)
Ehh, who needs mod points.
Take a look at this link: https://www.techdirt.com/artic... [techdirt.com]
The gist is that iPhone's "Secure credential storage" firmware is part of the regular firmware, and can be updated without authentication. It just has to be signed by Apple. I will agree that a much better model would be a fully seperate chip that requires authentication, or a wipe to update the firmware. Unfortunately, it looks like Apple didn't want to do things properly.
I'm not sure what you're talking about for the
Re: (Score:2, Informative)
There is no vulnerability here. There are no such thing as "automatic updates" of iOS. There are "auto-downloaded" updates... but you ALWAYS have to install them manually... and to do so you need to unlock the device AND put in your iCloud username and password.
There is NO backdoor here.
Re: So the vulnerability is the updating mechanism (Score:2)
Re: (Score:2)
No: What's been stated is that if Apple is in physical possession of the phone they can put the phone in a special mode and forcibly update portions of the operating system.
This is not an issue with the normal system that's built in that people use to update their operating system.
However, I do expect Apple to close even this final loophole in the next version of iOS. Instead of encrypting just the user's data on the phone... EVERYTHING will be encrypted... including the OS.
Re: (Score:2)
Citation?
Re: (Score:2)
My OS only updates when I want it to. Cyanogenmod comes built that way. Some danger from Google Play or Amazon App Store, which can install whatever they want.
Security is hard. I can still install a bad application, or have Google Play update itself with nastiness; I can also remove those things and not install updates. It's a similar problem when the phone's whole OS has a built-in auto-update, although you can't just rip that out; then again, modified Android OSes *are* just ripping the OS out.
And soon it won't be (Score:5, Interesting)
If I were Apple, I'd make sure a future release gave the user the option of only allowing firmware updates after the user logged in. This doesn't have to be required for every iPhone (corporations might want this disabled on iPhones they purchase for their employees), but it should at least be an option.
Re:And soon it won't be (Score:5, Informative)
Re: (Score:2)
A normal update does require you to unlock the phone to accept the update. They're talking about leveraging recovery mode which can be used to force load an image onto a phone that might be otherwise unusable. See here - https://support.apple.com/en-u... [apple.com]
Yes. That's the exact Apple support page that worries me. It says "iTunes will try to reinstall iOS without erasing your data." Updating iOS in this way needs to either require my passcode or erase my data. I expect that it will in a future version version of hardware (because only doing it in software isn't enough).
Re:And soon it won't be (Score:4, Interesting)
A normal update does require you to unlock the phone to accept the update. They're talking about leveraging recovery mode which can be used to force load an image onto a phone that might be otherwise unusable. See here - https://support.apple.com/en-u... [apple.com]
Yes. That's the exact Apple support page that worries me. It says "iTunes will try to reinstall iOS without erasing your data." Updating iOS in this way needs to either require my passcode or erase my data. I expect that it will in a future version version of hardware (because only doing it in software isn't enough).
I have gone through this process, so can speak from experience. My wife changed her passcode, then promptly forgot the new one. The only option according to Apple is to reinstall. But if the phone is previously synced to a computer, it has exchanged cookies that allow the computer to still access the phone's contents (this is one of the reasons why the FBI wanted to find that hard disk). When I did the reinstall, it first read the contents out like a normal backup, then installed a fresh OS, then restored the data from the backup. I think this is what they mean by "try to reinstall iOS without erasing your data." It does get erased, but is restored, so effectively not erased.
About six months later she did the same thing, except this time, she tried rebooting the phone. When I hooked it to the computer, the system was unable to access the phone, so the restore could only put back the data saved during the latest backup (about a month before). She was bummed since she lives off her phone's calendar and doesn't trust it backing up to iCloud.
Re: (Score:2)
Also that any update of the secure enclave firmware erases the current security key. Better to make the enclave firmware flash once and not updatable.
Signed updates are fine... (Score:3)
Signed updates are fine, as long as you can't update the firmware in your secure memory to alter the maximum number of wrong guesses before erasing or reduce the minimum time between guesses. That way even if the OS image is compromised you still need to enter the correct code within n attempts to unlock the device.
It seems incredible that Apple thought it would be a good idea to build that functionality. I don't know of any other ARM CPU design that allows it, for this exact reason.
Re: (Score:3)
Apple already had to update the fw once http://9to5mac.com/2015/03/18/... [9to5mac.com] because it wasn't incrementing properly when the power was cut. You would prefer to wipe the phone to apply the update?
Personally I would like the ability to set the key myself.
Re:Signed updates are fine... (Score:4, Insightful)
You can fix that super easily:
secure enclave will accept software updates in two cases: 1) provide unlock code and keep the encryption key intact. 2) do not provide unlock code and then wipe the encryption key.
This is a secure method of doing it. You can either provide the unlock code and update the firmware of the secure enclave without wiping the device, or you can wipe your device and update the firmware of the secure enclave without the unlock code.
Re: (Score:3)
Well, it's an academic discussion because the phone in question is an iPhone 5C, which doesn't have the Secure Enclave.
If it did, then the FBI would be fucked. But, because it's the last model without it, this type of brute forcing of the PIN is still possible if the OS doesn't prevent it, which is exactly what they are asking for.
Word on 'net (Score:2)
Re: (Score:2)
>> this will come up under free speech violations
You must be new here. (The nod to 'net makes me think you woke up from a nap started in 1995.)
>> Code is speech and the government is requiring Apple to create the code and the means to do this.
Remember that thousands of US-based governments (fed, state, county, city...) already requires thousands of companies to develop code (or "speech" if you want) and the means to do X, Y and Z (e.g., "calculate tax withholding on..." or "use GPS fencing to
Re: (Score:3)
I hope you're right, but SCOTUS says money is speech and people are still compelled to pay money.
The issue of compelled speech is not completely settled either. The courts have ruled both that it can be and that it can't be depending on circumstances.
http://www.firstamendmentcente... [firstamendmentcenter.org]
https://www.washingtonpost.com... [washingtonpost.com]
https://www.researchgate.net/p... [researchgate.net]
There's a lesson here (Score:5, Insightful)
Listen up, law enforcement, DoJ, et al. I am more afraid of your incompetence than I am any dark "world domination" motive on your part, but I am nowhere near as afraid of :"teh terrorists" as I am of you, regardless of your motive. So hands off my crypto. M'kay?
Re:There's a lesson here (Score:5, Insightful)
And that's all fine. Remind me again why Apple has to provide said help?
A Judge can order a safe broken into, the FBI can hire a safecracker to break into it. If that safecracker doesn't want to do the job, they'll get someone else.
What DOESN'T happen is the Judge directly ordering a SPECIFIC safecracker to do the job against their will, and in the process, damage their reputation for ALL safes.
No one is disputing the FBI's right to inspect this phone. More power to them, crack away... Why exactly does Apple have to help again? Have we become slaves?
The title of this article is wrong! (Score:5, Insightful)
You need physical access to put it in DFU mode (Score:5, Informative)
What they're talking about is putting the phone into Device Firmware Update mode, like this [imore.com]. Only then will they be able to update it remotely and on the newest iPhones that'd also wipe the encryption keys. But not on the model in question here.
Dumb Pre-Paid Phones? (Score:2, Insightful)
Is this why drug dealers buy lots of pre-paid phones?
Re: (Score:2)
>> why drug dealers buy lots of pre-paid phones
It's more that pre-paid phones can be obtained with cash or pre-loaded cash cards. Regular phone plans are typically tied to a bank account (often a credit card account), which ties a specific phone to a person (that can be ID'ed through a bank), so drug dealers would prefer the "burner" route.
In other words, arrested drug dealers don't care as much about a "ha ha you can't encrypt my data" defense as they do about "hey - that's not my phone!" defense.
Re: (Score:3)
While some of this is true, I think the real answer is even simpler: they're disposable.
There's a reason that the phones are called burner phones; if it gets trashed or destroyed for whatever reason, you're not out anything except an easily replicated list of phone numbers.
Likewise, a lot of burner phones just don't have many of the tattle-tale features that smart phones do; older models lack GPS, very little on-board memory for logging, and so on.
While law enforcement certainly does have the means to spy o
Android (Score:5, Interesting)
Lot's of good discussion about iOS and Apple.
I would like to have the same analysis about the state of Andriod. Can it be made secure against such backdoors? Do third-party flavors and rooting have a role? Is it possible to have a device where all software and firmware code can be examined?
Re: (Score:2)
Android software provides APIs for storing encryption keys in secure hardware. However, whether the secure hardware storage your phone uses is actually secure depends on the manufacturer, how they implement the hardware and what kinds of modifications they have made to the software.
Android also provides hooks for external security devices. And you can use the SIM card for storing encryption ke
Re: (Score:2)
Based on the press releases surrounding the San Bernadino iPhone, the same does not appear to be the case with the iPhone backups Apple "scoops up".
iPhone 7 will use SE to authorize any OS updates (Score:5, Interesting)
Apple has updated the secure enclave with an iOS update in the past and added additional protection, so it presumably can do an update that would REMOVE protections on the SE. So the same scenario of this phone can theoretically be applied to any existing iPhone and not just a 5c.
So right now, Apple is making the iPhone 7 immune to this attack vector. With the iPhone7, even Apple with not be able to do a firmware modification to the SE in DFU mode. The correct user password will *have* to be entered in the iPhone7 and it will be enforced solely in the SE hardware. There will be nothing that can get around that. You can't solder on a different SE chip, you can't swap components, change the IEMI, or anything else.
That will be the selling point of the iPhone 7. iOS 9 was software-based protection since a software update could (apparently) change the SE. Apple will disclaim they never expected their own government trying to force them to create a hacker-version of iOS, so security of the iPhone has to be hardware based. iPhone7 will have true 100% bulletproof hardware-based protection that will truly be bulletproof. And that is what they will sell.
Then, unfortunately, the FBI will simply demand iOS source code and signing keys.
Re: (Score:2)
Re: (Score:3)
If the SE is designed correctly then even publishing the source code and signing keys will not allow recovering the encryption key.
That's what the S stands for!
Someone educate me, please (Score:2)
I don't understand what the FBI is asking for. I understand they'd like Apple to install a backdoor key for use in the future, but Apple can't add a backdoor to an existing phone which would defeat existing encryption, could it? How could they do that?
If the FBI has the phone, then the FBI has the encrypted data, and they can brute force attack it. But if the data wasn't encrypted using a scheme with a backdoor key, and you don't have the frontdoor key, then what is Apple supposed to do exactly?
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
This is covered in the numerous articles on the topic.
The FBI wants to brute force the PIN, not the encryption key. The phone is set to wipe if the PIN is incorrectly entered too many times. They want a custom firmware that will let them guess until they get the right PIN, at which point they will simply have an unlocked phone with no need to even try to brute force the encryption.
IP v Security (Score:2)
What more? (Score:5, Insightful)
Re:What more? (Score:4, Interesting)
Easy. The FBI has two reasons for compelling Apple to do this.
1) The phone itself. Think of all the credentials stored on the device that you now can access. Saved messages in WhatsApp and other IM style apps, live access to various services (perhaps they used GMail? The Gmail app or web page will show you the account and its data as well), etc. etc. etc.
Effectively, they get access to all sorts of data without requiring a warrant - perhaps they know he had a GMail account, and then they'd need to get a warrant to get information from that account from Google. But if they can access the Gmail app from the iPhone, warranty avoided!
2) The second part is to get Apple to deveop this software, because once it exists, it can be used over and over again.
The case cited for the All Writs Act involves the use of pen registers. The telephone company lost purely because they were already using pen registers in their day to day operations to verify billing and check for fraud. So they can be compelled to connect a pen register up to a desired phone line because they were doing it already.
Apple doesn't have the software, but once they do, it can be compelled into action. That's the result the FBI really wants.
Re:What more? (Score:4, Informative)
You have a few factual errors. The passcode wasn't changed. The iCloud account password was. The distinction matters quite a bit, since one is used to unlock the phone, while the other is used by the phone to access external Apple services, including iCloud Backup. The hope here was that they could initiate an automatic iCloud backup by charging the iPhone while it was in range of a recognized WiFi network. Apple has the ability to access data that's backed up to iCloud, so they'd be able to provide the FBI with the lawfully-requested contents of the iPhone if a fresh backup were initiated, and they could do so without needing to build malicious tools.
Unfortunately, the iPhone belonged to the county (since the shooter was a government employee). For reasons that are unknown but very suspicious since the iCloud backup technique is known to the FBI and has proven useful in the past, in the day immediately after the attack, the FBI ordered the county to reset the user's iCloud password, which the county was able to do by logging into his work e-mail that was tied to his iCloud account and initiating the password reset from there. As a result, the iPhone now lacks the correct credentials to create an iCloud backup. The FBI then tried to downplay the matter in the footnote of some court documents by implying offhandedly that it was local yokels who made a mistake, until the "local yokels" spoke up in their own defense by pointing out that they were acting on FBI orders.
So, going back to your original question, the FBI wants one thing: a change in precedent that allows them to put a stop to strong encryption. Demanding access to the current contents of the phone (despite already having a recent backup) while sabotaging the best known way to get at it is just a means to that end.
For Sufficiently Worthless Definitions of Backdoor (Score:2)
Specifically, the ability to remotely update code on a device automatically, without user intervention, represents a fairly serious threat vector.
This is a core feature of most modern operating systems. It is easily disabled in both iOS and OS X.
Your argument is only slightly less inane than suggesting that allowing a computer to access the Internet counts as a backdoor.
iPhone has a backdoor for Apple's own use. (Score:4, Insightful)
FBI wants to use this very backdoor, too. For a lot of people, this is already NOT OK. The government is pretty much different from a company you have business with.
And it is not about the ability to crack. NSA probably has the resources to do that. FBI wants it "by the law".
Re: (Score:2)
The FBI's argument. (Score:4, Insightful)
A response (Score:5, Informative)
The best response to the FBI's request I've read thus far comes from the noted IOS forensics security guru, Jonathan Zdziarski [zdziarski.com] where he wrote the following
An instrument is the term used in the courts to describe anything from a breathalyzer device to a forensics tool, and in order to get judicial notice of a new instrument, it must be established that it is validated, peer reviewed, and accepted in the scientific community. It is also held to strict requirements of reproducibility and predictability, requiring third parties (such as defense experts) to have access to it. I've often heard Cellebrite referred to, for example, as the Cellebrite instrument in courts. Instruments are treated very differently from a simple lab service, like dumping a phone. I've done both of these for law enforcement in the past: provided services, and developed a forensics tool. Providing a simple dump of a disk image only involves my giving testimony of my technique. My forensics tools, however, required a much thorough process that took significant resources, and they would for Apple too.
The tool must be designed and developed under much more stringent practices that involve reproducible, predictable results, extensive error checking, documentation, adequate logging of errors, and so on. The tool must be forensically sound and not change anything on the target, or document every change that it makes / is made in the process. Full documentation must be written that explains the methods and techniques used to disable Apple's own security features. The tool cannot simply be some throw-together to break a PIN; it must be designed in a manner in which its function can be explained, and its methodology could be reproduced by independent third parties. Since FBI is supposedly the ones to provide the PIN codes to try, Apple must also design and develop an interface / harness to communicate PINs into the tool, which means added engineering for input validation, protocol design, more logging, error handling, and so on. FBI has asked to do this wirelessly (possibly remotely), which also means transit encryption, validation, certificate revocation, and so on.
Once the tool itself is designed, it must be tested internally on a number of devices with exactly matching versions of hardware and operating system, and peer reviewed internally to establish a pool of peer-review experts that can vouch for the technology. In my case, it was a bunch of scientists from various government agencies doing the peer-review for me. The test devices will be imaged before and after, and their disk images compared to ensure that no bits were changed; changes that do occur from the operating system unlocking, logging, etc., will need to be documented so they can be explained to the courts. Bugs must be addressed. The user interface must be simplified and robust in its error handling so that it can be used by third parties.
Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify. NIST checks to ensure that all of the data on the test devices is recovered. Any time the software is updated, it should go back through the validation process. Once NIST tests and validates the device, it would be clear for the FBI to use on the device. Here is an example of what my tools validation from NIJ looks like: https://www.ncjrs.gov/pdffiles... [ncjrs.gov]
During trial, the court will want to see what kind of scientific peer review the tool has had; if it is not validated by NIST or some other third party, or has no acceptance in the scientific community,
Re: (Score:3)
Re: (Score:2)
Well I don't think they are going to prosecute the phone owners as they are rather dead.
Otherwise they are dealing with terrorists and terrorists don't have rights the govt just sends them to gitmo and keeps them there indefinitely without charges.
Funny laws that we have nowadays.
Re: (Score:3)
If you want to maintain the constitution (I know, it's far fetched), all evidence must be processed as described above. If the FBI gets a contact list from the phone and decides to prosecute an individual, all the defense has to do is "well, how did you get that phone number" and if the evidence isn't good/correct or the FBI tells them that it just magically knew who to talk to, it's highly likely that the case gets thrown out right then and there.
It's not all Apple's fault (Score:5, Informative)
My answer came over the weekend when I read this article [cbsnews.com] which stated the county paid for but never installed such software.
Having been responsible for setting up iPhones for a state agency, one of the steps was installing AirWatch which we did have to use on a few occasions when people locked themselves out.
Not installing such software is either incompetence or laziness on the part of the IT folks who handed out these phones.
Smoke and Mirrors? (Score:3, Insightful)
I'm seriously wondering if this whole thing could really just be a giant PR/marketing exercise by Apple, when in fact they are already complying with the NSA?
http://www.theguardian.com/wor... [theguardian.com]
Apple is wrong, but so is the FBI (Score:2)
While I support Apple's stance on this issue, it really doesn't apply in the California case. Authorities already had access to the phone from the start. Local authorities inadvertently reset the password and do not know what it is. The FBI is requesting help to reset the password that the authorities had put on the phone, not the shooters. As such, why would Apple not help?
All of that said, the FBI is also wrong. While it is one thing to request help with this particular phone. Trying to force Apple to
Re: (Score:3)
The biggest reason why Apple would not help, other than the possibility that there is no help they are capable of offering (which is conceivable), is that by doing so, they would be confirming beyond any shadow of doubt that it is even actually possible.
The realization that something is physically possible is a *HUGE* incentive for some people to try and figure out how it is done, and if Apple can do it, then so can other people... people with much more nefarious intentions than even an untrustworthy gov
It's not a backdoor (Score:2)
It's a *way to install* a backdoor.
In meatspace, Apple does not have the keys to the building, but they have a key to the tool shed where you can build a new handle and lockset that has a maser key, and a screwdriver which would alloy you to replace the current door handle with the compromised on. Apple will not let the FBI into the toolshed, nor help them create the faulty (master-keyed) lockset.
Cluster Fuck (Score:4, Insightful)
This is all a giant Cluster Fuck.
It's still unclear; does the FBI want to give the phone to Apple so they can break in, or do they want apple to give them the tools to do it themselves?
If it's the former, then Apple should get it done, then destroy the tools and cal it a day. if it's the latter, then Apple should make it clear and call them out on it.
What is clear is that getting the data from the phone is not secondary to the Us vs Them bullshit going on now.
Re:Cluster Fuck (Score:5, Insightful)
from what I've read the FBI prefers the latter but would accept the former. However, Cook has said that law enforcement around the country has already said they have hundreds of iPhones they want appel to unlock if the FBI wins; if that's so, I don't think destroying the tool is going to be a viable option.
Re:Cluster Fuck (Score:5, Interesting)
This. If it's done once, the demands will never stop. At least not until the NSA steals a copy of the hacked firmware and distributes it the LEOs everywhere under an NDA.
Re:Cluster Fuck (Score:5, Interesting)
If Apple is as serious as they say they are about security and privacy, they need to change the OS/firmware/hardware to make updating a phone impossible without either unlocking the phone or wiping it clean. This way, when this happens again, and it almost certainly will, they can honestly say, we can't rather than we would rather not.
Re: (Score:2)
Exactly right. If not for everyone else, then for their own freedom they must put this sort of thing beyond their own capability.
Re: (Score:2)
Re:Cluster Fuck (Score:5, Insightful)
Re: (Score:3)
My guess is that it's probably not possible without doing some serious work, such as imaging the phone as a "backup", wiping it, updating the OS, and then restoring the "backup" over the top, which would then restore the encrypted data. Because this phone doesn't have the "Secure Enclave" the encryption key is stored somewhere in flash, and likely would be backed up with the rest of the data.
I know that every time there was an iOS update delivered over the air since they added that capability, it makes you
Re:Cluster Fuck (Score:5, Insightful)
The demands would never stop from US law enforcement agencies. And then they would roll in from governments around the world. And then some hacker group would get their hands on the "unlock" tool and repurpose it to break into any iPhone at any time.
If Apple breaks the encryption, there is no way that it will be just for this one phone and that's it.
Re:Cluster Fuck (Score:4, Insightful)
It's also not quite as simple as "Apple does it, destroy the tool, call it a day." It's like any weapon, once developed it's hard to put the genie back in the bottle. We can't go back from missiles, guns, bombs, etc... The technology is there, and it can't be undone. Similarly, if Apple where to develop the tool and use it in-house, then there are brains in Cupertino that know how to defeat the protection. Think of insider threat, extortion, the increased attempts to break into Apples network, etc... Not to mention the requests from law enforcement to break into other phones.
I've never been a fan of Apple's walled garden and prefer to have control over my devices... though with their standing firm on consumer privacy that iPhone is starting to look pretty good.
Re: (Score:3)
Can Apple develop software to *upgrade* a phone without user interaction?
The fact that they are raising such a stink about this hints at yes. Though since they hold the source code, there is some security through obscurity at play here. We can only speculate as to how they would implement this tool, or what protections Apple puts on access to those bits of code (if any). This i
Re: (Score:2)
If it's the former, then Apple should get it done, then destroy the tools and cal it a day.
How can you fully ensure that software tools are destroyed & never copied?
if it's the latter, then Apple should make it clear and call them out on it.
Uh...first link in TFS
Re: (Score:2)
Even worse, if Apple does this and people find out that it is actually physically possible, how can you fully ensure that nobody else ever eventually figures out how Apple did it and replicates it in the wild?
Re: (Score:3)
Because firmware updates have to be cryptographically signed with Apples signing key.
Re: (Score:3)
Re:Cluster Fuck (Score:5, Informative)
Wrong.
It's still unclear; does the FBI want to give the phone to Apple so they can break in, or do they want apple to give them the tools to do it themselves?
The order clearly states that Apple is not required to provide the software created. Many people, including myself, believe that there is an unspoken motivation in this case to have a precedent which allows law enforcement to force software companies to produce software to enable access to encrypted systems, but it is a supposition not substantiated by the court documents.
The court documents compel Apple to create software which will make it easy for the DOJ to break in, but not that Apple do the final step of actually breaking in.
If it's the former, then Apple should get it done, then destroy the tools and cal it a day.
Which Apple probably would have done if the DOJ had made the request under seal to keep it secret, as Apple requested. However, the government made it a public request, which supports the idea that the government wants either a legal precedent or an excuse to ask Congress to change the laws so they can force software companies to create hacking software.
What is clear is that getting the data from the phone is not secondary to the Us vs Them bullshit going on now.
I think that must be a typo. It is clear that this debate is not about this case, but rather what the DOJ can successfully force software companies to do, or an excuse to get new legislation so they can force hacking by software companies.
Re: (Score:2)
Re:Cluster Fuck (Score:5, Insightful)
Re: (Score:3)
"If it's the former, then Apple should get it done, then destroy the tools and cal it a day. "
Exactly. And additionally, make sure that after the next iOS update, that method will never, ever work in the future.
Re: (Score:3)
TFA contains more info links, but by itself the content looks more of assumption/implication. I can't find anything from TFA showing the evidence that there is a backdoor but rather said it (see below)...
Tim Cook protests that Apple is being asked to create “a new version of the iPhone operating system.” This glib talking point distracts attention from the reality that there’s essentially a backdoor on every new iPhone that ships around the world: the ability to load and execute modified firmware without user intervention.
Ostensibly software patches were intended to fix bugs. But they can just as easily install code that compromises sensitive data. I repeat: without user intervention. Apple isn’t alone in this regard. Has anyone noticed that the auto-update feature deployed with certain versions of Windows 10 is impossible to turn off using existing user controls?
Now, to answer your question about FBI, you would get the answer http://www.nytimes.com/2016/02... [nytimes.com] by following a link on TFA page.
After December’s San Bernardino attack, Apple worked with the F.B.I. to gather data that had been backed up to the cloud from a work iPhone issued to one of the assailants, according to court filings. When investigators also wanted unspecified information on the phone that had not been backed up, the judge this week granted the order requiring Apple to create a special tool to help investigators more easily crack the phone’s passcode and get into the device.
Apple had asked the F.B.I. to issue its application for the tool under seal. But the government made it public, prompting Mr. Cook to go into bunker mode to draft a response, according to people privy to the discussions, who spoke on condition of anonymity. The result was the letter that Mr. Cook signed on Tuesday, where he argued that it set a “dangerous precedent” for a company to be forced to build tools for the government that weaken security.
Anyway, this does not mean I trust Apple that they don't have backdoor on their device, but I would rather see an evidence or some research results that point out exactly what i
Re: (Score:2)
That is until someone besides Apple or the government figures out how to get into that backdoor.
How about a compromise. If an unauthorized third party gains access to your data via this sanctioned back door, you automatically get five hundred billion dollars tax free.
Re: (Score:2)
I'd really say, any ability to update any operating system is a place where a back door could be inserted.
Re: (Score:3)
To me that is the very definition of a back door, apple can install arbitrary software on your phone without your consent. That is make your phone do whatever apple wants without consent.
Re: (Score:3)
To me that is the very definition of a back door, apple can install arbitrary software on your phone without your consent.
Um, what hardware do you have upon which it is impossible for someone with physical control of the hardware cannot install software? -and if your answer is, "but at least I can encrypt my data"-- you do know that the proposed software that the FBI demands that Apple write doesn't actually get them into the phone; it just gives them the opportunity to brute-force the password.
Re:Puh-leeze. It's an iPhone. (Score:5, Funny)
It has genuine woodgrain vinyl overlay.
running vi, naturally
Re: (Score:2)
Atari Phone sucks, the IntelliHearing is much better.
Re: (Score:2)
Good point. Google has become more and more abusive.
Microsoft looked at that and said, "Evil is OUR business. How can we compete?" That the reason that Windows 10 tracks everything with spyware, excuse me, "telemetry". Microsoft is hoping to sell the information and make easy money. .
I have/had a Microsoft insiders account and downloaded Win10 some 8+ months before it's release.
I couldn't agree to the TOS so never installed Win10 and bowed out of the insiders account. Now I have Win10 which came pre-installed and the same TOS.