Apple Lawyer Ted Olson: Creating Unlock Tool Would Lead To 'Orwellian' Society (9to5mac.com) 183
Apple's lawyer, Ted Olson, explained in an interview with CNN that what the government is asking Apple to do is "limitless." Olson explained that if the tool that the government wants is created, any judge anywhere could essentially order to list to any customer's conversation, track location, and much more. The lawyer likened it to an Orwellian "big brother" type society. When pressed about how Apple could potentially help fight terrorism by creating a tool to access locked devices, Olson explained that while Apple will help the government defeat terrorism in every way that it can, it can't be done by breaking the Constitution.
pretending that back doors dont exist (Score:4, Insightful)
The back door is already there. Thats the problem. The problem isn't that the government wants Apple to use it, and certainly not that the government wants Apple to create one (remember the original narrative?)
Re:pretending that back doors dont exist (Score:4, Interesting)
You miss to point: Apple - and Google, and Microsoft - would much rather do the big-brothering themselves for their own profit, and don't want to give that power to the government.
1984 is already happening, but Orwell got one thing wrong: the tyranny is coming from the private sector, not the government.
Re: (Score:2, Insightful)
Just where in the Constitution is this guarantee of privacy?
Re: (Score:2)
Re: pretending that back doors dont exist (Score:4, Informative)
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
There is a while bunch of privacy law that hangs on this.
Re: (Score:2)
Re: (Score:2)
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
The whole case has nothing to do with privacy. There's a legal search warrant to search the phone, and that search warrant wouldn't even be needed, because the owner of the phone agreed to it.
What the whole case is about is that the technology needed to crack this phone can be used by criminals or terrorists to crack any other phone as well.
Re: (Score:2)
That has precisely nothing to do with the action in question. Nobody is saying there's any legal problem with breaking into the phone, particularly since it's fine with the owner of the phone. It's a question of the FBI trying to make Apple do something they really, really don't want to do.
Re: (Score:3)
And the Tenth.
The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.
Which translates to, "If we didn't say the Feds can do it, they can't."
Re: (Score:2)
It does, sophistry notwithstanding. It simply isn't being respected.
Re: (Score:2)
Correct, Constitution doesn't grant the right to privacy. In fact, it doesn't grant any rights. Govt doesn't grant rights. The Constitution defines the limits to govt, and enumerates its duties.
Where in the Constitution is the enumerated duty to force Apple to fabricate software?
Re: (Score:2)
Where in the Constitution is the enumerated duty to force Apple to fabricate software?
The commerce clause, silly.
Re:pretending that back doors dont exist (Score:4, Insightful)
Spot on!
Apple i perfectly capable of cracking your device. They are not fighting for privacy. They are fightning for the appearance of privacy because it is good for business.
It seems to me that either they follow the legal requests (which they are) or they get their shit together and create a phone that is actually secure.
Re:pretending that back doors dont exist (Score:5, Insightful)
Re:pretending that back doors dont exist (Score:5, Insightful)
I believe they already have made one that's more secure. Apparently this particular attack vector only works on older iPhones, which the shooter had in this case. I wouldn't be surprised if the next phone is completely impossible (so much as anything can be at least) for even they themselves to hack. Apple makes all of their money from selling expensive hardware, not customer data, so they don't have much financial motive for needing an access to that data and their inability to do so only makes the hardware more attractive.
What if the governements orders Apple to create iphones that are breakable ? Thought about that ?
People are fucking stupid and don't understand that technology is never the answer to a societal problem.
Politics is. Apple is doing the right thing. If the government wants to break the iphone they have at their disposal billions of dollars, talent and infrastructure beyond even what is available to Apple. So why don't they do it ? Because once the precedent of making a company do your bidding is made private companies are fucked for life. The government is playing the big game here. If they break Apple we'll never ever again have a computer industry that protects the consumer. We don't live (at least for the time being) in a dictatorship and the constitution is still valid. You can't just through it away because of crime or "insert any other bogey man of the week".
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
If unlocking your phone is done by the OS, and Apple can update your OS without your consent, then they can always unlock your phone.
Unlocking the phone requires the passcode. There is absolutely 100% no way to unlock the phone without the passcode. What the FBI wants Apple to do is remove a security feature that erases the phone after ten incorrect attempts. Which would be good enough because the phone uses a 4 digit passcode. With an eight digit passcode, nothing Apple or anyone can do.
Re: (Score:2)
Pretending that back doors don't exist is what will create an Orwellian society. The back door is already there. Thats the problem. The problem isn't that the government wants Apple to use it, and certainly not that the government wants Apple to create one (remember the original narrative?)
With enough C4 you can get into anything physical I own, that's not a backdoor it's just the degree of physical protection it has. Apple has taken the user's very weak lock (the PIN) and tried to put in a much more secure box but without dedicated hardware they had to do it in firmware. It's not a perfect solution but not many have the power to compel Apple to produce a signed firmware disabling it, don't let perfect be the enemy of good.
At worst, you're back to the user's shitty lock. At no point does Appl
Re: (Score:2)
The back door is already there.
If it was we wouldn't be having this debate.
By your definition, every device has a backdoor. (Score:2)
with enough time and effort you can crack any device. Security has never been about 'perfection'...at any point in the history of mankind. Ever.
The whole point of security is to raise the cost (time, money, political capital in this case) that must be spent to break in. Seeing that the government is basically having to go 15 rounds with Apple to break into the iPhone of a deceased terrorist--that seems like pretty good security to me.
But I understand your confusion--which isn't to say that I'm excusing it,
Re: (Score:3)
The back door is already there.
Prove it or STFU
Apple has claimed [cnn.com] it will take "two to four weeks [...] for six to ten Apple engineers and employees dedicating a very substantial portion of their time" to comply with the government's request.
If a company the size of Apple can spend about $100,000 of developer time to get to your data, I think it is only semantics to say the back door doesn't already exist. It would be the equivalent of me saying your house is secure from me breaking in because it would cost me 25 cents to create a master key to your home
Re: (Score:2)
Re: (Score:2)
The back door is already there.
Prove it or STFU
Apple has claimed [cnn.com] it will take "two to four weeks [...] for six to ten Apple engineers and employees dedicating a very substantial portion of their time" to comply with the government's request.
If a company the size of Apple can spend about $100,000 of developer time to get to your data, I think it is only semantics to say the back door doesn't already exist. It would be the equivalent of me saying your house is secure from me breaking in because it would cost me 25 cents to create a master key to your home.
You very obviously don't understand - if Apple did what they are told to, after a couple of weeks the iPhone still wouldn't be "cracked", It just wouldn't be unreasonably hard to crack it anymore. Well, at least if the guy actually only used a 4-digit key to lock the phone,
Re: (Score:2)
If a person as rich as Bill Gates can spend about $11,000 of contractor time to finish his house, I think it's only semantics to say the house doesn't already exist
There, I fixed your analogy so it lines up with the scale of money we are talking about and so it fits the scenario better.
Re: (Score:2)
Actually Gates is worth about 1/3 what Apple is ($70 billion vs. $200 billion), so it should be $66,000.
It doesn't change the point that regardless of cost, the house doesn't already exist just because it's cheap (to you) to build.
To fit your analogy better, Apple has spent billions on R&D for the iPhone over the past decade. The end result is a phone that only requires $100k of additional development to break its security. If it is that easy to break into their own phones, I do contend they already have a backdoor; they just didn't finish putting the door knob on yet.
And lastly, from what I can find Apple has almost 8x the net worth of Bill Gates, although I wrote my original post after reading an article from last year when App
Re: (Score:2)
Specifically it would take Apple that long to do it. For someone else to try and do it would be much harder as they would have to figure out a way to sign the code without having access to Apple's distribution certs or steal them somehow.
Knowing how to build a sledgehammer to bash in a door to gain access isn't the same as a backdoor existing.
But in this case, creating a wall weak enough that a sledgehammer can break into it is no different than building a backdoor yourself. We are talking about a phone with billions of dollars of R&D spending behind it that can be compromised by $100k of development. As I said in another post, they may not have completed a backdoor but they basically just need to finish installing the door knob.
Re: (Score:3)
can be compromised by $100k of development.
Not correct. It can be compromised by $100k of development by an organisation in possession of Apple's private signing keys, will only work on older phones, and can be defeated even on those simply by having a longer passphrase.
Re: (Score:2)
Re: (Score:2)
Specifically it would take Apple that long to do it. For someone else to try and do it would be much harder as they would have to figure out a way to sign the code without having access to Apple's distribution certs or steal them somehow.
*In best Wonka voice*
Tell me again how Homebrewed PS3 CFW isn't available
Goodbye, Thirteenth Amendment? (Score:3)
We of the dark side are often accused of invoking the slippery slope argument too soon. But in this instance, if the FBI is able to convince courts that forced labor is a valid tactic to use in a terror investigation, it already has nine new cases (more according to some sources) for which it wants Apple to be forced to write custom crack code in hopes of solving. And every single one of these new cases involves the drug war, not terror.
Re: (Score:2)
Of course they involve terror.
The FBI is terrified that they'll stop getting their customary kickbacks from the drug cartels.
Re: (Score:2, Interesting)
It's not just a demand for forced labor, they're trying to compel legally-protected speech, and to set a legal precedent in the process.
Re:Goodbye, Thirteenth Amendment? (Score:4, Insightful)
Re: (Score:2)
This is no different from anybody else having to comply with a court order: it can be costly and the penalties for non-compliance can be harsh. This is nothing new.
As long as they can break into the phones they produce, courts can
Re: (Score:2)
"If Apple wants to avoid such cases in the future, they should design phones that they themselves cannot break into. That is entirely legal."
Apple has already designed iOS to be not decryptable. The FBI, like any other possessor of an encrypted device, is welcome to try writing the facilitating software it would take to allow a brute-force attack on the iPhone. Instead, it is trying to compel Apple to write the software for it, knowing that this would make it easier to break into other such devices in the f
Re: (Score:2)
That's simply false. Apple has code signing for OS updates plus insecure cryptographic hardware.
Again, that is also false. There is no reason to believe that such software would work for other devices.
Re: (Score:2)
Actually I wish you were right about the ease of decrypting an iPhone, because that would make the FBI's case even weaker. You're saying it could easily break into the iPhone by hiring its own developers, but instead chooses to go mano a mano against a company with far more money than it does, and be willing to shred the US Constitution - in an election year - to support its crappy case?
Re: (Score:2)
No, it's not easy, it's a lot of work: without Apple's help, they need to get a hardware debugger, a couple of people able to use it, a couple of old iPhones with the same software version, and then try to identify the location where the unlock count is kept/updated. Then they need to take the target 5c and change the unlock count in hardware.
I think the NSA could do it easily: they probably have both Apple's source code and
Re: (Score:2)
Courts do not have unlimited authority. Apple is claiming that this oversteps the bounds of legal court orders.
Re: (Score:2)
Yes, but that is an "undue burden" argument, not a "forced labor" argument; Applehu Akbar's argument was bullshit. Apple's argument may or may not be valid.
Re: (Score:2)
Yes, the FBI wants to establish "writing special break-in software that we could write ourselves if we weren't terminally clueless" as a compulsory labor exception like prison time, the draft or jury duty. Welcome, oh bootlickers, to the wonderful world this would open up to you as developers.
Let's just be honest (Score:1)
While I happen to agree with Apple's position in this case, I think it's important to be intellectually honest here.
When pressed about how Apple could potentially help fight terrorism by creating a tool to access locked devices, Olson explained that while Apple will help the government defeat terrorism in every way that it can, it can't be done by breaking the Constitution.
Let's not pretend that Tim Cook, or virtually any executive at Apple, gives a shit about the US Constitution.
It's just really annoying when I see these shysters get indignant and hide behind the same Constitution that they continuously mock and use as a punchline.
Re: (Score:1)
Let's not pretend that the FBI, or virtually any politician, gives a shit about the US Constitution.
There - fixed it to more closely match the realities, because the US is already an Orwellian society.
Re: (Score:2)
Maybe I'm missing something here?
The 4th Amendment 'guarantees' the people privacy against government intrusion until a judge decides it's OK for the government to search. Search. Not find, search.
The government can already get the data off this 5C if they want. Heck it doesn't even have the Secure Enclave. They can scrape the epoxy off the memory chips and read the data out in their lab. Did you see Snowden's recent video where he shows an example of this? They have a very precise robot-guided router
Re: (Score:2)
Even if you "scrape the epoxy off the memory chips and read the data out" all you're going to be reading out is encrypted bits. Yes, by reading out those encrypted bits it could be possible to then try to brute-force attack the encryption... but you make it sound like they could just read the data directly... which is not true.
Re: (Score:2)
Re: (Score:2)
the sun explodes.
Careful, making threats like that on a public forum could get you on a no fly list.
Re: (Score:3)
Let's not pretend that Tim Cook, or virtually any executive at Apple, gives a shit about the US Constitution.
Whether he does or doesn't isn't terribly relevant. Tim "cares" about the privacy of his users. The 4th Amendment "cares" about the privacy of the people. They're aligned.
The Constitution that authorizes the government restricts the powers of said government. The government specifically is not authorized to obtain General Warrants; what they're asking for is the digital equivalent of King George'
Apple speaking out 2 sides of their mouth (Score:2)
While I'm on Apple's side in this one. The argument that this is against the constitution is, well...arguable.
The constitution says:
"The right of the people to be secure in their persons, houses, papers, and effects,[a] against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized"
So we need a sworn court order which, to
Re: (Score:2)
Please don't post in stories on which you are completely uninformed. This particular case has no 4th amendment issues. The phone belongs to the San Bernadino heath department, so there never was an expectation of privacy.
The issue here is whether Apple can be forced to do something that they don't want to do. Something that is against Apple's business interests, something that may have 1st amendment issues, something that may negative consequences for the security of Am
Re:Apple speaking out 2 sides of their mouth (Score:5, Insightful)
The "unreasonable" part. It's "reasonable" for Apple, on receipt of a court order, to turn over to the FBI all data in its possession concerning the terrorists, which Apple has done.
Demanding that Apple force its programmers to write custom software THAT DOES NOT NOW EXIST to allow the FBI to break into one particular iPhone is "unreasonable", and I think Cook, and Apple, are correct here.
Further, concerning the 1789 "All Writs Act", signed by George Washington back before there was much Federal law at all; if the All Writs Act can be perverted so far as to demand that Apple write software that does not exist, then what government demand does it NOT permit? Because if there aren't any limits to THIS PARTICULAR LAW, then the Constitution died in 1789, barely two years after its ratification.
Re: (Score:2)
Wouldn't any law written in 1789 be limited by both the US constitution as written and any subsequent amendments? I mean the law would have been passed 2 years before the bill of rights (10 of the first 12 amendments) was ratified in 1791-2. Furthermore the 13th amendment bars involuntary servitude except as punishment for a crime. Certainly any law would have to take that into consideration.
Perhaps if Apple was a cake maker or photographer this would be a lot simple to sort out.
Re: (Score:2)
What does that have to do with anything?
Re: (Score:3)
Exactly. The FBI should have asked. And Apple would have and should have refused, as they have.
Instead, the FBI screwed up and then are trying to strong-arm Apple into repairing the mess the FBI made. Typical overreach and use of force instead of brains.
Re: (Score:2)
Perhaps Apple should agree to write that software for the FBI, but the Professional Services fee should be TEN BILLION DOLLARS, paid in advance.
And since there are, reportedly, 8 or 9 other Federal prosecutors in the possession of locked iPhones who plan to use the FBI's precedent to make their OWN case (these cases are all drug related, not terrorism), then Apple will have set the price for this service. No discounts!
Re: (Score:2)
Re: (Score:2)
Demanding that Apple force its programmers to write custom software THAT DOES NOT NOW EXIST to allow the FBI to break into one particular iPhone is "unreasonable"
But what if the software does exist? Asking "Hey, can you change a couple of variables, and recompile" is not very burdensome. As for the driver to allow for PIN guesses via some other method than the touch screen, that sounds more likely to be unreasonable. But what if the code already exists? It's not too hard to imagine that Apple already has some test assembly that they use in test labs for testing PIN entries on physical devices.
Re: (Score:2)
That doesn't comply with the court order, which requires that the software shall run ONLY on the target device and NO OTHERS. If it were a matter of changing some variables and recompiling, then it would violate the court order. Since Apple would violate the court order in ANY event, then they're correct to choose THIS course of action - to do nothing.
Re: (Score:2)
Re: (Score:2)
Demanding that Apple force its programmers to write custom software THAT DOES NOT NOW EXIST to allow the FBI to break into one particular iPhone is "unreasonable"
But what if the software does exist? Asking "Hey, can you change a couple of variables, and recompile" is not very burdensome.
Yes. And that is what they will claim in the next cases already piling up, because then, not now, the software will exist.
Re: (Score:2)
Further, concerning the 1789 "All Writs Act", signed by George Washington back before there was much Federal law at all.
Actually, I'm wandering a bit off-topic, but this "All Writs Act" could be awesome.
Recently, Northrop-Grumman announced their new B-21 strategic bomber. They're going to build them at a cost to the taxpayer of $800 million per.
Nope! "All Writs Act!"
That's right, with the "All Writs Act," all the government needs is a court order saying that this is important for "National Security" and, bingo, Northrop-Grumman has to figure out how to build it for free!
(Yes, I'm being facetious.)
Re: (Score:3)
Don't SAY stuff like that, not even in jest! Some congresscritter (or staffer) is likely to be lurking here, and get the idea that this might actually WORK!
But there is case law concerning the All Writs Act; demands made regarding it are required to be "reasonable", which the FBI's demand in this case is not.
But now Apple will spend a billion dollars litigating this all the way to the Supreme Court, and Apple is pretty sure that's how far it'll go - because you don't hire the former Solicitor General of
Re: (Score:2)
Because if there aren't any limits to THIS PARTICULAR LAW, then the Constitution died in 1789, barely two years after its ratification.
There are limits. The word 'reasonable' and that it takes a request from one branch of government, and another independent branch to sign off on it. If either one believes that what is being asked is not reasonable, then it doesn't apply.
Re: (Score:2)
All Writs Act essentially just says that the courts can do stuff without needing micromanaging from congress.
Re: (Score:2)
All Writs Act essentially just says that the courts can do stuff without needing micromanaging from congress.
Unless congress has already done that micromanagement, then they can't. In this case it's called "Communications Assistance for Law Enforcement Act".
In the section of CALEA entitled “Design of features and systems configurations,” 47 U.S.C. 1002(b)(1), the statute says that it “does not authorize any law enforcement agency or officer —
(1) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire o
Re: (Score:1)
Maybe I'm missing something here?
Yes, the popcorn. Here's a bowl. Enjoy the show.
FUD and Confusion (Score:2)
You are absolutely right. There is a court order, and a public one at that, so the 4th amendment is not at issue. That's what distinguishes this from the whole Snowden thing, where government intelligence-gathering entities either act without a court order, or else on a secret court order by a secret court (which is really the same thing, 'cause who knows what happened, 'cause it's a secret).
No, the thing going on here is that Apple is being asked, or even forced, to compromise their own product using mean
Re: (Score:2)
If they actually valued privacy, they wouldn't create a phone that they themselves can break into, because the obvious and natural consequence of that is that courts and spy agencies will order them to break into it.
No, what i
Re: (Score:2)
Okay... how does Apple build a phone they can't break into, but is capable of updates and bug fixes?
Remember, the means by which the FBI proposes Apple "break into" the phone is to push an update that just happens to omit some security features, like the self-imposed delay for processing subsequent PIN attempts. So, no updates? Bugs remain unfixed? No recourse if you accidentally brick your phone if you, say, forget your PIN?
Re: (Score:2)
Easy: the contents of the phone are encrypted with a long random key that is stored securely in a crypto processor. The crypto processor also checks your pin for unlocking. If you make too many pin entry attempts, the crypto processor erases the key internally, rendering the data on the phone irretrievable. There is no way to reset the pin entry count that's kept inside the crypto processor without actually
Re: (Score:2)
That's a crucial fact which keeps being omitted from this debate. Apple's argument holds no water if this software can only be used at the device owner's request. It's not the government coercing them to hack an iPhone that they fear. It's Johnny's mom and dad coercing them to hack Johnny's iPhone they fear (ti
Re: (Score:3)
Either the FBI is very incompetent or that was a deliberate act by the FBI to create the situation that now exists. A situation that is the best possible scenario for the FBI to force Apple to unlock a phone.
The FBI has lied about this case time and time again. They even had the gall to blame the San Bernadino health department for resetting the password.
I am not convinced tha
Uh, no, it wouldn't... (Score:2)
Sure, I agree with Apple on this one (though I'd argue they're less concerned about freedom and more so about having to pay to write backdoors and clean up the brand damage from said backdoors). But can we stop
Re: (Score:2)
Re: (Score:2)
But can we stop trotting out Orwell as our anti-gov't poster boy please? The man was a socialist for Pete's sake...
Any full blown 'ist is indistinguisable from another.
We're already there (Score:1)
Kudos to Apple for trying to limit the scope of the problem, but they can't prevent something that already exists.
Adjust accordingly (Score:2)
So basically, your phone is still your enemy, and anyone with physical access to the device will eventually be able to defeat all of your safeguards. This same situation exists with laptops and other computers. Even if your entire system is encrypted, at some point you must enter a key. It would seem that anyone with physical access to the hardware can intercept that key by some means. The only "hard" problem is this ex-post facto style access where the keyholder is dead. I guess the cops will have to stop
facecrime (Score:4, Informative)
"It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offense. There was even a word for it in Newspeak: facecrime, it was called."
"The telescreen recieved and transmitted simultaneously. Any sound Winston made, above the level of a very low whisper, would be picked up by it; moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever the wanted to. You had to live- did live, from habit that became instinct- in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized."
-Some quotes from 1984
Mmm Irony (Score:2)
Re: (Score:2)
100% Agree (Score:2)
We are thier now. (Score:2)
Good move Apple (Score:3)
Apple: stop the posturing and fix your phones (Score:2)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
National security letters can make Apple turn over information they already have, but can't compel them to write new software to break security. Since Apple is objecting on the grounds that they don't want to write the software, it's a reasonable presumption that they haven't written it already. If the Feds knew Apple had the software, they wouldn't be demanding that Apple write it, they'd just have the court tell Apple to use the software. It's theoretically possible that Apple has written the crack, b
Re: (Score:2)
Sorry, that's just silly. Figuring out where the unlock count is kept in RAM is a moderate amount of work, but it isn't rocket science. Furthermore, an attacker with access to the hardware doesn't need code signing, they can simply change the lo
Modern codebreaking is HARD (Score:2)
If Apple is doing their job right, that's not possible - a flawless implementation of a decent modern encryption algorithm would take the combined computational resources of the entire planet many thousands of years to break a single code. Even most accidental flaws don't reduce that time to something terribly useful.
Unless of course someone has a functional quantum codebreaking computer, then it should only take a few hours, depending on speed. Has anyone heard if there are any cryptographic schemes in us
Re: (Score:2)
Certainly that's what they'd like to do, but if it were as simple as that they would simply copy the data off the device and brute force it on any PC within minutes since there are so few PIN options. Sadly(for them), as I understand it the PIN only grants access to the hardware-embedded encryption key, not directly to the data itself. So they need Apple's cooperation to bypass the lockout safeguard.
And that strategy has nothing to do to with the NSA building a codebreaking supercomputer, which is the ide
Re: (Score:2)
Nah, call it the FBiPhone!
Re: (Score:2)
100% preventable,, mostly kids, our spiritual & physical allies all over the wwworld... calling it/us society must be another madison ave, tackdick? like cold or civil war? all part of our wmd on credit greed fear ego based never ending holycost.. talk about a fairytail... truth+mercy=justice !in the moms we trust!.
Holy crap, see what happens when you hid a stoner's Fritos?
Re: (Score:2)
You have to trust someone.
Unless you want to write your own OS and create your own hardware you have no choice. Yes, you can use an Open Source OS so at least in theory it's possible to verify the code on your own... but there is so much firmware embedded in each and every chip you get there is no way you're going to be able to verify the phone from top to bottom.
In the end, you have to go by reputation and track records and make an informed decision. In my mind Apple has always been firmly on the side of
Re: oh boy (Score:2, Informative)
It is simply not possible to build the required tool in a way that:
- it will only run on this iPhone
AND
- it can not be trivially adapted to run on every other iPhone
The first part is completely possible, but the second part is impossible - by building the tool, you have done 99.999% of the effort required to do it for another phone. Maybe not quite that for secure enclave devices, but certainly for everything pre-A7.
This isn't a 4th amendmant issue - it's a government owned phone. The same government that:
-
Re: (Score:2)
One correction to that; the hack would only work on an iPhone 5C, and not on the 5S or any newer model.
Re: (Score:2)
Apple makes quite a bit of its profits not in the US...
Re: (Score:2)
It's a bit more subtle than this: can the US Government order a programmer to write a program they don't want to write ?
Re: (Score:2)
The search warrant is irrelevant. It was a work phone, and the actual owner has given permission. The Fourth Amendment has nothing to do with this.
Apple is legally obligated to hand over what information it has, and Apple has done so. Apple doesn't want to be forced to write special software to break the iPhone's security, and does not believe they can be legally required to do so.
Re: (Score:2)
Is it really an issue of constitutional law? I now doubt it, because if Olson and Apple were so confident in their interpretation of the law, i.e. they were sure they had a case, why would they put so much effort into creating this media sideshow? Why are they trying to fight this in the court of public opinion?
Excuse me, but that was started by the FBI. If I British newspaper writes "Apple refuses to unlock TERROR PHONE", then surely you should admit that Apple has the right to do a bit of positive PR on its own.