Charlie Miller Circumvents Code Signing For iOS Apps 172
Sparrowvsrevolution writes "At the SysCan conference in Taiwan next week, Charlie Miller plans to present a method that exploits a flaw in Apple's restrictions on code signing on iOS devices, the security measure that allows only Apple-approved commands to run in an iPhone's or iPad's memory. Using his method, an app can phone home to a remote computer that downloads new unapproved commands onto the device and executes them at will, including stealing the user's photos, reading contacts, making the phone vibrate or play sounds, or otherwise using iOS app functions for malicious ends. Miller created a proof-of-concept app called Instastock that appears to show stock tickers but actually runs commands from his server, and even got it approved by Apple's App Store."
Update: 11/08 02:54 GMT by U L : Not unexpectedly, Apple revoked Miller's developer license.
App redacted... (Score:2)
Re: (Score:3)
App redacted in 3...2...1.
But one has to think, if this application was approved, how many other approved applications in the App Store have some form of malicious code or other surreptitious data collection?
It seems the only reason Apple noticed this is because Charlie Miller published it.
This is why Apple's security model is fundamentally flawed. It provides a single point of failure for security. Those of us who work with networks understand that gateway only security doesn't work, so trusting the gateway to get everything
Re: (Score:2)
Careful...I got flamed into oblivion last time I tried to bring the "single point of failure" of Apple's security model. Dissent is not welcome in that house.
Re: (Score:2)
Wait, you're trying to say that slashdot is a *pro* Apple site?
Goodness!
This does sound like quite a serious security hole, so I expect it to be patched. Of course, slashdot will report the patching of this hole as "Apple patches iOS to prevent jailbreaking", just like the last time they closed the security vulnerability that was also used to provide jailbreaking ability.
If they don't close the hole, slashdot will crow about how "insecure" iOS is.
Y'know, classic "damned if you do, damned if you don't". Just
Re: (Score:2)
I'm waiting on a vendor coming up with a firewall program for your phone - think ZoneAlarm, where you are prompted to allow or block when apps request 'outside access'.
And if you make a version for the iPhone, it won't be approved. ;)
(Of course, it is possible for iPhone users to install "disapproved" apps from other sources. But only a few knowledgeable people will do that, so you certainly won't make much money from your app that way.)
Re: (Score:3)
Somebody hacked one of Michael Kristopeit's accounts and used it to post a useful comment! The world is at an end.
The external sources are probably web pages. The web page can be javascript-free right up until the app is approved, so I don't see how this can prevent it.
Re: (Score:2)
You did not use
in your post about MichaelKristopeit.
Re: (Score:2)
Well - if he failed to keep up with his standard crap, then it would be MichaelKristopeit = stagnated.
Re: (Score:2)
So, and if your application doesn't do anything with the content unless some magical condition is met? It would just be reading an HTTP GET. Hardly something for Apple to look closely at.
So, the application checks stocks, including the author's own custom stock thingy on his website. Looks pretty innocent.
But if, for example, an HTML comment is on the page with a CRC of 42, it will look for the second comment, which contains the new code. It acts on this.
this initial comment is only there about an hour, abo
Boo apple. (Score:1)
Yay jailbreak.
Admiral! (Score:1)
Re: (Score:2)
IT'S A TR.... oh forget it.
Re: (Score:2)
*whew* I thought that was going to be a 4chan reference.
Not so walled garden... (Score:2)
Joy...oh joy...oh rapture.
Re: (Score:2)
...they say "only install apps from trusted publishers"...yeah...ok...so, no one? If I did that, I'd have only the pre-loaded apps.
And I'd have zero.
Re:Not so walled garden... (Score:5, Informative)
> This isn't really news
Actually it is. The way these things get fixed are by making people aware of the problem. No software is absolutely bug free. As much as some people would like to stick their fingers in their ears and say "la-la-la not a problem...", there are just as many us who would like to fix the issue. So, yes this is news.
Re: (Score:2)
Actually it was a flaw introduced last year when Apple relaxed restrictions, apparently to increase browser speed:
From TFA:
Re: (Score:2)
Re: (Score:2)
Not much like that at all. Those were kernel level drivers that executed with system level authority. This is something that would execute in the user space and it only does so because they reduced some restrictions allowing it to execute in a lower memory space, but it certainly doesn't have root authority.
Heaven forbid! (Score:5, Funny)
It could also lead to people deveoping unapproved apps and selling them to people on the black market - and thus, with the wall breached, the Apple hegemony fell and there was much rejoicing!
"Yea!"
Treacherous computing (Score:3)
That's the definition of trusted computing - it trusts someone else, and not the owner. So that someone else, or anyone who compromises them, gets to control your device before you do.
Re: (Score:2)
True. But the alternative to that is untrusted computing - ie. any app you install gets more control over the device than you.
The vast majority of users are not even remotely capable of providing a higher level of trust than a competent third party. This is akin to representing yourself in court instead of hiring a lawyer who is an expert in the laws and defence techniques that apply to your case. Step and repeat for each app you install.
Re: (Score:2)
Not a flaw (Score:2)
It's not a flaw, it's a feature!
Translation (Score:4, Informative)
Most of the article was quite puzzling, as this is nothing new or remarkable. It's really quite simple to have your application execute stuff it downloads.
If I can reverse-engineer the uninformative article a little, I would hazard a guess to say that he's found a way of bypassing the NX bit protection using Safari as an attack vector. This means that he would be able to inject arbitrary ARM code that wasn't present on the device at review time, meaning that he could execute code against APIs that the application wasn't originally using (but which are available for applications to use legitimately).
As an attack, it sounds real enough, however in real-world terms, Apple's review process is leaky enough to avoid getting caught anyway. Their review consists of some trivial automated checks and everything else is handled by a human reviewer who just looks at the application from an end-user's point of view. During the submission process you have to include instructions on how to trigger any Easter eggs in your application because they wouldn't otherwise find them.
Re: (Score:2)
So then why has no one just built an app that is friendly until 10k downloads at which point it does some evil?
To me it seems like something spammers/malware folks would have thought of by now
Re: (Score:2)
How do you know there aren't multiple of those already on the Store and simply haven't been detected?
Re: (Score:2)
Because it isn't possible. This app demonstrates a bug in the way the NX bit is working that makes it accidentally possible.
You see, one thing IOS4.3 did was use a new javascript engine in Safari. You may remember it as web clippings ran Javascript much slower than if they ran it inside Safari.
One trick Safari did was
Re: (Score:1)
reading comprehension (Score:2)
next time RTFA. it's not at all like what you said.
Re: (Score:2)
He doesn't have to do that. Just have an interpreter that is built in with the c functions being rather full featured but mostly not used.
Re: (Score:2)
So, you are saying Apple hasn't solved the halting problem yet? Those ignorant bastards...
Re: (Score:2)
If I can reverse-engineer the uninformative article a little, I would hazard a guess to say that he's found a way of bypassing the NX bit protection using Safari as an attack vector. This means that he would be able to inject arbitrary ARM code that wasn't present on the device at review time, meaning that he could execute code against APIs that the application wasn't originally using (but which are available for applications to use legitimately).
Nope, he wrote a Sleeper App (basically malware with trojan functionality) and put it up on the App-Store. Using the "backdoor" in the App, he could download, install and run unsigned code. Apps in the App Store run binary code. You don't need to inject code anywhere into a browser.
Also, what he did was EXPLICITLY AGAINST the developer agreement he made when he became an Apple Developer. He basically proved that you could write code with trojan functionality that violated developer agreements, lie abo
Re: (Score:2)
He basically proved that you could write code with trojan functionality that violated developer agreements, lie about the functionality to Apple, and get it published on the App Store. Apple found out and took his App down and then took away his developer license.
So iOS is secure against developers that tell Apple about the malware in their apps. That gives me a really warm, fuzzy feeling ...
Re: (Score:2)
So iOS is secure against developers that tell Apple about the malware in their apps. That gives me a really warm, fuzzy feeling ...
Yes... however, if Apple finds malware in an App, it is pulled from the App Store and the developer is banned. But anything you install could be potentially malware. Then again, I'd venture to say malicious developers can take advantage of *ANY* current software platform once you've installed their software.
Re: (Score:2)
Yes, when a "white hat hacker" like this Miller guy shows up and demos a security hole to Apple, Apple's response is to pull his app and ban him.
This is supposed to reassure us of iOS's security exactly how?
The intended effect seems to be to "send a message" to others who may be playing with such things. And that message is "Don't tell us about security problems you find; we don't want to hear about them. Go sell the info to interested buyers, like any self-respecting businessman would do."
Re: (Score:2)
Re: (Score:3)
All memory allocated by user apps is NX. Your code is not going to execute no matter how many buffers you stuff it in.
Native code (Score:5, Interesting)
So long as iOS apps are developed using a language that allows pointer access, including function pointers, people are going to find and exploit bugs like this. It's actually a really interesting parallel to homebrew development on Windows Phone (yes, I have one, in addition to a few Linux devices - no iOS ones though): you can do native code on WP7, but you have to use COM to access it. Microsoft prohibits ISVs from using the COM import API from C#/VB in marketplace apps, so they can very easily block this kind of thing by just checking for references to a few specific APIs (they also block the use of C# "unsafe" pointers).
Now, I'm not exactly advocating that Apple needs to re-design their entire applicaiton model. However, the fact remains that the way they do it, it's almost impossible to really verify that any given app isn't doing something like this, short of code-reviewing the source of every submission and rejecting any that are too hard to understand (completely impractical). It means they *are* vulnerable to malware, though - even from the "trustworthy" marketplace.
Re: (Score:3)
It does not matter what they do, without code reading apps can always do evil.
I could submit a time zone calculator app that waits until 06/06/2012 and instead of opening properly shows goatse. With the limited testing apple does how would they ever know?
Re: (Score:2)
I meant code reviews. They would also have to reject any app more complicated than the most basic of software.
Re: (Score:2)
"Almost impossible"?
It's a more complicated problem than determining whether the program will halt.
Re: (Score:2)
"Almost impossible"?
It's a more complicated problem than determining whether the program will halt.
And we all know that's easy because we can see the screen freeze and we have to reboot. Problem solved! Right? Right! :)
For the people not getting this: http://en.wikipedia.org/wiki/Halting_problem [wikipedia.org]
Re: (Score:2)
Well he got through one wall with that method. There are still more walls.
Re: (Score:2)
You know, your post would have a lot more credibility if you could spell "virtualization" correctly.
I was making a point about the validity, or lack thereof, of API-based trust boundaries (you know, what the whole article was about). It's entirely possible to make an API-based trust boundary in a language that doesn't support pointers. It's not possible in a language that does. You need something else to enforce your trust boundaries, or you need to accept that they will be vulnerable. Apple is taking the l
Ok, fanbois tell me all about the wall garden (Score:2)
It was only a matter of time. Since they only do blackbox testing, it should not have taken this long for an app to get approved that waits to do evil until after it is in the wild.
Re: (Score:2)
Charlie Miller? (Score:2)
I bet he's recording some sick jams [archive.org] with his unsigned iOS apps.
So (Score:1)
Re: (Score:1)
And when it happens to OSS it's an example of how it's inherently more secure.
last line is a gem (Score:1)
FTA:
”Android has been like the Wild West,” says Miller. “And this bug basically reduces the security of iOS to that of Android.”
Lolz.
Re: (Score:2)
Did or did you not notice that the whole point of what Charlie Miller did was that the sandbox was breached, despite ASLR, and he was able to do it from an app allowed into the walled "solution"?
Please explain how an app store that is unable to detect malware but *claims* to be inherently secure is actually more secure? If anything, I see it as the opposite - it will delude people (like yourself) into thinking it's safe, when it's actually not. Android, by comparison, is acknowledged to have malware - meani
Re:last line is a gem (Score:4, Insightful)
Did or did you not notice that the whole point of what Charlie Miller did was that the sandbox was breached, despite ASLR, and he was able to do it from an app allowed into the walled "solution"?
Please explain how an app store that is unable to detect malware but *claims* to be inherently secure is actually more secure? If anything, I see it as the opposite - it will delude people (like yourself) into thinking it's safe, when it's actually not. Android, by comparison, is acknowledged to have malware - meaning people need to be more cautious about the apps they install.
I think the numbers of actual malware on the two platforms speak for themselves. And in iOS' case, Apple-haters certainly can't claim "security through obscurity" or "lack-of-marketshare" excuses.
And I, for one, would rather have a guard who repels 99.99999999999999% of enemies, than me having to stay up every night with a shotgun in my hand, protecting my home and my loved ones.
Window screens don't stop all insects; but take them away, and pretty soon, all you'll have time to do all day, every day (and every night) is swat flies. Which would you prefer: The occasional gnat in your beer, or having flies crawling all over your dinner, every single day?
Re: (Score:2)
Yep, people need to look at the apps they install and ask them whether they harbor malware. There, the security problem is now deemed solved, we can rest easily from here on out.
Already removed (Score:2)
The app in question has already been pulled from the App Store. And I'm quite sure the flaw that allows executing code via some hole in Safari will be fixed very soon. iOS 5 supports delta updates now, so Apple can (and will) come with small updates much more often than in the past.
I'm still torn about security in such appliances. Ideally the user should fully own the device as well as all code running on it, but in practice, users being what they are, having a central control instance may very well be the
Re: (Score:3)
The app in question has already been pulled from the App Store. And I'm quite sure the flaw that allows executing code via some hole in Safari will be fixed very soon. iOS 5 supports delta updates now, so Apple can (and will) come with small updates much more often than in the past.
Unless he's figured out how to sign apps such that the OS thinks they are from Apple, and aren't. Then Apple would have to revamp their code signing system.
Re: (Score:2)
He hasn't.
Re: (Score:2)
Well that's breaking encryption in general. That takes down much more than just the app store.
Re: (Score:2)
Well that's breaking encryption in general. That takes down much more than just the app store.
Assuming Apple's algorithms are implemented properly, which is never a guarantee. Look at Sony.
Re: (Score:2)
The provisioning profile stuff is an open source part of Core Data. I haven't personally checked it, but given that the encryption has been in the open for 5 years....
Re: (Score:2)
The app in question has already been pulled from the App Store. And I'm quite sure the flaw that allows executing code via some hole in Safari will be fixed very soon. iOS 5 supports delta updates now, so Apple can (and will) come with small updates much more often than in the past.
Unless he's figured out how to sign apps such that the OS thinks they are from Apple, and aren't. Then Apple would have to revamp their code signing system.
He clearly stated that he went AROUND the code signing requirement; NOT that he "broke" the signing process itself.
Re: (Score:2)
I'm still torn about security in such appliances. Ideally the user should fully own the device as well as all code running on it, but in practice, users being what they are, having a central control instance may very well be the lesser evil.
Let the end user decide whether they want the central control, or not. Just make sure that status can't be altered by other than the actual user.
With digital devices filling every part of my life now the very thought of being personally responsible for every bit of code running on every one of them makes me shudder. Life is just too short for that.
Do I trust Apple? Not very much. Do I trust Apple more than myself when I haven't got the time to spend more than a few minutes a day to care for each device (and its software) that I own and use? Probably, yes. Sad but true.
Is there someone else you trust? How about having a trust choice? And for those of us that do trust ourselves, "self" should be one of the choices.
I do trust Apple with one thing ... that they will make business decisions that they believe will boost their bottom like. That is the only thing I trust them to do.
Misunderstanding as to approval? (Score:2)
The summary says "Apple-approved commands to run in an iPhone's or iPad's memory."
I'm not sure if that's the normal slashdot misunderstanding/hyperbole, if it's another reporter ignorance/flamebait thing, or if that's actually in what cmiller posted.
Apps are Apple-approved. Apps can't use Apple's non-public frameworks. Saying you can't run non-Apple-approved commands is completely inaccurate.
Re: (Score:2)
In theory, that's how it should work, by design. But when there's a bug in the code somewhere, that can provide a means to go around the checks. Too bad Apple inherited Steve Jobs' arrogance and refuses to work with security researchers.
Still safer than completely unvetted apps (Score:2)
It's not more secure (Charlie Miller keeps demonstrating that), but for the typical user (who doesn't know enough about security to judge an app), having a vetting/approval process such Apple's is still offers a safer environment than running completely unvetted apps (such as on the Android stores).
Re:Still safer than completely unvetted apps (Score:4, Insightful)
Except, it gives a false sense of security. With Android (or PC) apps, I know that there's a risk of malware, so I'm cautious. With iOS - well, I don't have one, but I imagine there are lot of people who think "it *can't* have malware, Apple checks everything!" and therefore completley trust anything in the app store.
The purpose of work like this is to demonstrate that Apple has misled those people; you can't simply trust everything. The only thing worse than an obviously untrustworthy app source is an untrustworthy app source that *appears* to be trustworthy.
Re: (Score:2)
Which makes absolutely no difference to the 95+% users who don't know enough about security to make such an evaluation. No matter how many times users get burned, if they don't understand security, most of them will make the same mistake next time simply because they don't know how to evaluate an app for security. And for those who do know about security, it doesn't stop them from exercising caution. Therefore, the "false sense of security" actually makes no difference.
Re: (Score:2)
So someone needs to watch out for that 95+%. Apple and Miller are both trying to do that. One of those two is even willing to cooperate with the other to that end goal. The other appears to be on the track to dishonesty over the matter.
Re: (Score:2)
Agreed. I'm a big fan of CM, and the rest of the ethical security researchers.
Apple's reaction to security vulnerabilities is pretty poor. I have personal experience with that since I reported a vulnerability in QT for Windows (CVE-2010-0530 [apple.com]) that they took over a year to fix, and didn't fix it properly when they did.
Apple isn't the only vendor to have such poor policies, just one of the most visible.
Re: (Score:2)
I think by coming here you insured that you are talking to the 5% that do care....
Which has absolutely nothing to do with my statement. My statement is about all users. That's the problem with most of the users on here, they can't see that most of the users aren't interested in the same things they are.
Re: (Score:2)
Except, it gives a false sense of security. With Android (or PC) apps, I know that there's a risk of malware, so I'm cautious.
And why do you imagine your caution is better than someone who's job is vetting apps? For example, what automated tools do you have for looking for suspicious API calls? Do you, like the app store reviewers, have test devices that don't contain your actual live data? Do you, like the app store, find out that the developer of the app is real enough to have a tax code?
Or is the reality of your "caution" that you're just going to guess.
Re: (Score:2)
It mostly comes down to using either apps from big names that are well-known and have a reputation to uphold, or using open-source apps. If I need an app that does neither, I can run it through a proxy and monitor what it connects to via my PC. Granted that the first approach isn't guaranteed, the second isn't guaranteed unless I both check the source and compile it myself for checking against the version in the app package, and the third is a hassle. It's possible, though - and I guarantee that the folks a
Re: (Score:2)
I guarantee that the folks at Apple don't have the time or people to properly verify the apps either, nor do they seem to have the personal incentive to do it right.
I know better that your "guarantee". The app store review process found a crashing bug in one of my apps that neither I nor my partner had ever come across in our testing. It took me two days to reproduce it myself. I know from what had to happen to trigger that bug that either he gave it a very thorough evaluation, or they have fuzzer that randomly operates the UI of the app for an extended period.
Also interesting is that you earlier pointed out the hazards of trusting software, and here you're willing to
Re: (Score:2)
With iOS - well, I don't have one, but I imagine there are lot of people who think "it *can't* have malware, Apple checks everything!" and therefore completley trust anything in the app store.
Any iOS user who think this after a tethering app was slipped through as a flashlight app deserves whatever they get.
Re: (Score:2)
Did you also explain to them that Apple iOS is already NOT even making at attempt to protect their privacy by block apps from getting personal information? At least Android tries.
Yes, we should make these things as secure as users think they are. Too bad Apple has changed course on this, having inherited Steve's arrogance without inheriting his wisdom.
Re: (Score:2)
Well, that depends.
Take the TSA as an analogy. One of their many jobs is to detect things like knives, guns, explosives and other nasty things being brought aboard airplanes. And they are pretty successful when people have forgotten that they have one of the forbidden items in their luggage. But if you make a bit of an effort to hide these things, they seem to have a poor success rate for detecting them.
Generally, most people have a pretty low opinion of the TSA's "Security Theater." It doesn't really
Re: (Score:2)
Flawed analogy. Forget that the TSA is searching for weapons when they need to be watching for suspicious behavior. Forget that they're irradiating passengers and groping others for their illusion of security.
The fundamental problem with the analogy is that air passengers know to watch for weapons, suspicious behavior, etc. In fact, passengers are the only ones who have actually caught any attempts at terrorism in the last 10 years, not the TSA. Passengers can still do something to detect and stop an attack
Re: (Score:2)
I'm not sure I see the flaw.
TSA's job is to prevent passengers from bringing weapons onto the airplane. They have some successes [nypost.com] and notable failures [judicialwatch.org] in doing this. Apple's job is to prevent malicious code from running on our iPhones and iPads and I'm sure they have some successes and failures.
What you're saying is that it's okay that the TSA might fail every now and again because the passengers will spot the malicious person and prevent him from performing his dastardly task. Of course, passengers [cnn.com] tend [huffingtonpost.com]
Re: (Score:2)
I'm not sure I see the flaw.
TSA's job is to prevent passengers from bringing weapons onto the airplane. They have some successes and notable failures in doing this.
No, the TSA's job is to stop terrorists from hijacking planes, not to keep guns off planes. If half the passengers had guns, the terrorists wouldn't try hijacking a plane. And that's the fundamental problem with the TSA, their focus is on passengers as threats, rather than on the threat to the passengers. That's like saying locks are to keep you from opening a door. No, the lock is to protect what's behind the door, the door and lock are just one mechanism of providing protection.
What you're saying is that it's okay that the TSA might fail every now and again because the passengers will spot the malicious person and prevent him from performing his dastardly task.
No, I'm saying it's impossi
Actually less safe then completely unvetted apps (Score:2)
It's not more secure (Charlie Miller keeps demonstrating that), but for the typical user (who doesn't know enough about security to judge an app), having a vetting/approval process such Apple's is still offers a safer environment than running completely unvetted apps (such as on the Android stores).
Actually it's less safe.
Users in the "walled garden" have a false sense of security, the security is breached and the users still unquestioningly trust everything from a now untrustworthy source.
Apple has a vetting process that doesn't work. How is that different to an unvetted source?
So essentially, with Android you have unvetted applications, with Apple you have unvetted applications and a user base which is actively ignorant of security issues. Despite the rumours to the contrary, there has been
Re: (Score:2)
So essentially, with Android you have unvetted applications, with Apple you have unvetted applications
Except that Apple do do vetting, and thus do have vetted apps.
You claim it doesn't work. The lesson of 4 years of the Apple App store is that it does work.
Despite the rumours to the contrary, there has been no great Android outbreak precisely because Android users are aware of their own security.
The average Android user is not like you. The average Android user is the average phone user. They're not geeks. They don't understand security. They are exactly the same people that load animated cursors, smily packages and screensavers on their Windows PCs.
There has been lots more malware on Android than iOS.
Re: (Score:2)
First, clearly you didn't read my reply [slashdot.org] to the previous commenter who used the "false sense of security" fallacy. Actually, the "false sense of security" argument can be many fallacies, linked below:
Appeal to belief [nizkor.org]. e.g. Many people claim it gives a false sense of security, therefore, it must. Show that it actually has that effect before you use it as your premise. A hypothetical premise only gives a hypothetical result.
Begging the question [nizkor.org]. e.g. Giving people "false sense of security" makes them less saf
Re: (Score:2)
Android does not lead people to a false sense of security.
Re: (Score:2)
Which means nothing. If you had bothered to think about it or ready any of my replies to the other people making that same claim, you wouldn't have bothered to repeat their mistake.
The article title is a bit misleading (Score:2)
What has been broken here is not the code-signing apparatus per se but another part of the Apple security regimen; it appears this doesn't affect the need to have a valid initial certification to begin with. If the signing mechanism were defeated, that would conceivably allow anyone and his dog to upload and sell apps on the store without registering as a developer. But it isn't. So, in fact, the only people who could leverage this issue for nefarious purposes are people who are already working in the marke
Re: (Score:2)
I agree that the article was not entirely clear on how code signing is broken. this approach seems to be the ability to sideload new code. That's the code that hasn't been signed and that code hasn't got
Well-researched article, not! (Score:2)
Apple's iPhones and iPads have remained malware-free thanks mostly to the company's puritanical attitude toward its App Store: Nothing even vaguely sinful gets in, and nothing from outside the App Store gets downloaded to an iOS gadget.
WTF? Are you serious? Games and apps download data external to the App Store all the time. e.g.: The myFish3D app downloads new 3D models for fish and ornaments from its home site, uselessiphonestuff.com.
Re: (Score:2)
s/nothing/no executable code/
It's not terribly well-written, but the gist of it is fairly accurate.
Apple runs scared (Score:2)
The Doctor Pwn's the OSX, he keeps his license. The Doctor Pwn's the iOS via Safari, he keeps his license. The Doctor Pwn's Apple's walled garden, and they take his license.
Re: (Score:2)
The Doctor Pwn's the OSX, he keeps his license. The Doctor Pwn's the iOS via Safari, he keeps his license. The Doctor Pwn's Apple's walled garden, and they take his license.
He was grandstanding. He could have EASILY contacted Apple on the downlow; but Noooooo! He had to grandstand, thus alerting the rest of the planet to the exploit BEFORE Apple had a chance to close the vulnerability.
He got exactly what he deserved (except that Apple should sue him into oblivion, and have him prosecuted for unauthorized access to a computer system, too).
In other words, Miller should thank his lucky stars that a company with a bigger legal department than most U.S. States have, and a nearl
Re: (Score:2)
Sure, he may have notified them. But did he also tell them that he seeded the App Store with a trojan, which gives him remote access to exploit the flaw, and which is also available to all iOS users for download?
If Apple ignored him, he could have very simply exposed the flaw publicly to shame them. The moment that he decided to violate policies, subvert the vetting process and inject into the App Store an app exploiting the flaw--at that moment, he made his bed and now he must sleep in it.
Re: (Score:2)
He was booted probably for subverting the vetting process by submitting an app exploiting the flaw publicly, where it could be downloaded by millions of people. The fact t
Re: (Score:2)
2. VOLUNTARILY pulled his App from the Store INSTANTLY, once it was Approved.
THEN he might be considered a "white hat" who was just trying to make Apple aware of a unknown vulnerability.
But we all know he didn't do that, did he?
Your turn...
Re: (Score:2)
He told Apple about the flaw on the 14th of October, please dis-engage reality distortion field.
No RDF here, buddy!
So, he told Apple about it, and in FAR less time than they could research, code, and TEST a fix, he decided to tell the rest of the planet. What's so "noble" about that?
To prove his point, he wrote & submitted an application to the App Store that was approved.
And then LEFT it on the App Store until APPLE pulled it. Again, not "noble".
Why should he tell Apple his app is abusing this flaw?
Depends on what his TRUE motivations are, now doesn't it?
Shouldn't Apple be creating a tool/procedure to block the flaw or detect it during the vetting process (to which all apps will have to retroactively be submitted)?
Assuming they are prescient, and KNOW about the (extremely narrow, according to cmiller) vulnerability before cmiller told them about it, yes. But obviously, they didn't, nor can anyone who
That's so rude! (Score:2)
Miller announced the news on Twitter this afternoon, saying "OMG, Apple just kicked me out of the iOS Developer program. That's so rude!"
- cnet.com [cnet.com]
Really? You've been around Apple and seen how they react for how many years and you were surprised by this?
Re: (Score:3)
Yes it is actually. How do you implement an API that guarantees that you go through that API to get access to something. It doesn't matter if you build your lovely "you don't get permission to anything unless the gatekeeper agrees" system, if you can simply go "we'll I'm ignoring the gatekeeper and jumping through this hole in the wall". That's what a security flaw actually is ;)
Re: (Score:2)
The whole point is that there is a security hole in Apple's security model. What you say is that if there is a bug, it implies the model is inherently broken?
Wow, lots of things are broken down here, trust me on that one.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
I have always thought that executing code on an iOS device in this way was possible I just never thought Apple would actually miss the fact that the app was downloading external code.
Charlie himself stated that he worked for months finding ONE corner-case that Apple didn't catch. It wasn't like this was right there in the open for all to see.