Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Bug Security Apple

Charlie Miller Circumvents Code Signing For iOS Apps 172

Sparrowvsrevolution writes "At the SysCan conference in Taiwan next week, Charlie Miller plans to present a method that exploits a flaw in Apple's restrictions on code signing on iOS devices, the security measure that allows only Apple-approved commands to run in an iPhone's or iPad's memory. Using his method, an app can phone home to a remote computer that downloads new unapproved commands onto the device and executes them at will, including stealing the user's photos, reading contacts, making the phone vibrate or play sounds, or otherwise using iOS app functions for malicious ends. Miller created a proof-of-concept app called Instastock that appears to show stock tickers but actually runs commands from his server, and even got it approved by Apple's App Store." Update: 11/08 02:54 GMT by U L : Not unexpectedly, Apple revoked Miller's developer license.
This discussion has been archived. No new comments can be posted.

Charlie Miller Circumvents Code Signing For iOS Apps

Comments Filter:
  • App redacted in 3...2...1.
    • by mjwx ( 966435 )

      App redacted in 3...2...1.

      But one has to think, if this application was approved, how many other approved applications in the App Store have some form of malicious code or other surreptitious data collection?

      It seems the only reason Apple noticed this is because Charlie Miller published it.

      This is why Apple's security model is fundamentally flawed. It provides a single point of failure for security. Those of us who work with networks understand that gateway only security doesn't work, so trusting the gateway to get everything

      • Careful...I got flamed into oblivion last time I tried to bring the "single point of failure" of Apple's security model. Dissent is not welcome in that house.

        • by jo_ham ( 604554 )

          Wait, you're trying to say that slashdot is a *pro* Apple site?

          Goodness!

          This does sound like quite a serious security hole, so I expect it to be patched. Of course, slashdot will report the patching of this hole as "Apple patches iOS to prevent jailbreaking", just like the last time they closed the security vulnerability that was also used to provide jailbreaking ability.

          If they don't close the hole, slashdot will crow about how "insecure" iOS is.

          Y'know, classic "damned if you do, damned if you don't". Just

  • by Anonymous Coward

    Yay jailbreak.

  • Enemy lawsuits detected in Sector 3-7!
  • This isn't really news...I imagine this 'flaw' will be found in every version of iOS until it dies. Not only that, but we should be suspicious of app producers...they say "only install apps from trusted publishers"...yeah...ok...so, no one? If I did that, I'd have only the pre-loaded apps.

    Joy...oh joy...oh rapture.
    • ...they say "only install apps from trusted publishers"...yeah...ok...so, no one? If I did that, I'd have only the pre-loaded apps.

      And I'd have zero.

    • by sl4shd0rk ( 755837 ) on Monday November 07, 2011 @04:45PM (#37978318)

      > This isn't really news

      Actually it is. The way these things get fixed are by making people aware of the problem. No software is absolutely bug free. As much as some people would like to stick their fingers in their ears and say "la-la-la not a problem...", there are just as many us who would like to fix the issue. So, yes this is news.

    • Actually it was a flaw introduced last year when Apple relaxed restrictions, apparently to increase browser speed:

      From TFA:

      Miller became suspicious of a possible flaw in the code signing of Apple’s mobile devices with the release of iOS 4.3 early last year. To increase the speed of the phone’s browser, Miller noticed, Apple allowed javascript code from the Web to run on a much deeper level in the device’s memory than it had in previous versions of the operating system. In fact, he realized

      • Not unlike how Windows moved GUI drivers into the kernel and ran them at elevated privileges, back for NT4. And we know how well that worked out...
        • Not much like that at all. Those were kernel level drivers that executed with system level authority. This is something that would execute in the user space and it only does so because they reduced some restrictions allowing it to execute in a lower memory space, but it certainly doesn't have root authority.

  • by ackthpt ( 218170 ) on Monday November 07, 2011 @04:36PM (#37978224) Homepage Journal

    It could also lead to people deveoping unapproved apps and selling them to people on the black market - and thus, with the wall breached, the Apple hegemony fell and there was much rejoicing!
      "Yea!"

    • That's the definition of trusted computing - it trusts someone else, and not the owner. So that someone else, or anyone who compromises them, gets to control your device before you do.

      • True. But the alternative to that is untrusted computing - ie. any app you install gets more control over the device than you.

        The vast majority of users are not even remotely capable of providing a higher level of trust than a competent third party. This is akin to representing yourself in court instead of hiring a lawyer who is an expert in the laws and defence techniques that apply to your case. Step and repeat for each app you install.

      • by zoloto ( 586738 )
        with nearly all of the iOS users out there, this is a far better alternative than trusting them.
  • It's not a flaw, it's a feature!

  • Translation (Score:4, Informative)

    by Bogtha ( 906264 ) on Monday November 07, 2011 @04:40PM (#37978274)

    Most of the article was quite puzzling, as this is nothing new or remarkable. It's really quite simple to have your application execute stuff it downloads.

    If I can reverse-engineer the uninformative article a little, I would hazard a guess to say that he's found a way of bypassing the NX bit protection using Safari as an attack vector. This means that he would be able to inject arbitrary ARM code that wasn't present on the device at review time, meaning that he could execute code against APIs that the application wasn't originally using (but which are available for applications to use legitimately).

    As an attack, it sounds real enough, however in real-world terms, Apple's review process is leaky enough to avoid getting caught anyway. Their review consists of some trivial automated checks and everything else is handled by a human reviewer who just looks at the application from an end-user's point of view. During the submission process you have to include instructions on how to trigger any Easter eggs in your application because they wouldn't otherwise find them.

    • by h4rr4r ( 612664 )

      So then why has no one just built an app that is friendly until 10k downloads at which point it does some evil?

      To me it seems like something spammers/malware folks would have thought of by now

      • How do you know there aren't multiple of those already on the Store and simply haven't been detected?

      • by tlhIngan ( 30335 )

        So then why has no one just built an app that is friendly until 10k downloads at which point it does some evil?

        To me it seems like something spammers/malware folks would have thought of by now

        Because it isn't possible. This app demonstrates a bug in the way the NX bit is working that makes it accidentally possible.

        You see, one thing IOS4.3 did was use a new javascript engine in Safari. You may remember it as web clippings ran Javascript much slower than if they ran it inside Safari.

        One trick Safari did was

    • by Elbart ( 1233584 )
      At least the app had no bare tits. The security flaws? Meh.
    • next time RTFA. it's not at all like what you said.

    • by jbolden ( 176878 )

      He doesn't have to do that. Just have an interpreter that is built in with the c functions being rather full featured but mostly not used.

    • by gtall ( 79522 )

      So, you are saying Apple hasn't solved the halting problem yet? Those ignorant bastards...

    • by adisakp ( 705706 )

      If I can reverse-engineer the uninformative article a little, I would hazard a guess to say that he's found a way of bypassing the NX bit protection using Safari as an attack vector. This means that he would be able to inject arbitrary ARM code that wasn't present on the device at review time, meaning that he could execute code against APIs that the application wasn't originally using (but which are available for applications to use legitimately).

      Nope, he wrote a Sleeper App (basically malware with trojan functionality) and put it up on the App-Store. Using the "backdoor" in the App, he could download, install and run unsigned code. Apps in the App Store run binary code. You don't need to inject code anywhere into a browser.

      Also, what he did was EXPLICITLY AGAINST the developer agreement he made when he became an Apple Developer. He basically proved that you could write code with trojan functionality that violated developer agreements, lie abo

      • by jc42 ( 318812 )

        He basically proved that you could write code with trojan functionality that violated developer agreements, lie about the functionality to Apple, and get it published on the App Store. Apple found out and took his App down and then took away his developer license.

        So iOS is secure against developers that tell Apple about the malware in their apps. That gives me a really warm, fuzzy feeling ...

        • by adisakp ( 705706 )

          So iOS is secure against developers that tell Apple about the malware in their apps. That gives me a really warm, fuzzy feeling ...

          Yes... however, if Apple finds malware in an App, it is pulled from the App Store and the developer is banned. But anything you install could be potentially malware. Then again, I'd venture to say malicious developers can take advantage of *ANY* current software platform once you've installed their software.

          • by jc42 ( 318812 )

            Yes, when a "white hat hacker" like this Miller guy shows up and demos a security hole to Apple, Apple's response is to pull his app and ban him.

            This is supposed to reassure us of iOS's security exactly how?

            The intended effect seems to be to "send a message" to others who may be playing with such things. And that message is "Don't tell us about security problems you find; we don't want to hear about them. Go sell the info to interested buyers, like any self-respecting businessman would do."

  • Native code (Score:5, Interesting)

    by cbhacking ( 979169 ) <been_out_cruisin ... oo.com minus bsd> on Monday November 07, 2011 @04:44PM (#37978312) Homepage Journal

    So long as iOS apps are developed using a language that allows pointer access, including function pointers, people are going to find and exploit bugs like this. It's actually a really interesting parallel to homebrew development on Windows Phone (yes, I have one, in addition to a few Linux devices - no iOS ones though): you can do native code on WP7, but you have to use COM to access it. Microsoft prohibits ISVs from using the COM import API from C#/VB in marketplace apps, so they can very easily block this kind of thing by just checking for references to a few specific APIs (they also block the use of C# "unsafe" pointers).

    Now, I'm not exactly advocating that Apple needs to re-design their entire applicaiton model. However, the fact remains that the way they do it, it's almost impossible to really verify that any given app isn't doing something like this, short of code-reviewing the source of every submission and rejecting any that are too hard to understand (completely impractical). It means they *are* vulnerable to malware, though - even from the "trustworthy" marketplace.

    • by h4rr4r ( 612664 )

      It does not matter what they do, without code reading apps can always do evil.

      I could submit a time zone calculator app that waits until 06/06/2012 and instead of opening properly shows goatse. With the limited testing apple does how would they ever know?

      • by h4rr4r ( 612664 )

        I meant code reviews. They would also have to reject any app more complicated than the most basic of software.

    • "Almost impossible"?

      It's a more complicated problem than determining whether the program will halt.

    • by jbolden ( 176878 )

      Well he got through one wall with that method. There are still more walls.

  • It was only a matter of time. Since they only do blackbox testing, it should not have taken this long for an app to get approved that waits to do evil until after it is in the wild.

  • I bet he's recording some sick jams [archive.org] with his unsigned iOS apps.

  • by Dunbal ( 464142 ) *
    When it happens to a Windows device it's called a "security vulnerability" and when it happens to an iOS device it's a "feature"?
  • FTA:

      ”Android has been like the Wild West,” says Miller. “And this bug basically reduces the security of iOS to that of Android.”

    Lolz.

  • The app in question has already been pulled from the App Store. And I'm quite sure the flaw that allows executing code via some hole in Safari will be fixed very soon. iOS 5 supports delta updates now, so Apple can (and will) come with small updates much more often than in the past.

    I'm still torn about security in such appliances. Ideally the user should fully own the device as well as all code running on it, but in practice, users being what they are, having a central control instance may very well be the

    • by nwf ( 25607 )

      The app in question has already been pulled from the App Store. And I'm quite sure the flaw that allows executing code via some hole in Safari will be fixed very soon. iOS 5 supports delta updates now, so Apple can (and will) come with small updates much more often than in the past.

      Unless he's figured out how to sign apps such that the OS thinks they are from Apple, and aren't. Then Apple would have to revamp their code signing system.

      • by Goaway ( 82658 )

        He hasn't.

      • by jbolden ( 176878 )

        Well that's breaking encryption in general. That takes down much more than just the app store.

        • by nwf ( 25607 )

          Well that's breaking encryption in general. That takes down much more than just the app store.

          Assuming Apple's algorithms are implemented properly, which is never a guarantee. Look at Sony.

          • by jbolden ( 176878 )

            The provisioning profile stuff is an open source part of Core Data. I haven't personally checked it, but given that the encryption has been in the open for 5 years....

      • The app in question has already been pulled from the App Store. And I'm quite sure the flaw that allows executing code via some hole in Safari will be fixed very soon. iOS 5 supports delta updates now, so Apple can (and will) come with small updates much more often than in the past.

        Unless he's figured out how to sign apps such that the OS thinks they are from Apple, and aren't. Then Apple would have to revamp their code signing system.

        He clearly stated that he went AROUND the code signing requirement; NOT that he "broke" the signing process itself.

    • by Skapare ( 16644 )

      I'm still torn about security in such appliances. Ideally the user should fully own the device as well as all code running on it, but in practice, users being what they are, having a central control instance may very well be the lesser evil.

      Let the end user decide whether they want the central control, or not. Just make sure that status can't be altered by other than the actual user.

      With digital devices filling every part of my life now the very thought of being personally responsible for every bit of code running on every one of them makes me shudder. Life is just too short for that.

      Do I trust Apple? Not very much. Do I trust Apple more than myself when I haven't got the time to spend more than a few minutes a day to care for each device (and its software) that I own and use? Probably, yes. Sad but true.

      Is there someone else you trust? How about having a trust choice? And for those of us that do trust ourselves, "self" should be one of the choices.

      I do trust Apple with one thing ... that they will make business decisions that they believe will boost their bottom like. That is the only thing I trust them to do.

  • The summary says "Apple-approved commands to run in an iPhone's or iPad's memory."

    I'm not sure if that's the normal slashdot misunderstanding/hyperbole, if it's another reporter ignorance/flamebait thing, or if that's actually in what cmiller posted.

    Apps are Apple-approved. Apps can't use Apple's non-public frameworks. Saying you can't run non-Apple-approved commands is completely inaccurate.

    • by Skapare ( 16644 )

      In theory, that's how it should work, by design. But when there's a bug in the code somewhere, that can provide a means to go around the checks. Too bad Apple inherited Steve Jobs' arrogance and refuses to work with security researchers.

  • It's not more secure (Charlie Miller keeps demonstrating that), but for the typical user (who doesn't know enough about security to judge an app), having a vetting/approval process such Apple's is still offers a safer environment than running completely unvetted apps (such as on the Android stores).

    • Except, it gives a false sense of security. With Android (or PC) apps, I know that there's a risk of malware, so I'm cautious. With iOS - well, I don't have one, but I imagine there are lot of people who think "it *can't* have malware, Apple checks everything!" and therefore completley trust anything in the app store.

      The purpose of work like this is to demonstrate that Apple has misled those people; you can't simply trust everything. The only thing worse than an obviously untrustworthy app source is an untrustworthy app source that *appears* to be trustworthy.

      • Which makes absolutely no difference to the 95+% users who don't know enough about security to make such an evaluation. No matter how many times users get burned, if they don't understand security, most of them will make the same mistake next time simply because they don't know how to evaluate an app for security. And for those who do know about security, it doesn't stop them from exercising caution. Therefore, the "false sense of security" actually makes no difference.

        • by Skapare ( 16644 )

          So someone needs to watch out for that 95+%. Apple and Miller are both trying to do that. One of those two is even willing to cooperate with the other to that end goal. The other appears to be on the track to dishonesty over the matter.

          • Agreed. I'm a big fan of CM, and the rest of the ethical security researchers.

            Apple's reaction to security vulnerabilities is pretty poor. I have personal experience with that since I reported a vulnerability in QT for Windows (CVE-2010-0530 [apple.com]) that they took over a year to fix, and didn't fix it properly when they did.

            Apple isn't the only vendor to have such poor policies, just one of the most visible.

      • Except, it gives a false sense of security. With Android (or PC) apps, I know that there's a risk of malware, so I'm cautious.

        And why do you imagine your caution is better than someone who's job is vetting apps? For example, what automated tools do you have for looking for suspicious API calls? Do you, like the app store reviewers, have test devices that don't contain your actual live data? Do you, like the app store, find out that the developer of the app is real enough to have a tax code?

        Or is the reality of your "caution" that you're just going to guess.

        • It mostly comes down to using either apps from big names that are well-known and have a reputation to uphold, or using open-source apps. If I need an app that does neither, I can run it through a proxy and monitor what it connects to via my PC. Granted that the first approach isn't guaranteed, the second isn't guaranteed unless I both check the source and compile it myself for checking against the version in the app package, and the third is a hassle. It's possible, though - and I guarantee that the folks a

          • I guarantee that the folks at Apple don't have the time or people to properly verify the apps either, nor do they seem to have the personal incentive to do it right.

            I know better that your "guarantee". The app store review process found a crashing bug in one of my apps that neither I nor my partner had ever come across in our testing. It took me two days to reproduce it myself. I know from what had to happen to trigger that bug that either he gave it a very thorough evaluation, or they have fuzzer that randomly operates the UI of the app for an extended period.

            Also interesting is that you earlier pointed out the hazards of trusting software, and here you're willing to

      • With iOS - well, I don't have one, but I imagine there are lot of people who think "it *can't* have malware, Apple checks everything!" and therefore completley trust anything in the app store.

        Any iOS user who think this after a tethering app was slipped through as a flashlight app deserves whatever they get.

    • Well, that depends.

      Take the TSA as an analogy. One of their many jobs is to detect things like knives, guns, explosives and other nasty things being brought aboard airplanes. And they are pretty successful when people have forgotten that they have one of the forbidden items in their luggage. But if you make a bit of an effort to hide these things, they seem to have a poor success rate for detecting them.

      Generally, most people have a pretty low opinion of the TSA's "Security Theater." It doesn't really

      • Flawed analogy. Forget that the TSA is searching for weapons when they need to be watching for suspicious behavior. Forget that they're irradiating passengers and groping others for their illusion of security.

        The fundamental problem with the analogy is that air passengers know to watch for weapons, suspicious behavior, etc. In fact, passengers are the only ones who have actually caught any attempts at terrorism in the last 10 years, not the TSA. Passengers can still do something to detect and stop an attack

        • I'm not sure I see the flaw.

          TSA's job is to prevent passengers from bringing weapons onto the airplane. They have some successes [nypost.com] and notable failures [judicialwatch.org] in doing this. Apple's job is to prevent malicious code from running on our iPhones and iPads and I'm sure they have some successes and failures.

          What you're saying is that it's okay that the TSA might fail every now and again because the passengers will spot the malicious person and prevent him from performing his dastardly task. Of course, passengers [cnn.com] tend [huffingtonpost.com]

          • I'm not sure I see the flaw.

            TSA's job is to prevent passengers from bringing weapons onto the airplane. They have some successes and notable failures in doing this.

            No, the TSA's job is to stop terrorists from hijacking planes, not to keep guns off planes. If half the passengers had guns, the terrorists wouldn't try hijacking a plane. And that's the fundamental problem with the TSA, their focus is on passengers as threats, rather than on the threat to the passengers. That's like saying locks are to keep you from opening a door. No, the lock is to protect what's behind the door, the door and lock are just one mechanism of providing protection.

            What you're saying is that it's okay that the TSA might fail every now and again because the passengers will spot the malicious person and prevent him from performing his dastardly task.

            No, I'm saying it's impossi

    • It's not more secure (Charlie Miller keeps demonstrating that), but for the typical user (who doesn't know enough about security to judge an app), having a vetting/approval process such Apple's is still offers a safer environment than running completely unvetted apps (such as on the Android stores).

      Actually it's less safe.

      Users in the "walled garden" have a false sense of security, the security is breached and the users still unquestioningly trust everything from a now untrustworthy source.

      Apple has a vetting process that doesn't work. How is that different to an unvetted source?

      So essentially, with Android you have unvetted applications, with Apple you have unvetted applications and a user base which is actively ignorant of security issues. Despite the rumours to the contrary, there has been

      • So essentially, with Android you have unvetted applications, with Apple you have unvetted applications

        Except that Apple do do vetting, and thus do have vetted apps.

        You claim it doesn't work. The lesson of 4 years of the Apple App store is that it does work.

        Despite the rumours to the contrary, there has been no great Android outbreak precisely because Android users are aware of their own security.

        The average Android user is not like you. The average Android user is the average phone user. They're not geeks. They don't understand security. They are exactly the same people that load animated cursors, smily packages and screensavers on their Windows PCs.

        There has been lots more malware on Android than iOS.

      • First, clearly you didn't read my reply [slashdot.org] to the previous commenter who used the "false sense of security" fallacy. Actually, the "false sense of security" argument can be many fallacies, linked below:

        Appeal to belief [nizkor.org]. e.g. Many people claim it gives a false sense of security, therefore, it must. Show that it actually has that effect before you use it as your premise. A hypothetical premise only gives a hypothetical result.

        Begging the question [nizkor.org]. e.g. Giving people "false sense of security" makes them less saf

    • by Skapare ( 16644 )

      Android does not lead people to a false sense of security.

      • Which means nothing. If you had bothered to think about it or ready any of my replies to the other people making that same claim, you wouldn't have bothered to repeat their mistake.

  • What has been broken here is not the code-signing apparatus per se but another part of the Apple security regimen; it appears this doesn't affect the need to have a valid initial certification to begin with. If the signing mechanism were defeated, that would conceivably allow anyone and his dog to upload and sell apps on the store without registering as a developer. But it isn't. So, in fact, the only people who could leverage this issue for nefarious purposes are people who are already working in the marke

    • I think your faith in iOS developers is a little misplaced. I'd just like to provide an app of value to my customers, but Apple has no process in place to vet who gets to submit an app. They just let any entity that pays the $100 submit an app. That's hardly a barrier to the evil miscreants of the world.

      I agree that the article was not entirely clear on how code signing is broken. this approach seems to be the ability to sideload new code. That's the code that hasn't been signed and that code hasn't got
  • The opening words of TFA:

    Apple's iPhones and iPads have remained malware-free thanks mostly to the company's puritanical attitude toward its App Store: Nothing even vaguely sinful gets in, and nothing from outside the App Store gets downloaded to an iOS gadget.

    WTF? Are you serious? Games and apps download data external to the App Store all the time. e.g.: The myFish3D app downloads new 3D models for fish and ornaments from its home site, uselessiphonestuff.com.

    • s/nothing/no executable code/

      It's not terribly well-written, but the gist of it is fairly accurate.

  • The Doctor Pwn's the OSX, he keeps his license. The Doctor Pwn's the iOS via Safari, he keeps his license. The Doctor Pwn's Apple's walled garden, and they take his license.

    • The Doctor Pwn's the OSX, he keeps his license. The Doctor Pwn's the iOS via Safari, he keeps his license. The Doctor Pwn's Apple's walled garden, and they take his license.

      He was grandstanding. He could have EASILY contacted Apple on the downlow; but Noooooo! He had to grandstand, thus alerting the rest of the planet to the exploit BEFORE Apple had a chance to close the vulnerability.

      He got exactly what he deserved (except that Apple should sue him into oblivion, and have him prosecuted for unauthorized access to a computer system, too).

      In other words, Miller should thank his lucky stars that a company with a bigger legal department than most U.S. States have, and a nearl

  • Miller announced the news on Twitter this afternoon, saying "OMG, Apple just kicked me out of the iOS Developer program. That's so rude!"
    - cnet.com [cnet.com]

    Really? You've been around Apple and seen how they react for how many years and you were surprised by this?

Trap full -- please empty.

Working...