Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Cellphones Handhelds IOS Programming Security Apple

Apple Cleaning Up App Store After Its First Major Attack 246

Reuters reports that Apple is cleaning up hundreds of malicious iOS apps after what is described as the first major attack on its App Store. Hundreds of the stores apps were infected with malware called XcodeGhost, which used as a vector a counterfeit version of iOS IDE Xcode. Things could be a lot worse, though: Palo Alto Networks Director of Threat Intelligence Ryan Olson said the malware had limited functionality and his firm had uncovered no examples of data theft or other harm as a result of the attack. Still, he said it was "a pretty big deal" because it showed that the App Store could be compromised if hackers infected machines of software developers writing legitimate apps. Other attackers may copy that approach, which is hard to defend against, he said.
This discussion has been archived. No new comments can be posted.

Apple Cleaning Up App Store After Its First Major Attack

Comments Filter:
  • Trusting Trust (Score:5, Insightful)

    by jeffb (2.718) ( 1189693 ) on Sunday September 20, 2015 @08:19PM (#50563831)

    Thirty-one years later, it's still worth reflecting on it [cmu.edu].

    • That definitely was worth reflecting on. Thanks.
      • Ken Thompson concluded:

        Acknowledgment. I first read of the possibility of such a Trojan horse in an Air Force critique [4] of the security of an early implementation of Multics. I cannot find a more specific reference to this document. I would appreciate it if anyone who can supply this reference would let me know.

        Did anyone find the original document? May be Snowdon did?

    • by gweihir ( 88907 )

      Incidentally, that problem has been solved: http://www.dwheeler.com/trusti... [dwheeler.com]

      It takes some effort though.

  • by Anonymous Coward

    Then what, pray tell, is the point of Apple's byzantine approvals process?

    • by phayes ( 202222 ) on Monday September 21, 2015 @03:35AM (#50565117) Homepage

      That's easy enough for everyone to figure out: It gives iOS users a more secure environment than the farce that is android today without imposing more than a tiny hardship on the vast majority of it's users.

      I don't see this as being a major problem for iOS after this incident. Other than laziness there is no good reason for people to get their Xcode anywhere else than apple (as Xcode is a free download). AppDevs have now been warned that Xcode must be inviolate if they want to avoid their apps getting banned.

      Now, what exactly was it that stopped you from making this simple deduction? Zealotry in favor of a rival platform perhaps?

      • by AmiMoJo ( 196126 ) on Monday September 21, 2015 @05:21AM (#50565329) Homepage Journal

        The usual method of getting developers to install a backdoored version of an IDE is to make them think they are downloading the legit one. Infect their computers, MITM them. The NSA/GCHQ have many ways to do that, and few developers bother to check file signatures (do Apple even offer them?)

        So far there is no evidence that the Apple way works any better than the Google way. Google scans all apps for malicious code, the same way that Apple does. You don't think that Apple employs people to decompile and check app manually, do you? If a human is involved at all, they are just there to make sure that the UI and content meet the Apple standards. Most apps don't appear to be human reviewed at all, or if they are the humans pay little attention and allow apps with zero functionality, or which clearly contravene the rules (e.g. there is a Playboy app, despite the prohibition on porn).

        The idea that Android is somehow riddled with malware is nonsense. Where are the vast botnets that would exist if it were? The Play store seems to be just as safe as the Apple app store, from a user's perspective.

        • by Wrath0fb0b ( 302444 ) on Monday September 21, 2015 @09:45AM (#50566213)

          The usual method of getting developers to install a backdoored version of an IDE is to make them think they are downloading the legit one. Infect their computers, MITM them. The NSA/GCHQ have many ways to do that, and few developers bother to check file signatures (do Apple even offer them?)

          Not only does they offer signatures [apple.com], but the infected version of xCode will be refused by default unless you modify the default Gatekeeper setting. This is all the more ridiculous because you don't even need to register to download the legit xCode directly from Apple. And of course it's protected in transit by SSL.

          Not sure what your FUD is.

          [ Yeah, maybe GCHQ is clever enough to infect xCode and still pass Gatekeeper. But this case shows you don't really have to be that smart -- just tell users "you must click here to run this software" and they'll do it, even if that means disabling security checks. ]

          • Re: (Score:2, Informative)

            by AmiMoJo ( 196126 )

            It seems that one of the affected parties was Tencent, hardly a small developer and unlikely to be using "dodgy" versions of XCode. It very much appears that they have been the victims of the NSA/GCHQ, targeting applications that are popular with Chinese users.

            We know that the NSA has the ability to bypass Apple's OS security checks, because they bragged about it in their catalogue of spy tools that was leaked. So it very much appears that they have either found a way around Gatekeeper or managed to steal o

        • by nuonguy ( 264254 ) <nuonguy@yahoo3.14159.com minus pi> on Monday September 21, 2015 @09:46AM (#50566223)

          No Evidence? [pcworld.com]

          Really? [techcrunch.com]

          No evidence at all? [yahoo.com]

          What would you consider evidence? [scmagazineuk.com]

          That’s why the news from Bitdefender researchers is so alarming. They discovered sophisticated CAPTCHA-bypassing Android malware in Google Play apps.

          from http://www.itbusinessedge.com/... [itbusinessedge.com]

          • by AmiMoJo ( 196126 ) on Monday September 21, 2015 @11:55AM (#50567005) Homepage Journal

            Sorry, you fell for the media hype. From your very first link:

            Both Wallpaper Dragon Ball and Finger Hockey, RiskIQ said, have malware that steals confidential information such as device IDs from infected devices.

            So an anti-virus company is spreading alarm that apps can access the device's unique ID and the internet, both things the user has to give it permission for. It's bullshit, they are just making out that you need anti-virus software in order to sell their shitty snake-oil product.

            By this standard there are thousands of bits of malware on the Apple app store too, because any app that has permission to read the device's ID and internet access is classed as malicious.

            The last link you posted is as close as it comes, but requires the user to download an "innocent" looking game that needs permission to send SMS messages (with a big warning that it may COST YOU MONEY $$$). They found one example, and Google removed it quickly. That's a pandemic all right.

        • by phayes ( 202222 )

          The usual method of getting developers to install a backdoored version of an IDE is to make them think they are downloading the legit one.

          Certainly, as long as you are referring to the usual methods of installing backdoored versions of IDEs for Android. As has been repeatedly pointed out, this is NOT how XCode is normally distributed.

          Your suppositions of automated and largely useless validation reeks of "this is how Android does it & though I'm ignorant of how Apple does it, I'll still offer baseless conjecture that they use the same methods as Google when authorizing apps". None but the true zealots can doubt that Apple's walled garden h

        • few developers bother to check file signatures (do Apple even offer them?)

          Apple DOES offer hashes/signatures on their regular Downloads; but not for stuff that is distributed through the App Store (which XCode now is).

          I ASSUME the rationale is that it is a "closed" file repository/download system; so file signatures were not needed.

          I would imagine that may change, or some other verification method post download, will be implemented.

          I guarantee that there have already been a few meetings about this. Apple knows how important it is to avoid "poisoned Apples" in the App Store.

    • Then what, pray tell, is the point of Apple's byzantine approvals process?

      Money.

      • Re: (Score:3, Insightful)

        by macs4all ( 973270 )

        Then what, pray tell, is the point of Apple's byzantine approvals process?

        Money.

        ORLY?

        Apple could make even MORE money by letting ANY software in, and saving the Resources it takes to Approve it.

        Therefore, there MUST be another reason. Let's see; what could it be?

        Could it POSSIBLY be that they really ARE trying (pretty damned successfully so far!) to keep this kind of shit OUT of the App Store(s)?

        Nah. That can't be it. Must be GREED, right?

        Haters gotta hate; even when it makes NO sense.

    • It doesn't mean that there's no value in imperfect security. Apple's walled garden failed in this attack, but it succeeded in thousands of other cases. The infected apps will be removed from devices and the app store, the hole will be closed.

  • Vetting of apps? (Score:5, Insightful)

    by Rainbow Nerds ( 4224689 ) on Sunday September 20, 2015 @08:24PM (#50563863)

    I'm wondering how these apps made it through in the first place. Apple is known for being strict about vetting apps and what's allowed to enter the walled garden. If so many apps were able to make it past the vetting, it ought to raise concerns about what other malicious apps might be in the app store on a smaller scale. The vetting process probably lulls many users into a false sense of security that any app downloaded is going to be safe because Apple wouldn't let unsafe apps through. Obviously that's not the case, and it's not possible to know before downloading an app whether it's safe or not. Even reputable publishers could be compromised in this way. Although I think the walled garden is actually a good idea, it's obviously not sufficient, and there needs to be other layers of security. As much as I despise most antivirus software, it might be another good line of defense. I'd like to see more about app permissions like the old Android Market listing, and perhaps firewalling and only whitelisting certain sites for apps to connect to. It's reasonable that the browser you download would be able to connect to any site; that game, not so much. What's there now isn't enough and there really is no way for a user to know that an application is safe prior to installing it.

    • by tepples ( 727027 )

      I'd like to see more about app permissions like the old Android Market listing

      The permissions are still listed. Crossy Road [google.com], the endless Frogger-clone that's become popular on Google Play. Scroll down to "Permissions" and click "View details". Or are you asking for some sort of rich privacy policy where each permission is justified with an immediately adjacent rationale, such as "Uses camera to scan barcodes" or "Uses phone state to pause gracefully when a phone call is received"?

      and only whitelisting certain sites for apps to connect to

      I don't see how this can be effective, as the app may use one of those whitelisted "certain sites" as

    • by brantondaveperson ( 1023687 ) on Sunday September 20, 2015 @09:26PM (#50564137) Homepage

      When presented with a request for access to a local or remote resource generated by a running application, almost everyone clicks "Yes".

      They normally click "Yes" without even reading the prompt, and certainly without conducting a thorough review of what the application is attempting to access, and why. This is because people are not on the whole security professionals, and just want to get shit done on their phones (or tablets, or PCs, or whatever).

      Permissions are not a solution to this problem.

      • by sims 2 ( 994794 )

        Java trains people to click yes.

        • What has Java (or any programming language) to do with that?

          When I use Java there is no random Dialog popping upo asking me to click yes for anything ... why should there?

          • by sims 2 ( 994794 )

            Java autoupdate aka jucheck.exe.

            http://forums.whirlpool.net.au... [whirlpool.net.au]
            Here is one example for more just google "java uac"

            Without disabling uac or uninstalling java the only way I am aware of to fix it is to disable java's automatic updates which sounds like bad security practice but afaik no other fixes are available.

      • Generally I'd agree with you.

        But the prompting on iOS is clear enough that many people actually do click no - especially for things like location, which people know uses battery. Or contacts, which is very easy to say "no application you do not need to see my contacts".

        And again, all this prompting happens at the time the resources is requested. So if permission is asked for later it's especially odd.

      • The user should be able to lie and say yes, but actually not grant access,

        Want access to my contacts sure, here is my fake contacts.
        Want access to my phone calls sure, but it will look like I make none.
        Want microphone access sure here is some random noise. ...

        The problem with forcing a yes/no answer if you answer no you can't run the app, that means people will generally just say yes.

        • by Lennie ( 16154 )

          No, fake contacts doesn't solve the problem. The problem is you need a better model is. Funny fact Android already has one:

          http://developer.android.com/r... [android.com]

          A similar model was adopted by FirefoxOS from the start:
          "Web Activities are a way to extend the functionality of HTML5 apps without having to access the hardware on behalf of the user. In other words, you don’t need to ask the user to access the camera or the phone, but instead your app asks for an image or initiate a call and the user then picks t


        • The problem with forcing a yes/no answer if you answer no you can't run the app, that means people will generally just say yes.

          That is complete nonsense!
          The app does not really know if you have clicked yes or no, the Operation System is asking you, not the app. And an app like "Viber" or "WhatsApp" accessing your Location, just works fine when iOS askes: "may this app access your Location" and you answer: "no".
          Why the funk should the app stop working?

          • Re:Vetting of apps? (Score:4, Interesting)

            by MachineShedFred ( 621896 ) on Monday September 21, 2015 @11:23AM (#50566787) Journal

            More than that, it's spelled out explicitly in Apple's app developer guidelines that the app will be rejected if it doesn't gracefully handle a permission denial. And, that would be incredibly easy to test in an automated fashion.

            Now if the developer is a dick and just disables all the apps functionality because you don't give them permission to your contacts, then shame on them and they deserve a nice dose of herpes. But again, it's up to the user to have some responsibility in protecting their information, and they shouldn't just blindly allow permission to anything that asks.

      • People automatically click yes when they perceive their is no alternative. If you get a dialog that says "Yes"/"Cancel", then they'll click yes, because they do actually want the action that they asked for performed.

        Likewise with classic Android permissions, refusing permission meant you couldn't install the app. So people were trained to accept them regardless.

        With iOS requests for permission at the time of first use of a resource, the question is a significant one, Both Yes and No still allow the app to c

    • I'm wondering how these apps made it through in the first place.

      From what I've read - it was a trojaned version of Xcode that some developers have used and this has inserted malware into their otherwise apps.
      Apple's scanning has now discovered it, although I don't know why it has taken them so long to pick it up.

    • Re: (Score:3, Insightful)

      by drinkypoo ( 153816 )

      I'm wondering how these apps made it through in the first place. Apple is known for being strict about vetting apps and what's allowed to enter the walled garden.

      Apple is known for mysteriously and capriciously denying apps which are similar to other apps which they have accepted. Nobody knows on what basis they justify their decisions, because they don't have to justify their decisions. How that's even legal when they have a monopoly over software distribution to untampered devices... well, money. That's how.

      Although I think the walled garden is actually a good idea

      It isn't.

      • Re:Vetting of apps? (Score:4, Informative)

        by jo_ham ( 604554 ) <joham999 AT gmail DOT com> on Monday September 21, 2015 @10:07AM (#50566367)

        Of course Apple have a monopoly on their own products... I'm not sure how you can't see that this is obviously legal.

        There's no legal problem with being the only store on a product that you sell, *especially* when Android makes up the bulk of the smartphone market.

        So, "how that can even be legal" is that Apple are not a monopoly as far as smartphones are concerned, nor are they leveraging their non-monopoly position in one area to promote their business in another.

      • How that's even legal when they have a monopoly over software distribution to untampered devices... well, money.

        For the umpteenth time, a company's own platform is not a market for the purposes of competition laws.

        Although I think the walled garden is actually a good idea
        It isn't.

        You don't even use the platform. The walled garden is an extremely attractive security and ease of use feature of iOS. Regardless of what Android fans say.

    • by gweihir ( 88907 )

      People vastly overestimate what Apple can do. Basically, reviewing an app for backdoors competently takes several times as much effort as writing it, and the people doing the review need to be significantly better than the original coder. It is a lot cheaper in practice to just re-implement with trusted people.

    • Re:Vetting of apps? (Score:4, Interesting)

      by jittles ( 1613415 ) on Monday September 21, 2015 @08:39AM (#50565871)

      I'm wondering how these apps made it through in the first place. Apple is known for being strict about vetting apps and what's allowed to enter the walled garden. If so many apps were able to make it past the vetting, it ought to raise concerns about what other malicious apps might be in the app store on a smaller scale. The vetting process probably lulls many users into a false sense of security that any app downloaded is going to be safe because Apple wouldn't let unsafe apps through. Obviously that's not the case, and it's not possible to know before downloading an app whether it's safe or not. Even reputable publishers could be compromised in this way. Although I think the walled garden is actually a good idea, it's obviously not sufficient, and there needs to be other layers of security. As much as I despise most antivirus software, it might be another good line of defense. I'd like to see more about app permissions like the old Android Market listing, and perhaps firewalling and only whitelisting certain sites for apps to connect to. It's reasonable that the browser you download would be able to connect to any site; that game, not so much. What's there now isn't enough and there really is no way for a user to know that an application is safe prior to installing it.

      they run a static analyzer on app submissions that check for when a developer makes private API calls. It doesn't catch everything. I've worked on a white label app that had 280 successful reviews in the app store and randomly was rejected on 281st submission because I forgot to enable a new permission for the app prior to submission. My permissions files were all generated using a template so all apps were missing that permission. The users were still prompted to grant permissions. Apple generally doesn't let you enable permissions on functionality that you do not actually need for your app to function. If you used some Objective-C trickery to make hide private API calls it is quite possible that Apple will not even detect it unless that call is, perhaps, triggered during the app review process.

    • The answer is NOT to present customers with fourteen more layers of pop-ups and train users to just hit 'accept' on everything. The answer is NOT to load down our mobile devices with anti-virus software, most of which are worse that most viruses. The answer is NOT to expect users to become experts on technology.

      Those are the failed ideas and policies of the Windows world. Android is trying hard to make most of the same mistakes. They are horrible, horrible, ideas and it's scary that there are some in the te

  • by Anonymous Coward

    Some Chinese developers downloaded this tainted XCode because of slow download times of XCode from the Mac App Store.

    Downloading XCode from the Mac App Store takes nearly a full day!
     
    I think this delivery mechanism of XCode is developers is very crummy and quite a nuisance.

    • Re: (Score:2, Insightful)

      by jpellino ( 202698 )
      "Downloading XCode from the Mac App Store takes nearly a full day!" I get it and the accessory files in about an hour. YMMV but a day? Where?
      • by Malc ( 1751 )

        I think it was 10-15 minutes for me. But I digress...

        If these people were able to download the infected alternative faster than from the App Store, then the real question is why? Is this a consequence of the Chinese government's internet interference?

        • If these people were able to download the infected alternative faster than from the App Store, then the real question is why? Is this a consequence of the Chinese government's internet interference?

          I was just discussing this on G+, and that's the claim, all right. Which makes you wonder, was this hack by the chinese government? Or just someone taking advantage of the situation they've created by leaning on their people so hard and denying them any and all freedoms which might be the slightest bit inconvenient for the power elite?

    • Some Chinese developers downloaded this tainted XCode because of slow download times of XCode from the Mac App Store.

      Downloading XCode from the Mac App Store takes nearly a full day! I think this delivery mechanism of XCode is developers is very crummy and quite a nuisance.

      Maybe it's an effect of the Great Firewall? My understanding is that Internet throughput in China (especially for inbound traffic) is very unpredictable with speed varying not only across time but also on physical location [furbo.org].

    • by gweihir ( 88907 )

      If Apple had PGP-signatures on it, and the developers verified them, it would not matter where they got the XCode package. But yes, the slow download is a risk in itself, namely incompetent people taking shortcuts like happened here.

      • by jeremyp ( 130771 )

        Xcode is signed. However, you can turn off Gatekeeper or temporarily override it while you run Xcode for the first time.

        • I would hope that a developer would know better than to allow an allegedly Apple-published app to continue to run when Apple's own security measures are warning you about it.

          But then I remember that most software developers are complete knobs.

      • by Rosyna ( 80334 )

        Xcode is signed and Gatekeeper warns about a corrupted binary [twimg.com]. The issue is that these developers that were infected intentionally disabled Gatekeeper checks so they could run the infected Xcode.

  • This kind of possible attack is mitigated because after you download an app, it still has no permissions to do anything interesting - access to background location, contacts, camera, audio, etc. all require permissions that prompt the user for access.

    So even if someone uses an Xcode that is compromised, there's not very much gain you are going to get by having malicious code in the app except for what that app is working with.

    Happily Android has also recently moved to this same "permission on demand" model

    • So even if someone uses an Xcode that is compromised, there's not very much gain you are going to get by having malicious code in the app except for what that app is working with.

      how about adding an extra hidden recipient to all your emails? there's no way any security system is going to stop that.

      how about a bank app that transfers money to the malware author instead of the intended recipient? again how do you stop that with security?

      • how about adding an extra hidden recipient to all your emails? there's no way any security system is going to stop that.

        How easy is that to do for someone other than the developer of the mail app? My understanding is that the apps are sandboxed in a way that wouldn't allow an easy route to alter how other apps worked.

        • How easy is that to do for someone other than the developer of the mail app?

          what if the developer IS developing a mail app?

      • how about adding an extra hidden recipient to all your emails?

        How would you do that?

        The MFMailComposer class window you open tokenizes email recipients for the user, I can't see any way of composing an email that you could not see it was going to more than one person, or that you had pre-populated the "to" or "cc" or "bcc" values with an address they did not know.

        You have no control or visibility as to email addresses the user populates in this composer window. The content is totally separated from the oth

        • by AmiMoJo ( 196126 )

          Thanks to Snowden we know that the NSA and GCHQ like to trick developers into installing hacked versions of XCode, which inject NSA/GCHQ malware into compiled applications. Their goal appears to be to get some zero-day malware into popular applications, allowing them to remotely 0wn huge numbers of iOS devices.

          So while iOS security may protect users to some degree, I wouldn't bank on it where the NSA and GCHQ are concerned. They likely have zero-day exploits that can subvert iOS security, otherwise why both

        • There are a lot of layers any such attack would have to go through, in the end scrubbing out anything much useful (which is what we see with the results). I'm not saying there's no risk, I'm saying that the system as a whole does a good job of having enough layers of security that it's very hard to get something really malicious in place.

          Witness the fact that XCode has been offered for free since 1999 and this is the first time it has been compromised.

          • by narcc ( 412956 )

            As far as you know...

            Apple has a sketchy security track record. Like Linux, it benefited from being an unattractive target as it had such a tiny user-base. OSX still does. As for iOS, for a while there, you could root the damn thing by visiting a webpage.

            That is, their products are not an attractive target for malware. When someone bothers, they're usually successful. See: pwn2own for countless recent examples.

            Aside from the microscopic market share, Apple is just like everyone else.

      • One thing I meant to add is; because all the interesting attacks happen around what the application actually does, you have to ask is the attack easier to perform though Xcode, or attacking the server the app communicates with. Just like in a bear attack you only need to run faster than the person you are hiking with, to avoid a security breach you just have to be more secure than the server you work with.

        For any given attack (like trying to attack a bank app) why would it not be lots simpler to hack the w

    • because as a developer builds an app, they are often monitoring network traffic or otherwise examining app activity

      if (strcmp(username, "suckerDeveloper")) {
          do_nasty_stuff();
      }

      the app won't do anything differently when it's running on the developer's computer

      • And the person developing the cracked version of Xcode knows my development username how again? Or any of the accounts I use for testing?

    • This kind of possible attack is mitigated because after you download an app, it still has no permissions to do anything interesting - access to background location, contacts, camera, audio, etc. all require permissions that prompt the user for access.

      So even if someone uses an Xcode that is compromised, there's not very much gain you are going to get by having malicious code in the app except for what that app is working with.

      Unless they can trick the user into giving up their iTunes account details by showing a system-prompt-lookalike. The system already prompts for passwords at some pretty random times, so that might not be hard.

      Or they could customise the exploit behaviour depending on the host application. Wait until some relevant app has been successfully exploited and is reporting in, then tailor an approach to steal whatever app-specific data is relevant (login details, etc) or even override the app's networking classes a

      • I never said it was not a problem; just that it's very difficult to get an exploit through all of the layers it needs to go through to get to the app store...

        And as I said in a different response - it seems like all this is a lot more work to go through rather than simply attacking the API server you are communication with.

        • I would have said that getting Xcode exploited in the wild was the tricky bit. Most of the rest of it seems pretty trivial.

          Attacking the API server is certainly an option, but you'd need a separate exploit for that, plus you're working in an area where an exploit attempt is expected and hopefully being monitored for. So far, this kind of attack has been pretty much under the radar, at least on iOS. Maybe that will change now.

          Apple can obviously check for the signature of this specific exploit easily enough,

          • I would have said that getting Xcode exploited in the wild was the tricky bit. Most of the rest of it seems pretty trivial.

            None of it is trivial given the moving target that is Xcode, and all the possible ways an app might be developed and Xcode project settings changed, not to mention the mixture of Swift and ObjC...

            Remember that in the course of an application development it's likely there will be at least one Xcode update, which you also have to infect in the same way before they download it...

            Attacking

            • None of it is trivial given the moving target that is Xcode, and all the possible ways an app might be developed and Xcode project settings changed, not to mention the mixture of Swift and ObjC...

              Remember that in the course of an application development it's likely there will be at least one Xcode update, which you also have to infect in the same way before they download it...

              Yeah, that's exactly what I mean. That's somewhat hard, and yet they've still succeeded. I guess there are enough people using this approach that some were stung, even though others were not for the very reasons you state.

              Do they really need one beyond "download Xcode updates from Apple"?

              Apparently so? :)

              Will be interesting to see if this problem recurs, either directly or in some secondary manner (eg. an exploit for the dev machine which modifies Xcode and disables gatekeeper, or whatever.)

    • by 0123456 ( 636235 )

      Happily Android has also recently moved to this same "permission on demand" model which makes way more sense than "agree to laundry list of demands to run" ever did.

      If, by 'recently', you mean 'in a still unreleased version of the OS that most current Android users will never get.'

      It will be years before the majority of Android users have that capability, which should have been in the OS from the start.

  • From the article:

    The tainted version of Xcode was downloaded from a server in China that developers may have used because it allowed for faster downloads than using Apple's U.S. servers, Olson said.

    Really? Really?!?

    From the context in the article, it obviously sounds like these were Chinese developers. You click on the Apple app store, and Xcode downloads for free. I'm not sure how it could be easier - if speed was the issue, just update overnight. I can't figure out what the exact angle is, but it just seems too strange for legitimate developers to "innocently" make such a boneheaded mistake.

    Or, maybe Chinese developers are so used to just downloading everything illegally that th

    • A little bit of the former, a little bit of the latter I think.

      XCode takes forever to download in China and people are used to downloading black market software.

      • by lucm ( 889690 ) on Sunday September 20, 2015 @09:06PM (#50564057)

        XCode takes forever to download in China

        XCode, and everything Apple, takes forever to download everywhere. It's faster to download the CentOS "Everything ISO" (7GB) from a shitty ftp mirror in Egypt than to get XCode (3GB) from the global network of the wealthiest company in the world.

        Wtf Apple.

        • I live in Germany ... when I download XCode it rarely takes longer than 20 - 30 minutes.

          Your problem is most likely your shitty Internet in your country and not Apple. Ask your ISP what is going wrong.

        • Comment removed based on user account deletion
    • I can't figure out what the exact angle is, but it just seems too strange for legitimate developers to "innocently" make such a boneheaded mistake.

      I'm just throwing it out there, but could it be something like: The developer thought he'd be clever by downloading a pirated/hacked version of OS X that runs on non-Apple hardware. The hacked version either then downloads a hacked version of Xcode, or won't allow a legit installation of Xcode so that the developer is forced to pirate that, too.

      I don't know, just I'm hypothesizing. If it's not something like that, then I have a hard time figuring out how an iOS developer could unintentionally install a f

  • I wonder if it would be possible for XCode to compute a hash of system libraries / executables that is then embedded into the App binary. Apple could then check this hash against what it should be and reject any app that was compiled with a bogus version of XCode or system libraries.

    Might not stop everything... but it could be a start.

    • Yeah, and I would simply include the correct hashes, from the "original (second)" XCode Installation.

      What you would do in the Java world is signing all classes, however I guess that won't help much as I assume the "hacked XCode" simply added an additional lib.

      That could be compiled freshly all the time and signed with the developers key, then the Trojan/Virus looks like the develoepr had written it.

What is research but a blind date with knowledge? -- Will Harvey

Working...