Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
IBM Businesses Programming Apple IT Technology

IBM Releases XL compilers for Mac OS X 84

Visigothe writes "IBM released their XL Fortran Compiler and XL C/C++ Compiler for OS X. The compiler is binary compatible with GCC 3.3, and has multiple levels of optimization, creating binaries that are much faster than their GCC-compiled counterparts." No prices are noted, and the planned availability date is January 16.
This discussion has been archived. No new comments can be posted.

IBM Releases XL compilers for Mac OS X

Comments Filter:
  • Benchmarks (Score:5, Interesting)

    by melquiades ( 314628 ) on Wednesday January 14, 2004 @03:08PM (#7977049) Homepage
    Has anybody seen any useful benchmarks of compiler output comparing XL and GCC on PowerPC?

    That would be interesting to see.
    • Re:Benchmarks (Score:4, Informative)

      by integral-fellow ( 576851 ) on Wednesday January 14, 2004 @04:56PM (#7978552)
      for performance comparisons, see this page:
      http://www.spscicomp.org/ScicomP7/Presentations/ Blainey-SciComp7_compiler_update.pdf
      • http://www.spscicomp.org/ScicomP7/Presentations/Bl ainey-SciComp7_compiler_update.pdf Here's the correct address. The space in there screwed it up.
        • Re:Benchmarks (Score:5, Informative)

          by klui ( 457783 ) on Wednesday January 14, 2004 @07:55PM (#7980480)
          Slashdot mucks long lines. You need to use a link like this [spscicomp.org]. Basically, on a POWER4 system (unknown re: G4/G5), specint2000 is around 30% improvement, specfp2000 is around 50% improvement. (Just eyeing the results.)
          • Thanks for that! I was trying to find the tags to insert the link, but I did not. How does one embed the link?
            In fact, I could not find any directions for using HTML other than the spartan 'Allowed HTML:' list just below the Comment window. Where are the instructions?
            Cheers!
            fellow
            • It's standard HTML tags; find any decent HTML reference, and it should explain how to work the available tags.

              To make a link, enter it as something like:
              <a href="http://slashdot.org/">Slashdot</a>
              which appears as:
              Slashdot [slashdot.org]
              Note that in the href, the URL in quotes needs the http:// bit.

  • by jazuki ( 70860 ) on Wednesday January 14, 2004 @03:16PM (#7977139) Homepage
    Very cool! Looks like the C/C++ compiler also has support for Objective-C now. Even if it's in the form of a "technology preview" and probably preliminary.

    This means that this could well be usable as a replacement for GCC in developing Cocoa-based apps. It's good to finally have some options. Can't wait to see how well it works!
  • by Laplace ( 143876 ) on Wednesday January 14, 2004 @03:25PM (#7977246)
    It's great that IBMs compiler produces faster code that is compatible with gcc, however it appears that it won't generate code that runs on G3 machines. This means if you want to build apps with it you either need to write code that builds with two compilers or not support any G3 machines.

    As a very happy G3 user I will be sad when I'm forced to upgrade.
    • Isn't that what fat binaries are for?
      • If I remember right (I'm not a Mac programmer), fat binaries are for mixing 68K and PPC code, not different PPC code.
        • The original "Fat Binaries" on the Mac were for 68k/PPC version, however, the idea of the FAT binary is not limited by CPU architecture.

          You could easily produce a "Fat Binary" that runs on G3/G4-G5/Alpha/68K/SPARC. That said, it would be one big binary before you stripped it =)

          • by Anonymous Coward

            NeXT did "Fat Binaries" before Apple did, and they are still possible in OS X's .app bundles. NeXT's added Fat Binary support on Black Tuesday [simson.net] in 1993. Apple's Fat binaries were introduced with the Power Macintosh line in 1994. NeXT's fat binaries could be built to run 68K, x86, SPARC, and PA-RISC.

            Of course, right now the search algorithm isn't designed for a fallback mechanism. The system can consider itself either a "MacOS" or a "MacOSClassic". Both are assumed to be generic PPC code and one doesn't fal

            • > The system can consider itself either a "MacOS" or a "MacOSClassic".

              That's not the "fat" part about the fat binaries. Fat binaries are for several architectures in one file.
              You build and link your binary like normal, then use lipo to merge them.
              Darwin 7 is build completely fat, with even the kernel being a fat binary.
              Here's the output of lipo -detailed-info on "yes":

              Fat header in: yes
              fat_magic 0xcafebabe
              nfat_arch 2
              architecture ppc
              cputype CPU_TYPE_POWERPC
              cpusubtype CPU_SUBTYPE_POWERPC_ALL
              off

          • Don't forget x86! (Score:3, Interesting)

            by MarcQuadra ( 129430 ) *
            don't forget x86, there's nothing STOPPING the *.app files from holding code for any architecture. If Apple ever does have to jump ship to x86 I'm sure there'll be a lot of apps that are distributed with PPC and x86 (and probably x86-64) executables inside them.

            One Application Icon to Rule Them All.
            • Thinking of that, is there a way to have a FAT binary for cross platform, like windows and MacOS X?
              • Not really, the magic is that the application icon in OS X is really just a folder with folders inside it, including what would normally be installed as seperate files in Windows. See, in Windows you have a binary .exe file and a whole slew of other files installed all over the system (dlls, jpgs, gifs, inis, etc.). In OSX you have one icon that's a folder with the 'tidbits' in it and seperate folders for binaries for different machines. To get this working under Windows you would need a different binary lo
    • I'm fairly sure that it will produce code that will run fine on a G3, especially because the G4 is basically a G3 with Altivec.

      These compilers will produce better code then GCC for the PPC chips in macs, so maybe next year we will have a OS compiled on XLC and not GCC (another speed-up). But I don't know how reliant Apple is on GCC for compiling (extensions and whatnot).

    • Comment removed based on user account deletion
    • Sys req say it requires a G4 or G5, but the code it generates will work on a G3.
  • Pricing (Score:5, Informative)

    by Erect Horsecock ( 655858 ) on Wednesday January 14, 2004 @03:32PM (#7977356) Homepage Journal
    According to an Ars thread the XLC compiler will be $499 for a single seat license. WAY below the cost for the AIX versions.

    Linkage [arstechnica.com]
    • $500 is a good deal. Particularly since you should be able to take a project developed in gcc and re-compile it with XLC if the need arises. I'm considering starting work on a particular project, if it works out and I take it far enough it could be in my financial interest to spend the $500, on the other hand it could end up like other projects of mine, destined to be filed away and long forgotten... so I'll keep the $500 for now, and spend it later if it warrants.
  • Current compiler? (Score:2, Insightful)

    by Robowally ( 649265 )
    What is MacOS X currently compiled with? Is it GCC? If so, the new IBM compiler would presumably speed up the entire OS somewhat if it were recompiled via IBM's compiler?
    • Re:Current compiler? (Score:5, Informative)

      by Erect Horsecock ( 655858 ) on Wednesday January 14, 2004 @03:46PM (#7977605) Homepage Journal
      What is MacOS X currently compiled with?


      I'm not 100% sure but I seem to remember in the WWDC keynote Jobs saying it was built with gcc3.3 the version that ships with Xcode.

      If so, the new IBM compiler would presumably speed up the entire OS somewhat if it were recompiled via IBM's compiler?


      Ya probably. I was surprised that they implemented Vector support so quickly. XLC really shines on Floating Point code, but I'm really curious to see how well it handles Vector. Even if the whole operating system isn't compiled with xlc as long as the core libs and things like codecs for QT and other multimedia apps the speed up would be impressive.
      • Re:Current compiler? (Score:4, Informative)

        by Selecter ( 677480 ) on Wednesday January 14, 2004 @05:44PM (#7979143)
        Yes, Panther was built with the GCC 3.3 "+" some poeple are calling it. The OS would speed up quite dramatically *if* Apple were to release G3/G4/G5 specific versions as the opts for the G5 if taken to the highest level would actually slow down or break things with a G4 or more likely a G3.

        More likely they will break it down into chip specific libs and kibbles and bits and have the installer detect and choose. If they code for the middle ground I fear they will give up the best chance of having huge speed gains. They need those gains on the G5 to keep their momentum going.

        • They can't really do that because one of Apple's trademarks is system independence. You can remove your Mac's hard drive and place it in a completely different machine, any machine capable of booting it's OS, and have it work perfectly. I'd say that the boot loader, etc would choose, not the installer.
          • But there is precedent for the other view. I don't know if you were around in the Quadra/Performa days, but you could install system software for your particular machine, or for all machines.
            If you chose for "your machine", you could not move the system folder to another machine and have it boot if the other machine were substantially different.
            Granted it was a bit of a mess, but it did provide for a very lean system folder, and increased performance.
      • Re:Current compiler? (Score:4, Informative)

        by TALlama ( 462873 ) on Wednesday January 14, 2004 @07:11PM (#7979979) Homepage
        The only problem being that XLC doesn't support G3 Machines, so it would be impossible to create a binary that runs on any Mac, as is currently supported by GCC. In theory they could make separate binaries and install the 'correct' one, but that poses problems for systems booting off of external hard drives (which binary do you put on the drive?) and booting over the network. The ability to carry around a copy of OSX on your iPod is a powerful thing, and not one most people would give up lightly.

        Of course, this could have been gotten around by using Bundles, which is a folder that acts like a double-clickable application. The structure is:
        SomeApplication.app/ <-- The application
        Contents/MacOS/SomeApplication <-- The OSX binary
        Contents/MacOSClassic/SomeApplication <-- The OS9 binary
        Contents/Resources/Blah.jpg
        Contents/Resources/Foo.tiff

        It chould be:
        SomeApplication.app/
        Contents/MacOS/SomeApplication <-- The generic binary
        Contents/MacOS-G3/SomeApplication <-- The G3-optimized binary
        Contents/MacOS-G4/SomeApplication <-- The G4-optimized binary
        Contents/MacOS-G5/SomeApplication <-- The G5-optimized binary
        Contents/Resources/Blah.jpg
        Contents/Resources/Foo.tiff

        When you double-click, it uses whatever binary is appropriate for the system. Unfortunately, this doesn't work for Frameworks, which lack the notion of platform.
        A rant to beat the lameness filter: this bundle format should be adopted everywhere. It allows for a folder to be used as an application, and to contain all the resources it needs to be used and run. Moving the folder moves the application, and the folder doesn't use any vodoo to keep the data together, as pretty much any HD format understands folders and files.
        In addition, the multiple-binaries trick (as shown above working with OS9/X and proposed for processors) would allow the same bundle to work on muluple platforms, so I could email you a zipped version of Office from my Mac that could work on your Wintel, no Java required.
        The support is in the Finder/Explorer/Browser, which needs to understand that 'double click on bundle' == 'find correct binary and launch it'.
        • Re:Current compiler? (Score:3, Informative)

          by Graff ( 532189 )

          Of course, this could have been gotten around by using Bundles, which is a folder that acts like a double-clickable application...When you double-click, it uses whatever binary is appropriate for the system. Unfortunately, this doesn't work for Frameworks, which lack the notion of platform.

          Ahh, but that is what fat binaries [apple.com] are for. A fat binary allows you to package up several different versions of a program optimized for different runtime architectures.

          Macintoshes have been able to use fat binaries f

          • Here is more information on optimizing for the G5, if you look towards the bottom there is notes on packaging an app to run on different processors.

            Whups, somehow my link was lost. Here is the link [apple.com] to optimizing the G5. And yes, I know it should be PowerPC not PowerPc! :-)
          • Sorry, this is not what fat binaries are for.
            Fat binary was a ugly quick&dirty hack that ruined the beautiful, clean design of resource forks. A better way would have been to use for example "PPCD" resources for PPC machine code and "CODE" for existing 68k binary code.
            The NeXT approach of MacOSX is however much more scalable. It works really well in OpenStep bundles.
  • by Troy ( 3118 )
    Pardon my ignorance, I was 31337 for only 37 seconds in 1997.

    What do they mean when they say that two compilers are "binary compatible" Does it mean that XL produces identical machine code? Does it take identical switches so makefiles don't have to be rewritten? Does it simply mean that XL has the same foibles as gcc, so code written to gcc's foibles doesn't need tinkering? Use of the term doesn't quite fit with my current understanding of compilers.

    -Troy

    • by Anonymous Coward
      I would imagine that it means a library compiled by one compiler could be linked by an app compiled with the other.
    • by dr2chase ( 653338 ) on Wednesday January 14, 2004 @03:48PM (#7977638) Homepage
      Binary compatible means same data layouts, same parameter-passing conventions, same conventions for shared libraries and position-independent code. However, between those interfaces, the generated code is probably different.

      Think of it like nuts and bolts -- a nut and bolt are compatible if they have the same diameter and threads per inch, but they may be made of carbon steel, steel, bronze, nylon, titanium, whatever.
    • Usually it would mean that if you has a (shared) library that was compiled with XLC, you could use it in a program compiled with gcc or vice versa. Perhaps there's more to it than that, but that would be the minimum requirement, I think.

      The extreme case would be that Apple could compile OS X with XLC ($) but still allow people to run applications written with gcc (free).
    • Looking at IBM's accouncement [ibm.com] it looks like they use same headers and run-time libraries as gcc 3.3. They say you can combine xlc and gcc compiled files. I.e., you can link object files.
    • by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Wednesday January 14, 2004 @03:57PM (#7977760) Homepage
      I believe that it referes to the interfaces between code. What I mean is that, for example, it lists the functions in object files the same was as GCC and they are called with the same machine code sequence as GCC (they way arguments are put on the stack, etc). This is good for a few reasons. For one thing, it means that code written with this compiler can link to code written with GCC or vice-versa. Ordinarily you can't take an object file from VC++, one from OpenWatcom, one from GCC, and one from ICC and link them together. But if all the compilers were biniary compatablie, you would be able to. It has nothing to do with the internal code generated, as if both compilers generated identicle sequences of machine code, one couldn't be faster than the other. I think the main benefit is that, for example, you could use a static library that was compiled with this compiler with your code that uses GCC without a problem.

      As for commandline switches and such, I would assume that they would be the same (or that there would be a simple option like --gcc that would turn on "gcc mode" so that it took the same command line stuff).

      PS: If I'm wrong would someone please reply and correct me, and not just mod me wrong?

      • by g_lightyear ( 695241 ) on Wednesday January 14, 2004 @06:53PM (#7979827) Homepage
        "binary' refers, indeed, to the binary compatibility of object files; in GCC terms, when there's an "ABI" change, you have to re-compile all applications, as new stuff compiled in the new Application Binary Interface can't access stuff compiled in the old ABI.

        What it REALLY means:

        1) You can compile the majority of your application in GCC, and selectively compile in IBM's XLC.

        2) You can compile one library in XLC, and link it in to your GCC application.

        3) You can compile a library in GCC, and link it in to your XLC application.

        Etc. You get the point. Essentially, while the code they generate is very, very different in terms of optimization and performance, they are, in fact, completely interchangeable in terms of the things they produce as output.

        XLC is, in fact, a very different beast than GCC. The number of optimizations it provides goes well beyond what GCC currently provides, and does include auto-vectorization and support for OpenMP - things which don't suck on parallel systems.

        So XLC is a good thing for commercial software developers, and at minimum, the compatibility of the systems means that we as developers have no excuse not to be compiling, at bare minimum, the most *important* functions (and if we're doing it this way, it might as well be specific functions) in XLC, and link in that parallelized and optimized object file into our existing project.

        As for commandline switches... nope. Almost never compatible. No hope. Basic stuff is mildly similar, but the guts you'd use once optimizing are very different.

        But at a high level, yes, you just say xlc -O3 instead of gcc -O3, only you might say xlc -O5.
        • And just to underline the point: Names are mangled the same way between GCC and XLC so one linker knows what the string of trashcharacters means (the mangled name of C++ functions to account for overloading).
  • FASTER OS X? (Score:5, Interesting)

    by zulux ( 112259 ) on Wednesday January 14, 2004 @03:42PM (#7977538) Homepage Journal
    What does Apple use to compile OS X - and if IBM get the Objective C sections woking properly, could Apple use the IBM comiper to get OS X to run faster?

    • I would assume that they already use IBM's compilers since IBM designed the chips and would have the fastest/maturest compilers. What other option is there? GCC? I thought that GCC didn't produce very optomized code for G3/4/5s (optomised well by using AltiVec, etc).

      Just like I'd assume that Microsoft would use Intel's compilers if a) Microsoft didn't make compilers and b) AMD (and other clones) weren't around.

      So I'm just speculating. Anyone know the real answer?

      • Re:FASTER OS X? (Score:4, Interesting)

        by addaon ( 41825 ) <addaon+slashdot@nOsPAM.gmail.com> on Wednesday January 14, 2004 @04:22PM (#7978079)
        OS X and all Apple software for OS X that I know of (maybe not including some high-end stuff, I just don't know for sure) is indeed built with GCC.

        Porting an operating system to a different compiler is a pain in the neck, and most OS's use compiler-specific tricks to deal with low-level details. Also, most of Apple's higher-level software is in Objective C, and as of now only GCC really supports Obj C well on the mac.
      • I've always heard that Apple uses GCC to compile OS X. Apple has always touted the GCC name on its website, and I remember hearing that a lot of the speed increases in recent releases are due to new optimizations in GCC. Also, the much maligned/celebrated SPEC benchmarks Apple touted for the G5 were obtained with GCC builds. Seeing that Objective C is only in "technical preview" for the IBM XL compilers, you can be certain that it wasn't used for any current version of OSX.
      • by DAldredge ( 2353 ) <SlashdotEmail@GMail.Com> on Wednesday January 14, 2004 @04:57PM (#7978559) Journal
        Have you thought about applying for a /. editors job?
    • Re:FASTER OS X? (Score:4, Interesting)

      by Visigothe ( 3176 ) on Wednesday January 14, 2004 @04:39PM (#7978352) Homepage
      Apple currently uses GCC to build Panther. As XL is much faster in 95% of the situations, I would imagine that Apple would transition to XL once the Obj-C portions of the compiler were a bit more mature. [The public beta of XLC didn't have any Obj-C support]
    • Re:FASTER OS X? (Score:5, Interesting)

      by rmlane ( 589573 ) on Wednesday January 14, 2004 @07:30PM (#7980180)
      As mentioned by others, the majority of OS X is compiled by GCC.

      The exception is Quicktime, which uses (and has used since well before OS X) a older, custom version of the IBM compilers. I believe, but am not 100% sure, that Quicktime has always used the IBM compilers on PowerPC CPUs.

      This is very good news for Apple's science users, one of the real problems pushing Mac boxes into some markets has been the lack of a really good Fortran compiler. The performance boost for C/C++ code will also be appreciated.

      As for a wholesale transition of OS X to the IBM compilers: next to no chance. QA of the transition would take far too long and absorb resources that could be better used on other improvements. It would also cause problems with the Open Source versions of Darwin, so expect the vast majority of OS X to remain GCC compiled.

      That being said, I would expect that certain chunks will be transitioned, where it makes sense. The output of the IBM compilers is binary compatible with GCC, so you can recompile (and re-QA) chunks of the OS where you'll get a major improvment.

      Quartz Extreme, CoreFoundation and AppKit spring to mind, but don't expect this to happen in 10.3 or 10.4, more like 10.5 or 11.0.

      • Re:FASTER OS X? (Score:3, Informative)

        by g_lightyear ( 695241 )
        As there's work going into XCode to ensure that any project can specify which compiler it uses on a target-by-target basis, I fully expect we'll see several core projects in the Darwin codebase switch over to using both compilers (where XLC will be used to compile specific branches) in 10.4.

        OpenMP, at app-level, is pretty much guaranteed to get some use, and Apple will very likely spend some time in the vec/math libs fully OpenMPing that code to get parallel use of both CPUs.

        CoreGraphics would probably ge
      • Re:FASTER OS X? (Score:3, Interesting)

        by gerardrj ( 207690 )
        You mean it would consume more resources than a complete re-write of the OS from Pascal to C as happened with System 6 to System 7 transition?

        I expect that once IBM completes the ObjC compiler that most of the OS will be migrated to that compiler, as will much of the high performance commercial software out there. The developer tools will still come with the free GCC compiler, and Apple will still maintain it, but without changes to the core of GCC (which are being resisted by the maintainers), it will ne
        • what is the status of GCC optimizations for the macintosh platform? why is the GCC team resisting? and can't apple just fork GCC if they need to make heavy modifications for the G5?
          • Re:FASTER OS X? (Score:3, Informative)

            by gerardrj ( 207690 )
            Apple has completed some very major improvements to the gcc code base, I don't recall, but I think Apple managed so improve compile times by something like 25%, and code performance of something like 15%. I don't remember where I read those numbers, or how close I am to what I read.

            The reason (again from what I've read) that the gcc maintainers are resisting the changes that Apple would really like to make for the G5, is that the changes would fundamentally break many if not most of the other platforms com
    • I would guess that only a relatively tiny portion of OS X would benefit from things like autovectorization--probably most of the code is passing messages around, blocking on semaphores, writing out display lists to the video card, managing queues, etc, etc. The code that tends to benefit from vector optimizations are the really tight inner loops of 2D graphics routines and audio/video codecs; these are small enough that it may be easier to just vectorize them by hand or with GCC's intrinsics and probably b
  • Autovectorisation ? (Score:5, Interesting)

    by Jesrad ( 716567 ) on Wednesday January 14, 2004 @04:27PM (#7978174) Journal
    Weren't these compilers supposed to bring automatic conversion of multiple 32bits arithmetic operations into Altivec-accelerated code ?
    • Actually, reading around, it seems like a limited version of this will indeed be available. Now, will it be as good as we'd like?

      Maybe, maybe not. :) I'd like to see a really aggressive auto-vectorization scheme that goes for every possible chance to parallelize code. Since the PowerPC spec calls for so dang many registers, it seems like it'd be much easier (and provide more benefits) to store up several ops in registers and then chain them.

      Integer ops are also subject to this. :)
  • Fortran Motives (Score:4, Interesting)

    by fm6 ( 162816 ) on Wednesday January 14, 2004 @05:58PM (#7979282) Homepage Journal
    From what I've heard, software companies hate selling Fortran compilers. You'll notice that Microsoft no longer has one. Not enough people use the language to make it worth the development and support costs.

    So why are you still able to buy Fortran compilers? Because the people who use the language tend to be engineers (the physical kind) and scientists, and thus spend a lot of money on high-end computers. No Fortran compiler, not fat contracts for your Starfire [sun.com] and Origin [sgi.com] boxes. Which is why Sun and SGI both sell Fortran. And whose the leading vendor of Fortran for the Itanium? Good guess [google.com].

    So is IBM trying to help Apple sell more Macs? Probably not. They'd make a little money from the extra CPU sales, but not enough to justify something like this. More likely they have this compiler to help them sell more high-performance PPC systems [ibm.com]. As long as they have it, not that much extra effort to port it to the Mac.

    • Re:Fortran Motives (Score:3, Interesting)

      by jabberjaw ( 683624 )
      Yes indeed Fortran is still alive and kicking [rice.edu]. Although I have heard that some of the physics libraries are being converted to C/C++.
      As an aside, has anyone else noticed the lack of Fortran texts in brick & motor bookstores? I know Numerical Recepies in Fortran [nr.com] is online, anyone care to mention a good intro. text for a "n00b" like me?
      • Re:Fortran Motives (Score:3, Interesting)

        by fm6 ( 162816 )
        I used to work with a guy who had just finished a Physics PhD. He hated Fortran, and had insisted on using C++ all through school. But he admitted to being the only person in his program who did so. Proof that programming languages are as about the community they serve as the technology they encompass.
      • Try "Classical Fortran, Programming for Engineering and Scientific Applications" by Michael Kupferschmid, ISBN=0824708024, around $70

        Its a well written book that assumes the reader knows nothing of fortran. He teaches Fortran at RPI [rpi.edu].

      • No lack of texts in Portland [powells.com]. These are all on the shelf at Powells Technical Books at like NW 8th(maybe 9th) and Burnside. Or you can buy them online too if you don't live in Stumptown....
    • Re:Fortran Motives (Score:1, Interesting)

      by Anonymous Coward
      Fortran is part of the SPEC benchmark, so every vendor will continue to produce Fortran compilers and write them off as marketing cost.
  • So has anyone got better pointers towards the state of their objective-c support? I know they say it is there as a technical preview with no guarantees until they finish, but does it basically work and is slow, or is it unable to compile even modestly complex stuff?
  • Big Blue might find it hard to compete with the free ADC tools, no matter the quality of their XLC.
  • Does this symlink itself over GCC or does it add itself to gcc_select so you could do sudo gcc_select xl and get the new compiler. I know using variables that you can get Project Builder/XCode to use a different compier or send different flags. I think now there is a bit of compeletion in the PPC compiler, maybe GCC will get faster.
  • by Anonymous Coward
    Just to note (this is slashdot after all) - these compilers have also been released for Linux on PowerPC! And there, it supports both 32-bit and 64-bit ABIs. On OS X, you're limited to the 32-bit ABI.

"Oh what wouldn't I give to be spat at in the face..." -- a prisoner in "Life of Brian"

Working...