Forgot your password?
typodupeerror
Graphics OS X Operating Systems Ubuntu Apple Linux

OS X 10.8 vs. Ubuntu On Apple Hardware, Benchmarked 130

Posted by timothy
from the apt-get-me-a-new-version dept.
An anonymous reader writes "OS X 10.8 has been benchmarked against Ubuntu Linux with some interesting results. From the tests on a Apple Mac Mini and Apple MacBook Pro, OS X Mountain Lion was clearly superior when it came to the graphics performance, but the rest of the time the operating systems performed quite closely with no clear winner. OS X also seems to have greater performance issues with solid-state drives than Linux."
This discussion has been archived. No new comments can be posted.

OS X 10.8 vs. Ubuntu On Apple Hardware, Benchmarked

Comments Filter:
  • surprise surprise (Score:5, Informative)

    by shentino (1139071) on Thursday August 23, 2012 @02:03PM (#41098477)

    Apple hardware performs better when run by Apple device drivers.

    News at 11.

    • Stole my line....

    • by haystor (102186)

      No kidding.

      What are the benchmarks on all the non Apple hardware platforms?

      • by Gr8Apes (679165)
        What would you like run? Pretty sure you can go do a little research at www.insanelymac.com or www.tonymacx86.com for example and see some benchmarks on hackintoshes.
    • Re:surprise surprise (Score:5, Informative)

      by maccodemonkey (1438585) on Thursday August 23, 2012 @02:43PM (#41099211)

      For graphics, what Apple device drivers?

      The graphics drivers are written in house at NVidia and AMD. Apple doesn't actually write their own drivers. And the GPUs are just bog standard AMD, NVidia, and Intel GPUs (expect for some of the graphics switching.). There is not reason Linux should be at a disadvantage.

      And if they did I'd expect worse performance. Back when Apple used to write their own drivers they were totally awful. Apple has less experience writing graphics drivers, I'm not actually sure why you'd expect Apple written drivers to perform better.

      • The hardware tested had an Intel HD 3000 integrated graphics, the Linux Intel drivers are known for being slow, but the Nvidia drivers are just as fast (or slightly faster in some cases) as the Windows-equivalent and presumably OSX as well. ATI I'm not sure, but they're at least closer to Windows performance than Intel is.

      • Intel graphics on Linux uses the open source Mesa/Gallium stack, which still has significantly lower performance than the proprietary drivers. Frankly, I'm wondering if the GPU is being used at all. I have a Radeon 6870, and with open source radeon driver I don't see any acceleration. For example, a full-screen xterm with Midnight Commander takes a full half-second to draw the frame with only 160x50 char cells. With fglrx 12.8, the drawing time is not noticeable at all. The Mesa radeon feature matrix says R

        • Are you actually using xterm, or is it one of the libvte based terminals (gnome-terminal, xfce4-terminal, ect.)? I recently encountered some pretty serious performance issues with libvte. Try Konsole, or urxvt if you want something less heavy.
          • by Chemisor (97276)

            Are you actually using xterm, or is it one of the libvte based terminals (gnome-terminal, xfce4-terminal, ect.)?

            I'm using xterm. gnome-terminal is actually faster because it uses Xrender to draw the text, while xterm relies on Xft. Under fglrx gnome-terminal is awesomely fast, with no flicker at all. Unfortunately, vte-based terminals flash the gray background when you switch to them. They also have that ugly resize handle in the lower right corner that nothing can remove. Oh, and gnome-terminal captures F1

      • by CAIMLAS (41445) on Thursday August 23, 2012 @08:57PM (#41104227) Homepage

        The graphics drivers are written in house at NVidia and AMD. Apple doesn't actually write their own drivers. And the GPUs are just bog standard AMD, NVidia, and Intel GPUs (expect for some of the graphics switching.). There is not reason Linux should be at a disadvantage.

        What, you mean aside from the fact that Linux drivers for all those respective device manufacturers don't really get a whole lot of attention from the developers compared to Windows and OS X?

        But really, that's not important. What people need to pay attention to is that this is something done by Phoronix. This means you need to consider a couple things:

        * This benchmark is almost meaningless. Time and time again, I have seen them (falsely) correlate data with an assumption.
        * The review was done by someone who doesn't really know what they're talking about.
        * These are synthetics. Without context or understanding of what the benchmarks are doing (there is no explanation) or what may have led to the
        * The discrepancies are, in most cases, severe enough that you have to assume (at least) one of the following: their benchmark suite was not properly/identically configured for all architectures, or there are drastic implementation discrepancies within the benchmark tool they're using (eg. it wasn't designed but with a specific use case in mind).

        The reason there is "no clear winner" is because it's all rubbish. They're throwing 100 things at two different targets and comparing what misses and saying "no conclusion". Really? You'd have better consistency with an ink blot test of random participants, with ink blots generated by a true random number generator.

        Some of the graphics benchmarks don't stand out; the ones that stand out the most are the computational ones involving (very) standard libraries or frameworks which then contradict later results.

        For instance, CompileBench and Threaded I/O Tester: OS X falls flat on its face. The threaded I/O tester I believe, because I've seen the same with db and server performance. But earlier, they've got bgbench giving OS X four times the performance for postgresql as Linux. Is that even rational, given that even FileMaker has shyed away from OS X as a preferred platform due to threading and filesystem performance?

        Then, they go on to fail to explain these things and why they're fundamentally inconsistent. Not just "this doesn't quite line up, we can write it off due to different library version overhead" but in line with "this car goes faster because its engine is smaller". What?

        On a more personal level, I have used their suite of benchmark tools and come away fairly underwhelmed by the results. They're inconsistent and inexplicable, such as those seen in this review.

        Here's a hint, benchmarkers:

        * when you benchmark something, you must compare things and try to figure out why they are performing as they do.
        * If there are gross discrepancies which belay a reasonable expectation or contradicts other information, investigate it, because it's probably important
        * Be sure of what you're comparing. If you've got (more or less) identical binaries on different platforms and the hardware, you're just comparing the kernel. Is that what's happening here? Are their tools linked against native libraries (which would, you know, be an honest benchmark of said platform) or do they use their own stack?

        Anyway, I could go on, but you get the idea. This benchmark is stupid on its face. The only benchmarks I'd trust from this roundup here are those that are straight up "measure something real" (frames per second in x, time to complete concrete task y). They make a very different picture than when the synthetics are thrown in to the whole: overall OS X performance is pretty abysmal, but is marginally better at graphical things than Linux. This fits pretty close with my (personal) observation that OS X is about 10-15% slower than linux on general things, markedly slower on threaded things, and a dog at file manipulations while having a firm grasp on display management/graphical stuff - so it might just be my "uneducated Apple-hating bias" speaking. :)

    • Re: (Score:2, Funny)

      by burne (686114)

      Intel HD 3000 or NVIDIA GeForce GT 330M graphics is Apple hardware? That would be real news...

    • Re: (Score:2, Insightful)

      by hairyfeet (841228)

      I just wish people would be happy with what they like instead of the constant "X is better than Y" flamewars. If OSX is to your liking and you think Apple hardware is worth the extra scratch? Then please enjoy, wish you nothing but luck. If you think Linux is worth the hassle, are a programmer and need the ability to script, or one of the lucky few that manages to find hardware that is never broken by an update? Great, I wish you nothing but luck and happiness. If you are one of those people for whom the la

      • Re:surprise surprise (Score:5, Interesting)

        by shentino (1139071) on Thursday August 23, 2012 @06:03PM (#41102313)

        I love how you're biased in calling linux a hassle to program and saying that one must know how to script to use it properly.

        There are actually, contrary to rumor, a few user friendly distros out there that don't require a PhD in computer science to make use of.

        And Microsoft at least HAS been caught hiding APIs that gives its own programs a performance advantage.

        My comment isn't about which is better, anyway. It's about which ones cheat on their benchmarks by giving themselves a proprietary boost not available to the competition.

        See also the scandal of either nvidia or ati making its own hardware's performance deliberately go down the crapper when it detected the competition's chips.

        • by hairyfeet (841228)

          Sure they don't...until you update the thing and Linus takes a steaming dump all over your drivers! But don't take MY word for it, how about one of the Red Hat Developers [google.com] who says the desktop is "suckage" and the entire system is broken? Are you claiming HE has a "bias" too? How about a list of things horribly broken in Linux [narod.ru] and please note the date of the list is 2012 AND it has links to every. single. example. so you can check them for yourself.

          I repeat if you are willing to put up with a broken driver

      • by jbolden (176878)

        This isn't about running Ubuntu, its a way to test out areas where the Linux kernel needs improvement. For example 9 watts OSX vs. 21 watts Ubuntu shows an area where Linux could use some help. Or the problems Linux has with dual video subsystems, an area for improvement. On the other hand the fact that Ubuntu was able to outperform OSX on SSD performance is likely an area in which XNU could use some help.

    • by ardor (673957)

      This isn't just because of the drivers. Its because Quartz is clearly superior to X11. Built from the ground up to make use of HW acceleration , most likely with a scenegraph-like approach, it outclasses X11 easily, which has an architecture that is an anachronism these days.

      An interesting comparison would be to run an OpenGL 2/3 benchmark tool on Linux, then on OSX, with nvidia hardware, then with ati/amd hardware.

  • by Anonymous Coward

    How is any of this surprising and/or news? Mac OS X has been designed with the graphics card of MacBooks in mind. Other parts of the hardware don't require as much magic, so there's less difference...

  • by MrEricSir (398214) on Thursday August 23, 2012 @02:06PM (#41098545) Homepage

    I made the mistake of "upgrading" two Ubuntu 12.04 desktops to solid state drives, only to find the performance increase was trivial.

    What gives? The difference between magnetic drives and SSDs on OS X is incredible. Is this a driver issue, or what?

    • by Sparticus789 (2625955) on Thursday August 23, 2012 @02:13PM (#41098645) Journal

      PEBKAC. When I upgraded my Ubuntu laptop to SSD, boot time was under 10 seconds and my battery life while surfing the internet went from 3 hours to almost 5 hours. Not all SSDs are made the same, you have to research the performance of each, power draw, etc.

      That being said, I bought the SSD with the second-lowest power usage and middle-of-the-road performance.

    • by Hatta (162192) on Thursday August 23, 2012 @02:15PM (#41098681) Journal

      Linux caches disk reads pretty aggressively. If you have plenty of RAM, you might only notice a difference the first time you start an app.

    • by cpu6502 (1960974) on Thursday August 23, 2012 @02:17PM (#41098731)

      That's Not what the article said. It said OS X had performance issues with solid state drives.

      Also I'm kinda curious: Why would spend twice as much to buy an Intel Mac PC if they're just running linux? I'd buy a regular PC for 1/2 to 2/3rd the cost.

      • by sl956 (200477) * on Thursday August 23, 2012 @03:00PM (#41099461)

        Also I'm kinda curious: Why would spend twice as much to buy an Intel Mac PC if they're just running linux? I'd buy a regular PC for 1/2 to 2/3rd the cost.

        I looked for a silent small footprint linux pc. I was unable to find one. That's why I bought a Mac Mini. It runs Linux flawlessly... and silently thanks to the fanless design and SSD.

        People wanting an HD screen on a laptop might also have to buy Apple hardware even though they plan to use only Linux.

        • by dfghjk (711126)

          Mac Minis are not fanless designs nor are they silent.

          • by CharlyFoxtrot (1607527) on Thursday August 23, 2012 @04:41PM (#41101085)

            I have the older style Mini and when the HDD goes to sleep and it runs on SSD-only it's damn near completely silent. The fan will only come on when really stressing the cpu.

            • I have the older style Mini and when the HDD goes to sleep and it runs on SSD-only it's damn near completely silent. The fan will only come on when really stressing the cpu.

              Psh, that's not fixing the real issue. You should solve the problem at it's source. My heatsink fans could be a pair of turbo props, and I still wouldn't hear it them.

              Now, where was I? Look, I don't know or care if this is my lawn or not, but you better get to stepping!

            • by gerardrj (207690)

              The fan is ALWAYS on. I think idle for the older Mini's fan is 1,200 RPM and max of about 5,500RPM.

      • by willy_me (212994) on Thursday August 23, 2012 @03:06PM (#41099551)

        I would guess it is because OS X defragments the drive as it is being written. The overhead is largely not noticed when writing to a traditional hard drive while. Due to SSDs greater speed, it will make it appear that OS X has performance issues. The thing about performing inline defragmentation is it improves speed as the computer ages and as the HD begins to fill. Because all of the benchmarks were performed with fresh systems, the benefits of a defragmented drive would not be noticed.

        The question I have is with the low seek times of SSDs, is there still a need to defragment drives? Probably, but to what degree as it surely is not as important as when one is using a traditional hard drive.

      • I sure wouldn't buy a new MacBook to run Linux. But, I might switch this one to Linux if/when it can't handle a new OS X release.

      • Isn't it obvious? Because you like the design of the hardware, and feel that the price difference is worth it. It's precisely the same reason someone would go buy a ThinkPad or similar upper-tier laptop and run Linux on it.

      • by CharlyFoxtrot (1607527) on Thursday August 23, 2012 @04:35PM (#41100983)

        Footprint, noise (or lack thereof), the ability to run all major OS (OSX, Windows, Linux) on the same machine, low power usage and nice looking sturdy construction. If you're going to be putting it on a desk the Mini is a nice little package.

      • Seven hours of battery life, and very high resale values compared to Windows laptops. From my experience, 4 year old Windows laptops go for less than $100, while a similarly aged Macbook usually goes for 40%-50% original purchase price. Seriously, most cheap Windows laptops feel like toys. Macs don't.
    • by 0123456 (636235) on Thursday August 23, 2012 @02:35PM (#41099049)

      I made the mistake of "upgrading" two Ubuntu 12.04 desktops to solid state drives, only to find the performance increase was trivial.

      If a process isn't disk-intensive, an SSD will make no difference. If it's not seek intensive, a cheap SSD may actually be worse; if I remember correctly, sustained reads from my 'Green' hard drive are 80-100MB per second, whereas one of my SSDs only gets about 40MB per second.

      The big benefit is reduced seek time, and a lesser benefit from faster sustained reads on the more modern and/or expensive SSDs. It won't make games run faster unless they're streaming from disk, or improve CPU-intensive 3D rendering, or anything much else that doesn't require a lot of disk seeks.

      • by 0123456 (636235)

        Oh, and that cheap and crappy SSD cut my Ubuntu netbook's boot time from about 45 seconds to about 15 seconds.

      • If a process isn't disk-intensive, an SSD will make no difference

        and almost all apps are relatively disk intensive (vs. CPU intensive) and hence IO bound. unless you are running something like BOINC that just crunches numbers.

        on my mac laptop (without SSD), the only time i see the CPU go above ~20% is when i start eclipse, and that's with 16GB of RAM.

    • by idsfa (58684)

      Linux default config is optimized for spinning platters. You have to tweak a few things [howtogeek.com] to get the best performance.

    • by Gothmolly (148874)

      noatime

      Mounting filesystems with atime imposes a 15-20% penalty on IO due to increased writes. Troll the linux kernel lists for details.

      • Some applications might break (such as mutt), though it's generaly safe.
        It should be noted that linux now uses relatime by default, which seems to be more efficiente than atime, but not break anything (like noatime).

    • Just double-check your BIOS: are you using the SSDs in the SATA native (AHCI) mode, or the old fashioned IDE mode? Also, are you using a SATA II (or III) rather than SATA I port? Remember that SSDs shine on random reads and writes: if you try to benchmark with something like hdparm -tT. you won't see much improvement.

    • I own two OCZ 30GB drives, one 60GB, and one 120GB. In every computer I placed them in, the difference was HUGE. Debian Squeeze and formerly Debian Lenny.
  • Summary of tests? (Score:5, Interesting)

    by GoNINzo (32266) <GoNINzo@y a h oo.com> on Thursday August 23, 2012 @02:10PM (#41098605) Homepage Journal
    15 pages of a review, with a poor summary of the results, results in the most number of page views. It would have been nice if they had some sort of summary or benchmark to compare the two against rather than individual tests spread across this. Perhaps a summary chart?

    Also, comparing a well tuned video device driver versus the (usually) hastily written Linux one is a poor comparison.

    I really doubt people choose a mac over Linux over this kind of test. There more solid reasons to choose one or the other.
    • 15 pages of a review, with a poor summary of the results, results in the most number of page views. It would have been nice if they had some sort of summary or benchmark to compare the two against rather than individual tests spread across this. Perhaps a summary chart?

      Funny, I never bothered looking at the link, but from this comment alone it was obvious that it's a Phoronix article.

      • by styrotech (136124)

        Funny, I never bothered looking at the link, but from this comment alone it was obvious that it's a Phoronix article.

        I find you can usually pick them by the Slashdot article title alone.

        If it is the latest Ubuntu benchmarked against anything else, or comparison benchmarks of some recent GPU on Linux or the latest Xorg drivers etc - 98 times out of 100 it will be Phoronix.

        Nobody else ever bothers - or if they do, it never makes it to slashdot. Which is a shame, it would be nice to see those comparisons done

    • I really doubt people choose a mac over Linux over this kind of test.

      i really doubt many people would be dull enough to pay top dollar for mac hardware just to run linux.

      • by Anonymous Coward

        Well, Linus Torvalds is apparently "dull enough" to use a Macbook Air.

        • and linus torvalds has the same sensibilities and motivations as the average consumer, right? sheesh.

          also, did you know bill gates is running windows on his laptop? did you know sergey brin has an android phone? other shocking facts omitted for brevity.

    • comparing a well tuned video device driver versus the (usually) hastily written Linux one is a poor comparison.

      Uhhhh, why? That was the point of the test. Same hardware, different software, what is the performance difference?

      • by GoNINzo (32266)

        Uhhhh, why? That was the point of the test. Same hardware, different software, what is the performance difference?

        Er, the point of the test was to generate page views. But yes, a graph showing the clear winners and losers at the end in the summary would have been helpful. At least with Tom's Hardware, they put a summary of the different pages.

    • by savuporo (658486)
      Also, comparing a well tuned video device driver versus the (usually) hastily written Linux one is a poor comparison.

      I read this as : Linux is not for critics [penny-arcade.com], because hastily written graphics drivers that mostly suck is what you get with it ?
  • Too long; don't care; Slashdotted anyway

  • by Anonymous Coward

    The graphics tests were run with Intel graphics. Linux results may have been more competitive if AMD or Nvidia graphics were used. Ubuntu 12.04 has gotten a large FPS jump in some games using AMD or Nvidia. I don't have the magazine in front of me right now.

  • Worthless... (Score:5, Insightful)

    by Thinine (869482) on Thursday August 23, 2012 @02:48PM (#41099281)
    Yet another worthless benchmarking from Phoronix (Moronix, amirite?). They switch between compilers, compiler versions, and even use Xcode itself for some of these comparisons, which make it essentially worthless. Add to that absolutely zero investigations of the reason for differences between the platforms (aside from the obvious mention of graphics drivers) and this is yet another piece of benchmark porn from a site dedicated to it.
    • by Gothmolly (148874)

      They're the OSNews of the 2010 decade.

    • by F.Ultra (1673484)
      But still, how many out there installs custom compilers and recompile everything with that instead of using the supplied version by the distribution?
  • by tstrunk (2562139) on Thursday August 23, 2012 @03:36PM (#41100049)

    If you read the whole article you will see that there are many computing intensive benchmarks, where Linux outperforms OSX by nearly a factor of two. Saying that there is no noticeable difference is simply wrong (see Page 11, Page 12).

    • by tlhIngan (30335)

      If you read the whole article you will see that there are many computing intensive benchmarks, where Linux outperforms OSX by nearly a factor of two. Saying that there is no noticeable difference is simply wrong (see Page 11, Page 12).

      That makes sense. Mac OS X is a "microkernel" based system and does a lot stuff passing around Mach messages.

      OS X is also inefficient in that each process gets its own address space - for a 32-bit process, that's 4GB of address space it can use all of (no 2/2 or 3/1 user/kerne

      • by Anonymous Coward

        If you read the whole article you will see that there are many computing intensive benchmarks, where Linux outperforms OSX by nearly a factor of two. Saying that there is no noticeable difference is simply wrong (see Page 11, Page 12).

        That makes sense. Mac OS X is a "microkernel" based system and does a lot stuff passing around Mach messages.

        I'm afraid you are wrong two ways, here.

        One is that OS X is not a microkernel OS. It does still support Mach messages -- but only for userspace IPC. The kernel itself is monolithic. Mach code, BSD code, and Apple's custom code all live in the same address space, using function calls for internal communication. A true microkernel would partition bits and pieces into independent address spaces and use messaging to communicate between them. (It's best to think of Mach as a bit of foundation code you can b

    • by CAIMLAS (41445)

      The whole thing is a sham, but you're right: aside from the graphics drivers, Linux hands OS X its ass.

      Look through it again (with adblock on, don't let those bastards have another cent of deceptively gained ad revenue), this time mentally excluding all of the synthetic benchmarks (which all seem to be grossly wrong on this review, in favor of OS X). What do you notice? Aside from a couple linear and potentially single core tasks, OS X gets trounced. They were probably paid for these modifications to skew t

      • by jbolden (176878)

        XNU is not nearly as good a kernel as the Linux kernel. And in terms of filesystems OSX has fallen way behind everyone.

        There are areas where Apple is excellent, deep plumbing is not one of them.

  • Isn't the biggest video hassle with Linux on MacBooks the hybrid graphics?

    Rather than being able to switch back and forth, I'd prefer just disabling use of the onboard Intel graphics altogether, assuming fan control was well in hand.

    • by jo_ham (604554)

      Isn't the biggest video hassle with Linux on MacBooks the hybrid graphics?

      Rather than being able to switch back and forth, I'd prefer just disabling use of the onboard Intel graphics altogether, assuming fan control was well in hand.

      That will affect battery life though, assuming that matters to you. For day to day desktop tasks, the HD3000/HD4000 is more than adequate and sucks down much less power than the dedicated GPU. Were I running Linux on a hybrid GPU system I'd want it to be able to use the integrated GPU when the demand was low.

      • Battery life isn't an issue for me. I'd rather have a stable system that defaulted to one or the other GPU than a flaky one that tried to do the automatic switch.

  • by CadentOrange (2429626) on Thursday August 23, 2012 @04:35PM (#41100981)

    The compilation benchmarks are not comparable as the compilers are different, not only in version number but in architecture! OS X ships with llvm-gcc, which is a different compiler from GCC. Think of it a LLVM pretending to be GCC (accepting GCC options, etc) for backward compatibility. This would explain the huge discrepancies between the results of the compilation benchmarks

    Disk performance is another thorny issue. The Postmark benchmark shows Ubuntu 12.04 being 3x faster than OS X 10.8 (246 tps vs 80 tps), yet the postgresql database benchmark shows OS X to be 3x faster than Ubuntu. No explanation is even attempted. Why? Readers would like to know! How can OS X be faster at a database benchmark when a raw disk benchmark shows it to be a lot slower than Ubuntu?! Perhaps there's something screwy with the configuration of Postgres on Ubuntu? Does this mean that OS X is *THE* choice for hosting busy databases? My suspicion is that this is due to fsync (http://www.postgresql.org/docs/8.1/static/runtime-config-wal.html). If fsync is enabled, the database waits for the transaction log to be flushed to disk every time a transaction is committed. It's basically down to defaults, and who knows what the default values are for Postgresql on OS X vs Ubuntu?

    The graphs raise far too many questions that are not addressed. Many of them should have raised warning flags, like the one about disk performance vs actual database performance. As such, the results are thoroughly suspect and no reasonable conclusions can be drawn. Pity, because they clearly have the kit just not the knowhow.

  • by Anonymous Coward

    Put OSX and Ubuntu on a PC based system then compare, or better yet: compare the averages. This would be a more accurate comparison.

  • Isn't the 'low score' region of Ubuntu (graphics performance) being worked on by the Wayland compositor? While still a release or two down the road, would that be able to improve said tests? Or is the problem broader than that?

Almost anything derogatory you could say about today's software design would be accurate. -- K.E. Iverson

Working...