Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.

 



Forgot your password?
typodupeerror
Intel Graphics OS X Stats Apple Linux

Intel's Linux OpenGL Driver Faster Than Apple's OS X Driver 252

Posted by samzenpus
from the greased-lightning dept.
An anonymous reader writes "The open-source Intel Linux graphics driver has hit a milestone of now being faster than Apple's own OpenGL stack on OS X. The Intel Linux driver on Ubuntu 13.04 is now clearly faster than Apple's internally-developed Intel OpenGL driver on OS X 10.8.3. when benchmarked from a 'Sandy Bridge' class Mac Mini. Only some months ago, Apple's GL driver was still trouncing the Intel Linux Mesa driver."
This discussion has been archived. No new comments can be posted.

Intel's Linux OpenGL Driver Faster Than Apple's OS X Driver

Comments Filter:
  • That's great news! (Score:5, Insightful)

    by girlinatrainingbra (2738457) on Wednesday May 22, 2013 @06:25PM (#43798671)

    Well, that is great news, but if Intel played a hand in its development, then that would only make sense if Intel did NOT play a hand in helping Apple develop the Apple version of the OpenGL driver.

    Since Intel is the creator of the architecture for the video hardware in question, it would be only sensible for Intel assisted development to be better than development that occured without Intel's help.

    Either way, go go Gnu/Linux (and open source!) !!!

    • by girlintraining (1395911) on Wednesday May 22, 2013 @07:07PM (#43798927)

      Nice nickname you got there. Wonder where the inspiration came from. Achem, now, on to the commentary!

      Well, that is great news, but if Intel played a hand in its development, then that would only make sense if Intel did NOT play a hand in helping Apple develop the Apple version of the OpenGL driver.

      Compiler warning on line 1: .Comments.Item("43798671") has ambiguous syntax. Should contain (3) of type=sentence, but has (1).

      Since Intel is the creator of the architecture for the video hardware in question, it would be only sensible for Intel assisted development to be better than development that occured without Intel's help.

      Compiler error on line 3: .Comments.Item("43798671") Variable of type 'sensible' cannot be narrowed to class 'business sense'. Consider replacing with 'legal sense'. Note: include Slashdot.Comments.Inflammatory.Duopoly module to access this object.

      Either way, go go Gnu/Linux (and open source!) !!!

      Compiler warning on line 5: .Comments.Item("43798671") contains excessive punctuation. Should contain (1) of type=punctuation_exclaim, but has (4).
      Compiler warning on line 5: .Comments.Item("43798671") contains duplicate objects.
      Compiler warning on line 5: .Comments.Item("43798671") 'Gnu/Linux' is depreciated. Consider using 'Linux' instead.

      • Nice nickname you got there.

        Yeah. Caused a few double takes first time he cropped up. More than a little irritating to ape the name of a well known poster. But hey, you've reached some exhaluted heights. The rest of us have to make do with AC stalkers. Now, if slashdot actually implemented the achievements mentioned on April 1, I suspect this would be one of the higher ones. So, er... congratulations, I guess...? Next up, you need to get a +5 (Troll), I suppose.

        But anyway, back on topic[*] (to the main thre

    • by fuzzyfuzzyfungus (1223518) on Wednesday May 22, 2013 @10:12PM (#43799953) Journal

      Is there any reason to suspect that Intel is withholding any assistance that Apple is requesting?

      Since they are actively working on an OSS driver, they clearly don't have some sort of 'zOMG Intellectual Secrets!!!' concern(and it's not as though Apple would be averse to signing the NDAs in any case), and Apple buys a lot of Intel chips(including a pretty good mix of the higher margin ones. They don't move Xeons for shit; but they also don't ship anything lower-end than an i5. That's not the sort of customer you play petty little games with when it comes to engineering support.

      • Apple also gives Intel plenty of free marketing, and doesnt take any of that Intel Inside co-branding money because of their refusal to put annoying stickers on their product. And don't forget that Apple makes for a decent hedge against Microsoft doing something Intel doesn't like.

      • by AmiMoJo (196126) *

        The difference is down to the differing driver architectures and the way the OS manages resources for OpenGL.

    • by Yvanhoe (564877)
      But Intel's linux driver is open sourced. This counts as helping!
    • Different kernels.
      Different user environment.
      Different services running.
      Different implementations of the OpenGL API

      But I'm sure it's the driver, and only the driver making the difference here. What a ridiculous comparison.

      • But I'm sure it's the driver, and only the driver making the difference here. What a ridiculous comparison.

        If you look at TFA, he's making both a whole-stack comparison and separately a driver version comparison.

        The OSX stack appears to fair worse against most of the linux tests, and the new driver does marginally better than the old driver.

        Thank you, Intel driver folk, for reassuring my purchasing decision (based on linux driver support).

  • Just so you know (Score:5, Informative)

    by Anonymous Coward on Wednesday May 22, 2013 @06:28PM (#43798687)

    For all you integrated GPU haters and Intel haters... the Intel Linux drivers are straight up excellent. I do not believe there are better Xorg drivers available in Linux, including NVidia. Intel has really been diligently working to make their Xorg drivers work well and they deserve credit. For desktop work, HD video and other non-first person shooter use cases both the hardware and the drivers are a godsend and I thank Intel.

    • by niko9 (315647)

      I only, wish, wish wish that I could buy a discrete card from Intel!! PCI or PCI-E, I don't give a damn!

      • by Nimey (114278)

        I'm sure you could find an i740-powered card on eBay for $5.

        • According to Wikipedia [wikipedia.org], the Intel i740 was AGP.

        • i740 was AGP only, because Intel was trying to use it as a lever to push people to Slot-1 away from Socket-7, to screw AMD over. Those cards were also notorious for only working in AGP slots that happened to talk to Intel chipsets, and Intel CPUs.

          Yeah, I supported one of those products long ago - the Diamond Stealth II G460 [anandtech.com].

          • Ironically I have an i740 AGP card in a Super Socket 7 Motherboard with the Ali Aladdin V chipset. It works fine, never had a problem. The i740 landed up in quite a few SS7 builds since it was a low priced card.
      • by armanox (826486)

        I have an Intel i740 in my desk if you're really that desperate...

        • Whilst I'm sure the OP was joking, I actually wish they made a discreet GPU. A while ago I was working on some commercial openGL heavy software, and periodically you'd get a bug report (e.g. rendering glitches on HD2000). With an AMD or Nvidia glitch, you just put in an order for a card of that series, pop it in your dev machine, and fix the problem. With Intel, it inevitably ends up requiring a completey new machine just to fix the bugs. I used to have a cupboard with 6 or 7 GPUs (for AMD/Intel testing), a
    • Everyone seems to say that, but I have never found it to be true. For example, the multi-monitor support for my Intel HD 4000 is terrible. It rarely works right. First, it usually doesn't work. It just outputs to one display when I ask it to output to two. Then when I can get it to work, one of the screen is usually statically or flickers. In addition, the GPU is supposed to support three displays under Windows. I have never gotten that to work and as I said, two often don't work. I have never had a

      • Everyone seems to say that, but I have never found it to be true. For example, the multi-monitor support for my Intel HD 4000 is terrible

        Not wanting to duel with anecdotes, but I've found the support to be excellent on the 4000 and older 915. I suspect the problem you're having is not with the drivers but with your desktop environment instead.

        Which xrandr client do you use?

        Also, if one of your connections is analog, sometimes, espeically with some older monitors and with some refersh rates, they are quite p

    • While OSX's Intel GPU performance is now lagging behind linux, I have to say I'm genuinely impressed by the quality of the drivers. I've never seen glitches or corruptions in rendering - and speaking as a guy who's been writing opengl for 10+ years, I've seen a lot of shit drivers. Particularly on the OSX side of things.

      Intel's drivers for OSX ( whether written by apple or intel, IDK ) always produce correct output, even if performance isn't always top notch.

  • by anthony_greer (2623521) on Wednesday May 22, 2013 @06:31PM (#43798717)

    A company that makes and designs chips is better at coding drivers to those chips than a PC maker that just sources those chips as components... Why is this shocking?

    • by danbob999 (2490674) on Wednesday May 22, 2013 @06:46PM (#43798797)

      No mater who makes it, in the end, you are getting more performance on Linux than on OS X. Unless you can download a better performing driver for OS X, this is an argument for using Linux.

      • Not unless the software you need OpenGL performance for runs under Linux, too.

      • It's an argument for using Linux, sure, but not a good one. Especially on its merits alone.

      • by Dynedain (141758)

        Unless you can download a better performing driver for OS X, this is an argument for using Linux.

        ...on a particular set of hardware. Notice they're compiling and testing on Apple's low-end machine.

        And honestly if your Linux vs. OSX decision is based on a narrow difference in performance, then you probably aren't considering cheap desktop hardware to begin with.

      • Okay, sure.

        Except that nobody that cares about OpenGL performance is using an Intel integrated GPU as their sole source of rendering horsepower. If OpenGL is important to you, you spend the extra money for a real professional-graphics GPU like Quadro or FireGL, since the time saved waiting for things to render literally pays back the card's cost in a few weeks.

        Also, there's way more variables in this comparison than the driver, and Intel could also publish their own Mac OS X kexts for their GPUs, like Nvid

    • Re: (Score:3, Interesting)

      by aliquis (678370)

      No-one said it was shocking?

      However the drivers for the open-source OS is good. Not only for the proprietarian one. That's nice.

    • Why is this shocking?

      Well, if you've ever worked with hardware venduh's you'll know that while many of them make decent hardware, they not only write terrible software but also jealously guard this terrible software as if it is something to be proud of as opposed to embarrassed of. Seriously these hardware companies seem to believe that their buggy, barely non functional XP only drivers and programs are some mega proprietary advantage, rather than the actual piece of hardware they're trying to hawk.

      A hardwa

  • why can't ati / nvidia / intel have there own driver downloads for OSX like they do for Linux and windows?

    • by D1G1T (1136467)
      For better or worse, Apple tries to sheild its customers from the driver instability/incompatibility that has affected (mostly) DOS/Windows over the last couple decades. Yes, they have given up a lot of choice in graphics cards, but most Mac users would rather have graphics that only run at 85% speed but that crashes much less often.
    • I guess its because on Windows/Linux you see that there is driver bugs on each version of the driver, so all Apple really needs to do is to create testsuit, not tell there is a testsuit, and then test the driver. If it has bugs that is found, do not approve the driver. If there is a update, but there is no real improvement, do not update the driver, etc.
      Seeing the insane bugs that pile up and that is version specific, at some point it makes too much sense, especially since drivers are maintained.

  • Great! (Score:2, Funny)

    by ArchieBunker (132337)

    Great news for all those OpenGL games out there like Minecraft and um....

    • Re:Great! (Score:5, Insightful)

      by Hes Nikke (237581) <slashdot@nOsPam.gotnate.com> on Wednesday May 22, 2013 @07:00PM (#43798879) Journal

      Did you know that you can run steam and source engine games on ubuntu now?

      • Re: (Score:2, Troll)

        by flimflammer (956759)

        Indeed, it's cool that Linux now has access to some new 7-9 year old games.

      • by aliquis (678370)

        Did you know that you can run steam and source engine games on ubuntu now?

        Steam yes.
        Number of games limited.

        Anyway what I wanted to say was: Did you know you can also run them on other distributions than Ubuntu?

        As in OpenSUSE 12.3 here. Not that I've got anything to play on it. But Steam runs. :D

        • by armanox (826486)

          I've played quite a few (TF2, World of Goo, Serious Sam 3, HL, CS) on Fedora 18 quite beautifully.

      • Sigh (Score:5, Interesting)

        by Sycraft-fu (314770) on Wednesday May 22, 2013 @09:28PM (#43799761)

        When you post stuff like that, and fanboys mod it to +5, it looks really silly. The reason isn't because it is not true, but because it is not impressive. Yes, Linux has a few games for it including some older Source games. Yay. Trying to imply that because it has Steam it has games is silly. Roughly 6 of my 163 Steam games will run on Linux and most of those are the older Source engine games.

        Having Steam doesn't mean you get games. It means there's a platform to sell games on that many Linux users will hate on (costs money, has DRM, no source code). The games themselves have to be ported and so far, not much of that has been going on.

        It does not strengthen your point when you go and make a rather silly argument. The "but it has Steam!" argument that keeps getting trotted out when someone comments on Linux and gaming reminds me of Mac users back in the 90s pointing to the 10 or so old titles you could find in the store as proof that there were plenty of games on the Mac.

        Linux gaming is not in a good state currently, and trying to mask that is silly.

        • I just hate it when some supposedly "hardcore" gamer redefine "games" to refer to certain watt-sucking/heat-sink-busting games. FYI there are plenty, at least hundreds, maybe even thousands of games for Linux (if you're willing to go the grey market emulator route). Maybe not games as visually impressive as Crysis. But they're there. A simple "apt-cache search games" or its Fedora/rpm equivalent should prove my point.

      • Serious question. How much of a performance impact does running a game on Steam have vs running the application natively on MacOS. Same question could be said of Windows in fact. Or is it all the same just rolled up and packaged differently but executed the same regardless?

        • by DarkXale (1771414)
          Steam titles still run natively, all steam does is provide an additional overlay - which a lot of other software (voice-com especially) also does. Performance difference is essentially ignorable when theres nothing to show, it won't have anything to show unless you explicitly trigger the correct hotkey - or on special events like LOW BATTERY or MESSAGE RECEIVED.
      • Now only if everything on Steam would run on all platforms that can run Steam...

        (Mac OS X suffers from the same effect.)

    • Re:Great! (Score:5, Interesting)

      by Anubis IV (1279820) on Thursday May 23, 2013 @01:04AM (#43800577)

      I just popped open the Mac App Store and took a glance at the first page of games. Just to name a few that were listed, there's Borderlands 2, CoD: Black Ops, Batman: Arkham City, Deus Ex: Human Revolution, Civ V, Bioshock, Amnesia, Witcher 2, Assassin's Creed II, and XCOM: Enemy Unknown. And if I pop open my copy of Steam, I can find pretty much all of Valve's titles, as well as a whole lot more. Granted, they're not all the latest and greatest (e.g. Bioshock, not Infinite; AC2, not AC3; Black Ops, not Black Ops II), but it's a wide selection of well-known games from a number of developers.

      Jokes like yours are funniest when they use humor to take the edge off of a point that would otherwise be painful to swallow, but yours is simply off the mark entirely. Unreal, Source, Gamebryo, id Tech, IW, and Unity engines all work with OpenGL and have a number of games out using it. There are strong rumors that Crytek already has an in-house version of CryEngine 3 running with OpenGL, and based on job listings at DICE, it looks like they're porting their Frostbite engine over as well for use with Battlefield.

      Given the disappointment that some of the major game developers have expressed (e.g. Gabe Newell's public statements) towards Windows 8, along with Microsoft's signals that DirectX may be at its end of life, is it really any surprise that all of the major game engines have already been ported or are in the process of being ported to OpenGL? Even more so when you consider that the two major smartphone OSes (i.e. the platforms on which most games today are now played) only make use of OpenGL? Not to mention that on gaming devices that support one or both of OpenGL or DirectX, all but one of those devices (Xbox) supports OpenGL in addition to or to the exclusion of DirectX? And the fact that Linux is quickly gaining recognition as a high-performance gaming platform and is getting some love from developers and publishers? Finally, is it really all of that surprising that the developers are actually making use of these game engines to put games on as many devices as possible?

      Mind you, I'm not suggesting that DirectX should be abandoned, by any means, since it's still quite powerful and is still the library that's used on one of the major consoles out today. All I mean to do is point out the folly in your assertion that OpenGL is not being utilized in games.

  • by puddingebola (2036796) on Wednesday May 22, 2013 @06:55PM (#43798861) Journal
    I could really use this, since my crappy Intel GMA 950 graphics won't play Portal on Linux. I'm sure this amazing driver update will allow it now.
    • Re:really need this (Score:4, Informative)

      by ikaruga (2725453) on Wednesday May 22, 2013 @09:11PM (#43799671)
      I'm not sure if you're being sarcastic or not, but in case you aren't, I think you need a GPU with at least Pixel Shader 2.0 hardware in order to play it. anything bellow and it will crash the moment you open a portal. Unfortunately, no driver update can fix this problem.
      • by Narishma (822073)

        GMA950 does "support" PS 2.0, but only on Windows through Direct3D. I say "support" because even though it technically does support it, it has some extreme limitations in what is supported in hardware, and as soon as you exceed them, which is very easy even on the simplest of shaders, it reverts to a software implementation and the performance plummets.

    • sorry, but the answer is no. I'm running 13.04, with steam on a laptop running the same intel chipset they used for the test. Luckily it has an Nvidia card, so I can play the games using bumblebee.. (a hack that needs to be fixed IMHO) but portal will not run using just the intel graphics. It crashes and goes bye bye..
    • This might not affect Portal in any way, but as an interesting sidenote they recently added OpenGL 2 support [phoronix.com] to the i915 hardware under Linux. :)
  • Accuracy? (Score:4, Interesting)

    by MoronGames (632186) <cam.henlin@gmai[ ]om ['l.c' in gap]> on Wednesday May 22, 2013 @07:57PM (#43799271) Journal
    Okay, so now we know that the drivers themselves are faster at rendering OpenGL content, but are they accurate? I know that, in the past, both AMD and nVidia have resorted to not quite properly rendering things to get their cards to perform better in benchmarks, does anyone know if any of that is going on here?
    • That's the difference between "gaming" OpenGL ICD and "professional" OpenGL ICD. It's perfectly possible to have a graphics card that rules the roost at rendering AutoCAD and Maya 3D, but suck out loud at gaming framerates. Most of the Quadro line of GPUs are this way - they are optimized for accuracy rather than shoving as many frames out the door as possible.

      • by drinkypoo (153816)

        The problem with your idea is that with the nVidia driver you get a slider, even on Quadro cards, and you can drag it towards performance. At which point, even the Quadro cards will compromise visual quality and let you play a game just fine. I certainly got good frame rates (for the number of processors, anyway) with both QuadroFX GPUs I've owned.

  • osx is not all that (Score:2, Interesting)

    by kcmastrpc (2818817)
    It's actually starting to show its age. I've recently switched back to windows 8 (with classic shell) and will probably never give OSX the time of day again. The fact that I have to go back to the main screen to do anything with the menu bar, task bar, and a file manager that hasn't changed in 15 years started driving me insane. There were some other quirks as well - like the END key doing something completely different in every single application I used that drove me to switch. In any case, I tried it, fo
    • Yeah, I'm not a fan of the way Mac OS developed (I started using it with Jaguar). I moved back to Linux (I can't stand Windows). Having been away from using desktop Linux for around a decade, I was pleasantly surprised when I switched back.
    • by feranick (858651)
      It runs deeper than that. HFS is ancient, slow and inefficient. Memory management is a joke. I'd say enough "iOSization" of OS X, OS X should really make a leap jump into an innovative desktop OS. And I say this from my Mac.
      • by tyrione (134248)

        It runs deeper than that. HFS is ancient, slow and inefficient. Memory management is a joke. I'd say enough "iOSization" of OS X, OS X should really make a leap jump into an innovative desktop OS. And I say this from my Mac.

        First of all it's HFS+ [and then some], and your comment about Memory Management is a joke is the real joke. Like hell Linux is an innovative OS. It's been getting worse in quality and stability since the end of 2.6.

      • Apple knows that HFS is in need of replacement, which is why they had a fully functional ZFS [wikipedia.org] on Mac OS X 10.6 at launch, but removed it at the last second because they could never come to licensing terms with Sun.

        It is still updated as a forked open source project, and there is a commercial version that has a newer version of pool / ZFS available for purchase.

  • If you need/want 3D performance you should still get a discrete video card.

  • by gnasher719 (869701) on Thursday May 23, 2013 @05:18AM (#43801273)
    A graphics driver isn't "slow" or "fast" per se. The developers benchmark important apps, look for things that keep the speed down in these important apps, and try to improve things. The effect is limited by (1) what the graphics card can do, (2) time invested by the developers, and sometimes (3) the willingness to cheat in public benchmarks. (3) shouldn't be a big factor; if ATI and NVidia posted benchmarks, I'd watch out for that.

    Now an important factor is that this process will improve apps that the developers believed to be important; other apps will get less improvements. An app that nobody cares about might run into a speed bump that could easily be fixed, but it doesn't get fixed because nobody cares. And here we run into a problem with the posted benchmarks: They are all apps that are primarily used on Linux, and that no MacOS X user has ever heard of. Therefore, we may assume that no OpenGL developer at Apple has ever looked at these apps and has tried to remove speed bumps in these apps. Therefore, these apps might very well be non-typical.

    Consider a situation where a developer can use two techniques A and B, which should in theory run equally fast. And for some reason A runs faster on MacOS X, and B runs faster on Linux. So Mac app developers tend to use A, and Linux app developers tend to use B. As a result, Mac driver developers will try to improve A, while Linux driver developers will try to improve B. Which makes the speed difference bigger, Mac and Linux developers will even more tend to use on technique over the other, driver developers will optimise more and make the difference bigger. After a while, an app using A will run considerably faster on a Mac, while an app using B will run considerably faster on Linux. If you then port the Linux app to MacOS X, it will make you believe that the Linux drivers are faster.

I have yet to see any problem, however complicated, which, when you looked at it in the right way, did not become still more complicated. -- Poul Anderson

Working...