Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Apple Hardware

Apple's Charts Set the M1 Ultra up for an RTX 3090 Fight it Could Never Win (theverge.com) 142

An anonymous reader shares a report:When Apple introduced the M1 Ultra -- the company's most powerful in-house processor yet and the crown jewel of its brand new Mac Studio -- it did so with charts boasting that the Ultra capable of beating out Intel's best processor or Nvidia's RTX 3090 GPU all on its own. The charts, in Apple's recent fashion, were maddeningly labeled with "relative performance" on the Y-axis, and Apple doesn't tell us what specific tests it runs to arrive at whatever numbers it uses to then calculate "relative performance." But now that we have a Mac Studio, we can say that in most tests, the M1 Ultra isn't actually faster than an RTX 3090, as much as Apple would like to say it is.
This discussion has been archived. No new comments can be posted.

Apple's Charts Set the M1 Ultra up for an RTX 3090 Fight it Could Never Win

Comments Filter:
  • Maker of new product talks up the performance and people aren't wowed?
    This is news?

    • Re:Wow! (Score:5, Insightful)

      by AmiMoJo ( 196126 ) on Thursday March 17, 2022 @01:05PM (#62366719) Homepage Journal

      Most benchmarks are dubious, but Apple are especially bad for not only selecting benchmarks that favour them, but also for presenting the results in the most misleading way possible.

      • Like their battery life tests that are just playing a video non-stop. No web pages or javascript. Hardware offload video decoding and screen and that's it. Not real world at all.

        • by AmiMoJo ( 196126 )

          Their GPU isn't even comparable to an Nvidia one anyway. It only supports a subset of the functionality. The guys trying to create an open source driver have a lot of detail. It's more like a mobile GPU.

        • It is real world when you are in a train or plane.

          Ofc it is misleading as the OS shuts down everything not needed for video playing.

      • by jwhyche ( 6192 )

        So basically, Apple lied. Nothing new here.

      • While I agree on the numbers, I can't but take pause when an article on tech specs contains a statement like this:

        It feels like the chart should probably look more like this

        "Feels"??!! What the hell? How about giving us an actual chart, which "shows"?

        • While I agree on the numbers, I can't but take pause when an article on tech specs contains a statement like this:

          It feels like the chart should probably look more like this

          "Feels"??!! What the hell? How about giving us an actual chart, which "shows"?

          Not only that, the Verge’s “extrapolation” completely changes the curve of the 3090 line, which clearly showed that its performance was flattening-out as it approached its TDP of 350 W.

    • by SirSlud ( 67381 )

      News doesn't need to be novel. It just needs to inform. If this is something you already knew (I mean like really knew, like would have bet $10k on as opposed to having a pretty good guess) then give yourself a gold star or something.

    • Comment removed based on user account deletion
  • and the mac pro can take more then 1 video card.

    apples own chip pro may need quad chip to be good.

    • Yes, it can take more than one GPU. However, they won't be Nvidia because Apple is too busy holding a grudge about some shit that nobody else even remembers.

    • From the Geekbench 5 Compute benchmark results in the article:

      215 - RTX 3090
      86 - Mac Pro with 2x Radeon Pro Vega II
      102 - Mac Studio M1 Ultra 20 core with 64 core GPU

      So, ouch.

      On Shadow of the Tomb Raider game benchmark, the RTX 3090 gets 142 fps compared to 125 for the 2x Radeon Pro, vs 108 for the M1 Ultra. So, not as bad, but not really close either.

      • Right, this makes sense. It would be an extraordinary coup to ship a pre-packaged GPU that out-performs what is basically the most powerful card currently on the market.

    • by dfghjk ( 711126 )

      "apples own chip pro may need quad chip to be good." ...to be COMPETITIVE with high end GPUs from other manufacturers.

      There is no doubt that the M1 Pro/Ultra is "good" already, by general computing standards. Meanwhile, even a "quad chip" version still won't run Windows apps, not good at that at all.

      • "apples own chip pro may need quad chip to be good." ...to be COMPETITIVE with high end GPUs from other manufacturers.

        There is no doubt that the M1 Pro/Ultra is "good" already, by general computing standards. Meanwhile, even a "quad chip" version still won't run Windows apps, not good at that at all.

        And who has control over whether Windows on ARM is available for ASi Macs?

        Hint: It isn't Apple.

    • Any performance above the baseline is going to require Thunderbolt eGPUs. Still a possibility, but that means the graphics vendors have to write ARM drivers and Apple has to allow them. This has not happened yet.

      • "write ARM drivers" means you simply recompile the current driver.
        Not really rocket science. Worst case the "ARM driver" has to read and write to a different address space, hence you change some constants, and that's it.

        • If it were as simple as that, it would be done by now. The default OS installation blocks 3rd-party kexts. So either you have to get Apple to distribute it with the OS or you have to tell your users how to step by step go into recovery mode and disable the "security" feature that prevents this.

          As long as it's that complex, there's probably not going to be enough incentive to get a release from AMD or NVIDIA.

          • If it were as simple as that, it would be done by now. The default OS installation blocks 3rd-party kexts. So either you have to get Apple to distribute it with the OS or you have to tell your users how to step by step go into recovery mode and disable the "security" feature that prevents this.

            As long as it's that complex, there's probably not going to be enough incentive to get a release from AMD or NVIDIA.

            I have a genuine question: Why, in a Microkernel architecture such as Mach, (or, frankly, any architecture), why do GPU drivers (or really, anything) have to execute in Kernel Space?

            • I literally don't know enough about architecture to say this, but would DMA transfers need direct kernel-level access?

  • by guruevi ( 827432 ) on Thursday March 17, 2022 @12:08PM (#62366505)

    So for those of you who didn't RTFA, the article has 2 benchmarks, one with a very obtuse 'scoring' mechanic that doesn't tell you what is actually 'slower' and 'faster', just an overall score.

    Then there is a benchmark for a single game (Tomb Raider) which is fully emulated on the Mac and the M1 still gets relatively close to the gaming machine, but the chart seems to be reversed (all systems have fewer fps at lower resolutions) and is missing a full bar (1080p on RTX3090).

    Just a shoddy article altogether, looking for some real benchmarks from people that have a clue.

    • Ars has a number of Geekbench scores [arstechnica.com] comparing to the RTX 3070 - the Ultra beats it in one, but the others the 3070 is pretty substantially ahead (click on thumbnails below chart, there are four charts)... I would say comparing it to a 3090 seems like a stretch.

      But the Mac Studio does win handily in power efficiency over an Intel system with an RTX card.

      • by dfghjk ( 711126 )

        Power efficiency, THE critical metric when considering high end GPU performance.

    • by Moryath ( 553296 )
      If you're trying to game on Mac, chances are you have to emulate. Few studios want to waste the time porting to MacOS when it's such a small percentage of the marketplace. Virtually every developer's coding stations target Windows. If they then port to Linux flavors, they can more easily land on portable (Android phones/tablets) or Steam Deck. Either that, or they're going to port towards console (Xbox/PS/Switch) directly.
      • by Tom ( 822 )

        If you're trying to game on Mac, chances are you have to emulate.

        Not at all.

        I'm a gamer. Have been all my life. Not as crazy anymore today because I have a job and a family and a life, but I still game, and like to explore new games all the time. I have a fairly nice library, of games, too. And all of it runs on my Mac.

        Sure, there are some titles that I'd like to play and can't because the morons made them PC only. But hey, if I had a windos thingy, I'd have games that are Playstation exclusive or whatever and cry about that.

        There's plenty of great games for Mac. I'm tot

        • by jwhyche ( 6192 )

          Sure, there are some titles that I'd like to play and can't because the morons made them PC only.

          I wouldn't say they where morons. More likely they sat back and took a serous look at the market and decided not to support a system that is less than 2% of the desktop. Developing any kind of software is expensive. Smart people that are hoping to make money develop where they will get the most for their return.

          • by Tom ( 822 )

            and decided not to support a system that is less than 2% of the desktop

            Welcome to the 21st century. That hasn't been true for at least 15 years.

            https://gs.statcounter.com/os-... [statcounter.com] says 15%

            https://www.statista.com/stati... [statista.com] has similar numbers

            https://netmarketshare.com/ope... [netmarketshare.com] says 10%

            The only source that has MacOS in your area is the Steam Hardware Survey - https://store.steampowered.com... [steampowered.com] - with 2.6%

            As a Steam partner, you also get a hardware survey for your players, and for my game that shows higher numbers, because the game actually supports MacOS. Or in other words: If you bu

            • by jwhyche ( 6192 )

              Wow. Those numbers are all over the place. Lets just stick with the 2% since its a more realistic number.

              • by dgatwood ( 11270 )

                2% is a laughable number. Wikipedia says 16% [wikipedia.org] of desktop/laptop computers, and that's probably pretty close.

                Apple hasn't been anywhere close to 2% of the computing market since the Intel transition. At least in the U.S. (which is the second-largest market for games, behind China), Apple is typically somewhere in the neighborhood of 7 to 13% of computer sales, depending on quarter, and Apple hardware tends to be used for more years than your average PC, so that number significantly underestimates the perce

                • 2% is a laughable number. Wikipedia says 16% [wikipedia.org] of desktop/laptop computers, and that's probably pretty close.

                  Apple hasn't been anywhere close to 2% of the computing market since the Intel transition. At least in the U.S. (which is the second-largest market for games, behind China), Apple is typically somewhere in the neighborhood of 7 to 13% of computer sales, depending on quarter, and Apple hardware tends to be used for more years than your average PC, so that number significantly underestimates the percentage of the computers that are in active use.

                  To be fair, Apple makes up a negligible percentage of the computer gaming market, but that's only because too few games are available for the platform.

                  On the flip side, Apple dominates the mobile gaming market, and with M1-based Macs, you can write a single app and run it on iOS and on the Mac, so unless you don't have a mobile version of your game, if you aren't supporting at least Apple Silicon Macs, you're utterly incompetent. That should tip the balance of the computer gaming market pretty substantially in the next couple of years.

                  Doesn't the inclusion of iOS/iPadOS games (some of which are pretty good) suddenly propel the Mac into one of the most blessed-with-content Gaming Platforms in the planet?

              • by Tom ( 822 )

                Sorry, but are you mental?

                I've posted two sources saying 15% and 16% and one source saying 10%. And your answer is "that's all over, let's stick with 2%" ???

                You said percentage of desktop. That's those numbers.

                The Steam number is percentage of gaming desktops who subscribe to Steam. There's a huge amount of self-selection in that. THIS is the number that should be discarded first.

                • I would argue that the Steam number is closer to accurate for this case. If I'm developing a game, I don't care about total desktop market, I want to see the percentage of self-selected gamers. They are the ones who will potentially be interested in my product. Perhaps I can lure in a few other users from outside the gamer world, but this is not my target audience. Tom, you even say yourself in a previous post that Mac makes up about 5% of your players. So somewhere between 2.6% and 5% is going to be the mo
                  • by Tom ( 822 )

                    If I'm developing a game for distribution on Steam then that number is interesting.

                    There's a difference. On itch.io for example, download numbers for another small game I made have the Mac and Linux versions at around 10% each. A different self-selection is at work there.

                    And none of these catches the casual gamers.

                    So it depends on what exactly you're making, what your distribution channel is and what your target audience is. In any case, the percentage of desktops running macOS isn't 2%. The percentage of g

                  • I would argue that the Steam number is closer to accurate for this case. If I'm developing a game, I don't care about total desktop market, I want to see the percentage of self-selected gamers. They are the ones who will potentially be interested in my product. Perhaps I can lure in a few other users from outside the gamer world, but this is not my target audience. Tom, you even say yourself in a previous post that Mac makes up about 5% of your players. So somewhere between 2.6% and 5% is going to be the most useful number for this discussion.

                    However, you must agree that the potential market includes every single person who uses a Mac.

              • Wow. Those numbers are all over the place. Lets just stick with the 2% since its a more realistic number.

                No, it's not.

                Home use is approaching 50% in the U.S., and a lot of new businesses are Mac-based.

          • You are mixing up "sales" of Mac vs. PC with installed and used hardware.

            Macs are around 20%. And most private Mac owners play as much as Windows players.

            And on top of that: it is actually not hard to be cross plat form. Developer studios are just lazy and have odd pressures form outside on them.

            I do it opposite around, since decades: I do not buy windows games that are not available on Macs. So their stupid idea makes them loose two times, neither can/do I buy the non existing Mac version, nor do I buy the

            • by jwhyche ( 6192 )

              Again, Wow. I was just going to let this thread dye off because I know better than trying to argue with zealots. Then you all start posting your fictitious numbers. I'm even willing to admit that 2% is a low ball number, keeping real you know.

              Then one of you chimes in with 20%, and another 50%. Well we all know that is bullshit. Even you included all Macs ever made it would never be close to 20%. As for 50%? Well that one isn't even worth talking about. Someone is on the crack.

              Then we have a

              • As for 50%? Well that one isn't even worth talking about. Someone is on the crack.

                You are the one with the substance abuse problem.

                I think Linux Desktops are even over 2% these days. MacOS is the second most popular platform next to Windows. And Windows is nowhere near the 98% marketshare it had in the XP and Win 7 days. Where do you think those people went?

                Mac adoption has been growing by leaps and bounds, especially in the home market.

                And there are a Lot more Livingrooms than Corporate Offices.

                And more and more of them are sporting Macs these days, too.

                Watch TV. I don't mean shows with

                • by jwhyche ( 6192 )

                  Well one of us in this conversation, who isn't me, believes that Mac's make up 50% of the desktops. So, we have to question which one us is hitting that crack pipe.

                  Linux desktops are well over 5% of the current desktop market. Making Linux almost 3 times the number of Mac's out there.

                  But, hey, if you want to believe that Mac's make up 50% of the desktop market, who am I to say other wise. I mean if you are going to have delusions, might as well make them grand. Shit, I know of one Amiga user out th

        • So you basically do 30% more work for 10% more profit. Makes total sense!
          • Most of the work is no work. The compiler does it.

            You simply write your "make file" to compile for Macs and Windows. Simple. No idea where you get the 30% more work from.

            And: if he would not do the extra work, he would not get the extra 10% profit. It is his job to check if that works out.

            Simple example: Windows only means 20h per weak work. Windows + Mac means 26h per week work, and you get 10% more? Well, I guess I would do the 26h week work ... depends of course on the money.

            If I only make a mere 100k pe

          • by Tom ( 822 )

            Give back your nerd credentials, will you?

            I have my entire build process automated. It is literally pressing one button to build all three platforms, then running one script to upload all three packages.

            Also, in the past 6 months, I've had one (!) platform-specific bug. So there's not much additional workload on that, either.

    • So for those of you who didn't RTFA, the article has 2 benchmarks, one with a very obtuse 'scoring' mechanic that doesn't tell you what is actually 'slower' and 'faster', just an overall score.

      Then there is a benchmark for a single game (Tomb Raider) which is fully emulated on the Mac and the M1 still gets relatively close to the gaming machine, but the chart seems to be reversed (all systems have fewer fps at lower resolutions) and is missing a full bar (1080p on RTX3090).

      Just a shoddy article altogether, looking for some real benchmarks from people that have a clue.

      It's The Verge. What did you expect?

    • So for those of you who didn't RTFA, the article has 2 benchmarks, one with a very obtuse 'scoring' mechanic that doesn't tell you what is actually 'slower' and 'faster', just an overall score.

      Then there is a benchmark for a single game (Tomb Raider) which is fully emulated on the Mac and the M1 still gets relatively close to the gaming machine, but the chart seems to be reversed (all systems have fewer fps at lower resolutions) and is missing a full bar (1080p on RTX3090).

      Just a shoddy article altogether, looking for some real benchmarks from people that have a clue.

      Exactly!

      There are a ton of really crappy benchmarking articles floating around regarding ASi Macs.

      The Verge article is but one example.

  • The Mac Studio has the capability of supporting devices over the Thunderbolt Interface. They could actually support eGPU and add nVidia's power to their own for jobs that require even higher computational values.

    I understand Apple's desire to control everything, but let's be honest and admit that without third party suppliers, Apple wouldn't be nearly as popular.

    • by zekica ( 1953180 )
      Yes, but they have no drivers for any GPU except their own on ARM. Thunderbolt AMD GPU will probably start working earlier on Asahi Linux than on macOS.
      • Probably only ever on Asahi.
        Apple has made it pretty clear that they don't want third-party kexts on their computers anymore.
        They have made automated driver install impossible, and horrifically painful to do manually.

        It was enough to make me seek out peripherals that "just worked", even when there were Arm kexts available.
        • by dgatwood ( 11270 )

          Apple has made it pretty clear that they don't want third-party kexts on their computers anymore.

          Which sucks because of how much effort is wasted rewriting perfectly working drivers for no good reason other than paranoia, but at least in theory, nothing prevents graphics card manufacturers from writing user-space graphics card drivers with DriverKit. It provides full access to PCI devices already, AFAIK.

          That said, there are probably a lot of critical missing pieces that Apple will have to implement before it will be possible to make it integrate properly into the OS (before which, all you'd be able to

          • nothing prevents graphics card manufacturers from writing user-space graphics card drivers with DriverKit. It provides full access to PCI devices already, AFAIK.

            It does.
            There are other caveats to now being in User-Mode, though. It means you're now strictly limited to APIs they expose. If you're trying to extend something that the Operating System doesn't want you to (say, graphics), then you can sod off.

            Of course, for CUDA/OpenCL on NV/AMD eGPUs- that much should be possible.

            That said, there are probably a lot of critical missing pieces that Apple will have to implement before it will be possible to make it integrate properly into the OS (before which, all you'd be able to do would be to publish some sort of custom user client and tell the driver to send commands to the card from custom user-space apps, which isn't particularly useful).

            Ya, as long as tight system integration isn't a concern of yours, specialized apps talking to specialized user-space drivers should be perfectly feasible.

    • The Mac Studio has the capability of supporting devices over the Thunderbolt Interface. They could actually support eGPU and add nVidia's power to their own for jobs that require even higher computational values.

      I understand Apple's desire to control everything, but let's be honest and admit that without third party suppliers, Apple wouldn't be nearly as popular.

      What a ridiculous statement!

      No computer platform would exist without 3rd party suppliers.

  • Are you saying it actually wins something against a 3090. Dude thats insane.

    • by AvitarX ( 172628 )

      I'd assume its physical proximity to the other system parts would give it some advantages.

    • Yeah?

      My crappy old Xeon workstation can in CPU only mode beat the decent GPU in some tasks. That doesn't mean it's a particularly good CPU,Âit just means the latency and setup for the tasks on a PCIe GPU is more expensive than the rather cheap task being run.

      The M1 has shared memory, so it's going to win in churning many easy tasks that the big GPU needs to have shipped over the PCIe bus. The bigger the problems, the wider the gap from the 3090.

      • I did some basic tests with cracking (john the ripper), and even with pretty small working sets (few hundred MB), my RTX 2060 still pretty handily trounces my 32-core M1 Max.

        I suppose there's a hypothetical system where if you do single operation -> GPU -> back, then the M1 would probably kick ass at that, but that's not really realistic- as even with the shared memory, there's still a significant amount of setup involved to map memory between process and GPU, bootstrap the shader, etc.
        • I thinking deep learning. If you're training a very small network e.g. On MNIST, then you can be done in seconds then on a CPU, and it's often faster than the GPU. It would be very silly to use that as a benchmark, but it does demonstrate that you can if you really want :)

    • That is unsane, and highly unlikely, without a very big asterisk.

      I find generally, when they find ways for it to keep up, they do something nasty like a Windows OpenGL comparison vs. a native Arm Metal comparison, at which point, anyone who understands the OpenGL limitations will say "holy shit- how does the 3090 even keep up?"
  • It depends on WHAT you are doing. Are you trying to play a video game at max resolution? It's not going to work on a Mac. And it was never designed for that.

    Are you trying to transcode a video efficiently? Yes, a Mac M1 Pro Ultra Uber edition will be perfect for that.

    I'm sure some uber geek will try a USB-C to RX3090 one of these days but again this is defeating the point. Every machine configuration is good at a particular thing. Since Apple's target audience is generally artists, causal moms and kids and even video editors -- the M1 will work for this.

    Pro Gamers will never consider the M1 for a variety of reasons but performance is clearly not the main reason. The main reason would be that every pro game is already designed for Intel / Nvidia or even AMD's worlds -- not the M1.

    • by fazig ( 2909523 )
      The main audience for the RTX 3090 are digital artists that are into 3D rendering and texture painting.
      There it is a decently priced entry level product even at current prices. For gaming it's always been a waste of money compared to something like a 3080 or 3080ti.

      The RTX 3090 provides decent enough ray tracing performance in the viewport of 3D (at least popular) modelling software, like Blender. This makes it a lot easier for the artist to judge lighting conditions in real time, speeding up workflows b
    • by antdude ( 79039 )

      Apple needs to really get into gaming to compete against Windows. Yes, it failed during Steve Jobs' era but try again!

    • Are you trying to play a video game at max resolution? It's not going to work on a Mac. And it was never designed for that.
      My video games all do that. ATM I have an 2020 M1 on my lap, it does not even get warm.

      • Mine gets very warm, lol.
        Which is fine- I knew what I was getting into when I bought a passively cooled device.
        But's more than capable of hitting thermal throttle in a couple minutes of heavy load.
  • Everybody knows that "Plaid" is the ultimate.

  • by MobyDisk ( 75490 ) on Thursday March 17, 2022 @01:53PM (#62366847) Homepage

    The M1 Ultra gets about 76% - 84% (Geekbench, Shadow of Tomb Raider) of the performance of the Xeon + RTX 3090, using 1/3rd the power consumption (100W -vs- 310W, according to Apple's chart). I don't understand why Apple would need to artificially inflate the numbers since these numbers are damn good! Yes, there are cases where consuming 3x the power to get a 30% improvement is worth it. But anyone competing with the M1 should be quite concerned here.

    • I agree with you.
      When I got my M1 Max MBP, I was unsurprised to find that my RTX 2060 mobile GPU (on an i9-9980HK) handily spanked it any compute tasks by a factor of 2.
      Of course, that laptop uses ~250W to do what my MBP is doing with ~50W.

      Half the performance for 1/5th the power? That's nothing short of incredible.
      With that kind of a feather under your cap, why bother with misleading comparisons against the power of high-power discretes?
  • I think there are a few things to understand, though, before bashing Apple too hard on the benchmarks.

    First? They've suffered from really weak graphics performance across most of their product line for years, now. Apple tried to make excuses for only giving people basic Intel video on the machines lacking the "Pro" designation as "good enough" -- and for many, it probably was. But even when you spent thousands for something like the Mac Pro cylinder machine, you wound up with a non-upgradeable GPU that quic

    • In performance per watt, there's nothing the competes with them, full stop.
      As I posted elsewhere, my 32-core M1 Max gets about half the computer performance as my RTX 2060 mobile... at ~1/5th of the power consumption. Which is frankly incredible. Means I could take 2 M1 Maxes, hit my RTX 2060 in performance, and still be using less than half the power. If I kept adding M1 Maxes until I hit power parity, I'd be doing over twice the performance of my RTX 2060. Which is a technically brilliant feat. But using
  • ... and it seems a combination of Intels best desktop processor and an rtx 3090, with all the other "stuff" needed - ram, hard drives etc. - can be had at a more favourable price.

    Plus if you want to run macOS on that kind of system, it's totally possible.

    The thing is, the market for the mac Studio is creative professionals - it's a niche market.
    These are people who have relied on Apple hardware for decades and Apple have, generally, delivered.
    They buy into the eco-system because it works for them - the upfr

    • ... a sizable number of us have moved on from drinking the Cupertino Kool-Aid and are getting our work done on other less expensive platforms. The OS and hardware are irrelevant, the software available to get the job done is all that matters, within budgetary considerations. My personal take is that I can't really justify paying such a high premium for Apple hardware anymore, I felt differently about this in 1997.
      • Fair enough, but there's been plenty of breakdowns of cost done over the years and the general consensus is, the hardware isn't *that* much more expensive.

        The elephant in the room, however, is upgradability - and Apple are currently king in that territory - king of the lack thereof.
        You want more RAM? Sorry, it's soldered onto the mainboard = no upgrade option.

        Besides, I'm more referring to those with deep pockets - who don't want to have to go through the pain barrier of switching operating systems, because

        • by hawk ( 1151 )

          >You want more RAM? Sorry, it's soldered onto the mainboard

          no, that's out of date.

          It is now attached directly to the cpu, without passing through the mainboard.

          At this point, they would have to give up the advantages of their processor integration to be expandable.

  • This is a fight you can't win.
    But there are alternatives.

  • It's almost like they're paid to shit in Apple products. Amazing.

The computer is to the information industry roughly what the central power station is to the electrical industry. -- Peter Drucker

Working...