Apple's Charts Set the M1 Ultra up for an RTX 3090 Fight it Could Never Win (theverge.com) 142
An anonymous reader shares a report:When Apple introduced the M1 Ultra -- the company's most powerful in-house processor yet and the crown jewel of its brand new Mac Studio -- it did so with charts boasting that the Ultra capable of beating out Intel's best processor or Nvidia's RTX 3090 GPU all on its own. The charts, in Apple's recent fashion, were maddeningly labeled with "relative performance" on the Y-axis, and Apple doesn't tell us what specific tests it runs to arrive at whatever numbers it uses to then calculate "relative performance." But now that we have a Mac Studio, we can say that in most tests, the M1 Ultra isn't actually faster than an RTX 3090, as much as Apple would like to say it is.
Wow! (Score:2)
Maker of new product talks up the performance and people aren't wowed?
This is news?
Re:Wow! (Score:5, Insightful)
Most benchmarks are dubious, but Apple are especially bad for not only selecting benchmarks that favour them, but also for presenting the results in the most misleading way possible.
Re: (Score:3)
Like their battery life tests that are just playing a video non-stop. No web pages or javascript. Hardware offload video decoding and screen and that's it. Not real world at all.
Re: (Score:3)
Their GPU isn't even comparable to an Nvidia one anyway. It only supports a subset of the functionality. The guys trying to create an open source driver have a lot of detail. It's more like a mobile GPU.
Re: (Score:2)
Makes sense, since Apple is essentially a mobile electronics company. Something like 75% of their revenue comes from iPhone, iPad and wearables [statista.com]. The Mac isn't even 9%. Not enough to worry about doing cutting-edge design that cannot bleed down to the other devices.
But they did.
Re: (Score:2)
It is real world when you are in a train or plane.
Ofc it is misleading as the OS shuts down everything not needed for video playing.
Re: (Score:3)
So basically, Apple lied. Nothing new here.
Re: (Score:2)
"Feels"??!! What the hell? How about giving us an actual chart, which "shows"?
Re: (Score:2)
While I agree on the numbers, I can't but take pause when an article on tech specs contains a statement like this:
"Feels"??!! What the hell? How about giving us an actual chart, which "shows"?
Not only that, the Verge’s “extrapolation” completely changes the curve of the 3090 line, which clearly showed that its performance was flattening-out as it approached its TDP of 350 W.
Re: (Score:2)
News doesn't need to be novel. It just needs to inform. If this is something you already knew (I mean like really knew, like would have bet $10k on as opposed to having a pretty good guess) then give yourself a gold star or something.
Re: (Score:2)
Re: (Score:2)
Not really surprising, given Apple's poor cooling. The Intel chips in their devices never really got to live up to their potential since they were in a thermal throttling hell anytime they were even remotely taxed. At least with a glorified phone processor, Apple can now get away with their crappy cooling solutions.
Ain’t nothin' poor about the cooling in the Mac Studio.
For one thing, there is a two-pound block of copper on top of the SoC, and two fans. Pretty much the entire upper half of the Mac studio is dedicated to cooling.
and the mac pro can take more then 1 video card (Score:2)
and the mac pro can take more then 1 video card.
apples own chip pro may need quad chip to be good.
Re: (Score:2)
Yes, it can take more than one GPU. However, they won't be Nvidia because Apple is too busy holding a grudge about some shit that nobody else even remembers.
Re: (Score:2)
GPU's desoldering themselves from the motherboard of macbooks about 10 or so years ago
Re: (Score:2)
"... as a result of GPU manufacturer lying about amount of heat produced and cooling required"
FTFY
Apple might ponder on that... (Score:2)
Re: and the mac pro can take more then 1 video car (Score:2)
Re: (Score:2)
Re: (Score:2)
GPU's desoldering themselves from the motherboard of macbooks about 10 or so years ago
GPU's desoldering themselves from the motherboard of macbooks about 10 or so years ago
Closer to 20.
And the problem was nVidia couldn't make chips that were actually flat on the bottom (remember the term "coplanarity"?). So, some of them tended to desolder after several heat cycles.
Apple wasn't the only ones affected. Everyone that used those GPUs had the same problems.
Re: (Score:2)
Yes, it can take more than one GPU. However, they won't be Nvidia because Apple is too busy holding a grudge about some shit that nobody else even remembers.
Pretty much every organization that isn't Nvidia has a grudge against them. There is likely a good reason.
Re: (Score:2)
Here's a classic:
Linus Torvalds telling nVidia off: https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
Linus Torvalds apologizes for years of being a jerk [arstechnica.com]
Re: (Score:3)
215 - RTX 3090
86 - Mac Pro with 2x Radeon Pro Vega II
102 - Mac Studio M1 Ultra 20 core with 64 core GPU
So, ouch.
On Shadow of the Tomb Raider game benchmark, the RTX 3090 gets 142 fps compared to 125 for the 2x Radeon Pro, vs 108 for the M1 Ultra. So, not as bad, but not really close either.
Re: (Score:2)
Right, this makes sense. It would be an extraordinary coup to ship a pre-packaged GPU that out-performs what is basically the most powerful card currently on the market.
Re: (Score:2)
"apples own chip pro may need quad chip to be good." ...to be COMPETITIVE with high end GPUs from other manufacturers.
There is no doubt that the M1 Pro/Ultra is "good" already, by general computing standards. Meanwhile, even a "quad chip" version still won't run Windows apps, not good at that at all.
Re: (Score:2)
"apples own chip pro may need quad chip to be good." ...to be COMPETITIVE with high end GPUs from other manufacturers.
There is no doubt that the M1 Pro/Ultra is "good" already, by general computing standards. Meanwhile, even a "quad chip" version still won't run Windows apps, not good at that at all.
And who has control over whether Windows on ARM is available for ASi Macs?
Hint: It isn't Apple.
Re: (Score:2)
Any performance above the baseline is going to require Thunderbolt eGPUs. Still a possibility, but that means the graphics vendors have to write ARM drivers and Apple has to allow them. This has not happened yet.
Re: (Score:2)
"write ARM drivers" means you simply recompile the current driver.
Not really rocket science. Worst case the "ARM driver" has to read and write to a different address space, hence you change some constants, and that's it.
Re: (Score:2)
If it were as simple as that, it would be done by now. The default OS installation blocks 3rd-party kexts. So either you have to get Apple to distribute it with the OS or you have to tell your users how to step by step go into recovery mode and disable the "security" feature that prevents this.
As long as it's that complex, there's probably not going to be enough incentive to get a release from AMD or NVIDIA.
Re: (Score:2)
If it were as simple as that, it would be done by now. The default OS installation blocks 3rd-party kexts. So either you have to get Apple to distribute it with the OS or you have to tell your users how to step by step go into recovery mode and disable the "security" feature that prevents this.
As long as it's that complex, there's probably not going to be enough incentive to get a release from AMD or NVIDIA.
I have a genuine question: Why, in a Microkernel architecture such as Mach, (or, frankly, any architecture), why do GPU drivers (or really, anything) have to execute in Kernel Space?
Re: (Score:2)
I literally don't know enough about architecture to say this, but would DMA transfers need direct kernel-level access?
Only 2 benchmarks, one in emulation (Score:3, Insightful)
So for those of you who didn't RTFA, the article has 2 benchmarks, one with a very obtuse 'scoring' mechanic that doesn't tell you what is actually 'slower' and 'faster', just an overall score.
Then there is a benchmark for a single game (Tomb Raider) which is fully emulated on the Mac and the M1 still gets relatively close to the gaming machine, but the chart seems to be reversed (all systems have fewer fps at lower resolutions) and is missing a full bar (1080p on RTX3090).
Just a shoddy article altogether, looking for some real benchmarks from people that have a clue.
Ars has Geekbench scores (Score:2)
Ars has a number of Geekbench scores [arstechnica.com] comparing to the RTX 3070 - the Ultra beats it in one, but the others the 3070 is pretty substantially ahead (click on thumbnails below chart, there are four charts)... I would say comparing it to a 3090 seems like a stretch.
But the Mac Studio does win handily in power efficiency over an Intel system with an RTX card.
Re: (Score:2)
Power efficiency, THE critical metric when considering high end GPU performance.
Re: (Score:3)
Their actual presentation already said they weren't faster than a 3090. They'd heavily caveated with "for some workloads" and then "approaching the performance", and then "higher performance per watt".
In short, article sets up a fight of its own invention, not one that Apple ever proffered.
Re: (Score:2)
Look at the vertical axis, "Relative Performance", and the scale "50, 100, 150, 200"... it's literally utterly meaningless. But whatever it is, the M1 Ultra is shown to have about "15" more of this "relative performance" than an RTX3090 can acheive, and for substantially less power consumption, in (watts).
Apple produced and displayed this "performance charts" and yeah sure, surrounded it with a bunch of weasel language. Because otherwise it would be a bald face LIE.
So what we have is pure marketing GARBAGE,
Re: (Score:2)
Not mining either. Apple weren't selling a games machine, they were selling a production machine. Clue being in the name - Studio.
Their actual presentation already said they weren't faster than a 3090. They'd heavily caveated with "for some workloads" and then "approaching the performance", and then "higher performance per watt".
Exactly!
In short, article sets up a fight of its own invention, not one that Apple ever proffered.
Re: (Score:2)
In short, article sets up a fight of its own invention, not one that Apple ever proffered.
Aha, I see what you are getting at now, that makes sense.
Not only that; but the haters all call the 3090 a 500 W GPU, and then claim Apple "chopped off the chart" to "hide" the "fact" that the "3090 was twice as fast".
Problem is, the TDP of the 3090 is 350 W, not 500 W, so, at most, Apple stopped charting only about 30 W (10%) from that TDP. Further, the performance curve clearly shows that the 3090 was severely levelling-off by the point at which Apple stopped charting.
IOW, Apple didn't lie; those who claim the RTX3090 can operate at 500 W for more than a few se
Re: (Score:2)
IOW, Apple didn't lie; those who claim the RTX3090 can operate at 500 W for more than a few seconds are the real liars here.
Na, they lied.
My RTX 2060 Mobile molests my 32-core M1 Max in compute, by a factor of 2.
My 2080 Ti Desktop, by a factor of a little over 4. Of course, it's using 250W while doing that.
Either way, the point is, it's not even close.
If my 2080 Ti can destroy my 32-core M1 Max at 250W, the 3090 can lap it at 350.
Cut it out with your gaslighting.
Re: (Score:2)
IOW, Apple didn't lie; those who claim the RTX3090 can operate at 500 W for more than a few seconds are the real liars here.
Na, they lied.
My RTX 2060 Mobile molests my 32-core M1 Max in compute, by a factor of 2.
My 2080 Ti Desktop, by a factor of a little over 4. Of course, it's using 250W while doing that.
Either way, the point is, it's not even close.
If my 2080 Ti can destroy my 32-core M1 Max at 250W, the 3090 can lap it at 350.
Cut it out with your gaslighting.
Honestly, I think that at this point, there is so much non-optimized software and benchmarks around, that serious scrutiny is called for in benchmarking ASi-based systems.
For example, The Verge using an OpenCL, rather than Metal-based benchmark, or Translated/Emulated software.
Is Apple engaging in specsmanship, picking benchmarks the Mx is particularly good at? Most assuredly. But every single manufacturer does that. But is Apple out-and-out lying? I honestly doubt it.
Re: (Score:2)
Honestly, I think that at this point, there is so much non-optimized software and benchmarks around, that serious scrutiny is called for in benchmarking ASi-based systems.
This is absolutely a fair criticism to any and all benchmarks between the devices, so no argument there.
However, even with the difficulty in benchmarking them, there are analogues that can be drawn.
For example, The Verge using an OpenCL, rather than Metal-based benchmark, or Translated/Emulated software.
Yup. Lots of people engage in non-analogous testing methodologies like this. In the other direction, you see native metal comparisons against OpenGL on windows.
OpenGL being a notoriously terrible graphics library, performance wise.
Is Apple engaging in specsmanship, picking benchmarks the Mx is particularly good at? Most assuredly. But every single manufacturer does that. But is Apple out-and-out lying? I honestly doubt it.
Ok, I'll give you that they didn't "out-and-out lie".
They die out-and-out attemp
Re: It matters in some case (Score:2)
Also sound.
I generally go for silent or near silent operation.
Re: (Score:2)
Re: (Score:2)
If you're trying to game on Mac, chances are you have to emulate.
Not at all.
I'm a gamer. Have been all my life. Not as crazy anymore today because I have a job and a family and a life, but I still game, and like to explore new games all the time. I have a fairly nice library, of games, too. And all of it runs on my Mac.
Sure, there are some titles that I'd like to play and can't because the morons made them PC only. But hey, if I had a windos thingy, I'd have games that are Playstation exclusive or whatever and cry about that.
There's plenty of great games for Mac. I'm tot
Re: (Score:3)
Sure, there are some titles that I'd like to play and can't because the morons made them PC only.
I wouldn't say they where morons. More likely they sat back and took a serous look at the market and decided not to support a system that is less than 2% of the desktop. Developing any kind of software is expensive. Smart people that are hoping to make money develop where they will get the most for their return.
Re: (Score:2)
and decided not to support a system that is less than 2% of the desktop
Welcome to the 21st century. That hasn't been true for at least 15 years.
https://gs.statcounter.com/os-... [statcounter.com] says 15%
https://www.statista.com/stati... [statista.com] has similar numbers
https://netmarketshare.com/ope... [netmarketshare.com] says 10%
The only source that has MacOS in your area is the Steam Hardware Survey - https://store.steampowered.com... [steampowered.com] - with 2.6%
As a Steam partner, you also get a hardware survey for your players, and for my game that shows higher numbers, because the game actually supports MacOS. Or in other words: If you bu
Re: (Score:3)
Wow. Those numbers are all over the place. Lets just stick with the 2% since its a more realistic number.
Re: (Score:2)
2% is a laughable number. Wikipedia says 16% [wikipedia.org] of desktop/laptop computers, and that's probably pretty close.
Apple hasn't been anywhere close to 2% of the computing market since the Intel transition. At least in the U.S. (which is the second-largest market for games, behind China), Apple is typically somewhere in the neighborhood of 7 to 13% of computer sales, depending on quarter, and Apple hardware tends to be used for more years than your average PC, so that number significantly underestimates the perce
Re: (Score:2)
2% is a laughable number. Wikipedia says 16% [wikipedia.org] of desktop/laptop computers, and that's probably pretty close.
Apple hasn't been anywhere close to 2% of the computing market since the Intel transition. At least in the U.S. (which is the second-largest market for games, behind China), Apple is typically somewhere in the neighborhood of 7 to 13% of computer sales, depending on quarter, and Apple hardware tends to be used for more years than your average PC, so that number significantly underestimates the percentage of the computers that are in active use.
To be fair, Apple makes up a negligible percentage of the computer gaming market, but that's only because too few games are available for the platform.
On the flip side, Apple dominates the mobile gaming market, and with M1-based Macs, you can write a single app and run it on iOS and on the Mac, so unless you don't have a mobile version of your game, if you aren't supporting at least Apple Silicon Macs, you're utterly incompetent. That should tip the balance of the computer gaming market pretty substantially in the next couple of years.
Doesn't the inclusion of iOS/iPadOS games (some of which are pretty good) suddenly propel the Mac into one of the most blessed-with-content Gaming Platforms in the planet?
Re: (Score:2)
Sorry, but are you mental?
I've posted two sources saying 15% and 16% and one source saying 10%. And your answer is "that's all over, let's stick with 2%" ???
You said percentage of desktop. That's those numbers.
The Steam number is percentage of gaming desktops who subscribe to Steam. There's a huge amount of self-selection in that. THIS is the number that should be discarded first.
Re: (Score:2)
Re: (Score:2)
If I'm developing a game for distribution on Steam then that number is interesting.
There's a difference. On itch.io for example, download numbers for another small game I made have the Mac and Linux versions at around 10% each. A different self-selection is at work there.
And none of these catches the casual gamers.
So it depends on what exactly you're making, what your distribution channel is and what your target audience is. In any case, the percentage of desktops running macOS isn't 2%. The percentage of g
Re: (Score:2)
I would argue that the Steam number is closer to accurate for this case. If I'm developing a game, I don't care about total desktop market, I want to see the percentage of self-selected gamers. They are the ones who will potentially be interested in my product. Perhaps I can lure in a few other users from outside the gamer world, but this is not my target audience. Tom, you even say yourself in a previous post that Mac makes up about 5% of your players. So somewhere between 2.6% and 5% is going to be the most useful number for this discussion.
However, you must agree that the potential market includes every single person who uses a Mac.
Re: (Score:2)
Wow. Those numbers are all over the place. Lets just stick with the 2% since its a more realistic number.
No, it's not.
Home use is approaching 50% in the U.S., and a lot of new businesses are Mac-based.
Re: (Score:2)
You are mixing up "sales" of Mac vs. PC with installed and used hardware.
Macs are around 20%. And most private Mac owners play as much as Windows players.
And on top of that: it is actually not hard to be cross plat form. Developer studios are just lazy and have odd pressures form outside on them.
I do it opposite around, since decades: I do not buy windows games that are not available on Macs. So their stupid idea makes them loose two times, neither can/do I buy the non existing Mac version, nor do I buy the
Re: (Score:3)
Again, Wow. I was just going to let this thread dye off because I know better than trying to argue with zealots. Then you all start posting your fictitious numbers. I'm even willing to admit that 2% is a low ball number, keeping real you know.
Then one of you chimes in with 20%, and another 50%. Well we all know that is bullshit. Even you included all Macs ever made it would never be close to 20%. As for 50%? Well that one isn't even worth talking about. Someone is on the crack.
Then we have a
Re: (Score:2)
As for 50%? Well that one isn't even worth talking about. Someone is on the crack.
You are the one with the substance abuse problem.
I think Linux Desktops are even over 2% these days. MacOS is the second most popular platform next to Windows. And Windows is nowhere near the 98% marketshare it had in the XP and Win 7 days. Where do you think those people went?
Mac adoption has been growing by leaps and bounds, especially in the home market.
And there are a Lot more Livingrooms than Corporate Offices.
And more and more of them are sporting Macs these days, too.
Watch TV. I don't mean shows with
Re: (Score:3)
Well one of us in this conversation, who isn't me, believes that Mac's make up 50% of the desktops. So, we have to question which one us is hitting that crack pipe.
Linux desktops are well over 5% of the current desktop market. Making Linux almost 3 times the number of Mac's out there.
But, hey, if you want to believe that Mac's make up 50% of the desktop market, who am I to say other wise. I mean if you are going to have delusions, might as well make them grand. Shit, I know of one Amiga user out th
Re: (Score:3)
Three times as many Linux desktops as Macs? Maybe; but Only if you try to count chromebooks.
If you count chromebooks then the desktop market for linux is many many times that of Mac. Chromebooks make up just around 15% of marketshare for OS's. They are very popular because they run standard hardware, cheap, and do the job. They are most popular in academic circles where students are finding you don't need a $2000 Macbook, or a even a $800 dell. You can do just as good with a $200 to $300 Chromebook.
But back to the original question. Traditional Linux installs, not Chromebook or server, are
Re: (Score:3)
Why is that so hard to believe? We all know from your past posting history that you tend to be "reality challenged" but this one seems to be a no brainer.
Linux, in many ways now, is far superior to MacOS. Many distros have a free install and no longer take a computer science degree to install.
It runs on far more hardware than MacOS does. I can pull 20 year old hardware out of storage and there is a good chance that any random distro will run on it, or have a version that will run. Virtually, a 64-
Re: Only 2 benchmarks, one in emulation (Score:2)
Re: (Score:2)
Most of the work is no work. The compiler does it.
You simply write your "make file" to compile for Macs and Windows. Simple. No idea where you get the 30% more work from.
And: if he would not do the extra work, he would not get the extra 10% profit. It is his job to check if that works out.
Simple example: Windows only means 20h per weak work. Windows + Mac means 26h per week work, and you get 10% more? Well, I guess I would do the 26h week work ... depends of course on the money.
If I only make a mere 100k pe
Re: (Score:2)
Give back your nerd credentials, will you?
I have my entire build process automated. It is literally pressing one button to build all three platforms, then running one script to upload all three packages.
Also, in the past 6 months, I've had one (!) platform-specific bug. So there's not much additional workload on that, either.
Re: (Score:2)
So for those of you who didn't RTFA, the article has 2 benchmarks, one with a very obtuse 'scoring' mechanic that doesn't tell you what is actually 'slower' and 'faster', just an overall score.
Then there is a benchmark for a single game (Tomb Raider) which is fully emulated on the Mac and the M1 still gets relatively close to the gaming machine, but the chart seems to be reversed (all systems have fewer fps at lower resolutions) and is missing a full bar (1080p on RTX3090).
Just a shoddy article altogether, looking for some real benchmarks from people that have a clue.
It's The Verge. What did you expect?
Re: (Score:2)
So for those of you who didn't RTFA, the article has 2 benchmarks, one with a very obtuse 'scoring' mechanic that doesn't tell you what is actually 'slower' and 'faster', just an overall score.
Then there is a benchmark for a single game (Tomb Raider) which is fully emulated on the Mac and the M1 still gets relatively close to the gaming machine, but the chart seems to be reversed (all systems have fewer fps at lower resolutions) and is missing a full bar (1080p on RTX3090).
Just a shoddy article altogether, looking for some real benchmarks from people that have a clue.
Exactly!
There are a ton of really crappy benchmarking articles floating around regarding ASi Macs.
The Verge article is but one example.
If only they could get eGPU Drivers (Score:2)
The Mac Studio has the capability of supporting devices over the Thunderbolt Interface. They could actually support eGPU and add nVidia's power to their own for jobs that require even higher computational values.
I understand Apple's desire to control everything, but let's be honest and admit that without third party suppliers, Apple wouldn't be nearly as popular.
Re: (Score:2)
Re: (Score:2)
Apple has made it pretty clear that they don't want third-party kexts on their computers anymore.
They have made automated driver install impossible, and horrifically painful to do manually.
It was enough to make me seek out peripherals that "just worked", even when there were Arm kexts available.
Re: (Score:2)
Apple has made it pretty clear that they don't want third-party kexts on their computers anymore.
Which sucks because of how much effort is wasted rewriting perfectly working drivers for no good reason other than paranoia, but at least in theory, nothing prevents graphics card manufacturers from writing user-space graphics card drivers with DriverKit. It provides full access to PCI devices already, AFAIK.
That said, there are probably a lot of critical missing pieces that Apple will have to implement before it will be possible to make it integrate properly into the OS (before which, all you'd be able to
Re: (Score:2)
nothing prevents graphics card manufacturers from writing user-space graphics card drivers with DriverKit. It provides full access to PCI devices already, AFAIK.
It does.
There are other caveats to now being in User-Mode, though. It means you're now strictly limited to APIs they expose. If you're trying to extend something that the Operating System doesn't want you to (say, graphics), then you can sod off.
Of course, for CUDA/OpenCL on NV/AMD eGPUs- that much should be possible.
That said, there are probably a lot of critical missing pieces that Apple will have to implement before it will be possible to make it integrate properly into the OS (before which, all you'd be able to do would be to publish some sort of custom user client and tell the driver to send commands to the card from custom user-space apps, which isn't particularly useful).
Ya, as long as tight system integration isn't a concern of yours, specialized apps talking to specialized user-space drivers should be perfectly feasible.
Re: (Score:2)
The Mac Studio has the capability of supporting devices over the Thunderbolt Interface. They could actually support eGPU and add nVidia's power to their own for jobs that require even higher computational values.
I understand Apple's desire to control everything, but let's be honest and admit that without third party suppliers, Apple wouldn't be nearly as popular.
What a ridiculous statement!
No computer platform would exist without 3rd party suppliers.
Most? (Score:2)
Are you saying it actually wins something against a 3090. Dude thats insane.
Re: (Score:2)
I'd assume its physical proximity to the other system parts would give it some advantages.
Re: (Score:3)
Yeah?
My crappy old Xeon workstation can in CPU only mode beat the decent GPU in some tasks. That doesn't mean it's a particularly good CPU,Âit just means the latency and setup for the tasks on a PCIe GPU is more expensive than the rather cheap task being run.
The M1 has shared memory, so it's going to win in churning many easy tasks that the big GPU needs to have shipped over the PCIe bus. The bigger the problems, the wider the gap from the 3090.
Re: (Score:2)
I suppose there's a hypothetical system where if you do single operation -> GPU -> back, then the M1 would probably kick ass at that, but that's not really realistic- as even with the shared memory, there's still a significant amount of setup involved to map memory between process and GPU, bootstrap the shader, etc.
Re: (Score:2)
I thinking deep learning. If you're training a very small network e.g. On MNIST, then you can be done in seconds then on a CPU, and it's often faster than the GPU. It would be very silly to use that as a benchmark, but it does demonstrate that you can if you really want :)
Re: (Score:2)
I find generally, when they find ways for it to keep up, they do something nasty like a Windows OpenGL comparison vs. a native Arm Metal comparison, at which point, anyone who understands the OpenGL limitations will say "holy shit- how does the 3090 even keep up?"
As usual...everyone wants their horse to win (Score:3)
It depends on WHAT you are doing. Are you trying to play a video game at max resolution? It's not going to work on a Mac. And it was never designed for that.
Are you trying to transcode a video efficiently? Yes, a Mac M1 Pro Ultra Uber edition will be perfect for that.
I'm sure some uber geek will try a USB-C to RX3090 one of these days but again this is defeating the point. Every machine configuration is good at a particular thing. Since Apple's target audience is generally artists, causal moms and kids and even video editors -- the M1 will work for this.
Pro Gamers will never consider the M1 for a variety of reasons but performance is clearly not the main reason. The main reason would be that every pro game is already designed for Intel / Nvidia or even AMD's worlds -- not the M1.
Re: (Score:2)
There it is a decently priced entry level product even at current prices. For gaming it's always been a waste of money compared to something like a 3080 or 3080ti.
The RTX 3090 provides decent enough ray tracing performance in the viewport of 3D (at least popular) modelling software, like Blender. This makes it a lot easier for the artist to judge lighting conditions in real time, speeding up workflows b
Re: (Score:2)
Apple needs to really get into gaming to compete against Windows. Yes, it failed during Steve Jobs' era but try again!
Re: (Score:2)
Are you trying to play a video game at max resolution? It's not going to work on a Mac. And it was never designed for that.
My video games all do that. ATM I have an 2020 M1 on my lap, it does not even get warm.
Re: (Score:2)
Which is fine- I knew what I was getting into when I bought a passively cooled device.
But's more than capable of hitting thermal throttle in a couple minutes of heavy load.
Re: (Score:2)
If you exclusively use "ProRes", an M1 is the fastest hardware you can get.
I'm waiting for the "Plaid" version (Score:2)
Everybody knows that "Plaid" is the ultimate.
The M1 is still amazing (Score:3)
The M1 Ultra gets about 76% - 84% (Geekbench, Shadow of Tomb Raider) of the performance of the Xeon + RTX 3090, using 1/3rd the power consumption (100W -vs- 310W, according to Apple's chart). I don't understand why Apple would need to artificially inflate the numbers since these numbers are damn good! Yes, there are cases where consuming 3x the power to get a 30% improvement is worth it. But anyone competing with the M1 should be quite concerned here.
Re: (Score:2)
When I got my M1 Max MBP, I was unsurprised to find that my RTX 2060 mobile GPU (on an i9-9980HK) handily spanked it any compute tasks by a factor of 2.
Of course, that laptop uses ~250W to do what my MBP is doing with ~50W.
Half the performance for 1/5th the power? That's nothing short of incredible.
With that kind of a feather under your cap, why bother with misleading comparisons against the power of high-power discretes?
Re: (Score:2)
Shadow of Tomb Raider is 4 years old.
SOTR is still on the list of top 15 most graphically demanding games. [thegamer.com] At 4k, even a 3090 can't do 144FPS yet.
good job apple at working on a 4 year old benchmark that means nothing
The benchmark we are discussing was done by The Verge, not by Apple.
Re: (Score:2)
Shadow of Tomb Raider is 4 years old.
SOTR is still on the list of top 15 most graphically demanding games. [thegamer.com] At 4k, even a 3090 can't do 144FPS yet.
And don't forget that SoTR is emulated on M1 Macs. It is doing almost as well as an Intel machine and a high-end graphics card while running code that was machine-translated from x86-64 assembly to ARM64 assembly. That's no small feat.
All the M1 to RTX GPU comparisons have been "off". (Score:2)
I think there are a few things to understand, though, before bashing Apple too hard on the benchmarks.
First? They've suffered from really weak graphics performance across most of their product line for years, now. Apple tried to make excuses for only giving people basic Intel video on the machines lacking the "Pro" designation as "good enough" -- and for many, it probably was. But even when you spent thousands for something like the Mac Pro cylinder machine, you wound up with a non-upgradeable GPU that quic
Re: (Score:2)
As I posted elsewhere, my 32-core M1 Max gets about half the computer performance as my RTX 2060 mobile... at ~1/5th of the power consumption. Which is frankly incredible. Means I could take 2 M1 Maxes, hit my RTX 2060 in performance, and still be using less than half the power. If I kept adding M1 Maxes until I hit power parity, I'd be doing over twice the performance of my RTX 2060. Which is a technically brilliant feat. But using
Trying to figure out the cost ... (Score:2)
... and it seems a combination of Intels best desktop processor and an rtx 3090, with all the other "stuff" needed - ram, hard drives etc. - can be had at a more favourable price.
Plus if you want to run macOS on that kind of system, it's totally possible.
The thing is, the market for the mac Studio is creative professionals - it's a niche market.
These are people who have relied on Apple hardware for decades and Apple have, generally, delivered.
They buy into the eco-system because it works for them - the upfr
On behalf of certain creative professionals.. (Score:3)
Re: (Score:2)
Fair enough, but there's been plenty of breakdowns of cost done over the years and the general consensus is, the hardware isn't *that* much more expensive.
The elephant in the room, however, is upgradability - and Apple are currently king in that territory - king of the lack thereof.
You want more RAM? Sorry, it's soldered onto the mainboard = no upgrade option.
Besides, I'm more referring to those with deep pockets - who don't want to have to go through the pain barrier of switching operating systems, because
Re: (Score:2)
>You want more RAM? Sorry, it's soldered onto the mainboard
no, that's out of date.
It is now attached directly to the cpu, without passing through the mainboard.
At this point, they would have to give up the advantages of their processor integration to be expandable.
use the GeForce (Score:2)
This is a fight you can't win.
But there are alternatives.
Yet another Verge hitpiece (Score:2)
Re: (Score:2)
Because it was in Apple's presentation.
Go complain to Apple for starting it.
Re: (Score:2)
And it's a serious development as far as laptops are concerned. Ultimately, it's still not the best performance in a laptop. My Intel i9-9980HK w/ RTX 2060 easily beats it in any GPU compute task... Of course using around 5x the power. But that's what's amazing, and what they should be focusing on. I don't understand why they're trying to claim these things are better performing than discretes. They're not. They're not even close. But if you want a laptop that can
Re: (Score:2)
Nobody alive has managed to get it higher than 269, though.
Which is still nowhere near half of 355, but still. I can't help but see some irony in a dude with the handle Macdude using a bullshit Tesla claim to throw snark at an article calling out a bullshit Apple claim