Apple's Mac Studio is a New Desktop for Creative Professionals (theverge.com) 140
Apple has announced the Mac Studio, a desktop system that looks like the Mac Mini on the outside but packs a lot more power on the inside. The Mac Studio features both Apple's M1 Max chip as well as a new, even more powerful processor, the M1 Ultra. It looks a bit similar to the Mac Mini, but Apple claims that the new device will be faster than even its top-of-the-line Mac Pro. From a report: The chassis is 7.7 inches by 3.7 inches; Apple claims it "fits perfectly under most displays" and will remain quiet under heavy workloads. The rear includes four Thunderbolt 4 ports as well as a 10Gb Ethernet port, two USB-A ports, an HDMI, and an audio jack. It supports Wi-Fi 6 and Bluetooth 5.0. The front includes two USB-C ports (10 Gbps on M1 Max systems, 40 Gbps/Thunderbolt 4 on M1 Ultra systems) and an SD card slot. The Mac Studio can support up to four Pro Display XDRs and a 4K TV, Apple says. Apple claims that the Mac Studio with M1 Max will deliver 50 percent faster CPU performance than a Mac Pro with a 16-core Xeon and 2.5 times faster CPU performance than a 27-inch iMac with a 10-core Core i9. The M1 Ultra configuration purportedly has 3.8 times faster CPU performance than that 27-inch iMac and is up to 90 percent faster than the 16-core Mac Pro. The Mac Studio with M1 Max will start at $1,999, and M1 Ultra models will start at $3,999. The studio display is $1,599.
Can't wait... (Score:2)
until someone figures out how to run Linux on it. I'd be all over it. Looks like a real powerhouse.
Re: (Score:2)
Re: (Score:2)
until someone figures out how to run Linux on it. I'd be all over it. Looks like a real powerhouse.
Here you go [asahilinux.org].
Re: (Score:2)
Thanks I didn't know it was actually a thing. Might have to go get me one of those shiny new M1 boxes :-)
Re: (Score:2)
It's still a work in progress - I've been supporting his Patreon. But the basic functionality is there.
Re: (Score:2)
It's still a work in progress - I've been supporting his Patreon. But the basic functionality is there.
And maybe by the time the M4 Ultra is out, that Linux distro will work as good as whatever Apple-Silicon-supporting Darwin layer was working back in 2020.
Why?
Re: (Score:2)
until someone figures out how to run Linux on it. I'd be all over it. Looks like a real powerhouse.
Again, why?
macOS is a Certified Unix. WTF is $RANDOMDISTRO Linux so much better?
Makes zero sense.
Re: (Score:2)
I like the fact that Linux is open source and not owned by some huge corporation that is tracking my every movement. I also like the amount of customization I can do with the look and feel of Linux vs Windows and OSX. To me Linux feels snappier and more responsive.
Editorial contributions (or lack thereof) (Score:2)
Ok, so just to be clear, msmash literally lifted the entire post from The Verge while making it seem like only part of the post was quoted material. https://www.theverge.com/2022/3/8/22962081/apple-mac-studio-m1-max-ultra-price-specs-processor-release-date [theverge.com]
That's really lame.
Price (Score:2)
Why TF dont Apple reduce their prices? Profit is the standard answer but IMO theyd make a lot more by charging half the price and making 3x or more the sales because god knows, a lot of people want a plug and play alternative to a Windows PC but the price to performance ratio of Macs just doesnt cut it right now. If Macs were more affordable MS would shit a brick because they must realise a huge percentage of users only use Windows because they cant afford a Mac and arnt technical enough to install Linux on
Re: (Score:2)
because people continue to buy them at the current price?
Re: (Score:2)
But hey, they've gotten ~$11,000 from me in the last 8 months, so oh well.
Re: (Score:2)
Sell 'em at a loss and make it up on volume!
You price to what the market will bear.
Re: (Score:2)
Apple had the highest net income of any company in the world in 2020 and 2021
Imagine how dumb you have to be to believe a very obvious idea you have would be the key to them making more money, as if they aren't the quantitively proven top experts at generating profit AND have an insane amount more data informing them of the decisions they're making. Maybe you can tell Steph Curry how to shoot 3s better while you're at it.
Re: Price (Score:2)
The vast majority if that income came from iPhones and the app store, NOT Macs. And who tf is steph curry? Never heard of her.
Re: (Score:2)
Why TF dont Apple reduce their prices? Profit is the standard answer but IMO theyd make a lot more by charging half the price and making 3x or more the sales because god knows, a lot of people want a plug and play alternative to a Windows PC but the price to performance ratio of Macs just doesnt cut it right now. If Macs were more affordable MS would shit a brick because they must realise a huge percentage of users only use Windows because they cant afford a Mac and arnt technical enough to install Linux on lower priced machine (and sorry, linux is a fantastic server OS but it doesnt really cut it on the desktop due to lack of apps). It certainly isnt for Windows technical quality or what MS laughably call its user "experience".
You're right. I'm sure Tim Cook has never considered what the optimal prices for his products would be.
Re: (Score:2)
It's quite possible there isn't enough pent-up demand to offset the per-unit margin, or that the pent-up demand would be reduced if the price came down, strangely enough.
Same argument about lack of apps applies to both macOS and Linux, so people nervous about application support making them skeptical of Linux makes them also skeptical of macOS. Broadly speaking, slapping a modern distro on a PC is cheap and easy to get into the browser experience and developer tools, with Steam supporting more games under
Re: (Score:2)
Same argument about lack of apps applies to both macOS and Linux
Not for many, many years.
Re: (Score:2)
Of course it does. For example, pulling up Steam's top seller list, the top 19 entries do not support macOS. In general it looks like Linux supported games even outnumber macOS supported games a bit, though Windows is supported by 100%.
On the flip side, can I access Office from macOS? Sure but I can also access most of everything I would need through a browser on Linux too. This application does favor macOS a little, but historically Windows is best supported.
Both macOS and Linux require the user to pause
Re: (Score:2)
If they reduce their prices and increase their volume then they're going to have to increase their production as well. Right now Apple is in the enviable position of making lots of profit by selling few computers, and it's one they've enjoyed throughout most of their history. The only time they had the cheaper computer was early in the Apple 2 era. They only had one down period, which admittedly was a doozy, but even then they did well in education and graphic arts selling extremely overpriced equipment to
Re: (Score:2)
> Why TF dont Apple reduce their prices?
Poor people using Apple products is a bad look. Next thing you know ugly people will show up at an Apple store. Who wants to be part of that crowd?
Re: (Score:2)
Why TF dont Apple reduce their prices? Profit is the standard answer but IMO theyd make a lot more by charging half the price and making 3x or more the sales because god knows, a lot of people want a plug and play alternative to a Windows PC but the price to performance ratio of Macs just doesnt cut it right now. If Macs were more affordable MS would shit a brick because they must realise a huge percentage of users only use Windows because they cant afford a Mac and arnt technical enough to install Linux on lower priced machine (and sorry, linux is a fantastic server OS but it doesnt really cut it on the desktop due to lack of apps). It certainly isnt for Windows technical quality or what MS laughably call its user "experience".
You want Apple to race itself to the bottom?
Unbelievable that you aren't CEO of Apple; with that marketing acumen, obviously you would do so much better!
[rollseyes]
Where do I put all my hard drives? (Score:2)
The silicon seems capable -- though the big grill on the back tells me heat will be a factor -- but we're back to the Tube and no place for all the local storage volumes. I was hoping for a mini-tower like the Quadra 700.
I need a data truck. The silicon's powerful but I'm not a fan of cluttering a workspace with many bricks on strings.
Re: (Score:2)
Then you'll probably need to wait for the Pro upgrade. But it seems to me that most of the people I see needing a lot of local storage are already sporting RAID enclosures or NAS boxes.
Re: (Score:2)
They don't have enough I/O on that chip yet, they are presumably working on getting more and better pci-e on there eventually.
Re: (Score:2)
They don't have enough I/O on that chip yet, they are presumably working on getting more and better pci-e on there eventually.
That's why yesterday wasn't about an Apple Silicon Mac Pro; but rather a "Mac mini Pro".
Re: (Score:2)
That's why yesterday wasn't about an Apple Silicon Mac Pro; but rather a "Mac mini Pro".
Yeah, what I've learned about stuff in general is that any time something is advertised as a light, mini, or otherwise inferior version there's a good reason for it. They have to differentiate it so as to avoid disappointing people. Meanwhile any time something is offered as a cut-back version like that and simultaneously as "pro", "expert" etc. then it's always horribly compromised and doesn't deserve such exaltation.
This isn't to say I don't think the device will be useful, or sell like hotcakes among the
Re: (Score:2)
the big grill on the back tells me heat will be a factor
Maybe. They are going for quietness, and a bigger grille might help with quietness. I'm sure it will get plenty warm while working hard but it probably won't be horrible.
I was hoping for a mini-tower like the Quadra 700.
This is current-year Apple. We're lucky they didn't make another iMac with this thing for guts. This is at least a step forward.
The silicon's powerful but I'm not a fan of cluttering a workspace with many bricks on strings.
Wait a bi
Re: (Score:2)
The silicon seems capable -- though the big grill on the back tells me heat will be a factor -- but we're back to the Tube and no place for all the local storage volumes. I was hoping for a mini-tower like the Quadra 700.
I need a data truck. The silicon's powerful but I'm not a fan of cluttering a workspace with many bricks on strings.
The big grill on the back is to keep the "impedance" of the air-path low. This allows for a larger, quieter, lower-velocity cooling system.
Bottom line: It's about silence; not a need for raw airflow capacity.
Re: (Score:2)
The silicon seems capable -- though the big grill on the back tells me heat will be a factor -- but we're back to the Tube and no place for all the local storage volumes. I was hoping for a mini-tower like the Quadra 700.
I need a data truck. The silicon's powerful but I'm not a fan of cluttering a workspace with many bricks on strings.
BTW, how many Quadras had even one single card in their expensive NuBus slots?
poor IO and at least X2 markup on storage! (Score:2)
poor IO for an pro system and at least X2 markup on storage!
How many TB buses are there 2? 4? 6?
Why no e-sata?
Why no m.2 slot?
Why only 1 disk build in? with no raid configs 1?
Why the big markup on storage?
Why only one hdmi port?
Re: (Score:2)
and that color !
and the smell !!
SMP?! (Score:2)
Shitty writing. The above sentence totally made me think it was dual-processor, and I was wondering why they used two different processors.
Clicked through to comments and nobody was talking about it. I eventually figured out it's single-processor and they're just offering a choice between two different ones. Good job on hyping it, though.
Guess I am not in the target market (Score:2)
I usually use an ITX case which is 22cm x 18 x 27 and pretty inexpensive. Then of course there is PSU, main board, memory, processor.
You can get the price up to $2,000 with no problem. But what it gets you, admittedly in a much bigger enclosure, is great flexiblilty and Intel high end processor. Add drives (and that is plural), add memory, replace PSU (it happens). Lots of ports. Get the right cooler and fans and its just about silent. Go to any of the usual sources and they will supply the main boar
Re: (Score:2)
Re: (Score:3)
If you sell your Mac mini M1, your justification will be that you need a computer. This is the perfect time, right after a keynote where the Mac mini M1 is still unchanged both in specifications and in price.
Re: (Score:2)
If I hadn't bought my M1 mini just last year, I'd be all over this. Trying to figure out a justification...
You can probably sell an M1 mini for around 80% of what you paid for it.
Plus, Apple offers up to $2720 for Mac trade-ins:
https://www.apple.com/shop/tra... [apple.com]
Re: (Score:2)
Re: (Score:2)
I think there will be two audiences. One that want something like the mini, where they can have a few in a small space, maybe a render farm.
I was wondering about the distinction between these two, was wondering why someone would choose a Mini now... for something like a render farm, it seems like the Mac Studio would be more powerful option when comparing occupation of the same physical space. But maybe not.
Depends on whether the cost of those extra cores exceeds the cost of extra Mini machines, whether they have the extra space for more machines, and how well their problem distributes across multiple machines. But realistically, yeah, for render farms, it's hard to imagine anybody buying the Mini over the Studio.
Re: (Score:2)
I think there will be two audiences. One that want something like the mini, where they can have a few in a small space, maybe a render farm.
I was wondering about the distinction between these two, was wondering why someone would choose a Mini now... for something like a render farm, it seems like the Mac Studio would be more powerful option when comparing occupation of the same physical space. But maybe not.
The current M1 Mini's are still good for low-end, economy applications. These "Mac mini Pros" are for relatively high-load Applications, that do not require PCIe card-based special-purpose hardware, or gargantuan, memory-resident, datasets.
It was actually very smart to offer this also with the "still pretty damned nice" M1 Max for less than 50% of the price of the Ultra-based models. Even with Apple's $1500 27" 5k iMac Display, even the Max version probably wipes the floor with the Xeon-based iMac Pro, and
re: Not quite what people wanted though, IMO? (Score:2)
I think this does fill a bit of a void in the Mac lineup, in the sense you finally have a desktop Mac that doesn't have a built-in display, but isn't just an "entry level" Mac Mini.
But the thing I always heard Mac users wishing for was some sort of "mini tower" Mac. In other words, they wanted something that had card slots and internal drive bays like the original Mac Pro towers had, except scaled down a bit, and with an affordable price tag.
Something like a "headless iMac" is probably about right, as far a
Re: (Score:2)
To be fair, a lot has changed in recent years, so maybe the PCIe card slots aren't the draw they once were?
That is what I am thinking, they simply are not needed with the throughput you can get from modern expansion ports.
There's always room for the traditional system but fewer people really need that over time, even if you are trying to build out a GPU heavy system.
This is kinda what they wanted the Cylinder Mac Pro to be (minus the cool science fiction Dalek packaging); but Thunderbolt took too long to catch on (mostly due to Intel's approval and licensing crap), and neither the CPU nor TB2 in it was fast enough yet.
Re: (Score:2)
Re: (Score:2)
I don't think this is what anyone was asking for. This is a Mac mini, except with a high end CPU and memory. What people have been asking for the longest time has been a budget friendly version of the Mac Pro - a Mac that's expandable but doesn't cost the Earth. This isn't that.
That's what they wanted back when that was the way "real computers" were built.
But times have changed. We no longer set IRQ numbers with DIPswitches, don't have to worry about what N-8-1 means, or how to set the jumper blocks on our PATA Hard Drives. And we no longer need one single card slot in 99.998% of all Use-Cases.
And the Use-Cases that do need peripheral hardware that has I/O bandwidth and compute requirements that would quickly exceed a "midrange tower"-class machine. That leaves your "economy tower
Re: (Score:2)
Either way it's great to see Apple provide this option and let the market decide if it's a good idea or what.
I think I can sum it up in one word: Courage.
Re: (Score:2)
But it does have a headphone jack
Re: (Score:2)
I'm going to wait to see how it does with thermal stuff, then probably bite the bullet and give Apple their 3-4k for a midrange machine with 64 GB of RAM, 2 TB SSD, and so on. I have had great luck with the M1 Mac Mini for a desktop, however, it does have limitations.
The new display is nice as well. Apple has been at/near the top of the heap when it comes to PPI and decent displays, so even though a 5k displaying going for ~2k is high, it is something that will last 5-10+ years, and still going even when
Re: (Score:2)
Re: (Score:2)
Sometimes, a post formatted like a slick rebuttal (I'll forgive the lack of quote tags) is nothing more than a shit post. This is just a shit post. Nothing of value was added to the conversation. Congrats dfghjk, you demonstrated that you are, in fact, a dumbass.
He's the idiot that doesn't know the difference between JIT ISA Translation and bare-metal CPU Emulation.
Re: (Score:2)
What really matters for this use is whether it supports discrete graphics (it really should) and whether it's offered in a config which has a lot of RAM (minimally 64G, really should be more). If not both, this is just marketing plastering over a machine that's misdesigned for this purpose.
That will be the MacPro. The MacStudio is ideal for most music/media studios.
Re: (Score:2)
What really matters for this use is whether it supports discrete graphics (it really should) and whether it's offered in a config which has a lot of RAM (minimally 64G, really should be more). If not both, this is just marketing plastering over a machine that's misdesigned for this purpose.
That will be the MacPro. The MacStudio is ideal for most music/media studios.
Give up. People like that think "real computers" only come one way: The way they all were in 1990.
Re: (Score:2)
It supports up to 128 GB of memory and a 64-core GPU...both of which are integrated. On the plus side, integrated memory (when done properly, as it has been with the M-series) means better performance than socketed memory. On the downside, it means no upgradeability and no repairability. And in this case, it's also shared memory with the GPU, which once again provides for some great performance, but at the cost of having to split the use of that memory. Even so, I'm struggling to think of workloads where yo
Re: (Score:2)
On the plus side, integrated memory (when done properly, as it has been with the M-series) means better performance than socketed memory.
No, it does not.
Latency is higher on the M1 Max than any high end competing Intel or AMD machine. This is because it uses LPDDR RAM.
Memory throughput on the M1* series is because it has 8 channels of 32-bit RAM (equivalent to 4 channels of 64-bit that you'd see on a competing x86)
This means the SoC has roughly double the bandwidth at its root complex. This isn't because the RAM is soldered on to the package.
On the downside, it means no upgradeability and no repairability. And in this case, it's also shared memory with the GPU, which once again provides for some great performance
Sharing memory with the GPU is always a downside in terms of performance.
Nobody thought to try to
Re: (Score:2)
On the plus side, integrated memory (when done properly, as it has been with the M-series) means better performance than socketed memory.
No, it does not. [...]
I'll grant that I should not have included mention of the M-series in that sentence and will accept your corrections with regards to its inclusion, but are you sure about those numbers? Anandtech indicates that the M1 has a 128-bit interface, the M1 Pro a 256-bit interface, and the M1 Max a 512-bit interface [anandtech.com], and this thread on Reddit [reddit.com] suggests 8x 16-bit channels for the M1, as opposed to the 4x 32-bit channels we'd expect on x86.
I also stand by the notion that there's a performance benefit to be had with in
Re: (Score:2)
I'll grant that I should not have included mention of the M-series in that sentence and will accept your corrections with regards to its inclusion, but are you sure about those numbers? Anandtech indicates that the M1 has a 128-bit interface, the M1 Pro a 256-bit interface, and the M1 Max a 512-bit interface [anandtech.com], and this thread on Reddit [reddit.com] suggests 8x 16-bit channels for the M1, as opposed to the 4x 32-bit channels we'd expect on x86.
OK, ya, the memory arrangements are sufficiently diverse now, that I should avoid generalized answers.
Haven't read the anandtech article (I will though), but in my channelization benchmarks, I noted that both my M1 Air, and My M1 Max 64GB has 8 channels to the CPU (aggregate bandwidth climbs discretely up to 8 concurrent non-parallelizable memory-bound uncached instruction streams)
For the Air, that comes out to x64 per IC (x128 total) of LPDDR4X.
The Max, is interesting, in that there must be 2 configura
Re: (Score:2)
Where are you going with this?
If anything it supports my side of the argument
Nowhere in particular. Not even really trying to argue, so much as to simply soak up what you’re saying, internalize it, process it against what else I can find, and then push back in various areas to see where I’m still getting things wrong, because, frankly, it’s clear I’m out of my depth and that you know this subject WAY better than I do. Even if you got a detail wrong along the way, that pales in comparison to the errors I made.
In that light, thanks for the thoughtful follow up,
Re: (Score:2)
Well, my recommendation is not to buy into the "Unified Memory" marketing pitch. They invented it so that they could imply that the RAM on the original M1 was enough it was small in comparison to contemporary machines.
UMA has existed long before the M1, and isn't a performance improvement. It's a cost and power improvement- and a major one at that.
At this point, the UMA is the biggest thing holding the M1 Max back. If they were to, for example, make a discrete 32-core M1 Max GPU with its own, sa
Re: (Score:2)
Do we have a reason to think the M1 Ultra does not?
It's a not-insignificant change to move to non-LPDDR. They would have to reconfigure for 4 64-bit memory channels instead of 8 32-bit.
Re: (Score:2)
Re: (Score:2)
Serious question about advantages though. It makes sense that I would rather have 128GB of shared memory than 64GB of CPU memory and 64GB of GPU memory.
Well, it's not quite that simple.
128GB of RAM, of course, confers absolutely no performance benefit if the system + your application requires less than that.
Further, sharing the RAM also means sharing the bandwidth.
This means the GPU busy doing some work (very quickly, I might add) will slow down the CPU significantly (you can see this in CPU+GPU benchmarks on the M1 Max already)
Really, there's no "best" way to do it. They have different pros and cons, and are thus better for different things.
The bigg
Re: (Score:2)
One of the big advantages of Apple's CPU/GPU shared memory architecture is that the system no longer needs to *move* big chunks of data between two pools of memory - this both conserves memory and sidesteps any bus latency that might be present between the two. So it's not really a split situation; it's a share situation. They're both working from the same data store.
Also, I think I need to correct you a bit on your Cube nostalgia. The problem with the Cube was yes, that it was overpriced, but more specific
Re: (Score:2)
One of the big advantages of Apple's CPU/GPU shared memory architecture is that the system no longer needs to *move* big chunks of data between two pools of memory - this both conserves memory and sidesteps any bus latency that might be present between the two. So it's not really a split situation; it's a share situation. They're both working from the same data store.
That is an advantage of UMA. Of course, you're ignoring the disadvantage of UMA- the lack of high speed GDDR.
In the end UMA performs worse than fast local RAM for graphics, in general, because it's faster to *move* a big chunk of RAM than to process it many millions of times at a slower speed.
Intel chips have been UMA since ~2013. I.e., zero-copy.
It doesn't imply better performance. It does imply cheaper parts, as you don't need high speed RAM with very wide buses. Avoiding bus latency to trade for high
Re: (Score:2)
That would require them to get over their little piss-fit they've had with Nvidia for over a decade, if they really wanted to support discrete GPUs. Thunderbolt 4 GPU enclosures exist, so the hardware is there. They would just need to support eGPUs in their firmware (current M1 macs do not), and then they would just need to sign the drivers that Nvidia has already written, but Apple refuses to do so.
I suppose you could go the AMD route as long as Apple sorted out the eGPU / thunderbolt stuff, but if we're
Re: (Score:2)
Unless I'm missing some peculiarity in the way Thunderbolt works, they wouldn't, strictly speaking, necessarily have to support eGPUs at the firmware level. I mean, they would if you want to be able to see the Apple logo and access the boot picker while a display is attached to the eGPU if you don't have any other display, but otherwise it's just like any other PCIe device hanging off a remote bus and tunneled through Thunderbolt. So the only thing strictly required should be a driver that is recompiled
Re: (Score:2)
That's not a discrete GPU. Maybe they'll get that part right next time.
Cool about the RAM though.
Re: (Score:2)
That's not a discrete GPU. Maybe they'll get that part right next time.
Cool about the RAM though.
Apple didn't call it a Discrete GPU. The GP did.
Re: (Score:2)
RAM access is faster on the Apple because it has twice the memory bandwidth (256-bit vs. 128-bit)
Not because of the locality of the RAM.
This can be easily proving by looking for RAM latency benchmarks of the M1 vs. any Intel Core chip from the last 5 years, all of which have superior latency.
This is because they use DDR, instead of LPDDR.
Apple uses low-power high latency RAM with a very fat bus 8x32 vs. 2x64. Location of RAM is
Re: (Score:2)
It has a discrete GPU, the Apple GPU. It's a separate GPU section of the M1 chip.
Stop it.
You've just redefined the word discrete. So that you can fanboi a little harder.
It's fucking gross dude.
I've got an M1 Max MBP and an M1 Air. I'm digging the new products. And the 32-core Max GPU is good (though compute performance is only about half of my RTX 2060 Mobile in my Intel laptop- it sounds like this one will possibly match it)
This is not a discrete GPU. It's the baddest ass integrated GPU that exists on the market- that's for sure. But it is not discrete.
Further, it doesn't really
Re: (Score:2)
They already have a name for a CPU with an integrated GPU core. It's called an APU.
Re: (Score:2)
That being said, I'm not particularly averse to the term APU, it's just not an industry standard. It's an AMD standard.
Re: (Score:2)
When it has more silicon and is more powerful than most "real" discrete GPUs, I will damn well call it a discrete GPU. It's just as discrete as the discrete GPU I have in my laptop I can't replace either.
I can't stop you from doing that, but it's wrong, so stop it.
discrete has a definition, and it is not "discrete equivalent performance"
It's not a fanboy move, it's a way to indicate this is absolutely nothing like the horrific Intel "ointgerated graphics" chips in terms of results.
Yes, it is.
AMDs also have integrated graphics. In fact, you can get them with RDNA2.
Further in fact, the Xbox Series X with integrated RDNA2 has better performance than my 32-core M1 Max.
They're still integrated.
Changing the meaning of the word discrete because there exists a shitty integrated GPU (Intel) is like trying to change the definition of car because you don't
Re: (Score:2)
Either that or you're going wayyyy too deep into character, Comrade. You're going native, and not in a good way for you.
Ah, yup.
I was about to order some food. You delivering tonight? I'll make sure to tip you well.
Re: (Score:2)
When it has more silicon and is more powerful than most "real" discrete GPUs, I will damn well call it a discrete GPU.
Why? It's an integrated GPU, what's wrong with that?
Re: (Score:2)
Why? It's an integrated GPU, what's wrong with that?
How is the Apple GPU any less integrated than a "discrete" GPU integrated into a laptop which is soldered onto the main board.
Because it's integrated into the CPU or SoC as opposed to being a separate component connected to the motherboard (soldered or slotted).
Re: (Score:2)
How is the Apple GPU any less integrated than a "discrete" GPU integrated into a laptop which is soldered onto the main board. Yet we still call it discrete....
You mean how is it less discrete.
The sentence, "how is it less integrated than a discrete" is fucking nonsensical.
It is less discrete than a discrete because it is not discrete. That's how definitions work. Funny things, really.
The GPU in the M1*, and the A* they grew from, are integrated into the CPU. They're not even a separate die.
A discrete is on a different package connected via circuit boards. It is, by definition, discrete. The integrated GPU is by definition, not discrete.
I prefer to use a term that is the closest accurate description; "discrete" is a more accurate description of the Apple M1 GPU than is the term "integrated" because of people's understanding of what the term "integrated" implies.
No it's not, you sill
Re: (Score:2)
It has a discrete GPU, the Apple GPU. It's a separate GPU section of the M1 chip.
It does not have a discrete GPU - a separate part of the chip is is not what "discrete GPU" means, they all do that for obvious reasons. You can claim that the performance is the same as e.g. one of the high end discrete NVidia offerings, so that the old connotation of "integrated graphics sucks" based on past experiences (budget solutions) no longer applies - but it is not a discrete GPU.
It will be interesting to see benchmarks.
Re: (Score:2)
It does not have a discrete GPU - a separate part of the chip is is not what "discrete GPU" means
Wrong, discrete GPU refers to the use of memory architecture, which is different on the M1 than on Intel chips with integrated graphics.
You can claim it's integrated, but it's a misuse of the term and a misunderstanding of the hardware.
You [makeuseof.com] can't [intel.com] make [quora.com] your [pcmag.com] own [hp.com] definition [gpumag.com].
Look, I'm not claiming it's bad. I'm just saying your definition is wrong, and - like another post phrased it - can't choose to not call it a car just because your European car is a lot better than your GM/Chrysler/Ford. Your European vehicle would still be a car.
Re: (Score:2)
Agreed, their emulation is still not good enough and they cannot run windows apps, even after a year and a half of the M1 in production. My mini can last a little longer since it's faster at running software this can't run at all.
Re: (Score:2)
If a Windows app will run on Wine, it'll run on an Apple Silicon Mac using Crossover [codeweavers.com].
Obviously that's not *all* Windows apps, though.
Re: (Score:2)
I don't need windows apps, I need actual virtualbox VMs.
Re: (Score:2)
Windows for ARM is not a "thing", there is no product you can buy for this purpose.
Re: So what? (Score:2)
Funny. I still get regular upgrades of VirtualBox with new features.
Re: (Score:2)
Sounds very much in the spirit of MacOS. It just works, right?
Seriously, what's the point of this post?
Re: (Score:2)
Sounds very much in the spirit of MacOS. It just works, right?
Seriously, what's the point of this post?
I agree wholeheartedly: There is seldom any reason for any of your Posts!
Re: (Score:3)
they cannot run windows apps, even after a year and a half of the M1 in production.
Rumors suggest this is because of an exclusivity deal between Microsoft and Qualcomm [theverge.com], rather than technical hurdles or a lack of desire from either Apple or Microsoft. Most indications suggest Microsoft wants to make it happen, as does Apple, but everyone's hands are tied until the deal expires, presumably sometime soon.
It's also worth noting that 1.5 years is keeping in linewith precedent. Most people have forgotten or were never aware that booting into Windows wasn't supported on Day 1 of the switch to In
Re: (Score:2)
"I presume you're talking about Rosetta 2, in which case I beg to differ."
Beg to differ all you want. There were those here insisting it was perfect, probably including you, even better than native x86, over a year ago. Meanwhile, there are floating point problems that prevent some apps from working. It is not a matter of "slowness".
"...but if you think an emulator needs to be capable of outperforming native hardware before it's "good enough", I'll suggest you revisit your expectations."
Quite a dickish s
Re: (Score:2)
there are floating point problems that prevent some apps from working. It is not a matter of "slowness".
Honest to goodness, I'd love to know more. Not for the sake of argument, but simply for the sake of curiosity. Do you have more info?
"...but if you think an emulator needs to be capable of outperforming native hardware before it's "good enough", I'll suggest you revisit your expectations."
Quite a dickish straw man you have there.
To the contrary, I didn't know what you were getting at, hence why I preceded that with an "if". While "if" can be interpreted in English as "since", that was not my intent here. I really did intend it as a conditional statement.
Re: (Score:3)
Re: (Score:2)
They can claim all they want about the M1 performance; until I can run an x86/x64 virtualbox system on it, it is not for me.
LOL.
They can claim all they want about the Model T performance. Until I can hook up my horses to it, its not for me.
Re: (Score:2)
Quite the computer scientist you are.
Re: (Score:2)
I run x86 docker containers all the time. Maybe that's not enough for you, but maybe it is?
Re: So what? (Score:2)
On Mac they're close to the same thing, actually, because containers don't work on Mac, so you usually run a VirtualBox VM to get container support, then connect to that via the Docker CLI.
Re: So what? (Score:2)
Maybe you're right. But almost 100% of the Mac users I know use x86 VMs.
Re: (Score:2)
So get two of these and stack them on top of each other, and have an M1 Cube!
Designed to fail (Score:2)
Yes. They do. As do serious users of other stripes. So I checked with Apple directly today, and they told be definitively that the SSD is not replaceable.
Consequently, when it dies, as it inevitably will because that's the nature of the technology, the owner is now proud to have a nice aluminum boat anchor. I was considering one of these, right up until that soggy tur
Re: (Score:2)
Here are the Macs that support Mac OS Monterey:
MacBook, early 2016 and later.
MacBook Air, early 2015 and later.
MacBook Pro, early 2015 and later.
Mac Pro, late 2013 and later.
Mac Mini, late 2014 and later.
iMac, late 2015 and later.
iMac Pro, 2017 and later.
Some go back 9 years, and the iMac Pro which is the latest line supported only released in 2017. All are supported in that range.
Re: (Score:2)
Consequently, when it dies, as it inevitably will because that's the nature of the technology, the owner is now proud to have a nice aluminum boat anchor. I was considering one of these, right up until that soggy turd of information landed in my lap.
Since we're talking about a desktop machine, not a laptop, if worst comes to worst, you could install an external SSD and just pretend that the internal SSD doesn't exist.
External storage over USB 4.0 or Thunderbolt 4 would be slightly slower (the M1 Max's built-in NVMe storage reportedlydelivers up to 7.4 GB/sec [macperformanceguide.com] performance, and Thunderbolt 4 throughput would be limited to ~4 GB/s by the 32-gigabit-per-second PCIe bandwidth), but it would be far from a boat anchor.
Not With Apple Silicon (Score:2)
Consequently, when it dies, as it inevitably will because that's the nature of the technology, the owner is now proud to have a nice aluminum boat anchor. I was considering one of these, right up until that soggy turd of information landed in my lap.
Since we're talking about a desktop machine, not a laptop, if worst comes to worst, you could install an external SSD and just pretend that the internal SSD doesn't exist.
On an M1 mac (and, you should assume, on anything later), the built-in SSD is required for the machine to function at all. Stated very roughly, the primary, internal OS install is special and must validate/authenticate other OS installs, in a way that's never happened on a Mac before (i.e. not on Intel Macs, even the ones with embedded SSDs).
Basically, now that writeable, non-volatile mass storage can be embedded and soldered to the motherboard, Apple can (in the name of security) lock the OS down, conf
Re: (Score:2)
Consequently, when it dies, as it inevitably will because that's the nature of the technology, the owner is now proud to have a nice aluminum boat anchor. I was considering one of these, right up until that soggy turd of information landed in my lap.
Since we're talking about a desktop machine, not a laptop, if worst comes to worst, you could install an external SSD and just pretend that the internal SSD doesn't exist.
On an M1 mac (and, you should assume, on anything later), the built-in SSD is required for the machine to function at all. Stated very roughly, the primary, internal OS install is special and must validate/authenticate other OS installs, in a way that's never happened on a Mac before (i.e. not on Intel Macs, even the ones with embedded SSDs).
Basically, now that writeable, non-volatile mass storage can be embedded and soldered to the motherboard, Apple can (in the name of security) lock the OS down, confident that the onboard storage has never been connected to (and written from) any other machine.
This means that Apple can use the SSD to store the kind of data that would previously have to go in NVRAM or PRAM, and they'll put a ton more data in that category, now that they have so much more space. "Designed to Fail" isn't putting it quite right, but yes, a huge single point of failure has been added to the design, and arguably for Apple's benefit rather than yours.
Yes, you can still boot from an external drive, but this was unreliable in the first few versions of Big Sur*. Of course you couldn't get a straight answer out of Apple (or at an Apple Store) about whether or not it was supported (or why it randomly didn't work for some people), but things were a little better when I tried it myself last year. Still, remember that the size of the built-in storage (which you can ONLY increase at purchase, and NEVER upgrade) is a profit center for Apple, so who knows how long (or how easily) they're going to allow you to boot off a bigger, cheaper external drive?
* Some think it only worked with certain enclosures, some have no idea why it did or didn't work:
https://old.reddit.com/r/macmi... [reddit.com]
https://old.reddit.com/r/macmi... [reddit.com]
https://old.reddit.com/r/macmi... [reddit.com]
Blah, blah, Apple could do this, Apple could do that.
But they never do.
How long has the meme that "Apple will lock down Macs to the Mac App Store" been around? Well over a decade. Never happens. And yet. . .
This is just the same meme, different day.
How long has Apple been soldering SSDs onto Motherboards? Where are the Class Action lawsuits from Irate Apple customers with SSD wearout problems? And the ones that would be failing would be the old technology, low-capacity SSDs that logically should be failing
Re: (Score:2)
I thought industrial creatives cared about upgradeabily and longevity - isn't this just another [well equipped, but still] disposable machine?
Not necessarily. Some do, some don't. For some use cases, performance is so important than when something new and significantly faster arrives, you get it. After all, tools like these are cheap compared to the people cost.
For others, budget is a significant constaint.