Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Apple Hardware

Apple Unveils Flagship M1 Ultra Desktop Processor for Its Most Powerful Computers (theverge.com) 141

The next generation of Apple Silicon chips has arrived with the announcement of the company's new M1 Ultra SoC (system on a chip), the latest entry to Apple's M1 chipset lineup that's even more powerful than the M1, M1 Pro, and M1 Max chips its released so far. From a report: The key to the M1 Ultra is the Apple's UltraFusion architecture -- effectively, Apple is fusing together two separate M1 Max chips into a single, massive SoC, thanks to the 2.5TB/s inter-processor connection that the M1 Max offers. That design lets Apple double virtually all the specs from its M1 Max chip on the M1 Ultra: 20 CPU cores (16 performance and four efficiency), 64 GPU cores, a 32-core Neural Engine for AI processing, and up to 128GB of RAM. All told, Apple says that the M1 Ultra offers eight times the performance of the regular M1.
This discussion has been archived. No new comments can be posted.

Apple Unveils Flagship M1 Ultra Desktop Processor for Its Most Powerful Computers

Comments Filter:
  • How meny pci-e lanes?

    • One thunderbolt 3 port. So 4x PCIe Gen3 lanes.

      • How meany does the chip really have? Looking down the road for say an pro workstation an new cube / ashtray will fail even more.
        Even with build in video still need pci-e for network / storage / other IO cards. and TB is X4 pci-e 3.0 per channel. No PCI-E v4 on TB.

      • Sorry, that's wrong it has 4x TB ports. So 4x4 lanes total of Gen3.

      • One thunderbolt 3 port. So 4x PCIe Gen3 lanes.

        Sorry.

        SIX Thunderbolt 4 Ports as realized on the Mac Studio.

        Plus 2 USB-A (3.1) Ports, 10 Gb Ethernet, HDMI (2.2?), an SDXC Slot, and and a analog audio out with high-impedance headphone and line-level drive capability. Oh, and 6G WiFi and BT 5.?

        60% faster than a 28 Core Mac Pro.

    • by slaker ( 53818 )

      That's my first thought as well. I have a pair of 16x PCIe cards with four NVME drives as well as an Infiniband HBA in my workstation at home. I didn't buy a Threadripper because of absurd number of CPU cores, but for access to enough I/O to matter. It's nice to have an absurd number of CPU cores, and I'm sure Apple will point to the Thunderbolt ports of whatever the put that M1 in, but I'm still curious to see how much their workstation product will be bound by RAM, disk and expansion limitations.

      • by jwhyche ( 6192 )

        I didn't buy a Threadripper because of absurd number of CPU cores,

        I've never really seen the need to have that many cores in a workstation chip. The things that I see needing that many cores would be better being done on a server grade processor. Those things being video rendering, animation, and virtual machines. As for animation, most people that I know that use Blender hands off the real work to a dedicated monster GPU

        What am I missing?.

        • The dedicated monster gpu is on the chip now.

          How monster it really is will probably the topic of heated debate.

          And I'm not in this fight. My Ryzen 3200G is fine for my use.

          • I have a 3400G and agree it's a great all-arounder. And I'm really excited for the next-gen Ryzen RDNA2 APUs. But if you can get me Zen 3/RDNA2 performance at ARM TDPs, I would be on it in a heartbeat.

            Too bad the chipmaker is Apple, who have zero interest in making the chips available for purchase and much less in releasing drivers for them for other operating systems.
            • by dgatwood ( 11270 )

              This. If Apple wanted to pretty much end Qualcomm, AMD, and Intel, they could wipe the floor with the whole industry pretty easily at this point.

              • The margins on just selling parts will not be anywhere close to selling all those machines with all those subscription services.

                What will happen is that QCOM, Intel, AMD will play catch up, just like they did on the first 64 bit A series chips.

                • What makes you think that Q, Samsung, etc. has caught up? None of them have the power efficiencies and sustained performance even now. Samsung's phones can't even run at full power, unless you're a benchmarking app or certain games.
              • This. If Apple wanted to pretty much end Qualcomm, AMD, and Intel, they could wipe the floor with the whole industry pretty easily at this point.

                Wouldn't that be something. . .?

          • The dedicated monster gpu is on the chip now.

            How monster it really is will probably the topic of heated debate.

            Even if it comes in at the level of say the RTX 3090 it's tightly coupled to everything else in the system so it's only available in the $8000 SKU. So it's certainly not targeting gamers and if you are doing ML or rendering then you're better off spending that money on a system with multiple 3090s instead. Of course as an appliance to run specialized applications like Final Cut it's no contest and people will certainly pay for that.

        • by AmiMoJo ( 196126 )

          There are lots of workstation loads that benefit from many cores, like compilation and synthesis (FPGA/ASIC), modelling, AI, big data, image processing, video editing...

          ThreadRipper is a server chip with some features cut, basically it exists to offer a consumer product that won't eat into the lucrative server market sales.

          • by jwhyche ( 6192 )

            I'm going to have to take your word for it. Seem to me the things you mentioned would be better off being done on dedicated server chips. In the case of video and image processing that load being handed off to gpu cores. Somebody has to be using them since they sell rather well.

            • by AmiMoJo ( 196126 )

              There are some aspects of video processing that can be done on a GPU, but some things that need to happen on the CPU such as effects, compositing and the like.

              Threadrippers are basically server chips, just a little cost reduced and tailored to retail customers e.g. the motherboards supporting them have more M.2 sockets and 16x PCIe slots. You could buy a server chip but you would end up paying the server premium, not just for the CPU but for the motherboard as well, and the motherboard might not be ideal fo

            • These things are going to doing local development before pushing to larger servers/clusters. If you are working on highly parallel software you need enough cores to push the parallelization to find out where the problems are and doing that debugging on a server can be expensive and slow because of running remotely.

            • Seem to me the things you mentioned would be better off being done on dedicated server chips.

              Why? What difference would it make running those things on EPYC as opposed to Threadripper?

        • > What am I missing?

          Threadipper is not just about core counts, but ALSO about memory channels, maximum RAM, PCIe lanes, cache sizes, and base/turbo core clock speed.

          Here are some of the things you are missing:

          1. Price, Scalability, and Bang/Buck

          My 24C/48T TR 3960X was $1400 which was a good compromise between cost and performance. (While 1st and 2nd gen TRs are "dead end" due to AMD pulling Intel socket shenanigans but there may be some deals to be found. That's how I was able to get my 12C/24T TR 1920

          • Threadrippers are workstation and epyc are server

            Not just enterprise.
            But less cores at an higher speed vs more cores at an lower speed (better for running server workloads)

          • While 1st and 2nd gen TRs are "dead end" due to AMD pulling Intel socket shenanigans

            What "socket shenanigans"? They doubled the number of memory channels and PCIe lanes. This wasn't some arbitrary tweak to force the purchase of a new motherboard, it was a major upgrade. You're mistaken BTW about TR being 4 channel, the top end models are 8 channel.

            HEDT is VERY niche and probably dead now with Threadripper 5000 WX being basically discount EPYC.

            In some sense that was always the case: you're not paying for the

            • > What "socket shenanigans"?

              If only there was a way to research socket types [wikichip.org] /s Since you are too lazy here is a handy table:

              * TR 1000, 2000 are socket sTR4
              * TR 3000 is socket sTRX4
              * TR Pro 3000WX, TR Pro 5000WX is socket sWRX8 [amd.com]

              If AMD had done more planning they would have used the same socket across multiple generations of Threadrippers like they did for socket AM4.

              > You're mistaken BTW about TR being 4 channel, the top end models are 8 channel.

              I said "used to sit in between", as in the ORIGINAL Thre

        • by slaker ( 53818 )

          In my case it's a mixed workload. I moved what had been four moderately nice desktop PCs into one big system that can handle both work (several VMs that run 24x7) and hobby (prosumer camera stuff) needs. I actually do get to process 4k/120 and 8k video in Resolve Studio, which is one of those tools that uses the CPU and sometimes the GPU, but in more or less all cases benefits from having I/O split across several disks (e.g. input / scratch space / output). A lot of other prosumer graphics tools have simila

      • Too bad AMD is so slow to update Threadripper.

  • 128GB max is still not an fit for an pro workstation like the mac pro

    • It will do for the tasks for which it is intended by Apple, and if you want to do anything else then they don't care about you for one reason or another.

      An Apple computer is, while technically an expandable general purpose computer, for most intents and purposes best treated as an appliance for running the big flagship apps. Apple will sometimes go out of its way to prevent you from doing what you want to do, though this is fairly rare on its desktop computers, but it won't lift one finger to help you unles

      • by leptons ( 891340 )
        >It will do for the tasks for which it is intended by Apple

        And if you need something different than what Apple says you need, they'll just tell you you're "holding it wrong".
    • 128GB max is still not an fit for an pro workstation like the mac pro

      For certain values of "pro." I know professional photographers who make a living just fine with substantially less ram and substantially less general computing power. Good enough for every single use case? No. Good enough for almost everybody else? Yes.

      Also, the headline is misleading, and Apple specifically said they are leaving their most powerful product line, the Mac Pro, to a future update.

      • by AmiMoJo ( 196126 )

        The real problem is the of upgrade options. You have to decide how much RAM you need when you buy the machine, and pay Apple prices for it.

        • If you don't know how much RAM you'll need when you buy it by now,
          what else don't you know?

          • by AmiMoJo ( 196126 )

            You may know how much you need today, but in a few years time? My current computer is a decade old, it's had a lot of upgrades.

            • You may know how much you need today, but in a few years time? My current computer is a decade old, it's had a lot of upgrades.

              In a few years, you'll be giving to your Mom to run Mah Jongg and kerp her recipes on; because there will be a new Mac that will make the performance of this look like a Quadra 650.

              Apple now needs to wait for no one else's CPU Roadmap.

        • The real problem is the of upgrade options. You have to decide how much RAM you need when you buy the machine, and pay Apple prices for it.

          That is a fair assessment. It looks as if Apple is charging $400 USD to go from 32gb to 64gb (M1 Max), and $800 to go from 64gb to 128gb (M1 Ultra).

          The exact specs remain to be seen, but retail dimms of 64gb (2x32) DDR5 seems to be in the ballpark (+/-) of $600. So, maybe $100 markup per 32gb?

          Personally I upgraded my 2017 iMac from 8gb (! never even realized I was so low) to 64gb just recently for cheap. I'm never going to argue against upgradeability. I would, however, say that if you are someone who might

          • by AmiMoJo ( 196126 )

            They probably won't ever change those prices either. Typically Apple computers stay the same price through their lifetimes, so the gap between Apple price and everyone else's price only increases.

            Will be interesting to see what else gets soldered on, e.g. the SSD.

            It has repair ramifications too. A memory fault means replacing your entire system because the RAM and CPU are soldered to the motherboard.

            • It just makes the repairs more difficult for the average person, but if you can solder the replacement part yourself it's not an issue.
              • by AmiMoJo ( 196126 )

                BGA rework isn't exactly easy, and even if you have the gear and knowledge, Apple won't sell you a replacement part anyway.

                • Are the RAM and SSD chips actually proprietary?

            • Will be interesting to see what else gets soldered on, e.g. the SSD.

              Don't know about this particular computer, but Apple has been soldering SSDs for several years now.

              i'm sure it's happened, but I have literally never heard of an Apple product with soldered SSD dying due to the SSD (or RAM). I follow several Apple forums sporadically.

          • by hawk ( 1151 )

            > It looks as if Apple is charging $400 USD to go
            >from 32gb to 64gb (M1 Max), and $800 to go
            >from 64gb to 128gb (M1 Ultra)

            So $100 per 8gb.

            To put this into perspective, new Apple ][ started getting maxed out to all 48k when third party memory cost dropped to $100 for 16k . . . apple briefly had similar pricing, then started only shipping 48k models.

            [and early on, there were *two* 12k factory configurations: one with 3 contiguous 4k blocks, and another with a gap between the first and second bank fo

    • by fermion ( 181285 )
      What is not in the summary is that this chip is being used to create a new category for Apple. A high performance headless compact desktop. Like a Mac mini, but intended for a market between the all in one and future pro machines, which presumably will cost 2-3 times as much no can see it as a replacement for my cylinder.

      The memory limit is likely due to the size. Apple often sacrifices higher specs for small size. The smaller size could also lead to an efficient personal render farm, with several cubes

    • by dgatwood ( 11270 )

      128GB max is still not an fit for an pro workstation like the mac pro

      This is a midrange workstation. I think it is probably safe to say that the Mac Pro will have socketed RAM. The current Intel Mac Pro can hold up to 1.5 TB of RAM, and there's no way they could realistically cram anything approaching that much RAM into a CPU package.

      • 128GB max is still not an fit for an pro workstation like the mac pro

        This is a midrange workstation. I think it is probably safe to say that the Mac Pro will have socketed RAM. The current Intel Mac Pro can hold up to 1.5 TB of RAM, and there's no way they could realistically cram anything approaching that much RAM into a CPU package.

        The two questions for the Mac Pro that remain are:

        1) How are they going to offer PCIe Slots?

        2) How are they going to maintain the performance advantages of Unified Memory?

    • You do realize the Mac Pro refresh hasn't happened yet, right?
    • 128GB max is still not an fit for an pro workstation like the mac pro

      Probably why they tossed-out the sequel-bait line about the Mac Pro being left for another day.

  • "They did what? Screw it boys, we're going to 21 cores! [theonion.com]"
  • ARMCortex(TM) is old hat now. All the cool kids are using RISC-V
    • My understanding is that RISC-V implementations have been maturing rapidly over the past few years, going from being "far too slow to use in a desktop or laptop" to being merely "not yet as fast as a low-end Intel CPU". If RISC-V continues this rate of progress, then in a few years' time I imagine there will be some "fast enough" RISC-V chips for use in laptops and desktops. But, as far as I know, that time is not yet upon us.
  • apple only storage and ram at extreme markup

    • Yes it's expensive, but now at least they come with super fast CPU/GPU performance rather than Intels slow and hot CPUs.
  • Now, I will not buy overpriced Apple hardware, but enough people will and some will actually benefit from this kind of power at this price. But the good thing is that this makes amply clear that AMD64 is no longer the only architecture if you need high performance and cannot buy a "big iron". CPU architecture has been stagnant for too long.

    • locked in storage, ram and limited pci-e is not good.

      Really needs an m.2 slot
      Really more then just TB ports that share bandwidth with video.

      Better networking / some kind of way to an non copper nic inside may be needed for an workstation.

    • Now, I will not buy overpriced Apple hardware

      That's exactly the kind of hard-hitting analysis I come to Slashdot for!

      But the good thing is that this makes amply clear that AMD64 is no longer the only architecture if you need high performance and cannot buy a "big iron"

      That's been clear for quite some time.

      • by gweihir ( 88907 )

        But the good thing is that this makes amply clear that AMD64 is no longer the only architecture if you need high performance and cannot buy a "big iron"

        That's been clear for quite some time.

        Not really. There was good reason to suspect, but it may have been an one-time thing. Now that they have done it again, we can be pretty sure it is not and others will start to invest as well.

        • Not really. There was good reason to suspect, but it may have been an one-time thing. Now that they have done it again, we can be pretty sure it is not and others will start to invest as well.

          To expand on that, for certain workflows and jobs, GPU-based architectures have been the real deal, and more important than the CPU.

          Apple seems to have done an incredible job with their ARM chips, but viability of ARM has been clear for some time. Apple does deserve the credit for kicking arm squarely into the PC market, and let's see what happens over the next few years with competition.

          • by gweihir ( 88907 )

            Apple does deserve the credit for kicking arm squarely into the PC market, and let's see what happens over the next few years with competition.

            Exactly. Competition is good for innovation and product optimization.

        • To be more precise, this is the 3rd/4th time they've done it:
          1. First they released the M1.
          2/3. Later, they released the M1 Pro and M1 Max at the same time.
          4. Today they released the M1 Ultra.

          • Oh dear, where can they go from there? Super Ultra Plus?

            • Well, they're doing M1, M1 Pro, M1 Max, M1 Ultra. Let's hope they continue with a single qualifier for the more powerful versions, if there are any. For all we know, the Mac Pro could very well launch directly with the M2 Ultra.

            • Oh dear, where can they go from there? Super Ultra Plus?

              Actually, they intentionally teased by introducing the Ultra as the "one final Member of the M1 Series."

              So no more Adjectives to be added to "M1" (thank Bob!).

    • Now, I will not buy overpriced Apple hardware, but enough people will and some will actually benefit from this kind of power at this price. But the good thing is that this makes amply clear that AMD64 is no longer the only architecture if you need high performance and cannot buy a "big iron". CPU architecture has been stagnant for too long.

      Really? Overpriced?

  • Seems to me that the names "Ultra" and "Max" really should have been swapped in this lineup (provided they don't have another name in mind for something even more powerful for the Mac Pro). Apple's product lineups have drifted further and further away from the simplicity of consumer/pro+desktop/laptop of old for what are arguably valid reasons, but this is just inexplicable.

    • by ceoyoyo ( 59147 )

      Not inexplicable. The people who vehemently demanded simplicity died or left. The ones who replaced them believe marketing research that says people get erections when products are labelled "Ultra Max Pro Xperience" so there you go.

      Just wait for the Mac DG4632 Ultra HiSpeed.

    • by jbengt ( 874751 )

      Seems to me that the names "Ultra" and "Max" really should have been swapped in this lineup

      Despite the literal meaning of "max" and "ultra", in most people's minds "ultra" is a greater superlative than "max".

      • Seems to me that the names "Ultra" and "Max" really should have been swapped in this lineup

        Despite the literal meaning of "max" and "ultra", in most people's minds "ultra" is a greater superlative than "max".

        I predict next year's more-powerful models will be called "ludicrous" and "plaid".

  • So the new Mac Studio is just an elongated Mac Mini with a better cooling system. So it's basically a Mac Maxi. So you can buy a Mac Mini Maxi M1 Max Studio.

    I own and love an M1 Pro MacBook Pro (gah!) but the naming schemes are trying way too hard to be techy and accessible. Nerds like numbers, you can still name CPUs with some denoting of performance and tier without this kind of jank confusion.

  • by drnb ( 2434720 ) on Tuesday March 08, 2022 @02:08PM (#62337213)
    The "floppy" is back, SD card on the front of Mac Studio. :-)
  • $200 for an upgrade of 512 to 1TB of disk!!

    With other m.2 disks it's like $200 for an 2TB and $100 for an 1TB one. Even if you go to an higher end ones still maybe $150-$200 for an 1TB one.

    • AFAIK there's no M.2 slot in the Mac Studio so complaining about prices and options pointless.

      Nothing prevents you or anyone from connecting external drives via the multiple ports. The Mac Studio has four Thunderbolt 4 ports with support for:
      - Thunderbolt 4 (up to 40Gb/s)
      - DisplayPort
      - USB 4 (up to 40Gb/s)
      - USB 3.1 Gen 2 (up to 10Gb/s)

    • Seems like there should be a market for third-party "upgradeable" Macs, so we customers aren't locked into crap like this. For example, an enterprising soul buys the low-end Macs from Apple, deconstructs the parts that might be upgraded, later, and re-packages it into a modular system that can be opened and upgraded at will. It's still Apple stuff, so the hardware shouldn't balk at running MacOS, but it's just packaged so we can get inside. Of course, that would necessitate a premium over an already not-
      • Seems like there should be a market for third-party "upgradeable" Macs, so we customers aren't locked into crap like this.

        Dude! You would have loved the 20th century [cultofmac.com]!

      • Seems like there should be a market for third-party "upgradeable" Macs, so we customers aren't locked into crap like this. For example, an enterprising soul buys the low-end Macs from Apple, deconstructs the parts that might be upgraded, later, and re-packages it into a modular system that can be opened and upgraded at will. It's still Apple stuff, so the hardware shouldn't balk at running MacOS, but it's just packaged so we can get inside. Of course, that would necessitate a premium over an already not-cheap device.

        Can't do it and retain the speed of Unified Memory. Speed of Light and all that.

    • I assume this is due to the speed of the SSD. At ~7GBps is is faster then any M2 SSD available today. But they really could use an internal M2 slot for a second drive because sometimes 2GBps is fast enough. Actually, it is almost always fast enough.
      • I assume this is due to the speed of the SSD. At ~7GBps is is faster then any M2 SSD available today. But they really could use an internal M2 slot for a second drive because sometimes 2GBps is fast enough. Actually, it is almost always fast enough.

        What can you do with a TB4 external drive?

  • That means no way to use current high-end NVME SSDs at their highest speeds. That's prerequisite for someone using a monster workstation like this for many, many use cases -- especially at this crazy price point. And soldered-on SSDs and RAM? Pass. For the money I could build two complete PCIe 4.0-based workstations, each of which would bury this thing in any benchmark you'd use.
    • As well as keeping your house nice and toasty in the winter months.

    • For the money I could build two complete PCIe 4.0-based workstations, each of which would bury this thing in any benchmark you'd use.

      And for that money I could buy a used Jeep that would bury your workstations on any dirt road you'd use.

    • you can pay at least an $150 /TB markup for apple ones that you can't replace.

    • That means no way to use current high-end NVME SSDs at their highest speeds. That's prerequisite for someone using a monster workstation like this for many, many use cases -- especially at this crazy price point. And soldered-on SSDs and RAM? Pass. For the money I could build two complete PCIe 4.0-based workstations, each of which would bury this thing in any benchmark you'd use.

      Oh, here we go again!

      Even more amazing, since there are almost zero benchmarks so far for the M1 Ultra, eh?

  • Odd that besides the three males at the top of the features, all of the product presenters were female.
    Even the software company "oh the Mac Studio is awesome" spots were female.

    I didn't go back to check, but I think that most were now presentation actors, not Apple staff.
    One would think that it's to promote female empowerment, but on the other hand, stiletto heels.

  • Up to 8 times the performance of the 8000.

  • by AndyKron ( 937105 ) on Tuesday March 08, 2022 @03:37PM (#62337579)
    I don't care if it's the super duper gee whiz processor. It's Apple and fuck Apple
  • by nickovs ( 115935 ) on Tuesday March 08, 2022 @07:13PM (#62338523)
    The lack of expandability is certainly a big drawback but I have to disagree with the people here complaining that the RAM is overpriced. The bandwidth of the RAM in the Ultra is rather higher than you'll get out of DDR5 on a 12th gen Intel i9. Apple's upgrade from 64GB to 128GB is $800 but if you go on NewEgg.com and search for 64GB DDR5 kits (either 2 x 32GB or 4 x 16GB) the average price is $850. In comparison Apple's price looks pretty good, especially given the higher speed.

    The pricing for the SSD options are far less compelling, although it rather depends on the endurance of the drive since the prices are fairly in line with those of data-centre SSDs. Still, I expect most of the target market will have a huge storage server nearby and the built-in 10Gb/s networking will mean that the SSD only needs to hold the assets for the current project.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...