Apple Begins Testing Speedy M3 Chips That Could Feature 12 CPU Cores (engadget.com) 61
Engadget writes:
Apple is testing an M3 chipset with a 12-core processor and 18-core GPU, according to Bloomberg's Mark Gurman. In his latest Power On newsletter, Gurman reports a source sent him App Store developer logs that show the chip running on an unannounced MacBook Pro with macOS 14. He speculates the M3 variant Apple is testing is the base-level M3 Pro the company plans to release sometime next year...
[T]he M3 Pro reportedly features 50 percent more CPU cores than its first-generation predecessor.
From Gurman's original article: I'm sure you're wondering: How can Apple possibly fit that many cores on a chip? The answer is the 3-nanometer manufacturing process, which the company will be switching to with its M3 line. That approach allows for higher-density chips, meaning a designer can fit more cores into an already small processor.
[T]he M3 Pro reportedly features 50 percent more CPU cores than its first-generation predecessor.
From Gurman's original article: I'm sure you're wondering: How can Apple possibly fit that many cores on a chip? The answer is the 3-nanometer manufacturing process, which the company will be switching to with its M3 line. That approach allows for higher-density chips, meaning a designer can fit more cores into an already small processor.
mac pro? (Score:3, Informative)
mac pro?
Re: Joe_Dragon with the "room temp IQ" saves the d (Score:3)
Fahrenheit (Score:2, Funny)
Re: (Score:2)
mac pro?
Dunno.
But this does feel like the "fish or cut bait" generation of chips for getting the Mac Pro switched to M-series Apple Silicon.
Re: (Score:3)
Re: (Score:3, Funny)
Re: What's the point? (Score:5, Informative)
Re: (Score:2)
Twelve cores is an incremental improvement. The M1 has eight, so going from 5nm to 3nm, will allow for two efficiency cores, and two performance cores for the base CPU, and a corresponding amount for the Pro/Max/Ultra/etc.
I'm just hoping that the on-die RAM increases. It would be nice to see 48 gigs of RAM on the "plain" M3 as an option, with the basic model having 16 minimum, perhaps 24.
As for software taking advantage of it, there are a lot of tasks that can be split up. For example, renderings are eas
Re: (Score:3, Interesting)
I'm just hoping that the on-die RAM increases. It would be nice to see 48 gigs of RAM on the "plain" M3 as an option, with the basic model having 16 minimum, perhaps 24.
According to the article "the chip was spotted configured with 36GB of RAM," so perhaps 48 or more max.
Re: (Score:2)
Yeah, Apple has historically under-gunned its base level Macs with RAM, and while that was fine in the days when you could actually upgrade such things, its a much bigger problem with baked-on ram.
Even the 16gb in my Macbook M1 Max gives me grief sometimes if I've got a lot of nonsense going on (I have a tendency to ADHD computing with about 20 tasks at a time. Yeah, not great but thats just how my brain rolls) and while I'd LIKE to have gotten the higher spec at the time, my finances where severely impaire
Re:What's the point? (Score:5, Informative)
This is talking about an M processor, which is used for Macs. Phones use the A processors.
As a programmer, I'm using a 16 core/32 thread Ryzen chip. When I do builds, all 32 threads run at 100% usage and it does wonders for reducing build times. A lot of graphics and video work can also use as many cores as you can throw at the task.
Re: (Score:2)
Is there actually software that can take advantage of that many cores? Certainly not on a phone right?
Final Cut Pro, Logic, Premier, Photoshop, Maya and Resolve are a few that come to mind.
In fact, a friend of mine who has a Mac Studio with an M1 Ultra recently told me that the photo processing software he uses (name escapes me ATM) routinely lights up all the cores.
So, Developers of at least some of those "the more cores the merrier" class of Applications actually do take advantage of the available hardware when available. And since Apple, unlike Microsoft, doesn't try to play licensing games based on the
misleading comparison, an Apple mainstay (Score:2, Insightful)
"[T]he M3 Pro reportedly features 50 percent more CPU cores than its first-generation predecessor."
Since when are new processors compared to TWO generations prior? And why total CPU count when they're a very different types of cores?
We all know why, of course.
Re:misleading comparison, an Apple mainstay (Score:5, Informative)
Since when are new processors compared to TWO generations prior?
From TFA:
If you recall, the M1 Pro and M2 Pro feature eight- and 10-core processors, alongside 14 and 16-core GPUs
So this is an incremental step with 2 more processors and GPUs than the M2, just like the M2 was over the M1.
And why total CPU count when they're a very different types of cores?
They aren't. They're the same family / architecture of chips, so a core count is an apple to apples comparison.
We all know why, of course.
No, I don't know. Are you saying the M3 is not more powerful than the M2 and they're totally trying to mislead the public? According to the article this was a leak obtained via developer logs that reported the core count, so this isn't even something Apple is claiming via marketing. It sounds like the M3 has a max of two more cores than the M2, which had two more cores than the M1. Pretty straightforward stuff here.
Re: (Score:2)
Those processors aren't all the same though. Some are efficiency cores, some are performance cores.
We will have to see what the mix of efficiency and performance cores is for M3.
Re: (Score:2)
Are you saying the M3 is not more powerful than the M2 and they're totally trying to mislead the public?
Yes, this is marketing-speak. Is 12 cores better than 10? Is the M3 more "powerful" than the M2? The answer is -- it depends. For some workloads, yes. For some, no. Being more "powerful" or other marketing metrics of goodness are intentionally murky. The marketing leaves out that dependency and fuzziness and tacitly suggests that more cores or the current generation is better for all potential buyers.
Quantum effects? (Score:3)
Re: (Score:1)
Re:Quantum effects? (Score:5, Insightful)
I suspect part of the answer is that the way they measure feature size has changed. I don't think a modern 3nm chip is 3nm in all the same ways that, say, a 90nm chip was years ago when it was state-of-the-art. I don't really understand this myself, as most of the reporting is written by people who also don't understand it, or so it seems. I would love to hear a good explanation from someone who really knows all the relevant details.
Re: (Score:2)
I also don't know it for certain, but yes, they changed what they measure.
At super small sizes, you can make a transistor of sorts just by laying out three traces in the right way (no N or P doping), which causes manufacturing problems. My understanding is that this means you can (reliably) make a feature a bit smaller than we've previously made them, but you have to space them out a bit more too. I guess there's a slight density improvement, but not as much as you might read in from "4nm to 3nm". I don't t
Re: (Score:2)
Modern chip manufacturing processes are supplied as a measure of transistor density - given as if we were still making the same transistors as in those 90nm chips. But we now use different transistor designs which allow for much higher density chips without improvements to the manufacturing process.
The quoted 3nm chips will actually have a much larger design node. But does this matter? Probably not so much as transistor density is what is important and the quoted value for the design node is supposed
Re: Quantum effects? (Score:2)
Correct. 3nm is a heuristic to convey the effective progress, not to be confused with physical reality when compared to the historical XYnm era. In fact this heuristic started with 22nm if Iâ(TM)m not mistaken.
Re: (Score:3)
I recall reading many years ago that below 5nm quantum effects make transistors act nondeterministic. Was this limitation solved?
No, they just changed the way they measure it for marketing reasons.
(just like hard disks used to be in real megabytes but aren't any more)
Re: Quantum effects? (Score:3)
Re: (Score:2)
12 cores? (Score:3)
It's easy to make a chip with 12 cores, there are chips available with 64, 80 or even 128 cores these days. The limiting factor is not the ability to produce a chip with many cores, but being able to do so within the power budget.
Re: (Score:2)
Doing it with the power budget is not a problem. You could have 128 very slow cores that would consume only 5W. The challenge is to make X *fast* cores within a given power budget.
Re: (Score:2)
Re: (Score:1)
Yes. Because the opinions of some nobody who can't even put their name on their post means SO much to me....
Re: It's still a Mac. (Score:2)
Okay. Here's a logged-in guy to tell you: Your 'opinion' was half-assed and it revealed that you're poorly informed.
Re: (Score:1)
You're wrong.
But, good or bad, you have the spine to actually put your name on your opinions.
Re: (Score:2)
Single Core (Score:4, Insightful)
More cores are good, but better single core performance at the same time would be good too.
Re: (Score:3)
Right now is an interesting time for CPUs. We have two competing ideas - efficiency cores and chiplets.
ARM and by extension Apple use efficiency cores to get better battery life. AMD gets similar battery life with only performance cores, using chiplets. AMD's way seems to be the best at the moment. You get more performance cores and more performance overall, at a lower cost.
I expect M3 will be like M2 and M1 before it. Mid range performance overall, some halo SKUs that throw cores at a few benchmarks. Not c
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
ThinkPad Z series?
Re: (Score:2)
What AMD systems rival Macs right now? I really would love a good Windows system that has similar battery/performance/heat/form factor to my M1 MacBook Pro.
Curious what you are needing the Windows machine for? Gaming, or something else?
The ARM based Windows compatible machines are coming, but we will likely need to wait a bit more time. Also, on the x86* side, I am curious how much of power reduction getting ride of the 32-bit backwards compatibility will be?
Re: (Score:2)
I'm relatively new to MacOS and I bet by the time a suitable Windows replacement is available I won't even care and will have adapted any existing projects/habits to what I can do with my M1 MBP. I do have to admit, I've been surprised how much more keyboard friendly current Windows is than macOS. It seems like most shortcuts for the Mac requi
More is better of course. (Score:2)
How else am I supposed to all these tasks at the same time on a single core cpu like the M1?
Was I wondering? (Score:3, Insightful)
I mean EPYCs come with 96 cores, so I don't think I was really wondering how a company managed to fit 12 on a CPU. You don't even need some special magic source. You just slice the die a little larger.
Typical Bloomberg reporting.
Re: (Score:3)
Re: (Score:2)
> the Power Mac G4 was marketed by Apple as the first "personal supercomputers"
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Slashdot used to be a place where people knew the difference between and laptop CPU and a server CPU with 360W TDP. Is it still true?
Yes. It's also a place where some people realise you making that distinction is irrelevant since TDP is nothing more than a tradeoff and you seem to not understand what it means to make a facetous comparison.
But since you want to play the TDP game note that the M2 with maximum core count has only about 3/4 of the TDP of many Intel laptop chips which function just fine in laptops. So ... what's your point? That Apple should have had 18 cores on the M2? Fuck the Intel P4M 2.8 had an 88W TDP, so Apple should b
What did they do to the L2 cache? (Score:3)
At some point in this course we profiled software that used an increasing amount of memory, and by monitoring the speed it could clearly be seen when the access was is L1, L2, L3 cache or main memory. We profiled several AMD, Apple and Intel processors. L1 and L3 cache speed was the same for all processors, but the M processors by Apple had clever engineering in them that made the L2 cache noticably faster.
Hopefully it will one day be public knowledge what that trick is.
Re: (Score:2)
It's not a trick, they just used more expensive memory for the L2 cache. They had to, because ARM's instruction density is quite low compared to AMD64, so to get similar performance you need massive caches.
Its very embarassing.. (Score:1)
While i never liked apple products for the past decade or two, since they launched M1 it has become exceedingly embarrassing.
I just tried someone's macbook air m1 to see if all the hype is true and installed parallels and Win11 on a 8GB ram mba M1 - expecting it to be at best just usable for small stuff like it is on my windows laptops (with more ram though)
Bloody thing runs macOS plus parallels and Windows 11 arm64 + kali linux arm, way faster than any windows laptop i have tried. Bought a macbook few days
The M1 was so good they couldnt sell much of M2 (Score:1)
I read that the M1 was so good Apple was unable to sell much of M2, because the performance was hardly better
M3 is supposed to fix that issue.
Re: (Score:2)
There are a few things going on which hampered the sales of M2-based Macs.
1) Pent-up demand for a new chipset led the M1 to be extremely popular. Sales were wildly successful, moreso than expected.
2) Consumers / Corporations spent their budgets on M1's; they will wait a few years for ROI before replacing these new computers with later models.
3) M2 was not that big of a leap over the M1 performance-wise; this is not simply a hardware issue, because it will take developers some time to get better at writing c
Re: (Score:3)
ALSO:
As much as I want Apple to upgrade their complete product line every year, chances are they make more money by selling the same model computer over several years. So even if they could produce a new Mx chip every year, it may not make business sense to do so. Spreading out their computer model upgrades over 2 years seems to be the new cycle, as the MacBook will get upgraded more frequently than the Mac Mini or Mac Studio.
After spending $5000 on an M1 Max MacBook Pro (the storage upgrades are redonkulus
GPU cores (Score:3)
I wish journalist stop using that marketing gimmick term of "GPU cores" it means nothing. GPU are massively parallel. There is much more than 18 instructions that can be executed at the same time.
They are trying to make a comparison with CPUs, but it is not a valid one.
Re: (Score:2)
Re: (Score:2)
They represent nothing physical. Their 16 core GPU could be rebranded 32 cores or 8 cores if they wanted to.
The M2 has up to 2560 ALUs, so they could claim any number up to that, but the choose to keep it in the same range than CPU cores for marketing purposes only.
By comparison, the Geforce RTX 4090 has 16384 shaders processors, but I don't think Nvidia went as low as marketing a core number, yet.
Re: (Score:1)
Ahhh yes the new M series (Score:3)
Wife wanted a new Mac for work, Air is all she needs for performance but sadly, had she got one, even an Air with the M2 processor. A $1500 US product, it would only support a single external display.
In 2023.
A single external display on a $1500 Mac. Incredible!
So, we had to get an M2 Pro and of course it still has weird oddities, her Intel 2017 and Intel 2019 Macs both worked with my high end dock, with the lid closed. Just tap the space bar and the Macs woke up in the morning, not so with the new M series, have to lift the lid, unplug the dock, wait, plug in dock, wait, close lid each morning.
Then there's this, you know Apple using Displayport logos but not actually complying with Displayport specifications https://sebvance.medium.com/ev... [medium.com]
https://www.google.com/search?... [google.com]
(of course Apple stans will defend them with a plethora of silly excuses why this is acceptable)
(and yes if you're going to skim the post, the hardware IS capable of it, the software just decides not to)
Finally there's this
https://developer.apple.com/fo... [apple.com]
You know industry standard docks, using Thunderbolt, USB-C which cause one particular model of Macs to reboot, despite the fact that Asus, Dell, HP, Lenovo, Huawei and even Apple iPads working fine in docks. This one is clearly a driver fault of some kind, easily recreatable, simply ignored by Apple. It also occurs on monitors with USB -PD / basic hubs in them.
This is the precise stuff that has 'sperglord IT nerds' like myself, whining about Apple still, 20 years on. I get it, people like them and some stuff really does just work, heck some stuff is outright so damn well designed. If you like it, you're not 'wrong'. However doing stuff 'weird' and breaking stuff / ignoring faults because 'that's now how it's intended' doesn't cut it.