Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Apple Hardware

Apple Begins Testing Speedy M3 Chips That Could Feature 12 CPU Cores (engadget.com) 61

Engadget writes: Apple is testing an M3 chipset with a 12-core processor and 18-core GPU, according to Bloomberg's Mark Gurman. In his latest Power On newsletter, Gurman reports a source sent him App Store developer logs that show the chip running on an unannounced MacBook Pro with macOS 14. He speculates the M3 variant Apple is testing is the base-level M3 Pro the company plans to release sometime next year...

[T]he M3 Pro reportedly features 50 percent more CPU cores than its first-generation predecessor.

From Gurman's original article: I'm sure you're wondering: How can Apple possibly fit that many cores on a chip? The answer is the 3-nanometer manufacturing process, which the company will be switching to with its M3 line. That approach allows for higher-density chips, meaning a designer can fit more cores into an already small processor.
This discussion has been archived. No new comments can be posted.

Apple Begins Testing Speedy M3 Chips That Could Feature 12 CPU Cores

Comments Filter:
  • "[T]he M3 Pro reportedly features 50 percent more CPU cores than its first-generation predecessor."

    Since when are new processors compared to TWO generations prior? And why total CPU count when they're a very different types of cores?

    We all know why, of course.

    • by Dan East ( 318230 ) on Sunday May 14, 2023 @10:14PM (#63521361) Journal

      Since when are new processors compared to TWO generations prior?

      From TFA:

      If you recall, the M1 Pro and M2 Pro feature eight- and 10-core processors, alongside 14 and 16-core GPUs

      So this is an incremental step with 2 more processors and GPUs than the M2, just like the M2 was over the M1.

      And why total CPU count when they're a very different types of cores?

      They aren't. They're the same family / architecture of chips, so a core count is an apple to apples comparison.

      We all know why, of course.

      No, I don't know. Are you saying the M3 is not more powerful than the M2 and they're totally trying to mislead the public? According to the article this was a leak obtained via developer logs that reported the core count, so this isn't even something Apple is claiming via marketing. It sounds like the M3 has a max of two more cores than the M2, which had two more cores than the M1. Pretty straightforward stuff here.

      • by AmiMoJo ( 196126 )

        Those processors aren't all the same though. Some are efficiency cores, some are performance cores.

        We will have to see what the mix of efficiency and performance cores is for M3.

      • Are you saying the M3 is not more powerful than the M2 and they're totally trying to mislead the public?

        Yes, this is marketing-speak. Is 12 cores better than 10? Is the M3 more "powerful" than the M2? The answer is -- it depends. For some workloads, yes. For some, no. Being more "powerful" or other marketing metrics of goodness are intentionally murky. The marketing leaves out that dependency and fuzziness and tacitly suggests that more cores or the current generation is better for all potential buyers.

  • by sinij ( 911942 ) on Sunday May 14, 2023 @09:47PM (#63521289)
    I recall reading many years ago that below 5nm quantum effects make transistors act nondeterministic. Was this limitation solved?
    • No, they're just interpreting and reporting the sizes based on the capabilities they think a process of that size would have. i.e. lying.
    • by crow ( 16139 ) on Sunday May 14, 2023 @09:55PM (#63521311) Homepage Journal

      I suspect part of the answer is that the way they measure feature size has changed. I don't think a modern 3nm chip is 3nm in all the same ways that, say, a 90nm chip was years ago when it was state-of-the-art. I don't really understand this myself, as most of the reporting is written by people who also don't understand it, or so it seems. I would love to hear a good explanation from someone who really knows all the relevant details.

      • I also don't know it for certain, but yes, they changed what they measure.

        At super small sizes, you can make a transistor of sorts just by laying out three traces in the right way (no N or P doping), which causes manufacturing problems. My understanding is that this means you can (reliably) make a feature a bit smaller than we've previously made them, but you have to space them out a bit more too. I guess there's a slight density improvement, but not as much as you might read in from "4nm to 3nm". I don't t

      • Modern chip manufacturing processes are supplied as a measure of transistor density - given as if we were still making the same transistors as in those 90nm chips. But we now use different transistor designs which allow for much higher density chips without improvements to the manufacturing process.

        The quoted 3nm chips will actually have a much larger design node. But does this matter? Probably not so much as transistor density is what is important and the quoted value for the design node is supposed

      • Correct. 3nm is a heuristic to convey the effective progress, not to be confused with physical reality when compared to the historical XYnm era. In fact this heuristic started with 22nm if Iâ(TM)m not mistaken.

    • I recall reading many years ago that below 5nm quantum effects make transistors act nondeterministic. Was this limitation solved?

      No, they just changed the way they measure it for marketing reasons.

      (just like hard disks used to be in real megabytes but aren't any more)

    • Because nano meter for a chip has gone from being the actually length of an entire transistor to being the length of the depletion zone, then to bring the length of the track, and then to being a certain offset accuracy of the uv optics interference patterns, and nothing to do with the size of anything on the chip.
  • It's easy to make a chip with 12 cores, there are chips available with 64, 80 or even 128 cores these days. The limiting factor is not the ability to produce a chip with many cores, but being able to do so within the power budget.

    • Doing it with the power budget is not a problem. You could have 128 very slow cores that would consume only 5W. The challenge is to make X *fast* cores within a given power budget.

    • An additional critical constraint is keeping those cores fed. Memory is slow and having many cores increases the need for larger cache.
  • Single Core (Score:4, Insightful)

    by Midnight Thunder ( 17205 ) on Sunday May 14, 2023 @11:36PM (#63521445) Homepage Journal

    More cores are good, but better single core performance at the same time would be good too.

    • by AmiMoJo ( 196126 )

      Right now is an interesting time for CPUs. We have two competing ideas - efficiency cores and chiplets.

      ARM and by extension Apple use efficiency cores to get better battery life. AMD gets similar battery life with only performance cores, using chiplets. AMD's way seems to be the best at the moment. You get more performance cores and more performance overall, at a lower cost.

      I expect M3 will be like M2 and M1 before it. Mid range performance overall, some halo SKUs that throw cores at a few benchmarks. Not c

      • What AMD systems rival Macs right now? I really would love a good Windows system that has similar battery/performance/heat/form factor to my M1 MacBook Pro.
        • by Dusanyu ( 675778 )
          I would like to know this as well I am currently using a M2 Mini at School so I don't overheat my dorm in the summer months because my PC doubles as a space heater.
        • by AmiMoJo ( 196126 )

          ThinkPad Z series?

        • What AMD systems rival Macs right now? I really would love a good Windows system that has similar battery/performance/heat/form factor to my M1 MacBook Pro.

          Curious what you are needing the Windows machine for? Gaming, or something else?

          The ARM based Windows compatible machines are coming, but we will likely need to wait a bit more time. Also, on the x86* side, I am curious how much of power reduction getting ride of the 32-bit backwards compatibility will be?

          • Some games. Some .NET stuff I fiddle around with that doesn't really work the same in CORE. But neither of these would work, I bet, under a proper ARM Windows anyway.

            I'm relatively new to MacOS and I bet by the time a suitable Windows replacement is available I won't even care and will have adapted any existing projects/habits to what I can do with my M1 MBP. I do have to admit, I've been surprised how much more keyboard friendly current Windows is than macOS. It seems like most shortcuts for the Mac requi
  • How else am I supposed to all these tasks at the same time on a single core cpu like the M1?

  • Was I wondering? (Score:3, Insightful)

    by thegarbz ( 1787294 ) on Monday May 15, 2023 @04:44AM (#63521815)

    I mean EPYCs come with 96 cores, so I don't think I was really wondering how a company managed to fit 12 on a CPU. You don't even need some special magic source. You just slice the die a little larger.

    Typical Bloomberg reporting.

    • Slashdot used to be a place where people knew the difference between and laptop CPU and a server CPU with 360W TDP. Is it still true?
      • by leptons ( 891340 )
        Apple will no doubt still claim their M3 is faster than any server CPU ever, the reality distortion field is still in effect. They once (laughably) marketed their computers as "a supercomputer", so it should be obvious that facts about CPU performance doesn't really make a difference to them.

        > the Power Mac G4 was marketed by Apple as the first "personal supercomputers"

        https://en.wikipedia.org/wiki/... [wikipedia.org]
      • Slashdot used to be a place where people knew the difference between and laptop CPU and a server CPU with 360W TDP. Is it still true?

        Yes. It's also a place where some people realise you making that distinction is irrelevant since TDP is nothing more than a tradeoff and you seem to not understand what it means to make a facetous comparison.

        But since you want to play the TDP game note that the M2 with maximum core count has only about 3/4 of the TDP of many Intel laptop chips which function just fine in laptops. So ... what's your point? That Apple should have had 18 cores on the M2? Fuck the Intel P4M 2.8 had an 88W TDP, so Apple should b

  • by SciCom Luke ( 2739317 ) on Monday May 15, 2023 @06:19AM (#63521907)
    I am curious. I followed a course on software optimization.
    At some point in this course we profiled software that used an increasing amount of memory, and by monitoring the speed it could clearly be seen when the access was is L1, L2, L3 cache or main memory. We profiled several AMD, Apple and Intel processors. L1 and L3 cache speed was the same for all processors, but the M processors by Apple had clever engineering in them that made the L2 cache noticably faster.
    Hopefully it will one day be public knowledge what that trick is.
    • by AmiMoJo ( 196126 )

      It's not a trick, they just used more expensive memory for the L2 cache. They had to, because ARM's instruction density is quite low compared to AMD64, so to get similar performance you need massive caches.

  • While i never liked apple products for the past decade or two, since they launched M1 it has become exceedingly embarrassing.
    I just tried someone's macbook air m1 to see if all the hype is true and installed parallels and Win11 on a 8GB ram mba M1 - expecting it to be at best just usable for small stuff like it is on my windows laptops (with more ram though)

    Bloody thing runs macOS plus parallels and Windows 11 arm64 + kali linux arm, way faster than any windows laptop i have tried. Bought a macbook few days

  • I read that the M1 was so good Apple was unable to sell much of M2, because the performance was hardly better
    M3 is supposed to fix that issue.

    • by jsepeta ( 412566 )

      There are a few things going on which hampered the sales of M2-based Macs.
      1) Pent-up demand for a new chipset led the M1 to be extremely popular. Sales were wildly successful, moreso than expected.
      2) Consumers / Corporations spent their budgets on M1's; they will wait a few years for ROI before replacing these new computers with later models.
      3) M2 was not that big of a leap over the M1 performance-wise; this is not simply a hardware issue, because it will take developers some time to get better at writing c

      • by jsepeta ( 412566 )

        ALSO:

        As much as I want Apple to upgrade their complete product line every year, chances are they make more money by selling the same model computer over several years. So even if they could produce a new Mx chip every year, it may not make business sense to do so. Spreading out their computer model upgrades over 2 years seems to be the new cycle, as the MacBook will get upgraded more frequently than the Mac Mini or Mac Studio.

        After spending $5000 on an M1 Max MacBook Pro (the storage upgrades are redonkulus

  • by fred6666 ( 4718031 ) on Monday May 15, 2023 @09:09AM (#63522125)

    I wish journalist stop using that marketing gimmick term of "GPU cores" it means nothing. GPU are massively parallel. There is much more than 18 instructions that can be executed at the same time.
    They are trying to make a comparison with CPUs, but it is not a valid one.

    • But the GPU is broken down into "cores", they look different from CPU cores, but it's a reasonable unit of measure to compare the M3 to the M2 (for example).
      • They represent nothing physical. Their 16 core GPU could be rebranded 32 cores or 8 cores if they wanted to.
        The M2 has up to 2560 ALUs, so they could claim any number up to that, but the choose to keep it in the same range than CPU cores for marketing purposes only.

        By comparison, the Geforce RTX 4090 has 16384 shaders processors, but I don't think Nvidia went as low as marketing a core number, yet.

  • Comment removed based on user account deletion
  • by AbRASiON ( 589899 ) * on Tuesday May 16, 2023 @06:21PM (#63527019) Journal

    Wife wanted a new Mac for work, Air is all she needs for performance but sadly, had she got one, even an Air with the M2 processor. A $1500 US product, it would only support a single external display.
    In 2023.
    A single external display on a $1500 Mac. Incredible!

    So, we had to get an M2 Pro and of course it still has weird oddities, her Intel 2017 and Intel 2019 Macs both worked with my high end dock, with the lid closed. Just tap the space bar and the Macs woke up in the morning, not so with the new M series, have to lift the lid, unplug the dock, wait, plug in dock, wait, close lid each morning.

    Then there's this, you know Apple using Displayport logos but not actually complying with Displayport specifications https://sebvance.medium.com/ev... [medium.com]

    https://www.google.com/search?... [google.com]
    (of course Apple stans will defend them with a plethora of silly excuses why this is acceptable)

    (and yes if you're going to skim the post, the hardware IS capable of it, the software just decides not to)

    Finally there's this

    https://developer.apple.com/fo... [apple.com]

    You know industry standard docks, using Thunderbolt, USB-C which cause one particular model of Macs to reboot, despite the fact that Asus, Dell, HP, Lenovo, Huawei and even Apple iPads working fine in docks. This one is clearly a driver fault of some kind, easily recreatable, simply ignored by Apple. It also occurs on monitors with USB -PD / basic hubs in them.

    This is the precise stuff that has 'sperglord IT nerds' like myself, whining about Apple still, 20 years on. I get it, people like them and some stuff really does just work, heck some stuff is outright so damn well designed. If you like it, you're not 'wrong'. However doing stuff 'weird' and breaking stuff / ignoring faults because 'that's now how it's intended' doesn't cut it.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...