Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Apple Hardware

Apple's M2 Chip Goes Into Mass Production for Mac (nikkei.com) 235

The next generation of Mac processors designed by Apple entered mass production this month, Nikkei Asia reported Tuesday, citing sources, bringing the U.S. tech giant one step closer to its goal of replacing Intel-designed central processing units with its own. From the report: Shipments of the new chipset -- tentatively known as the M2, after Apple's current M1 processor -- could begin as early as July for use in MacBooks that are scheduled to go on sale in the second half of this year, the people said. The new chipset is produced by key Apple supplier Taiwan Semiconductor Manufacturing Co., the world's largest contract chipmaker, using the latest semiconductor production technology, known as 5-nanometer plus, or N5P. Producing such advanced chipsets takes at least three months. The start of mass production came as Apple introduced new iMac and iPad Pro models using the M1. The company said the M1 offers CPU performance up to 85% faster than an iMac using an Intel chipset, and graphics performance that is twice as fast.
This discussion has been archived. No new comments can be posted.

Apple's M2 Chip Goes Into Mass Production for Mac

Comments Filter:
  • The M1 is the only Apple product interesting me so far. I hope the M2 will be competitive as well and will push Intel/AMD to create betters chips. Competition is good.

    Too bad that chip must be used in an Apple computer/phone/tablet, and with an Apple operating system however.

    • Re: (Score:2, Interesting)

      by Guspaz ( 556486 )

      You'll be able to run Windows directly on it too, if Microsoft opens up the Win10 ARM licensing. Right now you're limited to running Windows on the M1 via a virtual machine, but Apple is on record saying that they'd do Bootcamp for ARM if Microsoft wanted to license it for that.

      Even then, though, there's still going to be the question of Apple's x86 emulator versus Microsoft's x86 emulator... Apple did some minor instruction set extensions on the M1 to accelerate emulation (IIRC to facilitate x86-style memo

      • I wouldn't see why Microsoft would want to do that. Just like Apple is not going to license macOS to others.
        It would just drive people away from the Microsoft world.

        • Re: M1 (Score:3, Insightful)

          Microsoft sells software. They donâ(TM)t care what hardware you buy.

          Apple sells hardware. They care about the software as that it makes you buy more hardware.

          • Of course but given the limited number of licenses they are going to sell, and the fact that 100% of those people would have bought Apple hardware and software, I don't think it's a very good strategy. Every Mac sale is a loss for Microsoft and they know it.

            • Every Mac sale is an opportunity to sell Office 365 license. I'm not sure that the OEM Windows 10 is worth that much.

              And every corporate computer sale -- whether PC or Mac -- is another Azure AD or Windows CAL license

              • Every Mac sale is an opportunity to sell Office 365 license.

                Office 365 runs natively on MacOS-M1. So there is no need for running Windows-on-Mac for that.

                MacOS is a different ecosystem with its own office productivity apps. Most MacOS users are not going to use Office 365.

                Every computer user who abandons Windows for MacOS is a loss for Microsoft.

                And every corporate computer sale -- whether PC or Mac -- is another Azure AD or Windows CAL license

                I doubt very much if MacOS users are likely to use Azure or buy Windows CAL licenses when iCloud "just works" with all their apps.

                • by Malc ( 1751 )

                  Every computer user who abandons Windows for MacOS is a loss for Microsoft.

                  This is not completely true. As a cross-platform software developer, I appreciate being able to run Windows and Visual Studio inside a VM on my Mac, or even natively occasionally. They're getting plenty of money from me via MSDN. As the sweaty bald man once said: developers, developers, developers!

                • I have a MacBook Pro.

                  Yes, I like Appleâ(TM)s apps (Pages, Numbers, Keynote).

                  I also have Office365 because that is preferred by most employers. Most arenâ(TM)t running Pages or Numbers.

                  I run Windows via Parallels for development.

                  Being able to switch between MacOS and Windows seamlessly would be a big thing.

                  What I had hoped to read in the article is the expected performance gains of the M2 over M1 vs Intel. Thatâ(TM)s the selling point.

                • by imgod2u ( 812837 )

                  Every Mac sale is an opportunity to sell Office 365 license.

                  Office 365 runs natively on MacOS-M1. So there is no need for running Windows-on-Mac for that.

                  MacOS is a different ecosystem with its own office productivity apps. Most MacOS users are not going to use Office 365.

                  Every computer user who abandons Windows for MacOS is a loss for Microsoft.

                  I question this assertion. Do you have data behind this? Most places I've worked at (that wasn't Apple) used Office by default and mixes Windows and Mac machines. The default Office suite was MS Office.

                • by Reeses ( 5069 )

                  Office 365 runs natively on MacOS-M1. So there is no need for running Windows-on-Mac for that.

                  Except they're not the exact same product. Excel on Windows has a completely different set of capabilities from the Mac version.

                • Every Mac sale is an opportunity to sell Office 365 license.

                  Office 365 runs natively on MacOS-M1. So there is no need for running Windows-on-Mac for that.

                  MacOS is a different ecosystem with its own office productivity apps. Most MacOS users are not going to use Office 365.

                  Every computer user who abandons Windows for MacOS is a loss for Microsoft.

                  And every corporate computer sale -- whether PC or Mac -- is another Azure AD or Windows CAL license

                  I doubt very much if MacOS users are likely to use Azure or buy Windows CAL licenses when iCloud "just works" with all their apps.

                  Microsoft Office is the de-facto standard for Business productivity software, and Microsoft have put in a huge amount of effort into making sure that the Mac version can do nearly everything that the Windows version can do.
                  I support hundreds of Macs across my client base. The vast majority of people using them have a Microsoft 365 Business Standard licence - which covers email, Azure AD and Microsoft Office. No one uses Apple's Pages and Numbers apps in a professional situation. Keynote is different, it use

        • Microsoft has a strategy of running its software agnostically on everything. Apple has a strategy of selling an appliance, not an operating system (as Microsoft does). Selling an OS to install in a virtual machine is Microsofts business model. I dont think there is even a price for Mac OS anymore.
          • In an x86 virtual machine yes. ARM? Not much. They might to it at some point, but not only for Apple's machines.

        • Apple doesn't want to license macOS to others because it subsidizes the OS with their hardware.

          Microsoft doesn't use hardware pricing to subsidize their OS, and would like to sell more copies of their OS. However, Server licenses make them lots more money than consumer devices, and Apple's devices are an even smaller niche. So Microsoft is probably going to wait and see if there ends up being enough of a market to make it worth their while.

        • by cfalcon ( 779563 )

          > I wouldn't see why Microsoft would want to do that

          Are you confusing Microsoft with Intel?

          For the same reason Microsoft makes Windows work on x86 Mac hardware, and in the past has made Windows work on other chip types. Same reason they already support other ARM chips:
          https://support.microsoft.com/... [microsoft.com]

          Microsoft would love Windows to work flawlessly on ALL hardware, from your watch to your server.

          The real question is why won't Apple sell MacOS? The answer here is because Apple wants you to buy their hard

          • The amount of people ready to purchase Windows on ARM to run in a VM on macOS is way too small for Microsoft to choose to sell Windows for ARM.

      • by amp001 ( 948513 )
        It's not about memory mapping. Apple supports a strongly-ordered mode to make executing translated x86 instructions easier (emulating x86 on weakly-ordered machines requires more instructions to be generated, so Microsoft's emulator has to live with that constraint).
      • Windows without its backwards compatibility isn't windows at all.

        • by Guspaz ( 556486 )

          It still has its backwards compatibility, the same way macOS does: x86 emulation. The performance hit in the case of macOS has turned out to be relatively minimal, though Microsoft doesn't currently have the CPU helping them out like Apple does.

    • I hope the M2 will be competitive as well and will push Intel/AMD to create betters chips.

      It wont specifically because Intel and AMD are focused on the server market, where the real money is and always has been located. The consumer market (desktop/laptop CPUs) have razor thin margins to prevent competition from getting a foothold and becoming a threat.

      • I hope the M2 will be competitive as well and will push Intel/AMD to create betters chips.

        It wont specifically because Intel and AMD are focused on the server market, where the real money is and always has been located. The consumer market (desktop/laptop CPUs) have razor thin margins to prevent competition from getting a foothold and becoming a threat.

        Wouldn't a datacenter dissipating 70% less heat (i.e.80% less power overall) be much cheaper to operate?

        • Data centers need more than 16GB of RAM.
          • Data centers need more than 16GB of RAM.

            More RAM is a highly likely upgrade for successive "M" chips.

          • Data centers need more than 16GB of RAM.

            The original proposition called for Intel/AMD to create better chips. You're arguing they focus on servers.
            Wouldn't they have a huge incentive to create better chips optimised for server memory requirements ?

        • Wouldn't a datacenter dissipating 70% less heat (i.e.80% less power overall) be much cheaper to operate?

          If that's how it worked then every data-center owner would be clamoring for these chips but it's not that simple. However, there is the small matter of Apple not making servers that prevents any consideration at all.

        • by cfalcon ( 779563 )

          > Wouldn't a datacenter dissipating 70% less heat (i.e.80% less power overall) be much cheaper to operate?

          Sure would! Maybe one day Apple will beat AMD on that metric, in the same way that they have barely inched past Intel on Performance per Watt. Of course, they beat Intel on that metric by having an absolutely laughable total power per chip that isn't able to scale for shit, running on an absolutely superior process that isn't theirs (this also helps AMD).

          ARM chips are finally getting to the point w

      • Intel and AMD are focused on the server market, where the real money is ...

        Any niche Apple serves likely involves real money too. They are not know for serving the mass commodity market with low profit margins.

        • Intel and AMD are focused on the server market, where the real money is ...

          Any niche Apple serves likely involves real money too.

          Apple got out of the server market long ago. As such, they are not going to be competing with AMD/Intel who are in the server market.

    • The M1 is the only Apple product interesting me so far. I hope the M2 will be competitive as well ...

      They could accomplish that with little more than 32GB and 64GB support. Additional architectural improvements a bonus.

    • This is much less of an issue today then it was 20 years ago.

      Most of what you do on your computer for "necessary" work is mostly all done via the Web Today. And being that most code today isn't coded in assembly language any more, it is now just a recompile away for what ever CPU you are using, so going from Intel to M1 or M2 isn't that big of a deal as it use to be.

      While I would kinda wish, that I could just build an M1, or M2 based computer without having to go full Apple, in reality what is going to hap

    • Are you kidding? Slashdot is significantly less technical than it used to be; so you don't hear about these things here that much anymore. But there is still a decently-sized cadre of geekdom out there devoted to porting Linux to everything under the sun; if for no other reason than: "For the lulz." Remember Doom running in a DOS emulator running on Linux running from an SD card on digital cameras; and playing it on the camera's viewfinder screen using the menu buttons to move and the shutter button to s

  • 32 GB of RAM? (Score:5, Insightful)

    by javacowboy ( 222023 ) on Tuesday April 27, 2021 @10:10AM (#61319546)

    Will it support 32 GB of RAM?

    Yes, the chip is a technical marvel, but the limit of 16 GB has been an impediment for those who need to do heavy duty work such as run multiple VMs, dockers, IDEs, etc...

    • Agreed. My personal cheap desktop computer from 2012 has 16 GB RAM. My professional laptop has 32. Time to move on.

      • by maynard ( 3337 )

        I run 64GB of RAM with a 16GB video card and I already want to upgrade to double the cores, double the RAM, and get a 24GB RTX 3090. Apple pretends they're selling these machines for Logic and Final Cut Pro, but there's no way 16GB shared is enough to cut 4k raw with any real color grades or heavy compositing.

        I don't know who they plan to sell these machines to, but I can say with certainty it won't be people making film, video, or doing 3d modeling and animation. That market is dead to Apple with these mac

        • The iMac has never been a video editing powerhouse. You need the expensive Mac Pro for that.

        • You're forgetting that as powerful as the M1 is, it is the absolute low-end of Apple's CPU roadmap. That's why it was released in the MacBook Air, the low-end MacBook Pro 13-inch and the Mac mini.
          You are not the target market for the M1.
          There will be a faster, more capable processor, whether it's called the M1X, or the M2, or whatever, that is targeted at professional usage. It will support more RAM and will have more GPU power, or allow the use of a separate GPU.

    • Guess what? (Score:2, Insightful)

      This product isn't for you.

    • by xjerky ( 128399 )

      Using ARM Macs for containers is kind of pointless if the production servers that they would be running on are Intel.

    • Re:32 GB of RAM? (Score:4, Interesting)

      by amp001 ( 948513 ) on Tuesday April 27, 2021 @10:46AM (#61319726)
      I certainly hope so. There are two routes to supporting more than 16G of RAM. The simplest is using higher-density parts for the on-package RAM. We might see that in a later variant of M1. But, the alternative is something Apple filed a patent for awhile back: populating the on-package RAM with lower density parts and treating it as an L4 cache backed by off-package RAM (which could be soldered to the board in something like a MacBook Pro or slotted in something like a bigger iMac or Mac Pro). I am hoping M2 includes this option.
      • by Entrope ( 68843 )

        There is also an older, boring, but not patented alternative arrangement: Non-Uniform Memory Access. Treat both the on- and off-package RAM as independent system memory, but select one or the other based on expected access patterns for what is more or less sensitive to latency.

        • by imgod2u ( 812837 )

          NUMA works in certain specific HPC but general purpose OS's don't have the insight into app behavior to realistically be able to schedule well in NUMA configurations. Caching turns to work out a lot better, which is why the big iron Power and Xeons don't have NUMA and instead use on-package DRAM chips as caches.

    • Computers usually support too little RAM before the processor becomes otherwise outdated. RAM is not expensive (relative to a premium product) but the less one may opt for the sooner their purchase will become obsolete. 16GB is fine for a Facebook machine but otherwise crippled. Remember Apple gear is theoretically high end professional equipment.

      Limiting max RAM in no possible way benefits customers but small quantities of soldered RAM ensure they'll treat their machine like a disposable phone. I'm not off

      • RAM is not expensive (relative to a premium product)...

        Really?

        Just the memory upgrade on a new MBP, is $800. That's 25% of the total cost of the fucking hardware.

        You'd have to be smoking the iWeed to believe that line when it comes to Apple.

        • by tsa ( 15680 ) on Tuesday April 27, 2021 @12:46PM (#61320288) Homepage

          That is what you paid for it, but not nearly what it is worth.

        • I don't buy Apple hardware because that choice is severely limiting for my use case, but for someone making tech money $800 is background noise.

          Apple sales are PROOF it's background noise.

          While my choices are thriftier I recognize other use cases exist. For someone making $100/hr eight hours pay isn't much for a tool or a toy. For someone living at poverty level it's considerable but poor people don't need iGadgets because (for them) they're just Fecesbook machines, but ya don't become poor by making wise

    • There are ways Apple could get around the RAM die limit. One way is to have internal RAM and external RAM, like the antediluvian IBM mainframes did, where when the pressure on internal RAM went over a threshold, it would move pages to external RAM. Another technique is using the internal RAM as a large cache, but this would require more total RAM.

      I hope they can either get more RAM on the die, or allow for on-board RAM. Shipping memory starved units may be OK for the first round of M1s, but for a lot of

    • Will it support 32 GB of RAM?

      Yes, the chip is a technical marvel, but the limit of 16 GB has been an impediment for those who need to do heavy duty work such as run multiple VMs, dockers, IDEs, etc...

      I can configure a new MBP with up to 64GB of RAM today. Perhaps this was a simple oversight?

      (With a 32GB memory upgrade running $400, I wouldn't be surprised if your wallet had a fucking stroke before you even got to that upgrade step.)

    • Yes, the chip is a technical marvel, but the limit of 16 GB has been an impediment for those who need to do heavy duty work such as run multiple VMs, dockers, IDEs, etc...

      Or, for that matter, for those who like to hold onto a computer for more than two or three years.

      Oh, wait - have they solved the SSD thrashing issue? If not, there's only a couple years' worth of use in one of those machines.

      (I'm not planning to give up my 2015 MacBook Pro quite yet)

    • Well part of the reason (not all but part) is precisely because it only supports 16G of ram. The use of HBM lowers memory latency, increases bandwidth and importantly drastically cuts power consumption because those high bandwidth out board buses take a lot of power to drive.

    • by antdude ( 79039 )

      VMs? Still no VirtualBox for M1 Macs. :(

  • Wish they had gone with RISC V. Itâ(TM)s easy to see what Apple is heading towards doing .. eventually they will make ARM incompatible instruction sets. Why wouldnâ(TM)t they? Developers would get mad? What if it supports everything ARM does, plus their own extensions. Who benefits from that? Certainly not the consumer.

    • Exactly why would the ***Apple*** customer not benefit if they implement extra instructions over generic ARM?
    • Re:RISC V (Score:4, Informative)

      by algaeman ( 600564 ) on Tuesday April 27, 2021 @10:47AM (#61319732)
      If Apple is going to ditch Intel, they need a performance chip for desktops. The riscv architecture offers better performance per watt than arm. It is not anywhere close to the raw performance of arm, at the moment. Also, nobody is building real desktops for riscv right now, because they will then have to pay for all the peripherals components, which are included in the SoC/chipset of a mature platform like arm/intel/amd.
      • Re:RISC V (Score:5, Insightful)

        by UnknowingFool ( 672806 ) on Tuesday April 27, 2021 @11:34AM (#61319962)
        Also Apple has more than a decade of experience designing with ARM. It would take Apple years and years to design a RISC V chip that would be acceptable to use in their products.
        • It would take weeks or months, not years. RISC-V ISA is extremely similar to MIPS, and relatively close to ARM. They only need to modify the decoder on the front-end, note that they design their whole core (they do not license the IP.

          The current problem with RISC-V is that it lacks the mature software ecosystem around it. That is what Apple pays for with their architectural license of ARM.

          • It would not take weeks/months for Apple to design a RISC-V chip for an Apple product considering it takes Apple a year for every new iteration of their ARM chip. And Apple has a decade of experience with ARM.
            • You are referring to a complete new iteration of the processor (I agree on that). I mean that the effort to only replace the instruction decoder in the front-end is not that large.

              • As I said above: "It would take Apple years and years to design a RISC V chip that would be acceptable to use in their products." I did not say it would take years for Apple to make any RISC-V chip.
    • Why? What is the benefit of RISC V now? My understanding is that RISC V is years behind ARM. While RISC V being open source can be updated eventually, it needs a lot of work/time to optimize it for Apple's needs.
      • Both RISC-V and ARM are both moving up the food chain. You see a lot of traction growing for RISC-V in the embedded space, and ARM is moving into the desktop and higher-end space, pushed by Apple. Apple did the world a favour by showing that it is possible to get serious performance from ARM and start to give x86 a run for its money.

        RISC-V is getting a lot of love from Chinese embedded chips because you don't need an ARM license which makes them much cheaper. $30 can buy you a dual-core 400MHz RISC-V with K

        • My view is RISC-V has a lot potential but does not yet have the maturity Apple would need for a product yet. For example, any iPhone chip today contains custom hardware like encryption, audio/video encoding/decoding, etc. While Apple could build it eventually, it would take time. With the NVidia proposed purchase of ARM, Apple like others may be looking at migrating to RISC-V as a contingency.
    • eventually they will make ARM incompatible instruction sets.

      Apple has already done this. The M1 has Apple-specific instructions to boost the performance of x86 emulation.

      Developers would get mad?

      Developers don't care. Few even know. Unless you are a compiler developer, it doesn't affect you.

      Certainly not the consumer.

      The consumer certainly benefits from better performance and better emulation.

      So what is the downside? None that I can see. Apple apps only run on Apple hardware, so compatibility elsewhere isn't an issue.

      • by _merlin ( 160982 )

        The M1 has Apple-specific instructions to boost the performance of x86 emulation

        Does it actually have additional instructions? I thought the feature for helping x86 emulation was support for switching the memory coherency model to total store order.

        • Does it actually have additional instructions?

          I believe it does. It implements AMX instructions that are not part of the ARM ISA to do matrix operations.

          AMX: Apple Matrix coprocessor [medium.com]

          The instructions are not documented. They were discovered by reverse engineering.

    • ARM is the obvious choice for Apple.

      (1) Apple has a perpetual ARM license, (2) Apple has ARM design experience since they co-designed the ARM6 together with Acorn (who originally designed ARM in 1985) and VLSI Technology (now NXP Semiconductors) back in 1991, and (3) ARM is a more mature platform than RISC V.

      In short: they already have it, they know it through-and-through, and it less work than the alternative. For other companies which do not have the two first advantages, the extra work might be wo
  • The color iMacs aren’t even pre-orderable yet and they are producing parts for generation 2 already. Reminds me when they announced the fourth gen iPad less than six months after third gen.
    • The color iMacs aren’t even pre-orderable yet and they are producing parts for generation 2 already.

      Do you know how logistics work? The color iMacs are available in a few days meaning hundreds of thousands units may have been shipped from China and were manufactured months ago to build inventory ahead of a launch. The M1 began chip production this time last year. This is how long in advance companies have to manufacture things.

      Reminds me when they announced the fourth gen iPad less than six months after third gen.

      Apple announcing updates to products multiple times a year . . . [sarcasm] How shocking. Apple nor anyone else in the industry ever does this. [/sarcasm] Most likely, the M2 will no

  • That thing came out 6 weeks ago throw it in the garbage and get with the times!
  • Until I can launch virtualbox w/ vms for windows and linux, it won't matter if they have an M99 chip...

  • Or is this just desktop level with GPU that is ok but can't really game at 4K-8K

"The following is not for the weak of heart or Fundamentalists." -- Dave Barry

Working...