Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Apple Hardware

Apple Introduces M1 Chip To Power Its New Arm-Based Macs (theverge.com) 155

Apple has introduced the new M1 chip that will power its new generation of Arm-based Macs. It's a 5nm processor, just like the A14 Bionic powering its latest iPhones. From a report: Apple says the new processor will focus on combining power efficiency with performance. It has an eight-core CPU, which Apple says offers the world's best performance per watt of an CPU. Apple says it delivers the same peak performance as a typical laptop CPU at a quarter of the power draw. It says this has four of the world's fastest CPUs cores, paired with four high-efficiency cores. It pairs this with up to an eight-core GPU, which Apple claims offers the world's fastest integrated graphics, and a 16-core Neural Engine. In addition, the M1 processor has a universal memory architecture, a USB 4 controller, media encode and decode engines, and a host of security features. These include hardware-verified secure boot, encryption, and run-time protections.
This discussion has been archived. No new comments can be posted.

Apple Introduces M1 Chip To Power Its New Arm-Based Macs

Comments Filter:
  • by jay age ( 757446 ) on Tuesday November 10, 2020 @01:50PM (#60708404)

    There were no hard data anywhere during the whole Apple event. What Windows PCs they were comparing too were never mentioned, nor any benchmark results.

    I'm curious what reviews will show, but for now there are many reasons to be sceptical.

    • by dfghjk ( 711126 )

      I can see lack of hard data being A reason to be skeptical, what might be the others of "many reasons to be skeptical"?

    • That's certainly true, but even hard data can turn out to be a lot of bullshit after someone figures out how it was constructed to be misleading. Apple usually doesn't release any hard data during their presentations and it's not really their style to do so. If you've ever seen one of their presentations for a new phone or tablet it's the same style of X amount faster and Y additional hours of battery life. Once the tech sites get their hands on a device and release benchmarks, it turns out that Apple's cla
      • by dfghjk ( 711126 )

        Agreed, plus while one can argue that Apple has used optimistic spin in the past, they can't argue that Apple will be suddenly far "optimistic" now. Also, Apple's not claiming superior performance against every x86 CPU, they're talking about low-TDP mobile CPUs. That has been their focus for quite some time and there's no reason to believe they lack expertise.

    • There was plenty of hard data. "Faster than 98% of all laptops sold", and between "up to 3.5 times faster" and "up to 2.8 times faster" than previous quad core models.
      • by vux984 ( 928602 )

        "There was plenty of hard data. "Faster than 98% of all laptops sold"

        Whoop-de-fucking-doo; that's marketing garbage right there. They COST more than 98% of all laptops sold too. The macbook air, their lowest laptop STARTS at $1000.

        The Dell XPS stuff starts at $1000 too although you can get them for a little less on sale.

        But dell sells a lot of inspirons, vostros, and latitudes for less.

        So big fucking deal... laptops more expensive than what most people buy is faster than what most people buy.

        Next up: 200,000 sports cars handle better than 98% of cars on the road.

        "up to 3.5 t

    • They were certainly playing with statistics to make themselves look better than they are. How many HP Streams get sold in a year, for example? Probably a lot more of those than LG Gram laptops.

    • There were no hard data anywhere during the whole Apple event. What Windows PCs they were comparing too were never mentioned

      They were pretty clear. PC laptops in the same class of machines for the Air and Pro. PC desktops in the same class as the mini. Class being a factor of price, size, consumers/student vs pro focus (which translates to CPU and RAM choices among other things), etc..

      Sure you and I can build our own monster desktop from parts, high end CPUs and GPUs, etc. But that is not the sort of thing most people buy, its the orange being compared against the apple.

    • There were no hard data anywhere during the whole Apple event. What Windows PCs they were comparing too were never mentioned, nor any benchmark results.

      Not sure, but they were using an EGA monitor if that helps.

  • Definitely a lot of marketing going on here. And it's hilarious that you can buy a $700 Mac Mini to drive a $7000 Pro Display XDR. But it's been a long time since there was anything interesting happening in the desktop CPU market. If they can deliver on their performance and battery life claims, it really does represent a huge change in the laptop space.

    • by AmiMoJo ( 196126 )

      Are you kidding? AMD just released Ryzen 5000 and it's destroying Intel. That's pretty interesting.

      • That's fairly interesting, it's true. (Also, hooray for AMD—I'll always have a soft spot for them.) But assuming that Apple's graphs represent anything approaching reality, they're showing much bigger year-over-year gains than we've seen in a long time. I don't know how Apple is going to scale these processors up—surely they can't keep putting ALL the memory on the SOC; 16GB is one thing, but what if you want 64GB or 512GB or 12TB like on the Mac Pro?—but I guess that's what I mean by *int

  • Comment removed based on user account deletion
  • by Improv ( 2467 ) <pgunn01@gmail.com> on Tuesday November 10, 2020 @02:05PM (#60708464) Homepage Journal

    It'd great to see ARM doing well, even if it's Apple doing it. The better battery life will be exciting, and when this hits the desktop they may be able to build massively parallel machines. Integrated graphics is unfortunate; I hope they offer something better soon. Also their initial systems don't have much in the way of RAM. This should change over time but a better launch would include something with 32G or 64G as higher-end options.

    • I don't see the battery life being that big of an issue. x86-64 has made huge strides in the past few years. You can easily find a laptop that lasts over 10 hours, which is more than enough for a lot of people. Most people don't need a laptop that does 17 hours away from an electrical socket. Sure, if the performance and price are on par with an x86-64 chip, then more power is welcome, but for most people, the battery life won't be the first thing they look at.

      • by amorsen ( 7485 )

        You can easily find a laptop that lasts over 10 hours, which is more than enough for a lot of people.

        Can you though?

        I have not yet found a laptop that can last more than 5 hours for my work use, and most of my work is done in a terminal window. My current laptop hovers around 10-15W, according to powertop. 7W just for the display backlight at half brightness.

      • by tlhIngan ( 30335 )

        You can easily find a laptop that lasts over 10 hours, which is more than enough for a lot of people.

        The laptops I see with over 10 hours of battery life usually have a big ass battery strapped to them that triples the weight and doubles the size. Not exactly something people would love to lug around for all day computing.

        After all, if they need that sort of endurance, they usually aren't near places to charge otherwise they'd be using the AC adapters far more to extend battery life.

    • Iâ(TM)d imagine thereâ(TM)ll be better RAM configurations in higher spec Macs. The 13â is entry level. Iâ(TM)d be surprised if itâ(TM)s not at least 32 in the 15â upwards.

      Iâ(TM)m more curious to see how the GPU performs. If this is the end of discrete GPUs in MacBook Proâ(TM)s then it had better be bloody good and not just better than Intel integrated video.

      • Bloody Slashdot. Shall we expect UTF-8 support by 2030?

        • by tlhIngan ( 30335 )

          Bloody Slashdot. Shall we expect UTF-8 support by 2030?

          Unicode support has been around since 2006 or so. But a string of abuses has resulted in going from a Unicode codepoint blacklist to a codepoint whitelist. And yes, UTF-8 is supported.

          Of course, if you don't know how you can abuse Unicode to screw things up for everyone, then you really shouldn't be complaining. (And yes, Unicode abuse is what leads to those deadly text messages that crash Android and iOS phones).

          • They could try seeking advice from practically every website made in the past decade. There are ways the apostrophe can be made safe. Maybe even one day weâ(TM)ll even find a way to make the £ (Sterling) safe.

          • It would be pretty easy to add some basic punctuation and science/math symbols to the whitelist - especially those used by default on iOS. However, I think those specific characters are kept on the blacklist to put a spotlight on who is using Apple hardware.

          • That doesn't exactly reflect any better upon them. I know this isn't a banking site or anything. And if my comment history goes kablooey... BFD. But sanitizing your inputs so that users can't break your site or DB with malicious "text" has been a standard best-practice for competent web developers for I-don't-even-know-how-many years. There's really just no excuse.

    • by AmiMoJo ( 196126 )

      My guess is they deliberately avoided the really high end because they won't be competitive with Ryzen for workstation type loads. Even Intel can't keep up there.

      • I think they'll start going there eventually, but it's probably better if they don't try to replace every single segment at the same time. It's pretty obvious that they can compete against the low end desktop and laptop chips from Intel and AMD just based on how well their mobile devices perform, but scaling up is a different matter entirely. Apple doesn't have the volume for those segments, so unless they have a design similar to what AMD is doing now, it doesn't make a lot of sense for them to target HEDT
      • The Mac Pro will need new chips too; Apple said they're transitioning the WHOLE line over to their silicon in the next couple of years. But those more powerful chips are definitely going to present a bigger architectural hurdle. I had originally thought they'd take on the Mac Pro first exactly because it's the most difficult, but it's the platform they have the absolute most control over. Since they went for the portable/power-efficient end first, the Mac Pro will have to be last, and we won't see those mac

      • It made no sense for them to transition the actual Pro Macs to ARM because the major Pro apps wonâ(TM)t have ARM versions for some time. On top of that, many Pro apps requires great CPU and GPU performance, and even if Apple's competitive on CPU performance soon, their GPUs aren't there. Besides, the 13â MacBook Air is really an iPad Pro with a keyboard and a trackpad at this point. It was always predicted because it was a really safe bet engineering-wise.
        • by Improv ( 2467 )

          I maintain some opensource scientific software available on all 3 major OSs; I want to start porting my stuff so when higher-end ARM systems show up I'm ready. There's a chance a few people might try to run things on a lower-end system (contrary to our RAM recommendations) probably starting in a month, and I really don't want to buy a system now only to discard it a few months later. It's a bit frustrating.

          (I realise not everyone is in the academic sector; thought I'd share some of our concerns there)

      • by Improv ( 2467 )

        I think this competition will not likely be on a per-CPU basis so much as a fleet basis. Meaning they'll go with having lots of low-mid-tier CPUs in any given workstation rather than a few very fast cores. If I'm right, I expect a big push by Apple to get developers to make their apps much more multithreaded (or composed of smaller units).

        • by AmiMoJo ( 196126 )

          The issue is that ARM currently doesn't scale very well to massive numbers of cores. ARM has just released some new designs that improve things a bit but they are many years behind what AMD and even Intel are doing. You can't just throw more cores in and expect it to work well.

    • The better battery life will be exciting, and when this hits the desktop they may be able to build massively parallel machines

      I gotta admit, that's where I get curious.

      Performance per watt is an important metric for laptops. And Apple sells a ton of laptops, so this is very important for Apple. But there are some applications where I'm fine being "chained to a desk" and I'd rather have high-performance and don't really care about the watts.

  • No good. (Score:5, Funny)

    by Drew84WHEEE ( 1447189 ) on Tuesday November 10, 2020 @02:08PM (#60708472)
    (extremely smooth-brained Slashdot engineer chud voice) Unacceptable. I need to support 2700 desktops running Tabworks and my team has 14 additional with [whichever shitty Linux people use now for no fucking reason]. Normal humans, of which I am definitely one, have these needs too, so I predict failure. More space than a Nomad but still lame.
    • Like when the iPod was released and everyone here saying how dumb it was and it wouldn't sell.

    • whichever shitty Linux people use now for no fucking reason

      Kind of offtopic, but the reason is that it's free, works fine (if only barely), and it's not controlled by Microsoft. I like Pop OS.

  • ... since the Mini and Pro do have a fan.
  • by xonen ( 774419 )

    Would this be the year of Apple on my desktop?

  • I read in a separate article that the CPU in new Macs will have memory on-chip? 8 GB to start with, and 16 GB later, with no possibility of expansion? Just askin'. That'll be fine for most users, but not for power users. (I'm a heavy user of Adobe CC apps and have 56 GB installed.)

    If the above is true, I wonder if they'll go with some kind of NUMA architecture in their very-high-end Macs.

    • by dfghjk ( 711126 )

      Why would you not assume that higher end Macs simply get a different processor and memory solution? You think that because their first, low-TDP processor makes a design tradeoff that every other Mac will require bandaids?

      • Why would you not assume that higher end Macs simply get a different processor and memory solution? You think that because their first, low-TDP processor makes a design tradeoff that every other Mac will require bandaids?

        I think that, Apple tends to try and concentrate all platforms into a single architecture. I have no idea whether they will actually do that in this case, I can't read their minds. It's all speculation at this stage.

        I don't see them using different architectures in different tiers, but I guess we'll see.

        • I think if they were going to do that, they'd've announced all their Macs today. My own intuition is that they'll have different tiers and system architectures because trying to make a Mac Pro the same way you make a Macbook Air would just leave you with a garbage Mac Pro, and they've already had enough trouble there.

          • I think if they were going to do that, they'd've announced all their Macs today. My own intuition is that they'll have different tiers and system architectures because trying to make a Mac Pro the same way you make a Macbook Air would just leave you with a garbage Mac Pro, and they've already had enough trouble there.

            The whole point is NOT to announce all their macs today, so that they can sell you an 8GB mac now and a 16GB mac when it comes out.

    • Re:Memory on-chip? (Score:4, Interesting)

      by lactose99 ( 71132 ) on Tuesday November 10, 2020 @03:31PM (#60708770)

      Yeah RAM appears to be on-chip, apparently with that they can do a whole bunch of zero-copy between the different components (CPU, GPU, ML) for increased performance over the alternative. I'd be very interested to see the low-level OS architecture for this type of setup.

    • You did get the "no possibility of expansion" part right.
      But it's not "8 GB to start with, and 16 GB later". It's "Buy your Mac with either 8GB or 16GB of RAM".

      • You did get the "no possibility of expansion" part right.
        But it's not "8 GB to start with, and 16 GB later". It's "Buy your Mac with either 8GB or 16GB of RAM".

        Yes, I did get that, and you are correct. But it appears to be part of Apple culture to replace your Mac with the upgraded one when it comes out. So naturally they came out with the 8 GB model first.

  • by xgerrit ( 2879313 ) on Tuesday November 10, 2020 @03:03PM (#60708660)
    The battery life on the new ARM Macs sounds amazing and I'm cautiously optimistic, but 8GBs of memory on the base model? And it's shared with the GPU..?? The base 13" MacBook Pro used to have 16GBs of memory plus another 4GBs for the GPU. I just checked and I'm burning through 12 GBs memory right now with just Safari and Xcode open. And that doesn't include GPU memory usage. Factor in that ARM-code uses more memory (from RISC code-expansion) and the memory on these new Macs seems really anemic.
    • by seoras ( 147590 )

      RISC code-expansion? Can you elaborate on that please?
      Yes I know RISC means Reduced Instruction Set but even with a reduced instruction set the amount of code needed isn't necessarily more.
      Back when the ARM2 first appeared in the Acorn Archimedes, which I bought while at Uni doing Comp Sci & Electronics, I ran a comparative test.
      I took a C function and compiled it on the Archimedes, a Sinclair QL (68000) and a sequent (x86).
      Then I counted the total number of instructions each compiler produced for its C

      • by Saffaya ( 702234 )

        For your information, the MC68000 has 16 registers of 32-bit width.
        The one dumping registers on the stack was most probably the x86, not the MC68000.

      • by twosat ( 1414337 )

        When I was at Canterbury University in the 1980's there were some other computer science students there who were interested in ARM. Dave Jaggar actually did his thesis on the ARM instruction set. I attended a couple of his seminars over the years and I recall him saying that it normally needed about 20% more code than a CISC computer, but it would execute the code much faster therefore more than making up the difference. He actually ended up as ARM’s Head of Architecture Design in Cambridge, UK and de

    • You understand the idea of the "base model" right? You don't need 16GB of RAM to post shit on Slashdot all day. 8GB is more than enough for many people. Not for me on my desktop, but for many people.

  • by Lije Baley ( 88936 ) on Tuesday November 10, 2020 @03:15PM (#60708702)

    Now that's some advanced Eurospeak!

  • I have the 13" MacBook Pro for work, and I already need extra dongles for ethernet, monitors, keyboard, and mouse. The new one only gives you 2 USB ports, and one of them will be used for power! I can't wait to see what they remove on the next big release!

    • You know you can get use a single USB-C port for a "docking station" with ethernet, display, and multiple USB-A. And it can pass through power, leaving all of the other ports on the laptop available. I like the ones from Monoprice.

  • Comment removed based on user account deletion
  • I didn't see an answer to this question, but didn't read all the posts here. Microsoft has enough trouble and its hands full getting its OS to run its own software (Office ?) on PCs, much less moving it to a new hardware design. The same for other non Apple programs folks depend on. Presumably Apple will have some of its familiar software running on this new PC level hardware, but what about non Apple stuff?
    • Apple announced that all Apple programs are compiled in fat executables with both Intel and ARM binaries included.

      For 3rd-party programs, there is an emulation/translation layer called Rosetta 2. Rosetta 2 is apparently both JIT and install-time recompile of Intel -> ARM binaries. This is supposed to work for the vast majority of applications, but it will not work for Bootcamp or virtualization. I imagine we'll start seeing benchmarks soon.

      • To be more precise, anyone is/will be able to compile fat executables too. And if you don't use Apple's Xcode then it's your own damn fault, they've been telling you to switch for years now.

        Rosetta 2 is there in case you need to use x86 Mac software that's not supported anymore (i.e. the developer/company stopped making new versions), or you can't upgrade to the new version for some reason (the licensing changed, it's too expensive, it changed from paid software to software-as-a-service, etc).

  • by presidenteloco ( 659168 ) on Tuesday November 10, 2020 @04:45PM (#60709040)
    So, there was a brief window there in 2019-2020, after the 4 year no-escape-key debacle, where a 16" Macbook Pro became the best non-Apple-SW developer laptop again. You know, for Docker development, or unix-style development, including a good platform for controlling deployment to the cloud etc.

    When the new ARM 16" arrives, that window will be closed again, it looks like, since x86-target docker images will probably run 5x slower emulated on the ARM.

    So I guess all the developers who don't develop Apple-specific code will have to scoop up their last 16" Intel Macbook Pro before they're gone. Hopefully for a discount once the new ARM one arrives.

    Probably too much to ask for a macbook pro with an optional x86 in there alongside the ARM, so it remains a developer machine.
  • Apple is making M1s. Well why the hell not, International Harvester and IBM did. Oh wait, they said 8 cores not 8 rounds, never mind.
  • Better skip a generation when they get to M-5. M-5 units have a tendency to develop some fairly dramatic and exciting showstopper bugs. Some deaths were involved the last time around, if my recollection is correct. And the problem can only be corrected by talking the M-5 into committing suicide. And not all of us have the unusual diction that's been useful for that sort of enterprise.

  • Will they be completely compatible with older softwares? I know Apple dropped 32-bit in Catalina. I assume these new MacBooks will come with BigSur.

Keep up the good work! But please don't ask me to help.

Working...