Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Portables (Apple) Desktops (Apple) Intel Apple

Apple Silicon M1 Chip In MacBook Air Outperforms High-End 16-Inch MacBook Pro (macrumors.com) 174

The first benchmark of Apple's M1 chip shows that the multi-core performance of the new MacBook Air with 8GB RAM beats out all of the 2019 16-inch MacBook Pro models, including the 10th-generation high-end 2.4GHz Intel Core i9 model. "That high-end 16-inch MacBook Pro earned a single-core score of 1096 and a multi-core score of 6870," reports MacRumors. The MacBook Air with M1 chip and 8GB RAM features a single-core score of 1687 and a multi-core score of 7433. From the report: Though the M1 chip is outperforming the 16-inch MacBook Pro models when it comes to raw CPU benchmarks, the 16-inch MacBook Pro likely offers better performance in other areas such as the GPU as those models have high-power discrete GPUs. It's worth noting that there are likely to be some performance differences between the MacBook Pro and the "MacBook Air" even though they're using the same M1 chip because the "MacBook Air" has a fanless design and the MacBook Pro has an new Apple-designed cooling system. There's also a benchmark for the Mac mini, though, and it has about the same scores. The "Mac mini" with M1 chip that was benchmarked earned a single-core score of 1682 and a multi-core score of 7067.

There's also a benchmark for the 13-inch MacBook Pro with M1 chip and 16GB RAM that has a single-core score of 1714 and a multi-core score of 6802. Like the "MacBook Air," it has a 3.2GHz base frequency. A few other "MacBook Air" benchmarks have surfaced too with similar scores, and the full list is available on Geekbench. [...] When compared to existing devices, the M1 chip in the "MacBook Air" outperforms all iOS devices. For comparison's sake, the iPhone 12 Pro earned a single-core score of 1584 and a multi-core score of 3898, while the highest ranked iOS device on Geekbench's charts, the A14 iPad Air, earned a single-core score of 1585 and a multi-core score of 4647.

This discussion has been archived. No new comments can be posted.

Apple Silicon M1 Chip In MacBook Air Outperforms High-End 16-Inch MacBook Pro

Comments Filter:
  • Why only Geekbench? (Score:3, Interesting)

    by Entrope ( 68843 ) on Thursday November 12, 2020 @06:27PM (#60717714) Homepage

    Every single performance comparison I've seen of M1 versus other CPUs uses Geekbench, either in its single-core or all-core (multi-core) version. This gives an extremely limited basis for comparison. Even when AMD is comparing itself to Intel, while it uses Cinebench as the top-line comparison, it generally provides comparison on other benchmarks as well. Why is Apple focusing so much on Geekbench?

    • by MrNaz ( 730548 )

      Because that's how benchmarks for unreleased hardware works. A highly selective test that is cherry picked for most hype value.

      There's no way the real world performance of an ARM system will beat out a high end x86 machine in everyday use with normal patterns of use.

      Now, the pattern of use for an Apple user on the other hand... Those guys will probably buy these devices to do nothing but sit in Starbucks and show running Geekbench instances to random passers by.

      • #jealous much?

        Geekbench is a standard. Iâ(TM)m sure others will surface. The thing ships out this month and you can run any test you want once they sit on the tables of your local Apple store. If the mobile space is anything to note, the eggheads at the Apple acquired PA Semi will have and continue to outperform their colleagues at Intel, which is probably why Apple decided to switch. The iPad has been getting faster and faster, catching up to the MacBook Pros, and for less heat and power. Apple have
        • by Z80a ( 971949 )

          Using a bad standard to measure performance of your CPU leads to a bad CPU.

          • Do you really think Apple's engineering team used Geekbench to judge its CPU? I am sure that they have their own internal benchmarks that are weighted towards the metrics that they think are important. It may be that Apple's marketing team uses it as it shows the CPU in good light, but my bet is that the design decisions were based on other criteria.

      • by Entrope ( 68843 )

        Wasn't that the obvious implication of how I asked the question? I was hoping for someone to give an iron man approximation of Apple's logic, not just assume bad faith on Apple's part.

        • by dfghjk ( 711126 )

          Your failure is assuming that Apple is behind it at all.

        • by malvcr ( 2932649 )

          I have a Mac Mini 2012 (yep ... one of the abandon machines with Big Sur). It has an i7 and now it runs with Catalina and 10GB RAM (because one of its 8GB memory sticks failed and I had a 2GB around). And generally speaking it runs well.

          However, my work is on Linux, so I have Linux virtual machines on that Mac, and a bunch of ARM based devices with a lot of different fruit logos printed on them.

          What I think is that there is not unique working landscape that could define every possible usage. The M1

      • by dfghjk ( 711126 )

        The M1 doesn't target a "high end x86 machine", it targets a low power notebook. Also, using Geekbench isn't "cherry picking", it's just basic and it's not Apple generating "hype value".

      • by leonbev ( 111395 )

        You weren't expecting them to use video game benchmarks to show off their shiny new Macbooks, did you? The integrated GPU on the M1 processor would get thoroughly trounced by any similarly priced Windows laptop or desktop with a dedicated GPU.

        • Untrue. The way-less-speced GPU in the DTK is very, very fast and handles loads of work without even getting warm. M1 should be at least twice that.
          Can you whip out a $1000 Nvidia card and beat it? Sure.
          That's not the market for the M1.

          • by Z80a ( 971949 )

            How well you think it would do against a Ryzen 3400g?

          • by leonbev ( 111395 )

            I would expect the $150 GeForce 1660 from a comparably priced Windows gaming laptop or desktop to thoroughly trounce the gaming benchmarks of the new 13" MacBook Pro with an M1. That's not a fair fight, but then neither is benchmarking your new product with prerelease software and plugins that were specially designed with Apple optimizations against their 2-year-old hardware.

          • by AmiMoJo ( 196126 )

            It would be incredible if the GPU in these things is at all comparable to similarly priced x86 laptops which will have a dedicated chip rather than an integrated one.

            My guess is that Apple will have optimized it for video processing to keep video rendering decent, since a lot of low tier reviewers use Macs for that, but for games... eSports maybe.

        • Re: (Score:3, Interesting)

          Benchmark are irrelevant.
          Relevant is user experience.

          I mostly play WoW Classic and eve online. It is basically impossible to build a laptop that does not run both in absurd high quality.

          For ordinary day to day use, I use Mail, Safari, Chrome. And: Ecplipse. Seriously, who the funk cares if a windows laptop is "objectively faster" in a benchmark, when my Mac simply does what I want and does it so fast (e.g. compiling a huge Java project) that I can not even go a way to take a legit coffee break or write one

          • by Anonymous Coward on Thursday November 12, 2020 @09:19PM (#60718144)
            Don't know what you're using for computers but our Windows machines are usually booted and twiddling their thumbs at the login prompt even before the HDMI screens have gotten their shit together to display anything.
            • Re: (Score:3, Interesting)

              It's the corporate environment aspect. Corporate IT spent decades piling remote management software onto their Windows infrastructure boot processes. By the time everything loads and connects to the central server, it does feel like 30 minutes. Now what did these same people do when Apple hardware came along? Nothing... those machines are allowed to run without all of the boot-time crap because corporate IT only knows Windows and doesn't know how to "manage" Apple products.

              • I can confirm. I had a modest Windows10 laptop (mid range Lenovo), and from reboot to being able to actually *DO* anything, was 8 - 12 minutes, depending on policies in the background running tasks for the corporate network compliance. One upgrade day (Win10 upgrade) required 4 reboots, and it was a full 2.5 hours before I could even check email.

                While my corporate Mac that I use now has some shim shit installed on it, it is far less obnoxious than what was layered on in Windows 10.
          • by Cederic ( 9623 )

            I use my computer for many things, some of which are directly and discernibly dependent on CPU speed.

            I expressly benefit from single core speed and multi-threaded performance.

            The benchmark is irrelevant but user experience does matter, and your comparison between Mac and Windows fails miserably to really capture that.

            For instance, my Windows tablet is 'press a button and it's on' fast, and boots in under 10 seconds when I restart it from scratch. Your Mac book will, if it complies with "company policies", t

          • Most "fast" windows laptop take up to half an hour to boot, because of "company policies" - security updates, scans, encrypted hard disks, slow connection to the AD service, or what ever reason.

            Even my work laptop, which is about five years old, boots in less than a minute with all the myriad agents and things installed, and McAfee's pre-boot encryption prompt. My personal laptop boots faster than my external monitor can detect the signal and power on.

    • by Xenx ( 2211586 )
      I would assume because Geekbench is widely used for ARM benchmarking. It's almost certainly the synthetic benchmark that paints the M1 in the best light.
      • by Entrope ( 68843 ) on Thursday November 12, 2020 @07:47PM (#60717924) Homepage

        Maybe. It seems like the Geekbench scores are not very comparable across platforms. For example, the iPhone 12 Pro [geekbench.com] gets the same single-core score as the Ryzen 5 5600X [geekbench.com]. I find it difficult to believe that those two processors are equivalent for any compute-heavy task.

        • Yup, that's pretty hard to believe, but a lot points toward it being true nonetheless. We'll see when the first m1 machines get delivered next week. And it's a chip that's not even competing eith the ryzen from a pricing standpoint - i'm curious what they'll do a few generations down the road when they'll have chips for their higher end machines.
          • by Entrope ( 68843 )

            I wasn't comparing the Ryzen to the M1, but to the A14. The simple physics don't make sense: How does a cell phone processor legitimately run as fast as a desktop processor and its roughly 20 times higher power budget? The differences in decoding instructions and register set size can't bridge that gap, unless you assume that AMD and Intel are secretly terrible at CPU design.

            • Does a higher power budget make the clock faster than what it is, decrease cycles per instruction, decrease latency to caches and memory? No.

              I think it does let you put more stuff on the chip, more cache, more parallelism, more bandwidth, those can all affect performance, but it depends on the workload right?

              Power budget is like how much gas your car uses, when you should be thinking weight and power. Are you putting that energy to good effect or not is what it comes down to.

              • by Entrope ( 68843 )

                Are you saying that the A14 is like a go-kart compared to the Ryzen's heavy-duty pickup truck, and Geekbench simply measures the turning radius or top speed, making it a good basis for comparison?

                I guess I'll accept that as Apple's argument.

        • I think Apple is better than you think at creating its CPUs. They create fast processors and the improvements have been more rapid than in the Intel world (mainly because they started off so far behind). The main difficulty you will find with a phone processor is that it cannot sustain the performance. So burst (as in a benchmark) it can do well. But because of cooling issues, it needs to be throttled after a while.

          • I think Apple is better than you think at creating its CPUs.

            So let me get this straight, you believe that Apple licensed ARM and produced an ARM chip with a 53% IPC boost over AMD's wildly praised Zen architecture, and a 65% IPC boost over Intel's Gen10?

            *breaths in deeply* bwhahahahahahahahahahahahaha

        • by AmiMoJo ( 196126 )

          It's because Geekbench tests are highly artificial and focused heavily on the CPU core, so for example the massive amount of cache that a Ryzen 5600X has doesn't benefit those tests much. Also Geekbench is highly compiler dependent, i.e. the better the code that the compiler emits and the better tuned it is for a particular CPU the better it will run on that platform.

          It's going to be interesting to see the excuses when real benchmarks appear showing actual application performance.

      • by dgatwood ( 11270 )

        How long does the Cinebench test run? We're talking about the Air here, so they might have skipped that benchmark for thermal reasons.

        • by Xenx ( 2211586 )
          It's a fixed workload, so it takes as long as it takes. I've never timed it when running it. But, I would say my 3800x is in the ~1min range for multicore and like 5-10min for single core.
      • Itâ(TM)s a combination of commonly used synthetics like n-bodies and real world use cases like compiling with clang and manipulating images.

        Anandtech have found that itâ(TM)s results correlate pretty strongly with SPEC results, so itâ(TM)s a pretty good benchmark.

        • by Xenx ( 2211586 )
          I wasn't really knocking the quality of it. Just commenting that it's well used for ARM while most of the other tools used for normal CPU benchmarking aren't designed for ARM. Those other tools might give us more real world performance info, but wouldn't look as good in comparison.
    • by dfghjk ( 711126 )

      Who says it is Apple behind these initial tests? Geekbench is well known and easy to run.

      • by Entrope ( 68843 )

        Apple has repeatedly referenced Geekbench. Maybe someone else is violating their NDA and releasing M1 benchmarks early, but I would think any third party would be more interested in a benchmark different from what Apple has used for its comparisons.

    • by EvilSS ( 557649 ) on Thursday November 12, 2020 @08:01PM (#60717974)
      You can upload the scores when you run the Geekbench test, so that's why they tend to show up on pre-release hardware. Same with 3DMark and new graphics cards. The people who have them right now probably have then for evaluation and reviews, and are under embargo, but the scores can leak through the geekbench website. Its is also one of the few widely available benchmarks that runs on ARM.
      • by AmiMoJo ( 196126 )

        Nobody seriously reviewing CPUs uses Geekbench. Its results do not translate to real-world performance. For some reason it's taken hold in the mobile space, probably because real application and game benchmarks are difficult to do properly on mobile devices and because it tends to favour Apple.

        Also the comparison here is flawed. The Macbook Pro suffers from severe thermal throttling. When running sustained multi-core loads the Intel CPU hits 99C and throttles to protect itself. A comparison with a better de

        • by EvilSS ( 557649 )
          People reviewing ARM processors, however, use it all the time. If you have a better benchmark that runs under MacOS on ARM natively and allows uploading anonymous scores (you know, so they can leak pre-release), I'd love to hear it. Otherwise, tighten up that tinfoil hat and move on. Once they drop next week we will see other published benchmarks. I dislike Macs as much as the next guy but some of you turn it into a fucking religion.
    • by beelsebob ( 529313 ) on Thursday November 12, 2020 @09:18PM (#60718142)

      Because none of the other benchmark suites have released binaries for the macOS/ARM platform yet.

    • by plasm4 ( 533422 )
      I think Cinebench only just released an Apple Silicon compatible version it's benchmark for macOS. Also I think the point of Geekbench is to test the CPU on it's own without testing the cooling ability of the computer. I think both approaches give a different picture and that both are useful. Either way we'll start seeing cinebench results soon.
  • If Wikipedia is right, this is a SoC with RAM bundled into the package so you have a choice of 8G or 16G. That is less than what goes for a kids' PCs in my house. My kids' machines start at 32GB DDR4. As a consumer gear it's fine. For actual work - no thanks. I will grab a Ryzen based machine instead.
    • by Tailhook ( 98486 )

      Also, you can't run Docker.

      • Rosetta 2

        • by brunes69 ( 86786 )

          Rosetta 2 is closed source and no one has actually used it yet, so there is a lot to still look forward to here RE compatibility - we simply don't know.

          Frankly, if Apple has developed an efficient way to transpile X86 binaries to run on ARM, I am surprised they are not trying to monetize that outside OSX. A lot of people would buy that for Android and AWS.

          • by sl149q ( 1537343 )

            Anyone that has the DTK has been using Rosetta 2. Works quite well. Just download random Mac App's and they work (i.e. Intel based ones that have not been released as Intel/Arm binaries.)

        • Rosetta 2

          You're saying run Docker using Rosetta 2?

    • by Xenx ( 2211586 )
      I'm not speculating on how fit the M1 is for gaming on whole. But, 16GB is plenty for gaming. The only thing that would potentially cause a problem, in terms of RAM, is that it's shared between cpu and gpu. But, even AAA titles today only require 8GB and 16GB is recommended. There might be outliers, but 16GB is enough for general productivity and gaming.
      • I'm sure it will be more than plenty for all 3 games that will work... linux is going to look like gaming nirvana next to this.
        • by Xenx ( 2211586 )
          Yeah. Gaming on a Mac is relatively limited as is. If Rosetta works as well as claimed, I would imagine the lest demanding on cpu/gpu will be fine. It's just RAM is not likely the issue.
          • I wouldn't put too much faith in Rosetta helping out with anything interesting. macOS stopped evolving OpenGL at v2.1, in deference to Metal, so it's not going to help you run anything that requires OpenGL 3.x or 4.x. Four year-old No Man's Sky needs OpenGL 4.x, for example, or Blender for Windows under Wine because so many Blender extensions are Windows-only will work on neither Linux nor macOS.
      • I don't see that it matters. None of this matters for anything except, maybe, Photoshop and some other design work which requires a high performance device and will actually get Mac ports. It's not an x86 CPU, so trying to compare what it can do in gaming is like saying the PS5 can run Mario games better than the Switch. No it can't. It can't run them at all. So who cares?
    • What work do your kids do that requires 32GB? For most office work, I don’t see much of a need for 32 GB unless you are video editing. Heck if people close Chrome, most people I know don’t need more than 8GB. Even gaming recommendations rarely dictate 32GB as the more important factor would be the discrete GPU. Yes Apples have a lack of expansion and years from now people might want to increase their RAM later but they can’t.
      • Re:Consumer Gear (Score:5, Insightful)

        by DamnOregonian ( 963763 ) on Thursday November 12, 2020 @07:25PM (#60717858)
        My machine hovers runs around ~18-20GB with just basic use- which is admittedly quite a fair share of shit resident.
        With 16GB, it would be paging.
        So what does your kid do that requires 32GB?
        Discord, Twitch, Fortnite, Youtube, and 47 tabs of Chrome. All at once. That's what kids do on a PC.
        I just upgraded my nephew to 32GB because his machine would grind to a halt every time it tried to flush a few gigs to disk.

        Heck if people close Chrome, most people I know don’t need more than 8GB.

        I don't know who you know, but they sure as hell aren't the people I know.

        And as for gaming- of course. A game doesn't recommend 32GB, when it commits 8GB tops. But games aren't the only thing a PC does, and they're rarely the only thing running.

        • Comment removed based on user account deletion
        • I don't know who you know, but they sure as hell aren't the people I know.

          It sounds like you know computer nerds. 8GB is still very much the stock standard professional grade PC that is selling by the 100s of thousands to corporations around the world. It's also very much the standard laptop size. It is right now the most sold size for PCs.

          To be clear its not enough for me either, but then I do more than just open a couple of tabs at a time. The girlfriend is in the next room editing her DSLR photos on her 8GB machine, and that's already more of a workload than most people stress

      • I can't speak for anyone else but I'm finding now that 16GB isn't enough for gaming. I can just barely load Rust in 16GB, for example. And I can't run the browser at the same time. For a desktop box including gaming, I now want 32GB. That seems egregious to me too, but it is what it is.

      • by jythie ( 914043 )
        Pissing contests?
    • by jythie ( 914043 )
      Well, as with many things, depends on what 'work' one is doing. I do heavy duty simulation stuff on a machine with only 8GB of RAM, but my work is also not very memory or disk intensive. On the other hand I'm using the best xeon chip that I could convince my boss to pay for.
  • by Ecuador ( 740021 ) on Thursday November 12, 2020 @06:30PM (#60717724) Homepage

    Seriously, it is great in Geekbench. It may be good for some people. It may be bad for other people. But it is probably enough "news" posts about it? I'll just quote my own post from the previous story about it [slashdot.org], since we are copy pasting stories anyway...

    Basically the M1 is better than all of Intel desktop & laptop CPUs and most AMD laptop CPUs at Geekbench and some other benchmarks when run natively on ARM. Sure, that is irrelevant to myself, as I use x86 VMs daily for development and they would be significantly slowed down on an ARM chip, but you can't really call it an outright lie, just classic marketing hyperbole, certainly not deserving a rant like this.
    Apple has done much worse too. I am old enough to remember the switch from PowerPC to Intel CPUs. For a while they were selling both, so they had a section on the Apple website dedicated to showing off how much faster the PowerPC Macs are compared to PCs - conveniently using Pentium 4 (instead of the much faster back then AMD CPUs) and enabling AltiVEC on the Mac benchmarks but no SSE on the P4. AT THE SAME TIME, the "new" Intel Mac pages linked to a different section of the website, where there were benchmarks showing off how much faster the new Intel Macs were compared to the PowerPC Macs, to warrant the upgrade! If they could pull that off with a straight face, they can pull off anything.

    PS. In the meantime I try to post stories like Cisco bungling an entire country's e-education system on launch day [slashdot.org] that seem quite big nerd news to me, and the editors decline it. Not enough space I guess after all the Apple news...

    • by EvilSS ( 557649 )
      Seriously Cisco is pissing so many people off these days. My company is selling so much HP and Aruba because of their screw ups. The subscription based licensing alone has got lots of places switching vendors when they refresh.
  • by IdanceNmyCar ( 7335658 ) on Thursday November 12, 2020 @06:36PM (#60717740)

    Goes the dynamite. These many-core designs were always the future of computing. It took us a lot to start making these risks but ARM opened us up more I believe, since we returned to these more fundamental changes in architecture. Many of the gains will likely be hard to measure, such as the gains from neural cores for AI applications. As a SD, playing with these sounds like a lot of fun. It will be interested to see how they approach dedicated graphics as the graphics cores currently seem very minimal. APUs from AMD have some different functionality with AMD graphics cards, but I have never personally experienced or played with it. I think Apples best route will be making a chip with a larger Graphics core count but I also have never seen a comparison of core typewith respect to heat generation. The restriction may be heat but otherwise they could Either have a single chip with 4x to 10x the graphics cores prevent more radical put two chips on the same board. The latter probably is more a unsound approach.

    No matter what, I hope this starts to silence some naysayers. Jia you Apple.

    • The 5nm process shrink is the winner here, and the brains not to bet on Intel after it did a Boeing and let MIS grads run the place. Multiple processors need multiple decode and pipeline logic, and cache, and I/O - pretty much a waste of power when this faulty logic in Intels case gobbles power, and more power to workaround the unfixable (so far). Plus memory fragmentation. One suspects Apple took a scalpel to the speculative execution pipelines, and no need for x86 legacy rubbish on Arm. profit margin - UP
  • Everyone on here raging about how the price is too high for the performance is missing the fact that the M1 was created with battery life in mind. Sure you can run a high end chip in an Alienware laptop, but unless it has a 50 pound battery good luck getting more than two hours.
    • No one's missing that. Plenty of people in other articles posted to slashdot (myself included) have pointed out that the battery life is going to be excellent.

      But this particular article (out of the many slashvertisements that have been posted about this laptop) is about benchmarks - so people are calling out the obvious marketing bullshit.
  • by rsilvergun ( 571051 ) on Thursday November 12, 2020 @07:00PM (#60717812)
    Mac books tend to have pretty bad thermals due to their ultra thin design. It always struck me as odd to push for such a thin laptop you were going to use for production purposes. Add an extra 1/2" doesn't make it all that much more bulky and lets your CPU breath and run at it's full potential.
  • by Camembert ( 2891457 ) on Thursday November 12, 2020 @08:01PM (#60717972)
    We can discuss about the real life validity of these benchmarks until the machines are in the hands of the reviewers next week.
    But interesting is that these are positioned as entry level Macs in the range. Apple's M2 must be quite a bit faster still (and probably with 32gb on chip as well) to position them for the more upmarket Macs.
    Regardless how relevant geek benchmarks are, I am pretty confident that these Macs will for most people be a total pleasure to use with recompiled software.
    I am still enjoying a maxed out early 2014 Air for my home use. Hope it works well for another couple of years, so I can skip to the next generation of Apple Silicon Airs.
  • Geekbench is great and all, but we want to know what really matters!
  • They came out with a Air, the Mini, and the tiny Mac Book Pro. Apple's entry machines.
    M1 isn't the end all. I would expect that the iMac gets the next tier up, and they will finish
    the transition on the Pros with something Intel and AMD will have a hard time besting for a while.

    That said, it's not magic and others could copy/catchup/contract with TMSC.

    • by cowdung ( 702933 )

      and the Macbook Pro is only "pro" in name.. not a single pro feature in it.

      It doesn't even have enough USB-C connectors let alone useful stuff.

      • And yet, many pros use one to edit photos and videos for their jobs.
        • Comment removed based on user account deletion
        • by jythie ( 914043 )
          yeah, I am not sure what 'pro features' the person is picturing, but I know a lot of engineering professionals and data scientists who use macbook pros for their well paid work. Never once have I seen them hook up anything 'USB-C' outside USB-C to HDMI adapters for connecting to one of the big screens at a meetig.
  • Given even cheap laptops can handle the vast majority of everyday computing activities without much effort, the focus on the CPU seems ... disingenuous.

    Heck, I'm using a 2010 mac pro cheese-grater and even this has a multi-core score of 5100 - and it's 10 years old. Granted, it uses about the same amount of power as a small third world country ... so, you know, comparing Apples to Apples ... hmm.

    But I'm willing to bet this old banger wipes the floor with the new MacBook Air on GPU benchmarks.
    Even *that* is

Let's organize this thing and take all the fun out of it.

Working...