Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Portables (Apple)

AMD Claims New Laptop Chip Is 30% Faster Than M1 Pro, Promises Up To 30 Hours of Battery Life (macrumors.com) 74

At CES this week, AMD announced a suite of new chips for notebooks and desktop computers, with one notable announcement being the company's new AMD Ryzen 7040 series of processors for ultrathin notebooks that will compete with Apple's M1 Pro and M2 chips. MacRumors reports: The AMD Ryzen 7040 series of chips are "ultrathin" processors based on the 4nm process, and the highest-end chip part of the family is the Ryzen 9 7940HS. The Ryzen 9 7940HS has eight cores, 16 threads, and 5.2GHz boost speeds. Announcing the new chip, AMD CEO Lisa Su made bold claims about its performance, saying it's up to 30% faster than Apple's M1 Pro chip. In specific tasks, AMD claims the chip is 34% faster in multiprocessing workloads than the M1 Pro and 20% faster than the M2 in AI tasks.

One cornerstone of Apple silicon is energy efficiency, and in that area, AMD claims the new AMD Ryzen 7040 series will offer 30+ hours of video playback in ultrathin notebooks. Built directly into the series of chips is Ryzen AI, a dedicated AI engine embedded in the processor. AMD chips configured with Ryzen AI are 20% faster in AI tasks than Apple's M2 chip while being 50% more energy efficient, according to the company.

To showcase the new chip's performance, AMD compared the performance of a high-end Intel chip, the M1 Pro, and its new Ryzen 9 7940HS processor rendering an object in the popular application Blender. In the time-lapsed video shown on stage, the M1 Pro lags behind the Ryzen 9 7940HS in rendering the object. AMD says it made its performance claims against a MacBook Pro with M1 Pro, 32GB of unified memory, and 1TB of SSD storage running macOS Monterey. The M1 Pro is not Apple's highest-end and most powerful chip for laptops, which is the M1 Max, and AMD did not compare its chip to the M1 Max.
After roasting the M1 Pro, Ian Zelbo from FrontPageTech noticed AMD running their CES keynote on multiple 14-inch MacBook Pros. "Obviously these are contracted employees, and it means nothing," he tweeted. "I just always find stuff like this hilarious."

We do too... It's akin to the "Twitter for iPhone" line on tweets that have gotten Android promoters in hot water multiple times over the past several years.
This discussion has been archived. No new comments can be posted.

AMD Claims New Laptop Chip Is 30% Faster Than M1 Pro, Promises Up To 30 Hours of Battery Life

Comments Filter:
  • Hopefully all these claims play out in the real world. it's always good to have competition...

    Not quite clear to me was if the increased speed was always paired with lower than M1 power usage. It seemed like there was one case that was stated.

    • Re: (Score:2, Interesting)

      by fermion ( 181285 )
      AMD has been making top chips for 53 years. Apple has been making laptop chips for two years. Mobile chips for less than a decade

      Apple also makes highly engineered machine. There is no one part that is more important. This was why it used to be funny to see people buy a machine with a âfastâ(TM) processor in a severely under built machine. U,to steely the success of the AMD chip will be how cost effective it is to integrate into a full laptop

      • I do not agree that because you have been doing it longer means you can do it better necessarily. I do agree with the whole appliance approach. The M1 Max is not Apple state of the art in processors. It is a year old. Also, I am not sure it is wise to compare yourself to your competitor that has shown such engineering excellence in your field, especially when they have the financial and engineering might to roast you. Unless you leap frog them. AMD has not done so here.
        • by dfghjk ( 711126 )

          LOL

          So what if the M1 Max is a year old

          What could possibly be better than comparing yourself to recognized excellence...when you compare favorably

          Intel had the "financial and engineering might" to "roast" AMD for decades and still do, yet AMD has done just fine comparing themselves to Intel.

          It's almost as though you think Apple is magic.

          • .So what if the M1 Max is a year old

            Just two problems with that statement:

            1) They are comparing to the M1 Pro, not Max.

            2) The M2 exists already, has for some time, so why compare to an older product?

            • by Entrope ( 68843 )

              There is almost no difference in CPU performance between the M1 Pro and the M1 Max. A GPU performance comparison would be interesting in some respects, but Apple's GPUs are limited in a lot of ways.

              The M2 has slightly better single-core performance than the M1 Pro and M1 Max, but fewer cores, so the multi-core performance is worse. That raises an interesting point about AMD's performance claims: The Ryzen 7940HS is an 8-core/16-thread CPU, but the M1 Pro is a 10-core/10-thread CPU. If AMD got the adverti

          • AMD has never been real competition for Intel. They have been in CPUs and chipsets.

            Intel is a much bigger business than just PCs and servers.

            Intel could compete more aggressively with AMD, but they are too busy counting their money.
      • Comment removed based on user account deletion
        • by fermion ( 181285 )
          AMD has been manufacturing chips for at least 40 years. They have been seriously designing chips for at least 30 years.
    • Question for the sequential logic designers out there: is this speed increase architectural (in that somehow they require fewer gates or stages to perform some task), or is this due only to process node?

      I mean, designs are such that basically everyone does an add or mul with the same circuit in the ALU, yes? So why would one design perform math faster than another?

    • by vlad30 ( 44644 )
      More interesting that the comparison is made with apple not Intel the usual AMD rival
  • Laptop memory isnt replaceable today, why not just package memory on the same package like apple did? They already have advanced packaging technologies.
    • by Mononymous ( 6156676 ) on Thursday January 05, 2023 @06:59PM (#63183298)

      You're buying your laptops from the wrong company. Don't incentivize that nonsense.

      • My Dell XPS 13 vs M1 Macbook Air. Same crap
        • by slaker ( 53818 )

          My 2022 ThinkPad X1 Extreme has two DIMM slots, two nvme slots, both iGPU and discrete GPU. It weighs under 2kg. I paid under $1500 for it.
          If you wanted a laptop with expansion and didn't buy one, you're part of the problem.

          • by edwdig ( 47888 )

            Pretty much all brands have soldered on RAM on at least some models nowadays. My 2018 ThinkPad T480s has one soldered on DIMM and one slot. When I bought it, to get a second DIMM slot I would've had to move to a model that weighed an extra pound. Generally if you give up some upgrade flexibility, you can get a much lighter laptop. The weight reduction meant more to me than the extra RAM flexibility.

        • by saloomy ( 2817221 ) on Thursday January 05, 2023 @08:06PM (#63183422)
          It is a trade off. Would you rather a laptop that had off die L1,2 or 3 caches at a sacrifice of power, performance, etc? Just because we used to do it that way does not mean it is the best way to do it. Laptops are lighter, thinner, and feel better because we can remove things like optical drives, battery exchanging cases, and dimm slots
          • by dfghjk ( 711126 )

            "Laptops are lighter, thinner, and feel better because we can remove things like optical drives, battery exchanging cases, and dimm slots"

            Optical drives sure, "battery exchanging cases" meh, dimm slots bullshit.

            and "lighter and thinner" sure, but "feel better" is nonsense. Removing that stuff never made anything "feel better".

            It's almost as if you think Apple is magic.

            • by ceoyoyo ( 59147 )

              LPDDR is designed to be low power and small form factor for laptops. You solder it on instead of putting it in a module because longer signal paths induce signal timing issues and connections involve impedance problems, voltage drops, and signal timing issues.

              Soldered RAM lets laptops be lighter, smaller, faster and use less power. Those are pretty much the things that make one "feel better."

              You can expect desktops to ditch sockets as RAM gets faster too. And putting it on the die is even better than solder

              • by CaptQuark ( 2706165 ) on Friday January 06, 2023 @03:22AM (#63184082)

                Another new option, rather than just soldering the memory directly to the motherboard, is the new CAMM standard. It addresses the problems with upgradabilitry, path lengths, etc. I hadn't heard much about it before LTT did a review last month. It might be old news to some but new to others.

                I've NEVER Seen a Memory Stick Like THIS - CAMM Explained [youtube.com]
                Dell CAMM DRAM: The New Laptop Standard? [storagereview.com]
                CAMM memory preview: The Dell SODIMM revolution [notebookcheck.net]

                And other reviews can be found...

                • Been hearing about CAMM since a few months.

                  Not a big fan, especially since an upgrade means dumping your old memory. You can't just add another piece of RAM to a slot and keep your existing RAM. You may end up with odd numbers such as 24GB ram, etc, but you can use all your RAM, including your older RAM.

                  CAMM requires you to fully replace the previous RAM in your laptop. And since all laptops come with at least some RAM, you can't even pass CAMM modules to an older laptop and add to it. Eventually you will e

                  • This. And, they are not as efficient as on-die RAM. That, and they would not be as form-factor compliant with the thin and light on-die memory, and would consume more power.
                • by mjwx ( 966435 )

                  Another new option, rather than just soldering the memory directly to the motherboard, is the new CAMM standard. It addresses the problems with upgradabilitry, path lengths, etc. I hadn't heard much about it before LTT did a review last month. It might be old news to some but new to others.

                  I've NEVER Seen a Memory Stick Like THIS - CAMM Explained [youtube.com]
                  Dell CAMM DRAM: The New Laptop Standard? [storagereview.com]
                  CAMM memory preview: The Dell SODIMM revolution [notebookcheck.net]

                  And other reviews can be found...

                  Great, another thing I need to check a new laptop doesn't have before I buy one.

                  This is really a solution looking for a problem. SODIMM is good enough that even a better replacement wont supplant it. Its why we still sell more DVD writers than Bluray writers and DVDs still outsell Bluray. It was a better technical solution for a problem no-one had.

                  Why can't we make laptops a few CM thicker? Last year I bought a new 15" Asus that had 2 replaceable DIMMs and 2 replaceable M2s. For a few quid, I doubled

                • by ceoyoyo ( 59147 )

                  It's a clever redesign that overcomes some of the limitations of DIMM, especially in laptops, but it's just optimizing closer to the physical limits. You're still going to need traces longer than on-chip memory and almost always longer than soldered, and it still has a physical connection with all of the electrical limitations that imposes.

                  There's a place for all the solutions. DIMM for people who want easily upgradable laptops and don't care so much about form factor or speed, low power soldered for people

      • no.

        You're buying LPDDR which had substantially lower power draw than DDR, and has no socketed form factor. Even Lenovo use soldered ram in their small laptops and they have socketed Bluetooth radios.

        • sockets for radios (Score:4, Informative)

          by Firethorn ( 177587 ) on Friday January 06, 2023 @12:01AM (#63183768) Homepage Journal

          They socket the radio systems - not just bluetooth, for a very simple reason: regulations.

          If they put the systems on the motherboard, then the entire motherboard has to be tested to regulatory standards for transmission stuff. When they change it, it has to be tested again. In multiple countries.

          Or they can put it on small bus as an add-on card, and then just test the add-on card. Hell, if they need a different card because a different country has their own requirements, it can be done.

          Memory being replaceable or upgradable just isn't the same drive.

          • They socket the radio systems - not just bluetooth, for a very simple reason: regulations.

            I think you are mistaken on two points: firstly the motherboards have to be tested regardless of the radio because they have GHz signals and can spew interference. Secondly, and here's where the rules are less clear to be honest (and I'm rusty), but I believe the assembly has to be tested. Just because the radios are theoretically detachable, ISTR if they're sold as a combined package ready assembled, they count as a s

        • As long as they add enough of it, it would be fine for me. My desktop has 32GB of ECC RAM, and I expect this to be enough for the next 10 years, as technological progress has slowed down somewhat.
          The memory came in two modules for 95 Euro each (summer 2022), so making a laptop with 32 GB instead of 16GB should be doable for an extra $100.

          • While I dont necessarily agree that 32GB is enough for 10 years (maybe it has slowed down of late, but that isnt necessarily an indicator of where it might go in the next 10 years), I do agree that pricing should be reflective of this. Yes, on-die memory is more expensive because the die has to be larger and therefore there are less chips per cycle of the photolithography plant, but only to a certain extent. Profit from memory should be held to a near constant.
      • Don't incentivize that nonsense.

        What nonsense? A laptop is a utility device. Heck the entire modern PC is. Very few PCs are upgraded these days. We're well beyond the days where a simple memory upgrade is the solution to a problem. Buy the correctly spec'd laptop / PC and use it till it breaks. They no longer cost $3000 (inflation adjusted) for a base model like they did in the 90s.

        The only part of a laptop that actively needs to be removable is the SSD, and that only to recover data if its damaged. Even then if you live your life in the

    • by grub ( 11606 ) <slashdot@grub.net> on Thursday January 05, 2023 @07:21PM (#63183336) Homepage Journal
      My Framework laptop has replacable memory.
    • I just upgraded my MSI laptop memory last week, from 16gb to 64gb. When I bought it 3 years ago 16gb was plenty but I've since started to do work on large datasets that requires more memory so I upgraded, cost $200. Installation took me about 10 minutes, one screwdriver, and a guitar pick as a spudger.
  • With much lower storage and ram pricing!

    Apple storage pricing is like at least X2 or more then other m.2 disks.

  • First off it was the AV guys, secondly none of pics he took had the presentation running on the screen AFIAK. It would have been a more convincing argument if he would have taken a pic of the same slide on the MAC and the stage, I couldn't see anything like that. What I really want is these specs to get in the hands of benchmarkers for review.
  • by RussellTheMuscle ( 2783037 ) on Thursday January 05, 2023 @06:57PM (#63183294)
    Until then, it's marketing.
    • Mod parent up for truthiness. Vaporware isn't hardware.

      • 4nm Vs M2 chip is built using 5-nanometer tech - a 20% difference So just the shrink gives a 20% advantage, you add extra cores and get 32%. That is no real gain. Wait till Apple does the same shrink and ups the clock speed as well. Maybe they booked 3nm fab slots.
  • game on! (Score:4, Interesting)

    by aRTeeNLCH ( 6256058 ) on Thursday January 05, 2023 @07:00PM (#63183300)
    So, I just upgraded my 65W Ryzen 5 2400G with a 35W Ryzen7 5750GE "to reduce energy consumption and not let putain win" (yes, the French got that one right all along)...
    It rocks to have 16 logical cores.

    But this new laptop CPU, awesome! I don't really care whether it's really always faster at lower power consumption, or only in select cases, it matters a lot that this is highlighted at the introduction. The race is on, let's celebrate that.

    Another claim that I often see, 8 to 10 hours battery is enough (640 minutes anyone?), but all I can say is: the longer your battery runs, the fewer times you need to charge it, and the longer total life it will have.

    • by G00F ( 241765 )

      huh, I am recently looking for low power cpu to replace my current server (4 core embedded celeron that uses like 10 watts) and didnt see that come up. Only saw some older 4 core 4 thread at 35 watt, otherwise I saw the 5300g/5600g at 65 watt.

      • It's an OEM only part AFAICT.

        • It's not OEM only, but it's very expensive. Since I also got ECC memory that surprisingly wasn't supported on my 2400G since all socket AM4 CPUs support it, but APUs only when they are Pro... I just had to get the Pro 5750 GE (I'd have taken the 5650 Ryzen 5 with 6 cores, but it remained unavailable).
    • by dfghjk ( 711126 )

      "8 to 10 hours battery is enough (640 minutes anyone?), but all I can say is: the longer your battery runs, the fewer times you need to charge it, and the longer total life it will have."

      And yet 8-10 is still frequently enough, especially for a class of users who never need to operate on battery for long periods. For those users 8-10 is enough, device life is just fine and recharging isn't a burden.

      • I'm not saying 10 hours isn't enough, I'm saying more is better. Or rather, less power consumption is. With 10 hours you have to recharge each working day, where with 20 hours it's each other day. So the point where your battery capacity drops to below 8 hours (not a full working day) and you have to be careful to bring your charger is after 4 or even 6 years instead of before 2 (the 80% level is reached after 2x the time, 300 to 500 cycles or so, but below 8 hours is 40% for a 20 hour system). Nowadays, al
      • by tlhIngan ( 30335 )

        "8 to 10 hours battery is enough (640 minutes anyone?), but all I can say is: the longer your battery runs, the fewer times you need to charge it, and the longer total life it will have."

        And yet 8-10 is still frequently enough, especially for a class of users who never need to operate on battery for long periods. For those users 8-10 is enough, device life is just fine and recharging isn't a burden.

        8-10 on the box isn't enough - if the vendor is somewhat honest (Apple is, oddly enough) then you can probably

    • by AmiMoJo ( 196126 )

      I'm hoping that Lenovo step up their next gen ThinkPads. The current ones are almost great, but lack ports and expansion options.

  • by backslashdot ( 95548 ) on Thursday January 05, 2023 @07:06PM (#63183308)

    Many years ago, when doing contract work, I was at a meeting at Apple wherein a vendor came and was setting up to make a presentation with a PC. As soon as the director walked in, he said, "What is that? You know what. We're done here. Wrap this up!" and walked out. It was pretty damn funny. In fact, my friends who work there tell me it is a pretty common occurrence at Apple that if you showed up with a non-Apple device -- even if you weren't presenting, you got kicked out or at least complained about -- no matter who you were. Even wearing a non-Apple watch is a risky thing -- won't help you, that's for sure. Contrast that with Tesla; I noticed their parking lots don't have as many Teslas as you'd expect, which I think is weird.

    • by sphealey ( 2855 )

      - - - Contrast that with Tesla; I noticed their parking lots don't have as many Teslas as you'd expect, which I think is weird. - - -

      That just might have something to do with the percentage of gross income that the respective firms pay out in salaries, and the median time on payroll of the median employee at each firm.

      • by dfghjk ( 711126 )

        As a complementary story to this, I worked for 15 years for a rapidly growing PC manufacturer that would go through phases of hiring from one industry competitor in particular (based on where the newest executive came from).

        The worst example of this was the Apple hiring phase, where the company hired many (50+) Apple engineers and managers. They were conspicuous for requiring solely Apple hardware, most notably Newtons. Not one of them ultimately contributed to the company's success and only one notably la

        • Hate to break it to you that would have been 25 to 30 years ago. You are getting old. Apple discontinued the newton in 1998. A person called Steve came and rewrote the company, including killing lots of products such as the Newton and changing the management. The pre-second coming of Steve were definitely the bad years, regardless of what you may think about the company now.
    • Wow, they were never so petulant and insecure in the '90's. Success ought to breed confidence, not arrogance.

      The arrogant always fall.

    • Well that's Apple's corporate culture for you. You're either one of us or your opinions are invalid and you're not worth listening to.

      Though that said there's a big difference between expecting your employee to have a specific $500 fashion accessory, vs your employee having a $35000 core mobility device that requires infrastructure in your home.

      My company recently changed policies to reduce CO2 emissions on company lease vehicles. The new policies basically mandate electric cars or PEHVs (and the latter com

  • I have it on decent authority, that the entire presentation was done using Macs.
  • "30%" faster (title) and "up to 30% faster" (ingress) is not the same thing. Which is it?

    Personally, I am up to a factor x>1 faster than the world's fastest person, with x possibly approaching infinity. Well, that is for the particular situation that (s)he is hospitalized and cannot leave the bed. But still, I am "up to infinity % faster than the world's fastest person". Impressive, eh?

    We'll have to wait for actual benchmarks before there is any point in getting too excited.

  • No date given by AMD for new chip. Meanwhile, Apple is developing/improving the M2 and a M3 is in the works. M1 is old at this point and new Apple products will have the M2 and faster.
    • I just bought a 'discounted' m1max with 32gb of ram around Xmas ($500 off). It may be old, but it will be more computer than I need for the next 5-7 years.

    • Exactly, the M1 Pro came out in November 2020, that's 2 1/4 years ago. The M2 is 35% more powerful than the M1 at graphics tasks, and it's been shipping for 8 months already. So kudos to AMD for pushing the envelope, but it's most likely that this Ryzen chip is at best comparable to the M2. However, that implies that it will be significantly slower than the rumored M3.

  • Need a new windows laptop but have been agonizing for months wether to buy M1/M2 macbook air and run win11 arm on parallels (office IT only supports windows) or just get a good windows laptop. The difference in performance between Apple & Intel/AMD silicon is huge. Before even considering power efficiency etc.

    Intel has only now inched somewhat closer to M1 with their 12th gen CPUs and a lot of those benchmarks look a bit better than actual use for Intel.

    If you check scores for last 3-4 gen Intel cpus y

    • by caseih ( 160668 )

      Just went through this when needing to get a new laptop for an individual in my organization. Spent all day looking at Windows laptops in my budget. There are dozens and dozens of laptops available, mostly all garbage, poor battery life, and various other compromises. Even a few models that were highly recommended to me didn't seem all that appealing. So in the end, despite my disdain for Apple and dislike of macOS, I bought a MacBook Air M2 for the same price. Yes it's full of its own compromises, bu

  • by Casandro ( 751346 ) on Friday January 06, 2023 @09:17AM (#63184434)

    Just make the battery an external component you can slot in. This way you could offer a tiny 20 Wh battery for people who want to have a slim laptop or a normal sized 100 Wh battery for people who want to use their laptop on battery.

    Bonus points for having a secondary battery so you can replace the primary one on the go.

  • For a moment I thought that Apple proved RISC [wikipedia.org] to be the winner on efficiency and performance.

    Now, AMD claims 30% Faster than M1 and 30+ hours of video playback?

    If this is true, this could be a game changer, not being tied to RISC/ARM for energy efficiency.
  • The main reason is that Apple has the head start, but the bigger reason is that they have more resources being put to bear and they have an easier job to do.
    Intel and AMD basically have to build RISC processors inside x86 translators. This gives them a complexity management disadvantage and a silicon efficiency/availability disadvantage and on top of that, they are starting to get out-spent on R&D. The R&D budgets of Apple + ARM + TSMC all combine to improve Apple's processors and their goals and

"It's the best thing since professional golfers on 'ludes." -- Rick Obidiah

Working...