Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Businesses Intel Apple Hardware

Apple Ditched Intel, and It Paid Off (cnbc.com) 101

An anonymous reader quotes a report from CNBC, written by Todd Haselton: Apple's decision to ditch Intel paid off this year. The pivot allowed Apple to completely rethink the Mac, which had started to grow stale with an aging design and iterative annual upgrades. Following the divorce from Intel, Apple has launched far more exciting computers which, paired with an ongoing pandemic that has forced people to work and learn from home, have sent Apple's Mac business soaring. It wasn't always a given. When Apple announced its move away from Intel in 2020, it was fair to question just how well Apple could power laptops and desktop computers. Apple has used in-house chips for iPhones and iPads but had been selling Intel-powered computers for 15 years. It wasn't clear how well its macOS desktop software would work with apps designed to run on Intel chips, or whether its processors would offer any consumer benefits and keep up with intensive tasks that people turned to MacBooks to run. Those fears were quickly quelled.

The first M1 Apple chip was launched in 2020 in a MacBook Air laptop. It was more powerful than Intel's chip while offering longer battery life and enabling a fanless design, which helped keep Apple's new MacBook Air even quieter. It proved to be an early success. In April 2021, CEO Tim Cook said during the company's fiscal second-quarter earnings call that the M1 chip helped fuel the 70.1% growth in Apple's Mac revenue, which hit $9.1 billion during that quarter. The growth continued in fiscal Q3, when Mac revenue was up 16% year over year. That quarter, it launched the all-new iMac, which offered a redesigned super-thin metal body that looks like a screen propped up on a stand. It's slimmer than the Intel models that came before it, while offering other benefits, like a much better webcam, great speakers and a much sharper display than the models it replaced. And Apple made the launch more exciting by offering an array of colors for the iMac, which it hadn't done since it shipped the 1999 iMac. There was a slowdown in fiscal Q4, when Mac revenue grew just 1.6%, as Apple, like all manufacturers, saw a slowdown from the burst of sales driven by the start of the pandemic and dealt with supply chain woes. But fiscal Q4 sales didn't include revenue from its most exciting new computer of the year.

Apple's fiscal Q1 earnings in January will give an indication of how well all its new computers are selling. But it's clear the move from Intel has allowed Apple to move full speed ahead with its own chip development, much like it does for iPhones and iPads, the latter of which has yet to be matched by any other tablet on the market. It's no longer beholden to delays that plagued Intel, which started to lag behind AMD with its new 7nm chips. And Apple has full control over its "stack," which means it can design new computer hardware and software together, instead of letting the power of another company's chips dictate what its computers can and can't do.

This discussion has been archived. No new comments can be posted.

Apple Ditched Intel, and It Paid Off

Comments Filter:
  • time will tell. (Score:3, Insightful)

    by bloodhawk ( 813939 ) on Wednesday December 29, 2021 @05:10PM (#62126227)
    I think we need to wait a few years to see whether it paid off, short term won't be the deciding factor, it will be how well they innovate and keep up with AMD and INTEL. If they sit on their hands it will look like a really bad decision in a couple of years time.
    • Re:time will tell. (Score:5, Informative)

      by Artem S. Tashkinov ( 764309 ) on Wednesday December 29, 2021 @05:38PM (#62126327) Homepage

      how well they innovate and keep up with AMD and INTEL

      You've got it backwards. Apple M1 destroyed [anandtech.com] both [anandtech.com] Intel and AMD uArchs by offering comparable or higher performance at a significantly lower power package. From the linked review:

      Whatâ(TM)s really important for the general public and Appleâ(TM)s success is the fact that the performance of the M1 doesnâ(TM)t feel any different than if you were using a very high-end Intel or AMD CPU. Apple achieving this in-house with their own design is a paradigm shift, and in the future will allow them to achieve a certain level of software-hardware vertical integration that just hasnâ(TM)t been seen before and isnâ(TM)t achieved yet by anybody else.

      • Re:time will tell. (Score:5, Informative)

        by DamnOregonian ( 963763 ) on Wednesday December 29, 2021 @06:28PM (#62126507)
        This overstates the situation.
        You're right that work unit per watt, the M1 fucking annihilated its x86 rivals.
        However, that's the only way.
        The comparable or higher performance is incorrect the way you word it.
        In the benchmarks you link, they're demonstrating its (unimpeachable) superiority over CPUs with similar power envelopes.
        The high end Intel and AMD parts smoke even today's M1 Max. Of course, using quite a lot of power, but still, a win is a win when we're discussing "who has the best performance"

        My 2-generation-old 10900HK has comparable multicore performance as my M1 Max, and significantly superior single-core performance.
        My M1 (Air, but I understand the MBP M1 wasn't significantly better) is more or less laid to waste.

        I'm not shitting on Apple here. I fucking love my M1* devices. First laptops I've bought in a decade that I was really excited about. But when you get rid of the per-watt from any of their metrics, they're middling laptop parts at best.
        • Re:time will tell. (Score:4, Interesting)

          by R3d M3rcury ( 871886 ) on Wednesday December 29, 2021 @09:29PM (#62126923) Journal

          In the benchmarks you link, they're demonstrating its (unimpeachable) superiority over CPUs with similar power envelopes.
          The high end Intel and AMD parts smoke even today's M1 Max. Of course, using quite a lot of power, but still, a win is a win when we're discussing "who has the best performance"

          I've gotta admit, this is where I start to gripe about Apple.

          Performance per watt is something very important for a laptop, of which Apple sells a lot. I can appreciate that my MacBook used significantly less electricity to do it's tasks but, ultimately, I want it to finish before 5:00PM so I can go home. My MacBook is plugged into a wall socket, so I really don't care how much power it's using.

          I'll be curious to see what Apple does with the Mac Pro. That should be Apple's "Performance, period" machine--I don't care about how much electricity it uses, I don't care that much about whether it's fan comes on or not, I just want it to finish what I tell it to do as fast as possible so I can go home.

          • Well to be fair- the M1 Max does outperform any other MacBook made... but that was Apple's fault, not Intel's. Intel didn't lie to them about what its thermal requirements were, Apple decided to put it in a chassis that couldn't keep it cool enough to actually get its advertised performance.

            Anyway, I agree. I'd love to see some focus on trying to get the clock speeds up on these.
            Sure, they lose some efficiency. Sure, it'll take some work to get the architecture working stably at 5Ghz. But I think it'd be
        • by Pieroxy ( 222434 )

          My 2-generation-old 10900HK has comparable multicore performance as my M1 Max, and significantly superior single-core performance.

          While I agree in general with your post, the 10900K has similar multicore performance but far worse single core performance than the M1 Max:

          https://www.cpubenchmark.net/c... [cpubenchmark.net]

          • Couple of things.
            First, I meant 10980HK (that's the laptop part, the 10900K is a desktop part) (Can the desktop part be... slower? Looks like it's got 10 cores instead of 8)
            That's my bad.
            Second, I'm seeing mixed results online... which is odd.
            OK, so it turns out the version of Cinebench I was using is still using Rosetta- so it was translated. Those results are trash.
            Native, (R23), the Max wins in both single and multicore.

            You sir, are correct. The M1 Max is faster. Though, I swear on my life, it sur
        • The high end Intel and AMD parts smoke even today's M1 Max.
          According to Apple, it is the other way around.

          But when you get rid of the per-watt from any of their metrics, they're middling laptop parts at best.
          Not according to Apple. And at least in Europe lying in a benchmark/advertisement is forbidden.

          I did not bother to check it though ... it is not the time yet for me to by an Mx Apple :D

      • by jwhyche ( 6192 )

        You apple fanboys are just as delusional as those old Amiga fanboys were.

      • huge exaggeration there, in some areas apple definitely is ahead, in others like graphics they are light years behind and a little behind on core scaling as well. But they are well positioned if they can innovate on what they have, if they don't they will be rapidly dead in the water.
      • The M1 *was* more efficient. Not faster, but more efficient. But you replied to "time will tell". AMD and Intel will come out with new CPUs next month and next year.

        Intel's R&D budget is $13 billion each year.
        Is Apple going to spend $13 billion / year and hope to keep up?

        At first, Apple's decision didn't immediately blow up in their face. Ov r the next 10 years we'll find out if it turns out to be a good strategic decision.

        • by teg ( 97890 )

          The M1 *was* more efficient. Not faster, but more efficient. But you replied to "time will tell". AMD and Intel will come out with new CPUs next month and next year.

          Intel's R&D budget is $13 billion each year.
          Is Apple going to spend $13 billion / year and hope to keep up?

          At first, Apple's decision didn't immediately blow up in their face. Ov r the next 10 years we'll find out if it turns out to be a good strategic decision.

          Apple spent almost $19 billion in fiscal 2020 [nasdaq.com]. That said, comparing R&D budgets doesn't tell the whole story.

          Apple spends a ton (by far the most, probably) of its R&D budget on other things than CPUs - both software, the devices they sell, devices they want to sell in the future - and some of the components in them. Intel also spends a lot of its budget on other things than CPU design... e.g., they spend a lot of money on the processes to create the CPUs themselves. This is the area where Intel use

          • > Also, the starting point and approach (few variants vs. a gazillion variants to maximize market segmentation) impact how much money you need to spend.

            You might find it interesting to Google semiconductor binning.
            Intel doesn't design nearly as many SKUs as it sells. Got a core that doesn't work on this 8-core CPU? You sell it as a 6-core CPU. It's unstable at the target speed? Sell it as a slower speed.

            Being able to sell all the rejects under different SKUs is advantage Intel.

            • by teg ( 97890 )

              > Also, the starting point and approach (few variants vs. a gazillion variants to maximize market segmentation) impact how much money you need to spend.

              You might find it interesting to Google semiconductor binning.
              Intel doesn't design nearly as many SKUs as it sells. Got a core that doesn't work on this 8-core CPU? You sell it as a 6-core CPU. It's unstable at the target speed? Sell it as a slower speed.

              Being able to sell all the rejects under different SKUs is advantage Intel.

              Apple already does this - there's quite a variation in the core counts on the CPU and GPU side, and I'm sure the iPad Pro also allows for yet another dimension for binning. Intel also does market segmentation on instructions and capabilities - e.g. a MacBook Air will trash any Intel laptop or desktop for many ML tasks, as every chip has an ML accelerator. Intel has this in "select Xeon cpus" only, in order to extract the most possible revenue. I'm just suggesting that doing it this way might lead to bad res

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      I think we need to wait a few years to see whether it paid off, short term won't be the deciding factor, it will be how well they innovate and keep up with AMD and INTEL. If they sit on their hands it will look like a really bad decision in a couple of years time.

      Yeah, this is a really short term payoff. Our workplace is being hobbled by the lack of printer drivers. People are having to send print jobs to Windows users to get work done, and some users are now asking for Windows workstations. Funny how some bits of Mac software aren't essential when you can't do something basic like print.

      • Re: (Score:3, Insightful)

        How is it Apple’s fault that your printer vendor is not supplying drivers?

        • by hazem ( 472289 )

          How is it the printer vendor's fault that Apple broke their hardware and software platforms so their drivers no longer work?

          • How is it the printer vendor's fault that Apple broke their hardware and software platforms so their drivers no longer work?

            It is 100% the printer vendor's fault. They knew this was coming and had the specs from Apple long before the release. In most cases, they just needed to recompile for the new arch, with no changes to the API.

            If your printer company is this incompetent, perhaps you should buy your printers from someone else.

            My printer is a Canon. I bought it at Walmart for $39. It works fine with the M1 MacBook.

            • by xwin ( 848234 )
              How does it matter who's fault this is? People can't print at that place. The corporations do not go to Walmart to get printers. They purchase print centers and copy centers. They will not change the printing hardware with every PC upgrade cycle.
              • How does it matter who's fault this is?

                It matters because it is a lot easier to switch printers than to switch laptops.

                If I google for people having printer problems with the M1, all of them are from a year ago when M1 Macs were first introduced. So it looks like this isn't a real problem anymore.

            • by jwhyche ( 6192 )

              It is 100% the printer vendor's fault.

              It's pretty much Apple's fault. They switch hardware again and another vendor decided not to throw away good money on a system that is less 2% of the market.

      • by shking ( 125052 )
        Has your workplace tried the Gutenprint open source drivers? https://sourceforge.net/projec... [sourceforge.net]
      • I canâ(TM)t believe this is a real post for two reasons: who is actually printing a lot, and what printers are not supported on M1?

        Not that I care either way but the anonymous post looks like itâ(TM)s got an agenda.

      • by teg ( 97890 )

        I think we need to wait a few years to see whether it paid off, short term won't be the deciding factor, it will be how well they innovate and keep up with AMD and INTEL. If they sit on their hands it will look like a really bad decision in a couple of years time.

        Yeah, this is a really short term payoff. Our workplace is being hobbled by the lack of printer drivers. People are having to send print jobs to Windows users to get work done, and some users are now asking for Windows workstations. Funny how some bits of Mac software aren't essential when you can't do something basic like print.

        Why did anyone buy printers that aren't just standard Postscript or "IPP everywhere" [pwg.org] compatible? I thought most printers supported this, as some people, for reasons I just don't understand, want to be able to print from their mobile devices.

  • when everyone wants the new shiny, but I have a feeling that in the long term, tightly coupling software to hardware could turn out to be disastrous.

    On the other hand, Apple probably have enough cash to rewrite everything from scratch if they find themselves in a cul-de-sac.
    • How long a term? (Score:3, Informative)

      by SuperKendall ( 25149 )

      I have a feeling that in the long term, tightly coupling software to hardware could turn out to be disastrous.

      It's worked out spectacularly well for Apple since 2007 with the release of the first iPhone.

      In fact it worked so well that was the motivation for moving the Mac line that way as well.

      Just how long of a timeframe are you thinking of?

      I would counter and ask, is it time we re-think if loosely coupling software and hardware makes sense any more in a world where it's so easy to design custom hardware.

      • by dmay34 ( 6770232 )

        Just how long of a timeframe are you thinking of?

        Let's make a bet on this: In 5 years everyone (Qualcomm, Intel, AMD, and many others) will have caught up to and surpassed Apple, so much so that Apple will again be marketing their products in fluffy and non-quantifiable terms, and knock-off Chinese companies will have caught up to where Apple is today.

        • What makes you think Apple will stop developing new chips? I mean for a while the $350 iPhone SE was performing better than $1000 flagship Android phones.

        • In 5 years everyone (Qualcomm, Intel, AMD, and many others) will have caught up to and surpassed Apple

          This companies are not going to stop improving but neither is Apple. It's the "surpassing" part I'm having trouble seeing overall, if in five years they are all still stuck on Intel architecture. I think there will continue to be a lot of leapfrogging honestly. But Apple will not be much if any behind at any stage of that.

          Although I am less sure about this part, I think within five years that Apple will

        • If Intel and AMD etc. do not switch to RISC V: no way.
          Or to ARM, and then they would just be "just the same".

      • He wrote "disastrous", not "disastrous for Apple". Those are not the same thing.
        • He wrote "disastrous", not "disastrous for Apple". Those are not the same thing.

          Ok, that's a great point, I've not considered it from that angle.

          However I would say we've had enough experience where hardware and software are tightly coupled to know it can work well and there's no indication the end result is disastrous. I would argue it makes much better use of hardware and tends towards software that is more usable by the end user, because it doesn't have to accommodate every possible device present, past

    • by swilver ( 617741 )

      On the other hand, Apple probably have enough cash to rewrite everything from scratch if they find themselves in a cul-de-sac.

      Rewriting things from scratch takes huge amounts of time, and then you will run into an interesting reality in the software industry: adding more money does not significantly reduce the time to completion.

  • but the apple chips are not pro ready & pros are unlikely to pay apples ram pricing at 256GB+

    • but the apple chips are not pro ready

      The existing Pro M1 laptops are able to process seven streams of 8K ProRes. [tweaktown.com]

      That's better than a lot of desktops. How are the M1 chips not "Pro Ready"?

      unlikely to pay apples ram pricing at 256GB+

      It only costs $400 to go from 32GB to 64GB In a top of the line M1 MacBook Pro.

      • as a non apple guy what does it cost to get 64 gig ram in an entry level mac?
        • 400 bucks for 32 gig of ram seems high these days to me but im an amd guy
          • 400 bucks for 32 gig of ram seems high these days

            Higher than the cheapest RAM yes, but remember the guy I was responding to was claiming Apple charges more for RAM than a "pro" is willing to pay. Pretty sure a "pro" is OK paying $400 for 32GB of extremely fast RAM on a system that is $3k or more.

            Also what people are not thinking of here is that buying more RAM on the M! systems is buying RAM not just for the CPU, but GPU also... so you get double the benefit from an expansion.

            • The way I look at it, after paying that much for the system itself being charged through the nose for RAM is just a kick in the pants. No wonder Apple is such a wealthy company.
              • For your home use it does feel like a kick in the pants, but if you are buying it for professional use then the sticker price by itself isnâ(TM)t whatâ(TM)s important. Whatâ(TM)s important is whether it will pay for itself by improved productivity. For a large number of people it will.

                If you canâ(TM)t justify the price, then there are other options and the MacBook Air has been shown to do well for a lot of people.

                Given so many of us do different professions, what is âoeproâ in

      • by DamnOregonian ( 963763 ) on Wednesday December 29, 2021 @06:43PM (#62126559)

        The existing Pro M1 laptops are able to process seven streams of 8K ProRes. [tweaktown.com]

        That's a bit of a 1-trick pony, right there. MediaEngine supports ProRes, while equivalent high-performance encoders like NVENC do not.
        That makes a lot of sense, since ProRes is an Apple format.

        My desktop GPU can do 8K 10-bit HEVC encoding at 30fps.

        My M1 Max can do it at around 0.7fps.

        It's amazing what we can do when we can leverage built in media engines.

  • I need to use my mac to develop docker-based software for deployment on 64-bit Intel or AMD Linux server slices.

    From what I understand, YMMV is the term used to describe how cross-compiled / universal docker image development on the new Apple silicon macs will work.

    So far, that has scared me away from getting one. Plus 1 for eliminating the touchbar though.
    • by CODiNE ( 27417 )

      You may be able to see significant cost savings from moving that software to ARM hosts. You also may not but it's definitely worth springing for a Mini or similar to get some testing on ARM done. Never know it may be well worth it.

  • by berchca ( 414155 ) on Wednesday December 29, 2021 @05:27PM (#62126295) Homepage

    I'm not saying the technical points of this article aren't accurate, but boy was this a fan article. I mean, yeah, 7.4% of the market in Q2 of 2021 (per IDC) is a big bump up from, well, 7.6% of the market in Q2 of 2020.

    No, wait...

    [IDC link: https://www.idc.com/getdoc.jsp?containerId=prUS48069721 ]

    • by jon3k ( 691256 )
      They also shipped 500,000 more units than they did in Q2 of 2020 at higher margins. (6.1M vs 5.6M, 9.4% growth, per your link). The market grew, they didn't quite grow at the same rate. Given the increase in margins and increase in total units shipped I don't think anyone at Apple is losing any sleep.
  • which had started to grow stale with an aging design and iterative annual upgrades.

    Switching to M1 doesn't prevent the mac from being stale.

    • Moreover, wtf does "stale" even mean in this context? Does it mean "we haven't completely redesigned just for the sake of redesigning it in a super long time?"

  • Blah blah blah blah blah blah blah...

    • Re: (Score:2, Insightful)

      by Petersko ( 564140 )

      And an anti-Apple FanBoi said...

      "An Apple Fanboi Said... Blah blah blah blah blah blah blah..."

      Neither contributed anything meaningful.

  • So when will we get good ARM laptops that run Linux well? Yeah, I know you can kind of run it on the M1 with some missing drivers, and that three are some extra cheap Linux arm laptops... But the power stuff?
    • One of these might suit you. They'd be faster than the laptop I've been using Linux on daily for the last few years.

      https://www.qualcomm.com/produ... [qualcomm.com]

      Of course, I don't produce 3D movies, so I don't need 8 cores running at 6 Ghz or whatever.

      • ... running either Windows 11 or Chrome OS.

        He asked an ARM laptop running 'Linux'. Never say never but that's not a market Qualcomm are interested in.

        Rockchip RK3588 will make for a nice upgrade for the Pinebook Pro sometime in 2024.

        • Yeah they can run Windows and can run the Android UI for Linux, as you mentioned. There's also a Debian build for Snapdragon (aarch64). The Linux kernel has actually supported aarch64 for over 10 years.

          Prebuilt-images that handle the drivers and all are available for at least the Lenovo Yoga C630, Lenovo Miix 630, HP Envy x2, and ASUS NovaGo TP370QL.

          That is a bit more involved than buying a Pinebook Pro that comes with non-Android Linux pre-installed.

      • by dargaud ( 518470 )
        Thanks. Interesting. I'll have to write this down for when our aging laptops inevitably cronk out.
  • Too soon to tell (Score:5, Insightful)

    by dmay34 ( 6770232 ) on Wednesday December 29, 2021 @05:44PM (#62126347)

    In the high end processor market, it's one thing to get ahead and it's another thing to stay ahead. Th G4 was a year ahead of the competition when it came out, then the competition caught up and surpassed it. Same with the G5. Then Apple switched to Intel because IBM couldn't match what Intel and AMD would be offering in the late 2000's. Similarly AMD knocked the socks off of Intel in the 2000s. Then Intel innovated and took over. Then ARM got all the spotlight with mobile chips. Now it's Apple's turn to be the leader.

    Being the leader in the high end processor world is a fleeting thing.

    • by dfghjk ( 711126 )

      and this is the problem. No one has doubted that Apple could make an interesting processor for Macs, the question is whether they can sustain it over the next decade plus. It has NEVER been done before, it always goes the opposite way.

      Good luck Apple. We all benefit from more competition and more compelling offerings. Whether Apple continues to succeed and makes great Macs we will have to see, but history is not kind to a move like this. It seems more likely that Macs fade to nothing or get integrated

      • by RazorSharp ( 1418697 ) on Wednesday December 29, 2021 @10:06PM (#62127067)

        I don't think Apple cares if Intel or AMD can beat them on the spec sheet in the future. Even when they were using Intel, because of the way their timeline worked, they often didn't have the latest and greatest offerings from Intel in their machines. People bought them anyway because they like Apple's stuff. I think they just wanted to have more control over their release schedule and product design.

        The fact that the M1 blows the competition out of the water when it comes to performance per watt is completely by design. That's what Apple prioritized and the only other chip manufacturers who placed such an emphasis on this metric (Samsung, Qualcomm) were making chips for mobile devices. For other manufacturers, laptop chips couldn't have such an exaggerated focus on this metric because they needed to be x86 for Windows or they needed to be cheap for Chromebooks.

    • by Shadow of Eternity ( 795165 ) on Wednesday December 29, 2021 @08:27PM (#62126771)

      Except they aren't the leader. If you look at ACTUAL tests someplace like openbenchmark intel is >2x as performant as apple is.

      • Re: (Score:2, Flamebait)

        by AmiMoJo ( 196126 )

        Performance is mid-range, but energy consumption is quite good. It's hard to compare apples-to-apples (pun intended) because MacOS is tuned for Apple's hardware much better than Windows and Linux are, but I expect that advantage will disappear in the next year or two as AMD catches up.

        Well, arguably it's not much of an advantage even now. AMD laptops are already >10 hours battery life, and while 20 hours from a Macbook is impressive it's also mostly pointless for most people.

  • by dfghjk ( 711126 ) on Wednesday December 29, 2021 @06:03PM (#62126411)

    "The pivot allowed Apple to completely rethink the Mac, which had started to grow stale with an aging design and iterative annual upgrades. Following the divorce from Intel, Apple has launched far more exciting computers which, paired with an ongoing pandemic that has forced people to work and learn from home, have sent Apple's Mac business soaring."

    Apple not only didn't "rethink the Mac", they did nothing OTHER than change the processor. Thing is, Mac users as a group do not know or care about the processors and most buyers won't be able to tell the difference. If the previous Macs were "aging and stale", the new ones are as well, only with a processor that runs even less software than before.

    Apple's Mac business has soared before with Intel, and PowerPC before it, and 68K before that. So far, Apple Silicon processors have not made previously undesirable computers compelling. They were not undesirable before, they are not irresistible now. The article is total crap.

    • by hdyoung ( 5182939 ) on Wednesday December 29, 2021 @06:46PM (#62126561)
      Youre sorta wrong in terms of design. As far as year-to-year design updates go The M1 imac is actually quite different than the intel designs. Lots thinner. Redesigned screen. Thermal management changed cause the intel chips run a LOT hotter than the M1. A camera that follows you as you move around. I mean, its still an imac. Any all-in-one is basically a large rectangle with a screen. You can only change so much. But it was way more than the usual “slightly faster processor in the same case”. That being said, if you got a recent intel imac on your desk, not a ton of reasons to swap it out.
      • Re: (Score:2, Informative)

        by AmiMoJo ( 196126 )

        Basically they made it even less serviceable and removed almost all upgrade options. Welcome to the age of disposable computing.

    • they did nothing OTHER than change the processor.
      Perhaps you should read a bit about what the other change are, you are making an idiot out of your self.

  • by Anonymous Coward

    None of the features touted on the 2020/2021 iMacs were being held back by Intel CPUs.

    For example the extremely low resolution of Apple's web cams (which have always been USB connected internally, thus the CPU has no bearing on their functional limitations) have always been a feature due to their nickel-and-diming of manufacturing processes: the 1 megapixel "high definition FaceTime camera" was introduced in 2011 at a time when commodity web cams had 2 megapixel sensors with 8 megapixel interpolated capture

  • It has nothing to do with Apple actually releasing a functioning product after several yes of garbage "Pro" devices without connectivity and shitty little touch toys. It could be powered from a by a potato and it would have sold.

  • by countach ( 534280 ) on Wednesday December 29, 2021 @09:59PM (#62127039)

    If Macs had become stale, it was nothing to do with Intel. More to do with terrible Apple designs, butterfly keyboards that broke all the time, an expensive touchbar that nobody cared about, lack of touch screens, lack of variety and so forth.

  • ...has allowed them to add back functional I/O ports; add back physical function keys instead of a dinky touchpad; add a better camera and a better screen. I'm sure that absolutely none of that was possible with a *shock horror* intel CPU inside!
  • I just bought a brand-new, highest-end Intel MacBook Pro, a little under two years old (new/old stock) with a $500 discount.

    If you plan on playing any games or do work with any serious compute demand I realized that the M1 MacBook Pro, even one with the "M1 MAX," has a really, really long way to go for people who live and work in the real world.

    Not hating on Apple, but the emulation layer to run Intel x86 apps on the M1 is a hard-stop deal breaker for us.

  • by Anonymous Coward

    This was on Jobs' list of "things to do". One of the last things he gave Apple before he left this world. There are a bunch of other things on there too.

    Once that list runs out Apple is doomed because they have never been successful without Jobs and never will be.

  • by i58 ( 886024 )
    So, in a world of intel vs arm, etc. is choice such a bad thing? Wouldnâ(TM)t supporting printers, etc on arm be a net gain for all? Why do we have to bring a flame war into something that could benefit us all? I fail to see how forcing printer manufacturers for example to open their view of the computing world a little is a bad thing. Tell me where Iâ(TM)m wrong here?
    • I fail to see how forcing printer manufacturers for example to open their view of the computing world a little is a bad thing.

      Personally:
      I fail to see how forcing printer manufacturers heads down the toilet would be a bad thing. Failing to conform to standards ought to be a crime punishable by corporate death.

      Waterboarding is too good for people who use chipping to stop after-market ink/toner being used on their hardware.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Thursday December 30, 2021 @09:04AM (#62127897)

    A decommissioning of x86 in the personal computer space is overdue. Everyone was hesitant to make the first move, also because WinTel have a nice charade going, forcing users to update their hardware every few years, because "The new Windows needs it"(TM). It's an awesome money-printing scheme that's been going on for 2+ decades and has been serving both MS and the hardware vendors quite well.

    Apple OTOH has quite a few things going in their favour:

    1.) They couldn't care less about the WinTel industry. Their whole shtick is keeping a berth around WinTel. In that regard doing the switch to Intel CPUs was quite branding stunt that could've effed things up for them. But they pulled it off nicely, mostly because they made the transition nigh hassle-free for all users and software makers.

    2.) They are the 800 pound gorilla in the Smartphone and Tablet space and had more than enough time to explore the merging of their mobile Unix (iOS) and their desktop Unix (macOS). Which are basically the same OS with different UIs and styles of user/kernel space separation. Convergence, whenever they decide to do it, is going to be a piece of cake for Apple.

    3.) Apple has obscene amounts of cash on their hands and controls just about the entire delivery chain of their mobile products, and huge parts of their deskop line (which isn't even the main source of cash to Apple anymore).

    4.) On top of that Apple is the only tech brand that is also a premium fashion brand - an upside others would kill for. Being a fashion-brand moves attention away from Mhz and technical specs to "Oh nice, pink and shiny! Want it!". This was the whole point of the iMac and it's breakthrough emphasis on case design. Being a fashion/lifestyle brand relates wonderfully with "obscene amounts of cash" and "controlling the entire chain". With the all-out move to own silicon, they now control just about 100%.

    The signs were/are clear: There was/is more that a few incentives for Apple to get this custom silicon thing up to speed. That they took the time they needed to get everything into place shows that at least for prepping the Apple silicon switch they once again knew what they were doing and once again showed the world how things are done properly. Todays Apple and their product lineup is starting to smell of marketing people taking over and pushing the product people out of final decisions, a thing Steve Jobs warned about. As a result, a slow decay may be due for Apple.

    But the Apple silicon thing was being prepared long ago so I personally expect it to play out for them, Apple style.

    I hope we soon see others doing the same move. I personally would love to see premium FOSS hardware like quality built Open RISC V laptops come about soon. I don't need x86 to do my software development and I also expect low-power computing to be a big thing any time soon now if the world should finally get serious about making that overdue eco-turnaround.

  • Funny, this wasnt identified as an ad. Even Realtors dont gush this much.

Only great masters of style can succeed in being obtuse. -- Oscar Wilde Most UNIX programmers are great masters of style. -- The Unnamed Usenetter

Working...