Apple Ditched Intel, and It Paid Off (cnbc.com) 101
An anonymous reader quotes a report from CNBC, written by Todd Haselton: Apple's decision to ditch Intel paid off this year. The pivot allowed Apple to completely rethink the Mac, which had started to grow stale with an aging design and iterative annual upgrades. Following the divorce from Intel, Apple has launched far more exciting computers which, paired with an ongoing pandemic that has forced people to work and learn from home, have sent Apple's Mac business soaring. It wasn't always a given. When Apple announced its move away from Intel in 2020, it was fair to question just how well Apple could power laptops and desktop computers. Apple has used in-house chips for iPhones and iPads but had been selling Intel-powered computers for 15 years. It wasn't clear how well its macOS desktop software would work with apps designed to run on Intel chips, or whether its processors would offer any consumer benefits and keep up with intensive tasks that people turned to MacBooks to run. Those fears were quickly quelled.
The first M1 Apple chip was launched in 2020 in a MacBook Air laptop. It was more powerful than Intel's chip while offering longer battery life and enabling a fanless design, which helped keep Apple's new MacBook Air even quieter. It proved to be an early success. In April 2021, CEO Tim Cook said during the company's fiscal second-quarter earnings call that the M1 chip helped fuel the 70.1% growth in Apple's Mac revenue, which hit $9.1 billion during that quarter. The growth continued in fiscal Q3, when Mac revenue was up 16% year over year. That quarter, it launched the all-new iMac, which offered a redesigned super-thin metal body that looks like a screen propped up on a stand. It's slimmer than the Intel models that came before it, while offering other benefits, like a much better webcam, great speakers and a much sharper display than the models it replaced. And Apple made the launch more exciting by offering an array of colors for the iMac, which it hadn't done since it shipped the 1999 iMac. There was a slowdown in fiscal Q4, when Mac revenue grew just 1.6%, as Apple, like all manufacturers, saw a slowdown from the burst of sales driven by the start of the pandemic and dealt with supply chain woes. But fiscal Q4 sales didn't include revenue from its most exciting new computer of the year.
Apple's fiscal Q1 earnings in January will give an indication of how well all its new computers are selling. But it's clear the move from Intel has allowed Apple to move full speed ahead with its own chip development, much like it does for iPhones and iPads, the latter of which has yet to be matched by any other tablet on the market. It's no longer beholden to delays that plagued Intel, which started to lag behind AMD with its new 7nm chips. And Apple has full control over its "stack," which means it can design new computer hardware and software together, instead of letting the power of another company's chips dictate what its computers can and can't do.
The first M1 Apple chip was launched in 2020 in a MacBook Air laptop. It was more powerful than Intel's chip while offering longer battery life and enabling a fanless design, which helped keep Apple's new MacBook Air even quieter. It proved to be an early success. In April 2021, CEO Tim Cook said during the company's fiscal second-quarter earnings call that the M1 chip helped fuel the 70.1% growth in Apple's Mac revenue, which hit $9.1 billion during that quarter. The growth continued in fiscal Q3, when Mac revenue was up 16% year over year. That quarter, it launched the all-new iMac, which offered a redesigned super-thin metal body that looks like a screen propped up on a stand. It's slimmer than the Intel models that came before it, while offering other benefits, like a much better webcam, great speakers and a much sharper display than the models it replaced. And Apple made the launch more exciting by offering an array of colors for the iMac, which it hadn't done since it shipped the 1999 iMac. There was a slowdown in fiscal Q4, when Mac revenue grew just 1.6%, as Apple, like all manufacturers, saw a slowdown from the burst of sales driven by the start of the pandemic and dealt with supply chain woes. But fiscal Q4 sales didn't include revenue from its most exciting new computer of the year.
Apple's fiscal Q1 earnings in January will give an indication of how well all its new computers are selling. But it's clear the move from Intel has allowed Apple to move full speed ahead with its own chip development, much like it does for iPhones and iPads, the latter of which has yet to be matched by any other tablet on the market. It's no longer beholden to delays that plagued Intel, which started to lag behind AMD with its new 7nm chips. And Apple has full control over its "stack," which means it can design new computer hardware and software together, instead of letting the power of another company's chips dictate what its computers can and can't do.
time will tell. (Score:3, Insightful)
Re:time will tell. (Score:5, Informative)
You've got it backwards. Apple M1 destroyed [anandtech.com] both [anandtech.com] Intel and AMD uArchs by offering comparable or higher performance at a significantly lower power package. From the linked review:
Re:time will tell. (Score:5, Informative)
You're right that work unit per watt, the M1 fucking annihilated its x86 rivals.
However, that's the only way.
The comparable or higher performance is incorrect the way you word it.
In the benchmarks you link, they're demonstrating its (unimpeachable) superiority over CPUs with similar power envelopes.
The high end Intel and AMD parts smoke even today's M1 Max. Of course, using quite a lot of power, but still, a win is a win when we're discussing "who has the best performance"
My 2-generation-old 10900HK has comparable multicore performance as my M1 Max, and significantly superior single-core performance.
My M1 (Air, but I understand the MBP M1 wasn't significantly better) is more or less laid to waste.
I'm not shitting on Apple here. I fucking love my M1* devices. First laptops I've bought in a decade that I was really excited about. But when you get rid of the per-watt from any of their metrics, they're middling laptop parts at best.
Re:time will tell. (Score:4, Interesting)
In the benchmarks you link, they're demonstrating its (unimpeachable) superiority over CPUs with similar power envelopes.
The high end Intel and AMD parts smoke even today's M1 Max. Of course, using quite a lot of power, but still, a win is a win when we're discussing "who has the best performance"
I've gotta admit, this is where I start to gripe about Apple.
Performance per watt is something very important for a laptop, of which Apple sells a lot. I can appreciate that my MacBook used significantly less electricity to do it's tasks but, ultimately, I want it to finish before 5:00PM so I can go home. My MacBook is plugged into a wall socket, so I really don't care how much power it's using.
I'll be curious to see what Apple does with the Mac Pro. That should be Apple's "Performance, period" machine--I don't care about how much electricity it uses, I don't care that much about whether it's fan comes on or not, I just want it to finish what I tell it to do as fast as possible so I can go home.
Re: (Score:3)
Anyway, I agree. I'd love to see some focus on trying to get the clock speeds up on these.
Sure, they lose some efficiency. Sure, it'll take some work to get the architecture working stably at 5Ghz. But I think it'd be
Re: (Score:3)
My 2-generation-old 10900HK has comparable multicore performance as my M1 Max, and significantly superior single-core performance.
While I agree in general with your post, the 10900K has similar multicore performance but far worse single core performance than the M1 Max:
https://www.cpubenchmark.net/c... [cpubenchmark.net]
Re: (Score:2)
First, I meant 10980HK (that's the laptop part, the 10900K is a desktop part) (Can the desktop part be... slower? Looks like it's got 10 cores instead of 8)
That's my bad.
Second, I'm seeing mixed results online... which is odd.
OK, so it turns out the version of Cinebench I was using is still using Rosetta- so it was translated. Those results are trash.
Native, (R23), the Max wins in both single and multicore.
You sir, are correct. The M1 Max is faster. Though, I swear on my life, it sur
Re: (Score:2)
The high end Intel and AMD parts smoke even today's M1 Max.
According to Apple, it is the other way around.
But when you get rid of the per-watt from any of their metrics, they're middling laptop parts at best.
Not according to Apple. And at least in Europe lying in a benchmark/advertisement is forbidden.
I did not bother to check it though ... it is not the time yet for me to by an Mx Apple :D
Re: (Score:2)
And how much are you willing to spring for a computer that sounds like a jet engine or requires liquid cooling?
Re: (Score:2)
Re: (Score:2)
You apple fanboys are just as delusional as those old Amiga fanboys were.
Re: (Score:3)
Past tense. $13 BILLION / year (Score:2, Insightful)
The M1 *was* more efficient. Not faster, but more efficient. But you replied to "time will tell". AMD and Intel will come out with new CPUs next month and next year.
Intel's R&D budget is $13 billion each year.
Is Apple going to spend $13 billion / year and hope to keep up?
At first, Apple's decision didn't immediately blow up in their face. Ov r the next 10 years we'll find out if it turns out to be a good strategic decision.
Re: (Score:2)
The M1 *was* more efficient. Not faster, but more efficient. But you replied to "time will tell". AMD and Intel will come out with new CPUs next month and next year.
Intel's R&D budget is $13 billion each year.
Is Apple going to spend $13 billion / year and hope to keep up?
At first, Apple's decision didn't immediately blow up in their face. Ov r the next 10 years we'll find out if it turns out to be a good strategic decision.
Apple spent almost $19 billion in fiscal 2020 [nasdaq.com]. That said, comparing R&D budgets doesn't tell the whole story.
Apple spends a ton (by far the most, probably) of its R&D budget on other things than CPUs - both software, the devices they sell, devices they want to sell in the future - and some of the components in them. Intel also spends a lot of its budget on other things than CPU design... e.g., they spend a lot of money on the processes to create the CPUs themselves. This is the area where Intel use
Check out binning (Score:2)
> Also, the starting point and approach (few variants vs. a gazillion variants to maximize market segmentation) impact how much money you need to spend.
You might find it interesting to Google semiconductor binning.
Intel doesn't design nearly as many SKUs as it sells. Got a core that doesn't work on this 8-core CPU? You sell it as a 6-core CPU. It's unstable at the target speed? Sell it as a slower speed.
Being able to sell all the rejects under different SKUs is advantage Intel.
Re: (Score:2)
> Also, the starting point and approach (few variants vs. a gazillion variants to maximize market segmentation) impact how much money you need to spend.
You might find it interesting to Google semiconductor binning.
Intel doesn't design nearly as many SKUs as it sells. Got a core that doesn't work on this 8-core CPU? You sell it as a 6-core CPU. It's unstable at the target speed? Sell it as a slower speed.
Being able to sell all the rejects under different SKUs is advantage Intel.
Apple already does this - there's quite a variation in the core counts on the CPU and GPU side, and I'm sure the iPad Pro also allows for yet another dimension for binning. Intel also does market segmentation on instructions and capabilities - e.g. a MacBook Air will trash any Intel laptop or desktop for many ML tasks, as every chip has an ML accelerator. Intel has this in "select Xeon cpus" only, in order to extract the most possible revenue. I'm just suggesting that doing it this way might lead to bad res
Re: (Score:3, Interesting)
I think we need to wait a few years to see whether it paid off, short term won't be the deciding factor, it will be how well they innovate and keep up with AMD and INTEL. If they sit on their hands it will look like a really bad decision in a couple of years time.
Yeah, this is a really short term payoff. Our workplace is being hobbled by the lack of printer drivers. People are having to send print jobs to Windows users to get work done, and some users are now asking for Windows workstations. Funny how some bits of Mac software aren't essential when you can't do something basic like print.
Re: (Score:3, Insightful)
How is it Apple’s fault that your printer vendor is not supplying drivers?
Re: (Score:1)
How is it the printer vendor's fault that Apple broke their hardware and software platforms so their drivers no longer work?
Re: (Score:3)
How is it the printer vendor's fault that Apple broke their hardware and software platforms so their drivers no longer work?
It is 100% the printer vendor's fault. They knew this was coming and had the specs from Apple long before the release. In most cases, they just needed to recompile for the new arch, with no changes to the API.
If your printer company is this incompetent, perhaps you should buy your printers from someone else.
My printer is a Canon. I bought it at Walmart for $39. It works fine with the M1 MacBook.
Re: (Score:2)
Re: (Score:2)
How does it matter who's fault this is?
It matters because it is a lot easier to switch printers than to switch laptops.
If I google for people having printer problems with the M1, all of them are from a year ago when M1 Macs were first introduced. So it looks like this isn't a real problem anymore.
Re: (Score:1)
It is 100% the printer vendor's fault.
It's pretty much Apple's fault. They switch hardware again and another vendor decided not to throw away good money on a system that is less 2% of the market.
Re: (Score:2)
Re: time will tell. (Score:2)
I canâ(TM)t believe this is a real post for two reasons: who is actually printing a lot, and what printers are not supported on M1?
Not that I care either way but the anonymous post looks like itâ(TM)s got an agenda.
Re: (Score:3)
I think we need to wait a few years to see whether it paid off, short term won't be the deciding factor, it will be how well they innovate and keep up with AMD and INTEL. If they sit on their hands it will look like a really bad decision in a couple of years time.
Yeah, this is a really short term payoff. Our workplace is being hobbled by the lack of printer drivers. People are having to send print jobs to Windows users to get work done, and some users are now asking for Windows workstations. Funny how some bits of Mac software aren't essential when you can't do something basic like print.
Why did anyone buy printers that aren't just standard Postscript or "IPP everywhere" [pwg.org] compatible? I thought most printers supported this, as some people, for reasons I just don't understand, want to be able to print from their mobile devices.
Maybe in the short term, (Score:1)
On the other hand, Apple probably have enough cash to rewrite everything from scratch if they find themselves in a cul-de-sac.
How long a term? (Score:3, Informative)
I have a feeling that in the long term, tightly coupling software to hardware could turn out to be disastrous.
It's worked out spectacularly well for Apple since 2007 with the release of the first iPhone.
In fact it worked so well that was the motivation for moving the Mac line that way as well.
Just how long of a timeframe are you thinking of?
I would counter and ask, is it time we re-think if loosely coupling software and hardware makes sense any more in a world where it's so easy to design custom hardware.
Re: (Score:2)
Just how long of a timeframe are you thinking of?
Let's make a bet on this: In 5 years everyone (Qualcomm, Intel, AMD, and many others) will have caught up to and surpassed Apple, so much so that Apple will again be marketing their products in fluffy and non-quantifiable terms, and knock-off Chinese companies will have caught up to where Apple is today.
Re: (Score:2)
What makes you think Apple will stop developing new chips? I mean for a while the $350 iPhone SE was performing better than $1000 flagship Android phones.
Hmm, last part sounds about right... (Score:1)
In 5 years everyone (Qualcomm, Intel, AMD, and many others) will have caught up to and surpassed Apple
This companies are not going to stop improving but neither is Apple. It's the "surpassing" part I'm having trouble seeing overall, if in five years they are all still stuck on Intel architecture. I think there will continue to be a lot of leapfrogging honestly. But Apple will not be much if any behind at any stage of that.
Although I am less sure about this part, I think within five years that Apple will
Re: (Score:2)
If Intel and AMD etc. do not switch to RISC V: no way.
Or to ARM, and then they would just be "just the same".
Not even a little. (Score:4, Informative)
The success of the iPhone is almost entirely due to marketing
I saw for my own eyes how not true that was.
When I got the first iPhone I had a boss who was REALLY into Window Mobile. He was saying that the Apple devices would never come close to what Windows Mobile was offering and that there was not way the iPhone would catch on.
After five minutes of playing with my iPhone, he had one himself within a week.
There was no amount of marketing that was going to sway this guy, But when he actually used it - bam.
That's what Apple has had for a long time, when you use them you see why people like them. The marketing is almost irrelevant.
e.g. when they bumped screen resolution beyond what the OS was designed for. All those lovingly hand-curated pixels needed to be moved.
As an iOS developer I can say that your entire statement is complete bullocks. When they introduced the retina screens, providing @2x resolution assets bothered zero people because designers are always working at much higher than output resolution (if they aren't working with vectors). There may have been a little pixel tweaking but it was pretty minor. That was never an issue and I was working on multiple apps that whole time as a consultant working directly with designers. The 1x/2x/3x system allowed for exact pixel designs with any of the screens, if that was desired.
Re: (Score:3)
I saw for my own eyes how not true that was. When I got the first iPhone I had a boss who was REALLY into Window Mobile. He was saying that the Apple devices would never come close to what Windows Mobile was offering and that there was not way the iPhone would catch on.
It's telling you had to go back 15 years for this example. Yes, the iPhone's capacitive touch screens were a quantum leap ahead of the Windows Mobile resistive screens at the time. But Samsung et al. caught up and surpassed the iPhone 2-3 years after that. I don't know many who've switched from say Samsung to iPhone in the last 10 years.
The 1x/2x/3x system allowed for exact pixel designs with any of the screens, if that was desired.
The 1x/2x/3x system is the very definition of inflexibility that I'm talking about. Not being able to have an arbitrarily sized screen is a huge disadvantage by any metric.
Marketing is not magic, cannot last (Score:1)
It's telling you had to go back 15 years for this example.
I only went back that far to show the iPhone popularly was never about marketing.
You think it's about marketing, fine, just what marketing? What specific marketing message do you think is making people buy iPhones today.
Answer is of course, there is none. The Marketing is fluff and it doesn't really get people to buy things. All it can do is get them to think about a thing for a bit. You seem to think marketing is some kind of magic, but if that
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
He wrote "disastrous", not "disastrous for Apple". Those are not the same thing.
Ok, that's a great point, I've not considered it from that angle.
However I would say we've had enough experience where hardware and software are tightly coupled to know it can work well and there's no indication the end result is disastrous. I would argue it makes much better use of hardware and tends towards software that is more usable by the end user, because it doesn't have to accommodate every possible device present, past
Re: (Score:2)
Rewriting things from scratch takes huge amounts of time, and then you will run into an interesting reality in the software industry: adding more money does not significantly reduce the time to completion.
but the apple chips are not pro ready & ram pr (Score:2)
but the apple chips are not pro ready & pros are unlikely to pay apples ram pricing at 256GB+
How not "Pro Ready"? (Score:2)
but the apple chips are not pro ready
The existing Pro M1 laptops are able to process seven streams of 8K ProRes. [tweaktown.com]
That's better than a lot of desktops. How are the M1 chips not "Pro Ready"?
unlikely to pay apples ram pricing at 256GB+
It only costs $400 to go from 32GB to 64GB In a top of the line M1 MacBook Pro.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
400 bucks for 32 gig of ram seems high these days
Higher than the cheapest RAM yes, but remember the guy I was responding to was claiming Apple charges more for RAM than a "pro" is willing to pay. Pretty sure a "pro" is OK paying $400 for 32GB of extremely fast RAM on a system that is $3k or more.
Also what people are not thinking of here is that buying more RAM on the M! systems is buying RAM not just for the CPU, but GPU also... so you get double the benefit from an expansion.
Re: (Score:1)
Re: How not "Pro Ready"? (Score:2)
For your home use it does feel like a kick in the pants, but if you are buying it for professional use then the sticker price by itself isnâ(TM)t whatâ(TM)s important. Whatâ(TM)s important is whether it will pay for itself by improved productivity. For a large number of people it will.
If you canâ(TM)t justify the price, then there are other options and the MacBook Air has been shown to do well for a lot of people.
Given so many of us do different professions, what is âoeproâ in
Re:How not "Pro Ready"? (Score:5, Informative)
The existing Pro M1 laptops are able to process seven streams of 8K ProRes. [tweaktown.com]
That's a bit of a 1-trick pony, right there. MediaEngine supports ProRes, while equivalent high-performance encoders like NVENC do not.
That makes a lot of sense, since ProRes is an Apple format.
My desktop GPU can do 8K 10-bit HEVC encoding at 30fps.
My M1 Max can do it at around 0.7fps.
It's amazing what we can do when we can leverage built in media engines.
Not as good for developing server software (Score:2)
From what I understand, YMMV is the term used to describe how cross-compiled / universal docker image development on the new Apple silicon macs will work.
So far, that has scared me away from getting one. Plus 1 for eliminating the touchbar though.
Re: (Score:3)
You may be able to see significant cost savings from moving that software to ARM hosts. You also may not but it's definitely worth springing for a Mini or similar to get some testing on ARM done. Never know it may be well worth it.
Re: (Score:2)
Why buy the hardware just to try it out? AWS will let you do that much, without the hardware investment.
https://aws.amazon.com/about-a... [amazon.com]
If it's worthwhile, you can invest in a machine afterward.
And advertising... Go! (Score:5, Insightful)
I'm not saying the technical points of this article aren't accurate, but boy was this a fan article. I mean, yeah, 7.4% of the market in Q2 of 2021 (per IDC) is a big bump up from, well, 7.6% of the market in Q2 of 2020.
No, wait...
[IDC link: https://www.idc.com/getdoc.jsp?containerId=prUS48069721 ]
Re: (Score:2)
Stale (Score:1)
which had started to grow stale with an aging design and iterative annual upgrades.
Switching to M1 doesn't prevent the mac from being stale.
Re: (Score:2)
Moreover, wtf does "stale" even mean in this context? Does it mean "we haven't completely redesigned just for the sake of redesigning it in a super long time?"
An Apple FanBoi Said.... (Score:1, Insightful)
Blah blah blah blah blah blah blah...
Re: (Score:2, Insightful)
And an anti-Apple FanBoi said...
"An Apple Fanboi Said... Blah blah blah blah blah blah blah..."
Neither contributed anything meaningful.
Re:An Apple FanBoi Said.... (Score:4, Insightful)
Going to be a toss up on who is hated here more, Apple or Intel.
Re: (Score:2)
Intel has a ways to go. Slashdot doesn't just hate Apple, it *meta-hates* Apple.
Re: (Score:2)
It's too bad too bc macOS is (was?) a great Unix-like OS and I say that as an "old" linux head. I thoroughly enjoyed my MBP '15 before I accidentally killed it. It's Linux otherwise, haven't used Windows for over 10 years as my primary.
Re: (Score:2)
Being able to run Office on a Unix is a big deal when your collaborators insist on using Word and EndNote.
Re: (Score:1)
Neither contributed anything meaningful.
Leave it to a karma whore to state the obvious, which also contributes nothing of value.
Re: (Score:2)
The worst thing is: he forgot one "blah"!
ARM laptops with Linux? (Score:2)
Maybe one of these 8CX gen 2 laptops (Score:2)
One of these might suit you. They'd be faster than the laptop I've been using Linux on daily for the last few years.
https://www.qualcomm.com/produ... [qualcomm.com]
Of course, I don't produce 3D movies, so I don't need 8 cores running at 6 Ghz or whatever.
Re: (Score:2)
... running either Windows 11 or Chrome OS.
He asked an ARM laptop running 'Linux'. Never say never but that's not a market Qualcomm are interested in.
Rockchip RK3588 will make for a nice upgrade for the Pinebook Pro sometime in 2024.
Debian do it for ya? (Score:2)
Yeah they can run Windows and can run the Android UI for Linux, as you mentioned. There's also a Debian build for Snapdragon (aarch64). The Linux kernel has actually supported aarch64 for over 10 years.
Prebuilt-images that handle the drivers and all are available for at least the Lenovo Yoga C630, Lenovo Miix 630, HP Envy x2, and ASUS NovaGo TP370QL.
That is a bit more involved than buying a Pinebook Pro that comes with non-Android Linux pre-installed.
Re: (Score:2)
Too soon to tell (Score:5, Insightful)
In the high end processor market, it's one thing to get ahead and it's another thing to stay ahead. Th G4 was a year ahead of the competition when it came out, then the competition caught up and surpassed it. Same with the G5. Then Apple switched to Intel because IBM couldn't match what Intel and AMD would be offering in the late 2000's. Similarly AMD knocked the socks off of Intel in the 2000s. Then Intel innovated and took over. Then ARM got all the spotlight with mobile chips. Now it's Apple's turn to be the leader.
Being the leader in the high end processor world is a fleeting thing.
Re: (Score:2)
and this is the problem. No one has doubted that Apple could make an interesting processor for Macs, the question is whether they can sustain it over the next decade plus. It has NEVER been done before, it always goes the opposite way.
Good luck Apple. We all benefit from more competition and more compelling offerings. Whether Apple continues to succeed and makes great Macs we will have to see, but history is not kind to a move like this. It seems more likely that Macs fade to nothing or get integrated
Re:Too soon to tell (Score:4, Insightful)
I don't think Apple cares if Intel or AMD can beat them on the spec sheet in the future. Even when they were using Intel, because of the way their timeline worked, they often didn't have the latest and greatest offerings from Intel in their machines. People bought them anyway because they like Apple's stuff. I think they just wanted to have more control over their release schedule and product design.
The fact that the M1 blows the competition out of the water when it comes to performance per watt is completely by design. That's what Apple prioritized and the only other chip manufacturers who placed such an emphasis on this metric (Samsung, Qualcomm) were making chips for mobile devices. For other manufacturers, laptop chips couldn't have such an exaggerated focus on this metric because they needed to be x86 for Windows or they needed to be cheap for Chromebooks.
Re:Too soon to tell (Score:4, Insightful)
Except they aren't the leader. If you look at ACTUAL tests someplace like openbenchmark intel is >2x as performant as apple is.
Re: (Score:2, Flamebait)
Performance is mid-range, but energy consumption is quite good. It's hard to compare apples-to-apples (pun intended) because MacOS is tuned for Apple's hardware much better than Windows and Linux are, but I expect that advantage will disappear in the next year or two as AMD catches up.
Well, arguably it's not much of an advantage even now. AMD laptops are already >10 hours battery life, and while 20 hours from a Macbook is impressive it's also mostly pointless for most people.
have to say this is bullshit (Score:4, Insightful)
"The pivot allowed Apple to completely rethink the Mac, which had started to grow stale with an aging design and iterative annual upgrades. Following the divorce from Intel, Apple has launched far more exciting computers which, paired with an ongoing pandemic that has forced people to work and learn from home, have sent Apple's Mac business soaring."
Apple not only didn't "rethink the Mac", they did nothing OTHER than change the processor. Thing is, Mac users as a group do not know or care about the processors and most buyers won't be able to tell the difference. If the previous Macs were "aging and stale", the new ones are as well, only with a processor that runs even less software than before.
Apple's Mac business has soared before with Intel, and PowerPC before it, and 68K before that. So far, Apple Silicon processors have not made previously undesirable computers compelling. They were not undesirable before, they are not irresistible now. The article is total crap.
Re:have to say this is bullshit (Score:5, Insightful)
Re: (Score:2, Informative)
Basically they made it even less serviceable and removed almost all upgrade options. Welcome to the age of disposable computing.
Re: (Score:2)
they did nothing OTHER than change the processor.
Perhaps you should read a bit about what the other change are, you are making an idiot out of your self.
Nonsense (Score:1)
None of the features touted on the 2020/2021 iMacs were being held back by Intel CPUs.
For example the extremely low resolution of Apple's web cams (which have always been USB connected internally, thus the CPU has no bearing on their functional limitations) have always been a feature due to their nickel-and-diming of manufacturing processes: the 1 megapixel "high definition FaceTime camera" was introduced in 2011 at a time when commodity web cams had 2 megapixel sensors with 8 megapixel interpolated capture
Yep it's all the M1. (Score:1)
It has nothing to do with Apple actually releasing a functioning product after several yes of garbage "Pro" devices without connectivity and shitty little touch toys. It could be powered from a by a potato and it would have sold.
Not Intel (Score:3)
If Macs had become stale, it was nothing to do with Intel. More to do with terrible Apple designs, butterfly keyboards that broke all the time, an expensive touchbar that nobody cared about, lack of touch screens, lack of variety and so forth.
Well I'm glad that putting in a new CPU arch... (Score:2)
Just bought an Intel MacBook Pro (Score:2)
I just bought a brand-new, highest-end Intel MacBook Pro, a little under two years old (new/old stock) with a $500 discount.
If you plan on playing any games or do work with any serious compute demand I realized that the M1 MacBook Pro, even one with the "M1 MAX," has a really, really long way to go for people who live and work in the real world.
Not hating on Apple, but the emulation layer to run Intel x86 apps on the M1 is a hard-stop deal breaker for us.
Apple didn't "decide" to do this (Score:1)
This was on Jobs' list of "things to do". One of the last things he gave Apple before he left this world. There are a bunch of other things on there too.
Once that list runs out Apple is doomed because they have never been successful without Jobs and never will be.
Choice? (Score:1)
Re: (Score:2)
Personally:
I fail to see how forcing printer manufacturers heads down the toilet would be a bad thing. Failing to conform to standards ought to be a crime punishable by corporate death.
Waterboarding is too good for people who use chipping to stop after-market ink/toner being used on their hardware.
No sh*t, Sherlock. (Score:3)
A decommissioning of x86 in the personal computer space is overdue. Everyone was hesitant to make the first move, also because WinTel have a nice charade going, forcing users to update their hardware every few years, because "The new Windows needs it"(TM). It's an awesome money-printing scheme that's been going on for 2+ decades and has been serving both MS and the hardware vendors quite well.
Apple OTOH has quite a few things going in their favour:
1.) They couldn't care less about the WinTel industry. Their whole shtick is keeping a berth around WinTel. In that regard doing the switch to Intel CPUs was quite branding stunt that could've effed things up for them. But they pulled it off nicely, mostly because they made the transition nigh hassle-free for all users and software makers.
2.) They are the 800 pound gorilla in the Smartphone and Tablet space and had more than enough time to explore the merging of their mobile Unix (iOS) and their desktop Unix (macOS). Which are basically the same OS with different UIs and styles of user/kernel space separation. Convergence, whenever they decide to do it, is going to be a piece of cake for Apple.
3.) Apple has obscene amounts of cash on their hands and controls just about the entire delivery chain of their mobile products, and huge parts of their deskop line (which isn't even the main source of cash to Apple anymore).
4.) On top of that Apple is the only tech brand that is also a premium fashion brand - an upside others would kill for. Being a fashion-brand moves attention away from Mhz and technical specs to "Oh nice, pink and shiny! Want it!". This was the whole point of the iMac and it's breakthrough emphasis on case design. Being a fashion/lifestyle brand relates wonderfully with "obscene amounts of cash" and "controlling the entire chain". With the all-out move to own silicon, they now control just about 100%.
The signs were/are clear: There was/is more that a few incentives for Apple to get this custom silicon thing up to speed. That they took the time they needed to get everything into place shows that at least for prepping the Apple silicon switch they once again knew what they were doing and once again showed the world how things are done properly. Todays Apple and their product lineup is starting to smell of marketing people taking over and pushing the product people out of final decisions, a thing Steve Jobs warned about. As a result, a slow decay may be due for Apple.
But the Apple silicon thing was being prepared long ago so I personally expect it to play out for them, Apple style.
I hope we soon see others doing the same move. I personally would love to see premium FOSS hardware like quality built Open RISC V laptops come about soon. I don't need x86 to do my software development and I also expect low-power computing to be a big thing any time soon now if the world should finally get serious about making that overdue eco-turnaround.
Gag (Score:2)