Apple Introduces M1 Chip To Power Its New Arm-Based Macs (theverge.com) 155
Apple has introduced the new M1 chip that will power its new generation of Arm-based Macs. It's a 5nm processor, just like the A14 Bionic powering its latest iPhones. From a report: Apple says the new processor will focus on combining power efficiency with performance. It has an eight-core CPU, which Apple says offers the world's best performance per watt of an CPU. Apple says it delivers the same peak performance as a typical laptop CPU at a quarter of the power draw. It says this has four of the world's fastest CPUs cores, paired with four high-efficiency cores. It pairs this with up to an eight-core GPU, which Apple claims offers the world's fastest integrated graphics, and a 16-core Neural Engine. In addition, the M1 processor has a universal memory architecture, a USB 4 controller, media encode and decode engines, and a host of security features. These include hardware-verified secure boot, encryption, and run-time protections.
These are some hefty claims (Score:5, Insightful)
There were no hard data anywhere during the whole Apple event. What Windows PCs they were comparing too were never mentioned, nor any benchmark results.
I'm curious what reviews will show, but for now there are many reasons to be sceptical.
Re: (Score:2)
I can see lack of hard data being A reason to be skeptical, what might be the others of "many reasons to be skeptical"?
Comment removed (Score:5, Informative)
Re: (Score:3)
To add to this there is another factors Apple does not optimize for or at least does not have to optimize as aggressively for, that is cost and yields.
All those other guys have to sell chips to integrators. At the end of the day that means out side of some nice products if they want to see something that costs n% more it pretty much must perform at %n better in one of the two benchmarks anyone cares about right now, IPS and IP/watt.
Apple's chip designers have a captive client so to speak. So as long as the
Re: These are some hefty claims (Score:3)
Apple will no doubt still be chasing the bottom line. Itâ(TM)s not a company known for being sloppy with its spending.
A bigger difference is that Apple gets to decide the future of the CPU. Buying from Intel, Apple is a relatively small customer vying with far larger customers to influence Intelâ(TM)s direction.
Re: (Score:2)
That is true. That is why I said "not way out of line."
Apple is in the consume products market. They sell finished goods which CPUs are arguable not. This means they are insulated from supply shocks to some degree now. They can get the silicon features they specifically want (like you point out with the lack of influence with intel) to deliver the consumer features they are targeting. I don't doubt there are other advantage in the the vertical integration for them as well.
If they pay a few more dollars per
Re: (Score:2)
Re: (Score:2)
You think this is Apple's "first try" at a CPU? Thanks for making it so easy to discard your opinions.
Re: (Score:2)
The biggest reason to be skeptical is that apparently Apple has had the fortune of designing both better CPUs and better GPUs that those that have been designing them for over half a century.... on their first try.
Ahem... They have been working on these chips for many years. The Apple A9 was released in 2015.
Re: (Score:2)
Apple is designing CPUs and GPUs since ages.
See: iPad, iPhone, etc.
Re: These are some hefty claims (Score:2)
Not their first try at all. Appleâ(TM)s been building these CPUs for years, and using them in devices with much lower computational requirements. The results show that theyâ(TM)ve been getting close to the claimed level of performance recently. Taking the existing known chips, doubling the core count on CPU (p cores) and GPU, and upping the frequency a bit certainly would get to the level of performance theyâ(TM)re claiming.
Re: (Score:2)
Apple doesn't have a whole bunch of backwards compatibility they have to build into the architecture like there is with x86, and Apple's been designing their own CPUs for iPhone/iPad for over about a decade now.
Re: (Score:2)
Well thats clearly not true. Apple have been making their own CPUs and GPUs for a good decade now, just not for desktop/laptops, and the dev kits that have been sent out which contain an *older* CPU have benchmarked spectacularly well
Now I'm suspicious of their claims on GPUs, I'm ASSUMING they me
Re: (Score:2)
Apple has had the fortune of designing both better CPUs and better GPUs that those that have been designing them for over half a century.... on their first try.
Ignoring the last 10 years of Apple CPU and GPU development. They even mention building on this 10 years of experience. ;-)
Re: (Score:3)
Re: (Score:3)
Hefty claims of wonderful CPU goodness, but if it still can't play Fortnite, what good is it?
I guess none.
If you're 10 years old.
Re: (Score:2)
"it can't run a piece of software" is not an advantage.
Re: (Score:3)
Re: (Score:2)
Agreed, plus while one can argue that Apple has used optimistic spin in the past, they can't argue that Apple will be suddenly far "optimistic" now. Also, Apple's not claiming superior performance against every x86 CPU, they're talking about low-TDP mobile CPUs. That has been their focus for quite some time and there's no reason to believe they lack expertise.
Re: (Score:2)
Re: (Score:2)
"There was plenty of hard data. "Faster than 98% of all laptops sold"
Whoop-de-fucking-doo; that's marketing garbage right there. They COST more than 98% of all laptops sold too. The macbook air, their lowest laptop STARTS at $1000.
The Dell XPS stuff starts at $1000 too although you can get them for a little less on sale.
But dell sells a lot of inspirons, vostros, and latitudes for less.
So big fucking deal... laptops more expensive than what most people buy is faster than what most people buy.
Next up: 200,000 sports cars handle better than 98% of cars on the road.
"up to 3.5 t
Re: (Score:3)
Until there is independent confirmation this is just marketing bullshit. ...
That would be illegal in most countries
Re: (Score:2)
Because we all know companies, such as Apple, would never break the law. [wikipedia.org] /s
Of course Apple will CYA and won't say anything that they can be sued for. However, how do you verify Apple isn't lying???\
That's what hard data is -- the ability to VERIFY that you AREN'T lying.
Re: (Score:2)
They were certainly playing with statistics to make themselves look better than they are. How many HP Streams get sold in a year, for example? Probably a lot more of those than LG Gram laptops.
Comparing the orange against the apple (Score:2)
There were no hard data anywhere during the whole Apple event. What Windows PCs they were comparing too were never mentioned
They were pretty clear. PC laptops in the same class of machines for the Air and Pro. PC desktops in the same class as the mini. Class being a factor of price, size, consumers/student vs pro focus (which translates to CPU and RAM choices among other things), etc..
Sure you and I can build our own monster desktop from parts, high end CPUs and GPUs, etc. But that is not the sort of thing most people buy, its the orange being compared against the apple.
Re: (Score:2)
There were no hard data anywhere during the whole Apple event. What Windows PCs they were comparing too were never mentioned, nor any benchmark results.
Not sure, but they were using an EGA monitor if that helps.
Re: (Score:2)
Even Microsoft??? Microsoft can't release a patch without screwing something up! Apple already did this once when they went from Motorola to Intel. It wasn't a flawless transition, but they did it successfully. I have far more confidence that Apple can do this than Microsoft. Windows is still a mess after all these years.
Re:These are some hefty claims (Score:5, Insightful)
Re: (Score:2)
Twice, actually: 68K -> PowerPC -> x86/AMD64.
Re: (Score:2)
Apple went from 68k to PPC in the 90s. in the 00s they did PPC to intel. Both transitions were amazingly well done.
Re: (Score:2)
Apple went from 68k to PPC in the 90s. in the 00s they did PPC to intel. Both transitions were amazingly well done.
And this transition is easier, because they are both little-endian, both have the same floating-point format, vector operations are provided by the compiler in a way that is processor independent, and Apple now has some highly advanced compiler technology that can quite easily translate x86 to ARM code.
Re: These are some hefty claims (Score:2)
Apple also dropped all 32 bit support years ago. Everything is 64 bit. This transition shouldnt be noticable to most users.
Heck even mac os and ios share huge codebases.
Re: (Score:2)
"Migrating billions lines of code from one hardware platform to a new one is no easy task."
Is it a harder task than designing that entire new platform including a 16B transistor processor? You do realize that a lot of that "billions lines of code [sic]" has been running on Apple ARM processors for many, many years, right?
Re: (Score:2, Funny)
He probably never knew that "Apples Code" is written in C/Objective C/C++, and can simply be recompiled for any thinkable architecture. Perhaps he does not know what a compiler is, and for what it is used.
Re: (Score:3, Insightful)
I have a about 300 lbs. of C++ books. I don't think I need any more.. C++20 is just getting so stupid on top of C++ 17 and C++ 14 and C++11 and C++x and C++.. good god then genealogy is longer than Moab in the Bible. Sometimes , it just best to let die off and move on.
ANYONE can write a compiler if they can understand how to link in STDLIB and do
Re: (Score:2)
During the presentation, there was a few programmers who talked about compiling for the new platforms. They said things like "very easy" and "it only took 10 minutes".
Re: (Score:2)
During the presentation, there was a few programmers who talked about compiling for the new platforms. They said things like "very easy" and "it only took 10 minutes".
I'm suddenly imagining one of the game company execs getting up there.
"It was easy," Mark Murray, CEO of Bat S**t Games, Inc. said. "We just took our entire programming team, locked them in a room, and slid pizzas under the door for eighteen months. At first, they complained, saying that it was unrealistic to throw out all of their OpenGL code and replace it with new code targeting Apple's Metal APIs, but then we told them that they could leave at any time they wanted. One older guy walked out the door,
Re: (Score:2)
And why exactly would a program compile fine for Intel x86 but not for M1 ARM? Using Xcode, the transition should be seamless.
Re: It wasn't an easy task, it took over a decade (Score:2)
Well, you sure didnâ(TM)t learn anything about Apple in the last 45 years. Nearly every single paragraph you wrote is factually incorrect.
Exciting (Score:2)
Definitely a lot of marketing going on here. And it's hilarious that you can buy a $700 Mac Mini to drive a $7000 Pro Display XDR. But it's been a long time since there was anything interesting happening in the desktop CPU market. If they can deliver on their performance and battery life claims, it really does represent a huge change in the laptop space.
Re: (Score:2)
Are you kidding? AMD just released Ryzen 5000 and it's destroying Intel. That's pretty interesting.
Re: (Score:2)
That's fairly interesting, it's true. (Also, hooray for AMD—I'll always have a soft spot for them.) But assuming that Apple's graphs represent anything approaching reality, they're showing much bigger year-over-year gains than we've seen in a long time. I don't know how Apple is going to scale these processors up—surely they can't keep putting ALL the memory on the SOC; 16GB is one thing, but what if you want 64GB or 512GB or 12TB like on the Mac Pro?—but I guess that's what I mean by *int
Re: (Score:2)
1.5TB of RAM like the Mac pro—I was thinking about the number of slots (12) when I typed that.
Re: (Score:2)
Yeah, I was thinking of the monitor in Canadian dollars—it's about $7000 here. It's still the same order of magnitude, though.
Benchmarks in a week or two (Score:2)
Re: (Score:2)
Oversold, but has some wins (Score:4, Interesting)
It'd great to see ARM doing well, even if it's Apple doing it. The better battery life will be exciting, and when this hits the desktop they may be able to build massively parallel machines. Integrated graphics is unfortunate; I hope they offer something better soon. Also their initial systems don't have much in the way of RAM. This should change over time but a better launch would include something with 32G or 64G as higher-end options.
Re: (Score:2)
I don't see the battery life being that big of an issue. x86-64 has made huge strides in the past few years. You can easily find a laptop that lasts over 10 hours, which is more than enough for a lot of people. Most people don't need a laptop that does 17 hours away from an electrical socket. Sure, if the performance and price are on par with an x86-64 chip, then more power is welcome, but for most people, the battery life won't be the first thing they look at.
Re: (Score:2)
You can easily find a laptop that lasts over 10 hours, which is more than enough for a lot of people.
Can you though?
I have not yet found a laptop that can last more than 5 hours for my work use, and most of my work is done in a terminal window. My current laptop hovers around 10-15W, according to powertop. 7W just for the display backlight at half brightness.
Re: (Score:2)
The laptops I see with over 10 hours of battery life usually have a big ass battery strapped to them that triples the weight and doubles the size. Not exactly something people would love to lug around for all day computing.
After all, if they need that sort of endurance, they usually aren't near places to charge otherwise they'd be using the AC adapters far more to extend battery life.
Re: Oversold, but has some wins (Score:2)
Iâ(TM)d imagine thereâ(TM)ll be better RAM configurations in higher spec Macs. The 13â is entry level. Iâ(TM)d be surprised if itâ(TM)s not at least 32 in the 15â upwards.
Iâ(TM)m more curious to see how the GPU performs. If this is the end of discrete GPUs in MacBook Proâ(TM)s then it had better be bloody good and not just better than Intel integrated video.
Re: Oversold, but has some wins (Score:2)
Bloody Slashdot. Shall we expect UTF-8 support by 2030?
Re: (Score:2)
Unicode support has been around since 2006 or so. But a string of abuses has resulted in going from a Unicode codepoint blacklist to a codepoint whitelist. And yes, UTF-8 is supported.
Of course, if you don't know how you can abuse Unicode to screw things up for everyone, then you really shouldn't be complaining. (And yes, Unicode abuse is what leads to those deadly text messages that crash Android and iOS phones).
Re: Oversold, but has some wins (Score:2)
They could try seeking advice from practically every website made in the past decade. There are ways the apostrophe can be made safe. Maybe even one day weâ(TM)ll even find a way to make the £ (Sterling) safe.
Re: (Score:2)
It would be pretty easy to add some basic punctuation and science/math symbols to the whitelist - especially those used by default on iOS. However, I think those specific characters are kept on the blacklist to put a spotlight on who is using Apple hardware.
Re: (Score:2)
That doesn't exactly reflect any better upon them. I know this isn't a banking site or anything. And if my comment history goes kablooey... BFD. But sanitizing your inputs so that users can't break your site or DB with malicious "text" has been a standard best-practice for competent web developers for I-don't-even-know-how-many years. There's really just no excuse.
Re: (Score:2)
My guess is they deliberately avoided the really high end because they won't be competitive with Ryzen for workstation type loads. Even Intel can't keep up there.
Re: (Score:2)
Re: (Score:2)
The Mac Pro will need new chips too; Apple said they're transitioning the WHOLE line over to their silicon in the next couple of years. But those more powerful chips are definitely going to present a bigger architectural hurdle. I had originally thought they'd take on the Mac Pro first exactly because it's the most difficult, but it's the platform they have the absolute most control over. Since they went for the portable/power-efficient end first, the Mac Pro will have to be last, and we won't see those mac
Re: (Score:2)
If I remember right, the Pro desktop was the last one to leave the PowerPC platform.
Re: Oversold, but has some wins (Score:2)
Re: (Score:2)
I maintain some opensource scientific software available on all 3 major OSs; I want to start porting my stuff so when higher-end ARM systems show up I'm ready. There's a chance a few people might try to run things on a lower-end system (contrary to our RAM recommendations) probably starting in a month, and I really don't want to buy a system now only to discard it a few months later. It's a bit frustrating.
(I realise not everyone is in the academic sector; thought I'd share some of our concerns there)
Re: (Score:2)
I think this competition will not likely be on a per-CPU basis so much as a fleet basis. Meaning they'll go with having lots of low-mid-tier CPUs in any given workstation rather than a few very fast cores. If I'm right, I expect a big push by Apple to get developers to make their apps much more multithreaded (or composed of smaller units).
Re: (Score:2)
The issue is that ARM currently doesn't scale very well to massive numbers of cores. ARM has just released some new designs that improve things a bit but they are many years behind what AMD and even Intel are doing. You can't just throw more cores in and expect it to work well.
Re: (Score:2)
The better battery life will be exciting, and when this hits the desktop they may be able to build massively parallel machines
I gotta admit, that's where I get curious.
Performance per watt is an important metric for laptops. And Apple sells a ton of laptops, so this is very important for Apple. But there are some applications where I'm fine being "chained to a desk" and I'd rather have high-performance and don't really care about the watts.
No Biden will not take your M-1 (Score:2)
Doesn't matter. Biden is taking all your ARMs away.
No, as the M-1 is limited to 8 and can't be expanded by the end user since its all internal its considered acceptable, even in CA.
No good. (Score:5, Funny)
Re: (Score:2)
Like when the iPod was released and everyone here saying how dumb it was and it wouldn't sell.
Re: (Score:2)
whichever shitty Linux people use now for no fucking reason
Kind of offtopic, but the reason is that it's free, works fine (if only barely), and it's not controlled by Microsoft. I like Pop OS.
Curious to see how much the Air will throttle... (Score:2)
2021 (Score:2)
Would this be the year of Apple on my desktop?
Re: (Score:3)
It can be, if you're willing. But it doesn't have to be an apple. You can put oranges and bananas on your desktop too, if you want.
Re: (Score:2)
That's low-hanging fruit you bait with.
Memory on-chip? (Score:2)
I read in a separate article that the CPU in new Macs will have memory on-chip? 8 GB to start with, and 16 GB later, with no possibility of expansion? Just askin'. That'll be fine for most users, but not for power users. (I'm a heavy user of Adobe CC apps and have 56 GB installed.)
If the above is true, I wonder if they'll go with some kind of NUMA architecture in their very-high-end Macs.
Re: (Score:2)
Why would you not assume that higher end Macs simply get a different processor and memory solution? You think that because their first, low-TDP processor makes a design tradeoff that every other Mac will require bandaids?
Re: (Score:2)
Why would you not assume that higher end Macs simply get a different processor and memory solution? You think that because their first, low-TDP processor makes a design tradeoff that every other Mac will require bandaids?
I think that, Apple tends to try and concentrate all platforms into a single architecture. I have no idea whether they will actually do that in this case, I can't read their minds. It's all speculation at this stage.
I don't see them using different architectures in different tiers, but I guess we'll see.
Re: (Score:2)
I think if they were going to do that, they'd've announced all their Macs today. My own intuition is that they'll have different tiers and system architectures because trying to make a Mac Pro the same way you make a Macbook Air would just leave you with a garbage Mac Pro, and they've already had enough trouble there.
Re: (Score:2)
I think if they were going to do that, they'd've announced all their Macs today. My own intuition is that they'll have different tiers and system architectures because trying to make a Mac Pro the same way you make a Macbook Air would just leave you with a garbage Mac Pro, and they've already had enough trouble there.
The whole point is NOT to announce all their macs today, so that they can sell you an 8GB mac now and a 16GB mac when it comes out.
Re:Memory on-chip? (Score:4, Interesting)
Yeah RAM appears to be on-chip, apparently with that they can do a whole bunch of zero-copy between the different components (CPU, GPU, ML) for increased performance over the alternative. I'd be very interested to see the low-level OS architecture for this type of setup.
Re: (Score:2)
You did get the "no possibility of expansion" part right.
But it's not "8 GB to start with, and 16 GB later". It's "Buy your Mac with either 8GB or 16GB of RAM".
Re: (Score:2)
You did get the "no possibility of expansion" part right.
But it's not "8 GB to start with, and 16 GB later". It's "Buy your Mac with either 8GB or 16GB of RAM".
Yes, I did get that, and you are correct. But it appears to be part of Apple culture to replace your Mac with the upgraded one when it comes out. So naturally they came out with the 8 GB model first.
Re: (Score:2)
Agreed.
I often right click on an image in Lightroom to "edit in Photoshop", and often have several photos open at the same time. Also, creating panoramas or doing stacking in Lightroom seems to take a lot of memory. Statistics routinely show 2/3 memory allocated.
A problem I'm having at the moment is with the medium-high-end Nvidia card failing. With Lightroom and/or Photoshop and a tutorial open at the same time, all using the gpu for acceleration, I frequently have the video crash (black screen) althoug
8 GBs of memory.. really? (Score:5, Insightful)
Re: (Score:2)
RISC code-expansion? Can you elaborate on that please?
Yes I know RISC means Reduced Instruction Set but even with a reduced instruction set the amount of code needed isn't necessarily more.
Back when the ARM2 first appeared in the Acorn Archimedes, which I bought while at Uni doing Comp Sci & Electronics, I ran a comparative test.
I took a C function and compiled it on the Archimedes, a Sinclair QL (68000) and a sequent (x86).
Then I counted the total number of instructions each compiler produced for its C
Re: (Score:2)
For your information, the MC68000 has 16 registers of 32-bit width.
The one dumping registers on the stack was most probably the x86, not the MC68000.
Re: (Score:2)
The QL's 68008 had 8 data registers, 7 address registers, 2 stack pointers (user/super). [wikipedia.org]
The ARM2 had 27 32-bit registers. [wikichip.org]
It's registers were general purpose and windowed. Only 16 were available at any time. The top most being the program counter.
On an interrupt the registers were swapped to remove the need to dump to stack (obviously not all 16, something like the first 10 hence 27 registers).
It's been 30+ years but both the 68008 and the x86 from memory spent too much time stack push/pulling.
Re: (Score:3)
When I was at Canterbury University in the 1980's there were some other computer science students there who were interested in ARM. Dave Jaggar actually did his thesis on the ARM instruction set. I attended a couple of his seminars over the years and I recall him saying that it normally needed about 20% more code than a CISC computer, but it would execute the code much faster therefore more than making up the difference. He actually ended up as ARM’s Head of Architecture Design in Cambridge, UK and de
Re: (Score:2)
You understand the idea of the "base model" right? You don't need 16GB of RAM to post shit on Slashdot all day. 8GB is more than enough for many people. Not for me on my desktop, but for many people.
An CPU? (Score:3)
Now that's some advanced Eurospeak!
Only 2 USB ports? (Score:2)
I have the 13" MacBook Pro for work, and I already need extra dongles for ethernet, monitors, keyboard, and mouse. The new one only gives you 2 USB ports, and one of them will be used for power! I can't wait to see what they remove on the next big release!
Re: (Score:2)
You know you can get use a single USB-C port for a "docking station" with ethernet, display, and multiple USB-A. And it can pass through power, leaving all of the other ports on the laptop available. I like the ones from Monoprice.
Re: (Score:2)
What apps will these chipped computers run? (Score:2)
Re: (Score:2)
Apple announced that all Apple programs are compiled in fat executables with both Intel and ARM binaries included.
For 3rd-party programs, there is an emulation/translation layer called Rosetta 2. Rosetta 2 is apparently both JIT and install-time recompile of Intel -> ARM binaries. This is supposed to work for the vast majority of applications, but it will not work for Bootcamp or virtualization. I imagine we'll start seeing benchmarks soon.
Re: (Score:2)
To be more precise, anyone is/will be able to compile fat executables too. And if you don't use Apple's Xcode then it's your own damn fault, they've been telling you to switch for years now.
Rosetta 2 is there in case you need to use x86 Mac software that's not supported anymore (i.e. the developer/company stopped making new versions), or you can't upgrade to the new version for some reason (the licensing changed, it's too expensive, it changed from paid software to software-as-a-service, etc).
Laptop for developers? (Score:3)
When the new ARM 16" arrives, that window will be closed again, it looks like, since x86-target docker images will probably run 5x slower emulated on the ARM.
So I guess all the developers who don't develop Apple-specific code will have to scoop up their last 16" Intel Macbook Pro before they're gone. Hopefully for a discount once the new ARM one arrives.
Probably too much to ask for a macbook pro with an optional x86 in there alongside the ARM, so it remains a developer machine.
Apple is making M1s ... (Score:2)
M-1, huh? (Score:2)
Better skip a generation when they get to M-5. M-5 units have a tendency to develop some fairly dramatic and exciting showstopper bugs. Some deaths were involved the last time around, if my recollection is correct. And the problem can only be corrected by talking the M-5 into committing suicide. And not all of us have the unusual diction that's been useful for that sort of enterprise.
Re: (Score:2)
Sorry... It's from classic Star Trek. The M-5 was a (One of several, actually.) computer that achieved sentience, went insane, and killed a bunch of people. And, as with most of the others; Captain Kirk saved the day by talking it into committing suicide.
https://memory-alpha.fandom.co... [fandom.com]
Software compatibilities? (Score:2)
Will they be completely compatible with older softwares? I know Apple dropped 32-bit in Catalina. I assume these new MacBooks will come with BigSur.