Apple Could Use ARM Coprocessors for Three Updated Mac Models (techcrunch.com) 119
According to a Bloomberg report, Apple could be working on three new Mac models for this year. From a report: All three of them could feature an ARM coprocessor to improve security. Apple isn't switching to ARM chipsets altogether. There will still be an Intel CPU in every Mac, but with a second ARM processor. Currently, the MacBook Pro features a T1 chip while the iMac Pro features a T2 chip. On the MacBook Pro, the ARM coprocessor handles the Touch ID sensor and the Touch Bar. This way, your fingerprint is never stored on your laptop's SSD drive -- it remains on the T1 secure enclave. The Intel CPU only gets a positive response when a fingerprint is validated. The iMac Pro goes one step further and uses the T2 to replace many discrete controllers. The T2 controls your stereo speakers, your internal microphone, the fans, the camera and internal storage.
Re: (Score:2)
What, didnt notice that even the mighty Fruit could not get Intel do deactivate the ARM core in the chipset, a.k.a Intel ME?
No the ME hardware is based on Intel Quark [wikipedia.org] processor
Re: (Score:2)
from TFWP: "Starting with ME 11..."
And prior to that ARC (not ARM)
Re: (Score:2)
He's probably thinking of the old ARC chip that they used formerly.
Re: (Score:2)
He's probably thinking of the old ARC chip that they used formerly.
Still not ARM though!
Re: (Score:2)
What? It's only off by one letter! It has to be nearly the same...
Attack Surface? (Score:1)
At first glance this appears to be a whole new way to attack the machine...
Re:Attack Surface? (Score:4, Informative)
In terms of security centralized devices are the bane of security. Decentralized components that do one thing and does them well helps security, by making each process easy to code and manage without conflicting with other actions. If your fingerprint scan will need to be handled by the main CPU. that means your fingerprint data is going down the main CPU Bus, which is possibly visible by other applications and hacks. Vs. in essence its own little computer in the computer to do the work and send back a good or bad bit. Outside what the rest of the computer is dealing with. All the extra data in the processing is not accessible from the rest of the computer. This in general makes things much safer.
Re: (Score:2)
In terms of security centralized devices are the bane of security. Decentralized components that do one thing and does them well helps security, by making each process easy to code and manage without conflicting with other actions. If your fingerprint scan will need to be handled by the main CPU. that means your fingerprint data is going down the main CPU Bus, which is possibly visible by other applications and hacks. Vs. in essence its own little computer in the computer to do the work and send back a good or bad bit. Outside what the rest of the computer is dealing with. All the extra data in the processing is not accessible from the rest of the computer. This in general makes things much safer.
Exactly!
Amazing that Slashdotters can't see that; but it is most likely their inherent Anti-Apple bias.
Re: (Score:2)
If you are attacking it with a bat, or soldering a bridge, then yes.
But most attacks are done with Software nowadays. That means access to the main memory and bus is key, and more complex information that goes on will increase the chances of a problem.
Re: (Score:2)
If you are attacking it with a bat, or soldering a bridge, then yes.
But most attacks are done with Software nowadays. That means access to the main memory and bus is key, and more complex information that goes on will increase the chances of a problem.
Exactly.
In the TOTAL analysis of the system, getting information such as raw biometric scans, whether encrypted or not, off of the main system bus(es), be it the main memory bus, or a miscellaneous SPI or I2C bus, etc. is ALWAYS a Good Thing as far as INCREASING security (a/k/a DECREASING "Attack Vectors").
Only stupid, basement-dwelling trolls with no REAL design experience like the GP don't know that fundamental truth.
Re: (Score:2)
No, those of us with actual hardware knowledge know that ADDING hardware only increases an attack surface, you fake fuckwit troll.
As an embedded hardware/software developer for the past several Decades, I have plenty of hardware knowledge.
Theoretically, true, anything you add, whether software OR hardware, increases an attack surface. So, why would you even bother to make a comment that is true no matter what, other than to Troll yourself?
So look, NOW who's trolling???
Re: Attack Surface? (Score:2)
Re: (Score:2)
No, those of us with actual hardware knowledge know that ADDING hardware only increases an attack surface
Bullcrap. Many mission critical systems require a separate CPU for WDTs, memory scrubbing, etc.
The primary CPU, which runs application code, is always going to be the most vulnerable. Moving critical functions to a separate CPU with its own memory will make a system much harder to attack.
Re: (Score:2)
You're both right, at least to some degree. Increasing the amount of hardware does increase the attack surface, undeniably, but it also moves critical functions from the main component (with the large attack surface) to the specialised sub-component (which may have a much reduced attack surface). The system as a whole may be technically less secure as a result, but the stuff you care most about is more secure.
Re: (Score:2)
At first glance this appears to be a whole new way to attack the machine...
At second glance this appears to be removing many ways to attack the machine.
By moving access control and security to a separate dedicated chip with its own memory, rather than running it on the general purpose CPU which also runs random apps and webpages, this should make exploits more difficult.
Intel be gone (Score:2)
Only a matter of time before the Intel chips disappear completely.
Re: (Score:1)
Yeah. Hundreds or maybe thousands of years but, still, just a matter of time.
Re: (Score:2)
Perhaps I should have said, "Only a matter of time before the Intel chips disappear completely from new Mac models."
For those unable to read context.
Re: (Score:2)
What a load of bull droppings. Not all Mac users use Faecebook or any other antisocial media for that matter.
I've never even seen a FB screen or app and don't want to.
I have absolutely ZERO social media accounts.
Re: (Score:2)
I have absolutely ZERO social media accounts. /.
You are mistaken. You have an account on
And I would not wonder if there is one or two more Wiki/Forum/Discussion sites where you have an account. An old delicious account perhaps, or on reddit?
Re: (Score:2)
I have absolutely ZERO social media accounts. /.
You are mistaken. You have an account on
And I would not wonder if there is one or two more Wiki/Forum/Discussion sites where you have an account. An old delicious account perhaps, or on reddit?
If you consider tech forums "Social media" (more like anti-social media around here!), then I have a VERY few of those (but no Delicious or Reddit); but I consider "Social Media" the usual subjects: Twitter, Facebook, Snapchat, etc, which are not "Tech-focused". I don't even know very many, because I just don't give a shit about that stuff.
I only put up with Slashdot because it's like beating oneself over the head. It feels so good when you stop!
Re: (Score:2)
Well, look at it this way: /. password, too, or not?
American immigration officers ask you for your social media account passwords.
Would they want your
Re: (Score:2)
Well, look at it this way: /. password, too, or not?
American immigration officers ask you for your social media account passwords.
Would they want your
I honestly don't think so.
Re: (Score:2, Troll)
They are already getting rid of 32 bit support, soon they will fully iOSify MacOS from OS X to OS i. With an ARMed walled garden. The $5000 iMac Pro is the swan song for Intel Macs. MacOS 10.14 will be full Facebook machines
They are getting rid of 32 bit support because it is a gigantic drain on development and testing (especially the latter) to keep both architectures going.
I don't like it much myself; but I understand their point of view.
It has NOTHING to do with "IOSifying" macOS.
I hope it's not lumped on the DMI bus (Score:1)
so storage will be cut down to 1 pci-e x4 bus mixed with co-processor traffic. nice way to cut the power of the new mac pro (at least it has the lanes to not stuff it on the DMI bus that also has network and all other io on it)
For the ones with less pci-e they should do pci-e like this.
X16 cpu to switch and switch X8 video card X4 TB (one bus) X4 (storage + co cpu)
and DMI for all other IO.
mac pro at least 1 open pci-e X16 + video card X16 + at least 2 TB buses. and maybe X4 storage 2 (non boot) and (X4 boot
Sounds like an Embedded Controller (Score:3)
Re:Sounds like an Embedded Controller (Score:4, Interesting)
For the most part it is. For some crazy reason we moved to integrated systems, back in the 1990's. I think it was because the OS started to support software drivers, so devices can be made much more cheaply, because things like controller boards, or supporting an open protocol can be skipped. A modem is just a DtoA and AtoD converter, which could had been made cheaply. However the expensive part of them was the Hayes AT command processing, which boosted its cost way up. However if you have the driver handle the stuff, you can release a cheap modem (which could probably double as a sound card)
This came at a cost of security though. Integrated means your OS which sees all that is going on. And any security flaw can effect everything.
Today we are getting more attention in security, also the price for components are getting cheaper and smaller too. So it seems that we are going back to this method. Perhaps we may get to a point where these things can be on a removable socket again too, so we can upgrade and repair again.
Re: (Score:3)
ARM embedded controllers with a huge number of peripherals are extremely cheap now. As such they get thrown in to all sorts of things. In a way it's good, they offload work from the main CPU. In another way it's bad, because they rarely even consider security in the design.
Sadly I doubt sockets will be coming back. They are expensive. Back in the day they made more sense because you might need to issue a firmware update which meant replacing chips. That an parts were unreliable or had to be matched during m
Re: (Score:2)
Hey AmiMoJo, remember how I adopted the sig "The one straight white male in new Star Trek will be portrayed as evil or incompetent" back before Star Trek Discovery premiered? You know, because he was the only straight white male on an SJW show, and so I knew that he would ultimately have to be revealed as either evil or incompetent--because SJW's, as much as they would deny it, really HATE straight white males.
Remember how an enlightened SJW like yourself corrected my foolish misinformed view back in Octobe
Ecclesiastes 1:9. (Score:2)
Remember the NuBus DOS card you could get to run DOS at 'native' speeds?
That said, I welcome it and other similar endeavors. I wish I could buy a more 'modular' desktop for exploratory development. For my work I'd rather have a boatload of ARM cores or FPGA devices on a x16 PCI link than a video card.
Re: (Score:2)
Some porn for you... 48 MIPS cores on a PCIe board:
http://parpro.com/product/o3e-... [parpro.com]
Huh? Why not switch from Intel to AMD? (Score:2, Insightful)
Apple could kill Meltdown and still have perfect Intel compatibility by just using AMD. I am not necessarily saying they should not have the ARM coprocessor, just that using AMD instead of Intel would increase security drastically. Also because AMD doesn't have the management Engine. They have something equivalent, but that doesn't have a full IP stack and other "niceties" like that.
Re: (Score:2)
Apple could kill Meltdown and still have perfect Intel compatibility by just using AMD. I am not necessarily saying they should not have the ARM coprocessor, just that using AMD instead of Intel would increase security drastically. Also because AMD doesn't have the management Engine. They have something equivalent, but that doesn't have a full IP stack and other "niceties" like that.
It would take the better part of a year for Apple to "qualify" macOS for AMD CPUs.
Re: (Score:1)
Apple probably already has MacOs "qualified" for AMD CPUs. If you remember back when Apple shocked the world by announcing that they were switching from Motorola CPUs to Intel, they'd been running various flavors of MacOS for years in their development skunkworks. They never stand still, and like all good companies, continually plan for multiple contigencies.
There goes Hackintosh (Score:4, Insightful)
I highly suspect that this change will render hardware compatibility with off the shelf components a thing of the past for Apple (again).
This means no more Hackintosh should a future OSX require this chip be in place.
Re: (Score:1)
I admit, Linux (with its various distros) are much easier and simpler to install and use. However, I prefer BSDs for philosophical reasons.
Re: (Score:2)
Were I to switch to BSD, I'd prefer Mate to KDE. I use lots of KDE tools and libraries, but their screen has gotten too noisy.
Re: (Score:2)
Re: (Score:2)
Honestly, I would not care less. OSX is a poor Linux imitation. If you wanted UNIX experience, just install Linux. It is much easier and gives you a better OS to boot.
That's why all the developers who care to diversify beyond Windows write for Ubuntu.
Re: (Score:3)
And the remaining use a Mac and write for CentOS or SUSE or Red Hat in Java ...
Re: (Score:2)
Honestly, I would not care less. OSX is a poor Linux imitation.
Actually, since macOS is the successor of NeXTSTEP and NeXTSTEP predates Linux the opposite is more accurate.
SMC? (Score:2)
So how exactly is this different from the SMC (System Management Controller for those that don't know)? AFAIK the SMC already does these tasks.
Sounds like they're just replacing whatever the SMC used to be (I'm assuming an FPGA of some sort) to an ARM CPU?
put in co in coprocessor! (Score:1)
This isn't surprising (Score:3)
Every PC has dozens of microprocessors, so adding an ARM chip into a computer is no big paradigm shift. A typical PC has a SATA controller, USB controller, video card, etc. One of the big things that Intel has been good at over the years is integrated more features onto a single die. Around 2000 is when they started adding wireless directly onto the die ("Centrino") followed by integrated video. I forget when the memory controller got integrated.
Re: (Score:2)
Every PC has dozens of microprocessors, so adding an ARM chip into a computer is no big paradigm shift. A typical PC has a SATA controller, USB controller, video card, etc. One of the big things that Intel has been good at over the years is integrated more features onto a single die. Around 2000 is when they started adding wireless directly onto the die ("Centrino") followed by integrated video. I forget when the memory controller got integrated.
Few computers have multiple general purpose CPU's of different architectures. A SATA controller, GPU or even a Northbridge or Southbridge are nothing like a CPU because they have different purposes. An ARM and Intel (or AMD) CPU are built to do the same thing but are fundamentally incompatible (you cant even get Intel and AMD CPU's to work together well).
Given that either of those processors are capable of handling modern OS's without any trouble, there's no benefit to increasing complexity to hand of di
I think I know why they're doing this (Score:4, Interesting)
Watch for a forthcoming OS that will run macOS apps and iOS apps simultaneously, with a touchscreen for at least the laptop models. At first such a machine will primarily for developers, replacing the iOS simulator that is now part of Xcode, but we may then see the long-awaited convergence of laptops and tablets.
Long awaited by whom? (Score:2)
I think of all the non-sense that has hit Mac Os in recent years, in an effort to make it more IOS like. I do not like how they removed management controls away from iTunes. I LIKED having more robust photo options. Almost everything they have added in to m
You can see how this went (Score:3)
Reasonably intelligent person: Hey, this fingerprint stuff is sensitive. Let's isolate it in separate hardware!
Non-stupid detail person: ... and since it's specialized hardware and has information we want to control let's lock it down and have it only run code we've signed!
Well-meaning idiot: ... and since it only runs our code, let's make it More Secure by having zero transparency!
Fucking worthless moron: ... and since it's More Secure, let's put it in control of more stuff! And add more software! And funnel everything through it! Let's have it run the keyboard! And the camera! And the disk!
(Intel): ... and let's give it direct network access, too!
Hacker: Pwnt!
This pattern happens over and over again at company after company. People build these "secure" enclaves to isolate things, and then as soon as they have them they blow that isolation by shoveling in every damned thing they can think of so everything can be "more secure". And since it's in charge of everything, it has to have control of everything. And then it gets cracked.
THAT'S NOT HOW IT'S SUPPOSED TO FUCKING WORK!. If you have a sensitive function, you put it in its OWN FUCKING COMPARTMENT. And you give it no more privilege than it needs to do that one thing. You don't dump in a shit-ton of unrelated software into a coprocessor that's trusted for everything (and, by the way, is usually pretty much invisible to the OS).
Morons.
After Intel ME flaws (Score:2)
Hooray for OOB vulnerabilities! (Score:2)
Re: (Score:1)
Re: (Score:3)
Of course none of that info is secure any longer due to spectre and meltdown on the Intel side. Everyone is going to need to replace their CPU's. There's going to be a huge class action lawsuit (or several of them.)
Apple released their OWN Spectre and Meltdown patches for the past 3 macOS versions. I haven't heard that their versions of the patches have the same issues as the Intel ones that everyone is rushing to un-install...
Re: (Score:2, Funny)
They will; but it will cost an ARM and a leg.
Re:Next (Score:4, Funny)
(slow clap)
"Oh good, my slowclap processor made it into this thing." - GLaDOS
Re: (Score:2)
Maybe they will fix the myriad root exploits at some point too. That would be nice!
Typical bullshit Apple Hater comment.
Some drive-by hate-filled-bullshit, with zero substantiation.
Re: (Score:2)
So a fingerprint or other auth is stored externally, then tells an Intel chip, yeah, go ahead and boot.
In possession of that machine, we just fake the auth to the Intel chip and move on from there.
There is no Apple hate here; Apple *thinks* they're smarter than the rest of the world, but have become vastly insular, and a cult unto themselves.
Re: (Score:1)
EZ PZ. Monitor the transaction with a logic analyzer. Cough the stream to your favorite GPU board, or perhaps an ASIC or FPGA that knows the algorithm. Somewhere, the security key is stored. Hammer that as an alternative. Voila: unlock.
Or just find where the state is termed valid somewhere downstream of this logic path and flip (or pound) a few bits.
The sheer sanctimoniousness of inter-process systems designers galls the shit out of me. With a clever enough hammer, you can break anything, and Apple is and h
Re: (Score:2)
EZ PZ. Monitor the transaction with a logic analyzer. Cough the stream to your favorite GPU board, or perhaps an ASIC or FPGA that knows the algorithm. Somewhere, the security key is stored. Hammer that as an alternative. Voila: unlock.
Or just find where the state is termed valid somewhere downstream of this logic path and flip (or pound) a few bits.
The sheer sanctimoniousness of inter-process systems designers galls the shit out of me. With a clever enough hammer, you can break anything, and Apple is and has never been an exception to this.
Then why is the FBI ranting and raving?
They sure as HELL don't have to do so for Android phones!
And the "Knows the Algorithm" part isn't necessarily so "EZPZ" when BOTH ends are non-discoverable, like in an iPhone. It might be possible in a Mac, unless Apple has convinced them to put some custom hardware in their CPUs (which they very may well have).
With a clever enough bolt, you really do need the proper tool to remove it.
I'm not saying it's impossible; but Apple has a VERY good track-record in this regard
Re: (Score:2)
EZ PZ. Monitor the transaction with a logic analyzer.
This will only work if Apple's protocol is designed to total morons supervised by other total morons.
You can't just play back a bitstream to defeat encryption, because every transaction uses different "salt" and timestamp.
... an ASIC or FPGA that knows the algorithm.
Many of the most robust cryptography is based on published algorithms. "Knowing the algorithm" is obviously not enough. You also need time, like quadrillions of times the life of the universe. Good luck.
Re: (Score:2)
So a fingerprint or other auth is stored externally, then tells an Intel chip, yeah, go ahead and boot.
In possession of that machine, we just fake the auth to the Intel chip and move on from there.
There is no Apple hate here; Apple *thinks* they're smarter than the rest of the world, but have become vastly insular, and a cult unto themselves.
In practice, Apple's "Go/NoGo" is a LOT more complicated than a simple Logic-Level. That's why you can't swap-out the Fingerprint sensor in an iPhone/iPad. It has a unique cryptographic pairing with the SoC. I'm not sure how that's handled with the Intel CPU in the MacBook Pro; but I am sure Apple has more than just "Pull this pin low for an "Unlock" Signal".
And yes, Apple IS smarter than the rest of the world (or at least as smart as the top people in this field); at least in this regard.
Ask the FBI. BTW,
Re: (Score:2)
In possession of that machine, we just fake the auth to the Intel chip and move on from there.
I doubt if the "auth" is something as simple as a voltage change on a pin. You have no reason to believe that it is easy, or even possible, to "fake the auth".
Re: (Score:2)
Maybe they will fix the myriad root exploits at some point too. That would be nice!
When the company gets word of a root exploit, it gets patched quickly. If you know about 10,000 of them, Apple would be really interested.
Re:Hackintosh (Score:5, Insightful)
Interesting way to make hackintosh machines more difficult to build, but an arm core can be emulated with qemu.
Nice try.
macOS will still have to install on the dozens of Mac models WITHOUT an ARM coprocessor; so, for the next foreseeable while, that paranoid fantasy will remain just that.
Re: (Score:2)
They'll ditch compatibility when convenient, regardless of what the community has to say. Never forget that Apple is the king of planned obsolescence.
Re: (Score:2)
They'll ditch compatibility when convenient, regardless of what the community has to say. Never forget that Apple is the king of planned obsolescence.
They've been "looking the other way" regarding the Hackintosh community since OS X debuted nearly TWENTY years ago.
I really don't think they are using the ARM coprocessors to do that, overtly; but if macOS starts DEPENDING on their presence...
Re: (Score:2)
Interesting way to make hackintosh machines more difficult to build, but an arm core can be emulated with qemu.
Nice try.
macOS will still have to install on the dozens of Mac models WITHOUT an ARM coprocessor; so, for the next foreseeable while, that paranoid fantasy will remain just that.
Are Hackintoshes even still a thing?
Honestly, that's the first time I've heard that term in what must have been 5 or 6 years... I don't think many have bothered on a serious level because Windows 7 was good enough and Apple made it too hard. I imagine the only people doing it now are doing it just for the LoLs.
However welcome to the beginning of the end. The Mac User is now just an Ipad user with a bigger bill, its only a matter of time before the Intel processor is dropped and MacOS and IOS become on
Re: (Score:3)
Interesting way to make hackintosh machines more difficult to build, but an arm core can be emulated with qemu.
Nice try.
macOS will still have to install on the dozens of Mac models WITHOUT an ARM coprocessor; so, for the next foreseeable while, that paranoid fantasy will remain just that.
Are Hackintoshes even still a thing?
Honestly, that's the first time I've heard that term in what must have been 5 or 6 years... I don't think many have bothered on a serious level because Windows 7 was good enough and Apple made it too hard. I imagine the only people doing it now are doing it just for the LoLs.
However welcome to the beginning of the end. The Mac User is now just an Ipad user with a bigger bill, its only a matter of time before the Intel processor is dropped and MacOS and IOS become one. No-one does serious work on a Mac, despite your forthcoming protestations.
Hackintoshes are most CERTAINLY still "a thing", especially with the long-time since the Mac Pro refresh, and the fact that some have been pining for the "return of the tower" since the Cylindrical Mac Pro came out in 2013. Plus, some people are just "cheap"...
Apple will not be dropping Intel (buy they may switch to AMD, if they can get their power consumption down to a reasonable level); because they know they sell a LOT of MacBook Pros (and some iMacs) to people running primarily Windows or Linux, and tho
Re: (Score:1)