Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Desktops (Apple) Intel Portables (Apple) Apple

Apple Could Use ARM Coprocessors for Three Updated Mac Models (techcrunch.com) 119

According to a Bloomberg report, Apple could be working on three new Mac models for this year. From a report: All three of them could feature an ARM coprocessor to improve security. Apple isn't switching to ARM chipsets altogether. There will still be an Intel CPU in every Mac, but with a second ARM processor. Currently, the MacBook Pro features a T1 chip while the iMac Pro features a T2 chip. On the MacBook Pro, the ARM coprocessor handles the Touch ID sensor and the Touch Bar. This way, your fingerprint is never stored on your laptop's SSD drive -- it remains on the T1 secure enclave. The Intel CPU only gets a positive response when a fingerprint is validated. The iMac Pro goes one step further and uses the T2 to replace many discrete controllers. The T2 controls your stereo speakers, your internal microphone, the fans, the camera and internal storage.
This discussion has been archived. No new comments can be posted.

Apple Could Use ARM Coprocessors for Three Updated Mac Models

Comments Filter:
  • by Anonymous Coward

    At first glance this appears to be a whole new way to attack the machine...

    • Re:Attack Surface? (Score:4, Informative)

      by jellomizer ( 103300 ) on Monday January 29, 2018 @10:59AM (#56026867)

      In terms of security centralized devices are the bane of security. Decentralized components that do one thing and does them well helps security, by making each process easy to code and manage without conflicting with other actions. If your fingerprint scan will need to be handled by the main CPU. that means your fingerprint data is going down the main CPU Bus, which is possibly visible by other applications and hacks. Vs. in essence its own little computer in the computer to do the work and send back a good or bad bit. Outside what the rest of the computer is dealing with. All the extra data in the processing is not accessible from the rest of the computer. This in general makes things much safer.

      • In terms of security centralized devices are the bane of security. Decentralized components that do one thing and does them well helps security, by making each process easy to code and manage without conflicting with other actions. If your fingerprint scan will need to be handled by the main CPU. that means your fingerprint data is going down the main CPU Bus, which is possibly visible by other applications and hacks. Vs. in essence its own little computer in the computer to do the work and send back a good or bad bit. Outside what the rest of the computer is dealing with. All the extra data in the processing is not accessible from the rest of the computer. This in general makes things much safer.

        Exactly!

        Amazing that Slashdotters can't see that; but it is most likely their inherent Anti-Apple bias.

    • At first glance this appears to be a whole new way to attack the machine...

      At second glance this appears to be removing many ways to attack the machine.

      By moving access control and security to a separate dedicated chip with its own memory, rather than running it on the general purpose CPU which also runs random apps and webpages, this should make exploits more difficult.

  • Only a matter of time before the Intel chips disappear completely.

    • Yeah. Hundreds or maybe thousands of years but, still, just a matter of time.

      • by shmlco ( 594907 )

        Perhaps I should have said, "Only a matter of time before the Intel chips disappear completely from new Mac models."

        For those unable to read context.

  • so storage will be cut down to 1 pci-e x4 bus mixed with co-processor traffic. nice way to cut the power of the new mac pro (at least it has the lanes to not stuff it on the DMI bus that also has network and all other io on it)

    For the ones with less pci-e they should do pci-e like this.

    X16 cpu to switch and switch X8 video card X4 TB (one bus) X4 (storage + co cpu)
    and DMI for all other IO.

    mac pro at least 1 open pci-e X16 + video card X16 + at least 2 TB buses. and maybe X4 storage 2 (non boot) and (X4 boot

  • by peppepz ( 1311345 ) on Monday January 29, 2018 @10:43AM (#56026745)
    PC manufacturers have been using these for a long time. They started out as 8-bit MCUs with a builtin ROM and have been getting more and more powerful with time.
    • by jellomizer ( 103300 ) on Monday January 29, 2018 @11:09AM (#56026927)

      For the most part it is. For some crazy reason we moved to integrated systems, back in the 1990's. I think it was because the OS started to support software drivers, so devices can be made much more cheaply, because things like controller boards, or supporting an open protocol can be skipped. A modem is just a DtoA and AtoD converter, which could had been made cheaply. However the expensive part of them was the Hayes AT command processing, which boosted its cost way up. However if you have the driver handle the stuff, you can release a cheap modem (which could probably double as a sound card)
      This came at a cost of security though. Integrated means your OS which sees all that is going on. And any security flaw can effect everything.

      Today we are getting more attention in security, also the price for components are getting cheaper and smaller too. So it seems that we are going back to this method. Perhaps we may get to a point where these things can be on a removable socket again too, so we can upgrade and repair again.

      • by AmiMoJo ( 196126 )

        ARM embedded controllers with a huge number of peripherals are extremely cheap now. As such they get thrown in to all sorts of things. In a way it's good, they offload work from the main CPU. In another way it's bad, because they rarely even consider security in the design.

        Sadly I doubt sockets will be coming back. They are expensive. Back in the day they made more sense because you might need to issue a firmware update which meant replacing chips. That an parts were unreliable or had to be matched during m

        • by elrous0 ( 869638 )

          Hey AmiMoJo, remember how I adopted the sig "The one straight white male in new Star Trek will be portrayed as evil or incompetent" back before Star Trek Discovery premiered? You know, because he was the only straight white male on an SJW show, and so I knew that he would ultimately have to be revealed as either evil or incompetent--because SJW's, as much as they would deny it, really HATE straight white males.

          Remember how an enlightened SJW like yourself corrected my foolish misinformed view back in Octobe

  • Remember the NuBus DOS card you could get to run DOS at 'native' speeds?

    That said, I welcome it and other similar endeavors. I wish I could buy a more 'modular' desktop for exploratory development. For my work I'd rather have a boatload of ARM cores or FPGA devices on a x16 PCI link than a video card.

  • Apple could kill Meltdown and still have perfect Intel compatibility by just using AMD. I am not necessarily saying they should not have the ARM coprocessor, just that using AMD instead of Intel would increase security drastically. Also because AMD doesn't have the management Engine. They have something equivalent, but that doesn't have a full IP stack and other "niceties" like that.

    • Apple could kill Meltdown and still have perfect Intel compatibility by just using AMD. I am not necessarily saying they should not have the ARM coprocessor, just that using AMD instead of Intel would increase security drastically. Also because AMD doesn't have the management Engine. They have something equivalent, but that doesn't have a full IP stack and other "niceties" like that.

      It would take the better part of a year for Apple to "qualify" macOS for AMD CPUs.

      • by Anonymous Coward

        Apple probably already has MacOs "qualified" for AMD CPUs. If you remember back when Apple shocked the world by announcing that they were switching from Motorola CPUs to Intel, they'd been running various flavors of MacOS for years in their development skunkworks. They never stand still, and like all good companies, continually plan for multiple contigencies.

  • by Anonymous Coward on Monday January 29, 2018 @11:23AM (#56027017)

    I highly suspect that this change will render hardware compatibility with off the shelf components a thing of the past for Apple (again).

    This means no more Hackintosh should a future OSX require this chip be in place.

  • So how exactly is this different from the SMC (System Management Controller for those that don't know)? AFAIK the SMC already does these tasks.

    Sounds like they're just replacing whatever the SMC used to be (I'm assuming an FPGA of some sort) to an ARM CPU?

  • On first glance I immediately dismissed the "security" bit in the preview as click-bait... The interesting idea, at least to me, is the idea of having low power ARM acting like a hybrid south-bridge that functions as CPU for simple web browsing or media playback [I think these may already exist in some forms]. Unfortunately while there is a brief mention of the ARM chip handling sound there was not much other detail. I would guess the difficulty is in how to seamlessly transfer data and processing to the
  • by MobyDisk ( 75490 ) on Monday January 29, 2018 @01:08PM (#56027863) Homepage

    Every PC has dozens of microprocessors, so adding an ARM chip into a computer is no big paradigm shift. A typical PC has a SATA controller, USB controller, video card, etc. One of the big things that Intel has been good at over the years is integrated more features onto a single die. Around 2000 is when they started adding wireless directly onto the die ("Centrino") followed by integrated video. I forget when the memory controller got integrated.

    • by mjwx ( 966435 )

      Every PC has dozens of microprocessors, so adding an ARM chip into a computer is no big paradigm shift. A typical PC has a SATA controller, USB controller, video card, etc. One of the big things that Intel has been good at over the years is integrated more features onto a single die. Around 2000 is when they started adding wireless directly onto the die ("Centrino") followed by integrated video. I forget when the memory controller got integrated.

      Few computers have multiple general purpose CPU's of different architectures. A SATA controller, GPU or even a Northbridge or Southbridge are nothing like a CPU because they have different purposes. An ARM and Intel (or AMD) CPU are built to do the same thing but are fundamentally incompatible (you cant even get Intel and AMD CPU's to work together well).

      Given that either of those processors are capable of handling modern OS's without any trouble, there's no benefit to increasing complexity to hand of di

  • by Applehu Akbar ( 2968043 ) on Monday January 29, 2018 @01:28PM (#56027995)

    Watch for a forthcoming OS that will run macOS apps and iOS apps simultaneously, with a touchscreen for at least the laptop models. At first such a machine will primarily for developers, replacing the iOS simulator that is now part of Xcode, but we may then see the long-awaited convergence of laptops and tablets.

    • Personally, I never asked my computer and cell phone to be the same. I am very comfortable with them being different tools for different jobs. I am fine with them being optomized differently so that they can do their job in as an effective way possible.

      I think of all the non-sense that has hit Mac Os in recent years, in an effort to make it more IOS like. I do not like how they removed management controls away from iTunes. I LIKED having more robust photo options. Almost everything they have added in to m
  • by Hizonner ( 38491 ) on Monday January 29, 2018 @02:24PM (#56028413)

    Reasonably intelligent person: Hey, this fingerprint stuff is sensitive. Let's isolate it in separate hardware!

    Non-stupid detail person: ... and since it's specialized hardware and has information we want to control let's lock it down and have it only run code we've signed!

    Well-meaning idiot: ... and since it only runs our code, let's make it More Secure by having zero transparency!

    Fucking worthless moron: ... and since it's More Secure, let's put it in control of more stuff! And add more software! And funnel everything through it! Let's have it run the keyboard! And the camera! And the disk!

    (Intel): ... and let's give it direct network access, too!

    Hacker: Pwnt!

    This pattern happens over and over again at company after company. People build these "secure" enclaves to isolate things, and then as soon as they have them they blow that isolation by shoveling in every damned thing they can think of so everything can be "more secure". And since it's in charge of everything, it has to have control of everything. And then it gets cracked.

    THAT'S NOT HOW IT'S SUPPOSED TO FUCKING WORK!. If you have a sensitive function, you put it in its OWN FUCKING COMPARTMENT. And you give it no more privilege than it needs to do that one thing. You don't dump in a shit-ton of unrelated software into a coprocessor that's trusted for everything (and, by the way, is usually pretty much invisible to the OS).

    Morons.

  • After Intel ME flaws, Apple ME flaws?

It was kinda like stuffing the wrong card in a computer, when you're stickin' those artificial stimulants in your arm. -- Dion, noted computer scientist

Working...