Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Privacy Apple

Apple's AI Plans Include 'Black Box' For Cloud Data (appleinsider.com) 14

How will Apple protect user data while their requests are being processed by AI in applications like Siri?

Long-time Slashdot reader AmiMoJo shared this report from Apple Insider: According to sources of The Information [four different former Apple employees who worked on the project], Apple intends to process data from AI applications inside a virtual black box.

The concept, known as "Apple Chips in Data Centers" internally, would involve only Apple's hardware being used to perform AI processing in the cloud. The idea is that it will control both the hardware and software on its servers, enabling it to design more secure systems. While on-device AI processing is highly private, the initiative could make cloud processing for Apple customers to be similarly secure... By taking control over how data is processed in the cloud, it would make it easier for Apple to implement processes to make a breach much harder to actually happen.

Furthermore, the black box approach would also prevent Apple itself from being able to see the data. As a byproduct, this means it would also be difficult for Apple to hand over any personal data from government or law enforcement data requests.

Processed data from the servers would be stored in Apple's "Secure Enclave" (where the iPhone stores biometric data, encryption keys and passwords), according to the article.

"Doing so means the data can't be seen by other elements of the system, nor Apple itself."
This discussion has been archived. No new comments can be posted.

Apple's AI Plans Include 'Black Box' For Cloud Data

Comments Filter:
  • If requests are processed in a "black box", then that sounds like they may be anonymized before being processed. If so, those responses may be not be as personalized as responses from the competition, and thus the experience may seem less engaging. In some ways, it's admirable that Apple appears to care about privacy, but given the trends of oversharing on social media, it seems like most people are perfectly happy to trade their privacy for even a minor boost in convenience or clout. It'll be interestin
    • Yeah, this is the thing. I like the privacy-first approach in theory, but - as we've seen with Siri - there's no evidence Apple's team has the skills necessary to produce a quality product while following it.

      • Yeah, this is the thing. I like the privacy-first approach in theory, but - as we've seen with Siri - there's no evidence Apple's team has the skills necessary to produce a quality product while following it.

        Apple didn't write Siri; they bought it. Unfortunately, it simply wasn't designed correctly for Extensibility.

        And they are every bit as frustrated by Siri as everyone else is!

        That's why Apple is finally Redesigning/Rebuilding "Siri" from the ground-up. . .

        • citatiion required.
          you have been spouting this nonsense for years.
          apple had 10 years to make siri extendable, only apples inability to write quatity code stopped them
    • by AmiMoJo ( 196126 )

      Anonymization doesn't work.

      This is a failure. It's bad for privacy, and it means that Apple can't do the processing on-device like Google does. Apple was late to the AI game and seems to be several years behind, as Google started doing on-device processing of this stuff (voice recognition, image recognition and editing etc.) back in the Pixel 6 days when it introduced its first custom CPU.

      Even if you ignore the privacy issues, having to send data to the cloud and back means latency will be higher. Google's

    • Practically Apple can do whatever they want with their AI cloud systems, no outside developer is going to use them. Who would lock themself into Apple's services when they could easily use other systems that support Android and the web. Plus, Apple's release cycles are extremely slow.. Apple itself won't be launching any meaningful AI features until September 2025! Who would agree to operate on that timeline when competition is fierce. Apple's institutionally misaligned with the world of AI right now, so it
  • ...to avoid Apple
    The cloud is not a good thing

    • -IF- you expect to use AI, where/how do you expect your data to be stored? How would Apple be any different than any other company that is considering things like Large Language Model training? I know "hate for Apple" is A Thing on Slashdot, but in this case it seems to me that you could provide at least a little justification for this particular hate.

      • where/how do you expect your data to be stored?

        As other posters have said, on my device. There's even some small scale LLMs that work on a single GPU and a bog standard x86 white box machine if you need more complicated things. There's no reason for it to be cloud only, beyond that "black box" really meaning a "black box to everyone but Apple." As per the standard with Apple devices in general.

  • Supreme court decision on training data is still years out. Even if they could give OpenAI enough money to run their models on their own hardware, at that point they would be just as liable as letting OpenAI do it.

    For the smaller less capable models they can train on public domain and licensed content, they can run it on their own cloud, but the encryption is mostly smoke and mirrors. Yes, in theory the server hardware could create a public/secret key combo and then export the public key so you can't MitM c

    • by HiThere ( 15173 )

      I'm not really convinced that you *can't* train the AI on smaller datasets. And I rather think you must. Smaller networks can be trained faster, and are easier to validate. (I.e., it's easier to ensure that the data used to train the network is the right data). They *are* less capable, So you need a network of networks, and you've got to figure out how to train *that*. This probably repeats for several layers. I think people usually claim the brain uses seven layers, but I'm not sure that's a close an

  • ... a virtual black box.

    It's been said and needs to be said again: Anonymization doesn't work.

    Bayesian statistics means, with sufficient time, everything can be tracked.

    ... taking-control over how data ...

    If data doesn't go from the NIC directly to the NPU, then somewhere in the computer, it is plain-text and vulnerable to copying.

    Of course, the NPU will be doing private-layer inside public-layer encryption to guarantee data can be decrypted only by a 'known' end-point: That requires a lot of speed.

    ... prevent Apple itself ...

    It's bad that phone applets are always connected (to the inter

  • Remember The Box? The fact that it was an example of what not to do didn't change the fact that it could sell.

To communicate is the beginning of understanding. -- AT&T

Working...