Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Apple

Apple To Power AI Tools With In-House Server Chips This Year (bloomberg.com) 17

Apple will deliver some of its upcoming AI features this year via data centers equipped with its own in-house processors, part of a sweeping effort to infuse its devices with AI capabilities. From a report: The company is placing high-end chips -- similar to ones it designed for the Mac -- in cloud-computing servers designed to process the most advanced AI tasks coming to Apple devices, according to people familiar with the matter. Simpler AI-related features will be processed directly on iPhones, iPads and Macs, said the people, who asked not to be identified because the plan is still under wraps.

The move is part of Apple's much-anticipated push into generative artificial intelligence -- the technology behind ChatGPT and other popular tools. The company is playing catch-up with Big Tech rivals in the area but is poised to lay out an ambitious AI strategy at its Worldwide Developers Conference on June 10. Apple's plan to use its own chips and process AI tasks in the cloud was hatched about three years ago, but the company accelerated the timeline after the AI craze -- fueled by OpenAI's ChatGPT and Google's Gemini -- forced it to move more quickly. The first AI server chips will be the M2 Ultra, which was launched last year as part of the Mac Pro and Mac Studio computers, though the company is already eyeing future versions based on the M4 chip

This discussion has been archived. No new comments can be posted.

Apple To Power AI Tools With In-House Server Chips This Year

Comments Filter:
  • by xack ( 5304745 ) on Thursday May 09, 2024 @01:38PM (#64460233)
    But with soldered ram and storage.
  • What are the most advanced AI tasks?
    Asking for a friend who wants to update her resume.
  • Do we want our news filtered through the one AI. As in I've noticed ChatGPT giving the non-controversial opinion and getting upset if called on its faulty logic.

    -------

    We are reaching out to you as a user of OpenAI’s ChatGPT because some of the requests associated with the email ******@*** have been flagged by our systems to be in violation of our policies. Please ensure you are using ChatGPT in accordance with our Terms of Use and our Usage Guidelines, as your access may be terminated if
  • Simpler AI-related features will be processed directly on iPhones, iPads and Macs, said the people, who asked not to be identified because the plan is still under wraps.

    IPhones, iPads and Macs?

    Macs, maybe; but can you imagine iPhones and iPads in a Datacenter, racked-up like Batteries in The Matrix? When Apple could easily just generate a rackmount Board that they could stuff with about 8 Mx SoCs directly?

    Doesn't pass the smell test.

    • by MikeMo ( 521697 )
      The parent was trying to say that Apple intends to process most of these features on the user’s device without contacting a server, not use iPhones and iPads as servers.
      • The parent was trying to say that Apple intends to process most of these features on the user’s device without contacting a server, not use iPhones and iPads as servers.

        That makes a lot more sense!

        Duh, on my part!

        Thanks.

    • wow, just wow.
  • Has anyone tried training LLM(s) using slashdot, arstechnica, wired, etc., articles and content? From what I understand so far, it's not the power of the AI alone that is important, but the content used to train it... for example, openAI trained on many sources, including 100,000 real books. But for very specific subject fields, why not just confine them to particular forums as a kind of tool.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...