Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Apple

Apple's AI Research Signals Ambition To Catch Up With Big Tech Rivals (ft.com) 18

Apple's latest research about running large language models on smartphones offers the clearest signal yet that the iPhone maker plans to catch up with its Silicon Valley rivals in generative artificial intelligence. From a report: The paper, entitled "LLM in a Flash," offers a "solution to a current computational bottleneck," its researchers write. Its approach "paves the way for effective inference of LLMs on devices with limited memory," they said. Inference refers to how large language models, the large data repositories that power apps like ChatGPT, respond to users' queries. Chatbots and LLMs normally run in vast data centres with much greater computing power than an iPhone.

The paper was published on December 12 but caught wider attention after Hugging Face, a popular site for AI researchers to showcase their work, highlighted it late on Wednesday. It is the second Apple paper on generative AI this month and follows earlier moves to enable image-generating models such as Stable Diffusion to run on its custom chips. Device manufacturers and chipmakers are hoping that new AI features will help revive the smartphone market, which has had its worst year in a decade, with shipments falling an estimated 5 per cent, according to Counterpoint Research.

This discussion has been archived. No new comments can be posted.

Apple's AI Research Signals Ambition To Catch Up With Big Tech Rivals

Comments Filter:
  • by postbigbang ( 761081 ) on Thursday December 21, 2023 @10:46AM (#64096045)

    iPhone users have battery life anxiety now... wait until LLMs suck every last coulomb out of the answers to important life questions, like, Does My Butt Look Too Big In These jeans?

    • Well, the challenge for a lot of AI will be 'power efficiency'. I expect Apple to be able to tune hardware, system software and applications/AI to perform efficiently on their platforms. And I don't see anyone else really taking "AI at the edge" seriously. Google, Meta, Microsoft, et.al. all have these huge investments in clouds that will need cloud-based AI. But I do give Google credit for their computational photography and similar Android based AI.

      And yes, your butt does look too big in those jeans.

      • Doing compute on the iPhone seems to be likely limited to smaller operations, I'll admit. Nonetheless, when you're trying to edge competition on features like picture/video enhancement, sensor refinement, etc., there might be some competitive advantages if Apple is able to control their design supply chain. Otherwise, it's like buying an EV-- how long will that thing retain a charge??

        • by dfghjk ( 711126 )

          "Otherwise, it's like buying an EV-- how long will that thing retain a charge??"
          A deep insight. Who knew battery-operated devices didn't "retain a charge" with usage?

          "...there might be some competitive advantages if Apple is able to control their design supply chain."
          You're having trouble keeping the terms you don't understand straight. Supply chains do not offer competitive advantages in "picture/video enhancement" applications.

      • Power efficiency for AI is perhaps a consideration for training large data centers. Otherwise, it's not much of a concern. Why? Because functionality is obviously the key goal, and until that functionality is established, power efficiency doesn't matter. All key efforts right now are on creating AI models that work. In a way, AI currently has a problem similar to AVs: the systems seem to work somewhat well 99% of the time, but the remaining 1% is extremely challenging and yet important enough to preve

      • by youn ( 1516637 )

        They are doing a lot in the cloud... but both MS (phi) and Google (embedded assistant rework) are working on smaller versions that run on phones

    • by Tablizer ( 95088 )

      > Does My Butt Look Too Big In These jeans?

      Sure, but we like it! [youtube.com]

    • by dfghjk ( 711126 )

      What's most interesting is that you think your comment has any relevance whatsoever. iPhones already have AI engines and GPUs, you think they can't be used because of power consumption?

    • Is this actually true. The only time I ever really get concerned about my battery was when my previous iPhone started pushing 6 years old and the combination of lower battery capacity and more web bloat started draining it in a noticeable way. I doubt any of the people who upgrade every two years as a part of their plan have any kinds of problems. Phone battery life in general has improved a lot over the years and I really have to ask what people are doing that they have to worry about their phone battery d
      • It is not actually true, no. Android people have battery anxiety on behalf of iPhone users. MKBHD's pick for best battery life this year was the iPhone 15 Plus. All the others were also just fine.

      • My circle of friends with iPhones can't be away from something that will charge their phones for more than twenty minutes. If it's not their car, there has to be an outlet to plug it in. These are 11s, SEs, and 14s.

        I watch their anxiety, the habitual search for power for their over-app'd phones. OMG-- it's crazy to me to watch them go at it, cables in their pockets, their purses, whatever.

        I'm glad the phones are popular and their users love them; it's just that a new breed of GPU/CPU-sucking apps seems like

  • by nospam007 ( 722110 ) * on Thursday December 21, 2023 @11:57AM (#64096283)

    But I guess it will be called iA ?

  • ...does this dress make me look fat?... ...no, you'd look fat in any dress...
    • by youn ( 1516637 )

      if you don't like FAT performance, you should try NTFS... or ext if you're on linux lol

  • So the likes of ChatGPT is what was required to update Siri to be capable of doing things other than setting reminders, alarms, and timers?

Successful and fortunate crime is called virtue. - Seneca

Working...