Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Apple

Apple Is in Talks To Let Google's Gemini Power iPhone Generative AI Features (bloomberg.com) 52

Apple is in talks to build Google's Gemini AI engine into the iPhone, Bloomberg News reported Monday, citing people familiar with the situation, setting the stage for a blockbuster agreement that would shake up the AI industry. From the report: The two companies are in active negotiations to let Apple license Gemini, Google's set of generative AI models, to power some new features coming to the iPhone software this year, said the people, who asked not to be identified because the deliberations are private. Apple also recently held discussions with OpenAI and has considered using its model, according to the people.
This discussion has been archived. No new comments can be posted.

Apple Is in Talks To Let Google's Gemini Power iPhone Generative AI Features

Comments Filter:
  • by knoledgesponge ( 808547 ) on Monday March 18, 2024 @12:45AM (#64323695)
    And will they keep plans to have it run locally?
    • Doesnt take much CPU power to reply to the same inquiry with all the white shaded in various darker skin tones.
  • Which is a shame, their oss stuff has looked sorta promising on paper and their systems handle models well
    • Which is a shame, their oss stuff has looked sorta promising on paper and their systems handle models well

      This is pure Click-Bait.

    • by AmiMoJo ( 196126 )

      Siri has been neglected for a very long time. MKBHD does comparisons of Siri, Google Assistant, Alexa, and Samsung Bixby every now and then. Going back years, Siri has been struggling.

      On the other hand, I've been testing Google Gemini and it's still a long way from being good. Often it contradicts itself in the answer, for example. I am seeing some improvement, but it's still at the point where at most I'd use it as a starting point rather than accepting anything it says at face value.

      • by Dusanyu ( 675778 )
        Ironically everyone complains about Siri but at least in my area of the US it tends to be more reliable for finding things where the other Virtual assistants like Amazon's tends to suggest places in Chicago over more local offerings.
        • Siri is remarkably good at a very few limited things. Apple has neglected it because it still functions with its A team. One team handles everything that is innovative at Apple, so if they aren't working on Vision Pro or some new CPU, they are working on the car until recently, or the new Mac Pro, or whatever, and have little time to make changes to Siri. Steve Jobs was famous for this management style, and it worked well when you had 6 products to update on an annual basis (MacBook, iMac, Mac Pro, iPad, iP
          • Siri is remarkably good at a very few limited things. Apple has neglected it because it still functions with its A team. One team handles everything that is innovative at Apple, so if they aren't working on Vision Pro or some new CPU, they are working on the car until recently, or the new Mac Pro, or whatever, and have little time to make changes to Siri. Steve Jobs was famous for this management style, and it worked well when you had 6 products to update on an annual basis (MacBook, iMac, Mac Pro, iPad, iPhone). It doesnt work when you have 30 products.

            I just wonder when Apple will finally bite the bullet, and do a ground-up rewrite of Siri? At this point, nothing else will do.

      • I've been testing Google Gemini and it's still a long way from being good. Often it contradicts itself in the answer, for example. I am seeing some improvement, but it's still at the point where at most I'd use it as a starting point rather than accepting anything it says at face value.

        "as a starting point rather than accepting anything it says at face value". This is the only way to use anything on the internet, whether it's generative AI, Wikipedia, Google search, social media, news sites, or anything else. In fact, it's the only correct way to use information from any source in life. If I hear something from my teacher, and it doesn't make sense to me, I question whether the teacher is right or wrong and then do the research and thinking to figure things out.

        The problem is not with

    • Based on other stories, there's likely nothing wrong with Apple's models, but they're meant to run on-device.

      Why couldn't Apple just scale them up and run them on cloud servers? Because then they wouldn't have arms-length deniability when it comes to privacy.

      I like Apple products, I appreciate the privacy focus. But having Google as their default search engine because they're the highest bidder is an obvious privacy hole. My theory is they want a similar out. You'll be able to use Siri on-device for things

      • If you think apple gives a fuck about privacy one quarter of a fuck past it only being a marketing bullet point, you're a gullible fool. Apple doesn't give a fuck about your privacy.
        • I believe Apple believes in privacy enough for them to want to make it an effective marketing tool, so they care about it more than other companies, yes. I don't think they have as deep rooted a concern for privacy at an ethical or moral level as they say (though I DO also believe that Tim Cook believes in privacy more than most, because of the realities of being a gay man from the South). But it doesn't matter, they don't have to want to do it for philosophical reasons as long as they do it, and they do.

  • Racism embedded.. (Score:3, Interesting)

    by strUser_Name ( 7991504 ) on Monday March 18, 2024 @12:59AM (#64323715)
    Baked into your mobile phone?
  • It make sense for Google, as such a deployment will put them head to head to the market leader, ChatGPT and it make sense for Apple, as they want to be different from everybody else, so they can't run ChatGPT, that everybody else is running.

  • Please stop (Score:5, Insightful)

    by locater16 ( 2326718 ) on Monday March 18, 2024 @02:11AM (#64323771)
    LLMs suck. "A search engine for obscure programming documentation" is the only long lasting use I've ever come across for them. You can't even have them write pitches or introductory letters anymore because they all sound the same, even if people don't toss them into the trash for being AI written it means yours manages to blend straight into a hundred others just like it, defeating the entire purpose of such a thing. The user numbers are already dropping for ChatGPT and they're not going back up anytime soon.

    Let's go back to making cars self driving or telling people what kind of flower they just took a picture of. Large language models are not the road to whatever the hell "General Artificial Intelligence" is, and I'm getting tired of hearing about them.
    • by HBI ( 10338492 )

      The marketing people need to be punched in the face by reality before this goes away. It'll take a bit.

    • LLMs suck.

      They might somewhat suck but they are improving all the time. GitHub Copilot is an excellent programmers assistant. I've used it for everything from code generation, debugging, documentation, testing, analysis, etc. No it's not a replacement for a programmer, yet.

      You CAN use LLMs to write pitches and letters. Just don't copy and paste. It is far better than starting from a blank document. Saves time and gives you many ideas.

      So keep your expectations in check and you won't be disappointed.

      And by the way, isn

    • by Dusanyu ( 675778 )
      Lets Skip the self driving cars while were at it. I prefer to have safer streets for people who want to use Micromobility and less 2 ton death machines running about.
      • If we got the point that the self-driving cars can make mistakes, but make far less then human drivers, would you say the technology is ready?

        Or must it reach perfection before it is usable?

    • LLMs suck. "A search engine for obscure programming documentation" is the only long lasting use I've ever come across for them. You can't even have them write pitches or introductory letters anymore because they all sound the same

      Have you considered that sanitizing natural language input so that it "all sounds the same" might be a feature in a huge number of use cases?

      To pick an example that's of relevance to Apple, folks in the home automation space have demonstrated that putting an LLM in front of requests made to Siri opens the door to more expressive and complicated requests. The LLMs know the format that requests need to be in to be understood by the assistant, so it can do the work of translating what you're saying to the more

      • It can even take a "single" request from you (e.g. "do X, Y, and Z") and convert it into the appropriate number of requests to the assistant, saving you the hassle of the back and forth.

        This alone is interesting, as you could have a pre-processor that essentially eliminates the N+1 query problem that can be present in such "natural language" queries. Using such a mechanism could actually result in far more scalability / far less cost.

  • Very disappointing from Apple.

    But perhaps not surprising as Apple doesn't want the headache of policing and manipulating AI to be woke.

    • siri has proven that apple does not the the technical chops to buy and enhance code.
    • The rumors indicate that everything would run and stay on-device, just as with all of their current ML-based features, so what "privacy" concerns do you have? Just because the model comes from Google doesn't mean that it necessarily lives in the cloud.

  • suddenly extra shitty about Google's search strength. They're pissed that Apple isn't fawning over what M$ has to offer.

    • That's an interesting take. I thought that this was Apple hedging their bets out of desperation, but maybe it means that Google has some advantage that only Apple can see.
      • I have a feeling that advantage is that Google also wants to be able to run this on-device for their Pixel series of smartphones. Microsoft actively does not want to do that, because they want the user data.

        Don't get me wrong - Google does too. But they probably would like to have Apple's users for marketing numbers and metadata analytics that Apple likely doesn't care as much about, so they'll make that optional (for now)?

  • by oumuamua ( 6173784 ) on Monday March 18, 2024 @09:03AM (#64324651)
    And the unsaid news is huge! That a trillion dollar company would make that decision shows how hard it is to train these huge models. Then again, maybe it is just a ploy to train their own LLM while using Gemini.
    • Rumors suggest that this may be a case where theirs simply won't be ready in time for iOS 18's release. There was a report today that indicated they are spending over $1M/day on computing to train their own models, but that the models aren't expected to be ready for deployment until late in 2024. People expect Apple to have some sort of an answer before then, so that leaves them in the unenviable position of needing to license someone else's models, ship a half-baked model, or delay the release until their

  • This smacks of desperation on the parts of both Apple and Google. It's like the early days of the iPhone when Google and Apple paired up to shut Microsoft out of the emerging smart phone market, and they succeeded. This time, however, it seems as if both of them are desperate to counter Microsoft by learning and playing catch up in AI together. It definitely doesn't sound as if Apple is entering this partnership because Google's has more to offer than OpenAI. It feels more like a hedging move: let me help t
    • It feels more like a hedging move: let me help the laggard in exchange for cheap rates down the line; if it fails, I go with the big boys anyway.

      One question: why is that a problem? Apple users would be getting an enhanced experience by Siri not being so unbelievably shitty from complete neglect. Google gets a fast infusion of cash with which to rapidly iterate on their product, and motivation to do so beyond what they're already working with. We all get more viable competition, forcing even Microsoft / OpenAI to step their game up too.

      Sounds like exactly what we need to prevent monopolistic market domination, just the same as what happened with

I cannot conceive that anybody will require multiplications at the rate of 40,000 or even 4,000 per hour ... -- F. H. Wales (1936)

Working...