

Apple Lets Developers Tap Into Its Offline AI Models (techcrunch.com) 14
An anonymous reader quotes a report from TechCrunch: Apple is launching what it calls the Foundation Models framework, which the company says will let developers tap into its AI models in an offline, on-device fashion. Onstage at WWDC 2025 on Monday, Apple VP of software engineering Craig Federighi said that the Foundation Models framework will let apps use on-device AI models created by Apple to drive experiences. These models ship as a part of Apple Intelligence, Apple's family of models that power a number of iOS features and capabilities.
"For example, if you're getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging," Federighi said. "And because it happens using on-device models, this happens without cloud API costs [] We couldn't be more excited about how developers can build on Apple intelligence to bring you new experiences that are smart, available when you're offline, and that protect your privacy."
In a blog post, Apple says that the Foundation Models framework has native support for Swift, Apple's programming language for building apps for its various platforms. The company claims developers can access Apple Intelligence models with as few as three lines of code. Guided generation, tool calling, and more are all built into the Foundation Models framework, according to Apple. Automattic is already using the framework in its Day One journaling app, Apple says, while mapping app AllTrails is tapping the framework to recommend different hiking routes.
"For example, if you're getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging," Federighi said. "And because it happens using on-device models, this happens without cloud API costs [] We couldn't be more excited about how developers can build on Apple intelligence to bring you new experiences that are smart, available when you're offline, and that protect your privacy."
In a blog post, Apple says that the Foundation Models framework has native support for Swift, Apple's programming language for building apps for its various platforms. The company claims developers can access Apple Intelligence models with as few as three lines of code. Guided generation, tool calling, and more are all built into the Foundation Models framework, according to Apple. Automattic is already using the framework in its Day One journaling app, Apple says, while mapping app AllTrails is tapping the framework to recommend different hiking routes.
This is how it should be (Score:4, Interesting)
The future of AI should be local models.
Sure, you need big compute to train the models, but to run them, not so much, especially when run with hardware designed for it.
Re: (Score:2)
Google announced roughly the same thing, on device models for phones a couple weeks ago at their developer conference. The 1b model is fine for basic tasks like turning on lights, checking email, social media notifications etc and runs ok on midrange phone hardware. The 4b model technically runs but it's borderline unusable speed but it can answer questions like "how does a microwave work?" with moderate accuracy at a semi-scientific level which is impressive. I suspect most devices will be able to run a 1b
Re: (Score:2)
On a Pixel 8 a 4B Model (with ChatterUI) gets acceptable tokens per second. If you want it to write you a full article about a topic, it is too slow, but a 1B is too stupid. If you want it to answer a shorter question, it works fine. Now the question is, where the trade-off is for grammar checking longer texts. Do you have time and battery for a good model? Is the small model good enough?
Re: (Score:2)
My guess is phones and silicon will improve to make 4b on mobile by 2030. If not then those requests will get forwarded to xyz cloud service. I can see a world where 7-12b cloud models are ad supported free tier and you either pay or self host 70-600b yourself. I expect processing requirements to drop by half due to whatever breakthrough comes next and then there's a long tail of improvement after that. Token verification was a major improvement.
Re: (Score:2)
By 2030 you will probably see the 30B range of models on good phones.
The interesting question is, what the 30B of models will be able to do. In the last years, the capabilities of small models increased a lot, the question is how much more they will able increase. You won't get too much knowledge into it, it just doesn't fit. But if the model is really clever, it is still helpful and you can add knowledge by letting it access external sources.
And I don't think you will see much 12B cloud models anymore. Cha
Re: This is how it should be (Score:2)
I think it is a signal to the believers that Apple is worth it. They want tomorrow's business from you. Imagine how lucrative privacy will be
If you're a Lawyer, that has tons of implications.
There is gonna be a big fight, because The King won't b
Re: (Score:2)
The question is, how liable Apple is for their software. Locally it is no service of them. If your mail client deletes your mail due to a bug, is Apple liable? If yes, they also have a problem if their model writes some illegal text. If they are not liable for the mail client malfunctioning, they are also not liable for the local chatbot, which especially means that they can use lower guardrails. They won't drop all as otherwise news would be full of "iPhones can write porn!" but a local model is not clever
Re: (Score:2)
Re: (Score:2)
Good luck with a drug submarine describes by an LLM that can run on a personal device.
But we already have plenty of fearmongering, ironically much of it by the AI company Anthropic. My impression is, that they aim to tell anything is dangerous, but the guardrails on their cloud models are safe. I wonder if they will regret this, when the government may decide to regulate any AI because they said it might become Skynet.
Why do regular users even need AI ?? (Score:2)
Re: (Score:2)
A lot of hype, I can can do anything I want now w/o assets or Siri. Like an internet connected washer/dryer that I don't need.
You're looking at it wrong. (/Apple/Tech companies in general).
Tech companies aren't looking to provide the users with something the want or need. They're looking to create something, force-feed it to us if necessary, even through our jobs if that's what it takes, then slowly subsume the human experience into their technology. It's already happened a bit with social media. You can't go to a concert without seeing a sea of cellphones held up, with most people watching the little screen in front of them rathe
More AI Bukkake (Score:2)
Decouple features from models (Score:2)
Instead of the APIs only allowing apps to use Apple models the models should be interchangeable. Allow apps to offer models which other apps - and Apple AI features - will use. On-device is ok sometimes but it’s not the only private option. Let me use my home ollama rig. Or let me decide how much privacy matters for a given use case and choose my own provider.
This is just more anticompetitive Apple lock-in under the guise of privacy.