Apple to open its AI models to app builders
Add Axios as your preferred source to
see more of our stories on Google.

Screenshot via Apple
Apple on Monday announced that developers would be able to use Apple Intelligence models for their own apps, even as critics worry the company is falling further behind in AI.
Why it matters: Apple has been the slowest of the major tech companies to incorporate generative AI, with some features from last year still unreleased.
Driving the news: Apple software executive Craig Federighi kicked off the keynote, unveiling the new access to Apple's AI models for developers and outlining a series of design tweaks and software changes.
- As part of some otherwise modest enhancements to its custom emoji and image generation features, Apple announced that customers will now be able to draw on ChatGPT's capabilities as part of Apple Intelligence.
- "For moments when users have a specific idea in mind, they can tap Any Style and describe what they want," Apple said in a press release. "Image Playground sends a user's description or photo to ChatGPT and creates a unique image. Users are always in control, and nothing is shared with ChatGPT without their permission."
- Apple debuted a new, bolder "Liquid Glass" design for its software that the company said would be used across its products.
- The company is introducing a new naming scheme for its various operating systems. Each OS will have a year number rather than a version number, i.e. iOS 26, watchOS 26 for 2026.
Apple also unveiled new tools to streamline group chats, call screening, hold times and blocking spam to make everyday communication less chaotic.
Apple is also redesigning the camera app, which has grown more complicated as features like Cinematic, Portrait, Slo-mo, and Time Lapse were added. While those modes will remain, the first choice users see will now be whether to take a standard photo or video.
- Apple is adding an option to use AI to take action based on what is being shown on the iPhone screen, such as identifying a product and finding its website. Previously the company's "visual intelligence" features only worked with what's seen by the camera.
Zoom in: The software for iPad, Mac, Apple Watch, Vision Pro and Apple TV will also get modest updates.
- The iPad will get a new system for multitasking and managing multiple windows, while MacOS 26 Tahoe will add a dedicated phone app, in addition to the new Liquid Glass design.
- The Vision Pro headset will gain support for new accessories, including the controllers for Sony's PlayStation VR and a new Logitech stylus.
Zoom out: Apple's new features are available for developers to test starting today, with a public beta scheduled for next month before they arrive in final form this fall as part of free software updates.
Between the lines: The event began early with a few words from CEO Tim Cook and Federighi before the prerecorded keynote began.
- He also acknowledged the delay in using AI to improve Siri, promising further updates over the coming year as well as saying there would be other new AI features that work across the company's array of products.
- One protester briefly shouted toward the stage but was quickly escorted out.
The intrigue: Apple shares dropped on what began as a largely expected and modest set of announcements. Shares were trading recently at $201.12, down $2.80 or just over 1 percent.
Our thought bubble: Apple's improvements felt small compared to past years, and all the more so when compared to recent announcements from Google, OpenAI and others.
Editor's note: This event is still taking place. Check back for further updates.
