On Monday, Apple announced many announcements on artificial intelligence (AI), dubbed Apple Intelligence, at the Worldwide Developers Conference (WWDC) 2025. During the speech, the business reviewed existing AI features and introduced new features that are already available for testing and will be released to consumers later this year. These new features include Live Translation, Workout Buddy for Apple Watch, ChatGPT integration in Visual Intelligence, enhancements to the Genmoji and Image Playground experiences, and AI capabilities for Shortcuts.
Apple Introduces Foundation Models Framework to Developers
Craig Federighi, Apple's Senior Vice President (SVP) of Software Engineering, said that third-party app developers will now have access to the company's on-device foundation models. These AI models also underpin many of Apple's intelligence features. The Foundation Models Framework allows developers to use these AI proprietary models to create new features inside existing applications or altogether new apps.
Apple emphasized that because these are on-device models, the AI capabilities will work even when the device is offline. Notably, it will ensure that no user data leaves the device. Developers will not have to pay any application programming interface (API) fees for cloud inference. The framework natively supports Swift, allowing developers to access AI models with ease. In addition, the framework allows guided generation, tool calling, and other features.
New Apple Intelligence Features
Federighi stated that Siri will not receive the sophisticated AI functions revealed at last year's WWDC until 2026, when Apple will release more details about them. However, this year, the Cupertino-based tech behemoth intends to release a few additional Apple Intelligence capabilities.
Live Translation
Live Translation is the most significant new arrival. The AI-powered functionality will be included into the Messages, FaceTime, and Phone apps, allowing users to quickly connect with others who speak a different language. It is an on-device function, thus discussions will not leave the users' smartphones.
Live Translation will automatically translate messages in the Messages app. Users will have an option to automatically translate their messages as they type, which they can then send to friends and colleagues in the language they know and understand. Similarly, when the user receives a new communication in a foreign language, the functionality immediately translates it.
On FaceTime calls, the functionality will automatically add live subtitles in the user's language to help them keep up. During phone calls, Live Translation will translate and pronounce what is spoken in real time.
Visual Intelligence
Aside from Live Translation, Apple is improving Visual Intelligence. iPhone users may now ask ChatGPT questions while seeing their device's camera. To respond to user inquiries, the OpenAI chatbot will grasp what the user is looking at and the context. It can also search applications like Google and Etsy for comparable photos and goods. Users may also search for products online just by highlighting them in their camera.
According to Apple, Visual Intelligence can also detect when a user is viewing an event and instantly provide a suggestion to add it to their calendar.
Furthermore, by tapping the same buttons as when taking a screenshot, users may now share the image with the feature and ask questions about it.
Workout Buddy
The Apple Watch will also have an AI capability. The new training experience, called training Buddy, uses a user's workout data and fitness history to offer personalized motivating insights as they exercise. The function gathers and analyzes information such as heart rate, speed, distance, personal fitness milestones, and more.
The company's new text-to-speech (TTS) algorithm then uses these insights to generate voice output. Apple claims the voices were created using data from its Fitness+ coaches to deliver the appropriate enthusiasm, style, and intonation for a workout.
Workout Buddy will be available for Apple Watch with Bluetooth headphones. It also requires a nearby iPhone device that supports Apple Intelligence. The functionality will initially be accessible in English for certain exercises, including outdoor and indoor jogging and walking, outdoor cycling, high intensity interval training (HIIT), and functional and classic strength training.
Genmoji and Image Playground
This year, there will also be upgrades to Genmoji and Image Playground. In Genmoji, users may now combine emoji and add a text prompt to create new varieties. Users will be able to adjust emotions and personal characteristics (such as haircut) while creating photographs inspired by family and friends with Genmoji and Image Playground.
picture Playground is also being linked with ChatGPT to provide more picture styles. Users may tap Any Style and define what they're looking for. The description is subsequently transmitted to ChatGPT, which generates an image. To share this data with the OpenAI chatbot, users must express their approval.
Shortcuts
The tech titan is also incorporating Apple Intelligence into its Shortcuts app. This allows users to do tasks such as summarizing text with Writing Tools or making pictures with Image Playground. They will also be able to create replies using both on-device and Private Cloud Compute models, which will supplement the rest of their shortcuts.
In a blog post, the firm provided an example: "a student can build a shortcut that uses the Apple Intelligence model to compare an audio transcription of a class lecture to the notes they took, and add any key points they may have missed." Users can also utilize ChatGPT to offer answers that contribute to their shortcuts.
Finally, the company revealed that Apple Intelligence capabilities would be accessible in eight additional languages later this year: Danish, Duth, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese.