Apple Activates 'Apple Intelligence' — Live Translation, Onscreen AI, and a Watch Coach Arrive
What’s rolling out today
Apple is turning on a broad set of ‘Apple Intelligence’ features across iPhone, iPad, Mac, Apple Watch, and Vision Pro. Headline features include Live Translation available in Messages, FaceTime, and Phone; new visual intelligence that understands and acts on what’s on your screen; and creative upgrades to Genmoji and Image Playground.
Live translation across the system
Live Translation aims to make cross-language communication feel seamless. A friend texts in Spanish and your reply can auto-translate as you type. FaceTime captions translate on the fly. Incoming voice calls can be translated and spoken aloud so you don’t need to switch apps or copy text between tools. Apple says much of this happens with on-device processing to protect privacy, and that support for more languages will arrive before the end of the year.
Onscreen ‘visual intelligence’ and integrations
The new visual intelligence features let the system recognize what’s on your screen and offer relevant actions. That can include pinging search partners like Google or shopping apps such as Etsy directly from an item you see. Those handoffs and defaults could influence where commerce queries flow in the future.
System updates and design changes
This rollout arrives with the OS wave: iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and visionOS 26. Apple is shipping a refreshed ‘Liquid Glass’ design and a number of quality-of-life tweaks alongside the intelligence features.
Which devices get it and regional limits
Apple lists supported hardware as iPhone 15 Pro and newer (and the iPhone 16 family), M-series iPads and Macs (plus the A17 Pro iPad mini), Vision Pro, and Apple Watch Series 6 and later when paired to a supported iPhone. There are regional wrinkles: notably, Live Translation on AirPods will not work in the EU at launch.
Privacy, Private Cloud Compute, and how cloud AI is handled
For complex requests Apple may route processing to Private Cloud Compute, a server-side environment built on Apple silicon. Apple describes it as running signed, inspectable images and discarding data after the request is fulfilled. The approach is pitched as a way to offer cloud-level capabilities while keeping user data private, marking a technical and political bet on private cloud AI.
WatchOS 26 and Workout Buddy
If you use an Apple Watch, watchOS 26 introduces Workout Buddy: a coaching experience that speaks to you mid-run or ride using your personal fitness history. It starts in English and appears when the watch is paired to a supported iPhone, with outputs also available on iPhone and AirPods. This is Apple’s first wearable feature that applies contextual intelligence beyond text.
Developer access and on-device models
Developers can tap Apple’s models directly from Shortcuts, and Apple has exposed an on-device foundation model to enable private, offline-capable features in third-party apps. That access is important because it lets intelligence move from system-level conveniences into the apps people actually use every day.
The bigger picture: practical over flashy
Apple downplayed flashy ‘AI’ hype at its recent launch and instead shipped practical, system-integrated features: live translations, onscreen actions, and image tools that are genuinely useful. Not everything arrives everywhere on day one, and some capabilities will stagger by region and language, but the strategy is clear: embed generative tech into moments you already have — typing, calling, glancing at your screen — rather than forcing people into a separate ‘AI app’.
Things to watch
Two areas to follow: search and regulation. Partnerships that let visual intelligence hand queries to services like Google or Etsy could shape commerce defaults. And regional rules, as hinted by the AirPods EU limitation, mean features may appear unevenly as Apple navigates regulatory constraints.
If execution holds, these quiet system upgrades could quickly become indispensable parts of everyday device use.