AirPods are about to get a feature that turns them into your personal multi-language wingman. Hidden inside iOS 26 beta 6 is imagery hinting at an in-person Live Translation feature for AirPods, which users could activate with a double-press gesture. The imagery includes your favorite greeting, “Hello,” displayed in multiple languages, and a file labeled “Translate,” signaling that Apple is gearing up to make real-time translation part of your earpiece toolkit.
This isn’t Apple stepping off brand; it’s leaning into its AI-driven future. iOS 26 already offers Live Translation in Phone, Messages, and FaceTime, so adding in-person translation through AirPods feels like the next logical leap.
If the feature was already rumored by insiders like Mark Gurman, finding traces tucked in the beta code confirms Apple isn’t just piloting the idea; it’s close to putting it in your hands. The feature appears to target both AirPods Pro 2 and the fourth-gen AirPods. And as with other Apple Intelligence features, it probably requires an iPhone that supports on-device AI.
Functionally, this opens a whole new way to converse across borders without looking awkward. Imagine double-pressing your AirPods and suddenly hearing live translations during a face-to-face chat, no screen needed. Whether you’re ordering at a cafe, traveling, or talking across language barriers at home, this kind of seamless audio-first translation could change the game.
Bottom line: Apple is quietly making its AirPods smarter, more personal, and far more useful. Live Translation feels like more than a convenience; it’s one step closer to wearable tech blending invisibly into everyday life. And if executed well, this feature could make your AirPods not just an audio tool, but a universal communication assistant.