Yanko Design

Apple Wants To Put A Camera In Your AirPods… To Improve Siri’s Visual Intelligence

Your earbud can read your body temperature, heart rate variability, and sleep quality. No, I’m not joking, there are TWS earbuds on the market that can gather medical-grade data aside from playing music or your favorite podcast. Now, Apple wants to put a camera on them too. The AirPods Pro 3 already ships with a heart rate sensor. Brands like Amazfit and Soundcore have been quietly building health-monitoring earbuds for a couple of years now. The earbud has become a sensing platform in its own right, and Apple’s next move is to take that considerably further with infrared cameras baked into a premium new model, reportedly called the AirPods Ultra, that would sit above the existing AirPods Pro lineup and bring computer vision to the most personal wearable most people actually wear every day.

According to Bloomberg’s Mark Gurman, who has been tracking this story for months, the cameras won’t capture photos or video. They are infrared sensors, closer in nature to the Face ID array on iPhone, designed to scan the environment around the wearer and feed contextual data to Siri in real time. The goal is a smarter assistant that knows what you’re looking at and what’s happening around you, without you having to describe any of it. Gurman has described the product as a “major new product category,” and the branding alone tells you something: AirPods Ultra would sit above the AirPods Pro 3, which currently retails at $249, making it the most expensive AirPods Apple has ever sold. The concept has been circulating since Ming-Chi Kuo first floated it in mid-2024, but the story has crystallized considerably in recent weeks, with multiple sources converging on an expected September 2026 launch window.

Image Credits: Sarang Sheth

The Apple Watch Ultra and the M-series Ultra chips established “Ultra” as Apple’s signal for extreme capability and premium positioning within a product family, and the AirPods Ultra branding carries exactly that weight. 9to5Mac noted that what was previously reported as a high-end AirPods Pro variant has shifted in the rumor landscape toward a genuinely new product tier. The reported pricing reflects that: these will cost more than the AirPods Pro 3, which sits at $249. Apple is also reportedly developing an iPhone Ultra and MacBook Ultra for 2026, meaning the earbuds would join a broader product family refresh built around the tier. Apple is constructing a new ceiling for its entire hardware lineup, and the AirPods Ultra sits at an intersection of audio, AI, and ambient sensing that no earbud has occupied before.

The infrared camera’s job description, as currently understood from Gurman’s reporting, is to make Siri situationally aware. Visual Intelligence on iPhone 15 Pro and newer already allows the camera to identify objects, read menus, and pull up contextual information about whatever it points at. Moving that capability to an earbud means the system could, in theory, understand your environment passively, without you reaching for your phone or issuing a voice command first. Apple’s next-generation Siri, expected to arrive alongside iOS 27, is reportedly being rebuilt around exactly this kind of ambient, context-first intelligence. The AirPods Ultra cameras would feed that system continuous environmental data, turning a passive audio device into something closer to a spatial awareness layer running alongside your daily life.

Kuo’s original 2024 report framed the camera feature around in-air gesture control, the idea that waving a hand near your head could manage calls or control playback without touching the earbuds. It was a compelling angle, and it made for a more immediately legible pitch than “cameras for Siri.” Gurman has since walked it back, stating he does not expect the AirPods to support hand gestures at launch. A 2025 Apple patent did explore gesture recognition through the earbud camera system, so the underlying research exists even if the shipping product won’t lead with it. The gap between what Apple patents and what it actually ships in a first-generation product is well-established history, and gesture control reads like a capability that may surface in a second-generation AirPods Ultra rather than the first.

Visual Intelligence on iPhone has proven genuinely useful in contained scenarios, but earbuds introduce a layer of ambient, always-on sensing that is harder to control and considerably harder to explain to the person standing next to you. The privacy implications are real, and the design challenge of making an IR camera in your ear feel considered rather than intrusive is one Apple will have to solve in both hardware and communication. The AirPods Ultra, if it lands in September 2026, will be one of the more consequential product launches Apple has attempted in years, because it represents the company’s clearest statement yet about what a wearable is actually for. The earbud went from audio device to health monitor quietly enough that most people barely noticed. Adding computer vision to the mix is considerably harder to ignore.

Exit mobile version