The Vision Pro has had a rough go of it. Not only has it struggled to find an audience since its release in 2024, but recent reports indicate that Apple is abandoning a cheaper and lighter version to instead work on 2025’s hottest new gadget: smart glasses. If you’re reading between the lines and thinking “the Vision Pro is cooked,” I can’t say I blame you. But even if it is cooked, a new report suggests at least one facet could live on, and perhaps finally find its footing, in the new form factor.
Bloomberg’s Mark Gurman has reported that visionOS, the Vision Pro’s operating system, will make its way to Apple’s rumored smart glasses (which Bloomberg reported it was working on earlier this month). On the one hand, duh. Using visionOS, the only Apple operating system designed for mixed reality, is an obvious choice, especially because the Vision Pro’s UI is easily one of its best and biggest selling points. But it’s not just a matter of porting things over, according to the report: There’s a twist.
Per Bloomberg, visionOS on a pair of smart glasses will have two modes: one when it’s paired with your iPhone, which is stripped down and more useful on the go, and the other when your glasses are paired with—and this is where things get interesting—a MacBook. That’s another scarce detail about Apple’s very-much-in-development smart glasses, and it gives us a hint of how they might work.
The decision to differentiate between modes suggests that Apple’s smart glasses could compete with not only existing glasses like Meta’s Ray-Ban Display, which have a simple UI for navigation, messaging, photos, videos, and phone pairing, but also bigger, more headset-like devices (such as the Vision Pro) that parallel a MacBook. What those more advanced capabilities could be is anyone’s guess, but Apple’s smart glasses, if the display is nice enough and the chip is powerful enough, could lean into entertainment, gaming, or other more compute-intensive, laptop-like features.

There’s a potential hint about UI here, too. While Apple could very well tweak visionOS to conform to different input methods on a pair of smart glasses, the OS, in its current form, is suited for the Vision Pro’s UI, which combines hand and eye tracking for a novel “spatial computing” experience that uses pinches and other finger gestures. The resulting user experience feels genuinely more refined than those that competitors like Meta and its Quest 3/3S offer. Does that mean Apple’s glasses will use hand and eye tracking? Who’s to say? But if Gurman’s reporting is accurate, the foundation for an Apple-like smart glasses UI is there.
No matter how this shakes out, one thing is clear: Though Apple may not see a ton of promise in the Vision Pro’s hardware, it clearly sees value in visionOS. And, to be honest, so do I. As responsive and novel as Meta’s Neural Band (the wristband that registers inputs into Meta’s smart glasses) is, needing to combine smart glasses with a wearable doesn’t feel ideal. If Apple can port the convenience and smoothness of visionOS to a pair of smart glasses (especially in a wearable-free way), it’s got a big leg up, and that’s not even taking into account the opportunities presented by Apple’s direct integration with iPhones and MacBooks. This is all to say is that it looks like Apple may have finally figured out what to do with the Vision Pro, and the answer is turning it into a pair of smart glasses.