Six Colors
Six Colors

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Jason Snell

Eyes (and head) on with the Apple Vision Pro

Yes, I’ve worn it.

One of the advantages of being present at Apple Park for WWDC 2023 was that I got to experience—after having my face measured and my eyeglasses scanned—the Apple Vision Pro in a controlled environment, guided by two Apple employees.

I came away mostly impressed—but (as with so much of this product) with a bunch of questions, too. The hardware itself is impressive—a dense core like a mega iPhone you wear on your face, yes, but wrapped in humane materials like a gray woven elastic band. With its folds and ridges, it suggested a comfy sweater to me—and I think that’s intentional, as Apple seems to be working very hard to make this product not seem sleek and soulless.

Once I put the device on my face, it seemed familiar—I’ve got a lot of experience with the PSVR and the Meta Quest 2—but also different. I started with a brief set-up process, in which my eyes followed a dot moving around in my field of view so the device could calibrate itself, and the device automatically adjusted to the width of my eyes. (Apple reps had already prepped my headset with lenses based on my glasses prescription, so I could see clearly without my glasses.)

With all that, the device snapped on—and I was sitting on a couch with my two Apple guides in chairs on either side of me. In other words, the device defaulted to showing me the actual world around me, and in remarkable detail. The cameras on the Vision Pro are impressive, as are its displays. They don’t provide a view that’s quite as clear as reality itself, but it’s remarkably good. It was easy to get used to the idea that I was seeing the real world, even though it was really just a camera image on OLED displays right in front of my eyes.

At this point in the demo, my handlers walked me through numerous activities using the device. I opened the home screen (by performing a short press on the Digital Crown above my right eye) and launched various apps, all of which floated in the middle of the room.

Honestly, what I’m most excited by about the Vision Pro might be the fact that a lot of people at Apple have devoted years to figuring out the next evolution of computer interfaces. What they’ve done is build on the last 15 years of Apple touch interfaces, adapted to a new platform. It’s familiar—yet also new.

Let’s start with the pointer. There isn’t one. Nor are you expected to poke at virtual interfaces with your fingers. Instead, the Vision Pro’s eye tracking knows exactly where you’re looking at all times. (Items subtly highlight or move forward when you look at them.) Your gaze is the pointer. Look at an app and then tap your thumb and index finger together, and you’ve done the equivalent of tapping on app icon to launch it.

Other gestures are similarly intuitive. To swipe or scroll, you just bring your thumb and index finger together and then move your hand sideways (for swiping) or up and down (for scrolling). It took me no time to understand the gestures, because they’re clearly derivative of everything I’ve learned about using an iPhone or iPad.

Each app window on the Vision Pro has a small horizontal line at the bottom, just like what you’ll find at the bottom of an iPhone screen. On visionOS, it’s a grab handle. You look at it and bring your thumb and index finger together to grab it. Then you move your hand to relocate the window in the space around you in three dimensions—you can push it further away, bring it closer, or just choose to stow it off to the left or the right.

Among the apps I got to use was Photos, which shows off the high quality of the dual 4K displays in the Vision Pro. There was no graininess—the images looked good. Panoramas can be unfurled and wrapped around you. And of course, 3-D photos and videos look amazing. They’re intimate and personal in a way that flat images aren’t.

Apple made a big deal in the WWDC keynote video about how you can use Vision Pro to capture those videos, and I think it was a rare marketing misstep. The sight of the dad wearing a Vision Pro to record a family memory was jarring and inhumane, and that’s not the message Apple is trying to send with this product. Clearly the right use case here is that the iPhone will eventually be able to capture 3-D videos and photos—but since that iPhone doesn’t yet exist, Apple is left demoing the capture from the Vision Pro instead.

In any event, those dual displays really do 3-D content justice, and that includes sports footage—I lost it as I watched an infielder misplay a throw to first at Fenway Park from the first base dugout—and of course Hollywood movies. I got to view some clips from “Avatar: The Way of Water” and they looked fantastic. I’ve been a skeptic of 3-D movies in theaters, but on headsets they really do shine.

Given that Apple calls the Vision Pro a “spatial computer,” I should probably endorse the idea that it actually seems to work as a multitasking device. Each app is its own floating screen, and you can move and resize them with gestures. The apps are all familiar—they’re iPhone and iPad apps—and they feel quite easy to operate on the Vision Pro. I could definitely imagine being productive on a Vision Pro.

One of the key things to understand about the Vision Pro is that Apple doesn’t want you to consider it a device that cuts you off from the world. For that reason, everything defaults to displaying content layered on the real world around you. If you want a more immersive experience, you can turn the Digital Crown clockwise and the content begins to creep around you, filling your peripheral vision and ultimately even the space behind you. But immersing yourself in an alternate reality is a choice, one that can be easily dialed back, and one that can be “broken through” when someone is nearby. (Even when I took an immersive trip to Mount Hood, I could lean forward and see one of my Apple companions begin to appear.)

One of the things I hate about using VR headsets is that I have no awareness of my surroundings; the Vision Pro not only makes it easier to remain in your existing environment, but it will even let the real world break in when it needs to. I think that’s a great decision.

At one point, Apple had me scroll through a web page using Safari. The scrolling was smooth enough, but I was more impressed with how readable the type was. I’m not sure I’d declare it full-on retina resolution, but it was perfectly readable. I realize that doesn’t sound exciting, but I was deeply skeptical that type would render with enough quality for you to consider reading long documents on a VR headset. Apple’s device passed that quality test with ease.

I got to experience a FaceTime call with another Vision Pro user, which means I was speaking to her Digital Persona. (Calls with people using other Apple devices just appear in windows, but Vision Pro users are wearing something on their faces, so Apple constructs a 3-D avatar of your face and shoulders and animates it with your facial expressions as you hold the conversation.) I thought the audio portion of the FaceTime conversation was very well handled—when I slid her window to my left side, her voice moved over there, and her voice picked up room tone that matched the sound of my voice in the same room.

That said… I’m not really sold on Digital Personae as a concept. My FaceTime caller’s face fell into the uncanny valley. It looked sort of like a person, but the expressiveness was a bit wooden and weird. Apple hasn’t really shared a lot about this feature, but it seems to me that you should be able to personalize your persona (including in ways that make it look less like you in the real world) and give users options for other avatars, such as Memojis. I’m not against the Digital Persona as a concept, but it feels like this one isn’t quite good enough yet.

Throughout the process, I never experienced any lag or the sense that the VR system couldn’t keep up. I never felt queasy or uneasy. However, I could never get the device to really fit comfortably. My forehead began to hurt immediately when I started using it and I could never really find a setting that let me use it comfortably, despite tightening and loosening the headband, moving the band up and down on the back of my head, and adjusting the strap on the top of my head. Apple says that it’s still working on different shapes for the piece that fits between the Vision Pro and your head, blocking out light. I hope that addresses the issue I ran into—several of my fellow media compatriots said they had similar issues with fit and comfort.

Overall, I came away from my time with the Vision Pro being very impressed with the hardware and software. It’s Apple doing the Apple thing, bringing its unique combination of assets—custom chip design, a robust app platform from iOS, and an intense focus on interface and the user experience—to bear on the problem of building a mixed-reality computer. It seems to me that nobody on the planet is going to be able to match Apple at the game of building a device like this.

That said, does anyone want a device like this, at any price? Will people want to use one for work? Will people want to use them for entertainment? For all of Apple’s concern about not making a product that’s perceived as cutting you off from the world, don’t most of the use cases for this product seem lonely and solitary, and not appropriate for anyone with a partner or family at home?

I am now a believer that what Apple has built is an incredible accomplishment. This is the real deal. The unanswered question is, to what end?

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors