Skip to content
WWDC 2023

Hands-on with Apple Vision Pro: This is not a VR headset

This was the best headset demo I’ve ever seen. But there’s room for improvement.

Samuel Axon
An AR headset sits on a stand in a public viewing area.
This is Apple’s Vision Pro headset. It looks a bit like a particularly bulky pair of ski goggles, with the materials and design language of Apple’s AirPods Max headphones. Credit: Samuel Axon
This is Apple’s Vision Pro headset. It looks a bit like a particularly bulky pair of ski goggles, with the materials and design language of Apple’s AirPods Max headphones. Credit: Samuel Axon

CUPERTINO, Calif.—Going into the Vision Pro demo room at Apple’s WWDC conference, I wasn’t sure what to expect. The keynote presentation, which showed everything from desktop productivity apps to dinosaurs circling a Vision Pro user in space, seemed impressive, but augmented reality promotional videos often do.

They depict a seamless experience in which the elements of digital space merge with the user’s actual surroundings completely. When you actually put on the headset, though, you'll often find that the promotional video was pure aspiration and reality still has some catching up to do. That was my experience with HoloLens, and it has been that way with consumer AR devices like Nreal, too.

That was not my experience with Vision Pro. To be clear, it wasn’t perfect. But it’s the first time I’ve tried an AR demo and thought, “Yep, what they showed in the promo video was pretty much how it really works.”

(Quick note: Apple wouldn’t allow photos of me wearing the headset—or any other photos during the demo, for that matter. The photos in this article are of a headset put on display after Monday’s keynote.)

Getting set up

Before I was able to put on Vision Pro and try it, Apple gathered some information about my vision—specifically, that I was wearing contact lenses and that I'm nearsighted but not farsighted. This was to see if I needed corrective vision inserts, as glasses would not fit in the headset. Since I was wearing contacts, I didn’t.

An Apple rep also handed me an iPhone, which I used to scan my face with the TrueDepth sensor array. This was to create a virtual avatar, called a "persona," for FaceTime calls (more on that shortly) and to pick the right modular components for the headset to make sure it fit my head.

When the headset goes on sale, you’ll be able to use your iPhone to do all this while ordering Vision Pro online. If you don’t have an iPhone, you’ll be able to go into the Apple Store, and they'll do it for you there.

As for the vision part, glasses wearers sometimes find it uncomfortable to wear VR headsets because their glasses might not fit comfortably inside. Other headsets are made large enough to accommodate glasses, but then they’re unwieldy. In typical Apple fashion, the company wants Vision Pro users to throw money at the problem. Inserts matched to your glasses prescription will fit magnetically inside the headset, so you won’t have to wear glasses or contacts at all. It seems that this will be part of the buying process for Vision Pro.

An iPhone-sized battery pack connected to a thick wire
The one stain on this device’s ergonomics is that it’s tethered to this iPhone-sized battery pack.
The one stain on this device’s ergonomics is that it’s tethered to this iPhone-sized battery pack. Credit: Samuel Axon

Additionally, I was able to confirm that because these magnetic inserts are not permanent, you can swap them out for different people—so if you and your spouse both need corrective lenses and each want to use the headset, you can buy one headset and get different vision inserts. Apple hasn’t said how much this will cost, though.

The iPhone was also used to scan my ears with the same sensors for an optimal Spatial Audio configuration.

Once I was handed a headset to put on, Apple had equipped it with the right parts and data to tailor it to my specifications. After all that, the headset fit perfectly, and it took very little adjustment to get it to settle on my head. It was lightweight; I wore it for about half an hour and did not feel any physical fatigue.

The one stain on an otherwise comfortable fit was the fact that Vision Pro has a battery pack attached by a tether. Shaped and weighted like a thick iPhone, it slotted into my pocket easily, and the wire connecting the battery to the headset never got in my way—but I could still feel it there.

The best part was the interface

Downward-facing cameras on the bottom of a headset
Here you can see cameras on the bottom of the headset that read your body language for the interface and for FaceTime calls.
Here you can see cameras on the bottom of the headset that read your body language for the interface and for FaceTime calls. Credit: Samuel Axon
I was able to touch, examine, and wear the headset. An Apple representative walked me through the basic interface, and I browsed a home screen full of apps.

Vision Pro’s interface is all about eye tracking. Whenever you look at a UI element (like an X to close a window or a photo within a gallery in the Photos app), it is subtly highlighted in your view. To actually make a selection—to click, if you will—you simply tap two of your fingers together. You don’t have to hold your hand in front of the headset to do this; as long as your hand is not hidden completely behind you, it can be pretty much anywhere. To scroll, you pinch and move your fingers up and down or side to side. It feels a bit like pulling a string to open window blinds.

In my testing, the eye tracking was perfectly accurate and responsive. It reminded me of using a similar feature in PlayStation VR2, but it felt just a bit more accurate. If you’ve used well-implemented eye tracking in VR before, you know it becomes intuitive and natural almost immediately.

I’ve used headsets that required hand gestures, but it never felt very natural. With Vision Pro, it feels just right. The fact that your hand can go anywhere, and that you can pinch subtly instead of making some kind of dramatic gesture, goes a long way.

If you’ve used a Meta VR headset, a PlayStation VR, or almost any PC VR device, you know how awkward it can be to carry controllers in your hands. Now that I’ve used Apple’s interface, it will be hard to go back to using controllers again. This approach is not only more immersive; it’s much more practical.

I was able to launch windows for multiple apps and arrange them around me. Moving them around involved simply gazing at a small white line beneath each, pinching to match what on desktop would be holding down the left mouse button, and turning my eyes to where I wanted the window to go. I was able to place windows in an array around me, and I was even able to overlap them on top of each other. Whichever one I looked at appeared in front in that moment. All of this worked well, and I had no complaints. In this respect, there’s nothing to criticize: Apple has nailed the interface.

There’s one other aspect to the interface worth noting: adjusting your immersion level.

Turning the headset’s digital crown smoothly transitions between total immersion at one extreme and absolute passthrough at the other. The headset captures your surroundings (depth perception included) and displays them on the two screens in front of your eyes. When you’re turned all the way to passthrough, you see what you’d see if you weren’t wearing Vision Pro at all—albeit a bit darker and with just the slightest bit of softness.

As you turn the knob from immersion to passthrough, the digital elements slowly crossfade to whichever in-between state you want; it’s like changing the transparency level on a UI element in a 2D interface. If you turn the crown all the way, the digital objects disappear completely via a sort of vignette effect—kind of like those transitions in Star Wars where the initial shot disappears into a shrinking circle, revealing the next scene.

Further, Vision Pro recognizes when someone is standing or sitting near you and crossfades them into partial view, even if you’re far along the scale toward immersion. This is effective but very surreal. The face of an Apple rep sitting next to me was clearly visible, but he looked like a semi-transparent apparition floating strangely in my virtual reality environment—it was a bit like the visual effects you’ve seen movies use to represent ghosts or spirits, sort of there but sort of not.

A good display goes a long way

Most VR headsets I’ve used were held back by poor displays. For example, for all its high-end specs, the Valve Index’s LCD screens have terrible black levels, leaving the image looking washed out. Many of us have trained our eyes to ignore this when watching on a flat screen, but in VR, it drives home the “this isn’t really real” factor. Other headsets, like Sony’s first PlayStation VR or the Meta Quest, suffer from relatively low resolutions, so everything just looks a bit fuzzy.

The most impressive display in a headset I had seen up to this point was Sony’s PlayStation VR2, which has a higher-resolution OLED display that offers near-perfect black levels and HDR brightness on highlights. The Vision Pro is more akin to the PSVR2 than any of the others. As far as I could tell, it was maybe even better.

Apple hasn’t been very specific about the specs, here. It has said that each eye has a display that’s a little over 4K, and the display is micro-OLED, hence the deep blacks. For as high as the brightness is supposed to be, I felt the screen was a little dim—probably 30–40 percent dimmer than reality, based on my subjective experience. It was much brighter than a Meta Quest, but it still didn’t capture the full range my eyes could in the real world.

I asked Apple if users could turn up the brightness in the final product but was told the company didn’t have anything to announce on that and was reminded that I was testing pre-production hardware.

Still, it was much better than other headsets I’ve used on this front, even if it still wasn’t perfect.

Another problem in most AR headsets is field of view. Since many AR products are trying to project a digital image onto see-through glass in front of you, they can run into some fundamental problems of optics. The result is that digital content is usually constrained to a small box that occupies only a tiny fraction of your field of view.

By contrast, Apple’s headset (which captures the outside world with 3D cameras and presents it to you digitized via a VR-like immersive view) has no such limitation. The company wouldn’t say exactly what the FOV is, but to me, it seemed comparable to VR headsets. You don’t have your full peripheral vision, but the digital elements can go anywhere in view—not just in a limited square.

Further, the digital elements sit in 3D space. Some AR devices simply create a virtual flat panel in front of you into which all the digital stuff goes. With Vision Pro, you can arrange multiple flat panels in the room around you, and some things are fully 3D objects in that space, not just panels.

The strangest phone call I’ve ever had

A glistening display on the front of a headset
We weren’t able to see this ourselves in our hands-on demo, but the front of the headset displays a recreation of your eyes and various visual patterns to indicate your state to others around you.
We weren’t able to see this ourselves in our hands-on demo, but the front of the headset displays a recreation of your eyes and various visual patterns to indicate your state to others around you. Credit: Samuel Axon
A short time into my demo, I received a FaceTime call from an Apple rep in another room. Like me, she was wearing the Vision Pro headset.

Her talking head appeared in a small, square window that I could move anywhere I wanted in the space around me. What I was seeing wasn’t actually her, though. Rather, it was her digital persona, a 3D model of her face that moved in real-time to match her real-life expressions as captured by her own headset’s cameras.

Vision Pro looks at your eyes and mouth and uses that information in tandem with some machine-learning trickery to animate your entire face. The face is based on a TrueDepth camera scan, so it looks like you in terms of geometry and coloring—though the material of her skin and hair looked just a little off. (Think of a recent big-budget movie that digitally de-aged or otherwise recreated an actor in CGI at great expense and you'll know what to expect.)

Honestly, I found it a bit unsettling—the uncanny valley is a real phenomenon. Another journalist I spoke with after the demo said he might have preferred that Apple use something like Animojis to create a less surreal cartoon character.

I wouldn’t be surprised if Apple offers this option when the headset launches, but it makes sense that this hyperreal alternative exists because it completely retains social information. Even though the persona I was speaking to looked a little odd, everything in her facial expression and body language was preserved in such a way that I had all the same social cues I’d have if I was on a regular 2D FaceTime call with her, sans headsets. That’s an impressive achievement, even if the persona itself does look a little weird.

Additionally, she started up a collaboration session in a productivity app. I watched her make notes on a canvas live, and she showed me a 3D model she had embedded in a document. I was able to look at the model from different angles. It was pretty much exactly like those hype videos you’ve seen about collaborating in the so-called metaverse.

I’m not sure most people would pick this form of interaction over just launching Zoom and Google Docs on their laptops like they’ve been doing for years, but the cool factor was high regardless.

Consuming content

Outside that call, the demo was mostly focused on viewing content of various types within the headset’s mixed reality environment.

I launched a Photos app and used my eyes and fingers to move its app window around in space in front of me and swipe between several photos Apple had taken in advance. Some were 2D photos taken with an iPhone (including panoramic ones that wrapped around me), and others were 3D images captured with the Vision Pro. The 2D photos looked great. The 3D images were neat, but odd. They were not immersive but rather strangely exaggerated dioramas contained in a small box. It was like gazing into a little world living inside a cupboard.

I immersed myself in a virtual space (a lake by a mountain) and arranged apps like Safari around me as giant panels inside it. Using Safari, I scrolled through a webpage; text was crisp, clear, and legible.

I watched a snippet of a 3D movie (Avatar: The Way of Water) in the TV app. The movie was neat, but I’ve always felt 3D movies look a little weird, and I wish Apple would have shown a 2D movie instead so I could have assessed the picture quality compared to a real 4K HDR TV.

I also briefly meditated with assistance from an immersive mindfulness app that darkened the room around me and produced a strange, shapeshifting orb for me to stare at while I focused.

Apple didn’t show me any games. I wasn’t able to test interactions with or extensions to the Mac or other Apple devices. I only used a few apps, and perhaps most importantly, I did not capture or create any content—I only viewed it. In fact, my interactions were mostly limited to launching apps and moving them around in my view, as well as adjusting my immersion level with the device’s digital crown. Everything else was passive viewing.

This lack of interactivity, the headset’s still relatively distant launch date, and Apple’s repeated insistence that the test hardware was pre-production technology suggested to me that the company is probably still working out a lot of things about how you’ll actually use Vision Pro as a full computing device.

That said, Apple ended the demo with a showcase that genuinely impressed me.

The portal to Jurassic Park

Allowing you to share a virtual space with dinosaurs isn’t a fresh concept. Using a Meta or PC VR headset, you might have seen one or more “experiences” that transport you to a prehistoric world where a dinosaur towers over you.

And on the AR side, popular iOS and Android apps allow you to see the true scale of some dinosaurs through the viewport of your phone screen as you hold your mobile device in front of you and the dinosaur is placed into the phone’s rear camera view.

So when Apple told me the demo would end with a dinosaur encounter app, I admit I had low expectations.

A multi-piece strap wraps around behind the main hardware of the headset
The strap that you use to securely set the headset on your head is built from modular parts selected specifically to fit your head.
The strap that you use to securely set the headset on your head is built from modular parts selected specifically to fit your head. Credit: Samuel Axon

But as soon as I launched the app, I realized it would be a little different. Slowly, part of one of the walls in the square room I sat in turned into a doorway to a 3D-rendered rocky environment with blue skies above. This viewport mapped perfectly onto the wall, and the depth perception I had suggested that it really was a door in the wall that led to a very different setting outside. High-fidelity, 3D-rendered dinosaurs walked around on the other side of this portal. Because the dinosaurs were in an also-3D-rendered environment, this seemed similar to the dinosaur experiences I’ve seen in other VR headsets.

But then something happened I hadn't seen before—at least not in anything close to this fidelity. One of the dinosaurs walked over and seamlessly poked through the portal and into the room. I was able to walk up to it and examine it as if it were in the room next to me, and its head turned to stare at me as I walked around the room.

The dinosaur cast a shadow in the room and was lit naturally by the lamps nearby. Placing the dinosaur in a real space like that made it much more convincing than any other VR dinosaur video or game I’d seen before.

This experience confirmed something I’ve always figured: If someone could get AR right, it would be much more impactful than VR.

First impressions: This is not a VR headset

All the rumors leading up to this announcement positioned the Vision Pro as a competitor to Meta’s VR headsets, but after using it, I don’t see it that way, for two reasons—both of which my colleague Kyle Orland went into detail about yesterday.

First, Meta is making a mass-market product. The regular Quest headset is essentially the lowest common denominator—or the minimum viable product—to keep the cost down. It’s just good enough to provide a solid VR experience, but no more than that. If it offered more, it would be pricier. Meanwhile, Apple’s Vision Pro is extraordinarily expensive. It’s not just for the high-end of the market—it’s barely priced as a mass-market device at all.

But much more importantly, Meta’s headsets are first and foremost VR devices that happen to have a couple of AR features. Vision Pro, on the other hand, is primarily an AR device that just happens to have a few VR features. They’re largely offering different uses for different people.

In fact, despite all the rumors calling it VR (and despite some articles still erroneously doing so) the Vision Pro is not a VR headset at all—even though it uses VR-like techniques to present the real world to you. It can do some things a VR headset can do, but none of the use cases Apple showed me—not even any that were shown in Monday’s keynote—would be called virtual reality.

Whether it’s telepresence via FaceTime, turning the room into your personal spatial computing space, watching movies on a virtual TV mapped to the geometry of your wall, or interacting with dinosaurs that are standing next to you, it’s augmented reality. It’s all about putting virtual objects in real space.

Tellingly, Apple hasn’t announced any immersive VR games for the device. The few games the company did talk about during the WWDC keynote were 2D games played on a virtual TV.

Even when turning the crown all the way up to maximize immersion, I was still much more aware of my surroundings than in any VR headset. So if you’re expecting a successor to the Oculus and Meta Quest throne, forget about it. That’s not what Vision Pro is.

Instead, it’s a first look at something else entirely. It’s far too soon to say whether there will be a market for it, and I won't say it’s worth its ultra-steep price for most people—in part because the demo I received was highly controlled, and there was no opportunity to discover friction points or edge cases.

Regardless, even having used dozens of VR and AR headsets over the years, it was truly something I had never seen before. It’s been a while since we’ve gotten that from Apple.

Listing image: Samuel Axon

Photo of Samuel Axon
Samuel Axon Senior Editor
Samuel Axon is a senior editor at Ars Technica. He covers Apple, software development, gaming, AI, entertainment, and mixed reality. He has been writing about gaming and technology for nearly two decades at Engadget, PC World, Mashable, Vice, Polygon, Wired, and others. He previously ran a marketing and PR agency in the gaming industry, led editorial for the TV network CBS, and worked on social media marketing strategy for Samsung Mobile at the creative agency SPCSHP. He also is an independent software and game developer for iOS, Windows, and other platforms, and he is a graduate of DePaul University, where he studied interactive media and software development.
Most Read
  1. Listing image for first story in Most Read: Apple couldn’t tell fake iPhones from real ones, lost $2.5M to scammers
    1. Apple couldn’t tell fake iPhones from real ones, lost $2.5M to scammers
  2. 2. Helene ravaged the NC plant that makes 60% of the country’s IV fluid supply
  3. 3. X fails to avoid Australia child safety fine by arguing Twitter doesn’t exist
  4. 4. Neo-Nazis head to encrypted SimpleX Chat app, bail on Telegram
  5. 5. ULA’s second Vulcan rocket lost part of its booster and kept going