Review

My Life-Changing Week With Apple Vision Pro

Raymond Wong gets hands-on — and then some — in this in-depth Apple Vision Pro review.

by Raymond Wong
Inverse tech journalist Raymond Wong wearing the Apple Vision Pro for his review
Lais Borges/Inverse; Photograph by James Pero
Gear Reviews

Years before the $3,500 Apple Vision Pro was officially announced at WWDC 2023 last June, I made a bold prediction: The “killer app” — a piece of software or use case that can single-handedly justify the purchase of a new device — for Apple’s first face-worn computer would be a personal movie theater with the biggest virtual display ever and most immersive surround sound experience.

I was right. Watching movies and TV and personal 2D and 3D “spatial videos” recorded with an iPhone 15 Pro on the Vision Pro’s 100-foot virtual screen is such a mind-boggling experience it almost feels impossible that it exists in a headset that fits in your backpack. Taking off the Vision Pro will leave a temporary red imprint of its cushion around your eyes, but you will crave putting it back on to enter the stimulating augmented world of visionOS.

But that’s only part of the story. Once I started doing work inside of the headset, I discovered another killer app. I found myself able to do almost all the things I do on my MacBook Pro, like writing and editing stories, cutting together vertical videos, browsing the web, messaging on Slack, and posting to social media — without the restraints of the laptop’s fixed size and display resolution. With “spatial computing,” app windows can be as small or large as I want them — room-size even — and I can position them anywhere (against a wall, on a desk, on the ceiling or floor, or in mid-air), and they will appear through the Vision Pro’s screens as though they are actually floating in my physical surroundings. It all adds up to be a serious game-changer.

Then, I received an incoming FaceTime call from Ian Zelbo while wearing the Vision Pro. On that call, it dawned on me that there is no one or two killer apps for the Vision Pro, because it has many killer apps. Zelbo and I are digital friends — we’ve followed each other on X (formerly Twitter) and traded replies for years, but have never met in real life. But that night, our 3D “Personas” met up on FaceTime and we played a heated and very fun game of augmented reality battleship (or Sea Battle as it’s called in Resolution Games’ Game Room). We kept laughing at how futuristic it was for both of us to be playing an augmented reality tabletop game together as if it were actually in front of us on top of our beds. It was then that I realized how Vision Pro is exactly what’s advertised on the box, so to speak. It’s a computer, and computers have many killer apps, not just one.

I lost the game of augmented reality battle ship, but it was one of the most surreal mixed reality experiences I’ve ever had.

Screenshot by Ian Zelbo

Apple was famously mocked for asking the question, “What is a computer?” when it tried to redefine a “real” computer from its established definition as a machine with a traditional form factor (laptop or desktop), running a window-based graphical user interface (GUI), and controlled with a traditional input (mouse and keyboard). The ad was meant to portray how an iPad is a computer, but with a modular form factor (add a keyboard if you like), running a simpler interface (iPadOS), and controlled with modern inputs (multitouch and Apple Pencil). The ad is now meme legend, but the question will likely forever remain as long as Apple keeps making new electronic devices: What is a computer?

In 1984, a computer looked like the Macintosh; in 2007, any person would have said a MacBook or Dell laptop; in recent years, more powerful apps and better keyboard and trackpad support have made the iPad a more capable computer than it was when it launched as a giant iPhone and color e-reader in 2010. So what is a computer in 2024 and beyond? Well, if you’re Apple, the next evolution of the computer looks like Vision Pro and you control it with your eyes and hand gestures.

What It’s Like To Put A Computer On Your Face

The Apple Vision Pro in all of its first-gen glory.

Photograph by Raymond Wong

The hardest part about selling a computer that you wear on your face is that everyone’s head is different. It’s not at all as simple as selling a laptop, tablet, or smartphone made for hands that generally fall into small, medium, and large sizes. A head-mounted device or headset (Apple prefers you call it a “spatial computer” so as not to be lumped together with other mixed reality and virtual reality headsets) needs to account for not only head size and shape, but characteristics like bone structure and nose and cheek types, eye prescriptions and distances, ear sizes and distances, and hair. Those are a lot of variables that Apple has to factor in when you use an iPhone or iPad with Face ID to complete a face scan that will then select the right light seal for your face.

This personalization has its pros and cons. Assuming the face scan is accurate (if it’s not, you can visit an Apple Store and swap for one that fits better), it means your Vision Pro will fit your face snugly. Getting the right fit is essential to a good Vision Pro experience, especially for extended wear. With the wrong size light seal, you will likely feel more discomfort wearing the Vision Pro and/or see light leak into the headset either through the sides or the nose area. Having one or both of these issues would immediately leave a bad impression. Fortunately, my scans were pretty consistent and I got the same light seal after doing the face test several times. I recommend doing multiple scans and choosing the size that you get most. The downside to a light seal that best fits your face is that it makes it difficult to share your Vision Pro with another person. Giving another person a brief demo might not be the worst experience they have with the headset, but it won’t be the most optimal one, either. You can imagine how this would make sharing a Vision Pro in a household or at work challenging or impossible. Light seals, whether they’re replacements or for another person, aren’t cheap: $199 each.

Getting a light seal with the right fit is essential to a good Vision Pro experience.

Photograph by Raymond Wong

Configuring the ergonomics of the Vision Pro for your face doesn’t end there. After you’ve got the right light seal, you need to secure the headset to your face, which you do with one of two included straps. The Solo Knit Band — the wide and cool-looking stretchable one that you see in all of Apple’s marketing pictures — is the default strap that’s pre-installed with the Vision Pro. It has a “fit dial” that you rotate to tighten and loosen the threaded band. This is my preferred band. It looks great, the fit dial allows for precise adjustments, and it doesn’t flatten my hair’s middle part. In exchange for not messing up your hair, it doesn’t have a top strap, which many people find helps lift the Vision Pro and prevent it from sliding down and/or weighing down your face. To swap in the secondary Dual Loop Band, you simply pull the two orange tabs on the Solo Knit Band to detach it. The Dual Loop Band attaches to the spatial audio side bands with a satisfying snap. The elastic band with its Velcro straps isn’t as stylish as the Solo Knit Band, but most people seem to find it more comfortable.

The Solo Knit Band for the Apple Vision Pro.

Photograph by Raymond Wong

At WWDC 2023, I tried a band design that Apple doesn’t sell, which was a Solo Knit Band with a top strap. Even after trying both included bands and finding the Solo Knit Bands works best for me, I keep thinking about that unreleased band and feeling that Apple should have included that top strap as an attachable piece or that it should sell the combo band separately. The fact that Vision Pro users are using zip-tie cables to fasten additional Solo Knit Bands as top straps, suggests the Vision Pro’s 625g weight is too heavy for many people. Comfortable wear is something Meta figured out a long time ago for its Quest headsets. Hence, why all Quest headsets (except for the Quest Pro which uses a crown design) have shipped with a strap design that comes with top support. Of course, that also means you’ll always look dorky wearing a Quest headset which gives you serious bedhead. The inclusion of the uglier Dual Loop Band is an obvious concession for the sake of wider comfort, and it’s clear to me that when future Vision Pro headsets get thinner and lighter, the Solo Knit Band will be the one and only band for everyone.

The Dual Loop Band is also included in the box. It’s got more head support, but it’ll also mess up your hair.

Photograph by James Pero

It’s easy to complain about the weight of the Vision Pro when you might literally feel it on your face. I say might because I seem to be in the minority of Vision Pro users where the weight doesn’t usually bother me. The only time that I couldn’t stand the weight and felt the drag was during FaceTime video calls where I stared at the other peoples’ windows and didn’t move my head very much. I documented that on X, where I said that my neck cramped up about 45 minutes into the video call — a very engaging call with my friends Christina and Karissa — and by the last few minutes, I practically begged them to hang up because my face felt so fatigued. That was the only time that I needed to take off the Vision Pro during a week of almost non-stop testing. At first, I thought maybe the headset was on too tight so I loosened the Solo Knit Band, but I’ve since all but confirmed that reduced head movement (at least for me) leads to neck and face pain. At 625g, the Vision Pro is the heaviest of all the most popular VR headsets available. The Quest 3 weighs 515g, the Quest 2 503g, and the PlayStation VR2 560g. Only the Quest Pro weighs more than the Vision Pro at 722g. Comfort is not something that’s a guarantee for everyone.

The digital crown on the Apple Vision Pro is used to control the level of Environments, volume, and bring you to the home view of apps.

Photograph by Raymond Wong

Comfort and weight aside, the Vision Pro is truly state-of-the-art, and it’s probably, as Apple says, the most complex device it’s ever created. Every component, from the Apple-designed M2 main chip and R1 co-processor to the two postage stamp-size displays with a combined 23 million pixels to the 12 cameras, six microphones, and five sensors (for eye and hand tracking, and room and object detection), is densely packed into the ski goggle-shaped silhouette. iFixit’s teardown shows barely any room for air inside. The second I held the Vision Pro in my hand, I knew it was no toy and I would need to handle it with care, lest I fumble it and end up crying my way to expensive repairs at an Apple Store. It’s no wonder Apple includes a very fancy polishing cloth with every headset. I’m surprised there aren’t white gloves in the box, too.

To offer an analogy, the Quest 3 is like a Casio G-Shock watch. It provides a mixed reality experience; it’s generally affordable; it’s made of cheap lightweight materials that are plenty durable. The Vision Pro is more an Omega Speedmaster. It provides a high-fidelity mixed reality experience, it’s premium-priced, it’s made of nicer and heavier materials — and you probably gently store it in a padded lock box at night.

Apple spared no expense engineering (or over-engineering) the Vision Pro down to the tiniest details. Every button press; every turn of the fit dial or digital crown; every magnetic snap when you attach the light seal to the headset or face cushion to it; every latch click connecting the bands or detaching the cable from the external battery pack; every sound and tactile feedback is supremely satisfying. Maybe the only thing that could make the Vision Pro even more covetable is if it came in Space Black.

How I Felt After A Full Week With the Vision Pro

A tap of the thumb and index finger is a “click.”

Photograph by James Pero

If you want to get a real visual sense of what it’s like using hand gestures to control visionOS and how the operating system and apps (Vision, iPad, or on a Mac with the virtual display) work when “placed” or “pinned” to your surroundings, there are tons of videos on YouTube and social media that do a good job of that. This one went viral for good reason.

But what is the Vision Pro experience like after tests that span a week? What issues crop up when you have a bit more time with with the device? Even though Apple sent me a loaner Vision Pro three days before its February 2 release in the U.S., like any new platform, your experience is very much dependent on what you can actually do with the device at the time of testing. For example, there wasn’t a single YouTube app for the first batch of Apple’s hand-picked Vision Pro reviewers. By the time I got my review unit set up, Christian Selig (you may know him as the developer behind the now-dead Apollo Reddit app) had released his third-party YouTube app, Juno, and I was the first person in the world to try it (the app is barebones, but it works!). Post-launch, YouTube has confirmed it’s working on an official app for Vision Pro. There are no official Netflix or Spotify apps at launch, but if you’re buying a Vision Pro in the future, there may very well be ones in the App Store.

The volume and breadth of native Vision Pro apps and iPad-optimized apps are expanding at a rapid pace. I’ve opened the Vision Pro App Store and checked my inbox every day to find so many apps to try, and for many of those apps, there’s no way to know what to expect. Will they be “spatial” with augmented reality “volumetric” 3D objects and interfaces or will they be 2D — app windows, essentially — that merely float in your surroundings? And if apps have spatial components, will they be paywalled like Algoriddim’s djay music mixing app? Will they have ads, and if they do, what does that look like in possibly 3D? How well do eye and hand-tracking work as substitutes for touchscreen or mouse input? Trying them out is the only way to find out. It’s been fun; the App Store feels fresh again and worth a daily trip for the first time in over a decade.

What working with multiple virtual windows inside of Vision Pro is like.

Screenshot by Raymond Wong

It took me almost three days, but I wrote nearly every word of this review while wearing the Vision Pro strapped to my face. I dictated some parts and pecked at the virtual keyboard on subsequent revisions. But I typed about 95 percent of the text you’re reading using a Keychron Q1 Max mechanical keyboard paired wirelessly to the Vision Pro (sometimes with a Magic Trackpad since Bluetooth mice aren’t supported), as well as the keyboard and trackpad on a 2021 MacBook Pro with M1 Max chip. The last week of “living” in Apple Vision Pro has been absolutely surreal. I am in awe every time I take off the headset and return to my non-augmented surroundings. At the same time, I’m simultaneously unsurprised at how quickly and normal computing in visionOS has become for me. I also didn’t get motion sickness — something I occasionally succumb to, depending on the app, when using VR headsets such as the Quest 3 and PS VR 2. That doesn’t mean I won’t ever feel motion sickness while using the Vision Pro in the future, but so far, the headset’s high-resolution and fast refresh rate displays (they switch between 90z, 96Hz, and 100Hz) and software algorithms have done a great job fending back any dizziness, sweatiness, nausea, or other motion sickness-related symptoms. Apple has a whole support document if you’re prone to motion sickness in headsets or you’re concerned about it.

The learning curve for Vision Pro is so shallow that it practically renders the Vision Pro’s state-of-the-art processing power, responsive eye and hand tracking, real-time Lidar environment scanning, and everything else that’s happening invisibly behind the scenes unmagical. Because of course controlling virtual windows and manipulating virtual 3D objects should work as naturally and easily as looking at them and tapping your thumb and index finger together to “select” or “click” something. Of course moving an app window is as easy as pinching your fingers over a bar that appears below the window and then dragging it wherever you want. And of course zooming in and out to enlarge text or examine the details in a photo or video is as natural as using two hands to pinch in and out on it. The eye and hand tracking on Vision Pro that’s shipping today is as polished as a 1.0 version of any new input gets. There is zero perceivable lag between touching a virtual element and manipulating it. And if there is any latency that can be improved in future hardware, it’d be so imperceptible I doubt I’d notice a difference. That’s just how smooth the input and interface for Vision Pro is. visionOS isn’t perfect out of the gate. I did run into a few bugs throughout a week of using Vision Pro — finger taps occasionally didn’t track or my eyes couldn’t hold their gaze on the button that I wanted to select — but they’re so few and far between that it’s no different than a split-second lag or misclick of a mouse or trackpad.

Generally speaking, visionOS 1.0.2, which came out after the initial wave of embargoed Vision Pro media reviews (but prior to the headset launch date), is as stable as any of Apple’s other platforms. I can say with high confidence that any minor glitch or bug can be resolved with a reboot (you can just say “Siri, restart”) or it will be squashed in a software update soon enough. Contrary to what many people seem to think, Vision Pro is not in the unpolished state of a developer kit where the core functionality isn’t responsive or stable enough to complete the fundamental computing tasks that it’s designed to do. It’s inaccurate to call Vision Pro a developer kit if the only basis is that it’s expensive, there aren’t enough native Vision Pro apps, and the install base is small. That is the case for any groundbreaking new product.

Vision Pro is the best expression of what a headset experience with augmented reality should be.

But I want to be clear here: the Vision Pro is one of the most complete first-gen product releases that Apple has ever shipped. People remember past Apple products through rose-tinted glasses, as revolutionary products that became cultural phenomena overnight. From the Mac, to the iPod, to the iPhone, to the iPad, to the Apple Watch, to AirPods, new Apple products have always been premium-priced devices within existing categories, but with the “Apple Formula,” if you will. When Apple enters a product category, it bundles cutting-edge technologies, intuitive software interfaces, and extremely accessible inputs into gorgeous lust-worthy designs. When all of those pieces are in a finished enough state to deliver an experience that “just works” and feels effortless, allowing the tech specs to fade into the background, that’s when Apple packages everything together and ships it. Buying a first-gen Apple product is not simply getting a low-cost metal box of silicon and software thrown in as an afterthought. You’re buying the overall experience — the future before everyone has access to it — and Vision Pro is the best expression of what a headset experience with augmented reality should be.

Compare the Vision Pro to the Quest 3, Meta’s latest mixed reality headset that starts at $499, and it’s not even a fair fight. I like a lot of things about the Quest 3, but the two headsets are not in the same class. The Quest 3 is controlled with hand-based Touch Controllers with physical buttons and triggers that you use to aim at virtual icons and control virtual objects; it needs a not-insignificant tutorial if you’re not already familiar with game controllers. The Vision Pro is controlled with your eyes and fingers and requires little to no explanation; you instinctively reach out to touch and grab virtual windows and objects, and it usually works once you know where to tap your fingers and how to pinch them. On Vision Pro, your eyes are the equivalent of a mouse pointer and your finger taps and pinches are the “clicks” and gestures on a trackpad. There is no eye-tracking on Quest 3 and hand tracking is not turned on by default (it was a beta feature on Quest 2 and Quest Pro) and its finicky implementation only highlights its secondary status to the Touch Controllers.

Some More Notes On The Vision Pro, In No Particular Order

Eye and hand-tracking is so responsive it almost feels like Jedi mind control.

Photograph by Raymond Wong

Apple Has Put Computer First, Gaming Second:

Thanks to its M2 chip, Vision Pro is a desktop-class general-purpose computer first today that also happens to be a capable console for playing mobile games and AAA streaming game services (and soon via proper apps). The Quest 3 is a virtual reality game console first, with disparate computer functionality slapped together in the hopes it’ll form some kind of consistent platform some day. I’m not saying the Quest 3 is suddenly garbage, because it’s not. It’s a similarly shaped product, but the philosophies for the primary experiences they deliver are different — and that’s okay. Apple has been making computers for nearly 50 years with billions of devices sold and used daily; Vision Pro builds on past Apple platforms and software experiences and works seamlessly with existing Apple hardware and services. Meta has made zero successful general-purpose computers and work services that anybody truly wants to use. Quest has a strong foothold in VR gaming, and Meta should lean into it instead of trying to compete with janky VR work apps like Horizon Workrooms that not even its own employees want to use.

Sharp Displays Have One Clear Drawback:

You may have realized that I talked about the eye and hand tracking input interface of Vision Pro before the vision part of the headset, which is the Micro-OLED displays. Typically, when I review a phone or tablet, I critique the display first since it’s the main component you use to interact with the system software and consume content. But I’m talking about the Vision Pro’s displays after the input because it’s really secondary to the buttery smooth gesture controls, in my opinion. How can the displays be secondary on a device that literally puts pixels millimeters from our eyes and is the only way to see all of visionOS’s floating windows and spatial objects? Easy: The screens are the best on any consumer (enterprise is a different story) headset — so crisp you can easily read text without pixel ghosting or smearing — and they’re bright enough to render HDR photos and videos in spectacular fidelity. But the field of view — how much of the screen you can observe — is narrower than I would like on a headset that starts at $3,500. In comparison to the Quest 3 and other VR headsets, the Vision Pro, even with the right light seal and correct eye calibration, still resembles looking through a pair of binoculars. The picture quality is clear in the center, but look at the periphery and you’ll see black borders; it’s similar to a vignette.

The Vision Pro’s screens are the best on any consumer headset, but the field of view could get wider in future versions.

Photograph by Raymond Wong

Widening the field of view is something that I look for every time I review a new VR headset, and Meta deserves credit for improving it with each Quest headset. But Apple gets a small pass for the first-gen Vision Pro because the higher resolution, refresh rate, and brightness of its displays make up for the narrower field of view. On lower-res VR headsets, the poorer display clarity is often a distraction (and a real barrier to basic computer tasks like reading a website) and a larger field of view can’t compensate for that. Does the black vignetting around the lenses distract me? Sometimes when I pause, look around, and fixate on them. But, at least for me, I quickly stopped noticing the black peripheral borders because I was more concentrated on the content before me. The precision of the spatial windows and clarity of the content make hardware shortcomings like the FOV disappear for the most part. Here’s another analogy: It’s like the Dynamic Island or notch on an iPhone or the crease on a foldable phone. The blemish is prominent, but after a while, you learn to ignore it.

On Navigating VisionOS:

Navigating and using visionOS itself couldn’t be simpler. If you’ve used iOS or iPadOS, the operating system is immediately familiar. Pressing the digital crown brings up the “home view” screen, a screen of app icons laid out like on a phone or tablet. To open an app, you just look at it and tap your fingers together. It’s really that easy. To see more apps, you pinch the home view screen and swipe left and right. Apps are organized into two types: Vision Pro apps and iPad apps. Vision Pro apps are shown alphabetically in home view and iPad apps are in a “Compatible Apps” folder. This can be annoying because it segregates apps and means opening iPad apps takes extra taps and swipes to get to (or you ask Siri to open the app, which has a 50-50 chance of working unless your enunciation is spot on). I get why Apple did this — so that it’s clear which apps are “new” or might have a spatial component to it — but I really hope a future update adds a way to customize the home view the same way you can organize your home screens on iPhones and iPads. I also understand that throwing too many features into a first-gen product might be overwhelming. Apple is known for playing the long game and slowly introducing features with software updates and new hardware revisions so as not to over-stimulate.

Home view in Apple Vision Pro. The app icons just float in mid-air in your surroundings.

Screenshot by Raymond Wong

“Environments” Helped Me Stay Focused:

Apple doesn’t consider the Vision Pro a virtual reality headset, but there are some VR-like aspects with “Environments,” which are virtual recreations of scenic landscapes like the Haleakalā volcano, Joshua Tree, the beach, or even the surface of the Moon. You can think of Environments like a “curtain” that you can use to block out your surroundings while computing in visionOS. Environments can be turned completely on or off, or blended with your surroundings using the digital crown. The dozen available Environments all look pretty great. There’s a little bit of depth to each Environment and there are even ambient sounds like light rain for some of them. My apartment is in a perpetually messy state because I work from home most of the week. It’s filled with review device boxes, camera equipment (lights and tripods), and my own personal belongings, so being able to turn up an Environment and literally cover up all the stuff in my surroundings is great for removing distractions when I’m working, watching movies, playing games.

Apple Needs To Give Us A Clock:

I really wish there was a persistent clock in visionOS. There are two system-level ways to see the time: open Control Center or ask Siri. Especially with Environments turned up, it’s very easy to lose track of time. It’s like being in a Las Vegas casino; you forget how long you’ve worn the headset (unless the low-battery notification reminds you) or you can see your surroundings (if Environments are set at zero or blended). I ended up adding a clock widget to my virtual workspace using the Widgetsmith app. I’d love to see a persistent clock maybe alongside the Control Center (just look up from anywhere to see an arrow icon to open it) or on the home view. If you wear a watch or have a clock on your desk or wall, you could just look at that for the time, but I don’t wear a watch at home and I have no clocks other than one in my bathroom.

You can open pretty much as many app windows as you want in visionOS.

Screenshot by Raymond Wong

Siri Is Actually Useful:

Siri is shockingly good on Vision Pro. I can use my voice to open apps, close apps, close all apps, reboot the headset, and more. Whenever I’ve struggled to do something, I started asking Siri and found the voice assistant seems to have more capabilities than on my iPhone or Mac. “Siri, open Wi-Fi settings” is faster than opening Control Center or returning to the home view, clicking on the Settings app, and then opening the Wi-Fi tab. I could see future versions of Vision Pro relying on an even more advanced AI-charged Siri to control visionOS faster than with eye or hand tracking. For all the hate that Siri has gotten in the past decade for being slow or useless, I think Siri could see some real love and utility with visionOS if it gets a strong injection of large language models (LLMs).

You’ll Want A Real Keyboard For Work:

The virtual keyboard is only good for short text entry, not writing long articles like this review.

Screenshot by Raymond Wong

Typing with the virtual keyboard is not a great experience — I’ve outlined several times before how it works. You can look at each key and tap your fingers together to select it or peck at individual keys like a person who’s just learning how to type. Both are fine in a pinch for entering short text like a PIN or password (though you’ll probably rip your hair out if your password is a combination of letters, numbers, symbols, and with lower and upper casing). It would be nice if Apple let you place the virtual keyboard flat on a surface so that you can tap on it and get some kind of feedback like you would on an iPad. I tried to position the virtual keyboard flat, but it’s always angled. Bummer. For entering large amounts of text (like this long review), a wireless keyboard is a must. Vision Pro works with just about any Bluetooth keyboard (Apple’s Magic Keyboard works, but not older ones that use AA batteries). At a table, a wireless keyboard and trackpad are invaluable for working in Vision Pro, but not ideal for a couch or bed. For those places, I’m considering a pocket keyboard like the ones used for smartphones, controlling smart TVs, or home theater PCs (HTPCs). One thing I found difficult to do with the virtual keyboard was move the I-Beam cursor. Turns out, there’s a similar gesture for controlling the I-Beam cursor that’s equivalent to pressing and holding the spacebar on an iPhone’s on-screen keyboard. To move the I-Beam cursor, you can use a hard pinch of your thumb and fingers and then slide left and right to where you want to insert a character.

It Really Is A Great Virtual Display For Your Mac:

How awesome is it to have a massive virtual display inside of Vision Pro for your Mac?

Screenshot by Raymond Wong

If you own a Mac, the Vision Pro makes for an outstanding virtual screen for it. Connecting is easy. If both devices are on the same Wi-Fi network, you’ll see a pop-up over the device to connect to it. If that doesn’t appear — and sometimes it doesn’t because no wireless handoff is perfect — you can manually connect the two through Control Center. In my 48-hour hands-on, I said I really enjoyed enlarging my Mac to a huge 4K virtual display and I’ve been editing all my Vision Pro vertical videos in Final Cut Pro using it. The Vision Pro’s high-resolution displays really come in handy for Mac Virtual Display. If not for all of its pixels and the high pixel density, mirroring macOS and its tiny menu bar and buttons and drop-down menus would have been a very subpar experience. Any Mac with Apple silicon will mirror its desktop as a 4K virtual display. Supported Intel Macs only mirror in the headset as a virtual display with 3K resolution. And as I’ve previously said, the best part is that the Mac virtual display is its own window alongside Vision Pro and iPad apps. Even the keyboard and trackpad can control the cursor across the different apps using Universal Control. The only thing you can’t do with the Mac virtual display is use eye or hand tracking to control macOS.

Personas Are Very Much A Beta Feature:

My colleagues were not ready for me to show up on Zoom with my Persona.

Screenshot by Raymond Wong

Let’s talk about Personas, the 3D avatar representation that can be used for FaceTime and other video calling apps such as Zoom and Microsoft Teams. They… look really strange. I surprised some of my friends on FaceTime and my Inverse colleagues on a Zoom call by showing up as my Persona, and the reactions were pretty universal: “You look like a ghost.” And they’re right. My Persona not only looks like a PS3 game character (remember the console’s Second-Life-like PlayStation Home online hangout feature?) but if I turn sideways or look behind my back, there’s nothing there… it’s just a mess of pixels. Even stranger is that the quality of Personas vary between Vision Pro apps; my avatar looked better on Zoom than on FaceTime. To be fair, Personas are labeled as a beta feature and they’re already looking better in the visionOS 1.1 developer beta, so Apple is clearly making improvements. That being said, as strange and goofy-looking as appearing as a Persona on a video call is, the creepy factor fades rather quickly. I had to ask directly at the end of each call whether they felt weirded out by my polygonal reconstruction. They mostly didn’t care after a few minutes. Somebody asked me on X whether or not Personas are good enough to use in front of a customer on a video call. No, they are not. And I wouldn’t recommend appearing on a video call as a Persona if the person(s) you’re calling with is important (like a boss or a job recruiter) and has never experienced a Persona before, or it’s someone you are close with and prefer to show your real face expressions to. You don’t want to kick yourself for failing to close a deal or land a job interview because you decided to show up as your 3D alter ego.

Personas are a beta feature.

Screenshot by Raymond Wong

EyeSight Might Freak People Out:

EyeSight is... kind of dim and strange-looking.

Photograph by James Pero

And speaking of 3D representations… EyeSight, your eyes scanned while creating a Persona and reconstructed for projection on the outside display. It’s also really strange. The EyeSight display is a lot dimmer and lower resolution than the marketing images make it out to be. Seeing photos of my EyeSight, I don’t hate it, but I also don’t love how it looks. There’s something just a bit off about the eyes that don’t sell the illusion that they’re real; they don’t look like they’re aligned correctly with your face. Even with the curved glass screen and lenticular layer to give it a 3D-like look, it’s like a bad version of Metroid’s Samus Aran wearing her helmet. Apple says EyeSight is there solely for other people to see, so that they know when you’re looking at them and not at content. But it’s not useful if you’re using the headset alone, which is what I suspect most people will be doing. Maybe future versions can suffice with an indicator light.

Yes, There’s A Short Battery Life:

The battery pack has a weighted feel.

Photograph by Raymond Wong

Not that Apple is hiding this fact, but the Vision Pro’s battery is not very long. Apple rates the external battery pack for up to 2.5 hours for streaming video and up to 2 hours for mixed use. That falls in line with my testing: long enough for most movies unless it’s a Christopher Nolan or Martin Scorsese film that tests everyone’s bladders at over 3 hours, but not long enough for continuous work on a cross-country flight from New York to California without plugging into an external battery pack or outlet. I have been consistently able to get 3 to 4 hours if I’m mostly working in Safari and iPad apps and not doing a whole lot of media playback. Out of everything about the Vision Pro, the short battery life is its biggest drawback. It’s also disappointing that the batteries are not hot-swappable. If you disconnect one battery and want to swap in a freshly charged one, the Vision Pro shuts down and you lose your spatial windows. Plugging in a large-capacity portable battery pack to the Vision Pro’s battery via its USB-C port does provide an affordable way to extend power, but then you have two battery packs connected with a cable, which is then connected to another cable to the headset, and that’s just not ideal, especially if you’re standing up. It’s too many batteries in your pocket. I thought Belkin’s Battery Holder Case which clips the battery to your belt or waistband, or slings over your shoulder with the strap, would alert the fashion police, but you know what? I kinda dig it. It’s functional and stylish. Look, if people can rock fanny packs in public with no shame, clipping a Vision Pro battery to your belt indoors is the least of your worries.

And About That USB-C Port For Data and Video In:

Photograph by Raymond Wong

Lastly, since I’m on the topic of ports. I really wish the Vision Pro had a USB-C port. The headset works well enough (not perfect) with iCloud and other cloud services via apps and you can AirDrop files to it from other Apple devices, but a USB-C port on the headset itself would be useful for the last mile of “real” work that the iPad once suffered from. I’m, of course, talking about a USB-C dongle that you can connect for transferring photos and videos from a memory card reader. To edit these photos in this review, I had to connect my SD card to my MacBook Pro and import them into our content management system (CMS) using the Mac virtual display. It would have been more convenient if the Vision Pro could connect to USB-C data devices, even through the port on the battery pack, which only supports charging. And while I’m making a wishlist, maybe video-in support via USB-C DisplayPort or HDMI for connecting a game console like a Nintendo Switch or PS5 or Xbox Series S/X directly to the Vision Pro. I’m well aware that Apple doesn’t include video-in on any of its devices, but the company once did. iMacs last released in 2013 had a feature called “Target Display Mode” that let you use their displays as monitors for other devices. Apple has not shipped a device with “Target Display Mode” in over a decade, but I don’t want to say it’ll never bring back a feature because it did with MagSafe, HDMI (out), and the SD card reader on MacBook Pros, and that was extremely well received.

Do We Really Want This Future?

Is a spatial computer really what we want?

Photograph by James Pero

Apple Vision Pro is expensive, but also nothing short of profound. It’s not just the most beautiful head-mounted computer ever made; it also has the most intuitive input, the most responsive interface, and has the best shot at succeeding thanks to its close integration with other Apple devices (iPhones, iPads, Mac, AirPods) and services (iCloud, iMessage, FaceTime).

Skepticism of new technology, especially a new product from a company with such a strong ecosystem lock-in as Apple, is fair and necessary if we are to build the future we want, as opposed to the future that tech companies want to foist on everyone.

In the 2010s, we let tech giants like Apple, Google, Samsung, Microsoft, Amazon, and Meta, dictate our relationships with computers — mobile or not — because we needed access to modern services for work, entertainment, and communication. As consumers, we did little to push back on whether we really needed bigger phones and tablets, miniature computers on our wrists, always-listening voice assistant smart speakers, and the list goes on. Similarly, developers and creators flocked to new devices hoping for the same app gold rush that the iPhone kickstarted with the App Store. For better or worse, tech companies, developers, and consumers together were too busy barreling forward into the exciting and unknown future. We told ourselves that technology would solve everything; we just needed to embrace the revolutionary Shiny New Gadget every few years.

As the deputy tech editor at Inverse, it’s not only my responsibility to review new devices like the Vision Pro that could fundamentally change how we live and highlight all the wondrous potential they might offer, but also start the conversation for: Is this what we really want? History is littered with failed technological advances and “magical” products that would have transformed everything, had we only blindly allowed companies to rapidly iterate on their products, and ignored all the downsides. Now, with Apple kickstarting the start of the “spatial computing era,” it’s never been more important that we ask ourselves, no matter how incredible the Vision Pro’s eye and hand tracking are or how futuristic the mixed reality visionOS experience is: Do we want this future where a computer covers half our face?

Do we want this future where a computer covers half our face?

Since this is my review of the Vision Pro, it’s only fair that I answer the question. My answer is: yes… but only sometimes. That is not meant to be a cop-out answer. New technology that enables us to do more computing, in new ways, and faster than before is undeniably hard to turn a blind eye to. However, too much of anything, including technology, can only lead to an unhealthy and even more dystopian reality. We allowed smartphones to swallow us whole; we can and should be more intentional about when and where we summon the immersiveness of mixed reality or spatial computing.

Despite having “passthrough” cameras that let you see your surroundings, I view (no pun intended) the Vision Pro the same way I do a MacBook or laptop. There are times and places when you need the big screen and performance of a laptop — usually at home or school or the office for work, entertainment, and communication — but you’d almost never take it out on public transportation, at a bar or restaurant, or at a sporting event. You definitely would never use a laptop while sitting behind the wheel of a car, even if the vehicle has a self-driving or assisted-driving mode, and you never should use Vision Pro behind the wheel, either. (Apple even explicitly says not to do that when you first set up the headset.) We should reject Vision Pro (and headsets like it) as phone replacements today and in the future to prevent it from tightening our already self-imposed handcuffs to the digital world. Let Vision Pro be what it clearly is: a modern take on a general-purpose desktop computer, only instead of a tower under your desk or a clamshell on top of it, it’s a headset with three-dimensional virtual app experiences that can blend with your surroundings. Apple Vision Pro has set the bar for spatial computing, but let’s leave the augmented reality computing for indoors and be intentional about where and when we use it. Our reality outdoors still needs our undivided attention.

Related Tags