Apple Can Beat Meta by Finally Making VR User-Friendly
How you navigate in VR has improved from the Quest 2 to the PSVR2, but no company has entirely figured it out.
If there’s any obvious thing a new player in the virtual reality space could improve, it’s the interface and accompanying interactions that ferry users between VR apps and games. Smartphones have calcified around a launcher with a grid of apps or widgets you interact with using touch. VR headsets haven’t found that same kind of consensus yet.
Part of the issue is how we interact with these devices; controllers may be more accurate but more cumbersome. Hand tracking seems the most natural option, but pulling it off is more complex. It makes sense that interacting with things in the virtual world should be similar to how we do it out of the headset, but how that happens is essentially up in the air.
With the Quest Pro released last year and the PSVR 2 this year, VR interfaces and interactions have taken baby steps forward, but there’s an opportunity to streamline things — and Apple might be the one to take advantage of it.
Hands and eyes
To sift through a sea of task bars and app launchers, most VR headset-makers have relied on your hands or, more recently, your eyes to navigate menus. Sometimes that’s with a controller, like Meta’s original Touch controllers, which use a ring of invisible IR lights to keep an eye on your hand position in VR. The company took that further on the Quest Pro and introduced an inside-out tracking system on the controllers themselves for improved accuracy.
Other times that’s just your hands. Hand tracking similarly relies on the built-in cameras on your headset but drops the need for clunky controllers. It’s far from ideal — your room's lighting can dramatically impact if your headset can see your hands — but it feels more how pop culture has long imagined virtual reality would work.
Meta seems keen on making hand tracking even better. A recently added experimental feature to the Quest 2 and Quest Pro lets you interact with menus and software interfaces with a tap and swipe rather than treating your hand like a cursor. Sony focused on something even better for its new headset for an even more streamlined option. In applications that support the feature, interior cameras in the PSVR 2 will let you navigate menus with just your eyes. You look at the menu item you want to select and press a controller button to confirm.
Neither option feels like the entire solution on its own. Still, they handle the increasing complexity of VR interfaces in a way that should either feel easy to the point of being instantaneous (eye tracking) or familiar enough to understand without instruction (hand tracking).
A “Macintosh Moment”
If reports are to be believed. Apple’s mixed reality headset could combine both. Bloomberg reported in January that the headset would use both hand tracking and eye tracking to interact with an interface that looks a bit like a virtual reality version of the iPhone and iPad’s familiar home screen.
Apple’s rumored headset “allows the wearer to control the device by looking at an on-screen item — whether it’s a button, app icon, or list entry — to select it. Users will then pinch their thumb and index finger together to activate the task,” Bloomberg writes.
Any virtual reality headset from Apple should be considered vaporware at this point. There is, however, a good amount of historical precedent for expecting Apple’s new entry into a market to set a new standard for human-computer interaction. Arguably the iPhone did that for smartphones with its operating system, App Store, and capacitive touchscreen. The Apple Watch took longer to perfect but pulled off the same thing with smartwatches using a combination of touch, voice controls, and the Digital Crown to navigate watchOS. And its UI/UX has influenced other smartwatches; the Pixel Watch is much more like the Apple Watch than any smartwatch Google has backed before.
Perhaps more importantly, Apple’s first Macintosh is often credited with popularizing the graphical user interface, an innovation that fundamentally changed how people use computers. That makes a recent claim from Vrvana founder Bertrand Nepveu calling Apple’s entry into the virtual reality space a “Macintosh moment” in an interview with Radio Canada all the more interesting. Apple acquired Vrvana in 2017, and Nepveu reportedly left the company in 2021, but suggesting Apple’s VR take could reinvent the space in the same way the Macintosh did seems like further proof that the interface and interactions might be what the company has perfected.
This article was originally published on