Apple’s developer session provided insight into the many ways users will (eventually) control their new Vision Pro headset, including a virtual keyboard that you can type in between the air. central. It came to us thanks to a “Design for Spatial Input” session, in which two members of Apple’s design team educate potential developers on best practices for designing apps for the platform. new.
Apple seems to want users to primarily interact with the headset by simply looking at the user interface elements and performing small hand gestures with their arms relaxed on their laps. But during her developer session, Apple designer Israel Pastrana Vicente admitted that “some tasks are better suited for face-to-face interaction,” which might involve reaching and touching components. part of the user interface (a feature Apple calls “direct touch”). There is also support for using a physical keyboard and trackpad or game controller.
“Some missions are better suited for face-to-face interaction”
So let’s talk about Vision Pro’s virtual keyboard. Apple designer Eugene Krivoruchko explains that it’s important to provide plenty of visual and audio feedback during use, to make up for the “missing tactile information” associated with touch reading peripherals. Krivoruchko explains: “When your finger is above the keyboard, the buttons show the hover state, and the highlight gets brighter as you get closer to the button surface. “It provides a distance hint and helps direct the finger to the target. At the moment of exposure, the change of state is quick and responsive, accompanied by the right spatial sound effects.”
Meta also recently implemented a similar experimental feature, also known as live touch, to allow Quest VR users to tap menu buttons or the virtual keyboard. But thanks to the depth sensor, UploadVR note that Apple’s Vision Pro may be more accurate than Meta, at least until the depth-sensing Quest 3 launches later this year.
There’s also support for voice input, with the same developer session noting that focusing your eye on the microphone icon in the search field will trigger the “Speak to Search” feature. . That will likely pull audio data from the six microphones built into the Vision Pro.
Live touch can also be used to interact with other system components. There’s the ability to tap and scroll as if you were interacting with a touchscreen, and an Apple demo showed the wearer performing a pen movement in the air to write a word and draw a heart in Markup. While the main interaction is through the user’s hand, Krivoruchko explains how it also uses eye tracking to enhance gestures. “You control the brush pointer with your hand, similar to a mouse pointer, but then if you look to the other side of the canvas and touch, the pointer will jump there and land right where you’re looking. This creates a precise feel and helps to cover large canvases quickly,” says the designer.
We still have a lot of questions about how Apple’s expensive new Vision Pro headset will perform in practice (especially its ability to pair it with motion controllers from other manufacturers), but between the In our hands-on experiences and developer sessions like this, experience is starting to come into focus.
Updated June 8, 12:57 p.m. ET: Added gesture images from Apple.