banner
Home / News / A closer look at Apple’s Vision Pro keyboard and other controls
News

A closer look at Apple’s Vision Pro keyboard and other controls

Dec 17, 2023Dec 17, 2023

By Jon Porter, a reporter with five years of experience covering consumer tech releases, EU tech policy, online platforms, and mechanical keyboards.

An Apple developer session has offered an in-depth look at the many ways users will (eventually) control its new Vision Pro headset, including a virtual keyboard that you’ll be able to type on in mid-air. It comes to us thanks to the "Design for spatial input" session, in which two members of Apple's design team walk prospective developers through best practices for designing apps for the new platform.

Apple seems keen for users to mainly interact with the headset by simply looking at UI elements and making small hand gestures with their arms relaxed on their lap. But in its developer session, Apple designer Israel Pastrana Vicente admits that "some tasks are better suited to interact directly," which can involve reaching out and touching UI elements (a feature Apple refers to as "direct touch"). There's also support for using physical keyboards and trackpads or game controllers.

"Some tasks are better suited to interact directly"

So let's talk about the Vision Pro's virtual keyboard. Apple designer Eugene Krivoruchko explains that it's important to offer plenty of visual and audio feedback while using it, to make up for the "missing tactile information" involved with touching a read peripheral. "While the finger is above the keyboard, buttons display a hover state and a highlight that gets brighter as you approach the button surface," Krivoruchko explains. "It provides a proximity cue and helps guide the finger to target. At the moment of contact, the state change is quick and responsive, and is accompanied by matching spatial sound effect."

Meta also recently rolled out a similar experimental feature, also called direct touch, to allow Quest VR users to touch menu buttons or virtual keyboards. But thanks to its depth sensors, UploadVR notes that Apple's Vision Pro is likely to be more accurate than Meta's, at least until the depth sensor-equipped Quest 3 arrives later this year.

There's also support for voice input, with the same developer session noting that focusing your eyes on the microphone icon in the search field will trigger a "Speak to Search" feature. That’ll likely draw audio data from the six microphones built into the Vision Pro.

Direct touch can also be used to interact with other system elements. There's the ability to tap and scroll as though you’re interacting with a touchscreen, and one Apple demo shows the wearer making a pen motion in midair to write a word and draw a heart shape in Markup. Although the primary interaction is via the user's hand, Krivoruchko explains how it's also using eye-tracking to augment the gestures. "You control the brush cursor with your hand, similar to a mouse pointer, but then if you look to the other side of the canvas and tap, the cursor jumps there landing right where you’re looking. This creates a sense of accuracy and helps to cover the large canvas quickly," the designer says.

We still have plenty of questions about how Apple's expensive new Vision Pro headset is going to work in practice (in particular the potential to pair it with motion controllers from other manufacturers), but between our hands-on experience and developer sessions like these, the experience is starting to come into focus.

Update June 8th, 12:57PM ET: Added an image of gestures from Apple.

/ Sign up for Verge Deals to get deals on products we've tested sent to your inbox daily.

Update June 8th, 12:57PM ET