Building a Playground of 3D User Interfaces

by Martin Schubert and Barrett Fox • November 3rd, 2017

As mainstream VR/AR input continues to evolve – from the early days of gaze-only input to controllers and fully articulated hand tracking – so too are the virtual user interfaces with which we interact. We’re moving beyond flat user interfaces (UIs) from 2D screens toward a future filled with spatial interface paradigms that take advantage of depth and volume.

At Leap Motion, our team is constantly pushing the boundaries of our interactive VR toolkit. Recently, we created a VR sculpture prototype to explore its performance and usability. Along the way, we experimented with how spatial UIs could be used to control aspects of that sculpture – or any piece of complex content – by creating a playful set of physical-like user interfaces.

In the spirit of open exploration, here’s a journey through our latest round of rapid prototyping and design. Along the way, we’ll see how dynamic feedback can elevate a humble button press into a compelling and intuitive virtual interaction.

A Living Sculpture

We built the Leap Motion Interaction Engine to give developers the power to define the physical laws of the virtual universe. It unlocks virtual objects and physically inspired interfaces that you can pick up, throw, nudge, swat, smoosh, or poke. The concept of a VR sculpture was a great way to put the Interaction Engine to the test. (For a deep dive into this part of the project, check out our blog.)

Once we built the sculpture, it was time to take the interactions to the next level.

From Flat Screens to VR Interfaces

When someone first puts on a hand-tracking-enabled VR headset, it often seems they’re rediscovering how to use their own hands. In a sense, they are. When we bring our hands into a virtual space, we also bring a lifetime’s worth of physical biases with us. Compelling spatial interfaces complement and build upon these expectations.

For this exploration, we wanted to focus on physicality, playfulness, and conveying the distance between hands and UI elements through dynamic feedback.

A conceptual mood board featuring interfaces both simple and complex, with a focus on physicality and play. We also explored ideas around form, affordances, and use of color accents.

Since it was designed to run on mobile VR headsets, we knew that it might be experienced with only 3-degree-of-freedom head tracking. This meant the user would be seated and UIs needed to be within arm’s reach. With that in mind, we used curved spaces, which allow entire user interfaces to be warped into ergonomic curves.

Once we defined the layout, it was time to design the user interfaces themselves.

Building a Button

Since the iPhone introduced multi-touch input in 2007, we’ve seen 2D touchscreen interaction design evolve into a responsive, motion-filled language. Modern apps respond to any input with visual, audio, and sometimes even subtle haptic feedback. Taps and swipes are met with animated ripples and dynamic element resizing.

In VR, every interactive object should respond to any casual movement. Users don’t always know what to expect, and dynamic feedback helps to build a mental model of how the virtual world works, and what each action achieves. Without dynamic feedback, an interaction can feel unsatisfying and weird.

Beginning with the most fundamental of UI elements – a button – we asked what this sort of reactiveness might look like in VR with hands. While touchscreen button interactions are binary (contact vs. non-contact), pushing a button in 3D involves six distinct stages:

  • Approach. Your finger is near the button, which may start to glow or otherwise reflect proximity.
  • Contact. Your finger touches the button, which responds to the touch.
  • Depression. Your finger starts to push the button.
  • Engagement. Success! The button may change its visual state and/or make a sound.
  • Ending contact. Your finger leaves the button.
  • Recession. Your finger moves away.

When your hand approaches a button in this prototype, a white ring rises up from its base to meet the contact surface. As your finger gets closer, the ring gets closer, until contact is made and the ring reaches the top of the button.

Depressing the button until it engages changes the color of the button frame. Along with an audio click, this confirms the successful completion of the interaction. When contact between finger and button ends, a second slightly higher-pitched click marks the end of the interaction. The white ring recedes as the user moves their hand away.

These are actually toggles (the stubborn cousin of the button). They activate all of our sculpture’s presets.

A similar approach with an expanding white inner ring was used on the sliders.

Before settling on rising (or expanding) ring feedback, we also experimented with having the button morph as a finger approached. The idea was to make the button convex at rest, affording the action of being pushed. As your finger approached, it would morph into a concave shape, metaphorically providing a key to the button’s lock.

Inspired in part by the finger-key from The Fifth Element.

This style took full advantage of the 3D nature of the UI components and felt very interesting. However, it ultimately didn’t communicate how close the finger was as effectively as the rising ring approach. At some point we would love to delve deeper into this concept, perhaps by having your hand mesh also morph – turning your fingertip into a key shape.

Physical VR User Interfaces

Beyond adding spatial UI feedback for buttons and sliders, we were also curious to see whether a common mechanical input from the physical world would be compelling in VR. Physical trackballs are highly tactile with a wide spectrum of interaction levels. They can be manipulated with a fingertip and dialed in with slow precision or can be spun with force like a Kugel Fountain. A trackball seemed like a prime candidate for virtual recreation.

This spatial user interface playset is just a glimpse of the possibilities afforded by the Interaction Engine’s physics-based foundation. What kinds of 3D UIs would you like to see, touch, and create with these tools?

Mood board image credits: Iron Man 2, Endeavor Shuttle Cockpit, Fisher Price Vintage, Matrix Reloaded, Wirtgen Milling Machine, Fisher Price Laugh and Learn, Grooves by Szoraidez, Google Material Design Palette, sketches by Martin. This is a guest post not produced by the UploadVR staff. No compensation was exchanged for the creation of this content.”

Tagged with:

What's your reaction?
Like
Wow
43%
LOL
0%
Dislike
7%
  • Paul-Aristide Barré

    Possibility to have more informations by a link ? Is project available to download ?

  • Brandon Russell

    This is awesome! I have some harebrained ideas…

    In the first example the button frame should turn to a button independent color with a natural high value (e.g. yellow or cyan) and perhaps glow. This button independent color would communicate ‘active’ at a glance. The way it is now, the color of the frame and the button itself blend together more or less depending on the perspective of the user. Also the morphing button in the third example should morph to convex as you approach and concave as you press –the reverse. I believe convexity encourages touch more than concavity –something about the pleasure of depressing a volume. Also thinking of Michelangelo’s the Creation of Adam.

  • Paul-Aristide Barré

    Unity Engine ?

    • Leap Motion

      Unity Engine.

      • Доминик Ефремов

        Any way to test it out? I will be glad to gather some feedback from me, my friends, random people and send you if you need.

        HTC Vive + LEAP motion.

  • lovethetech

    The sphere rotation from another digital artifact is not intuitive .

    Please no buttons in VR………………………….

    • Aaron Penn

      why no buttons??? please tell me honestly what is more intuitive than a button.

      • lovethetech

        Intuitive digitals.
        This is 3D world without size/depth limitation.
        We lived/operated in the world without Buttons for the past thousands of years.
        An example: I want to turn a digital wheel. I would like a crank instead of button(slider) to push for wheel spinning.
        I am asking & have asked academia/institutions to look at that. A few 100+ tool set & some normal hand gestures should avoid these buttons.
        In AR, they look intrusive & ugly.
        Button served good for limited 2D screens.

        • Leap Motion

          We lived without many things for thousands of years, but buttons have been ubiquitous in the physical world for well over a century. We use them to operate cars, airplanes, helicopters, computers, phones, elevators, doorbells, bank machines, and literally any other thing that involves a binary decision point. While we’re also looking towards a future where the interface is able to disappear as much as possible, buttons remain invaluable interactive tools that everyone can understand and use.

  • Cool

  • randomly obsessing over the idea of your finger tip turning into a key. it might be more rewarding to have to rotate your hand to “unlock” a button than to gauge the depth of a “press”?
    on some level, that twist of the wrist kind of acts like it’s own haptic feedback – so you feel like you’ve done something. (just try it right now in the air!). I think this would be much preferable feeling to the bad experiences i’ve had trying to press a virtual button with Rift or Vive ghost hands (which is more “assume the single-finger-hand-shape and cautiously move controller forward, a little more, a little more, frustrated that the software isn’t registering the press. I’d swear i’ve hit the #%@!ing button, but nothing is happening so i’ll move my ghost finger forward even more. rage”)