Reality Labs is Meta’s laboratory for research work that is now bringing touch to the virtual world! Let’s have a closer look.
One of Meta’s Reality Labs Research teams is focused on inventing the future of interaction in augmented and virtual reality. Their goal is to create the technology which will solve one of the central challenges of the metaverse: “How do we touch the virtual world?”
“To enable this experience and bring touch to the metaverse, the team is developing haptic gloves: comfortable and customizable gloves that can reproduce a range of sensations in virtual worlds, including texture, pressure and vibration. While we’re still in the early stages of this research, the goal is to one day pair the gloves with your VR headset for an immersive experience like playing in a concert or poker game in the metaverse, and eventually they’d work with your AR glasses.” Meta says in their newsroom.
Pioneering New Scientific Domains with Haptic Glove Research
These gloves are a challenge to build and require invention in totally new domains of scientific research. For the past seven years the team has been creating new breakthroughs to make haptic gloves a reality. Here’s what Meta says about the breakthroughs:
- Perceptual Science: Because current technology can’t fully recreate the physics of the real world in VR, we’re exploring the idea of combining auditory, visual and haptic feedback for things like convincing a wearer’s perceptual system that it’s feeling an object’s weight.
- Soft robotics: Existing mechanical actuators create too much heat for such a glove to be worn comfortably all day. To solve this, we’re creating new soft actuators — tiny, soft motors all over the glove that move in concert to deliver sensation to the wearer’s hand.
- Microfluidics: We’re developing the world’s first high-speed microfluidic processor — a small microfluidic chip that controls the air flow that moves the actuators. The use of air (a fluid) means we can fit many more actuators on the glove than would otherwise be possible with electronic circuitry.
- Hand tracking: Even with a way to control air flow, the system needs to know when and where to deliver the right sensations. We’re building advanced hand-tracking technology to enable it to identify precisely where your hand is in a virtual scene, whether you’re in contact with a virtual object and how your hand is interacting with the object.
- Haptic rendering: Our haptic renderer sends precise instructions to the actuators on the hand, based on an understanding of things like the hand’s location and properties of the virtual objects (such as texture, weight and stiffness) that the hand comes in contact with.
With these developments Meta has pushed the boundaries in overcoming something challenging and making it real. It is now excited about the progress that has been made and the potential it shows for a virtual world which all of us can touch.