Oculus Quest Robot

by Meta

Control a robot with a Meta Quest 2 VR Headset. View the robot’s camera and control servos with your hands or Quest controllers.

How to add the Oculus Quest Robot robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Virtual Reality category tab.
  5. Press the Oculus Quest Robot icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Oculus Quest Robot robot skill.

How to use the Oculus Quest Robot robot skill

User-inserted image

Use the Oculus Meta Quest 2 VR Headset to view any robot's camera and control any robot’s servos with your hands or controllers. Servos can be attached to finger movements, hand movements, controllers, and buttons. Additionally, there is a built-in remote desktop to control ARC without having to remove the VR headset. This robot skill requires the Meta Quest 2 app installed from the Oculus Store to control a robot. 

Why Control Robots From VR?

While it is exciting to use this robot skill and have a robot mimic your movements with VR accessories, this is also a preview of future technology in the workplace. Robots can perform many jobs, but the computing power to automate the task is unavailable. Today, VR technology is advanced enough to allow humans to control robots remotely and accomplish tasks. This means, for example, that warehouse workers can work from the comfort and safety of their homes while improving outcomes by remotely controlling robots. As artificial intelligence improves, robots will require less human intervention, but humans will still be needed to help the robot when it gets stuck. This means one person can control many robots, complementing Synthiam's approach with our Exosphere product.

Imagine people working from home while operating robots remotely in dangerous locations, such as mining, disaster recovery, or space exploration. With Synthiam's agnostic approach, our platform can program any robot. With the addition of virtual reality remote control, we are saving lives and improving job safety. There is no better time to familiarize yourself with this technology at home, school, or the workplace. 

Downloading Quest Robot Controller App

Get the Oculus Quest Robot App: Oculus Quest Robot.apk (v0.10 June 25, 2022).

*Note: Due to complications with Meta's review process, this app must be side-loaded while we work with Meta to have the app accepted for their store. Meta is currently disallowing this app because it connects to a PC. We're working to convince Meta that this useful VR app should be accepted for their review process to be used by students, robot builders, and inventors. Stay tuned!

Stereoscopic Camera View

This robot skill and Oculus Quest VR app have optional support for two cameras (left and right eye). This provides the user with a 3D stereoscopic view. If two cameras are not required, a single camera can be used instead. The position of the two cameras is critical to prevent eye strain for the viewer. It is generally advised to measure the distance between your pupils. Use that measurement when distancing each camera from the center of each lens. 

Using Oculus Meta Quest App with Virtual Reality

User-inserted image

When the oculus meta quest robot control app is loaded, you will find yourself on the bridge of a spaceship. The left controller can move your view around the ship to explore. The right controller can interact with the menu that will connect to the robot. 

1) Enter the IP Address of the Synthiam ARC software. The robot skill will show public and private IP addresses you can enter in this field. Using this robot skill on your local network, use the local IP address. If connecting over the internet through a port on your router, use the public IP address.

2) The port can be edited, but the default value is always 8082.

3) Pressing the connect button will load the robot control view.

About The Oculus Meta Quest 2

The Oculus Meta Quest 2 (marketed since November 2021 as Meta Quest 2) is a low-cost virtual reality (VR) headset developed by Facebook Reality Labs (formerly Oculus). It is the successor to the company's previous headset, the Oculus Quest. The Quest 2 was unveiled on September 16, 2020, during Facebook Connect 7.

As with its predecessor, the Quest 2 can run as a standalone headset with an internal, Android-based operating system not requiring a PC or high-cost graphics card. It is a refresh of the original Oculus Quest with a similar design but lighter weight, updated internal specifications, a display with a higher refresh rate and per-eye resolution, and updated Oculus Touch controllers.

Main Window

User-inserted image

The main window displays real-time tracking information of Quest input devices. Either the controllers or hand detection can be used. If the hand tracking is used, each finger position will be displayed, starting with the thumb and the hand's position. The controller location, analog stick, buttons, and trigger positions will be displayed if controllers are used.

Controller Inputs

User-inserted image

Both hand tracking and controller tracking can control their respective servos. When the controllers are set aside, and hand tracking is detected, the configured hand servos will enable. The hand servos will be disabled when used, and the controller servos will become enabled. This switches automatically based on what control method you use. Switching control methods can be done on the fly.


Press the configure button on the robot skill to view the configuration screen for this robot skill.
User-inserted image


User-inserted image

1) The servos for X (horizontal), Y (vertical), and Z (tilt) can be assigned. These servos will move with your headset to look around. Generally, you will want these servos attached to a camera so you can look around at the environment with the headset. 

2) The option to pause servos when the robot moves and pause the headset servos while any Movement Panel is being moved. When checked, the headset servos are only usable when the robot is not moving. This is useful to reduce motion sickness while the robot is moving.

Left Hand & Right Hand
User-inserted image

1) The position tracking group will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.

2) The finger tracking group allows assigning servos to each finger. Gripping each finger into a fist will move the respective servos. This is useful when using a robot hand, such as the InMoov, to control each finger individually.

Left Controller & Right Controller
User-inserted image

1) The position tracking will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.

2) Triggers and analog sticks can also control servos. The triggers are index trigger and hand trigger. The analog stick for each controller can be assigned servos.

3) Scripts can be assigned to the A and B buttons when pressed or released.

User-inserted image

1) The TCP ports for servo Server and Remote Desktop are read-only. They cannot be changed,

2) Enable/Disable the remote desktop feature. This feature has no password protection, so be cautious if you enable it. Do not enable this feature on public networks. 

3) The number of frames to skip when transmitting servo positions. If too many servo positions are set, the EZB may brown out from lack of power supply.

4) Left Camera (or single camera if using only one camera)

5) Right camera (optional if using only the left/single camera)

Remote Desktop

User-inserted image

1) The remote desktop lets you move the mouse with the right controller. You can use the trigger to left-click and the hand-grab to right-click. The remote desktop feature does not provide authentication, so only use this on private networks.

2) Press the menu button on the left controller to exit the remote desktop mode.

3) The analog stick on the left controller can be used to move around the floating screenshot window.

Robot Control

User-inserted image

1) When viewing the robot control, the menu button on the left controller will load the menu to enable and disable specific servos.

2) In the popup menu, you can adjust the camera depth for comfort.

3) The return to lobby option will do just that, return you to the main menu lobby.

4) The remote desktop can remotely control ARC through the VR headset if the robot skill configuration is enabled.


Upgrade to ARC Pro

Synthiam ARC Pro is a cool new tool that will help unleash your creativity with programming robots in just seconds!

Yes, I've found it, let's see how I can integrate it...
it works, I was able to customize the scripts, great!
Now it can go on, my Rock Pi has also been delivered ...
Dang, that feels short sighted that they rejected the app. Thanks for the Stereoscopic support, that’s awesome! I’m definitely going to try using 2 cameras in the inMoov eyes!
Hi Jeremie,


I’m definitely going to try using 2 cameras in the inMoov
This is why it started for me.
I installed exactly these 2 cameras in my InMoov (MS LifeCam HD-3000).
It has always been my dream to operate the InMoove via VR, now thanks to the skill I have come a good deal closer to the dream.
Right on @UweP, it’s a shared dream then! I also want to do the same thing as I feel that humans piloting robots through VR headsets (with gloves or controllers) is going to be a part of our near future. It’s going to allow remote work to blossom and for next level exploration. 

Humans piloting robots paves the way for the robots to one day take over repetitive and low skill tasks as well once they are fully trained. This is DJ’s exosphere idea and I can completely see VR + Humanoid operation playing an important part.
@DJ I was wondering if this might help:

Although your app wasn’t accepted in the official Meta store, maybe you could add it to the Sidequest store. Sidequest is like a third party App Store for the Quest. There aren’t as many users there but still a pretty decent amount!
It requires approval for that as well. I have a ton of threads on their developer forum
about this as well. Because their original documentation says it doesn’t need approval but it does
Crazy! I thought Sidequest was pretty open and anything goes with their platform. There are some pretty low-quality games on there. I expected the approval process to be pretty much non-existent, LOLxD That's really too bad.
Side quest gets its apps from the App Lab. It’s a sort of reviewed state for apps. But we don’t even qualify for that because our vr robot app requires a robot. Is what their feedback has been. So we added Remote Desktop and explained it connects to a pc, not a robot. So it’s the same as virtual desktop that has already been approved. In which case they must accept it now. Time will tell
This is a fantastic skill - and I love it. I am now rigging up a linear actuator based Stewart platform for a robot neck, paired with Oculus Quest 2. Things are looking good.

Any idea how to set the servos though for X-Y-Z if it's 6 DoF parallel actuators? Been fiddling with them and can't figure it out...

User-inserted image
#67   — Edited
Anybody smarter than me understand what a servo profile for a Stewart Platform might look like?

Otherwise I gotta go back to just 4 DoF, even though I already built the platform...

Any thoughts would be great!!!

What's a steward platform? The photo of that thing you posted?

You'll probably need to use multi-servos for each axis. To look down, you'd need to raise servos in the back while lowering servos in the front. So multi servo with one servo inverted. And to tilt left and right would be the opposite. The trouble is that there's no tilt value from the headset. So you'd need to rotate left or right, in which case you'd do the opposite of looking up or down. 

Multi servo is the way to go. Every robot skill allows multiple servos per set. It's just in the advanced button, and then you can add as many servos per axis as you wish. You can share servos across multiple axes as well. That might take a bit of tweaking to get the multiplier correct, so it doesn't jerk between the two positions. But it can be done with trial and error or sitting down with a pencil and lots of math. I go by trial and error in testing:D
Thanks! I've been playing with different sets of multi-servos with no luck all week. Didn't realize the headset doesn't have tilt value (that's probably been my problem). It must have tilt? The multiplier might be the ticket though! Hadn't thought to try that. I'll keep you posted. Stewart platforms are most commonly used in aircraft simulators because they can roll, tilt, twist in so many different positions. Made sense to re-create it for humanoid neck.

I'm going to unveil my 8 year long ARC project in the next few months...
Experimented last night. Got Y axis working fine. There definitely is tilt value in the Oculus Quest 2 being fed, but even with multipliers (negative and positive) I can't get it to do more than jerk around if using the Y and Z axis together. Might have to just go back to 4 DoF, but the 6 parallel actuators just allow for so many smooth organic / biomimetic poses!
Hardy Dynamics

Interesting site, 

Also I am interested in your 8 yearlong ARC project you been working on, keep us posted, thanks.
Thanks! I've been working on a full humanoid for 8 years. Finally got it working in 3D printed PLA, now going back and re-doing the whole thing in aluminum. Will be quite interesting when I get the new version presentable!
#74   — Edited
sounds great

So, you have a aluminum. 3d printer? Want to see that...