Upgrade to ARC Pro

Experience early access to the latest features and updates. You'll have everything that is needed to unleash your robot's potential.


Oculus Quest Robot

by Meta

Control a robot with a Meta Quest 2 VR Headset. View the robot’s camera and control servos with your hands or Quest controllers.

How to add the Oculus Quest Robot robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Virtual Reality category tab.
  5. Press the Oculus Quest Robot icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Oculus Quest Robot robot skill.

How to use the Oculus Quest Robot robot skill

User-inserted image

Use the Oculus Meta Quest 2 VR Headset to view any robot's camera and control any robot’s servos with your hands or controllers. Servos can be attached to finger movements, hand movements, controllers, and buttons. Additionally, there is a built-in remote desktop to control ARC without having to remove the VR headset. This robot skill requires the Meta Quest 2 app installed from the Oculus Store to control a robot. 

Why Control Robots From VR?

While it is exciting to use this robot skill and have a robot mimic your movements with VR accessories, this is also a preview of future technology in the workplace. Robots can perform many jobs, but the computing power to automate the task is unavailable. Today, VR technology is advanced enough to allow humans to control robots remotely and accomplish tasks. This means, for example, that warehouse workers can work from the comfort and safety of their homes while improving outcomes by remotely controlling robots. As artificial intelligence improves, robots will require less human intervention, but humans will still be needed to help the robot when it gets stuck. This means one person can control many robots, complementing Synthiam's approach with our Exosphere product.

Imagine people working from home while operating robots remotely in dangerous locations, such as mining, disaster recovery, or space exploration. With Synthiam's agnostic approach, our platform can program any robot. With the addition of virtual reality remote control, we are saving lives and improving job safety. There is no better time to familiarize yourself with this technology at home, school, or the workplace. 

Downloading Quest Robot Controller App

Get the Oculus Quest Robot App: Oculus Quest Robot.apk (v0.10 June 25, 2022).

*Note: Due to complications with Meta's review process, this app must be side-loaded while we work with Meta to have the app accepted for their store. Meta is currently disallowing this app because it connects to a PC. We're working to convince Meta that this useful VR app should be accepted for their review process to be used by students, robot builders, and inventors. Stay tuned!

Stereoscopic Camera View

This robot skill and Oculus Quest VR app have optional support for two cameras (left and right eye). This provides the user with a 3D stereoscopic view. If two cameras are not required, a single camera can be used instead. The position of the two cameras is critical to prevent eye strain for the viewer. It is generally advised to measure the distance between your pupils. Use that measurement when distancing each camera from the center of each lens. 

Using Oculus Meta Quest App with Virtual Reality

User-inserted image

When the oculus meta quest robot control app is loaded, you will find yourself on the bridge of a spaceship. The left controller can move your view around the ship to explore. The right controller can interact with the menu that will connect to the robot. 

1) Enter the IP Address of the Synthiam ARC software. The robot skill will show public and private IP addresses you can enter in this field. Using this robot skill on your local network, use the local IP address. If connecting over the internet through a port on your router, use the public IP address.

2) The port can be edited, but the default value is always 8082.

3) Pressing the connect button will load the robot control view.

About The Oculus Meta Quest 2

The Oculus Meta Quest 2 (marketed since November 2021 as Meta Quest 2) is a low-cost virtual reality (VR) headset developed by Facebook Reality Labs (formerly Oculus). It is the successor to the company's previous headset, the Oculus Quest. The Quest 2 was unveiled on September 16, 2020, during Facebook Connect 7.

As with its predecessor, the Quest 2 can run as a standalone headset with an internal, Android-based operating system not requiring a PC or high-cost graphics card. It is a refresh of the original Oculus Quest with a similar design but lighter weight, updated internal specifications, a display with a higher refresh rate and per-eye resolution, and updated Oculus Touch controllers.

Main Window

User-inserted image

The main window displays real-time tracking information of Quest input devices. Either the controllers or hand detection can be used. If the hand tracking is used, each finger position will be displayed, starting with the thumb and the hand's position. The controller location, analog stick, buttons, and trigger positions will be displayed if controllers are used.

Controller Inputs

User-inserted image

Both hand tracking and controller tracking can control their respective servos. When the controllers are set aside, and hand tracking is detected, the configured hand servos will enable. The hand servos will be disabled when used, and the controller servos will become enabled. This switches automatically based on what control method you use. Switching control methods can be done on the fly.


Press the configure button on the robot skill to view the configuration screen for this robot skill.
User-inserted image


User-inserted image

1) The servos for X (horizontal), Y (vertical), and Z (tilt) can be assigned. These servos will move with your headset to look around. Generally, you will want these servos attached to a camera so you can look around at the environment with the headset. 

2) The option to pause servos when the robot moves and pause the headset servos while any Movement Panel is being moved. When checked, the headset servos are only usable when the robot is not moving. This is useful to reduce motion sickness while the robot is moving.

Left Hand & Right Hand
User-inserted image

1) The position tracking group will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.

2) The finger tracking group allows assigning servos to each finger. Gripping each finger into a fist will move the respective servos. This is useful when using a robot hand, such as the InMoov, to control each finger individually.

Left Controller & Right Controller
User-inserted image

1) The position tracking will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.

2) Triggers and analog sticks can also control servos. The triggers are index trigger and hand trigger. The analog stick for each controller can be assigned servos.

3) Scripts can be assigned to the A and B buttons when pressed or released.

User-inserted image

1) The TCP ports for servo Server and Remote Desktop are read-only. They cannot be changed,

2) Enable/Disable the remote desktop feature. This feature has no password protection, so be cautious if you enable it. Do not enable this feature on public networks. 

3) The number of frames to skip when transmitting servo positions. If too many servo positions are set, the EZB may brown out from lack of power supply.

4) Left Camera (or single camera if using only one camera)

5) Right camera (optional if using only the left/single camera)

Remote Desktop

User-inserted image

1) The remote desktop lets you move the mouse with the right controller. You can use the trigger to left-click and the hand-grab to right-click. The remote desktop feature does not provide authentication, so only use this on private networks.

2) Press the menu button on the left controller to exit the remote desktop mode.

3) The analog stick on the left controller can be used to move around the floating screenshot window.

Robot Control

User-inserted image

1) When viewing the robot control, the menu button on the left controller will load the menu to enable and disable specific servos.

2) In the popup menu, you can adjust the camera depth for comfort.

3) The return to lobby option will do just that, return you to the main menu lobby.

4) The remote desktop can remotely control ARC through the VR headset if the robot skill configuration is enabled.


Upgrade to ARC Pro

Your robot can be more than a simple automated machine with the power of ARC Pro!

#46   — Edited
This skill here is based on the skill "Servo And Camera Server" - isn't it? Otherwise I couldn't receive the video data in Unity.

Is it possible to get the Unity script from this skill here where I can see what format the controller data is being sent to ARC, then I could get on with my Unity tinkering (very very rudimentary, I'm new to Unity) I'd love to work with this skill here and not with the other .

Of course I'll keep you up to date.
My  AtomicPi Rock Pi X will be delivered in the next few days, so I'm curious how it handles a USB cam, for example...

#47   — Edited
I created a unity project for the stereo display as an example.
This is how I solved it, I'm really new to unity, no idea if there are other ways to display it.
I only found paid assets but they all require a real video.
Unity itself can display a side-by-side 180/360 video on the skydome, but this was not the solution either.

The structure briefly explained:

a new Layer Right
a new layer Left

one camera, Target Eye: Right, Culling Mask Layer Left off
One Camera, Target Eye Left, Culling Mask Layer Right off

a GameObject Plane for video display

a GameObject Plane for Cover Right -> Layer Right
a GameObject Plane for Cover Left -> Layer Left

Arrange accordingly, plane for video display, in front of them the two planes for left/right coverage
So only the corresponding half is seen by one eye, the other is always covered
This means that nothing has to be converted or broken down on the video stream.

Here are the Unity project data:

played around with unity, really funy...

Whoa, roli looks pretty wild in unity - good job! I think we're starting on some of the changes to this robot skill next week. Your changes are a tremendous influence on that!
Just to let you know these changes are in development. Hopefully there will be an update in the next week or so. Stay tuned
I'm curious what's coming, I'm looking forward to it.
May I ask again quietly, could I get the script lines where you send the controller data from Unity to ARC ....
The unity project is too large to add as an attachment. But the C# scripts are easy enough to identify.

This is every C# script in the project. Most scripts are for various buttons on the scenes. But the ones that matter most use the COMM folder. Here it is:
great, thank you very much.
I checked it out, it's exactly what I need...
Take a look at the robot view folder and the file robotview.cs has the information for sending the controller data.
Yes, I've found it, let's see how I can integrate it...
it works, I was able to customize the scripts, great!
Now it can go on, my Rock Pi has also been delivered ...
Dang, that feels short sighted that they rejected the app. Thanks for the Stereoscopic support, that’s awesome! I’m definitely going to try using 2 cameras in the inMoov eyes!
Hi Jeremie,


I’m definitely going to try using 2 cameras in the inMoov
This is why it started for me.
I installed exactly these 2 cameras in my InMoov (MS LifeCam HD-3000).
It has always been my dream to operate the InMoove via VR, now thanks to the skill I have come a good deal closer to the dream.
Right on @UweP, it’s a shared dream then! I also want to do the same thing as I feel that humans piloting robots through VR headsets (with gloves or controllers) is going to be a part of our near future. It’s going to allow remote work to blossom and for next level exploration. 

Humans piloting robots paves the way for the robots to one day take over repetitive and low skill tasks as well once they are fully trained. This is DJ’s exosphere idea and I can completely see VR + Humanoid operation playing an important part.
@DJ I was wondering if this might help:

Although your app wasn’t accepted in the official Meta store, maybe you could add it to the Sidequest store. Sidequest is like a third party App Store for the Quest. There aren’t as many users there but still a pretty decent amount!
It requires approval for that as well. I have a ton of threads on their developer forum
about this as well. Because their original documentation says it doesn’t need approval but it does
Crazy! I thought Sidequest was pretty open and anything goes with their platform. There are some pretty low-quality games on there. I expected the approval process to be pretty much non-existent, LOLxD That's really too bad.
Side quest gets its apps from the App Lab. It’s a sort of reviewed state for apps. But we don’t even qualify for that because our vr robot app requires a robot. Is what their feedback has been. So we added Remote Desktop and explained it connects to a pc, not a robot. So it’s the same as virtual desktop that has already been approved. In which case they must accept it now. Time will tell