Thumbnail

Oculus Quest Robot

by Meta

Use Meta Quest 2 VR headsets to control a real robot by viewing their cameras and manipulating servos with natural hand gestures or controllers. This skill brings a futuristic touch to remote robot operation, enhancing safety and efficiency in various work environments.

Requires ARC v11 (Updated 2/2/2025) Hardware Info

How to add the Oculus Quest Robot robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Virtual Reality category tab.
  5. Press the Oculus Quest Robot icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Oculus Quest Robot robot skill.


How to use the Oculus Quest Robot robot skill

User-inserted image

Use the Oculus Meta Quest 2 VR Headset to control a real robot and view the camera with your hands or controllers. Servos can be attached to finger movements, hand movements, controllers, and buttons. Additionally, a built-in remote desktop controls ARC without removing the VR headset. This robot skill requires the Meta Quest 2 app installed from the Oculus Store to control a robot.

Transform your robot interaction experience with the Oculus Quest robot skill for Synthiam ARC. This skill enables immersive control of robot cameras and servos using the Meta Quest 2 VR headset. This innovative skill offers a hands-on approach to robotics, complete with a remote desktop feature for seamless ARC control. It promises a future where remote work and safety in hazardous environments are enhanced through virtual reality.

Why Control Robots From VR?

While it is exciting to use this robot skill and have a robot mimic your movements with VR accessories, this is also a preview of future technology in the workplace. Robots can perform many jobs, but the computing power to automate the task is unavailable. Today, VR technology is advanced enough to allow humans to control robots remotely and accomplish tasks. This means, for example, that warehouse workers can work from the comfort and safety of their homes while improving outcomes by remotely controlling robots. As artificial intelligence improves, robots will require less human intervention, but humans will still be needed to help the robot when it gets stuck. This means one person can control many robots, complementing Synthiam's approach with our Exosphere product.

Imagine people working from home while operating robots remotely in dangerous locations, such as mining, disaster recovery, or space exploration. With Synthiam's agnostic approach, our platform can program any robot. With the addition of virtual reality remote control, we are saving lives and improving job safety. There is no better time to familiarize yourself with this technology at home, school, or the workplace.

Downloading Quest Robot Controller App

Get the Oculus Quest Robot App: Oculus Quest Robot.apk (v0.10 June 25, 2022).

*Note: Due to Meta's review process complications, this app must be side-loaded while we work with Meta to have the app accepted for their store. Meta is currently disallowing this app because it connects to a PC. We're working to convince Meta that this useful VR app should be accepted for their review process so that students, robot builders, and inventors can use it. Stay tuned!

Meta Refuses To Accept Our App

We apologize for the method of installing the app rather than making it available in the Oculus app store. No matter how hard we plead, Meta will not accept our app in the store. The Meta Oculus Store denies our Oculus Quest Robot App because they do not like that it controls physical hardware. We continue to plead to their process, but they do not respond, and we continue to receive this same response.

User-inserted image

Stereoscopic Camera View

This robot skill and Oculus Quest VR app optionally support two cameras (left and right eye), which provide the user with a 3D stereoscopic view. If two cameras are not required, a single camera can be used instead. The position of the two cameras is critical to prevent eye strain for the viewer. It is generally advised to measure the distance between your pupils. Use that measurement when distancing each camera from the center of each lens.

Using Oculus Meta Quest App with Virtual Reality

User-inserted image

When the Oculus Meta Quest robot control app is loaded, you will find yourself on the bridge of a spaceship. The left controller can move your view around the ship to explore. The right controller can interact with the menu connecting to the robot.

  1. Enter the IP Address of the Synthiam ARC software. The robot skill will show public and private IP addresses you can enter in this field. If you use this robot skill on your local network, use the local IP address. If you connect over the internet through a port on your router, use the public IP address.

  2. The port can be edited, but the default value is always 8082.

  3. Pressing the connect button will load the robot control view.

About The Oculus Meta Quest 2

The Oculus Meta Quest 2 (marketed since November 2021 as Meta Quest 2) is a low-cost virtual reality (VR) headset developed by Facebook Reality Labs (formerly Oculus). It is the successor to the company's previous headset, the Oculus Quest. The Quest 2 was unveiled on September 16, 2020, during Facebook Connect 7.

As with its predecessor, the Quest 2 can run as a standalone headset with an internal, Android-based operating system that does not require a PC or high-cost graphics card. It is a refresh of the original Oculus Quest with a similar design but lighter weight, updated internal specifications, a display with a higher refresh rate and per-eye resolution, and updated Oculus Touch controllers.

Main Window

User-inserted image

The main window displays real-time tracking information of Quest input devices. Either the controllers or hand detection can be used. If hand tracking is used, each finger position will be displayed, starting with the thumb and the hand's position. If controllers are used, the controller location, analog stick, buttons, and trigger positions will be displayed.

Controller Inputs

User-inserted image

Both hand tracking and controller tracking can control their respective servos. When the controllers are set aside, and hand tracking is detected, the configured hand servos will be enabled. The hand servos will be disabled when used, and the controller servos will become enabled. This switches automatically based on what control method you use. Switching control methods can be done on the fly.

Configuration

Press the configure button on the robot skill to view the configuration screen for this robot skill.

User-inserted image

Headset

User-inserted image

  1. The servos for X (horizontal), Y (vertical), and Z (tilt) can be assigned. These servos will move with your headset to look around. Generally, you will want these servos attached to a camera so you can look around at the environment with the headset.

  2. There is an option to pause servos when the robot moves and pause the headset servos while any Movement Panel is being moved. When checked, the headset servos are only usable when the robot is not moving. This is useful to reduce motion sickness while the robot is moving.

Left Hand & Right Hand

User-inserted image

  1. The position tracking group will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.

  2. The finger tracking group allows assigning servos to each finger. Gripping each finger into a fist will move the respective servos. This is useful when using a robot hand, such as the InMoov, to control each finger individually.

Left Controller & Right Controller

User-inserted image

  1. The position tracking will bind servos to the hand's position. The X is horizontal, Y is vertical, and Z is the distance outward from your body.

  2. Triggers and analog sticks can also control servos. The triggers are the index trigger and the hand trigger. The analog stick for each controller can be assigned servos.

  3. Scripts can be assigned to the A and B buttons when pressed or released.

General

User-inserted image

1) The TCP ports for servo Server and Remote Desktop are read-only. They cannot be changed,

  1. Enable/Disable the remote desktop feature. This feature has no password protection, so be cautious if you enable it. Do not enable this feature on public networks.

  2. The number of frames to skip when transmitting servo positions. If too many servo positions are set, the EZB may brown out due to a lack of power supply.

  3. Left Camera (or single camera if using only one camera)

  4. Right camera (optional if using only the left/single camera)

Remote Desktop

User-inserted image

  1. The remote desktop lets you move the mouse with the right controller. You can use the trigger to left-click and the hand grab to right-click. The remote desktop feature does not provide authentication, so only use this on private networks.

  2. Press the menu button on the left controller to exit the remote desktop mode.

  3. The analog stick on the left controller can move around the floating screenshot window.

Robot Control

User-inserted image

  1. When viewing the robot control, the menu button on the left controller will load the menu to enable and disable specific servos.

  2. In the popup menu, you can adjust the camera depth for comfort.

  3. The return to lobby option will do just that: return you to the main menu lobby.

  4. The remote desktop can remotely control ARC through the VR headset if the robot skill configuration is enabled.


ARC Pro

Upgrade to ARC Pro

Become a Synthiam ARC Pro subscriber to unleash the power of easy and powerful robot programming

#41  

This robot skill is currently under development - so many of the features you are asking about haven't been completed yet. However, these items are on the list to be completed. I don't have an estimated date, but I know they're actively working on finishing this robot skill.

PRO
Germany
#42  

If you want to test my stereoscopic attempts, here is the APK file for the Oculus Quest and this skill.

ARC_3D_Test.zip

Is a test, only rudimentary functions. Only the video is received, no servo data is sent.

You can test it with a side by side video, e.g. here to download https://stereoimage.de/download-video?download=90:europa just send the video to the camera skill using the "Stream All The Video Sources" skill

ARC must run under the IP 192.168.1.75 The skill "Oculus Quest Robot" with the servo Server Port: 8082

Right Controller Button A toggle Mono/Stereo Right Controller Button B toggle Cam/Test (use Left & Right Sticks).

I'm really surprised how good the latency is when I integrate the Stereopi as a USB cam on the Camera Skill.

Greetings Uwe

PRO
Synthiam
#43  

Do you have to use the stereo pi? Or can you use two cameras? What’s special about the stereopi that couldn’t be done with two cameras?

im asking because if we add dual camera support, we’d need to understand what the stereo pi offers that two usb cameras doesn’t? It’s a lot of overhead to have a raspberry pi running only for cameras

PRO
Germany
#44  

The stereopi synchronizes both cameras, the two cameras must be in sync. If you can sync the cams there shouldn't be a problem.

Yes, an extra raspberry pi is a bit overhead, but I was hoping that the Stereopi would stream itself. It also does this, but it is far too weak, the latency is impossible. There is a possibility via UDP, the latency was good but has many other disadvantages (cannot be reached from the Internet...). Originally I also wanted to control 2 servos directly via the Stereopi, theoretically possible but as written, it is too weak to implement everything on the pi.

That's how I ended up with you and ARC and I'm very enthusiastic so far.

PRO
Synthiam
#45  

That’s great information, thanks. I’ll take a look at what you’ve done and see if it can be added to this robot skill. I’m confident it can. The frame sync won’t be an issue because ARC uses the win32 low level video capture per frame.

theres a few other developments going on at the moment - so it might take a week or more for someone to get around to it. Please share any advancements you experience in the meantime.

PRO
Germany
#46   — Edited

This skill here is based on the skill "Servo And Camera Server" - isn't it? Otherwise I couldn't receive the video data in Unity.

Is it possible to get the Unity script from this skill here where I can see what format the controller data is being sent to ARC, then I could get on with my Unity tinkering (very very rudimentary, I'm new to Unity) I'd love to work with this skill here and not with the other .

Of course I'll keep you up to date. My  AtomicPi Rock Pi X will be delivered in the next few days, so I'm curious how it handles a USB cam, for example...

Greetings Uwe

PRO
Germany
#47   — Edited

I created a unity project for the stereo display as an example. This is how I solved it, I'm really new to unity, no idea if there are other ways to display it. I only found paid assets but they all require a real video. Unity itself can display a side-by-side 180/360 video on the skydome, but this was not the solution either.

The structure briefly explained:

a new Layer Right a new layer Left

one camera, Target Eye: Right, Culling Mask Layer Left off One Camera, Target Eye Left, Culling Mask Layer Right off

a GameObject Plane for video display

a GameObject Plane for Cover Right -> Layer Right a GameObject Plane for Cover Left -> Layer Left

Arrange accordingly, plane for video display, in front of them the two planes for left/right coverage So only the corresponding half is seen by one eye, the other is always covered This means that nothing has to be converted or broken down on the video stream.

Here are the Unity project data:

Stereo_ARC.zip

Greetings Uwe

PRO
Germany
#48  

played around with unity, really funy...