Thumbnail

Dual Camera Servo Server

by Synthiam Inc.

A server for DIY usage that allows a client to control servos and sends two stereoscopic camera streams to the client.

Requires ARC v2 (Updated 1/3/2025)

How to add the Dual Camera Servo Server robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Virtual Reality category tab.
  5. Press the Dual Camera Servo Server icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Dual Camera Servo Server robot skill.


How to use the Dual Camera Servo Server robot skill

A server for DIY usage that allows a client to control servos and sends two camera streams to the client for stereoscopic view. This allows a custom VR app to use two cameras for stereoscopic view.

C# .Net Win Forms Test App

We have provided an example C# .Net test app demonstrating how a client can connect to this robot skill server. The app can be downloaded here: Dual Camera servo Test App.zip

Unity VR App

The ideal custom VR app can be created in Unity. If using an Oculus Quest 2, we have a Unity example to demonstrate how to bind the hand, controller, and headset positions to servos. In addition, the left and right camera images can be assigned to a Raw Image game object.

Download the C# .Net Win Forms Test App from the link above and copy the Comm folder to your Unity asset project. This folder contains the StreamClient that will be used to connect to this robot skill, receive both camera images, and transmit servo positions.


using Assets.Comm;
using Synthiam_Unity_Common;
using UnityEngine;
using UnityEngine.UI;

namespace SceneRobotView {

  public class RobotView : MonoBehaviour {

    StreamClient _streamClient;
    bool _initialized = false;
    bool _isClosing = false;

    volatile byte [] _toDisplayLeft = new byte[]{ };
    volatile byte [] _toDisplayRight = new byte[]{ };

    RawImage _leftImage;
    RawImage _rightImage;

    void Start() {

      _leftImage = GameObject.Find("Left Eye Image").GetComponent();
      _leftImage.texture = new Texture2D(1280, 720, TextureFormat.RGB24, false);

      _rightImage = GameObject.Find("Right Eye Image").GetComponent();
      _rightImage.texture = new Texture2D(1280, 720, TextureFormat.RGB24, false);

      _streamClient = new StreamClient();
      _streamClient.OnLeftImageDataReady += _streamClient_OnLeftImageDataReady;
      _streamClient.OnRightImageDataReady += _streamClient_OnRightImageDataReady;
      _streamClient.OnStop += _streamClient_OnStop;
      _streamClient.OnStart += _streamClient_OnStart;
      _streamClient.OnError += _streamClient_OnError;
      _streamClient.Start("127.0.0.1", 8082);
    }

    private void _streamClient_OnError(System.DateTime time, string logTxt) {

    }

    private void _streamClient_OnStart() {

    }

    private void _streamClient_OnStop() {

      if (_isClosing)
        return;
    }

    private void _streamClient_OnLeftImageDataReady(byte[] imageData) {

      if (!_initialized || _isClosing)
        return;

      lock (_toDisplayLeft)
        _toDisplayLeft = imageData;
    }

    private void _streamClient_OnRightImageDataReady(byte[] imageData) {

      if (!_initialized || _isClosing)
        return;

      lock (_toDisplayRight)
        _toDisplayRight = imageData;
    }

    private void OnDestroy() {

      _isClosing = true;

      _streamClient.Stop();
      _streamClient.Dispose();
    }

    void Update() {

      _initialized = true;

      if (_isClosing)
        return;

      if (Camera.current == null)
        return;

      lock (_toDisplayLeft)
        if (_toDisplayLeft.Length > 0)
          (_leftImage.texture as Texture2D).LoadImage(_toDisplayLeft);

      lock (_toDisplayRight)
        if (_toDisplayRight.Length > 0)
          (_rightImage.texture as Texture2D).LoadImage(_toDisplayRight);

      if (_streamClient.IsRunning && OVRManager.hasInputFocus) {

        // https://developer.oculus.com/documentation/unity/unity-handtracking/
        // https://github.com/handzlikchris/Unity.QuestRemoteHandTracking/blob/master/HandsDataSender.cs
        // Hand_WristRoot = 0 // root frame of the hand, where the wrist is located
        // Hand_ForearmStub = 1 // frame for user's forearm
        // Hand_Thumb0 = 2 // thumb trapezium bone
        // Hand_Thumb1 = 3 // thumb metacarpal bone
        // Hand_Thumb2 = 4 // thumb proximal phalange bone
        // Hand_Thumb3 = 5 // thumb distal phalange bone
        // Hand_Index1 = 6 // index proximal phalange bone
        // Hand_Index2 = 7 // index intermediate phalange bone
        // Hand_Index3 = 8 // index distal phalange bone
        // Hand_Middle1 = 9 // middle proximal phalange bone
        // Hand_Middle2 = 10 // middle intermediate phalange bone
        // Hand_Middle3 = 11 // middle distal phalange bone
        // Hand_Ring1 = 12 // ring proximal phalange bone
        // Hand_Ring2 = 13 // ring intermediate phalange bone
        // Hand_Ring3 = 14 // ring distal phalange bone
        // Hand_Pinky0 = 15 // pinky metacarpal bone
        // Hand_Pinky1 = 16 // pinky proximal phalange bone
        // Hand_Pinky2 = 17 // pinky intermediate phalange bone
        // Hand_Pinky3 = 18 // pinky distal phalange bone

        OVRPlugin.HandState handState = default(OVRPlugin.HandState);

        if (OVRPlugin.GetHandState(OVRPlugin.Step.Render, OVRPlugin.Hand.HandLeft, ref handState)) {

          _streamClient.SendPacketLeftHand(
            Utilities.MapToByteHalf(handState.BoneRotations[5].x),
            Utilities.MapToByteHalf(handState.BoneRotations[5].y),
            Utilities.MapToByteHalf(handState.BoneRotations[5].z),

            Utilities.MapToByteHalf(handState.BoneRotations[8].x),
            Utilities.MapToByteHalf(handState.BoneRotations[8].y),
            Utilities.MapToByteHalf(handState.BoneRotations[8].z),

            Utilities.MapToByteHalf(handState.BoneRotations[11].x),
            Utilities.MapToByteHalf(handState.BoneRotations[11].y),
            Utilities.MapToByteHalf(handState.BoneRotations[11].z),

            Utilities.MapToByteHalf(handState.BoneRotations[14].x),
            Utilities.MapToByteHalf(handState.BoneRotations[14].y),
            Utilities.MapToByteHalf(handState.BoneRotations[14].z),

            Utilities.MapToByteHalf(handState.BoneRotations[18].x),
            Utilities.MapToByteHalf(handState.BoneRotations[18].y),
            Utilities.MapToByteHalf(handState.BoneRotations[18].z),

            Utilities.MapToByte(handState.RootPose.Position.x),
            Utilities.MapToByte(handState.RootPose.Position.y),
            Utilities.MapToByte(handState.RootPose.Position.z));
        } else if (OVRInput.GetControllerPositionTracked(OVRInput.Controller.LTouch)) {

          var controller = OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch);

          var thumbStick = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick, OVRInput.Controller.LTouch);

          var indexTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger, OVRInput.Controller.LTouch);

          var handTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);

          var buttonA = OVRInput.Get(OVRInput.Button.One, OVRInput.Controller.LTouch);

          var buttonB = OVRInput.Get(OVRInput.Button.Two, OVRInput.Controller.LTouch);

          _streamClient.SendPacketLeftController(
            Utilities.MapToByte(controller.x),
            Utilities.MapToByte(controller.y),
            Utilities.MapToByte(controller.z),
            Utilities.MapToByte(thumbStick.x),
            Utilities.MapToByte(thumbStick.y),
            Utilities.MapToByteHalf(indexTrigger),
            Utilities.MapToByteHalf(handTrigger),
            (byte)(buttonA ? 1 : 0),
            (byte)(buttonB ? 1 : 0));
        }

        if (OVRPlugin.GetHandState(OVRPlugin.Step.Render, OVRPlugin.Hand.HandRight, ref handState)) {

          _streamClient.SendPacketRightHand(
            Utilities.MapToByteHalf(handState.BoneRotations[5].x),
            Utilities.MapToByteHalf(handState.BoneRotations[5].y),
            Utilities.MapToByteHalf(handState.BoneRotations[5].z),

            Utilities.MapToByteHalf(handState.BoneRotations[8].x),
            Utilities.MapToByteHalf(handState.BoneRotations[8].y),
            Utilities.MapToByteHalf(handState.BoneRotations[8].z),

            Utilities.MapToByteHalf(handState.BoneRotations[11].x),
            Utilities.MapToByteHalf(handState.BoneRotations[11].y),
            Utilities.MapToByteHalf(handState.BoneRotations[11].z),

            Utilities.MapToByteHalf(handState.BoneRotations[14].x),
            Utilities.MapToByteHalf(handState.BoneRotations[14].y),
            Utilities.MapToByteHalf(handState.BoneRotations[14].z),

            Utilities.MapToByteHalf(handState.BoneRotations[18].x),
            Utilities.MapToByteHalf(handState.BoneRotations[18].y),
            Utilities.MapToByteHalf(handState.BoneRotations[18].z),

            Utilities.MapToByte(handState.RootPose.Position.x),
            Utilities.MapToByte(handState.RootPose.Position.y),
            Utilities.MapToByte(handState.RootPose.Position.z));
        } else if (OVRInput.GetControllerPositionTracked(OVRInput.Controller.RTouch)) {

          var controller = OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch);

          var thumbStick = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick, OVRInput.Controller.RTouch);

          var indexTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger, OVRInput.Controller.RTouch);

          var handTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);

          var buttonA = OVRInput.Get(OVRInput.Button.One, OVRInput.Controller.RTouch);

          var buttonB = OVRInput.Get(OVRInput.Button.Two, OVRInput.Controller.RTouch);

          _streamClient.SendPacketRightController(
            Utilities.MapToByte(controller.x),
            Utilities.MapToByte(controller.y),
            Utilities.MapToByte(controller.z),
            Utilities.MapToByte(thumbStick.x),
            Utilities.MapToByte(thumbStick.y),
            Utilities.MapToByteHalf(indexTrigger),
            Utilities.MapToByteHalf(handTrigger),
            (byte)(buttonA ? 1 : 0),
            (byte)(buttonB ? 1 : 0));
        }

          _streamClient.SendPacketHeadset(
            Utilities.MapToByte(Camera.current.transform.localRotation.x),
            Utilities.MapToByte(Camera.current.transform.localRotation.y),
            Utilities.MapToByte(Camera.current.transform.localRotation.z));
      }
    }
  }
}

ARC Pro

Upgrade to ARC Pro

Unleash your robot's full potential with the cutting-edge features and intuitive programming offered by Synthiam ARC Pro.

PRO
Germany
#17  

I've tested a lot now, unfortunately a video doesn't make sense to show you because you only get a mono image rendered in the video from the VR glasses.

It works with 2 USB cameras, the only small problem is that the USB cameras sometimes do not deliver a synchronous image.

This doesn't have to be a problem, it depends a lot on the PC (performance, USB ports). The processing of the videos on the PC and VR side is synchronous, but the USB cams deliver their images as they want (timings and delays), to avoid this, the two cameras would have to be synchronized on the hardware side. But this is not so easy to do, only very expensive USB cameras have connections for this. In our case, however, it is definitely enough to see only a stereoscopic image, we do not want to use it as a sensor.

Information on the topic: https://docs.baslerweb.com/synchronous-free-run https://robotics.stackexchange.com/questions/16454/trigger-two-usb-cameras-at-the-same-time https://www.arducam.com/industrial-grade-hardware-timestamping-usb-3-camera-modules-synchronization/ https://www.robotshop.com/community/blog/show/how-to-synchronise-two-cmos-camera-modules-for-stereo-vision

But as written, just to display a stereoscopic image, this skill works perfectly well!

I tested with this little guy (more on that soon), my attempts with the Rock Pi weren't so good:

CHUWI LarkBox Pro

User-inserted image

Both cameras are displayed with almost 0 delay.....

.

PRO
Germany
#18   — Edited

Feature request, would it be possible to integrate the Movement Panel, e.g. as in the "Joystick" skill?

User-inserted image

User-inserted image

Regards Uwe

.

PRO
Germany
#19  

Hi DJ, is there a way to get the IP of the client that connects, e.g. in a variable?

PRO
Synthiam
#20  

Yeah, I can do that for you! We're moving offices this week, so it's a bit chaotic. It will take a few days, hope that's okay

PRO
Germany
#21  

yeah, don't rush. It's too hot here anyway to sit at the pc. my wife scolds me when I just sit at the pc and don't go out :-) Have fun moving...

#22  

Would this skill work with a dedicated real dual Lens 3d Camera? Mine has a wifi client server built in so I am guessing ARC should see it but until I try it wont know if I get the stereo video,images or just one video,picture stream.