A server for DIY usage that allows a client to control servos and sends two stereoscopic camera streams to the client.
Requires ARC
v2
(Updated 1/3/2025)
How to add the Dual Camera Servo Server robot skill
- Load the most recent release of ARC (Get ARC).
- Press the Project tab from the top menu bar in ARC.
- Press Add Robot Skill from the button ribbon bar in ARC.
- Choose the Virtual Reality category tab.
- Press the Dual Camera Servo Server icon to add the robot skill to your project.
Don't have a robot yet?
Follow the Getting Started Guide to build a robot and use the Dual Camera Servo Server robot skill.
How to use the Dual Camera Servo Server robot skill
A server for DIY usage that allows a client to control servos and sends two camera streams to the client for stereoscopic view. This allows a custom VR app to use two cameras for stereoscopic view.
C# .Net Win Forms Test App
We have provided an example C# .Net test app demonstrating how a client can connect to this robot skill server. The app can be downloaded here: Dual Camera servo Test App.zipUnity VR App
The ideal custom VR app can be created in Unity. If using an Oculus Quest 2, we have a Unity example to demonstrate how to bind the hand, controller, and headset positions to servos. In addition, the left and right camera images can be assigned to a Raw Image game object.Download the C# .Net Win Forms Test App from the link above and copy the Comm folder to your Unity asset project. This folder contains the StreamClient that will be used to connect to this robot skill, receive both camera images, and transmit servo positions.
using Assets.Comm;
using Synthiam_Unity_Common;
using UnityEngine;
using UnityEngine.UI;
namespace SceneRobotView {
public class RobotView : MonoBehaviour {
StreamClient _streamClient;
bool _initialized = false;
bool _isClosing = false;
volatile byte [] _toDisplayLeft = new byte[]{ };
volatile byte [] _toDisplayRight = new byte[]{ };
RawImage _leftImage;
RawImage _rightImage;
void Start() {
_leftImage = GameObject.Find("Left Eye Image").GetComponent();
_leftImage.texture = new Texture2D(1280, 720, TextureFormat.RGB24, false);
_rightImage = GameObject.Find("Right Eye Image").GetComponent();
_rightImage.texture = new Texture2D(1280, 720, TextureFormat.RGB24, false);
_streamClient = new StreamClient();
_streamClient.OnLeftImageDataReady += _streamClient_OnLeftImageDataReady;
_streamClient.OnRightImageDataReady += _streamClient_OnRightImageDataReady;
_streamClient.OnStop += _streamClient_OnStop;
_streamClient.OnStart += _streamClient_OnStart;
_streamClient.OnError += _streamClient_OnError;
_streamClient.Start("127.0.0.1", 8082);
}
private void _streamClient_OnError(System.DateTime time, string logTxt) {
}
private void _streamClient_OnStart() {
}
private void _streamClient_OnStop() {
if (_isClosing)
return;
}
private void _streamClient_OnLeftImageDataReady(byte[] imageData) {
if (!_initialized || _isClosing)
return;
lock (_toDisplayLeft)
_toDisplayLeft = imageData;
}
private void _streamClient_OnRightImageDataReady(byte[] imageData) {
if (!_initialized || _isClosing)
return;
lock (_toDisplayRight)
_toDisplayRight = imageData;
}
private void OnDestroy() {
_isClosing = true;
_streamClient.Stop();
_streamClient.Dispose();
}
void Update() {
_initialized = true;
if (_isClosing)
return;
if (Camera.current == null)
return;
lock (_toDisplayLeft)
if (_toDisplayLeft.Length > 0)
(_leftImage.texture as Texture2D).LoadImage(_toDisplayLeft);
lock (_toDisplayRight)
if (_toDisplayRight.Length > 0)
(_rightImage.texture as Texture2D).LoadImage(_toDisplayRight);
if (_streamClient.IsRunning && OVRManager.hasInputFocus) {
// https://developer.oculus.com/documentation/unity/unity-handtracking/
// https://github.com/handzlikchris/Unity.QuestRemoteHandTracking/blob/master/HandsDataSender.cs
// Hand_WristRoot = 0 // root frame of the hand, where the wrist is located
// Hand_ForearmStub = 1 // frame for user's forearm
// Hand_Thumb0 = 2 // thumb trapezium bone
// Hand_Thumb1 = 3 // thumb metacarpal bone
// Hand_Thumb2 = 4 // thumb proximal phalange bone
// Hand_Thumb3 = 5 // thumb distal phalange bone
// Hand_Index1 = 6 // index proximal phalange bone
// Hand_Index2 = 7 // index intermediate phalange bone
// Hand_Index3 = 8 // index distal phalange bone
// Hand_Middle1 = 9 // middle proximal phalange bone
// Hand_Middle2 = 10 // middle intermediate phalange bone
// Hand_Middle3 = 11 // middle distal phalange bone
// Hand_Ring1 = 12 // ring proximal phalange bone
// Hand_Ring2 = 13 // ring intermediate phalange bone
// Hand_Ring3 = 14 // ring distal phalange bone
// Hand_Pinky0 = 15 // pinky metacarpal bone
// Hand_Pinky1 = 16 // pinky proximal phalange bone
// Hand_Pinky2 = 17 // pinky intermediate phalange bone
// Hand_Pinky3 = 18 // pinky distal phalange bone
OVRPlugin.HandState handState = default(OVRPlugin.HandState);
if (OVRPlugin.GetHandState(OVRPlugin.Step.Render, OVRPlugin.Hand.HandLeft, ref handState)) {
_streamClient.SendPacketLeftHand(
Utilities.MapToByteHalf(handState.BoneRotations[5].x),
Utilities.MapToByteHalf(handState.BoneRotations[5].y),
Utilities.MapToByteHalf(handState.BoneRotations[5].z),
Utilities.MapToByteHalf(handState.BoneRotations[8].x),
Utilities.MapToByteHalf(handState.BoneRotations[8].y),
Utilities.MapToByteHalf(handState.BoneRotations[8].z),
Utilities.MapToByteHalf(handState.BoneRotations[11].x),
Utilities.MapToByteHalf(handState.BoneRotations[11].y),
Utilities.MapToByteHalf(handState.BoneRotations[11].z),
Utilities.MapToByteHalf(handState.BoneRotations[14].x),
Utilities.MapToByteHalf(handState.BoneRotations[14].y),
Utilities.MapToByteHalf(handState.BoneRotations[14].z),
Utilities.MapToByteHalf(handState.BoneRotations[18].x),
Utilities.MapToByteHalf(handState.BoneRotations[18].y),
Utilities.MapToByteHalf(handState.BoneRotations[18].z),
Utilities.MapToByte(handState.RootPose.Position.x),
Utilities.MapToByte(handState.RootPose.Position.y),
Utilities.MapToByte(handState.RootPose.Position.z));
} else if (OVRInput.GetControllerPositionTracked(OVRInput.Controller.LTouch)) {
var controller = OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch);
var thumbStick = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick, OVRInput.Controller.LTouch);
var indexTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger, OVRInput.Controller.LTouch);
var handTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);
var buttonA = OVRInput.Get(OVRInput.Button.One, OVRInput.Controller.LTouch);
var buttonB = OVRInput.Get(OVRInput.Button.Two, OVRInput.Controller.LTouch);
_streamClient.SendPacketLeftController(
Utilities.MapToByte(controller.x),
Utilities.MapToByte(controller.y),
Utilities.MapToByte(controller.z),
Utilities.MapToByte(thumbStick.x),
Utilities.MapToByte(thumbStick.y),
Utilities.MapToByteHalf(indexTrigger),
Utilities.MapToByteHalf(handTrigger),
(byte)(buttonA ? 1 : 0),
(byte)(buttonB ? 1 : 0));
}
if (OVRPlugin.GetHandState(OVRPlugin.Step.Render, OVRPlugin.Hand.HandRight, ref handState)) {
_streamClient.SendPacketRightHand(
Utilities.MapToByteHalf(handState.BoneRotations[5].x),
Utilities.MapToByteHalf(handState.BoneRotations[5].y),
Utilities.MapToByteHalf(handState.BoneRotations[5].z),
Utilities.MapToByteHalf(handState.BoneRotations[8].x),
Utilities.MapToByteHalf(handState.BoneRotations[8].y),
Utilities.MapToByteHalf(handState.BoneRotations[8].z),
Utilities.MapToByteHalf(handState.BoneRotations[11].x),
Utilities.MapToByteHalf(handState.BoneRotations[11].y),
Utilities.MapToByteHalf(handState.BoneRotations[11].z),
Utilities.MapToByteHalf(handState.BoneRotations[14].x),
Utilities.MapToByteHalf(handState.BoneRotations[14].y),
Utilities.MapToByteHalf(handState.BoneRotations[14].z),
Utilities.MapToByteHalf(handState.BoneRotations[18].x),
Utilities.MapToByteHalf(handState.BoneRotations[18].y),
Utilities.MapToByteHalf(handState.BoneRotations[18].z),
Utilities.MapToByte(handState.RootPose.Position.x),
Utilities.MapToByte(handState.RootPose.Position.y),
Utilities.MapToByte(handState.RootPose.Position.z));
} else if (OVRInput.GetControllerPositionTracked(OVRInput.Controller.RTouch)) {
var controller = OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch);
var thumbStick = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick, OVRInput.Controller.RTouch);
var indexTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger, OVRInput.Controller.RTouch);
var handTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
var buttonA = OVRInput.Get(OVRInput.Button.One, OVRInput.Controller.RTouch);
var buttonB = OVRInput.Get(OVRInput.Button.Two, OVRInput.Controller.RTouch);
_streamClient.SendPacketRightController(
Utilities.MapToByte(controller.x),
Utilities.MapToByte(controller.y),
Utilities.MapToByte(controller.z),
Utilities.MapToByte(thumbStick.x),
Utilities.MapToByte(thumbStick.y),
Utilities.MapToByteHalf(indexTrigger),
Utilities.MapToByteHalf(handTrigger),
(byte)(buttonA ? 1 : 0),
(byte)(buttonB ? 1 : 0));
}
_streamClient.SendPacketHeadset(
Utilities.MapToByte(Camera.current.transform.localRotation.x),
Utilities.MapToByte(Camera.current.transform.localRotation.y),
Utilities.MapToByte(Camera.current.transform.localRotation.z));
}
}
}
}
we have to test that now, it always depends on who is looking at the stereoscopic image, because every person is different. some are very sensitive and quickly get problems if the pictures differ too much.
I have fewer problems with it. I taught myself to look at side by side pictures and videos using the "Crossview" technique. Once you get the hang of it, it's very entertaining, I just need to look closely at a side-by-side picture or video and the stereoscopic picture appears in front of my eyes in the middle, it's no joke, here e.g. information about it:
https://en.wikipedia.org/wiki/Stereoscopy#Freeviewing https://www.flickr.com/photos/28458566@N08/16661278842/
to practice it you can cover each of the pictures, over time you won't need it anymore....
https://www.deviantart.com/aporev/art/Cross-view-Stereo-Tutorial-111098009
.
Wow that’s neat and comes in handy I bet. I’m ordering a second camera like yours to test more with. It will be here in a few days
When using the separate left and right cameras, can you get a menu canvas to display in the center camera for both eyes?
I have Per Eye Cameras set for the OVR Camera Rig.
My canvas menu sits under the Center camera.
The per-eye cameras render their game objects correctly - but the center camera does not display the UI menu. Any ideas?
Thanks, I forgot to include the "Per Eye Cameras" setting, I've now added it above.
I tested a little, I can display a canvas on both cameras left/right.
blue cube on layer right green cube on layer left red canvas element on Layer UI or Default
I switched the canvas, Render Mode to World Space and positioned the canvas.
Does this help?
The center camera is only displayed in the editor, has no effect on the display in the Oculus with the "Per Eye Cameras" setting, that's how I understand it.
I also found something about the problem that you cannot hide a canvas via the layers:
https://answers.unity.com/questions/1507202/camera-culling-mask-for-ui-elements-not-working.html
.
I quickly created a camera rig for 2 x Microsoft LifeCam HD-3000 Web Cams
Instructions for disassembling the cameras can be found here (InMoov users should know this ;-) ):
https://astrophotovideo.wordpress.com/adapting-a-webcam-to-a-telescope/
You can find the STL file here: Stereo_LiveCams_HD-3000_mount.stl
One more thing, tripod adapter GoPro/EZ for easier testing...
STL File: Stativ_auf_EZ.stl
.
Awesome! I’ll be using that when my second cam arrives
did you see my post above to your question about the canvas, might that help?
It sure does. Don’t think I can have a popup menu so I might have to make the menu a separate scene