Servo And Camera Server

Servo & camera video server for Unity which allows a remote client to move servos and receive video stream from a camera device.

+ How To Add This Control To Your Project (Click to Expand)
  1. Make sure you have the latest version of ARC installed.
  2. Select the Get button in this page to download the archive file.
  3. Double click the downloaded archive file to execute installer.
  4. The installer will add this control to ARC.
  5. Load ARC and press the Project -> Add Control button from the menu.
  6. Choose the Servo category tab.
  7. Press the Servo And Camera Server icon to add the control to your project.


This is a servo & camera video server for Unity which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to ARC. where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit.

User-inserted image

Example Client App Source Code
Here is an example test app src that connects to localhost (, moves a servo on port D2 and displays the camera video stream. The sample app is C# .Net source-code and can be downloaded at this link: Test App

Rasperry Pi
This plugin will work on a Raspberry Pi ARC. but will require SUDO privilege. This means you will have to launch ARC with sudo mono ARC.exe. This is because the plugin binds to a TCP port and therefore requires root privilege to become a network server program

Test Unity Project
I have included a test Unity project for example reference. The example rotates a cube on the screen using the ARROW keys. The cube projects the texture from the camera stream onto it. The arrow keys will also move the servos connected to port D0 and D1 relative to the rotation of the cube. You can download the project here: Servo

Use In Unity
The stream client files in the "Test App" can be included in a Unity project to receive the video and move servos. The test app demonstrates how to move the servos using the methods, and how to display the video on Windows. To display the video in Unity, follow the steps below. The video works by becoming a Texture2D that can be applied to any material.

1) To use this in your Unity App, copy the files from the Test App\ServoServerClient\*.cs into your Unity project.
User-inserted image

2) Here is an example bit of unity code that can be attached to an object in Unity


using EZ_Robot_Unity_DLL;
using UnityEngine;

namespace SceneRobotView {

  public class RobotView : MonoBehaviour {

    // The stream client
    ServoServerClient _streamClient;

    // We want to ensure the scene has been initialized by at least one Update() event occuring before writing to a material.
    // This is because we don't want to write to material that hasn't been initialized yet.
    bool _initialized = false;

    // To be friendly on the garbage collector, we re-use ths same texture object.
    // The object gets initialized in the Start() for this scene.
    Texture2D _texture;

    // The data for the camera will arrive at a different rate than the scene.
    // To ensure there is synchronization between the two updates, we will store the camera video in this array
    // and load it into the texture on every Update()
    volatile byte [] _toDisplay = new byte[]{ };

    void Start() {

      // Initialize the texture with the video format from the servo Server
      _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);

      // The stream client will connect to the servo Server
      // Probably a good idea to use the address as a user-configurable item rather than hard coded;)
      _streamClient = new ServoServerClient ();
      _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
      _streamClient.Start("", 8282);

    // Every time the camera sends a frame, this event is raised
    private void _streamClient_OnImageDataReady(byte[] imageData) {

      // Exit if the scene has not been initialized yet
      if (!_initialized)

      // Set the image data to an array which we will use in the Update() on the material
      _toDisplay = imageData;

    void OnDisable() {

      // Stop the streaming because we're closing this scene.

    // Called with every update frame of the scene
    void Update() {

      _initialized = true;

      // If there is data in the camera array...
      if (_toDisplay.Length > 0) {

        // ... set the texture from the camera array data

        // Set the material to the current component
        var material = GetComponent<Renderer>().material;
        material.mainTexture = _texture;
        material.mainTextureScale = new Vector2(-1, -1);

#60   — Edited
User-inserted image
User-inserted image

It s working. 8282
Well thats frustrating.
@fxrtst When you said the thing about the disconnect it came back to my mind that I had an issue like that before...try removing the camera control from your project and add it again, just a guess into the blue!!:)
#63   — Edited
#64   — Edited
Can you guys tell me the version of ARC you are using
.....ok . This should go in the description under prerequisites...must use the latest beta. WILL NOT Work with 1.26

Got it working by installing latest beta.
Keeping my fingers crossed for you...and also I would try the EZ-B camera, just to make sure!!
But I guess updating will fix your problem!!:)
#68   — Edited
....yeah its i spent so many hours messing around....grrrr

Do you have this working with Bio Ik?
Yes its all working!! But I can tell you this needs a lot of tweaking if you want to create fluid motion...I got my scene set up, so I can use the Virtual JD scene to move my own robot!!
Its a bit off, because the measurement of the JD and my own robot do not really match, but it works!!

You can set up anyway you like, I guess I can always reproduce your setup in a way!!
Let me know how its all working for you!!:)
No fluid motion because? What are your thoughts? I wonder the more servos you have if its just too much for a live set up?
#71   — Edited
I've managed to break it a couple of times. Sent rotations for 3 servos real quick and it stopped responding and I had to re boot the tiny IO to get it to work again. I can see the jerky movements in the servo positions. Not sure what that is. Could be that the positions are sent once collected causing a bit of a lull between positions.
Did you use this method... _streamClient.SetCachedServoPosition

But yes its choppy, I guess to get smooth motion you would have to do something to those values!
Either way, form an array of values over time and send and equalized value every Nth frame, or send them at a given framerate...its all options!
It all works as it should I guess, but it still needs a great deal of tweaking!!:)
Yes I used that method. I think you are correct something needs to be done to those values to smooth out the motion. For sure needs some tweeking. But a great start.
#74   — Edited
The title says it all. Step right up ladies and gentlemen! Spin the cube and watch wormy's mesmerizing dance!
That seems to be super fluid...are you using the default code, or did you do anything to those servo values already?
Thats default. And i think that the cube it spinning so fast that the servos are forced to update faster. But if i slow down the movement its very jerky. Almost as if the steps are larger than a degree or two.
Ah OK, I was thinking its only my setup that does this...
Hey guys...I tried to use the Raspberry Pi camera in Unity for the setup with the Servo-and-Camera-Server, but had no luck!
It gave me the following Error in Unity 2018!
Any ideas what this might be?

SocketException: Der Zugriff auf einen Socket war aufgrund der Zugriffsrechte des Sockets unzulässig.

System.Net.Sockets.Socket.Bind (System.Net.EndPoint local_end)
SyntaxTree.VisualStudio.Unity.Messaging.UdpSocket.Bind (System.Net.IPAddress address, Int32 port)
SyntaxTree.VisualStudio.Unity.Messaging.Messager..ctor (Int32 port)
SyntaxTree.VisualStudio.Unity.Messaging.Messager.BindTo (Int32 port)
SyntaxTree.VisualStudio.Unity.Bridge.VisualStudioIntegration+<>c__DisplayClass9_0.<.cctor>b__0 ()
SyntaxTree.VisualStudio.Unity.Bridge.Project+<>c__DisplayClass37_0.b__0 ()
UnityEditor.EditorApplication.Internal_CallUpdateFunctions () (at C:/buildslave/unity/build/Editor/Mono/EditorApplication.cs:200)
It was an old version of the EZBPi server that caused the error in Unity...I updated to the latest version and everything works just fine!!
So now there is one thing that I would like to know...
Since my robot is using serial bus servos, is there a way to drive them directly with the ServoServerClient?
Or is there a way to access those values that are being send?

If this is not possible, I would send the values to another server already established on the Raspberry Pi...
But if there would be a way to do it with the ServoServerClient it would be kind of cool too!!:)