Thumbnail

Single Camera Servo Server

by Synthiam

Servo camera video server which allows a remote client to move servos and receive video stream from a camera device.

Requires ARC v10 (Updated 6/14/2022)

How to add the Single Camera Servo Server robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Virtual Reality category tab.
  5. Press the Single Camera Servo Server icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Single Camera Servo Server robot skill.


How to use the Single Camera Servo Server robot skill

This is a servo & camera video server which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to ARC. where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit.

Demo #1 This is an overview of how this robot skill can integrate with a unity scene using the Unity animation tool.

Demo #2 This example uses a block in Unity that controls an EZ-Robot JD shoulder. It's a very simple example of how powerful and easy it is to control any servo in ARC. The blocks have gravity and a hinge, so as it swings the angle is read and pushed into ARC to move the respective servo.

Demo #3 This is Mickey's demonstration of controlling the servos through unity and joysticks.

How It Works Code from the client (ie unity in this case) will connect to the camera servo skill over tcp. It streams the servo positions and receives the camera view. 

User-inserted image

Example Client App Source Code Here is an example test app src that connects to localhost (127.0.0.1), moves a servo on port D2 and displays the camera video stream. The sample app is C# .Net source-code and can be downloaded.

Download C# .Net Example Source code: Test App.zip (2020/12/17)

Test Unity Project I have included a test Unity project for example reference. The example rotates a cube on the screen using the ARROW keys. The cube projects the texture from the camera stream onto it. The arrow keys will also move the servos connected to port D0 and D1 relative to the rotation of the cube.

You can download the project here: Unity Test.zip (2020/12/17)

Use In Unity The stream client files in the "Test App" can be included in a Unity project to receive the video and move servos. The test app demonstrates how to move the servos using the methods, and how to display the video on Windows. To display the video in Unity, follow the steps below. The video works by becoming a Texture2D that can be applied to any material.

To use this in your Unity App, copy the files from the Test App\ServoServerClient*.cs into your Unity project.

User-inserted image

Examine Test Project The Unity project displays the ARC camera stream on a rotating cube. While, allowing the 2d sprite to control servos D0 and D1 by the X and Y position, respectively. Clicking on the scene will move the sprite and also move the servos.

User-inserted image

Any components within the group can have their position or rotation, etc. extracted and sent to ARC. If you have a 3d model of a robot, each joint position/rotation can be sent to ARC.

The most interesting thing to look at is the Scene object -> ServoCameraGroup. Notice it has child GameObjects. Those child GameObjects can be queried for their rotation or position or whatever is desired and sent to ARC as servo positions. Also, the camera image can be rendered to any material as a 2d texture.

User-inserted image

Look at the ServoCameraGroup to see the script

User-inserted image

The script ServoCamera.cs will be responsible for Start - create and instance of the StreamClient object

  • have the StreamClient connect to ARC at an IP address (this case it's using local machine 127.0.0.1)
  • assign child gameobjects to local variables that we will be using in Update (this makes cpu happy)
  • connecte to the ARC server

Update

  • obtaining rotation/position/whatever data from children and add to the servo position cache (in this example a sprite position)
  • sending the cache of servo positions
  • displaying the incoming image on a material as a texture

Let's take a look at the code for ServoCamera.cs and read the comments of how it is working


using EZ_Robot_Unity_DLL;
using UnityEngine;

public class ServoCamera : MonoBehaviour {

  ServoServerClient _streamClient;
  bool _initialized = false;
  Texture2D _texture;

  volatile byte [] _toDisplay = new byte[]{ };

  Transform _cube;
  Transform _sprite;

  /// 
  /// We have this script added to a parent that has children. 
  /// Because we can access the children's transformation or position from here to set servo positions
  /// In the update, we'll just grab the children and use thier data to send to ARC
  /// 
  void Start() {

    // This is the texture that will hold the camera image from ARC
    // We apply this texture to a cube
    _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);

    // assign a local variable to the children so we don't have to search for them on each frame (makes cpu happy)
    _cube = gameObject.transform.Find("Cube");
    _sprite = gameObject.transform.Find("MoveableSprite");

    //Create a client that will connect to ARC at the specified IP address
    // Once connected, any available video data from the ARC camera will raise the OnImageDataReady event
    _streamClient = new ServoServerClient();
    _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
    _streamClient.Start("127.0.0.1", 8282);
  }

  /// 
  /// This event is raised for every camera image that is received from the connected ARC server.
  /// We assign the image data to a volatile array that will be used in the update the texture with the latest image
  /// 
  private void _streamClient_OnImageDataReady(byte[] imageData) {

    if (!_initialized)
      return;

    _toDisplay = imageData;
  }

  void OnDisable() {

    _streamClient.Stop();
  }

  /// 
  /// Unity runs this on every render frame
  /// We get the keyboard input to move the camera around
  /// And we map the cube's X and Y rotation values to D0 and D1 servo positions in ARC, respectively
  /// 
  void Update() {

    _initialized = true;

    if (Input.GetKey(KeyCode.Escape))
      Application.Quit();

    // add the position of the servos to the cache based on the location of the sprice
    // We set the positions to cache in this loop rather than trying to send a position each time
    // That way we can send a bulk change which is much faster on bandwidth
    // So, add your servo positions to the cache and then send them all after
    _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(_sprite.transform.position.x));
    _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(_sprite.transform.position.y));

    // Send all the servo positions that have been cached
    _streamClient.SendCachedServoPositions();

    // Display the latest camera image by rendering it to the texture and applying to the cube's material
    if (_toDisplay.Length > 0) {

      _texture.LoadImage(_toDisplay);

      var material = _cube.GetComponent().material;
      material.mainTexture = _texture;
      material.mainTextureScale = new Vector2(1, -1);
    }
  }
}


ARC Pro

Upgrade to ARC Pro

Unlock the true power of automation and robotics by becoming a proud subscriber of Synthiam ARC Pro.

#25  

This is just a quick demo that this can work out... You can see the camera stream within the Unity App, and servo positions are being transferred live to ARC. This was utilized on the IOTiny, so I dont know if this caused the latency, or maybe my rainbow coding?:)

I am using @DJSures code snippet

_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)SomeInt);

So the weird jitter might be caused by me using the SetCachedServoPosition wrong. But this is the POC that this works out and just needs so tuning I guess!!

Great step forward, these are some exiting days!!!:D

#26  

@DJSures I tried serveral things to get Unity sending the servo positions live to ARC which was partly successful...what I am experiencing are random timeouts. So the servo position is not send, which results in a jump to the next position, rather than smooth movement...just guessing this of course! I was testing on an IOTiny and on an EZB v4, but I guess the problem is with im code in Unity!

So since you had this already figured out for the Virtual Reality Plugin, would you mind sharing you code on how to drive a couple of servos live? Just to make sure I did not do anything wrong? Or should I clean my code an post it here, so you can see what I was up to?:D

#27  
using EZ_Robot_Unity_DLL;
using BioIK;
using System.Collections;
using UnityEngine;
public class CameraCubeRotate : MonoBehaviour {
    ServoServerClient _streamClient;
    bool _initialized = false;
    public float _speed = 0.5f;
    Texture2D _texture;
    float positionAbs;
    public GameObject D0;
    public GameObject D1;
    public GameObject D2;
    volatile byte [] _toDisplay = new byte[]{ };
    void Start() {
        _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);
        _streamClient = new ServoServerClient();
        _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
        _streamClient.Start("127.0.0.1", 8282);
    }
    private void _streamClient_OnImageDataReady(byte[] imageData) {
        if (!_initialized)
          return;
        _toDisplay = imageData;
      }
      void OnDisable() {
        _streamClient.Stop();
      }
      void Update() {
        _initialized = true;
        if (Input.GetKey(KeyCode.Escape))
          Application.Quit();
    
        if (Input.GetKey(KeyCode.RightArrow)) {
          transform.Rotate(new Vector3(0, -_speed, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(transform.rotation.x));
        }
        if (Input.GetKey(KeyCode.LeftArrow)) {
          transform.Rotate(new Vector3(0, _speed, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(transform.rotation.x));
        }
        if (Input.GetKey(KeyCode.DownArrow)) {
          transform.Rotate(new Vector3(_speed, 0, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(transform.rotation.y));
        }
        if (Input.GetKey(KeyCode.UpArrow)) {
          transform.Rotate(new Vector3(-_speed, 0, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(transform.rotation.y));
        }
        BioJoint joint_D0 = D0.GetComponent<BioJoint>();
        double value_D0 = joint_D0.Z.GetTargetValue();
        int position_D0= Mathf.RoundToInt((float)value_D0);
        int positionAbs_D0 = Mathf.Abs(position_D0 - 90);
        print(positionAbs_D0);
        BioJoint joint_D1 = D1.GetComponent<BioJoint>();
        double value_D1 = joint_D1.Y.GetTargetValue();
        int position_D1 = Mathf.RoundToInt((float)value_D1);
        int positionAbs_D1 = Mathf.Abs(180 - (position_D1 - 90));
        print(positionAbs_D1);
        BioJoint joint_D2 = D2.GetComponent<BioJoint>();
        double value_D2 = joint_D2.Y.GetTargetValue();
        int position_D2 = Mathf.RoundToInt((float)value_D2);
        int positionAbs_D2 = Mathf.Abs(position_D2 - 90);
        
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs_D0);
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, (byte)positionAbs_D1);
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D2, (byte)positionAbs_D2);
        // Send all the servo positions if there's been a change
        if (_streamClient.HasServoChanged)
        _streamClient.SendCachedServoPositions();
        if (_toDisplay.Length > 0) {
          _texture.LoadImage(_toDisplay);
          var material = GetComponent<Renderer>().material;
          material.mainTexture = _texture;
          material.mainTextureScale = new Vector2(1, -1);
        }
      }
    }
PRO
USA
#28  

.............any luck?

#29  

Not really, I already dug out the old script that @ptp wrote, while he was still trying to set this all up by directly talking to the EZB and bypassing ARC! He wrote some really good code, that I am currently trying to rewrite...he had a really nice approach on handling the servos in Unity! but I will have to go sleep now, it 3am already!!:D

Did you ever try the Virtual Reality plugin? It should use the same code like the camera server? Does it work smoothly? I willl test this tomorrow...to see how the latency is while using the plugin!! Anyways I am having fun earning a lot of stuff here!! Lets hope we can tie it all together!!:)

PRO
Synthiam
#30  

Think you’re over complicating how it works. Given the code you posted, I’m not certain of the full picture. Put the project on Dropbox or something so I can see what you’ve done?

I provided a complete and working unity example above. Did you try it? Hope so because I took time to make it for you:)

#31   — Edited

@DJSures Yes, I used your code and am using your plugin! I am making good progress in the servo setup, this should make it easier for everyone using the scene to declare the servos, and their propperties! I am sure its me complicating stuff, or messing stuff up!!:D

Would you mind sending my a private mail, so I can send you a dropbox link? The scene contains a paid plugin, so I dont want to share the link public!

P.S. I ordered two LewanSoul Servos so I can test all this on my EZPi too!! It would be awesome if that 16channel servo PWM board would be supported in future updates!

PRO
USA
#32   — Edited

My theory is that Bio IK is somehow changing the translation and rotation values to something that servo server cannot translate properly? Or a longer delay with sendcashedservopositions then you expect (with as many servos that are in JD?)

I'd try a simple scene with just a box in unity and connect rotational values to this plug in to stream to a single servo and see if it works like you suspect. Then add another box and bio Ik to create an IK chain. Does it act ok or do you get the jitters? Keep the test simple and add one thing at a time until it breaks.