Thumbnail

Servo And Camera Server

Servo & camera video server which allows a remote client to move servos and receive video stream from a camera device.

+ How To Add This Control To Your Project (Click to Expand)
  1. Make sure you have the latest version of ARC installed.
  2. Select the Get button in this page to download the archive file.
  3. Double click the downloaded archive file to execute installer.
  4. The installer will add this control to ARC.
  5. Load ARC and press the Project -> Add Control button from the menu.
  6. Choose the Servo category tab.
  7. Press the Servo And Camera Server icon to add the control to your project.

Manual

This is a servo & camera video server which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to ARC. where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit.

Demo #1
This is an overview of how this robot skill can integrate with a unity scene using the Unity animation tool.




Demo #2
This example uses a block in Unity that controls an EZ-Robot JD shoulder. It's a very simple example of how powerful and easy it is to control any servo in ARC. The blocks have gravity and a hinge, so as it swings the angle is read and pushed into ARC to move the respective servo.


Demo #3
This is Mickey's demonstration of controlling the servos through unity and joysticks.


How It Works
Code from the client (ie unity in this case) will connect to the camera servo skill over tcp. It streams the servo positions and receives the camera view. 
User-inserted image



Example Client App Source Code
Here is an example test app src that connects to localhost (127.0.0.1), moves a servo on port D2 and displays the camera video stream. The sample app is C# .Net source-code and can be downloaded.

Download C# .Net Example Source code: Test App.zip (2020/12/17)


Test Unity Project
I have included a test Unity project for example reference. The example rotates a cube on the screen using the ARROW keys. The cube projects the texture from the camera stream onto it. The arrow keys will also move the servos connected to port D0 and D1 relative to the rotation of the cube.

You can download the project here: Unity Test.zip (2020/12/17)


Use In Unity
The stream client files in the "Test App" can be included in a Unity project to receive the video and move servos. The test app demonstrates how to move the servos using the methods, and how to display the video on Windows. To display the video in Unity, follow the steps below. The video works by becoming a Texture2D that can be applied to any material.

To use this in your Unity App, copy the files from the Test App\ServoServerClient\*.cs into your Unity project.
User-inserted image



Examine Test Project
The Unity project displays the ARC camera stream on a rotating cube. While, allowing the 2d sprite to control servos D0 and D1 by the X and Y position, respectively. Clicking on the scene will move the sprite and also move the servos.

User-inserted image


Any components within the group can have their position or rotation, etc. extracted and sent to ARC. If you have a 3d model of a robot, each joint position/rotation can be sent to ARC. 

The most interesting thing to look at is the Scene object -> ServoCameraGroup. Notice it has child GameObjects. Those child GameObjects can be queried for their rotation or position or whatever is desired and sent to ARC as servo positions. Also, the camera image can be rendered to any material as a 2d texture.
User-inserted image


Look at the ServoCameraGroup to see the script
User-inserted image


The script ServoCamera.cs will be responsible for
Start
- create and instance of the StreamClient object
- have the StreamClient connect to ARC at an IP address (this case it's using local machine 127.0.0.1)
- assign child gameobjects to local variables that we will be using in Update (this makes cpu happy)
- connecte to the ARC server

Update
- obtaining rotation/position/whatever data from children and add to the servo position cache (in this example a sprite position)
- sending the cache of servo positions
- displaying the incoming image on a material as a texture

Let's take a look at the code for ServoCamera.cs and read the comments of how it is working

Code:


using EZ_Robot_Unity_DLL;
using UnityEngine;

public class ServoCamera : MonoBehaviour {

ServoServerClient _streamClient;
bool _initialized = false;
Texture2D _texture;

volatile byte [] _toDisplay = new byte[]{ };

Transform _cube;
Transform _sprite;

///
/// We have this script added to a parent that has children.
/// Because we can access the children's transformation or position from here to set servo positions
/// In the update, we'll just grab the children and use thier data to send to ARC
///

void Start() {

// This is the texture that will hold the camera image from ARC
// We apply this texture to a cube
_texture = new Texture2D(640, 480, TextureFormat.RGB24, false);

// assign a local variable to the children so we don't have to search for them on each frame (makes cpu happy)
_cube = gameObject.transform.Find("Cube");
_sprite = gameObject.transform.Find("MoveableSprite");

//Create a client that will connect to ARC at the specified IP address
// Once connected, any available video data from the ARC camera will raise the OnImageDataReady event
_streamClient = new ServoServerClient();
_streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
_streamClient.Start("127.0.0.1", 8282);
}

///
/// This event is raised for every camera image that is received from the connected ARC server.
/// We assign the image data to a volatile array that will be used in the update the texture with the latest image
///

private void _streamClient_OnImageDataReady(byte[] imageData) {

if (!_initialized)
return;

_toDisplay = imageData;
}

void OnDisable() {

_streamClient.Stop();
}

///
/// Unity runs this on every render frame
/// We get the keyboard input to move the camera around
/// And we map the cube's X and Y rotation values to D0 and D1 servo positions in ARC, respectively
///

void Update() {

_initialized = true;

if (Input.GetKey(KeyCode.Escape))
Application.Quit();

// add the position of the servos to the cache based on the location of the sprice
// We set the positions to cache in this loop rather than trying to send a position each time
// That way we can send a bulk change which is much faster on bandwidth
// So, add your servo positions to the cache and then send them all after
_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(_sprite.transform.position.x));
_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(_sprite.transform.position.y));

// Send all the servo positions that have been cached
_streamClient.SendCachedServoPositions();

// Display the latest camera image by rendering it to the texture and applying to the cube's material
if (_toDisplay.Length > 0) {

_texture.LoadImage(_toDisplay);

var material = _cube.GetComponent().material;
material.mainTexture = _texture;
material.mainTextureScale = new Vector2(1, -1);
}
}
}

PRO
USA
#114   — Edited
@mickey...are you using flask? I'd be interested in see a video with that set up...I'm as visual learner too.

Unreal is a super massive download at 25 Gigs!!!...so also looking into free Blender as another option.....much smaller foot print, this could solve some of my issues with a gaming engine. Blender is a proper animation program so has built in IK, motion blending, motion controls etc. Its all modules coded in python. So just need to find a way to get data out of Blender to flask to Arc?!
#115  
OMG, you got me on that one...Blender!!!
This is some delicate subject, since I would strongly advocate for Blender. I would tell everyone to use it, to start on this platform right away! Its open source, its powerful and Python is integrated in the very core of its engine...
Its runs on all mayor operating systems, Windows, Mac OS and Linux.
And it is free, like open source free...free to the bare bone of it!!!

But I spend so much time learning software, that I just did not dare to walk down that road yet!
Its kind of a hard transition most of the times, and I heard that the interface does not make it easier...

BUT...If you are going down that road, am super curious how things will be working out for you!
A positive feedback could also be a push towards that direction for myself!!

And yes sending data out from Blender should be super easy!!:)
#116  


Finally got to film and cut a demo...There is still some fine tuning to be done regarding the IK, its a bit harder to control with a PS4 controller than in VR. But all in all it works quiet well!
The bit of jitter you see comes more from structural build integrity of the robot, than from the servos itself!

I will make a short description on how I connect to ARC in a separate thread, but its not difficult at all!:)
PRO
Synthiam
#117  
Oh that’s great! Super awesome actually!:) amazing how talented you guys are. When I was playing in unity, there wasn’t much I could do. I’m pretty terrible at design lol
PRO
USA
#118   — Edited
Nice job! So is this demoing pushing to flask? Because this is really smooth motion....what is different?

I spent most of last night downloading Blender and looking around...coming from a Lightwave back ground...once you know one animation program, you kinda know them all. Since you come from Max universe should be simple to master. Most tools are there...just finding them is the trick with new software. I'll let you know my progress.
#119   — Edited
Thanks, and yes the current version is using Flask and ARC, so its all tried and tested to work out like shown in the demo...
The only thing is, that I did not test it on any other servo setup, since ZOE is the only fully assembled robot in the house, and this setup is being run on LX16a Serial Bus Servos. But I guess any type of servo setup should work. Its all kinda the same thing!:)

I also tried to get my setup done using the servo And Camera Server, which will most likely work as good or better. But there is the minor annoyance that this would take more than the one allowed plugin available in my free subscription plan, because I need the Servo And Camera Server & LX16a plugin to get this to work...maybe there will be a developer subscription plan for ARC at some point in the future?

But its all working as it should, by using the Script that I have written in ARC to get the data from the server, so all is working out great with ARC free...I will post a tutorial on how to set it up soon!:)
PRO
USA
#120  
...I look forward to that tutorial. That will help me better understand your process, so I can think about a blender application.
#122  


Hey just wanted to share this, because I think this was something you were aiming for, right?
It works with Unity, Unreal and Blender... so it is pretty versatile!

Although I am still thinking that creating custom animations for a robot using a VR setup is the best possible option.

Rigging humanoid motion directly to a robot, is kinda like trying to take Mocap data from a hummingbird and map them to a monkey, its just such a different setup!
But it of course all depends on how you do it, maybe there is something I did not think of, when it comes to using Mocap data!:)
#123  


This is a great insight of Inverse Kinematics, it is in Unity...but can be applied to any other software!
You mentioned, that you wanted to build you own IK system to be independent of plugins...
This should give you a good start!!:)
PRO
Synthiam
#124  
This robot skill does all that. There’s instructions at the top of this manual page on how to do it. I’ve also made some videos that are published on this manual page with examples. 

it doesn’t matter how the unity model is controlled. The angles will be sent to ARC which will move respective servos. Any kind of servo can be used. Including serial bus or pwm. Simply add the specific robot skill for your servo type if they’re bus servos. Otherwise just select the Dx ports for pwn servos.
PRO
USA
#125   — Edited
Pretty cool. I saw another review on youtube and the motion is only being captured from one camera angle so there are issues with motion when viewed another angle...but not too bad.

@mickey in ref to mocap data:

My new approach has been in retargeting. You load a .bvh file or use live streaming and then retarget parts of the mocap data (local rotational or translational ) to a bunch of nulls that represent the joints of a robot. This way the nulls do not have to inherent the world space from the mocap data. The nulls rotational data is then sent to some math to calculate the relationship from null rotation to servo positions (1-180).

Problem: Dangers of exposure due to virus like covid to employee or cost of a employee (outsource to another country for savings.)

Solution -Telepresence: This would be a great demo of a real world use case of Exosphere. A driver (wearing a mocap suit) could stock shelves in a store, thousands of miles away using a robot. The wearer could also have VR goggles so he/she could see the environment, products etc. As they move in the suit ( maybe just upper body) those motions are streamed live to the robot.
PRO
Synthiam
#126  
Assign the joints and use the provided Map function to convert angles into servo degrees. The examples are included in this robot skill. Because this robot skill operates over tcp, that’s what the internet uses. So this skill can be used over the internet and provides a camera feed. 

this essentially is a telepresence skill. And if you combine this with exosphere by following the telepresence instructions, it’s already ready to go.
PRO
USA
#127  
Yeah the challenge is on the other end. The motion capture software must be linked to Unity, and then stream motion capture data "live" on to a control rig (null unit) within Unity. Where as the world space must be ignored. There are alot of hurdles to overcome including occlusion/intersections of data from human to robot, reduce spikes in the data (filter) with out introducing latency, which in a telepresence environment could spell trouble. If it was easy then everyone would be doing it! But that's the call to challenge isn't it?
#128  

Quote:

This robot skill does all that. There’s instructions at the top of this manual page on how to do it. I’ve also made some videos that are published on this manual page with examples.

it doesn’t matter how the unity model is controlled. The angles will be sent to ARC which will move respective servos. Any kind of servo can be used. Including serial bus or pwm. Simply add the specific robot skill for your servo type if they’re bus servos. Otherwise just select the Dx ports for pwn servos.

@DJSures sorry I was hijacking this thread for different topic information, that I thought @fxrtst was being interested in...the content I posted had nothing to do with the ARC skill, feel free to remove it, or move it to another thread? 
When I said plugin, I was referring to Unity/Unreal plugins and not to the ARC skill!
The ARC skill works perfectly fine!!:)
#129  
@fxrtst those problems could be easily solved by just using a VR Headset...I really don't get the point of struggling with mocap! But you are probably having something very specific in mind?

In VR, if you would want to get really fancy and the head and hands position are not enough for you...
You could also pump it up with a few Vive controllers!
But IK endpoints and retargeting are something that needs to be done anyways...so lets see what you are having up your sleeve!! I am curious!!:D
PRO
Synthiam
#130   — Edited
We have a few customers who use exosphere for remote telepresence and use finger haptic gloves. That’s what this robot skill is being used for. 

The physical robot is mirrored in unity. When the user moved, the servod joints can’t hit each other because the unity scene prevents it. 

it’d be really great to see you guys using this stuff like some of the universities are. They’re remote telepresence and exosphere use would impress you as it’s right up the alley you’re discussing.

What issues are you having with this robot skill? If you ask I’ll be able to help. You guys are pretty much the only people who use the community forum and I want you to be successful.
PRO
USA
#131  
Its hard to get you guys inside my head. Just know what ever it is, its going to be amazing..lol.
PRO
Synthiam
#132   — Edited
I have no doubt it will be:) it always is! 

Just don’t want you to think you need to reinvent what we already have or can quickly modify. The leading cause of failed robot companies is full stack development - meaning making stuff that already exists rather than building on top of it. 

like robots seems to be the one industry that is not moving very fast because of that reason. I mean, we all use operating systems and computer and programs that other people/companies make. But in robotics everyone wants to reinvent. Every robot company has failed because of that. There’s a trail of robot company failures attributed to that. 

Let me know what you’re having trouble with and I’ll help so you don’t have to rebuild everything. It’ll just drain your bank account and energy otherwise:)
#133   — Edited
Sorry...I just realized my Unity App has a problem!! 
But I will keep you guys posted!!:)