Upgrade to ARC Pro
Synthiam ARC Pro is a cool new tool that will help unleash your creativity with programming robots in just seconds!
Servo & camera video server which allows a remote client to move servos and receive video stream from a camera device.
How to add the Single Camera Servo Server robot skill
- Load the most recent release of ARC (Get ARC).
- Press the Project tab from the top menu bar in ARC.
- Press Add Robot Skill from the button ribbon bar in ARC.
- Choose the Virtual Reality category tab.
- Press the Single Camera Servo Server icon to add the robot skill to your project.
Don't have a robot yet?
Follow the Getting Started Guide to build a robot and use the Single Camera Servo Server robot skill.
How to use the Single Camera Servo Server robot skill
This is a servo & camera video server which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to ARC. where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit.Demo #1
This is an overview of how this robot skill can integrate with a unity scene using the Unity animation tool.
Demo #2
This example uses a block in Unity that controls an EZ-Robot JD shoulder. It's a very simple example of how powerful and easy it is to control any servo in ARC. The blocks have gravity and a hinge, so as it swings the angle is read and pushed into ARC to move the respective servo.
Demo #3
This is Mickey's demonstration of controlling the servos through unity and joysticks.
How It Works
Code from the client (ie unity in this case) will connect to the camera servo skill over tcp. It streams the servo positions and receives the camera view.

Example Client App Source Code
Here is an example test app src that connects to localhost (127.0.0.1), moves a servo on port D2 and displays the camera video stream. The sample app is C# .Net source-code and can be downloaded.
Download C# .Net Example Source code: Test App.zip (2020/12/17)
Test Unity Project
I have included a test Unity project for example reference. The example rotates a cube on the screen using the ARROW keys. The cube projects the texture from the camera stream onto it. The arrow keys will also move the servos connected to port D0 and D1 relative to the rotation of the cube.
You can download the project here: Unity Test.zip (2020/12/17)
Use In Unity
The stream client files in the "Test App" can be included in a Unity project to receive the video and move servos. The test app demonstrates how to move the servos using the methods, and how to display the video on Windows. To display the video in Unity, follow the steps below. The video works by becoming a Texture2D that can be applied to any material.
To use this in your Unity App, copy the files from the Test App\ServoServerClient\*.cs into your Unity project.
Examine Test Project
The Unity project displays the ARC camera stream on a rotating cube. While, allowing the 2d sprite to control servos D0 and D1 by the X and Y position, respectively. Clicking on the scene will move the sprite and also move the servos.

Any components within the group can have their position or rotation, etc. extracted and sent to ARC. If you have a 3d model of a robot, each joint position/rotation can be sent to ARC.
The most interesting thing to look at is the Scene object -> ServoCameraGroup. Notice it has child GameObjects. Those child GameObjects can be queried for their rotation or position or whatever is desired and sent to ARC as servo positions. Also, the camera image can be rendered to any material as a 2d texture.

Look at the ServoCameraGroup to see the script

The script ServoCamera.cs will be responsible for
Start
- create and instance of the StreamClient object
- have the StreamClient connect to ARC at an IP address (this case it's using local machine 127.0.0.1)
- assign child gameobjects to local variables that we will be using in Update (this makes cpu happy)
- connecte to the ARC server
Update
- obtaining rotation/position/whatever data from children and add to the servo position cache (in this example a sprite position)
- sending the cache of servo positions
- displaying the incoming image on a material as a texture
Let's take a look at the code for ServoCamera.cs and read the comments of how it is working
Code:
using EZ_Robot_Unity_DLL;
using UnityEngine;
public class ServoCamera : MonoBehaviour {
ServoServerClient _streamClient;
bool _initialized = false;
Texture2D _texture;
volatile byte [] _toDisplay = new byte[]{ };
Transform _cube;
Transform _sprite;
///
/// We have this script added to a parent that has children.
/// Because we can access the children's transformation or position from here to set servo positions
/// In the update, we'll just grab the children and use thier data to send to ARC
///
void Start() {
// This is the texture that will hold the camera image from ARC
// We apply this texture to a cube
_texture = new Texture2D(640, 480, TextureFormat.RGB24, false);
// assign a local variable to the children so we don't have to search for them on each frame (makes cpu happy)
_cube = gameObject.transform.Find("Cube");
_sprite = gameObject.transform.Find("MoveableSprite");
//Create a client that will connect to ARC at the specified IP address
// Once connected, any available video data from the ARC camera will raise the OnImageDataReady event
_streamClient = new ServoServerClient();
_streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
_streamClient.Start("127.0.0.1", 8282);
}
///
/// This event is raised for every camera image that is received from the connected ARC server.
/// We assign the image data to a volatile array that will be used in the update the texture with the latest image
///
private void _streamClient_OnImageDataReady(byte[] imageData) {
if (!_initialized)
return;
_toDisplay = imageData;
}
void OnDisable() {
_streamClient.Stop();
}
///
/// Unity runs this on every render frame
/// We get the keyboard input to move the camera around
/// And we map the cube's X and Y rotation values to D0 and D1 servo positions in ARC, respectively
///
void Update() {
_initialized = true;
if (Input.GetKey(KeyCode.Escape))
Application.Quit();
// add the position of the servos to the cache based on the location of the sprice
// We set the positions to cache in this loop rather than trying to send a position each time
// That way we can send a bulk change which is much faster on bandwidth
// So, add your servo positions to the cache and then send them all after
_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(_sprite.transform.position.x));
_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(_sprite.transform.position.y));
// Send all the servo positions that have been cached
_streamClient.SendCachedServoPositions();
// Display the latest camera image by rendering it to the texture and applying to the cube's material
if (_toDisplay.Length > 0) {
_texture.LoadImage(_toDisplay);
var material = _cube.GetComponent().material;
material.mainTexture = _texture;
material.mainTextureScale = new Vector2(1, -1);
}
}
}
@DJSures I just dug out the app we once made for JD, it already talks to ARC.
So with the EZ-Pi Server on the Raspberry Pi, I should be able to drive my robot with Unity and ARC once you got the plugin for the breakout board done...This would be awesome!!
My arm setup has one additional joint, but it should be fairly easy to run him like a JD for testing!!
I will make an input field for the IP address, so anyone can easily connect to their ARCs HTTP Server...
The thing is, you will need a camera connected for it to work. The Unity ServoServerClient assumes you have a camera connected and streaming for the servo positions to send.
This plugin is WAY faster and better than using the http server - because it'll be much friendlier on resources as well... Doubly so if you're using it for a VR headset or something requiring fast efficient processing.
I will have to go to work now, but will look into it this evening!
I already noticed this is not a sample scene, so I will have to figure out how to set it all up...
Would you mind providing a sample scene for Unity?
I know they are quiet heavy, so a Dropbox link or something would be OK!
If there is no sample scene, I guess I will figure it out myself, great progress!!!
I’m interested to know will this stream servo positions live from unity to ARC with this plug in?
Code:
Just connected or actually streaming from the camera for the servos to work?
Great video btw!
Would I be able to do this, but connected to ARC by streaming this data out using this plug in, then data to EZB..if yes would this be close to real time?
The app that I build i Unity had trouble sending the servo positions fast enough when I deployed it on my phone, since each servo was using a coroutine that includes http request...so I was thinking that caching the servo positions will lead to a better result! But well, my hack and slay methods can not really be called coding I guess...so I will take a close look to what you have been doing here!!
Thanks a lot!! This is fun!!
So you can build all the motions and translation of motion in Unity...basically Unity can be the app that could run on your phone, or Windows or Mac, and the EZ-B will drive your robot!
You could also store the servo positions in ARC if you wanted to...
in the RobotView script...
First of all, I wanted to turn the camera stream into a UI raw image element, but the problem is that Unity does not allow Texture2D as UI textures. Only Sprites are allowed as UI textures I guess...I solved this quick and dirty by setting up a second camera with a CameraRederTexture which allows us to display within the UI. Basically cheating by filming the cube with the second camera!
But the main problem is that I am using a plugin for Unity to extract Inverse Kinematic Rotations, this plugin is called BioIK developed by Sebastian Starke and dicovered by @fxrtst...this plugin returns the joint value, which I would like to use for my servo positions.
The problem is, I cannot get this to work propperly because _streamClient.MapToByte seems to do something to those values which is not working for me!
So is there a way to just send and integer of 1 to 180 to the _streamClient?
This is how I extract my rotations...
eg
if (Axis_Z)
{
BioJoint joint = this.GetComponent<BioJoint>();
double value = joint.Z.GetTargetValue();
int position = Mathf.RoundToInt((float)value);
if (Value != position)
{
Value = position;
Changed = true;
}
The servo function will accept int ranging from 1 to 180?
I am still a NOOB so a code snippet will always be handy...but I guess I will manage!!
Will post once I got things set up!!
Btw, the Raspberry Pi camera is still painfully slow, are you planning to implement a compression feature on the EZBPi server for the camera?
That messed things up!!!
It always throws an exeption at me saying it cannot convert the float, int or double to byte!!
Code:
Thanks for helping the NOOB!!!
I will make a quick demo video on how this is working so far!!
It great, finally Unity and ARC share a common ground!!
This is just a quick demo that this can work out...
You can see the camera stream within the Unity App, and servo positions are being transferred live to ARC.
This was utilized on the IOTiny, so I dont know if this caused the latency, or maybe my *rainbow* coding?
I am using @DJSures code snippet
_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)SomeInt);
So the weird jitter might be caused by me using the SetCachedServoPosition wrong.
But this is the POC that this works out and just needs so tuning I guess!!
Great step forward, these are some exiting days!!!
I was testing on an IOTiny and on an EZB v4, but I guess the problem is with im code in Unity!
So since you had this already figured out for the Virtual Reality Plugin, would you mind sharing you code on how to drive a couple of servos live? Just to make sure I did not do anything wrong? Or should I clean my code an post it here, so you can see what I was up to?
Code:
Did you ever try the Virtual Reality plugin? It should use the same code like the camera server? Does it work smoothly?
I willl test this tomorrow...to see how the latency is while using the plugin!!
Anyways I am having fun earning a lot of stuff here!! Lets hope we can tie it all together!!
I provided a complete and working unity example above. Did you try it? Hope so because I took time to make it for you
I am making good progress in the servo setup, this should make it easier for everyone using the scene to declare the servos, and their propperties!
I am sure its me complicating stuff, or messing stuff up!!
Would you mind sending my a private mail, so I can send you a dropbox link? The scene contains a paid plugin, so I dont want to share the link public!
P.S. I ordered two LewanSoul Servos so I can test all this on my EZPi too!! It would be awesome if that 16channel servo PWM board would be supported in future updates!
I'd try a simple scene with just a box in unity and connect rotational values to this plug in to stream to a single servo and see if it works like you suspect. Then add another box and bio Ik to create an IK chain. Does it act ok or do you get the jitters? Keep the test simple and add one thing at a time until it breaks.
Once this is set up, we can send as many servos as we wanted too, and test if Unity to ARC works without latency!!
It all works fine with the cube scene...so it might be some messed up code somewhere!!
One thing I would like to know...can I change the
_streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
D0 from ServoServerClient.ServoPortEnum.D0 to a variable?
I cannot wrap my head around this!!
But I can also just send you my scene so far with just one servo...
I will do this now!
Code:
I believe this is the single most powerful development with Synthiam/ARC.
We should work openly and freely to get this figured out. I think its important that DJ take a look at the scene with the inverse kinematics plug in attached to the joints. He will know immediately what the issue is if there is one.
The Unity bridge could be a game changer for robotics. Using it Live or with recorded motions could change the way we add motion to our robots.
My 2 Cents
So if for example, you found out how to create fluid animations you could make one for our sample scene and we can all test it on our own projects!! Or if I would make a new camera control or a nice tracking feature...
I only took the link down to respect Sebastian, the guy which wrote BioIK...but I will organize the project and will be putting it up on GitHub, so it will be open for everyone to contribute and share!!
I do not experience any trouble sending live data to my robot, but it is running on different hardware and different software...
Since Synthiam is opening up to new hardware solutions, I am happy to get this revitalized, now I can use my robots hardware and have the ARC sofware process the data!
I never had a visual feedback, because this would have required to tear my robots cable management apart just for testing...now its a whole new story!!
If anyone does not want to wait, pm me for the messy scene, so you can hack and slay yourselves!!
So the issue you are having is other hardware and software you are adding to your project?
If I set up a scene with Bio IK and connect it to EZB using this plug in, then everything works? I'm going to try and test later today. Something simple like one or two joints.
I will clean up the project (hopefully tonight) and put it to GitHub, so everyone can contribute!!
I guess that is the fastest way to get this to work...I will need to clean the rig a little, redo some naming conventions, and remove some *bunny hug* that piled up when I was trying to get stuff done...with that out of the way you, @DJSures, @ptp and myself will be having a nice clean Sandbox to play in!!
So today just because I rebuild the scene, I grabbed some servos on the table...and they seem to run just fine!!
Anyways I created a GitHub repository, it is without the plugin but @DJSures can pm me or @fxrtst about this, I dont think that Sebastian would mind if we would give to someone for debugging...just putting it out in the open is not so nice!!
I am really excited where this project will take us...its been a lot of fun so far!!
https://github.com/SwannSchilling/Synthiam_Unity
I will go an sleep now, its 1am over here!! See you tomorrow!!
I will do a quick video on how to set it up when I am back from work!!
It all works fine now!!
(Edit: Could it be version dependent. I'm using Unity 2017.4?) DJ built the scene in 2018 version.
I think the fbx files were only for joint positions, I don't remember doing any animation with them. Baking animations from one program to another simply removes the bone rig and bakes vertex position on every frame vs standard animations which will use a rig and you set key frames on poses and the program interpolates the frames from key frame to key frame.
Not relevant information
_streamClient.Start("127.0.0.1", 8282);
And it works for me, I figured the ServoServer runs on local machine right?
Anyways, one thing that will cause it not to work is, if did not start your camera...
You need to have the EZ-B camera connected and running!!
Its kind of funny since I connected my robot with an Arduino Uno running EZ-B Firmware, I still have to connect to my EZ-B and need to start its camera in ARC for this to work...so it basically just sits on the shelf doing nothing but streaming a video of my wall!!
But it all works, which is the only thing that counts, its all still beta anyways...
And finally my robot can participate and I can use ARC by just having to rewire the servo Board connection and not the whole cable tree of my servos!!
...i have noticed using the default scene with 127.0.0.1 when I hit play in unity and then hit stop the plug in window in ARC updates with "disconnection on tty" 1 or 0 so it looks like they are talking to some degree, but not passing the camera and servo info.
ARC designer 1.26
ARC designer beta 3.4
It s working.
127.0.0.1 8282
Got it working by installing latest beta.
But I guess updating will fix your problem!!
Do you have this working with Bio Ik?
Its a bit off, because the measurement of the JD and my own robot do not really match, but it works!!
You can set up anyway you like, I guess I can always reproduce your setup in a way!!
Let me know how its all working for you!!
But yes its choppy, I guess to get smooth motion you would have to do something to those values!
Either way, form an array of values over time and send and equalized value every Nth frame, or send them at a given framerate...its all options!
It all works as it should I guess, but it still needs a great deal of tweaking!!
It gave me the following Error in Unity 2018!
Any ideas what this might be?
SocketException: Der Zugriff auf einen Socket war aufgrund der Zugriffsrechte des Sockets unzulässig.
System.Net.Sockets.Socket.Bind (System.Net.EndPoint local_end)
SyntaxTree.VisualStudio.Unity.Messaging.UdpSocket.Bind (System.Net.IPAddress address, Int32 port)
SyntaxTree.VisualStudio.Unity.Messaging.Messager..ctor (Int32 port)
SyntaxTree.VisualStudio.Unity.Messaging.Messager.BindTo (Int32 port)
SyntaxTree.VisualStudio.Unity.Bridge.VisualStudioIntegration+<>c__DisplayClass9_0.<.cctor>b__0 ()
SyntaxTree.VisualStudio.Unity.Bridge.Project+<>c__DisplayClass37_0.b__0 ()
UnityEditor.EditorApplication.Internal_CallUpdateFunctions () (at C:/buildslave/unity/build/Editor/Mono/EditorApplication.cs:200)
So now there is one thing that I would like to know...
Since my robot is using serial bus servos, is there a way to drive them directly with the ServoServerClient?
Or is there a way to access those values that are being send?
If this is not possible, I would send the values to another server already established on the Raspberry Pi...
But if there would be a way to do it with the ServoServerClient it would be kind of cool too!!
Mickey and I had alot of issues with data not streaming fluidly with only a few servos. I had issue of disconnects and jerky movements and so did mickey (re-read above). Maybe you can take a look? I personally don't need a camera feed...maybe that is taking up too much bandwidth?
+1, I would definitely like to see a hack night of the plug ins usage in Unity/Arc.
Here is a link to the servo recorder: https://synthiam.com/Support/Skills/Servo/Servo-Recorder?id=15963
mickey was using a raspberry pi and that was a terrible experience with arc. I’ll have to create a demo. But the servo positions are sent in a loop constantly. So when they are updated in unity, they get sent to the server in near real-time in a loop, not on demand. This removes twitches or delays.
but you can use a custom Movement Panel with the saved recordings.
I think you the question is missing some information. Is there anything more you can provide
I don't think this is currently possible? The plug just plays live?
I am currently thinking about how to publish an ARC skill taking care of the whole process but there are a few caveats...
First of all, I am sending the servos to a REST server, using an array...this is essentially different from a streaming server! The REST server is sending a data packet once it is fully assembled, while a streaming server is updating as data comes in...kind of first in first out! While this seems to be beneficial at the first look, it also is a bit problematic when it comes to timing issues causing eg jerky playback...
Second thing is the arrary is of course very specific to my servo setup...the LX16A servo positions are have readings from 0 to 100 rather than 0 to 180, and you can also set the timing for positional transition!
Also, you figured it out yourself already...setting up the robot in Unity is a complicated task, so each robot being assembled in Unity has a unique setup!
So all that said...if you are having a rigged robot in Unity, there should not be a problem in sending or storing animations within ARC!
This way I can load and rig as many robots as I can come up with (or avatars of them) and save those Unity scenes for distribution to anyone who wants to down load 3d print and build any of my robots. This way there is also a library of animations saved to get them started and also the Unity prerigged scene for them to play around with. (minus the Bio IK rig)
Maybe this is an entirely different plug in?
Also Unreal 4.26 comes with free built in IK Rigging...no plug in needed..another reason to explore Unreal vs Unity.
But be aware that these are Game Engines, made for video games!
For storing and reading files and driving our robots ARC is the way...it's small and flexible, it's made for robotics!
For the Unreal Game engine you need at least the Nvidia 1060Ti to run smoothly, and that is out of scope for any SBC!!!
@DJ are you referring to this plug in?
So I guess that is why @DJ Sures also commented...
I really does not matter from which platform you are sending your data to ARC...
The Unity project, for example, displays the ARC camera stream on a rotating cube. While, allowing the 2d sprite to control servos D0 and D1 by the X and Y position, respectively.
Any components within the group can have their position or rotation, etc. extracted and sent to ARC. If you have a 3d model of a robot, each joint position/rotation can be sent to ARC.
The most interesting thing to look at is the Scene object -> ServoCameraGroup. Notice it has child GameObjects. Those child GameObjects can be queried for their rotation or position or whatever is desired and sent to ARC as servo positions. Also, the camera image can be rendered to any material as a 2d texture.
Look at the ServoCameraGroup to see the script
The script ServoCamera.cs will be responsible for....
Start
- create and instance of the StreamClient object
- have the StreamClient connect to ARC at an IP address (this case it's using local machine 127.0.0.1)
- assign child gameobjects to local variables that we will be using in Update (this makes cpu happy)
- connecte to the ARC server
Update
- obtaining rotation/position/whatever data from children and add to the servo position cache (in this example a sprite position)
- sending the cache of servo positions
- displaying the incoming image on a material as a texture
Let's take a look at the code for ServoCamera.cs and read the comments of how it is working
Code:
Otherwise, use Unity to move servos in ARC and save them with the servo Recorder. But I'm pretty sure it'll be possible to use some animation tool in Unity. Unity must have something like that.
* Edit: yes found it: https://www.youtube.com/watch?v=sgHicuJAu3g
Google is the greatest
so...
1) use my instructions above to send data of what ever joints you want as servo positions to arc
2) make an animation in unity
3) play animation
@DJ Yeah I successfully set up the project (see thread #74) in the past.
I guess the only solution is for me to use servo recorder to record the streamed motions, as there doesn't seem to be a way to save and load servo positional data in ARC from Unity. Might be enough.
Does servo recorder (under the hood) , actually capturing servo positions?
1) you can use the unity animation tool and save animations in unity. The servo positions are sent to ARC. You can use your Unity project to host a collection of animations if you wish. It doesn't matter if you don't want the camera view - just send the servo positions and ignore the camera view.
3) the project in the past is not the project today. You will notice a date of when it was updated, which is 2020/12/17 (today)
2) servo recorder records servo positions, hence the name
Other new features in Unreal 4.26 are built in IK (no plug in required) and ...get this...you don't have to hit the play button to make things active...YAY!
I can see you are having fun exploring.
i guess an over simplified but relatable example of this would be to create a digital replica of a physical pinball machine in unity. Now get the virtual pinball machine in unity to run a million simulations of games with the flippers hitting the ball in order to get the highest score. When you get the model trained you use that model to control the physical actuators of the real pinball machine flippers. So now you have trained a robot who can play pinball because it has 1 million games of practice time learning how to play.
Training robots to walk is another obvious use case as servo positions on a digital twin of a robot can be automatically tweaked to support any conditions to optimize walking in this environment. The ML trained models can then be used on the physical robots and real world terrains.
but, for some stuff it does like creating an animation. But for navigating or using real physics for walking or interacting with the world it doesn’t
the best use case for AAA engines is what will is using it for. Creating animations with a digital twin.
This of course is the software version of using a real twin like demonstrated here:
Remember the IK work @Mickey666maus @FXRTST and @PTP were doing 2 years ago.
This is one of the best collaborative build threads I have seen on here. https://synthiam.com/Community/Questions/Vr-Options-3693/comments
I was trying to figure out why you are planning to move away from Unity. Since it offers an Asset Store, where you can buy skills like eg REST client and it also natively runs lightweigt IOT sollutions, like WebRTC or gRPC.
So we got all the tools we need for robot/iot building and ARC is already plug and play!
So my guess for the reason for switching to some AAA game engine was, you are planning to lift some pretty heavy stuff...
Unreal is eg know for its unbelievable performance when it comes to its rendering engine!! But is of course this is also creating the need for massive hardware resources...so I thought it might be a bit much for some robot running a SBC!
But its not the first time you would amaze me with something I would have never imagined!
So I am really curious of what it will be!!
@Nink Thanks for the kudos on the concept. Team effort. I have resurrected some of those videos on my channel...and it got me thinking about some other ideas exploring this stuff even further. I've been fascinated about getting AI to teach robots how to move/walk, traverse. Seems Google is leading the way with this at the moment. I can see a future where you can build plug and play modules for robots like lower legs section and the AI will calculate loads for upper torso and arm length, weight, then calculate for a biped walking gait!!! That is the future!
@mickey Yay! Its hard sometimes for me to type out my thoughts clearly. Glad you get it now.
And Unreal has an asset store as well, but I'm trying to put together as much free technology as I can ..I.E, no plug ins needed, etc. You can use their visual programming tool called BluePrint, which is alot like Blocky to code. Here is my set up of connecting to an Arduino and activating the onboard LED. And its mostly a C++ environment for coding outside Blueprint. Oh and the blueprints are active all over the web...and you can copy one and take it apart....so its like active code.
https://unity.github.com/
The funny thing about node based coding...more visual, but sadly not less complicated!!
The one thing I really don't get yet is...what are you actually struggling with?
The above graph shows, that you are already sending data out to the Arduino on a serial port using Unreal, so if you are using PWM servos, all you need is a servo shield and you are done!
I am using Unity for quiet a while now to send servo positions live to ARC, and it works great!
So there are three different scenarios, which are all equally legit, when using any software...
a) sending the data directly to the serial port
b) sending the data to the servo And Camera Server
c) sending the data to a server of choice
Although its does not matter which Game Engine you are using for the first approach, you will loose the ability to integrate the servo positions within ARC, since ARC cannot "see" them, as they are being send to the serial port directly...
The second approach can an the moment only be done in Unity, since DJ Sures implemented his fast and reliable streaming server, that is not using http requests, but a compact binary stream for data transfer!
The third approach can also be done in both Game engines but requires a bit of extra setup.
It is the route that I am currently using, so I can confirm it is working great, no lag no jitter!!
Just install Flask use it to send the data over to ARC!
This way you are not loosing your Game Engine of choice, and you can also have ARC to process all the servo data
I got a bit caught up in making my controller app visually more appealing as you see above...but am hoping to show it in a few days, so you can see what I am talking about!!
I am pretty sure we can get this to work for your robot!
Unreal is a super massive download at 25 Gigs!!!...so also looking into free Blender as another option.....much smaller foot print, this could solve some of my issues with a gaming engine. Blender is a proper animation program so has built in IK, motion blending, motion controls etc. Its all modules coded in python. So just need to find a way to get data out of Blender to flask to Arc?!
This is some delicate subject, since I would strongly advocate for Blender. I would tell everyone to use it, to start on this platform right away! Its open source, its powerful and Python is integrated in the very core of its engine...
Its runs on all mayor operating systems, Windows, Mac OS and Linux.
And it is free, like open source free...free to the bare bone of it!!!
But I spend so much time learning software, that I just did not dare to walk down that road yet!
Its kind of a hard transition most of the times, and I heard that the interface does not make it easier...
BUT...If you are going down that road, am super curious how things will be working out for you!
A positive feedback could also be a push towards that direction for myself!!
And yes sending data out from Blender should be super easy!!
Finally got to film and cut a demo...There is still some fine tuning to be done regarding the IK, its a bit harder to control with a PS4 controller than in VR. But all in all it works quiet well!
The bit of jitter you see comes more from structural build integrity of the robot, than from the servos itself!
I will make a short description on how I connect to ARC in a separate thread, but its not difficult at all!
I spent most of last night downloading Blender and looking around...coming from a Lightwave back ground...once you know one animation program, you kinda know them all. Since you come from Max universe should be simple to master. Most tools are there...just finding them is the trick with new software. I'll let you know my progress.
The only thing is, that I did not test it on any other servo setup, since ZOE is the only fully assembled robot in the house, and this setup is being run on LX16a Serial Bus Servos. But I guess any type of servo setup should work. Its all kinda the same thing!
I also tried to get my setup done using the servo And Camera Server, which will most likely work as good or better. But there is the minor annoyance that this would take more than the one allowed plugin available in my free subscription plan, because I need the Servo And Camera Server & LX16a plugin to get this to work...maybe there will be a developer subscription plan for ARC at some point in the future?
But its all working as it should, by using the Script that I have written in ARC to get the data from the server, so all is working out great with ARC free...I will post a tutorial on how to set it up soon!
https://synthiam.com/Community/Robots/UNITY-LINK-ARC-20157
Hey just wanted to share this, because I think this was something you were aiming for, right?
It works with Unity, Unreal and Blender... so it is pretty versatile!
Although I am still thinking that creating custom animations for a robot using a VR setup is the best possible option.
Rigging humanoid motion directly to a robot, is kinda like trying to take Mocap data from a hummingbird and map them to a monkey, its just such a different setup!
But it of course all depends on how you do it, maybe there is something I did not think of, when it comes to using Mocap data!
This is a great insight of Inverse Kinematics, it is in Unity...but can be applied to any other software!
You mentioned, that you wanted to build you own IK system to be independent of plugins...
This should give you a good start!!
it doesn’t matter how the unity model is controlled. The angles will be sent to ARC which will move respective servos. Any kind of servo can be used. Including serial bus or pwm. Simply add the specific robot skill for your servo type if they’re bus servos. Otherwise just select the Dx ports for pwn servos.
@mickey in ref to mocap data:
My new approach has been in retargeting. You load a .bvh file or use live streaming and then retarget parts of the mocap data (local rotational or translational ) to a bunch of nulls that represent the joints of a robot. This way the nulls do not have to inherent the world space from the mocap data. The nulls rotational data is then sent to some math to calculate the relationship from null rotation to servo positions (1-180).
Problem: Dangers of exposure due to virus like covid to employee or cost of a employee (outsource to another country for savings.)
Solution -Telepresence: This would be a great demo of a real world use case of Exosphere. A driver (wearing a mocap suit) could stock shelves in a store, thousands of miles away using a robot. The wearer could also have VR goggles so he/she could see the environment, products etc. As they move in the suit ( maybe just upper body) those motions are streamed live to the robot.
this essentially is a telepresence skill. And if you combine this with exosphere by following the telepresence instructions, it’s already ready to go.
@DJSures sorry I was hijacking this thread for different topic information, that I thought @fxrtst was being interested in...the content I posted had nothing to do with the ARC skill, feel free to remove it, or move it to another thread?
When I said plugin, I was referring to Unity/Unreal plugins and not to the ARC skill!
The ARC skill works perfectly fine!!
In VR, if you would want to get really fancy and the head and hands position are not enough for you...
You could also pump it up with a few Vive controllers!
But IK endpoints and retargeting are something that needs to be done anyways...so lets see what you are having up your sleeve!! I am curious!!
The physical robot is mirrored in unity. When the user moved, the servod joints can’t hit each other because the unity scene prevents it.
it’d be really great to see you guys using this stuff like some of the universities are. They’re remote telepresence and exosphere use would impress you as it’s right up the alley you’re discussing.
What issues are you having with this robot skill? If you ask I’ll be able to help. You guys are pretty much the only people who use the community forum and I want you to be successful.
Just don’t want you to think you need to reinvent what we already have or can quickly modify. The leading cause of failed robot companies is full stack development - meaning making stuff that already exists rather than building on top of it.
like robots seems to be the one industry that is not moving very fast because of that reason. I mean, we all use operating systems and computer and programs that other people/companies make. But in robotics everyone wants to reinvent. Every robot company has failed because of that. There’s a trail of robot company failures attributed to that.
Let me know what you’re having trouble with and I’ll help so you don’t have to rebuild everything. It’ll just drain your bank account and energy otherwise
But I will keep you guys posted!!
I'm new in the community, my name is Uwe from Germany.
Since last week I have a Rolli Rover, my interests are to control it with my Oculus Quest and to see a stereoscopic live view.
I use a Stereopi V2 with an HDMI to USB cam device.
I have also successfully tested in Unity with the Skill servo and Camera Server, I get a stereo view (image is transmitted side by side).
The latency is very good, works great so far.
One problem I encountered, the skill works when you add it to the project, it connects.
But as soon as you save the project and start it again, no connection is established or you have to delete the skill and reinsert it, then it works again.
Is this a bug or am I doing something wrong?
Excuse my English, I'm translating with a translator, I hope you can still understand it.
Regards
Uwe
https://synthiam.com/Support/Skills/Virtual-Reality/Oculus-Quest-Robot?id=21158
But since I want to create the Unity App myself to display the video stereoscopically, I can't get any further with it.
It would be perfect if I could get the Unity Scripts (send controller data) from this skill (Oculus Quest Robot) that would help me a lot.
I use this Cam
Stereopi V2
Greetings
Uwe