Thumbnail

Single Camera Servo Server

by Synthiam

Servo & camera video server which allows a remote client to move servos and receive video stream from a camera device.

How to add the Single Camera Servo Server robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Virtual Reality category tab.
  5. Press the Single Camera Servo Server icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Single Camera Servo Server robot skill.

How to use the Single Camera Servo Server robot skill

This is a servo & camera video server which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to ARC. where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit.

Demo #1 This is an overview of how this robot skill can integrate with a unity scene using the Unity animation tool.

Demo #2 This example uses a block in Unity that controls an EZ-Robot JD shoulder. It's a very simple example of how powerful and easy it is to control any servo in ARC. The blocks have gravity and a hinge, so as it swings the angle is read and pushed into ARC to move the respective servo.

Demo #3 This is Mickey's demonstration of controlling the servos through unity and joysticks.

How It Works Code from the client (ie unity in this case) will connect to the camera servo skill over tcp. It streams the servo positions and receives the camera view.  User-inserted image

Example Client App Source Code Here is an example test app src that connects to localhost (127.0.0.1), moves a servo on port D2 and displays the camera video stream. The sample app is C# .Net source-code and can be downloaded.

Download C# .Net Example Source code: Test App.zip (2020/12/17)

Test Unity Project I have included a test Unity project for example reference. The example rotates a cube on the screen using the ARROW keys. The cube projects the texture from the camera stream onto it. The arrow keys will also move the servos connected to port D0 and D1 relative to the rotation of the cube.

You can download the project here: Unity Test.zip (2020/12/17)

Use In Unity The stream client files in the "Test App" can be included in a Unity project to receive the video and move servos. The test app demonstrates how to move the servos using the methods, and how to display the video on Windows. To display the video in Unity, follow the steps below. The video works by becoming a Texture2D that can be applied to any material.

To use this in your Unity App, copy the files from the Test App\ServoServerClient*.cs into your Unity project. User-inserted image

Examine Test Project The Unity project displays the ARC camera stream on a rotating cube. While, allowing the 2d sprite to control servos D0 and D1 by the X and Y position, respectively. Clicking on the scene will move the sprite and also move the servos.

User-inserted image

Any components within the group can have their position or rotation, etc. extracted and sent to ARC. If you have a 3d model of a robot, each joint position/rotation can be sent to ARC.

The most interesting thing to look at is the Scene object -> ServoCameraGroup. Notice it has child GameObjects. Those child GameObjects can be queried for their rotation or position or whatever is desired and sent to ARC as servo positions. Also, the camera image can be rendered to any material as a 2d texture. User-inserted image

Look at the ServoCameraGroup to see the script User-inserted image The script ServoCamera.cs will be responsible for Start - create and instance of the StreamClient object

  • have the StreamClient connect to ARC at an IP address (this case it's using local machine 127.0.0.1)
  • assign child gameobjects to local variables that we will be using in Update (this makes cpu happy)
  • connecte to the ARC server

Update

  • obtaining rotation/position/whatever data from children and add to the servo position cache (in this example a sprite position)
  • sending the cache of servo positions
  • displaying the incoming image on a material as a texture

Let's take a look at the code for ServoCamera.cs and read the comments of how it is working


using EZ_Robot_Unity_DLL;
using UnityEngine;

public class ServoCamera : MonoBehaviour {

  ServoServerClient _streamClient;
  bool _initialized = false;
  Texture2D _texture;

  volatile byte [] _toDisplay = new byte[]{ };

  Transform _cube;
  Transform _sprite;

  /// 
  /// We have this script added to a parent that has children. 
  /// Because we can access the children's transformation or position from here to set servo positions
  /// In the update, we'll just grab the children and use thier data to send to ARC
  /// 
  void Start() {

    // This is the texture that will hold the camera image from ARC
    // We apply this texture to a cube
    _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);

    // assign a local variable to the children so we don't have to search for them on each frame (makes cpu happy)
    _cube = gameObject.transform.Find("Cube");
    _sprite = gameObject.transform.Find("MoveableSprite");

    //Create a client that will connect to ARC at the specified IP address
    // Once connected, any available video data from the ARC camera will raise the OnImageDataReady event
    _streamClient = new ServoServerClient();
    _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
    _streamClient.Start("127.0.0.1", 8282);
  }

  /// 
  /// This event is raised for every camera image that is received from the connected ARC server.
  /// We assign the image data to a volatile array that will be used in the update the texture with the latest image
  /// 
  private void _streamClient_OnImageDataReady(byte[] imageData) {

    if (!_initialized)
      return;

    _toDisplay = imageData;
  }

  void OnDisable() {

    _streamClient.Stop();
  }

  /// 
  /// Unity runs this on every render frame
  /// We get the keyboard input to move the camera around
  /// And we map the cube's X and Y rotation values to D0 and D1 servo positions in ARC, respectively
  /// 
  void Update() {

    _initialized = true;

    if (Input.GetKey(KeyCode.Escape))
      Application.Quit();

    // add the position of the servos to the cache based on the location of the sprice
    // We set the positions to cache in this loop rather than trying to send a position each time
    // That way we can send a bulk change which is much faster on bandwidth
    // So, add your servo positions to the cache and then send them all after
    _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(_sprite.transform.position.x));
    _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(_sprite.transform.position.y));

    // Send all the servo positions that have been cached
    _streamClient.SendCachedServoPositions();

    // Display the latest camera image by rendering it to the texture and applying to the cube's material
    if (_toDisplay.Length > 0) {

      _texture.LoadImage(_toDisplay);

      var material = _cube.GetComponent().material;
      material.mainTexture = _texture;
      material.mainTextureScale = new Vector2(1, -1);
    }
  }
}



ARC Pro

Upgrade to ARC Pro

With ARC Pro, your robot is not just a machine; it's your creative partner in the journey of technological exploration.

#1   — Edited

@DJSures I just dug out the app we once made for JD, it already talks to ARC. So with the EZ-Pi Server on the Raspberry Pi, I should be able to drive my robot with Unity and ARC once you got the plugin for the breakout board done...This would be awesome!! My arm setup has one additional joint, but it should be fairly easy to run him like a JD for testing!! :)

I will make an input field for the IP address, so anyone can easily connect to their ARCs HTTP Server...

PRO
Synthiam
#2   — Edited

@Mickey, this plugin will do what you're looking for. There's example code as well to demonstrate how to get a video image from the camera to Unity + how to set servo positions from unity. There's Map() functions in the ServoServerStream as well, which will help map from Unity floats of -1 to +1 into servo degrees.

The thing is, you will need a camera connected for it to work. The Unity ServoServerClient assumes you have a camera connected and streaming for the servo positions to send.

This plugin is WAY faster and better than using the http server - because it'll be much friendlier on resources as well... Doubly so if you're using it for a VR headset or something requiring fast efficient processing.

#3   — Edited

@DJSures You are a wizzard!!! This is great!!! I will have to go to work now, but will look into it this evening! I already noticed this is not a sample scene, so I will have to figure out how to set it all up... Would you mind providing a sample scene for Unity? I know they are quiet heavy, so a Dropbox link or something would be OK!

If there is no sample scene, I guess I will figure it out myself, great progress!!! :D

PRO
USA
#4   — Edited

Ok now I know I will be revisiting my Unity this week. I prolly will have 1001 questions as I work my way through this plug. (I know hard to believe..LOL)

I’m interested to know will this stream servo positions live from unity to ARC with this plug in?

PRO
USA
#5  
The thing is, you will need a camera connected for it to work. The Unity servoServerClient assumes you have a camera connected and streaming for the servopositions to send.

Just connected or actually streaming from the camera for the servos to work?

PRO
Synthiam
#6  

See the video and example Unity application in the description above

PRO
USA
#7   — Edited

...video is worth a thousand words. Thank you. I will play around!

Great video btw!

PRO
USA
#8  

Would I be able to do this, but connected to ARC by streaming this data out using this plug in, then data to EZB..if yes would this be close to real time?

#9  

@DJSures Thanks a lot for the demo scene and for the explanation of your code!! This is really cool!!! The app that I build i Unity had trouble sending the servo positions fast enough when I deployed it on my phone, since each servo was using a coroutine that includes http request...so I was thinking that caching the servo positions will lead to a better result! But well, my hack and slay methods can not really be called coding I guess...so I will take a close look to what you have been doing here!!

Thanks a lot!! This is fun!! :D

#10   — Edited

@fxrtst It is possible to stream the motion live, now that @DJSure slimmed down the server to a dedicated server for camera and servo transmission, the latency will be a lot less than before! Using the HTTP server was giving very good results already!! So you can build all the motions and translation of motion in Unity...basically Unity can be the app that could run on your phone, or Windows or Mac, and the EZ-B will drive your robot!

You could also store the servo positions in ARC if you wanted to...

PRO
USA
#11  

...ok thats what i was hoping for! I cant wait to test it! Let me know how it goes for you!

#12   — Edited

StreamClient _streamClient; needs to be changed to ServoServerClient _streamClient;

in the RobotView script...

#13  

Also you will not loose the ARC functionality!! So all the work you did in ARC will be safe... :)

PRO
USA
#14  

..Nice. Thats what i was hoping for! Cool. Just opened up so much more functionality to be able to use unity and servos and all the other things that ARC has...originally what i was hoping for...

#15   — Edited

Hey all...I made some tests with the plugin, and ran into problems, which I partly solved. First of all, I wanted to turn the camera stream into a UI raw image element, but the problem is that Unity does not allow Texture2D as UI textures. Only Sprites are allowed as UI textures I guess...I solved this quick and dirty by setting up a second camera with a CameraRederTexture which allows us to display within the UI. Basically cheating by filming the cube with the second camera! :D

But the main problem is that I am using a plugin for Unity to extract Inverse Kinematic Rotations, this plugin is called BioIK developed by Sebastian Starke and dicovered by @fxrtst...this plugin returns the joint value, which I would like to use for my servo positions.

The problem is, I cannot get this to work propperly because _streamClient.MapToByte seems to do something to those values which is not working for me!

So is there a way to just send and integer of 1 to 180 to the _streamClient? This is how I extract my rotations...

eg

if (Axis_Z) { BioJoint joint = this.GetComponent<BioJoint>(); double value = joint.Z.GetTargetValue(); int position = Mathf.RoundToInt((float)value); if (Value != position) { Value = position; Changed = true; }

PRO
Synthiam
#16   — Edited

Yes... don’t call the MapToByte. Lol. Just call the set servo function directly

#17  

I think I was trying to do that, and it messed something up...but it was late so maybe I was just confused!!:D I am about to to go back home, will try it once more then!!

The servo function will accept int ranging from 1 to 180? I am still a NOOB so a code  snippet will always be handy...but I guess I will manage!!

Will post once I got things set up!! :)

Btw, the Raspberry Pi camera is still painfully slow, are you planning to implement a compression feature on the EZBPi server for the camera?

#18   — Edited

Oh, I remember...I did NOT try to do that, I tried feeding integer into the MapToByte function... That messed things up!!! :D

#19  

Well, I still cannot get it going...could you give me a code snippet on how to use, a float or int with the _streamClient?

It always throws an exeption at me saying it cannot convert the float, int or double to byte!!

PRO
Synthiam
#20  

cast the variable to a byte with (byte)


_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)SomeInt);

#21  

OMG....I thought its some kind of magic!!!!:D:D :D

Thanks for helping the NOOB!!!

#22  

Got it working now...my god, sometimes its so hard to figure stuff out when you are depend on searching the Internet!! That was all it took!!

I will make a quick demo video on how this is working so far!! It great, finally Unity and ARC share a common ground!! :D

PRO
USA
#23   — Edited

Yes...waiting for video! Working on a new project that can utilize this with the bio IK.

PRO
USA
#24  

...game changer....again....30 year fantasy about to come to fruition.....CG and real world are about to collide and merge.

#25  

This is just a quick demo that this can work out... You can see the camera stream within the Unity App, and servo positions are being transferred live to ARC. This was utilized on the IOTiny, so I dont know if this caused the latency, or maybe my rainbow coding? :)

I am using @DJSures code snippet

_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)SomeInt);

So the weird jitter might be caused by me using the SetCachedServoPosition wrong. But this is the POC that this works out and just needs so tuning I guess!!

Great step forward, these are some exiting days!!! :D

#26  

@DJSures I tried serveral things to get Unity sending the servo positions live to ARC which was partly successful...what I am experiencing are random timeouts. So the servo position is not send, which results in a jump to the next position, rather than smooth movement...just guessing this of course! I was testing on an IOTiny and on an EZB v4, but I guess the problem is with im code in Unity!

So since you had this already figured out for the Virtual Reality Plugin, would you mind sharing you code on how to drive a couple of servos live? Just to make sure I did not do anything wrong? Or should I clean my code an post it here, so you can see what I was up to? :D

#27  
using EZ_Robot_Unity_DLL;
using BioIK;
using System.Collections;
using UnityEngine;
public class CameraCubeRotate : MonoBehaviour {
    ServoServerClient _streamClient;
    bool _initialized = false;
    public float _speed = 0.5f;
    Texture2D _texture;
    float positionAbs;
    public GameObject D0;
    public GameObject D1;
    public GameObject D2;
    volatile byte [] _toDisplay = new byte[]{ };
    void Start() {
        _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);
        _streamClient = new ServoServerClient();
        _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
        _streamClient.Start("127.0.0.1", 8282);
    }
    private void _streamClient_OnImageDataReady(byte[] imageData) {
        if (!_initialized)
          return;
        _toDisplay = imageData;
      }
      void OnDisable() {
        _streamClient.Stop();
      }
      void Update() {
        _initialized = true;
        if (Input.GetKey(KeyCode.Escape))
          Application.Quit();
    
        if (Input.GetKey(KeyCode.RightArrow)) {
          transform.Rotate(new Vector3(0, -_speed, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(transform.rotation.x));
        }
        if (Input.GetKey(KeyCode.LeftArrow)) {
          transform.Rotate(new Vector3(0, _speed, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(transform.rotation.x));
        }
        if (Input.GetKey(KeyCode.DownArrow)) {
          transform.Rotate(new Vector3(_speed, 0, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(transform.rotation.y));
        }
        if (Input.GetKey(KeyCode.UpArrow)) {
          transform.Rotate(new Vector3(-_speed, 0, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(transform.rotation.y));
        }
        BioJoint joint_D0 = D0.GetComponent<BioJoint>();
        double value_D0 = joint_D0.Z.GetTargetValue();
        int position_D0= Mathf.RoundToInt((float)value_D0);
        int positionAbs_D0 = Mathf.Abs(position_D0 - 90);
        print(positionAbs_D0);
        BioJoint joint_D1 = D1.GetComponent<BioJoint>();
        double value_D1 = joint_D1.Y.GetTargetValue();
        int position_D1 = Mathf.RoundToInt((float)value_D1);
        int positionAbs_D1 = Mathf.Abs(180 - (position_D1 - 90));
        print(positionAbs_D1);
        BioJoint joint_D2 = D2.GetComponent<BioJoint>();
        double value_D2 = joint_D2.Y.GetTargetValue();
        int position_D2 = Mathf.RoundToInt((float)value_D2);
        int positionAbs_D2 = Mathf.Abs(position_D2 - 90);
        
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs_D0);
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, (byte)positionAbs_D1);
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D2, (byte)positionAbs_D2);
        // Send all the servo positions if there's been a change
        if (_streamClient.HasServoChanged)
        _streamClient.SendCachedServoPositions();
        if (_toDisplay.Length > 0) {
          _texture.LoadImage(_toDisplay);
          var material = GetComponent<Renderer>().material;
          material.mainTexture = _texture;
          material.mainTextureScale = new Vector2(1, -1);
        }
      }
    }

PRO
USA
#28  

.............any luck?

#29  

Not really, I already dug out the old script that @ptp wrote, while he was still trying to set this all up by directly talking to the EZB and bypassing ARC! He wrote some really good code, that I am currently trying to rewrite...he had a really nice approach on handling the servos in Unity! but I will have to go sleep now, it 3am already!! :D

Did you ever try the Virtual Reality plugin? It should use the same code like the camera server? Does it work smoothly? I willl test this tomorrow...to see how the latency is while using the plugin!! Anyways I am having fun earning a lot of stuff here!! Lets hope we can tie it all together!! :)

PRO
Synthiam
#30  

Think you’re over complicating how it works. Given the code you posted, I’m not certain of the full picture. Put the project on Dropbox or something so I can see what you’ve done?

I provided a complete and working unity example above. Did you try it? Hope so because I took time to make it for you :)

#31   — Edited

@DJSures Yes, I used your code and am using your plugin! I am making good progress in the servo setup, this should make it easier for everyone using the scene to declare the servos, and their propperties! I am sure its me complicating stuff, or messing stuff up!! :D

Would you mind sending my a private mail, so I can send you a dropbox link? The scene contains a paid plugin, so I dont want to share the link public!

P.S. I ordered two LewanSoul Servos so I can test all this on my EZPi too!! It would be awesome if that 16channel servo PWM board would be supported in future updates!

PRO
USA
#32   — Edited

My theory is that Bio IK is somehow changing the translation and rotation values to something that servo server cannot translate properly? Or a longer delay with sendcashedservopositions then you expect (with as many servos that are in JD?)

I'd try a simple scene with just a box in unity and connect rotational values to this plug in to stream to a single servo and see if it works like you suspect. Then add another box and bio Ik to create an IK chain. Does it act ok or do you get the jitters? Keep the test simple and add one thing at a time until it breaks.

#33  

@fxrtst I will look into this  a bit more now...the reason why I am rewriting the script that @ptp was using back in the days is mainly, because I need to also cache my servo positions when I am using my Unity App on Android, since Android is very slow when it comes to http requests, I would like to send them all at once rather than per servo...and the script @ptp wrote is a beautiful way to do so!!  Once this is set up, we can send as many servos as we wanted too, and test if Unity to ARC works without latency!!

It all works fine with the cube scene...so it might be some messed up code somewhere!! :)

PRO
Synthiam
#34  

I can only guess what your project is doing, but i imagine the cache is being set in multiple places. Put your project on a dropbox and ill take a look. I'm certain it will only take a minute and i'll be able to point it out

#35  

@DJSures I will do it, just need to clean it up a little, it is quiet messy at the moment! :D

One thing I would like to know...can I change the

_streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);

D0 from ServoServerClient.ServoPortEnum.D0 to a variable?

I cannot wrap my head around this!! But I can also just send you my scene so far with just one servo... I will do this now! :)

#36  

This is the code so far...using one servo works fine!! :)

using EZ_Robot_Unity_DLL;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Networking;
using Debug = UnityEngine.Debug;
namespace BioIK
{
    public class ServoDefinitionsAndSend : MonoBehaviour
    {
        public int positionAbs;
        ServoServerClient _streamClient;
        bool _initialized = false;
        Texture2D _texture;
        volatile byte[] _toDisplay = new byte[] { };
        private static readonly ServoDefinition[] ServoDefinitions = new ServoDefinition[]
        {
            new ServoDefinition("JD_Rig_v01:Right_Rotate_Shoulder", AxisType.Z, ServoPortEnum.D0, false, 1, 180, 90),
            //new ServoDefinition("JD_Rig_v01:Right_Upper_Arm", AxisType.Y, ServoPortEnum.D1, false, 1, 180, 90),
            //new ServoDefinition("JD_Rig_v01:Right_Lower_arm", AxisType.Y, ServoPortEnum.D2, false, 1, 180, 90),
        };
        private class ServoDefinition
        {
            public readonly string JointName;
            public readonly AxisType Axis;
            public readonly ServoPortEnum Port;
            public readonly bool inverted;
            public readonly int minValue;
            public readonly int maxValue;
            public readonly int middleValue;
            public ServoDefinition(string jointName, AxisType axis, ServoPortEnum port, bool inverted = false, int minValue = 1, int maxValue = 180, int middleValue = 90)
            {
                this.JointName = jointName;
                this.Axis = axis;
                this.Port = port;
                this.inverted = inverted;
                this.minValue = minValue;
                this.maxValue = maxValue;
                this.middleValue = middleValue;
            }
            private int ClampValue(int degrees)
            {
                return degrees < this.minValue ? this.minValue : degrees > this.maxValue ? this.maxValue : degrees;
            }
            public int AdjustValue(int degrees)
            {
                var servoValue = FixRotation(degrees, this.inverted, this.middleValue);
                var clampedValue = this.ClampValue(servoValue);
                return clampedValue;
            }
        }
        private static int FixRotation(int deg, bool inverted, int middleAngle = 90)
        {
            return !inverted ? middleAngle + deg : middleAngle * 2 - (middleAngle + deg);
        }
        public enum AxisType
        {
            X = 0,
            Y = 1,
            Z = 2
        }
        public enum ServoPortEnum
        {
            D0 = 0,
            D1 = 1,
            D2 = 2
        }
        void Start()
        {
            _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);
            _streamClient = new ServoServerClient();
            _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
            _streamClient.Start("127.0.0.1", 8282);
        }
        private void _streamClient_OnImageDataReady(byte[] imageData)
        {
            if (!_initialized)
                return;
            _toDisplay = imageData;
        }
        void OnDisable()
        {
            _streamClient.Stop();
        }
        private void Update()
        {
            _initialized = true;
            int ServoNumber = ServoDefinitions.Length;
            for (int i = 0; i < ServoNumber; i++)
            {
                string Name = (ServoDefinitions[i].JointName);
                var ServoJoint = GameObject.Find(Name);
                BioJoint joint = ServoJoint.GetComponent<BioJoint>();
                print(ServoJoint);
                bool invert = (ServoDefinitions[i].inverted);
                var PortNumber = (ServoDefinitions[i].Port);
                print(PortNumber);
                var UpVector = ((int)ServoDefinitions[i].Axis);
                if (UpVector == 0)
                {
                    print("X");
                    var Value = Mathf.RoundToInt((float)joint.X.GetTargetValue());
                    if (invert)
                    { positionAbs = 180 - (Mathf.Abs(Value - 90)); }
                    else
                    { positionAbs = Mathf.Abs(Value - 90); }
                    //print(positionAbs);
                    _streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
                };
                if (UpVector == 1)
                {
                    print("Y");
                    var Value = Mathf.RoundToInt((float)joint.Y.GetTargetValue());
                    if (invert)
                    { positionAbs = 180 - (Mathf.Abs(Value - 90)); }
                    else
                    { positionAbs = Mathf.Abs(Value - 90); }
                    //print(positionAbs);
                    _streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
                };
                if (UpVector == 2)
                {
                    print("Z");
                    var Value = Mathf.RoundToInt((float)joint.Z.GetTargetValue());
                    if (invert)
                    { positionAbs = 180 - (Mathf.Abs(Value - 90)); }
                    else
                    { positionAbs = Mathf.Abs(Value - 90); }
                    //print(positionAbs);
                    _streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
                };
            }
            if (_streamClient.HasServoChanged)
                _streamClient.SendCachedServoPositions();
            if (_toDisplay.Length > 0)
            {
                _texture.LoadImage(_toDisplay);
                var material = GetComponent<Renderer>().material;
                material.mainTexture = _texture;
                material.mainTextureScale = new Vector2(1, -1);
            }
        }
    }
}
PRO
Synthiam
#37  

Again, the code doesn't help me. I'd need to see your project. The code is a very small factor to the project as a whole.

#38  

There you go...delete the comment after you downloaded the whole thing!! :)

PRO
USA
#39   — Edited

Seems like this thread has been split over to the serial bus thread. Lets keep it all nice and neat in one place:) I believe this is the single most powerful development with Synthiam/ARC.

We should work openly and freely to get this figured out.  I think its important that DJ take a look at the scene with the inverse kinematics plug in attached to the joints. He will know immediately what the issue is if there is one.

The Unity bridge could be a game changer for robotics. Using it Live or with recorded motions could change the way we add motion to our robots.

My 2 Cents

#40   — Edited

@fxrtst I am totally on you at all points...my goal was always to create a Unity Project where everyone can contribute, this is why I thought a JD would suit best! So there is the EZ-Robot which is a good base for testing out new components, and whoever comes up with a new feature will be able to contribute and share!! So if for example, you found out how to create fluid animations you could make one for our sample scene and we can all test it on our own projects!! Or if I would make a new camera control or a nice tracking feature...

I only took the link down to respect Sebastian, the guy which wrote BioIK...but I will organize the project and will be putting it up on GitHub, so it will be open for everyone to contribute and share!!

I do not experience any trouble sending live data to my robot, but it is running on different hardware and different software... Since Synthiam is opening up to new hardware solutions, I am happy to get this revitalized, now I can use my robots hardware and have the ARC sofware process the data! I never had a visual feedback, because this would have required to tear my robots cable management apart just for testing...now its a whole new story!! :)

If anyone does not want to wait, pm me for the messy scene, so you can hack and slay yourselves!! :)

#41   — Edited

deleted deleted deleted deleted deleted

PRO
USA
#42   — Edited

Yeah that was a good idea, I'd feel bad if Sebastian lost money on our behalf.

So the issue you are having is other hardware and software  you are adding to your project?

If I set up a scene with Bio IK and connect it to EZB using this plug in, then everything works? I'm going to try and test later today. Something simple like one or two joints.

#43  

@fxrtst I was trying to use an implementation of the code @ptp wrote back in the days, I will drop him an email so he can participate because he knows best what he was coding and how it works! I will clean up the project (hopefully tonight) and put it to GitHub, so everyone can contribute!!

I guess that is the fastest way to get this to work...I will need to clean the rig a little, redo some naming conventions, and remove some bunny hug that piled up when I was trying to get stuff done...with that out of the way you, @DJSures, @ptp and myself will be having a nice clean Sandbox to play in!! :)

#44  

@all Guys I am terribly sorry to have caused a mess, but now I am almost 100% positive that my weird servo jumps came from underpowering the EZ-B Servos! I run my robot on 5 Volts and I did not pay attention when I hooked up the EZ-B to the Powersupply...I just thought it'll look cool to have those EZ-Robot servos attached!!

So today just because I rebuild the scene, I grabbed some servos on the table...and they seem to run just fine!! :D

Anyways I created a GitHub repository, it is without the plugin but @DJSures can pm me or @fxrtst about this, I dont think that Sebastian would mind if we would give to someone for debugging...just putting it out in the open is not so nice!! :)

I am really excited where this project will take us...its been a lot of fun so far!!

PRO
USA
#45  

....ahhh the ole underpowered servo trick :). Sounds good. If this plug in works with Bio Ik then I'm happy. Gotta have some sort of IK to make all this worth while!

#46  

So you will find it on GitHub here

https://github.com/SwannSchilling/Synthiam_Unity

#47  

Let me know how this works for you, and drop DJ a line about the missing plugin so he can do some testing too... ;)

I will go an sleep now, its 1am over here!! See you tomorrow!! :)

PRO
Synthiam
#48  

I'll check it out later

PRO
USA
#49   — Edited

Okay just got a moment to sit down with this. I am having trouble getting ezb to talk with unity. I'm sure its ip/port related. No streaming occurring. I just added the cameracuberotate to a cube. Camera is active in ARC and plug in installed and open. I press play in unity and i can control the rotation with arrow keys but no projection of camera or servo movement.

#50  

@fxrtst Are you using the GitHub scene or the ServoServer scene? I will do a quick video on how to set it up when I am back from work!! It all works fine now!! :)

#51  

Btw, can you confirm that BioIK is picking the joint rotations when you are playing those .fbx animations exported from Lightwave? I had some trouble figuring out the differences between baked animations and animations done within Unity...but I am sure it can be sorted out easily!!

PRO
USA
#52   — Edited

I loaded the servoServer scene. I just get the white cube, i can rotate it using arrow keys but no servo or camera streaming data from ARC. There is no Bio Ik in my scene, just trying to get the basic scene to work first before I start adding IK.

(Edit: Could it be version dependent. I'm using Unity 2017.4?) DJ built the scene in 2018 version.

I think the fbx files were only for joint positions, I don't remember doing any animation with them. Baking animations from one program to another simply removes the bone rig and bakes vertex position on every frame vs standard animations which will use a rig and you set key frames on poses and the program interpolates the frames from key frame to key frame.

#53  

My Unity is up to date and everything works fine!! So updating might be a good idea! :)

PRO
USA
#54  

Yeah I updated and I still have the same issue. I'm using 127.0.0.1 as thats the internal Ip of all computers correct? What is the port number based on. Is 8282 just an arbitrary number?

PRO
USA
#55   — Edited
  • deleted * Not relevant information
#56   — Edited

@ptp I am using

_streamClient.Start("127.0.0.1", 8282);

And it works for me, I figured the ServoServer runs on local machine right?

Anyways, one thing that will cause it not to work is, if did not start your camera... You need to have the EZ-B camera connected and running!! :)

Its kind of funny since I connected my robot with an Arduino Uno running EZ-B Firmware, I still have to connect to my EZ-B and need to start its camera in ARC for this to work...so it basically just sits on the shelf doing nothing but streaming a video of my wall!! :D

But it all works, which is the only thing that counts, its all still beta anyways... And finally my robot can participate and I can use ARC by just having to rewire the servo Board connection and not the whole cable tree of my servos!! :D

PRO
USA
#57  

....yeah my camera has been on for a day now..same thing pointed towards the wall. Using the 127.0.0.1:8282 will not stream for me. I've used every possible combo including PTP suggestion and I still get no stream.

PRO
USA
#58  

User-inserted image

...i have noticed using the default scene with 127.0.0.1 when I hit play in unity and then hit stop the plug in window in ARC updates with "disconnection on tty" 1 or 0 so it looks like they are talking to some degree, but not passing the camera and servo info.

PRO
USA
#59  

.....question are you using the beta version or or the release from 1.26 and does it matter?

ARC designer 1.26 ARC designer beta 3.4

PRO
USA
#60   — Edited

User-inserted imageUser-inserted image

It s working. 127.0.0.1 8282

PRO
USA
#61  

Well thats frustrating.

#62  

@fxrtst When you said the thing about the disconnect it came back to my mind that I had an issue like that before...try removing the camera control from your project and add it again, just a guess into the blue!! :)

PRO
USA
#63   — Edited

............................deleted.............

PRO
USA
#64   — Edited

......deleted.....

PRO
USA
#65  

Can you guys tell me the version of ARC you are using

PRO
USA
#66  

.....ok . This should go in the description under prerequisites...must use the latest beta. WILL NOT Work with 1.26

Got it working by installing latest beta.

#67  

Keeping my fingers crossed for you...and also I would try the EZ-B camera, just to make sure!! But I guess updating will fix your problem!! :)

PRO
USA
#68   — Edited

....yeah its working...man i spent so many hours messing around....grrrr

Do you have this working with Bio Ik?

#69  

Yes its all working!! But I can tell you this needs a lot of tweaking if you want to create fluid motion...I got my scene set up, so I can use the Virtual JD scene to move my own robot!! Its a bit off, because the measurement of the JD and my own robot do not really match, but it works!!

You can set up anyway you like, I guess I can always reproduce your setup in a way!! Let me know how its all working for you!! :)

PRO
USA
#70  

No fluid motion because? What are your thoughts? I wonder the more servos you have if its just too much for a live set up?

PRO
USA
#71   — Edited

I've managed to break it a couple of times. Sent rotations for 3 servos real quick and it stopped responding and I had to re boot the tiny IO to get it to work again. I can see the jerky movements in the servo positions. Not sure what that is. Could be that the positions are sent once collected causing a bit of a lull between positions.

#72  

Did you use this method... _streamClient.SetCachedServoPosition

But yes its choppy, I guess to get smooth motion you would have to do something to those values! Either way, form an array of values over time and send and equalized value every Nth frame, or send them at a given framerate...its all options! It all works as it should I guess, but it still needs a great deal of tweaking!! :)

PRO
USA
#73  

Yes I used that method. I think you are correct something needs to be done to those values to smooth out the motion. For sure needs some tweeking. But a great start.

PRO
USA
#74   — Edited

The title says it all. Step right up ladies and gentlemen! Spin the cube and watch wormy's mesmerizing dance!

#75  

That seems to be super fluid...are you using the default code, or did you do anything to those servo values already?

PRO
USA
#76  

Thats default. And i think that the cube it spinning so fast that the servos are forced to update faster. But if i slow down the movement its very jerky. Almost as if the steps are larger than a degree or two.

#77  

Ah OK, I was thinking its only my setup that does this...

#78  

Hey guys...I tried to use the Raspberry Pi camera in Unity for the setup with the Servo-and-Camera-Server, but had no luck! It gave me the following Error in Unity 2018! Any ideas what this might be?

SocketException: Der Zugriff auf einen Socket war aufgrund der Zugriffsrechte des Sockets unzulässig.

System.Net.Sockets.Socket.Bind (System.Net.EndPoint local_end) SyntaxTree.VisualStudio.Unity.Messaging.UdpSocket.Bind (System.Net.IPAddress address, Int32 port) SyntaxTree.VisualStudio.Unity.Messaging.Messager..ctor (Int32 port) SyntaxTree.VisualStudio.Unity.Messaging.Messager.BindTo (Int32 port) SyntaxTree.VisualStudio.Unity.Bridge.VisualStudioIntegration+<>c__DisplayClass9_0.<.cctor>b__0 () SyntaxTree.VisualStudio.Unity.Bridge.Project+<>c__DisplayClass37_0.b__0 () UnityEditor.EditorApplication.Internal_CallUpdateFunctions () (at C:/buildslave/unity/build/Editor/Mono/EditorApplication.cs:200)

#79  

It was an old version of the EZBPi server that caused the error in Unity...I updated to the latest version and everything works just fine!! So now there is one thing that I would like to know... Since my robot is using serial bus servos, is there a way to drive them directly with the ServoServerClient? Or is there a way to access those values that are being send?

If this is not possible, I would send the values to another server already established on the Raspberry Pi... But if there would be a way to do it with the ServoServerClient it would be kind of cool too!! :)

PRO
USA
#80  

@DJ   I would like to see a way to record the incoming data coming from Unity with this skill as Mickey has stated in another thread. If I create a complex animation in unity and send it through the Unity skill, at some point I would want ARC to receive and record the data, so I would not need to be connected to Unity at all times. Then this data could be played back. Almost like using servo recorder?

Mickey and I had alot of issues with data not streaming fluidly with only a few servos. I had issue of disconnects and jerky movements and so did mickey (re-read above). Maybe you can take a look? I personally don't need a camera feed...maybe that is taking up too much bandwidth?

+1, I would definitely like to see a hack night of the plug ins usage in Unity/Arc.

PRO
Synthiam
#81   — Edited

Use the servo recorder. This skill, like all skills, controls servos which can be recorded.

Here is a link to the servo recorder: https://synthiam.com/Support/Skills/Servo/Servo-Recorder?id=15963

mickey was using a raspberry pi and that was a terrible experience with arc. I’ll have to create a demo. But the servo positions are sent in a loop constantly. So when they are updated in unity, they get sent to the server in near real-time in a loop, not on demand. This removes twitches or delays.

PRO
USA
#82  

Can I connect servo recorder recordings to a movement panel?

PRO
USA
#84   — Edited

Yeah. My idea has always been record servo positions in Unity from animations of 3D meshes with IK (inverse kinematics). They don't need to be captured "Live" (altho that could be another feature). Then in Unity I can save out that file and then have a way to load those servo positions into ARC. In a perfect world, that loaded file would allow me to use a custom Movement Panel or ?? to control a robot via a virtual joystick or buttons, i.e. move forward/reverse, lift arm.

I don't think this is currently possible? The plug just plays live?

#86  

@fxrtst the way I am currently storing animation data is working the following way...I am creating the movement in Uniy, its being send to the server and stored in a file for playback!

I am currently thinking about how to publish an ARC skill taking care of the whole process but there are a few caveats...

First of all, I am sending the servos to a REST server, using an array...this is essentially different from a streaming server! The REST server is sending a data packet once it is fully assembled, while a streaming server is updating as data comes in...kind of first in first out! While this seems to be beneficial at the first look, it also is a bit problematic when it comes to timing issues causing eg jerky playback...

Second thing is the arrary is of course very specific to my servo setup...the LX16A servo positions are have readings from 0 to 100 rather than 0 to 180, and you can also set the timing for positional transition!

Also, you figured it out yourself already...setting up the robot in Unity is a complicated task, so each robot being assembled in Unity has a unique setup!

So all that said...if you are having a rigged robot in Unity, there should not be a problem in sending or storing animations within ARC!

#87  

If you are rigging that robot with BioIK, everything should work! :)

PRO
USA
#88   — Edited

I think for me and my use cases...and keeping it universal (and simple) for any robot...I'd like to see a global Unity save animation option that will save out the positions..(maybe save a csv file?) and that file simply loads and is read in Arc.

This way I can load and rig as many robots as I can come up with (or avatars of them) and save those Unity scenes for distribution to anyone who wants to down load 3d print and build any of my robots. This way there is also a library of animations saved to get them started and also the Unity prerigged scene for them to play around with. (minus the Bio IK rig)

Maybe this is an entirely different plug in? Also Unreal 4.26 comes with free built in IK Rigging...no plug in needed..another reason to explore Unreal vs Unity.

#89   — Edited

I only heard only good things about Unreal, keep us updated... But be aware that these are Game Engines, made for video games! For storing and reading files and driving our robots ARC is the way...it's small and flexible, it's made for robotics! For the Unreal Game engine you need at least the Nvidia 1060Ti to run smoothly, and that is out of scope for any SBC!!! :)

PRO
Synthiam
#90  

This works with absolutely any input from any third party software that allows scripting. This is not specific to unity. This is an agnostic server that allows absolutely anything to send information to it.

PRO
USA
#91  

@mickey Hahaha. I am a little lost, I'm not trying to replace ARC! Where did you get a crazy idea like that? xDUnreal like Unity works the same.

@DJ are you referring to this plug in?

#92  

@fxrtst I was just over exaggerating to make my point in, Game Engines are something to be run on a desktop PC... So I guess that is why @DJ Sures also commented...

Quote:

This works with absolutely any input from any third party software that allows scripting. This is not specific to unity. This is an agnostic server that allows absolutely anything to send information to it.
I really does not matter from which platform you are sending your data to ARC... :)

PRO
Synthiam
#93   — Edited

I updated the Unity & .Net example projects in the description of this skill. They can be downloaded and run the projects to see how they are setup.

The Unity project, for example, displays the ARC camera stream on a rotating cube. While, allowing the 2d sprite to control servos D0 and D1 by the X and Y position, respectively.

User-inserted image

Any components within the group can have their position or rotation, etc. extracted and sent to ARC. If you have a 3d model of a robot, each joint position/rotation can be sent to ARC.

The most interesting thing to look at is the Scene object -> ServoCameraGroup. Notice it has child GameObjects. Those child GameObjects can be queried for their rotation or position or whatever is desired and sent to ARC as servo positions. Also, the camera image can be rendered to any material as a 2d texture. User-inserted image

Look at the ServoCameraGroup to see the script User-inserted image The script ServoCamera.cs will be responsible for.... Start

  • create and instance of the StreamClient object
  • have the StreamClient connect to ARC at an IP address (this case it's using local machine 127.0.0.1)
  • assign child gameobjects to local variables that we will be using in Update (this makes cpu happy)
  • connecte to the ARC server

Update

  • obtaining rotation/position/whatever data from children and add to the servo position cache (in this example a sprite position)
  • sending the cache of servo positions
  • displaying the incoming image on a material as a texture

Let's take a look at the code for ServoCamera.cs and read the comments of how it is working


using EZ_Robot_Unity_DLL;
using UnityEngine;

public class ServoCamera : MonoBehaviour {

  ServoServerClient _streamClient;
  bool _initialized = false;
  Texture2D _texture;

  volatile byte [] _toDisplay = new byte[]{ };

  Transform _cube;
  Transform _sprite;

  /// 
  /// We have this script added to a parent that has children. 
  /// Because we can access the children's transformation or position from here to set servo positions
  /// In the update, we'll just grab the children and use thier data to send to ARC
  /// 
  void Start() {

    // This is the texture that will hold the camera image from ARC
    // We apply this texture to a cube
    _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);

    // assign a local variable to the children so we don't have to search for them on each frame (makes cpu happy)
    _cube = gameObject.transform.Find("Cube");
    _sprite = gameObject.transform.Find("MoveableSprite");

    //Create a client that will connect to ARC at the specified IP address
    // Once connected, any available video data from the ARC camera will raise the OnImageDataReady event
    _streamClient = new ServoServerClient();
    _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
    _streamClient.Start("127.0.0.1", 8282);
  }

  /// 
  /// This event is raised for every camera image that is received from the connected ARC server.
  /// We assign the image data to a volatile array that will be used in the update the texture with the latest image
  /// 
  private void _streamClient_OnImageDataReady(byte[] imageData) {

    if (!_initialized)
      return;

    _toDisplay = imageData;
  }

  void OnDisable() {

    _streamClient.Stop();
  }

  /// 
  /// Unity runs this on every render frame
  /// We get the keyboard input to move the camera around
  /// And we map the cube's X and Y rotation values to D0 and D1 servo positions in ARC, respectively
  /// 
  void Update() {

    _initialized = true;

    if (Input.GetKey(KeyCode.Escape))
      Application.Quit();

    // add the position of the servos to the cache based on the location of the sprice
    // We set the positions to cache in this loop rather than trying to send a position each time
    // That way we can send a bulk change which is much faster on bandwidth
    // So, add your servo positions to the cache and then send them all after
    _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(_sprite.transform.position.x));
    _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(_sprite.transform.position.y));

    // Send all the servo positions that have been cached
    _streamClient.SendCachedServoPositions();

    // Display the latest camera image by rendering it to the texture and applying to the cube's material
    if (_toDisplay.Length > 0) {

      _texture.LoadImage(_toDisplay);

      var material = _cube.GetComponent().material;
      material.mainTexture = _texture;
      material.mainTextureScale = new Vector2(1, -1);
    }
  }
}


#94  

Thanks for this...I will take a quick look, should have went to sleep an hour ago!! :D

PRO
Synthiam
#95   — Edited

Will - you'd have to look into Unity asset store to see if there's a way to save animations and such, because Unity is a different program that ARC. They have a large collection of their own assets and features.

Otherwise, use Unity to move servos in ARC and save them with the servo Recorder. But I'm pretty sure it'll be possible to use some animation tool in Unity. Unity must have something like that.

  • Edit: yes found it: https://www.youtube.com/watch?v=sgHicuJAu3g

Google is the greatest

so...

  1. use my instructions above to send data of what ever joints you want as servo positions to arc
  2. make an animation in unity
  3. play animation
PRO
USA
#96   — Edited

@mickey I guess my point keeps getting lost. I am not talking about serving a stream of servos and camera live from one program to another. I am talking about animating the avatar (like shown above of my robot) in the game engine of your choice and then saving servo positions, exporting them. Then importing it into ARC. Therefore you don't need a power house PC connected to your robot. Game engine is only to get animations of IK rigs, motion blending and other dynamics you can't get from Auto Position or any other animator on the market today, onto a robot. I think we have said it before, that we are similar but different paths on our final solutions for using a gaming engines.

@DJ Yeah I successfully set up the project (see thread #74) in the past. I guess the only solution is for me to use servo recorder to record the streamed motions, as there doesn't seem to be a way to save and load servo positional data in ARC from Unity. Might be enough.

Does servo recorder (under the hood) , actually capturing servo positions?

PRO
Synthiam
#97  

Will

  1. you can use the unity animation tool and save animations in unity. The servo positions are sent to ARC. You can use your Unity project to host a collection of animations if you wish. It doesn't matter if you don't want the camera view - just send the servo positions and ignore the camera view.

  2. the project in the past is not the project today. You will notice a date of when it was updated, which is 2020/12/17 (today)

  3. servo recorder records servo positions, hence the name :)

PRO
Synthiam
#98  

It's really easy. Not sure what else i can say on this - it's like, the easiest thing ever

PRO
USA
#99  

Thank you for that! That was my own special Thursday night Hack!. I'm such a visual learner. Texting and typing in a forum or on a cel phone is sometimes hard to understand or to try and get a point across. Video was clear! Thanks again.

PRO
Synthiam
#100  

Hey i'm not anything near your level but i did this... it's fun!

PRO
USA
#101  

Yeah there are so many really cool attributes like gravity, there is also dynamics, where an object can interact with another, there is the ability to keep objects in unity from penetrating its self (self intersecting), In Unreal they have motion blending..where as you can create two different animations and blend them together seamlessly. Lots and lots of advanced features.

Other new features in Unreal 4.26 are built in IK (no plug in required) and ...get this...you don't have to hit the play button to make things active...YAY!

I can see you are having fun exploring.

PRO
USA
#102   — Edited

Quote:

This works with absolutely any input from any third party software that allows scripting. This is not specific to unity. This is an agnostic server that allows absolutely anything to send information to it.
Are the camera and servo data intertwined? Is one dependent on the other to pass info.

PRO
Synthiam
#103  

If it allows scripting - might require some changes based on what ever language it supports. But yup - be prepared to write a bit of code at first that's all

PRO
Canada
#104  

This is very cool. Another use case is Digital twins. This is a growing field and being able to recreate this with unity and ARC is extremely valuable for industry, especially when you can use Machine Learning  (ML) to run and train a simulation a million times and then use the ML trained model created to control the physical robot in real world conditions.

i guess an over simplified but relatable example of this would be to create a digital replica of a physical pinball machine in unity.  Now get the virtual pinball machine in unity to run a million simulations of games with the flippers hitting the ball in order to get the highest score.  When you get the model trained you  use that model to control the physical actuators of the real pinball machine flippers.  So now you have trained a robot who can play pinball because it  has 1 million games of practice time learning how to play.

Training robots to walk is another obvious use case as servo positions on a digital twin of a robot can be automatically tweaked to support any conditions to optimize walking in this environment.  The ML trained models can then be used on the physical robots and real world terrains.

PRO
Synthiam
#105  

That gets used a lot in universities. As it’s cheaper to program a virtual robot than real. It was popular a number of years ago but slowly faded due to physics engines limitations. The trouble is that there’s a larger game market than robot/physics market. So game engines simulate physics but don’t actually use anything close to reality. Turns out that a semi complicated vr robot doesn’t work when connected to a real one.

but, for some stuff it does like creating an animation. But for navigating or using real physics for walking or interacting with the world it doesn’t

the best use case for AAA engines is what will is using it for. Creating animations with a digital twin.

This of course is the software version of using a real twin like demonstrated here:

PRO
Canada
#106  

I loved your puppet hack but yeah that is physical to physical. But if you took virtual JD and Ran him through something like unity simulation for robotics to complete a task https://github.com/Unity-Technologies/Unity-Simulation-Docs

Remember the IK work @Mickey666maus @FXRTST and @PTP were doing 2 years ago.

This is one of the best collaborative build threads I have seen on here. https://synthiam.com/Community/Questions/Vr-Options-3693/comments

#107  

@DJSures 

Quote:

Hey i'm not anything near your level but i did this... it's fun!
OMG!!! This is such a cool implementation!! It never ever even crossed my mind!! :D

#108  

@fxrtst I think I actually got you wrong!

I was trying to figure out why you are planning to move away from Unity. Since it offers an Asset Store, where you can buy skills like eg REST client and it also natively runs lightweigt IOT sollutions, like WebRTC or gRPC. So we got all the tools we need for robot/iot building and ARC is already plug and play!

So my guess for the reason for switching to some AAA game engine was, you are planning to lift some pretty heavy stuff... Unreal is eg know for its unbelievable performance when it comes to its rendering engine!! But is of course this is also creating the need for massive hardware resources...so I thought it might be a bit much for some robot running a SBC!

But its not the first time you would amaze me with something I would have never imagined! So I am really curious of what it will be!! :D

PRO
USA
#109   — Edited

@DJ I have a fun project slated in 2021 using the digital twin idea......such a great hack to get feed back from a standard servo!

@Nink Thanks for the kudos on the concept. Team effort. I have resurrected some of those videos on my channel...and it got me thinking about some other ideas exploring this stuff even further. I've been fascinated about getting AI to teach robots how to move/walk, traverse. Seems Google is leading the way with this at the moment. I can see a future where you can build plug and play modules for robots like lower legs section and the AI will calculate loads for upper torso and arm length, weight, then calculate for a biped walking gait!!! That is the future!

@mickey Yay! Its hard sometimes for me to type out my thoughts clearly. Glad you get it now. And Unreal has an asset store as well, but I'm trying to put together as much free technology as I can ..I.E, no plug ins needed, etc. You can use their visual programming tool called BluePrint, which is alot like Blocky to code. Here is my set up of connecting to an Arduino and activating the onboard LED. And its mostly a C++ environment for coding outside Blueprint. Oh and the blueprints are active all over the web...and you can copy one and take it apart....so its like active code. User-inserted image

#110  

Also. something that I never realized is...there is git for Unity!! :D

https://unity.github.com/

#111  

User-inserted image

The funny thing about node based coding...more visual, but sadly not less complicated!! :D

PRO
USA
#112  

Its funny every package has adopted nodes in one form or another..be it animation, or surface painting or coding....I started getting familiar with it when Lightwave moved over several years ago. As long as you understand the stack then its fairly simple. But yeah its pretty to look at but the complexity is still there.

#113  

Yeah, totally...its kind of a fresh approach to get stuff done, and its a big plus for a visually oriented person like myself! I don't mind coding, but a node based GUI is just such a nice thing to get started, right?

The one thing I really don't get yet is...what are you actually struggling with? The above graph shows, that you are already sending data out to the Arduino on a serial port using Unreal, so if you are using PWM servos, all you need is a servo shield and you are done!

I am using Unity for quiet a while now to send servo positions live to ARC, and it works great!

So there are three different scenarios, which are all equally legit, when using any software... a) sending the data directly to the serial port b) sending the data to the servo And Camera Server c) sending the data to a server of choice

Although its does not matter which Game Engine you are using for the first approach, you will loose the ability to integrate the servo positions within ARC, since ARC cannot "see" them, as they are being send to the serial port directly...

The second approach can an the moment only be done in Unity, since DJ Sures implemented his fast and reliable streaming server, that is not using http requests, but a compact binary stream for data transfer!

The third approach can also be done in both Game engines but requires a bit of extra setup. It is the route that I am currently using, so I can confirm it is working great, no lag no jitter!! Just install Flask use it to send the data over to ARC! This way you are not loosing your Game Engine of choice, and you can also have ARC to process all the servo data

I got a bit caught up in making my controller app visually more appealing as you see above...but am hoping to show it in a few days, so you can see what I am talking about!!

I am pretty sure we can get this to work for your robot! :)

PRO
USA
#114   — Edited

@mickey...are you using flask? I'd be interested in see a video with that set up...I'm as visual learner too.

Unreal is a super massive download at 25 Gigs!!!...so also looking into free Blender as another option.....much smaller foot print, this could solve some of my issues with a gaming engine. Blender is a proper animation program so has built in IK, motion blending, motion controls etc. Its all modules coded in python. So just need to find a way to get data out of Blender to flask to Arc?!

#115  

OMG, you got me on that one...Blender!!! This is some delicate subject, since I would strongly advocate for Blender. I would tell everyone to use it, to start on this platform right away! Its open source, its powerful and Python is integrated in the very core of its engine... Its runs on all mayor operating systems, Windows, Mac OS and Linux. And it is free, like open source free...free to the bare bone of it!!!

But I spend so much time learning software, that I just did not dare to walk down that road yet! Its kind of a hard transition most of the times, and I heard that the interface does not make it easier...

BUT...If you are going down that road, am super curious how things will be working out for you! A positive feedback could also be a push towards that direction for myself!!

And yes sending data out from Blender should be super easy!! :)

#116  

Finally got to film and cut a demo...There is still some fine tuning to be done regarding the IK, its a bit harder to control with a PS4 controller than in VR. But all in all it works quiet well! The bit of jitter you see comes more from structural build integrity of the robot, than from the servos itself!

I will make a short description on how I connect to ARC in a separate thread, but its not difficult at all! :)

PRO
Synthiam
#117  

Oh that’s great! Super awesome actually!:) amazing how talented you guys are. When I was playing in unity, there wasn’t much I could do. I’m pretty terrible at design lol

PRO
USA
#118   — Edited

Nice job! So is this demoing pushing to flask? Because this is really smooth motion....what is different?

I spent most of last night downloading Blender and looking around...coming from a Lightwave back ground...once you know one animation program, you kinda know them all. Since you come from Max universe should be simple to master. Most tools are there...just finding them is the trick with new software. I'll let you know my progress.

#119   — Edited

Thanks, and yes the current version is using Flask and ARC, so its all tried and tested to work out like shown in the demo... The only thing is, that I did not test it on any other servo setup, since ZOE is the only fully assembled robot in the house, and this setup is being run on LX16a Serial Bus Servos. But I guess any type of servo setup should work. Its all kinda the same thing! :)

I also tried to get my setup done using the servo And Camera Server, which will most likely work as good or better. But there is the minor annoyance that this would take more than the one allowed plugin available in my free subscription plan, because I need the Servo And Camera Server & LX16a plugin to get this to work...maybe there will be a developer subscription plan for ARC at some point in the future?

But its all working as it should, by using the Script that I have written in ARC to get the data from the server, so all is working out great with ARC free...I will post a tutorial on how to set it up soon! :)

PRO
USA
#120  

...I look forward to that tutorial. That will help me better understand your process, so I can think about a blender application.

#121  

Hey, tutorial on how to use is now over here...

https://synthiam.com/Community/Robots/UNITY-LINK-ARC-20157

#122  

Hey just wanted to share this, because I think this was something you were aiming for, right? It works with Unity, Unreal and Blender... so it is pretty versatile!

Although I am still thinking that creating custom animations for a robot using a VR setup is the best possible option.

Rigging humanoid motion directly to a robot, is kinda like trying to take Mocap data from a hummingbird and map them to a monkey, its just such a different setup! But it of course all depends on how you do it, maybe there is something I did not think of, when it comes to using Mocap data! :)

#123  

This is a great insight of Inverse Kinematics, it is in Unity...but can be applied to any other software! You mentioned, that you wanted to build you own IK system to be independent of plugins... This should give you a good start!! :)

PRO
Synthiam
#124  

This robot skill does all that. There’s instructions at the top of this manual page on how to do it. I’ve also made some videos that are published on this manual page with examples.

it doesn’t matter how the unity model is controlled. The angles will be sent to ARC which will move respective servos. Any kind of servo can be used. Including serial bus or pwm. Simply add the specific robot skill for your servo type if they’re bus servos. Otherwise just select the Dx ports for pwn servos.

PRO
USA
#125   — Edited

Pretty cool. I saw another review on youtube and the motion is only being captured from one camera angle so there are issues with motion when viewed another angle...but not too bad.

@mickey in ref to mocap data:

My new approach has been in retargeting. You load a .bvh file or use live streaming and then retarget parts of the mocap data (local rotational or translational ) to a bunch of nulls that represent the joints of a robot. This way the nulls do not have to inherent the world space from the mocap data. The nulls rotational data is then sent to some math to calculate the relationship from null rotation to servo positions (1-180).

Problem: Dangers of exposure due to virus like covid to employee or cost of a employee (outsource to another country for savings.)

Solution -Telepresence: This would be a great demo of a real world use case of Exosphere. A driver (wearing a mocap suit) could stock shelves in a store, thousands of miles away using a robot. The wearer could also have VR goggles so he/she could see the environment, products etc. As they move in the suit ( maybe just upper body) those motions are streamed live to the robot.

PRO
Synthiam
#126  

Assign the joints and use the provided Map function to convert angles into servo degrees. The examples are included in this robot skill. Because this robot skill operates over tcp, that’s what the internet uses. So this skill can be used over the internet and provides a camera feed.

this essentially is a telepresence skill. And if you combine this with exosphere by following the telepresence instructions, it’s already ready to go.

PRO
USA
#127  

Yeah the challenge is on the other end. The motion capture software must be linked to Unity, and then stream motion capture data "live" on to a control rig (null unit) within Unity. Where as the world space must be ignored. There are alot of hurdles to overcome including occlusion/intersections of data from human to robot, reduce spikes in the data (filter) with out introducing latency, which in a telepresence environment could spell trouble. If it was easy then everyone would be doing it! But that's the call to challenge isn't it?

#128  

Quote:

This robot skill does all that. There’s instructions at the top of this manual page on how to do it. I’ve also made some videos that are published on this manual page with examples.

it doesn’t matter how the unity model is controlled. The angles will be sent to ARC which will move respective servos. Any kind of servo can be used. Including serial bus or pwm. Simply add the specific robot skill for your servo type if they’re bus servos. Otherwise just select the Dx ports for pwn servos.

@DJSures sorry I was hijacking this thread for different topic information, that I thought @fxrtst was being interested in...the content I posted had nothing to do with the ARC skill, feel free to remove it, or move it to another thread?  When I said plugin, I was referring to Unity/Unreal plugins and not to the ARC skill! The ARC skill works perfectly fine!! :)

#129  

@fxrtst those problems could be easily solved by just using a VR Headset...I really don't get the point of struggling with mocap! But you are probably having something very specific in mind?

In VR, if you would want to get really fancy and the head and hands position are not enough for you... You could also pump it up with a few Vive controllers! But IK endpoints and retargeting are something that needs to be done anyways...so lets see what you are having up your sleeve!! I am curious!! :D

PRO
Synthiam
#130   — Edited

We have a few customers who use exosphere for remote telepresence and use finger haptic gloves. That’s what this robot skill is being used for.

The physical robot is mirrored in unity. When the user moved, the servod joints can’t hit each other because the unity scene prevents it.

it’d be really great to see you guys using this stuff like some of the universities are. They’re remote telepresence and exosphere use would impress you as it’s right up the alley you’re discussing.

What issues are you having with this robot skill? If you ask I’ll be able to help. You guys are pretty much the only people who use the community forum and I want you to be successful.

PRO
USA
#131  

Its hard to get you guys inside my head. Just know what ever it is, its going to be amazing..lol.

PRO
Synthiam
#132   — Edited

I have no doubt it will be:) it always is!

Just don’t want you to think you need to reinvent what we already have or can quickly modify. The leading cause of failed robot companies is full stack development - meaning making stuff that already exists rather than building on top of it.

like robots seems to be the one industry that is not moving very fast because of that reason. I mean, we all use operating systems and computer and programs that other people/companies make. But in robotics everyone wants to reinvent. Every robot company has failed because of that. There’s a trail of robot company failures attributed to that.

Let me know what you’re having trouble with and I’ll help so you don’t have to rebuild everything. It’ll just drain your bank account and energy otherwise :)

#133   — Edited

Sorry...I just realized my Unity App has a problem!!  But I will keep you guys posted!! :)

PRO
Germany
#134  

Hello, everyone,

I'm new in the community, my name is Uwe from Germany.

Since last week I have a Rolli Rover, my interests are to control it with my Oculus Quest and to see a stereoscopic live view. I use a Stereopi V2 with an HDMI to USB cam device. I have also successfully tested in Unity with the Skill servo and Camera Server, I get a stereo view (image is transmitted side by side). The latency is very good, works great so far.

One problem I encountered, the skill works when you add it to the project, it connects. But as soon as you save the project and start it again, no connection is established or you have to delete the skill and reinsert it, then it works again. Is this a bug or am I doing something wrong?

Excuse my English, I'm translating with a translator, I hope you can still understand it.

Regards Uwe

PRO
Synthiam
#135  

Use this robot skill with the oculus quest: https://synthiam.com/Support/Skills/Virtual-Reality/Oculus-Quest-Robot?id=21158

PRO
Germany
#136   — Edited

Yes, of course I know it and I already have it running. But since I want to create the Unity App myself to display the video stereoscopically, I can't get any further with it. It would be perfect if I could get the Unity Scripts (send controller data) from this skill (Oculus Quest Robot) that would help me a lot.

I use this Cam User-inserted image Stereopi V2

Greetings Uwe

PRO
Synthiam
#137  

Try this robot skill to get stereoscopic from two camera in vr: https://synthiam.com/Support/Skills/Virtual-Reality/Dual-Camera-Servo-Server?id=21198