Servo And Camera Server

Control Thumbnail

How To Download and Use This Control

  1. Make sure you have the latest version of EZ-Builder installed.
  2. Select the Install button below to download the file.
  3. Double click the downloaded file to open installer.
  4. The installer will add this control to EZ-Builder.

This is a servo & camera video server which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to EZ-Builder, where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit.


Example Client App Source Code
Here is an example test app src that connects to localhost (127.0.0.1), moves a servo on port D2 and displays the camera video stream. The sample app is C# .Net source-code and can be downloaded at this link: Test App v2.zip


Rasperry Pi
This plugin will work on a Raspberry Pi EZ-Builder, but will require SUDO privilege. This means you will have to launch EZ-Builder with sudo mono EZ-Builder.exe. This is because the plugin binds to a TCP port and therefore requires root privilege to become a network server program


Test Unity Project
I have included a test Unity project for example reference. The example rotates a cube on the screen using the ARROW keys. The cube projects the texture from the camera stream onto it. The arrow keys will also move the servos connected to port D0 and D1 relative to the rotation of the cube. You can download the project here: Servo Server.zip



Use In Unity
The stream client files in the "Test App" can be included in a Unity project to receive the video and move servos. The test app demonstrates how to move the servos using the methods, and how to display the video on Windows. To display the video in Unity, follow the steps below. The video works by becoming a Texture2D that can be applied to any material.

1) To use this in your Unity App, copy the files from the Test App\ServoServerClient\*.cs into your Unity project.
User-inserted image


2) Here is an example bit of unity code that can be attached to an object in Unity

Code:


using EZ_Robot_Unity_DLL;
using UnityEngine;

namespace SceneRobotView {

  public class RobotView : MonoBehaviour {

    // The stream client
    ServoServerClient _streamClient;

    // We want to ensure the scene has been initialized by at least one Update() event occuring before writing to a material.
    // This is because we don't want to write to material that hasn't been initialized yet.
    bool _initialized = false;

    // To be friendly on the garbage collector, we re-use ths same texture object.
    // The object gets initialized in the Start() for this scene.
    Texture2D _texture;

    // The data for the camera will arrive at a different rate than the scene.
    // To ensure there is synchronization between the two updates, we will store the camera video in this array
    // and load it into the texture on every Update()
    volatile byte [] _toDisplay = new byte[]{ };

    void Start() {

      // Initialize the texture with the video format from the servo Server
      _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);

      // The stream client will connect to the servo Server
      // Probably a good idea to use the address as a user-configurable item rather than hard coded;)
      _streamClient = new ServoServerClient ();
      _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
      _streamClient.Start("192.168.1.50", 8282);
    }

    // Every time the camera sends a frame, this event is raised
    private void _streamClient_OnImageDataReady(byte[] imageData) {

      // Exit if the scene has not been initialized yet
      if (!_initialized)
        return;

      // Set the image data to an array which we will use in the Update() on the material
      _toDisplay = imageData;
    }

    void OnDisable() {

      // Stop the streaming because we're closing this scene.
      _streamClient.Stop();
    }

    // Called with every update frame of the scene
    void Update() {

      _initialized = true;

      // If there is data in the camera array...
      if (_toDisplay.Length > 0) {

        // ... set the texture from the camera array data
        _texture.LoadImage(_toDisplay);

        // Set the material to the current component
        var material = GetComponent<Renderer>().material;
        material.mainTexture = _texture;
        material.mainTextureScale = new Vector2(-1, -1);
      }
    }
  }
}

#1   — Edited


@DJSures I just dug out the app we once made for JD, it already talks to EZ-Builder.
So with the EZ-Pi Server on the Raspberry Pi, I should be able to drive my robot with Unity and EZ-Builder once you got the plugin for the breakout board done...This would be awesome!!
My arm setup has one additional joint, but it should be fairly easy to run him like a JD for testing!!:)

I will make an input field for the IP address, so anyone can easily connect to their EZ-Builders HTTP Server...
Synthiam
#2   — Edited
@Mickey, this plugin will do what you're looking for. There's example code as well to demonstrate how to get a video image from the camera to Unity + how to set servo positions from unity. There's Map() functions in the ServoServerStream as well, which will help map from Unity floats of -1 to +1 into servo degrees.

The thing is, you will need a camera connected for it to work. The Unity ServoServerClient assumes you have a camera connected and streaming for the servo positions to send.

This plugin is WAY faster and better than using the http server - because it'll be much friendlier on resources as well... Doubly so if you're using it for a VR headset or something requiring fast efficient processing.
#3   — Edited
@DJSures You are a wizzard!!! This is great!!!
I will have to go to work now, but will look into it this evening!
I already noticed this is not a sample scene, so I will have to figure out how to set it all up...
Would you mind providing a sample scene for Unity?
I know they are quiet heavy, so a Dropbox link or something would be OK!

If there is no sample scene, I guess I will figure it out myself, great progress!!!:D
#4   — Edited
Ok now I know I will be revisiting my Unity this week. I prolly will have 1001 questions as I work my way through this plug. (I know hard to believe..LOL)

I’m interested to know will this stream servo positions live from unity to EZ-Builder with this plug in?
#5  

Code:

The thing is, you will need a camera connected for it to work. The Unity servoServerClient assumes you have a camera connected and streaming for the servopositions to send.



Just connected or actually streaming from the camera for the servos to work?
Synthiam
#6  
See the video and example Unity application in the description above
#7   — Edited
...video is worth a thousand words. Thank you. I will play around!

Great video btw!
#8  



Would I be able to do this, but connected to EZ builder by streaming this data out using this plug in, then data to EZB..if yes would this be close to real time?
#9  
@DJSures Thanks a lot for the demo scene and for the explanation of your code!! This is really cool!!!
The app that I build i Unity had trouble sending the servo positions fast enough when I deployed it on my phone, since each servo was using a coroutine that includes http request...so I was thinking that caching the servo positions will lead to a better result! But well, my hack and slay methods can not really be called coding I guess...so I will take a close look to what you have been doing here!!

Thanks a lot!! This is fun!!:D
#10   — Edited
@fxrtst It is possible to stream the motion live, now that @DJSure slimmed down the server to a dedicated server for camera and servo transmission, the latency will be a lot less than before! Using the HTTP server was giving very good results already!!
So you can build all the motions and translation of motion in Unity...basically Unity can be the app that could run on your phone, or Windows or Mac, and the EZ-B will drive your robot!

You could also store the servo positions in EZ-Builder if you wanted to...
#11  
...ok thats what i was hoping for! I cant wait to test it! Let me know how it goes for you!
#12   — Edited
StreamClient _streamClient; needs to be changed to ServoServerClient _streamClient;

in the RobotView script...
#14  
..Nice. Thats what i was hoping for! Cool. Just opened up so much more functionality to be able to use unity and servos and all the other things that ez builder has...originally what i was hoping for...
#15   — Edited
Hey all...I made some tests with the plugin, and ran into problems, which I partly solved.
First of all, I wanted to turn the camera stream into a UI raw image element, but the problem is that Unity does not allow Texture2D as UI textures. Only Sprites are allowed as UI textures I guess...I solved this quick and dirty by setting up a second camera with a CameraRederTexture which allows us to display within the UI. Basically cheating by filming the cube with the second camera!:D

But the main problem is that I am using a plugin for Unity to extract Inverse Kinematic Rotations, this plugin is called BioIK developed by Sebastian Starke and dicovered by @fxrtst...this plugin returns the joint value, which I would like to use for my servo positions.

The problem is, I cannot get this to work propperly because _streamClient.MapToByte seems to do something to those values which is not working for me!

So is there a way to just send and integer of 1 to 180 to the _streamClient?
This is how I extract my rotations...

eg

if (Axis_Z)
            {
                BioJoint joint = this.GetComponent<BioJoint>();
                double value = joint.Z.GetTargetValue();
                int position = Mathf.RoundToInt((float)value);
               
                if (Value != position)
                {
                    Value = position;
                    Changed = true;
                }
Synthiam
#16   — Edited
Yes... don’t call the MapToByte. Lol. Just call the set servo function directly
#17  
I think I was trying to do that, and it messed something up...but it was late so maybe I was just confused!!:D I am about to to go back home, will try it once more then!!

The servo function will accept int ranging from 1 to 180?
I am still a NOOB so a code  snippet will always be handy...but I guess I will manage!!

Will post once I got things set up!!:)


Btw, the Raspberry Pi camera is still painfully slow, are you planning to implement a compression feature on the EZBPi server for the camera?
#18   — Edited
Oh, I remember...I did NOT try to do that, I tried feeding integer into the MapToByte function...
That messed things up!!!:D
#19  
Well, I still cannot get it going...could you give me a code snippet on how to use, a float or int with the _streamClient?

It always throws an exeption at me saying it cannot convert the float, int or double to byte!!
Synthiam
#20  
cast the variable to a byte with (byte)

Code:


_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)SomeInt);
#21  
OMG....I thought its some kind of magic!!!!:D:D:D

Thanks for helping the NOOB!!!
#22  
Got it working now...my god, sometimes its so hard to figure stuff out when you are depend on searching the Internet!! That was all it took!!

I will make a quick demo video on how this is working so far!!
It great, finally Unity and EZ-Builder share a common ground!!:D
#23   — Edited
Yes...waiting for video! Working on a new project that can utilize this with the bio IK.
#24  
...game changer....again....30 year fantasy about to come to fruition.....CG and real world are about to collide and merge.
#25  

This is just a quick demo that this can work out...
You can see the camera stream within the Unity App, and servo positions are being transferred live to EZ-Builder.
This was utilized on the IOTiny, so I dont know if this caused the latency, or maybe my *rainbow* coding?:)

I am using @DJSures code snippet

_streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)SomeInt);

So the weird jitter might be caused by me using the SetCachedServoPosition wrong.
But this is the POC that this works out and just needs so tuning I guess!!

Great step forward, these are some exiting days!!!:D
#26  
@DJSures I tried serveral things to get Unity sending the servo positions live to EZ-Builder which was partly successful...what I am experiencing are random timeouts. So the servo position is not send, which results in a jump to the next position, rather than smooth movement...just guessing this of course!
I was testing on an IOTiny and on an EZB v4, but I guess the problem is with im code in Unity!

So since you had this already figured out for the Virtual Reality Plugin, would you mind sharing you code on how to drive a couple of servos live? Just to make sure I did not do anything wrong? Or should I clean my code an post it here, so you can see what I was up to?:D
#27  

Code:

using EZ_Robot_Unity_DLL;
using BioIK;
using System.Collections;
using UnityEngine;
public class CameraCubeRotate : MonoBehaviour {
    ServoServerClient _streamClient;
    bool _initialized = false;
    public float _speed = 0.5f;
    Texture2D _texture;
    float positionAbs;
    public GameObject D0;
    public GameObject D1;
    public GameObject D2;
    volatile byte [] _toDisplay = new byte[]{ };
    void Start() {
        _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);
        _streamClient = new ServoServerClient();
        _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
        _streamClient.Start("127.0.0.1", 8282);
    }
    private void _streamClient_OnImageDataReady(byte[] imageData) {
        if (!_initialized)
          return;
        _toDisplay = imageData;
      }
      void OnDisable() {
        _streamClient.Stop();
      }
      void Update() {
        _initialized = true;
        if (Input.GetKey(KeyCode.Escape))
          Application.Quit();
    
        if (Input.GetKey(KeyCode.RightArrow)) {
          transform.Rotate(new Vector3(0, -_speed, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(transform.rotation.x));
        }
        if (Input.GetKey(KeyCode.LeftArrow)) {
          transform.Rotate(new Vector3(0, _speed, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, _streamClient.MapToByte(transform.rotation.x));
        }
        if (Input.GetKey(KeyCode.DownArrow)) {
          transform.Rotate(new Vector3(_speed, 0, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(transform.rotation.y));
        }
        if (Input.GetKey(KeyCode.UpArrow)) {
          transform.Rotate(new Vector3(-_speed, 0, 0));
          _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, _streamClient.MapToByte(transform.rotation.y));
        }
        BioJoint joint_D0 = D0.GetComponent<BioJoint>();
        double value_D0 = joint_D0.Z.GetTargetValue();
        int position_D0= Mathf.RoundToInt((float)value_D0);
        int positionAbs_D0 = Mathf.Abs(position_D0 - 90);
        print(positionAbs_D0);
        BioJoint joint_D1 = D1.GetComponent<BioJoint>();
        double value_D1 = joint_D1.Y.GetTargetValue();
        int position_D1 = Mathf.RoundToInt((float)value_D1);
        int positionAbs_D1 = Mathf.Abs(180 - (position_D1 - 90));
        print(positionAbs_D1);
        BioJoint joint_D2 = D2.GetComponent<BioJoint>();
        double value_D2 = joint_D2.Y.GetTargetValue();
        int position_D2 = Mathf.RoundToInt((float)value_D2);
        int positionAbs_D2 = Mathf.Abs(position_D2 - 90);
        
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs_D0);
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D1, (byte)positionAbs_D1);
        _streamClient.SetCachedServoPosition(ServoServerClient.ServoPortEnum.D2, (byte)positionAbs_D2);
        // Send all the servo positions if there's been a change
        if (_streamClient.HasServoChanged)
        _streamClient.SendCachedServoPositions();
        if (_toDisplay.Length > 0) {
          _texture.LoadImage(_toDisplay);
          var material = GetComponent<Renderer>().material;
          material.mainTexture = _texture;
          material.mainTextureScale = new Vector2(1, -1);
        }
      }
    }
#29  
Not really, I already dug out the old script that @ptp wrote, while he was still trying to set this all up by directly talking to the EZB and bypassing EZ-Builder! He wrote some really good code, that I am currently trying to rewrite...he had a really nice approach on handling the servos in Unity! but I will have to go sleep now, it 3am already!!:D

Did you ever try the Virtual Reality plugin? It should use the same code like the camera server? Does it work smoothly?
I willl test this tomorrow...to see how the latency is while using the plugin!!
Anyways I am having fun earning a lot of stuff here!! Lets hope we can tie it all together!!:)
Synthiam
#30  
Think you’re over complicating how it works. Given the code you posted, I’m not certain of the full picture. Put the project on Dropbox or something so I can see what you’ve done? 

I provided a complete and working unity example above. Did you try it? Hope so because I took time to make it for you:)
#31   — Edited
@DJSures Yes, I used your code and am using your plugin!
I am making good progress in the servo setup, this should make it easier for everyone using the scene to declare the servos, and their propperties!
I am sure its me complicating stuff, or messing stuff up!!:D

Would you mind sending my a private mail, so I can send you a dropbox link? The scene contains a paid plugin, so I dont want to share the link public! 


P.S. I ordered two LewanSoul Servos so I can test all this on my EZPi too!! It would be awesome if that 16channel servo PWM board would be supported in future updates!
#32   — Edited
My theory is that Bio IK is somehow changing the translation and rotation values to something that servo server cannot translate properly? Or a longer delay with sendcashedservopositions then you expect (with as many servos that are in JD?)

I'd try a simple scene with just a box in unity and connect rotational values to this plug in to stream to a single servo and see if it works like you suspect. Then add another box and bio Ik to create an IK chain. Does it act ok or do you get the jitters? Keep the test simple and add one thing at a time until it breaks.
#33  
@fxrtst I will look into this  a bit more now...the reason why I am rewriting the script that @ptp was using back in the days is mainly, because I need to also cache my servo positions when I am using my Unity App on Android, since Android is very slow when it comes to http requests, I would like to send them all at once rather than per servo...and the script @ptp wrote is a beautiful way to do so!! 
Once this is set up, we can send as many servos as we wanted too, and test if Unity to EZ-Builder works without latency!!

It all works fine with the cube scene...so it might be some messed up code somewhere!!:)
Synthiam
#34  
I can only guess what your project is doing, but i imagine the cache is being set in multiple places. Put your project on a dropbox and ill take a look. I'm certain it will only take a minute and i'll be able to point it out
#35  
@DJSures I will do it, just need to clean it up a little, it is quiet messy at the moment!:D

One thing I would like to know...can I change the

_streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);

D0 from ServoServerClient.ServoPortEnum.D0 to a variable?

I cannot wrap my head around this!!
But I can also just send you my scene so far with just one servo...
I will do this now!:)
#36  
This is the code so far...using one servo works fine!!:)

Code:

using EZ_Robot_Unity_DLL;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Networking;
using Debug = UnityEngine.Debug;
namespace BioIK
{
    public class ServoDefinitionsAndSend : MonoBehaviour
    {
        public int positionAbs;
        ServoServerClient _streamClient;
        bool _initialized = false;
        Texture2D _texture;
        volatile byte[] _toDisplay = new byte[] { };
        private static readonly ServoDefinition[] ServoDefinitions = new ServoDefinition[]
        {
            new ServoDefinition("JD_Rig_v01:Right_Rotate_Shoulder", AxisType.Z, ServoPortEnum.D0, false, 1, 180, 90),
            //new ServoDefinition("JD_Rig_v01:Right_Upper_Arm", AxisType.Y, ServoPortEnum.D1, false, 1, 180, 90),
            //new ServoDefinition("JD_Rig_v01:Right_Lower_arm", AxisType.Y, ServoPortEnum.D2, false, 1, 180, 90),
        };
        private class ServoDefinition
        {
            public readonly string JointName;
            public readonly AxisType Axis;
            public readonly ServoPortEnum Port;
            public readonly bool inverted;
            public readonly int minValue;
            public readonly int maxValue;
            public readonly int middleValue;
            public ServoDefinition(string jointName, AxisType axis, ServoPortEnum port, bool inverted = false, int minValue = 1, int maxValue = 180, int middleValue = 90)
            {
                this.JointName = jointName;
                this.Axis = axis;
                this.Port = port;
                this.inverted = inverted;
                this.minValue = minValue;
                this.maxValue = maxValue;
                this.middleValue = middleValue;
            }
            private int ClampValue(int degrees)
            {
                return degrees < this.minValue ? this.minValue : degrees > this.maxValue ? this.maxValue : degrees;
            }
            public int AdjustValue(int degrees)
            {
                var servoValue = FixRotation(degrees, this.inverted, this.middleValue);
                var clampedValue = this.ClampValue(servoValue);
                return clampedValue;
            }
        }
        private static int FixRotation(int deg, bool inverted, int middleAngle = 90)
        {
            return !inverted ? middleAngle + deg : middleAngle * 2 - (middleAngle + deg);
        }
        public enum AxisType
        {
            X = 0,
            Y = 1,
            Z = 2
        }
        public enum ServoPortEnum
        {
            D0 = 0,
            D1 = 1,
            D2 = 2
        }
        void Start()
        {
            _texture = new Texture2D(640, 480, TextureFormat.RGB24, false);
            _streamClient = new ServoServerClient();
            _streamClient.OnImageDataReady += _streamClient_OnImageDataReady;
            _streamClient.Start("127.0.0.1", 8282);
        }
        private void _streamClient_OnImageDataReady(byte[] imageData)
        {
            if (!_initialized)
                return;
            _toDisplay = imageData;
        }
        void OnDisable()
        {
            _streamClient.Stop();
        }
        private void Update()
        {
            _initialized = true;
            int ServoNumber = ServoDefinitions.Length;
            for (int i = 0; i < ServoNumber; i++)
            {
                string Name = (ServoDefinitions.JointName);
                var ServoJoint = GameObject.Find(Name);
                BioJoint joint = ServoJoint.GetComponent<BioJoint>();
                print(ServoJoint);
                bool invert = (ServoDefinitions.inverted);
                var PortNumber = (ServoDefinitions.Port);
                print(PortNumber);
                var UpVector = ((int)ServoDefinitions.Axis);
                if (UpVector == 0)
                {
                    print("X");
                    var Value = Mathf.RoundToInt((float)joint.X.GetTargetValue());
                    if (invert)
                    { positionAbs = 180 - (Mathf.Abs(Value - 90)); }
                    else
                    { positionAbs = Mathf.Abs(Value - 90); }
                    //print(positionAbs);
                    _streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
                };
                if (UpVector == 1)
                {
                    print("Y");
                    var Value = Mathf.RoundToInt((float)joint.Y.GetTargetValue());
                    if (invert)
                    { positionAbs = 180 - (Mathf.Abs(Value - 90)); }
                    else
                    { positionAbs = Mathf.Abs(Value - 90); }
                    //print(positionAbs);
                    _streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
                };
                if (UpVector == 2)
                {
                    print("Z");
                    var Value = Mathf.RoundToInt((float)joint.Z.GetTargetValue());
                    if (invert)
                    { positionAbs = 180 - (Mathf.Abs(Value - 90)); }
                    else
                    { positionAbs = Mathf.Abs(Value - 90); }
                    //print(positionAbs);
                    _streamClient.SetServoPositionAndSend(ServoServerClient.ServoPortEnum.D0, (byte)positionAbs);
                };
            }
            if (_streamClient.HasServoChanged)
                _streamClient.SendCachedServoPositions();
            if (_toDisplay.Length > 0)
            {
                _texture.LoadImage(_toDisplay);
                var material = GetComponent<Renderer>().material;
                material.mainTexture = _texture;
                material.mainTextureScale = new Vector2(1, -1);
            }
        }
    }
}
Synthiam
#37  
Again, the code doesn't help me. I'd need to see your project. The code is a very small factor to the project as a whole.
#38  
There you go...delete the comment after you downloaded the whole thing!!:)
#39   — Edited
Seems like this thread has been split over to the serial bus thread. Lets keep it all nice and neat in one place:)

I believe this is the single most powerful development with Synthiam/EZ Builder.


We should work openly and freely to get this figured out.  I think its important that DJ take a look at the scene with the inverse kinematics plug in attached to the joints. He will know immediately what the issue is if there is one.

The Unity bridge could be a game changer for robotics. Using it Live or with recorded motions could change the way we add motion to our robots. 

My 2 Cents
#40   — Edited
@fxrtst I am totally on you at all points...my goal was always to create a Unity Project where everyone can contribute, this is why I thought a JD would suit best! So there is the EZ-Robot which is a good base for testing out new components, and whoever comes up with a new feature will be able to contribute and share!!
So if for example, you found out how to create fluid animations you could make one for our sample scene and we can all test it on our own projects!! Or if I would make a new camera control or a nice tracking feature...

I only took the link down to respect Sebastian, the guy which wrote BioIK...but I will organize the project and will be putting it up on GitHub, so it will be open for everyone to contribute and share!!

I do not experience any trouble sending live data to my robot, but it is running on different hardware and different software...
Since Synthiam is opening up to new hardware solutions, I am happy to get this revitalized, now I can use my robots hardware and have the EZ-Builder sofware process the data!
I never had a visual feedback, because this would have required to tear my robots cable management apart just for testing...now its a whole new story!!:)

If anyone does not want to wait, pm me for the messy scene, so you can hack and slay yourselves!!:)
#41   — Edited
*deleted* *deleted* *deleted* *deleted* *deleted*
#42   — Edited
Yeah that was a good idea, I'd feel bad if Sebastian lost money on our behalf.

So the issue you are having is other hardware and software  you are adding to your project?

If I set up a scene with Bio IK and connect it to EZB using this plug in, then everything works? I'm going to try and test later today. Something simple like one or two joints.
#43  
@fxrtst I was trying to use an implementation of the code @ptp wrote back in the days, I will drop him an email so he can participate because he knows best what he was coding and how it works!
I will clean up the project (hopefully tonight) and put it to GitHub, so everyone can contribute!!

I guess that is the fastest way to get this to work...I will need to clean the rig a little, redo some naming conventions, and remove some *bunny hug* that piled up when I was trying to get stuff done...with that out of the way you, @DJSures, @ptp and myself will be having a nice clean Sandbox to play in!!:)
#44  
@all Guys I am terribly sorry to have caused a mess, but now I am almost 100% positive that my weird servo jumps came from underpowering the EZ-B Servos! I run my robot on 5 Volts and I did not pay attention when I hooked up the EZ-B to the Powersupply...I just thought it'll look cool to have those EZ-Robot servos attached!!

So today just because I rebuild the scene, I grabbed some servos on the table...and they seem to run just fine!!:D

Anyways I created a GitHub repository, it is without the plugin but @DJSures can pm me or @fxrtst about this, I dont think that Sebastian would mind if we would give to someone for debugging...just putting it out in the open is not so nice!!:)

I am really excited where this project will take us...its been a lot of fun so far!!
#45  
....ahhh the ole underpowered servo trick:). Sounds good. If this plug in works with Bio Ik then I'm happy. Gotta have some sort of IK to make all this worth while!
#47  
Let me know how this works for you, and drop DJ a line about the missing plugin so he can do some testing too...;)

I will go an sleep now, its 1am over here!! See you tomorrow!!:)
#49   — Edited
Okay just got a moment to sit down with this. I am having trouble getting ezb to talk with unity. I'm sure its ip/port related. No streaming occurring. I just added the cameracuberotate to a cube. Camera is active in ezbuilder and plug in installed and open. I press play in unity and i can control the rotation with arrow keys but no projection of camera or servo movement.
#50  
@fxrtst Are you using the GitHub scene or the ServoServer scene?
I will do a quick video on how to set it up when I am back from work!!
It all works fine now!!:)
#51  
Btw, can you confirm that BioIK is picking the joint rotations when you are playing those .fbx animations exported from Lightwave? I had some trouble figuring out the differences between baked animations and animations done within Unity...but I am sure it can be sorted out easily!!
#52   — Edited
I loaded the servoServer scene. I just get the white cube, i can rotate it using arrow keys but no servo or camera streaming data from ez builder. There is no Bio Ik in my scene, just trying to get the basic scene to work first before I start adding IK.

(Edit: Could it be version dependent. I'm using Unity 2017.4?) DJ built the scene in 2018 version.

I think the fbx files were only for joint positions, I don't remember doing any animation with them. Baking animations from one program to another simply removes the bone rig and bakes vertex position on every frame vs standard animations which will use a rig and you set key frames on poses and the program interpolates the frames from key frame to key frame.
#53  
My Unity is up to date and everything works fine!! So updating might be a good idea!:)
#54  
Yeah I updated and I still have the same issue. I'm using 127.0.0.1 as thats the internal Ip of all computers correct? What is the port number based on. Is 8282 just an arbitrary number?
#55   — Edited
* deleted *
Not relevant information
#56   — Edited
@ptp I am using 

_streamClient.Start("127.0.0.1", 8282);

And it works for me, I figured the ServoServer runs on local machine right?

Anyways, one thing that will cause it not to work is, if did not start your camera...
You need to have the EZ-B camera connected and running!!:)

Its kind of funny since I connected my robot with an Arduino Uno running EZ-B Firmware, I still have to connect to my EZ-B and need to start its camera in EZ-Builder for this to work...so it basically just sits on the shelf doing nothing but streaming a video of my wall!!:D

But it all works, which is the only thing that counts, its all still beta anyways...
And finally my robot can participate and I can use EZ-Builder by just having to rewire the servo Board connection and not the whole cable tree of my servos!!:D
#57  
....yeah my camera has been on for a day now..same thing pointed towards the wall. Using the 127.0.0.1:8282 will not stream for me. I've used every possible combo including PTP suggestion and I still get no stream.
#58  
User-inserted image


...i have noticed using the default scene with 127.0.0.1 when I hit play in unity and then hit stop the plug in window in ez builder updates with "disconnection on tty" 1 or 0 so it looks like they are talking to some degree, but not passing the camera and servo info.
#59  
.....question are you using the beta version or or the release from 1.26 and does it matter?

Ez builder designer 1.26
Ez builder designer beta 3.4
#60   — Edited
User-inserted image
User-inserted image


It s working.
127.0.0.1 8282
#62  
@fxrtst When you said the thing about the disconnect it came back to my mind that I had an issue like that before...try removing the camera control from your project and add it again, just a guess into the blue!!:)
#63   — Edited
............................deleted.............
#64   — Edited
......deleted.....
#65  
Can you guys tell me the version of EZ builder you are using
#66  
.....ok . This should go in the description under prerequisites...must use the latest beta. WILL NOT Work with 1.26

Got it working by installing latest beta.
#67  
Keeping my fingers crossed for you...and also I would try the EZ-B camera, just to make sure!!
But I guess updating will fix your problem!!:)
#68   — Edited
....yeah its working...man i spent so many hours messing around....grrrr


Do you have this working with Bio Ik?
#69  
Yes its all working!! But I can tell you this needs a lot of tweaking if you want to create fluid motion...I got my scene set up, so I can use the Virtual JD scene to move my own robot!!
Its a bit off, because the measurement of the JD and my own robot do not really match, but it works!!

You can set up anyway you like, I guess I can always reproduce your setup in a way!!
Let me know how its all working for you!!:)
#70  
No fluid motion because? What are your thoughts? I wonder the more servos you have if its just too much for a live set up?
#71   — Edited
I've managed to break it a couple of times. Sent rotations for 3 servos real quick and it stopped responding and I had to re boot the tiny IO to get it to work again. I can see the jerky movements in the servo positions. Not sure what that is. Could be that the positions are sent once collected causing a bit of a lull between positions.
#72  
Did you use this method... _streamClient.SetCachedServoPosition

But yes its choppy, I guess to get smooth motion you would have to do something to those values!
Either way, form an array of values over time and send and equalized value every Nth frame, or send them at a given framerate...its all options!
It all works as it should I guess, but it still needs a great deal of tweaking!!:)
#73  
Yes I used that method. I think you are correct something needs to be done to those values to smooth out the motion. For sure needs some tweeking. But a great start.
#74   — Edited
The title says it all. Step right up ladies and gentlemen! Spin the cube and watch wormy's mesmerizing dance!
#75  
That seems to be super fluid...are you using the default code, or did you do anything to those servo values already?
#76  
Thats default. And i think that the cube it spinning so fast that the servos are forced to update faster. But if i slow down the movement its very jerky. Almost as if the steps are larger than a degree or two.
#77  
Ah OK, I was thinking its only my setup that does this...
#78  
Hey guys...I tried to use the Raspberry Pi camera in Unity for the setup with the Servo-and-Camera-Server, but had no luck!
It gave me the following Error in Unity 2018!
Any ideas what this might be?

SocketException: Der Zugriff auf einen Socket war aufgrund der Zugriffsrechte des Sockets unzulässig.

System.Net.Sockets.Socket.Bind (System.Net.EndPoint local_end)
SyntaxTree.VisualStudio.Unity.Messaging.UdpSocket.Bind (System.Net.IPAddress address, Int32 port)
SyntaxTree.VisualStudio.Unity.Messaging.Messager..ctor (Int32 port)
SyntaxTree.VisualStudio.Unity.Messaging.Messager.BindTo (Int32 port)
SyntaxTree.VisualStudio.Unity.Bridge.VisualStudioIntegration+<>c__DisplayClass9_0.<.cctor>b__0 ()
SyntaxTree.VisualStudio.Unity.Bridge.Project+<>c__DisplayClass37_0.b__0 ()
UnityEditor.EditorApplication.Internal_CallUpdateFunctions () (at C:/buildslave/unity/build/Editor/Mono/EditorApplication.cs:200)
#79  
It was an old version of the EZBPi server that caused the error in Unity...I updated to the latest version and everything works just fine!!
So now there is one thing that I would like to know...
Since my robot is using serial bus servos, is there a way to drive them directly with the ServoServerClient?
Or is there a way to access those values that are being send?

If this is not possible, I would send the values to another server already established on the Raspberry Pi...
But if there would be a way to do it with the ServoServerClient it would be kind of cool too!!:)