Asked — Edited

Vr Options?

Does anyone know if the SDK of the Oculus Rift package would work with EZB if there was a plug in developed for it? It doesn't really need to be something as big and bulky as that. Very unfortunate that Vuzix is no longer offering support with EZB. That would work great.

Like the Vuzix, I'd like to pull from the IMU data to move servos and transfer the camera feed from the EZ camera to the goggles over two complete separate computers in different places.

Any ideas or suggestions on how to POC this, would be grateful.


ARC Pro

Upgrade to ARC Pro

Become a Synthiam ARC Pro subscriber to unleash the power of easy and powerful robot programming

#249  

#248 Unity is directly communicating to the EZ-B, the way it is set up now it will bypass ARC, so all constraints will have to be applied directly within Unity!

#249 The script is universal, although you will have to build a model inside of Unity for every robot obviously, plus you will have to register the servos in the script!

PRO
Canada
#250  

@fxrtst ptp script is well documented. You just need to do new ServoDefinition for each ServoPortEnum for Six (will be a cut and paste after doing one leg).

PRO
USA
#251  

Ok great. I have a lot going on and struggling to find the time to download and test.

PRO
USA
#252  

Quote:

So we should now have control for min and max and also inverting the servo direction within the plug in? Are we completely bypassing the ARC? Or will any constraints we set in ARC on servos be adhered too?

Please check my post #179

The current implementation uses the Mono SDK's EZ_B.DLL. The connection is established between Unity and EZB controller. So there is no ARC.

ARC EZ-Script Min Max e.g.:


# Left Gripper
SetServoMin(d6, 30)
SetServoMax(d6, 90)

equivalent using the EZB_B SDK (DLL):


this.ezb.Servo.SetServoMin(Servo.ServoPortEnum.D6, 30);
this.ezb.Servo.SetServoMax(Servo.ServoPortEnum.D6, 90);

I didn't use the EZ_B functions, but is not equivalent of doing a "ARC Bypass". Those functions implement the same logic i did, although if you have an application with multiple components and multiple calls to SetServoPosition(s) makes more sense to initialize the limits once and all the calls will respect the min max.

I'll fix that later to enforce good practices.

I'll improve the script to support Serial Connections and via ARC.

To summarize you have 3 options:

  1. Connection to the EZB using a Hostname/Port (EZB SDK or via custom code)

Pro: Versatile mode: Wireless no cables.

Cons: You can't share the EZB with ARC. Wifi speed is very subjective, interference, distance, No 5 Ghz wifi only 2.4 G

  1. Connection to the EZB using Serial Port physically connected to the Camera Port

Pro: Has the minimum entropy, you send 2 bytes per servo directly to the controller.

Cons: You can't share the EZB with ARC. You need a Physical connection
You lose the camera Port / Camera

  1. Connection to the ARC's Http Server

Pro: Versatile mode: Wireless no cables. You can share the EZ-B with ARC, you can use autoposition, ez-scripts all that "boring" stuff that is keeping you from coding bare metal C or C++:)

Cons: Wifi issues from point 1) More entropy: Unity/Custom App->WIFI->ARC Http Server->EZ_Builder execution machine->EZ_B.DLL->EZB

Protocol is HTTP, verbose protocol much more than 2 bytes per servo.

You pick your poison.

PRO
USA
#253  

Ok thank you PTP, thats alot to digest and answers alot of my questions. Obviously each of us has their own project ideas and how we are going to implement this into our robots.

I personally will be looking into how to get facial mocap to move servos to work with your Perception Neuron plug in. That way I have full solution over face and body motion capture.

I will also be diving deep into Unity, setting up animations via keyframe by keyframe using IK/FK to set up motions quickly like gait etc.

Both of these will likely be playback from files, not live. Your solution is complete. Thanks again as always for your valuable time to help us solve this!

PRO
USA
#254  

ROS Framework uses the URDF (xml) description to describe a Robot. That description is useful to identify the physical robot boundaries used in navigation.

Using a description model you separate the Application from the Robot.

Maybe Unity allows a "model-plugin-component" or method to describe the model outside of the Unity Project.

Regarding the JD (Humanoid):

  1. How to Improve the walk ?

Per Leg: +1 servo between Ankle (Z rot), Knee (X rot). The new servo becomes the Ankle (X Rotation). +1 servo between Hip(X Rot) and the Body. The new servo becomes the Hip(Z Rot). DJ created a STL to attach 2 rotation servos, and is available for download. +1 (optional ?) Hip servo (Rot Y), this one should be inside the body, and a new adapter to the previous HIP servo.

I believe both Nao and the Bioloid have those extra servos. Having them will allow more fluid movements.

  1. Unity

We don't have an ARC Kinematic API, everything is done through scripting/auto-position.

this is not an easy task.

The ROS framework provides the MoveIt framework, the calculations are heavy and slow, but once you wire everything you don't script the movements.

Can Unity with plugins help these areas: 2.1) Avoid collision between body parts (You have servo limits, but they are not enough)

2.2) Restrict the movements when mimicking human poses e.g. elbow to the back etc.

2.3) Move the gripper to a specific position and calculating all the other joints to achieve the final goal ?

PRO
USA
#255  

Quote:

1) How to Improve the walk ?

  1. I agree for a better "walk" experience there are missing joints that exists in the human form. More joints will help. There are other ways to create a walk cycle. But it would be trial and error to work correctly. There needs to be constraints for the feet so you can lock them flat to ground. A missing component to bio IK. Most other rigging systems have this. I said this before Bio IK may not have been the best choice for a full IK system. That's why i purchased 3d puppet, which is more like conventional IK system. I just never got that far to test with. A work around is to add more goals for the knees.

  2. Agreed and this is why I have been trying to get a visual UI representation for animation for years. FK and IK are simply to aide in the animation process. Creating an animation allows you to use the Ik rig to pose your character to create keyframes. You can (as we are doing now) create a realtime floppy avatar to play with. To me this is NOT where the power in Unity comes from. The leverage is in using IK to build animations. You can create buttons and sliders to control and blend animations using a machine state called Mechanim. This allows you to transition from one animation to the next, i.e. Stand idle>Sit down> stand up>do push up. etc.

2.1) Yes every part of an object has a collision body already associated with it, meaning if you use the physics engine built into unity you can keep parts from intersecting one another in the animation window. These can be adjusted until there are no collisions on JD. Keep in mind the more complex a system the slower it will run.

2.2) When first set up the rig on JD I set all the limits of joints in Bio IK. So there was no way to bend back an elbow. Watch my original videos of my tests, elbow does not go backwards. These limits can all be set in Bio IK.

2.3) If you watch Sebastian's videos he shows a humanoid robot moving its waist to try and reach the goals. JD has no waist, so his body must be the "root" when creating the IK rig. In a human body you have a "root" AND a waist joint. JD was not created this way and is a limit by the actual physical robot, not by my rigging or a limitation in Bio IK.

#256  

I think first of all we need to not let ourselves be burdened down by Jds DOF restrictions, we should focus on the main goal..which is creating a Virtual JD for everyone having a JD to use Unity to move him as good as the IK system allows to!

Second task is to explore what Unity can do for us. Having a virtual environment with a representation of our real robot should let us describe every movement possible and transfer this real time or playback to our robots!

Lets not forget, the JD is a testing platform for all that is yet to come! Once we will have him set up, we can use JD as a common ground to explore all the possibilities that Unity will offer in the future!

And yes, Unity offers a lot more than just dragging IK goals in the viewport! I will make an example scene asap!

And if you are asking me, option one is fine! This is the most easy setup for all of us...Unity to EZ-B as easy as it can be!