USA
Asked — Edited

Jd Humanoid Robot Controlled By Kinect

This is a video of JD being controlled by a Kinect, I still need to to fix some discrepancies.

Only head, arms and grippers are being controlled.

I have the legs ready but i need to figure out away to secure JD when he loses the equilibrium, it's very difficult to keep him safe if you make a wrong move.

fishing lines ? more ideas ?


ARC Pro

Upgrade to ARC Pro

Harnessing the power of ARC Pro, your robot can be more than just a simple automated machine.

PRO
USA
#33  

@Will,

I developed a custom ARC skill plugin.

Regarding the project, it's not stopped, it's part of a bigger framework to interact with a custom Robot.

I'm breaking down the "Idea" in smaller components, that way i can release and ask for feedback.

I noticed in another thread you need help to integrate face sensor data, maybe i can help you that specific.

I'll drop you an email.

#34  

@ptp We are also working on sending motion data to the EZ-B in another thread... @fxrtst and myself would be interested in streaming and processing live motion data! I guess we could just read the servo positions out of ARC and send them to another application on arrival. But if you are getting them into ARC using the HTTP server, we could also use or application as a middleware. This would lets us record motion for later processing or tweak the incomming data to create better motion for individual each individual robot model while streaming the data live!

Does this sound like something which could work out or is this to much crosswiring between applications?:)

PRO
USA
#35  

@Mickey,

If i understood well the other thread objective is to control the ARC.via 3ds MAX, and you can do that via HTTP protocol + ARC.#39;s HTTP control.

If you need to tweak or synchronize both sides the master side is the 3ds tool, and ARC.hardware is the mirror.

It seems you are on the right track, you don't need any custom tool/development you only need some 3ds maxscripts and ARC.

Were do you see common paths ?

I can explain what components I'm working, but the focus of this post, is to understand the common areas if they exist.

PRO
USA
#36  

ASUS Topic:

A person (@JC) interested in the idea donated an Asus Xtion PRO to help extending the support to the Asus Xtion.

Currently Kinect V2 and Realsense are active, but only Kinect V2 provides body tracking data.

I got the xtion camera last Friday, and after some tests, the device works on a windows 10, and i got some positive results using the native API.

@JC, thanks for the device, as i told you before the elected hardware is the KV2, as soon the first components are released, i will start working on the older hardware KV1, and Xtion.

#37  

@ptp... Regarding the Xtion, that is very good news indeed...

#38  

@ptp Everything you mentioned is correct!

Correct me if I am wrong! One thing that could be done is to export the live motion out from ARC thru the HTTP protocol and record the motion within 3ds max for further processing and playback. It could be as easy as to use the GetServo( Port ) command and store it in a variable to read it out after the motion has been recorded!

The other thing that could be done is to stream the motion directly from the Kinect thru the HTTP server into 3ds max, where it would get processed and passed on to ARC, more joints could be added, motion could be retargeted or smoothed in this process...as we are working on a live control and on playing back animations correctly timed, both options would be interesting!

@fxrtst brought the idea of live motion capture up and we had the idea to check if our projects had a common ground, which I believe is the HTTP protocol!:)

PRO
USA
#39  

@Mickey,

let's clear some misunderstandings, HTTP is a client server stateless protocol. One side is the server answers in a ip/port (Server), and the other side sends command & receives the response.

Quote:

One thing that could be done is to export the live motion out from ARC thru the HTTP protocol and record the motion within 3ds max for further processing and playback.

A similar concept is the "http streaming". Http is not suited to perform stream, is a very versatile and always available protocol, there are some uses like mpeg stream, but is not the best option.

ARC does not generate data feeds (the exception is the ezb camera).

If you are talking about the servos, the servos positions are scripted or auto-positions actions, so trying to "stream" the servo positions does not make sense, you can easily replicate those scripts (Servo A pos X, servo B pos Y) on the 3d tool.

Quote:

The other thing that could be done is to stream the motion directly from the Kinect thru the HTTP server into 3ds max

If you are talking the ARC Http server, it can't be used the ARC Http Server exists to allow an external client to control some ARC features e.g. servos.

If the idea is to create a custom Application and act as a HTTP Server to provide a stream to the Sensor data, and use the stream within the 3ds max. I don't think that is possible, there is no standard for a http stream client, plus the 3ds maxscript have the normal limitations is not a full framework where you can build components or even modules.

So the question is how 3d tool can access sensor data ?

First is outside of ARC scope, there's nothing there it can help. Two options:

  1. Existent tools e.g. Faceshift , capture sensor data, the data is analyzed for example Face landmarks, some "black magic" is done, and then is exported to well know landmark formats, then you can use it later.

  2. 3d tools Plugins (C++), each 3d tool has a different plugin architecture, so it's not easy to build one plugin to fit all the tools... When you build a plugin you can access the 3d tool api, and you can communicate low level via network or hardware with an application or sensor. So the limits are the knowledge and possible the 3d tool api limitations.

coming back to your scenario, you have a different need you are on the tool creating a virtual robot, and replicating the motion on the hardware side.

You have 2 options:

  1. you control the ARC, and both worlds are in sync (via ARC Http Control)
  2. you export to a file the servo positions and the timeline, and later via ARC script you can playback.

Do you agree ?

#40  

@ptp It could be totaly true that I do missunderstand what a HTTP server is build for, I am only having some vague idea of what it does and how it is used... I used some similar setup a while ago to get motion into another 3d package, where it was as you said converted into another file format to be processed.

I guess my conception of

Quote:

5) Processing communicates with ARC via HTTP + HTTP Server Control 6) ARC + EZB + Servos
was wrong in thinking that the HTTP server could be used also to communicate this data to 3ds max!

And using your plugin to record motion live in EZ-Bulder and clean it up in the animation package later would probably be the best way to handle this!:)

Quote:

If you are talking about the servos, the servos positions are scripted or auto-positions actions, so trying to "stream" the servo positions does not make sense, you can easily replicate those scripts (Servo A pos X, servo B pos Y) on the 3d tool.

Thanks for letting me know whats possible and what is out of question here...the last two options that you stated are exactly what I am aiming for! I guess I just got too exited about things and thought a few steps too far ahead!:)