USA
Asked — Edited

Jd Humanoid Robot Controlled By Kinect

This is a video of JD being controlled by a Kinect, I still need to to fix some discrepancies.

Only head, arms and grippers are being controlled.

I have the legs ready but i need to figure out away to secure JD when he loses the equilibrium, it's very difficult to keep him safe if you make a wrong move.

fishing lines ? more ideas ?


ARC Pro

Upgrade to ARC Pro

Stay on the cutting edge of robotics with ARC Pro, guaranteeing that your robot is always ahead of the game.

PRO
Synthiam
#3  

Woo! Nice:) will this be an ARC skill plugin?

Also, is there a specific type of Kinect required? Can I use the one that came with my Xbox or do I need a developer version, etc

PRO
USA
#4  

i would like to extend the control to the legs, i think i will add two screws JD top box near the shoulders and fishing line to a vertical frame. If JD loses the equilibrium the lines hold him up, it seems an ugly hack, until a better idea.

I'm working the calculations, currently every joint is calculated independently, but, i believe when you see a skeleton as whole you can infer/adjust the joints minimize the errors.

I have two kinects: Kinect V1 Xbox 360 (game console) Kinect V2 Xbox One (windows KFW version)

the work is being done with the Kinect V2, API has more stuff, although the requirements are significantly higher: genuine exclusive USB 3.0 channel core I5 minimum (i plan to move to a core i7 to explore more stuff)

Kinect V1 and V2 have different API's.

I believe is possible to obtain similar results (less compared to V2).

before there were two versions of kinects: console & windows, last year MSFT abandoned the KFW (Windows), so now there is only one Kinect version, and you will need to buy an adapter to use with a PC.

I would like to hear ideas to explore the kinect interaction via ARC.

#5  

For stability, you can use the orientation sensor to detect that a fall is occurring, and have some auto-positions for restoring to a safe orientation, as well as standing back up if it does fall. A harness to prevent falls in the first place would certainly be useful while developing those routines.

Alan

#6  

Impressed! Great job! I look forward to messing with this sometime.

PRO
USA
#7  

@PTP, is it possible to record these actions live?

PRO
USA
#8  

can you explain what you mean ?

PRO
USA
#9  

As the video shows You making motions live in front of Kinect. The servos move to follow your motions live. Could it be possible record your live performance and play back that motion ( not live)?

PRO
Synthiam
#10  

You could write a short ezscript that loops every 100ms and saves each servo position to a file, separating each position with a comma

Simply make another script that reads back the file and sets the positions.

PRO
USA
#11  

Well this would be amazing. About 101 new uses if someone can make this happen! Extending uses for EZB!

PRO
USA
#12  

Will,

Its funny everything started because i was lazy and i wanted to create jd movement actions.

I'll create another video to demo the record & playback

#13  

Crazy idea that is probably more work than it is worth, but an EZB project is essentially an XML file. Could we write the file of servo positions and timings as an set of AutoPosition frames and actions that could be imported into an EZB project?

That would allow for simple visual editing of the routine after it is recorded.

Alan

PRO
USA
#14  

Ha ! I've been throwing this idea around in my head for sometime. I seriously have 101 really really cool ideas for using this. This is basically motion capture.

PRO
USA
#15  

Any luck with recording and playback of motions?

PRO
USA
#17  

Hey!

Happy memorial day.

@JD, you are always on:)

@Will,

Yes also i've been working on a data visualizer, i hope to have news soon.

PRO
USA
#18  

DJ! You are my hero! I'll take a look later this week. This opens up a lot of really cool uses for animatronics.

PRO
Synthiam
#20  

Thanks ptp:). I'll publish the source code when it's don't as usual. I use the EZB.Servo.OnServoMove event and record each servo movement and be elapsed millisecond timing. It's pretty straight forward.

Any progress on making this a plugin? I can assist if you'd like.

PRO
USA
#21  

You guys are great! I love this group!

#22  

Good job @ptp

I am still having my XBox 360 Kinect that I used to record motion capture within my 3D application...cannot wait to try this on my robot!

I was really exited when I accomplished, to use my Kinect for tracking but the downside of it is that you cannot really create motion capture data off it....there is a lot of cleaning to be done manually to get a good animation!

I am really surprised on how good this works within ARC, supercool stuff going on within this forum lately!:D

Portugal
#24  

Hi ptp, would love to use the kinect for doing a map and navigate. Is it possible? Thanks.

PRO
USA
#25  

@Merne,

The other thread setup is based from the book:

https://www.amazon.com/Arduino-Kinect-Projects-Design-Technology/dp/1430241675 Chapter 6 Kinect-networked Puppet. Play with a physical puppet remotely using your whole body.

Original book sample:

  1. is based on Kinect Xbox 360 hardware (deprecated)
  2. the core logic code is Processing https://processing.org/ runs on the Windows/Linux/MacOS (Java code)
  3. Processing code uses the SimpleOpenNI a processing wrapper for OpenNI framework
  4. OpenNI v1 framework was created to handle NUI sensors (Kinect, Asus, others) by PRIMESENSE (Microsoft partner to support the Kinect xbox 360, and Asus Partner to support Asus XTion Pro live sensor). Apple bought PrimeSense and shutdowns all "open" initiatives so OpenNI v1 is gone.
  5. Processing communicates with an Arduino via USB/Serial communication
  6. Arduino code receives serial commands i.e. servo positions and controls the servos

EZB Version (Post) 1-4) Same 5) Processing communicates with ARC via HTTP + HTTP Server Control 6) ARC + EZB + Servos

My version:

  1. Microsoft Xbox One (v2)
  2. Core logic is Microsoft Visual Studio C# and C++ 3-4) Microsoft Kinect V2 SDK
  3. SDK + EZB or ARC skill plugin + EZB

I'm on vacations, and i got delayed with "real" work, but it's not forgotten....

i plan to release a ARC skill plugin to control, record, play kinect body feeds.

#26  

@ptp.... Will your plugin also work with the Asus Xtion camera? It basically does the same thing as the Microsoft Kinect... I already have the Xtion, so I thought I would ask... Thanks

PRO
USA
#27  

@proteusy:

Kinect is well known and used with ROS framework for navigation, can be used to simulate a 2d Lidar or used as 3d Point Cloud navigation.

Currently i m using a Xbox 360 for navigation with ROS robot (Linux) and i'm researching the Xbox One on windows,

There are other similar sensors Intel Real Sense and Project Tango (Android), i have all of them, and the common base is the 3d point cloud data.

it s possible to write similar abilities on windows, but navigation and localization is much more than handling sensor data.

It takes time, money and human resources to do that, Linux is the choice for open source initiatives.

PRO
USA
#28  

@Richard,

i m using the Kinect V2 as reference more Joints, and the API is more recent.

i will add support for the Kinect V1, less joints, and different specs.

I can borrow an Asus XTION, and use the defunct OpenNI framework, unfortunately like Kinect Xbox don't expect driver updates.

But is doable.

PRO
USA
#29  

I'm really looking forward to this. Do you have a time line? I've recently ordered the Kinect v2 in anticipation of this project.

#30  

@ptp, From this post, https://synthiam.com/Community/Questions/7642, the you wrote about in this tread.

I have the Xbox 360 and found the SimpleOpenNI on my thumb drive, I have installed these two from the other post. Microsoft Kinect SDK - http://go.microsoft.com/fwlink/?LinkID=323588 Processing Windows 32-bit - https://www.processing.org/download/

I have installed and it is reading the Kinect in the processing box showing my body in the picture with the sample EZB project, but have not tried to use any servos yet.

Would my Xbox 360 be compatible with the plugin your are considering creating for this community?

Bty, I have just purchased the book with Kinect-networked Puppet from you link thank you. And thank you for your response(s)

#31  

Hello, I set this up and it does work using my linked post in thread #31 it it does work. Be careful, use a servo not connected to anything. The reason is because when you're running this setup and not in front of the Kinect it will move on its own if the skeleton moves it will move the servo.

Also, you have to change the virtual numbers in the processing and in the ezb project to a digital; port for it to work.

@Anybody, anyone know how to set this up? So if you are not in front of the skeleton/Kinect and someone or something is moving in front of the Kinect it would not move, does it need a variable? Hope this make sense?

Thanks, Merne

Edit I forgot to say I don't not mean to hi jack Ptp's thread

PRO
USA
#32  

Any progress on this?

I'd like to know how to get the data live into ARC.

PRO
USA
#33  

@Will,

I developed a custom ARC skill plugin.

Regarding the project, it's not stopped, it's part of a bigger framework to interact with a custom Robot.

I'm breaking down the "Idea" in smaller components, that way i can release and ask for feedback.

I noticed in another thread you need help to integrate face sensor data, maybe i can help you that specific.

I'll drop you an email.

#34  

@ptp We are also working on sending motion data to the EZ-B in another thread... @fxrtst and myself would be interested in streaming and processing live motion data! I guess we could just read the servo positions out of ARC and send them to another application on arrival. But if you are getting them into ARC using the HTTP server, we could also use or application as a middleware. This would lets us record motion for later processing or tweak the incomming data to create better motion for individual each individual robot model while streaming the data live!

Does this sound like something which could work out or is this to much crosswiring between applications?:)

PRO
USA
#35  

@Mickey,

If i understood well the other thread objective is to control the ARC.via 3ds MAX, and you can do that via HTTP protocol + ARC.#39;s HTTP control.

If you need to tweak or synchronize both sides the master side is the 3ds tool, and ARC.hardware is the mirror.

It seems you are on the right track, you don't need any custom tool/development you only need some 3ds maxscripts and ARC.

Were do you see common paths ?

I can explain what components I'm working, but the focus of this post, is to understand the common areas if they exist.

PRO
USA
#36  

ASUS Topic:

A person (@JC) interested in the idea donated an Asus Xtion PRO to help extending the support to the Asus Xtion.

Currently Kinect V2 and Realsense are active, but only Kinect V2 provides body tracking data.

I got the xtion camera last Friday, and after some tests, the device works on a windows 10, and i got some positive results using the native API.

@JC, thanks for the device, as i told you before the elected hardware is the KV2, as soon the first components are released, i will start working on the older hardware KV1, and Xtion.

#37  

@ptp... Regarding the Xtion, that is very good news indeed...

#38  

@ptp Everything you mentioned is correct!

Correct me if I am wrong! One thing that could be done is to export the live motion out from ARC thru the HTTP protocol and record the motion within 3ds max for further processing and playback. It could be as easy as to use the GetServo( Port ) command and store it in a variable to read it out after the motion has been recorded!

The other thing that could be done is to stream the motion directly from the Kinect thru the HTTP server into 3ds max, where it would get processed and passed on to ARC, more joints could be added, motion could be retargeted or smoothed in this process...as we are working on a live control and on playing back animations correctly timed, both options would be interesting!

@fxrtst brought the idea of live motion capture up and we had the idea to check if our projects had a common ground, which I believe is the HTTP protocol!:)

PRO
USA
#39  

@Mickey,

let's clear some misunderstandings, HTTP is a client server stateless protocol. One side is the server answers in a ip/port (Server), and the other side sends command & receives the response.

Quote:

One thing that could be done is to export the live motion out from ARC thru the HTTP protocol and record the motion within 3ds max for further processing and playback.

A similar concept is the "http streaming". Http is not suited to perform stream, is a very versatile and always available protocol, there are some uses like mpeg stream, but is not the best option.

ARC does not generate data feeds (the exception is the ezb camera).

If you are talking about the servos, the servos positions are scripted or auto-positions actions, so trying to "stream" the servo positions does not make sense, you can easily replicate those scripts (Servo A pos X, servo B pos Y) on the 3d tool.

Quote:

The other thing that could be done is to stream the motion directly from the Kinect thru the HTTP server into 3ds max

If you are talking the ARC Http server, it can't be used the ARC Http Server exists to allow an external client to control some ARC features e.g. servos.

If the idea is to create a custom Application and act as a HTTP Server to provide a stream to the Sensor data, and use the stream within the 3ds max. I don't think that is possible, there is no standard for a http stream client, plus the 3ds maxscript have the normal limitations is not a full framework where you can build components or even modules.

So the question is how 3d tool can access sensor data ?

First is outside of ARC scope, there's nothing there it can help. Two options:

  1. Existent tools e.g. Faceshift , capture sensor data, the data is analyzed for example Face landmarks, some "black magic" is done, and then is exported to well know landmark formats, then you can use it later.

  2. 3d tools Plugins (C++), each 3d tool has a different plugin architecture, so it's not easy to build one plugin to fit all the tools... When you build a plugin you can access the 3d tool api, and you can communicate low level via network or hardware with an application or sensor. So the limits are the knowledge and possible the 3d tool api limitations.

coming back to your scenario, you have a different need you are on the tool creating a virtual robot, and replicating the motion on the hardware side.

You have 2 options:

  1. you control the ARC, and both worlds are in sync (via ARC Http Control)
  2. you export to a file the servo positions and the timeline, and later via ARC script you can playback.

Do you agree ?

#40  

@ptp It could be totaly true that I do missunderstand what a HTTP server is build for, I am only having some vague idea of what it does and how it is used... I used some similar setup a while ago to get motion into another 3d package, where it was as you said converted into another file format to be processed.

I guess my conception of

Quote:

5) Processing communicates with ARC via HTTP + HTTP Server Control 6) ARC + EZB + Servos
was wrong in thinking that the HTTP server could be used also to communicate this data to 3ds max!

And using your plugin to record motion live in EZ-Bulder and clean it up in the animation package later would probably be the best way to handle this!:)

Quote:

If you are talking about the servos, the servos positions are scripted or auto-positions actions, so trying to "stream" the servo positions does not make sense, you can easily replicate those scripts (Servo A pos X, servo B pos Y) on the 3d tool.

Thanks for letting me know whats possible and what is out of question here...the last two options that you stated are exactly what I am aiming for! I guess I just got too exited about things and thought a few steps too far ahead!:)

PRO
USA
#42  

@dj, omg! Thank you. I'm working on a full mo cap to robot system. Perhaps this will be the foundation for expanding to a lot more axis! Thanks!

#44  

Thanks JD.

This is an awesome plugin. I am going to se if I can add this to max. As always you impress me with your creativity and know how.

PRO
USA
#45  

Almost 9 months later.... The baby is almost out....

The plugin supports 3 devices: Kinect Xbox, Asus Xtion, Kinect One.

The plugin allows custom angle calculations, publish the calculations to EZ-Script, and control servos.

Is very difficult to find a formula (e.g. angle calculation) that would work for all the devices, body shapes and other environment variables.

I took an extra step, and i added an expression editor to allow different formulas or adjustments.

Notes:

  1. Kinect the drivers are easy to find and it's easy to setup.

  2. Asus Xtion PrimeSense is gone is necessary special instructions to setup the latest drivers (OpenNI2/Nite2). I had a blog post with details, unfortunately the blog is down (needs a new provider and some TLC).

@Richard R if you have some time and you want to help with the tests, drop me an email (user's profile) I will provide the details to setup the Xtion.

Almost bed time ...

I'll post some screenshots to reserve some slots.

PRO
USA
#51  

** Reserved ***

Last post

PRO
Synthiam
#52  

Holy - now THAT is a seriously customizable awesome plugin! Kudos, sir. Your Kung fu is boss!

#53  

@ptp.... Awesome, thanks.... I will message you in a day or so....

Nice work!

PRO
USA
#54  

@DJ, @RR: Thanks!

@RR: Don't worry, I'll do a few more improvements.

I'll do a few more tests to catch the last bugs, i'll release the plugin during the weekend.

PRO
USA
#55  

Guys,

Sorry the delay, but the last weeks I've been ultra busy.

Plugin released!

The time is not enough to write all the plugin details, i think i provided the minimal to use the plugin.

@Richard let me know if it works.

@DJ:

a lot of new plugins, it's a shame i don't have enough time, thanks for sharing.

#56  

Man...this is some serious coding!

Also that Roll Pitch Yaw visualizer...Euler Angles and all this rotation matrix stuff never stopped driving me nuts hahahaha!

Outstanding work!:D

#57  

Good work ptp Is there way to start and stop this plugin in a script?

#58  

Great, I tried it, but I was not able to write a Plugin.

The DepthSensor plugin works fine, but in some situation the plugin stops working. I had to exit ARC and start again or delete the plugin from the Project and add it again to continue working.

I tried ist again and again and I think, when I´m very closed to the Sensor ( 50cm? ) and then I go to the right out of the sensor window it will be frozen.

Sorry for my bad english, I hope you understand what I mean.

Bye Hansi

Netherlands
#59  

Beautiful application. Great. One question, how can I lower the 180 dergees. If I try the prompt say the servo max position cannot be less then servo main position.

Thanks Peter

PRO
USA
#60  

Quote:

If I try the prompt say the servo max position cannot be less then servo main position.
Fixed

Netherlands
#61   — Edited

User-inserted image

Wow. You are fast. Thanks. Tried it but now I can not choose my Xbox Kinect only Neuronmocap. JD's plugin still recognize the Kinect.