
ptp
USA
Asked
— Edited
Jd Humanoid Robot Controlled By Kinect
This is a video of JD being controlled by a Kinect, I still need to to fix some discrepancies.
Only head, arms and grippers are being controlled.
I have the legs ready but i need to figure out away to secure JD when he loses the equilibrium, it's very difficult to keep him safe if you make a wrong move.
fishing lines ? more ideas ?
Also, is there a specific type of Kinect required? Can I use the one that came with my Xbox or do I need a developer version, etc
I'm working the calculations, currently every joint is calculated independently, but, i believe when you see a skeleton as whole you can infer/adjust the joints minimize the errors.
I have two kinects:
Kinect V1 Xbox 360 (game console)
Kinect V2 Xbox One (windows KFW version)
the work is being done with the Kinect V2, API has more stuff, although the requirements are significantly higher:
genuine exclusive USB 3.0 channel
core I5 minimum (i plan to move to a core i7 to explore more stuff)
Kinect V1 and V2 have different API's.
I believe is possible to obtain similar results (less compared to V2).
before there were two versions of kinects: console & windows, last year MSFT abandoned the KFW (Windows), so now there is only one Kinect version, and you will need to buy an adapter to use with a PC.
I would like to hear ideas to explore the kinect interaction via ARC.
Alan
Simply make another script that reads back the file and sets the positions.
Its funny everything started because i was lazy and i wanted to create jd movement actions.
I'll create another video to demo the record & playback
That would allow for simple visual editing of the routine after it is recorded.
Alan
Happy memorial day.
@JD, you are always on
@Will,
Yes also i've been working on a data visualizer, i hope to have news soon.
Any progress on making this a plugin? I can assist if you'd like.
I am still having my XBox 360 Kinect that I used to record motion capture within my 3D application...cannot wait to try this on my robot!
I was really exited when I accomplished, to use my Kinect for tracking but the downside of it is that you cannot really create motion capture data off it....there is a lot of cleaning to be done manually to get a good animation!
I am really surprised on how good this works within ARC, supercool stuff going on within this forum lately!
or do you have totally different setup?
Would you mind sharing?
Thanks, merne
Thanks.
The other thread setup is based from the book:
https://www.amazon.com/Arduino-Kinect-Projects-Design-Technology/dp/1430241675
Chapter 6
Kinect-networked Puppet. Play with a physical puppet remotely using your whole body.
Original book sample:
1) is based on Kinect Xbox 360 hardware (deprecated)
2) the core logic code is Processing https://processing.org/ runs on the Windows/Linux/MacOS (Java code)
3) Processing code uses the SimpleOpenNI a processing wrapper for OpenNI framework
4) OpenNI v1 framework was created to handle NUI sensors (Kinect, Asus, others) by PRIMESENSE (Microsoft partner to support the Kinect xbox 360, and Asus Partner to support Asus XTion Pro live sensor). Apple bought PrimeSense and shutdowns all "open" initiatives so OpenNI v1 is gone.
5) Processing communicates with an Arduino via USB/Serial communication
6) Arduino code receives serial commands i.e. servo positions and controls the servos
EZB Version (Post)
1-4) Same
5) Processing communicates with ARC via HTTP + HTTP Server Control
6) ARC + EZB + Servos
My version:
1) Microsoft Xbox One (v2)
2) Core logic is Microsoft Visual Studio C# and C++
3-4) Microsoft Kinect V2 SDK
5) SDK + EZB or ARC skill plugin + EZB
I'm on vacations, and i got delayed with "real" work, but it's not forgotten....
i plan to release a ARC skill plugin to control, record, play kinect body feeds.
Kinect is well known and used with ROS framework for navigation, can be used to simulate a 2d Lidar or used as 3d Point Cloud navigation.
Currently i m using a Xbox 360 for navigation with ROS robot (Linux) and i'm researching the Xbox One on windows,
There are other similar sensors Intel Real Sense and Project Tango (Android), i have all of them, and the common base is the 3d point cloud data.
it s possible to write similar abilities on windows, but navigation and localization is much more than handling sensor data.
It takes time, money and human resources to do that, Linux is the choice for open source initiatives.
i m using the Kinect V2 as reference more Joints, and the API is more recent.
i will add support for the Kinect V1, less joints, and different specs.
I can borrow an Asus XTION, and use the defunct OpenNI framework, unfortunately like Kinect Xbox don't expect driver updates.
But is doable.
From this post, http://www.ez-robot.com/Community/Forum/Thread?threadId=7642, the you wrote about in this tread.
I have the Xbox 360 and found the SimpleOpenNI on my thumb drive, I have installed these two from the other post.
Microsoft Kinect SDK - http://go.microsoft.com/fwlink/?LinkID=323588
Processing Windows 32-bit - https://www.processing.org/download/
I have installed and it is reading the Kinect in the processing box showing my body in the picture with the sample EZB project, but have not tried to use any servos yet.
Would my Xbox 360 be compatible with the plugin your are considering creating for this community?
Bty, I have just purchased the book with Kinect-networked Puppet from you link thank you. And thank you for your response(s)
Also, you have to change the virtual numbers in the processing and in the ezb project to a digital; port for it to work.
@Anybody,
anyone know how to set this up? So if you are not in front of the skeleton/Kinect and someone or something is moving in front of the Kinect it would not move, does it need a variable? Hope this make sense?
Thanks, Merne
*Edit* I forgot to say I don't not mean to hi jack Ptp's thread
I'd like to know how to get the data live into ARC.
I developed a custom ARC skill plugin.
Regarding the project, it's not stopped, it's part of a bigger framework to interact with a custom Robot.
I'm breaking down the "Idea" in smaller components, that way i can release and ask for feedback.
I noticed in another thread you need help to integrate face sensor data, maybe i can help you that specific.
I'll drop you an email.
We are also working on sending motion data to the EZ-B in another thread...
@fxrtst and myself would be interested in streaming and processing live motion data!
I guess we could just read the servo positions out of ARC and send them to another application on arrival.
But if you are getting them into ARC using the HTTP server, we could also use or application as a middleware.
This would lets us record motion for later processing or tweak the incomming data to create better motion for individual each individual robot model while streaming the data live!
Does this sound like something which could work out or is this to much crosswiring between applications?
If i understood well the other thread objective is to control the ARC.via 3ds MAX, and you can do that via HTTP protocol + ARC.#39;s HTTP control.
If you need to tweak or synchronize both sides the master side is the 3ds tool, and ARC.hardware is the mirror.
It seems you are on the right track, you don't need any custom tool/development you only need some 3ds maxscripts and ARC.
Were do you see common paths ?
I can explain what components I'm working, but the focus of this post, is to understand the common areas if they exist.
A person (@JC) interested in the idea donated an Asus Xtion PRO to help extending the support to the Asus Xtion.
Currently Kinect V2 and Realsense are active, but only Kinect V2 provides body tracking data.
I got the xtion camera last Friday, and after some tests, the device works on a windows 10, and i got some positive results using the native API.
@JC, thanks for the device, as i told you before the elected hardware is the KV2, as soon the first components are released, i will start working on the older hardware KV1, and Xtion.
Everything you mentioned is correct!
Correct me if I am wrong!
One thing that could be done is to export the live motion out from ARC thru the HTTP protocol and record the motion within 3ds max for further processing and playback. It could be as easy as to use the GetServo( Port ) command and store it in a variable to read it out after the motion has been recorded!
The other thing that could be done is to stream the motion directly from the Kinect thru the HTTP server into 3ds max, where it would get processed and passed on to ARC, more joints could be added, motion could be retargeted or smoothed in this process...as we are working on a live control and on playing back animations correctly timed, both options would be interesting!
@fxrtst brought the idea of live motion capture up and we had the idea to check if our projects had a common ground, which I believe is the HTTP protocol!
let's clear some misunderstandings, HTTP is a client server stateless protocol. One side is the server answers in a ip/port (Server), and the other side sends command & receives the response.
A similar concept is the "http streaming". Http is not suited to perform stream, is a very versatile and always available protocol, there are some uses like mpeg stream, but is not the best option.
ARC does not generate data feeds (the exception is the ezb camera).
If you are talking about the servos, the servos positions are scripted or auto-positions actions, so trying to "stream" the servo positions does not make sense, you can easily replicate those scripts (Servo A pos X, servo B pos Y) on the 3d tool.
If you are talking the ARC Http server, it can't be used the ARC Http Server exists to allow an external client to control some ARC features e.g. servos.
If the idea is to create a custom Application and act as a HTTP Server to provide a stream to the Sensor data, and use the stream within the 3ds max. I don't think that is possible, there is no standard for a http stream client, plus the 3ds maxscript have the normal limitations is not a full framework where you can build components or even modules.
So the question is how 3d tool can access sensor data ?
First is outside of ARC scope, there's nothing there it can help. Two options:
1) Existent tools e.g. Faceshift , capture sensor data, the data is analyzed for example Face landmarks, some "black magic" is done, and then is exported to well know landmark formats, then you can use it later.
2) 3d tools Plugins (C++), each 3d tool has a different plugin architecture, so it's not easy to build one plugin to fit all the tools... When you build a plugin you can access the 3d tool api, and you can communicate low level via network or hardware with an application or sensor. So the limits are the knowledge and possible the 3d tool api limitations.
coming back to your scenario, you have a different need you are on the tool creating a virtual robot, and replicating the motion on the hardware side.
You have 2 options:
1) you control the ARC, and both worlds are in sync (via ARC Http Control)
2) you export to a file the servo positions and the timeline, and later via ARC script you can playback.
Do you agree ?
It could be totaly true that I do missunderstand what a HTTP server is build for, I am only having some vague idea of what it does and how it is used...
I used some similar setup a while ago to get motion into another 3d package, where it was as you said converted into another file format to be processed.
I guess my conception of
was wrong in thinking that the HTTP server could be used also to communicate this data to 3ds max!
And using your plugin to record motion live in EZ-Bulder and clean it up in the animation package later would probably be the best way to handle this!
Thanks for letting me know whats possible and what is out of question here...the last two options that you stated are exactly what I am aiming for!
I guess I just got too exited about things and thought a few steps too far ahead!
This is an awesome plugin. I am going to se if I can add this to max. As always you impress me with your creativity and know how.
The plugin supports 3 devices: Kinect Xbox, Asus Xtion, Kinect One.
The plugin allows custom angle calculations, publish the calculations to EZ-Script, and control servos.
Is very difficult to find a formula (e.g. angle calculation) that would work for all the devices, body shapes and other environment variables.
I took an extra step, and i added an expression editor to allow different formulas or adjustments.
Notes:
1) Kinect
the drivers are easy to find and it's easy to setup.
2) Asus Xtion
PrimeSense is gone is necessary special instructions to setup the latest drivers (OpenNI2/Nite2). I had a blog post with details, unfortunately the blog is down (needs a new provider and some TLC).
@Richard R if you have some time and you want to help with the tests, drop me an email (user's profile) I will provide the details to setup the Xtion.
Almost bed time ...
I'll post some screenshots to reserve some slots.
Last post
Nice work!
@RR:
Don't worry, I'll do a few more improvements.
I'll do a few more tests to catch the last bugs, i'll release the plugin during the weekend.
Sorry the delay, but the last weeks I've been ultra busy.
Plugin released!
The time is not enough to write all the plugin details, i think i provided the minimal to use the plugin.
@Richard let me know if it works.
@DJ:
a lot of new plugins, it's a shame i don't have enough time, thanks for sharing.
Also that Roll Pitch Yaw visualizer...Euler Angles and all this rotation matrix stuff never stopped driving me nuts hahahaha!
Outstanding work!
Is there way to start and stop this plugin in a script?
The DepthSensor plugin works fine, but in some situation the plugin stops working.
I had to exit ARC and start again or delete the plugin from the Project and add it again to continue working.
I tried ist again and again and I think, when I´m very closed to the Sensor ( 50cm? ) and then I go to the right out of the sensor window it will be frozen.
Sorry for my bad english, I hope you understand what I mean.
Bye Hansi
Thanks Peter
Wow. You are fast. Thanks.
Tried it but now I can not choose my Xbox Kinect only Neuronmocap.
JD's plugin still recognize the Kinect.