
ptp
USA
Asked
— Edited
Jd Humanoid Robot Controlled By Kinect
This is a video of JD being controlled by a Kinect, I still need to to fix some discrepancies.
Only head, arms and grippers are being controlled.
I have the legs ready but i need to figure out away to secure JD when he loses the equilibrium, it's very difficult to keep him safe if you make a wrong move.
fishing lines ? more ideas ?
@Merne,
The other thread setup is based from the book:
https://www.amazon.com/Arduino-Kinect-Projects-Design-Technology/dp/1430241675 Chapter 6 Kinect-networked Puppet. Play with a physical puppet remotely using your whole body.
Original book sample:
EZB Version (Post) 1-4) Same 5) Processing communicates with ARC via HTTP + HTTP Server Control 6) ARC + EZB + Servos
My version:
I'm on vacations, and i got delayed with "real" work, but it's not forgotten....
i plan to release a ARC skill plugin to control, record, play kinect body feeds.
@ptp.... Will your plugin also work with the Asus Xtion camera? It basically does the same thing as the Microsoft Kinect... I already have the Xtion, so I thought I would ask... Thanks
@proteusy:
Kinect is well known and used with ROS framework for navigation, can be used to simulate a 2d Lidar or used as 3d Point Cloud navigation.
Currently i m using a Xbox 360 for navigation with ROS robot (Linux) and i'm researching the Xbox One on windows,
There are other similar sensors Intel Real Sense and Project Tango (Android), i have all of them, and the common base is the 3d point cloud data.
it s possible to write similar abilities on windows, but navigation and localization is much more than handling sensor data.
It takes time, money and human resources to do that, Linux is the choice for open source initiatives.
@Richard,
i m using the Kinect V2 as reference more Joints, and the API is more recent.
i will add support for the Kinect V1, less joints, and different specs.
I can borrow an Asus XTION, and use the defunct OpenNI framework, unfortunately like Kinect Xbox don't expect driver updates.
But is doable.
I'm really looking forward to this. Do you have a time line? I've recently ordered the Kinect v2 in anticipation of this project.
@ptp, From this post, https://synthiam.com/Community/Questions/7642, the you wrote about in this tread.
I have the Xbox 360 and found the SimpleOpenNI on my thumb drive, I have installed these two from the other post. Microsoft Kinect SDK - http://go.microsoft.com/fwlink/?LinkID=323588 Processing Windows 32-bit - https://www.processing.org/download/
I have installed and it is reading the Kinect in the processing box showing my body in the picture with the sample EZB project, but have not tried to use any servos yet.
Would my Xbox 360 be compatible with the plugin your are considering creating for this community?
Bty, I have just purchased the book with Kinect-networked Puppet from you link thank you. And thank you for your response(s)
Hello, I set this up and it does work using my linked post in thread #31 it it does work. Be careful, use a servo not connected to anything. The reason is because when you're running this setup and not in front of the Kinect it will move on its own if the skeleton moves it will move the servo.
Also, you have to change the virtual numbers in the processing and in the ezb project to a digital; port for it to work.
@Anybody, anyone know how to set this up? So if you are not in front of the skeleton/Kinect and someone or something is moving in front of the Kinect it would not move, does it need a variable? Hope this make sense?
Thanks, Merne
Edit I forgot to say I don't not mean to hi jack Ptp's thread
Any progress on this?
I'd like to know how to get the data live into ARC.