Asked — Edited

First Post - Question About Ezb4 And Object Tracking

Hello all,

I have been looking through the forums and am quite impressed with everyone's work and collaborative effort! :D

I am building an inmoov, and want to integrate the object tracking with the ability to actually grasp the object. I have two cameras in the eyes, and also an xbox 360 kinect. I am wondering if these would be sufficient to guide the actual grasping of an object?

I am sure someone knows a solution, but my current idea is to add cameras to the wrists (that each have tracking enabled through ARC and can move the hand toward the object), in addition to IR sensors within the palm that would allow the robot to detect an object within grasping range (hence, triggering the motion of grasping).

I am really impressed with bhouston's inmoov, and am hoping he may have some input as to the feasibility of this.

Thanks to everyone in advance!


ARC Pro

Upgrade to ARC Pro

Join the ARC Pro community and gain access to a wealth of resources and support, ensuring your robot's success.

PRO
Synthiam
#1  

Welcome to the forum! Ezrobot software can support up to 256 cameras.

Your first step would be building an inmoov and connecting ezrobot hardware and software to it. Once you get that far, the rest is even more fun - and you will have your own answers and better understanding to these questions :D

#2  

@bmb3488, Welcome to the community. Thanks for your comments, there are a few very impressive InMoov's and robot's here. InMoov's built by Richard R, merne and Aerious to name a few.

How far along are you with your InMoov build? We love seeing pictures of projects here - post some if you can. Have a go at the programming and if you have specific questions about something - post them - lots of people here to help you out.

I haven't done exactly what you want to do but it is doable. I have built a robotic arm with a camera in the jaws and programmed it to search for an object. Once the object is found, it reaches out and picks up the object. Adding an IR sensor would make it work better. Check out this thread; https://synthiam.com/Community/Questions/6909

PRO
USA
#3  

Thanks @bhouston, I have so far completed the skull, just waiting on neck servos. The torso is complete, but not attached to the head. The eyes have the ability to follow whatever it is set to track (faces, motion, etc), and I've got the jaw installed.

My goal is to utilize a pole that connects the inmoov's body with a 6 wheeled wheelchair base. I am planning on using sabertooth motor controllers to interface the chair to it's own EZB4, alongside some ultrasonic distance sensors, so that the base could be driven independently. I would like to utilize a custom wire harness to make the entire assembly easy to break down into two distinct pieces.

I am running the eyes off of two Ubisoft wide-angle motion tracking cameras installed to an onboard barebones netbook that runs ARC (which I connect to via remote desktop).

Once the torso is mounted to the vertical pole support (mounted to the wheelchair's seat bracket), I will begin the process of building the arms. I have access to 2 3D printers and keep them printing CONSTANTLY! lol.

Any input on this design is much appreciated.