You sure can. That video is faking the sorting of colors because there is no camera or color sensor. Instead, that video is merely an Arduino moving servo through a pre-programmed routine. With Synthiam ARC, you can reproduce that video more efficiently by using the same configuration with an Arduino EZB Firmware.
That video is faking the sorting of colors because there is no camera or color sensor.
I am not sure this is the case as he throws the blocks down and does not place them down. I assume he has an over head camera (perhaps the one we are watching the video from but more likely directly above where he is throwing the blocks). He is using the paper pieces to align the robot arm.
For this skill we will need to use colour detection in ARC and not auto position.
Check for blue
if blue exists get coordinates of object from video feed.
Move robot arm to coordinates.
do the same for other colours.
we don’t need the paper alignment though and we can throw the blocks anywhere in the room and attach robot arm to a vehicle.
Interesting video. I just watched it based on these comments and there probably is a camera. No one can see it but it must be overhead. Strange they didn’t show it so I can see how that could be confusing.
i disagree about the Auto Position though - you’d still need it. Because once you have the object picked up, the Auto Position will place it in the appropriate cup for the color. So have 3 actions, for dropping in red, blue or green cups.
it doesn’t matter the starting position when using Auto Position actions. Because the transition would start from where ever the servos are.
so the easiest way to do this shouldn’t take much longer than half an hour.
mount an overhead camera
determine the x/y servo position relation to the screen coordinates. Meaning pixel 10,10 is servo 30, 60 (or what ever). That way, when a object is detected at an x/y coordinate, a simple servo command can send the arm to get it
once the arm has the block, have a IF condition that launches the respect Auto Position action to place it in the appropriate cup
keep the arm out of the way until another object is detected.
Thanks mr. DJ and every one which community for this post, can u make tutorials for that project , how I can give command to servo to determine the X/Y coordinates to let the end effector to tracking object motion and grip it by using gripper?note I use widowX 250 6DOF in my collage
There are a lot of tutorials on line. If we write the tutorial to do the exact task that really isn’t going to help you learn. (The point of collage)
I doubt anyone here owns your specific robot arm nor would anyone here probably buy one (we would build from components) you need some servos and a camera.
If you want to use your robot arm, step 1 get your robot arm working with ARC. I did not see a servo controller listed for your robot arm. Do you know what the included servo controller is? If not Choose a controller on the list ARC supported controllers. There are dozens supported but an Arduino or an EZB will do to connect to your robot arm. https://synthiam.com/Products/ARC
I would probably recommend and EZ-B iotiny for your project as they are easy to use for beginners. (Edit: If you need servo read position of Dynamixel servos use the EZB V4 as it has hardware UART Support) You will also need a windows PC. It looks like from Robot ARM description there is an optional Raspberry PI 4 you could load windows on but not sure, any old windows laptop will work fine. I took a quick look at your robot arm and it uses dynamixel servos so they are supported by ARC. You can also use ARC with the realsense sensor that robot arm has as an option but most web cams will work just fine. You just select the appropriate plug-in.
Quick Google it uses an arbotix controller. This is an arduino with proprietary firmware so I don’t believe it is supported directly. you can just swap it out with a supported arduino like a mega https://synthiam.com/Support/Hardware/Arduino-Due-Mega or an ezb etc.
You sure can. That video is faking the sorting of colors because there is no camera or color sensor. Instead, that video is merely an Arduino moving servo through a pre-programmed routine. With Synthiam ARC, you can reproduce that video more efficiently by using the same configuration with an Arduino EZB Firmware.
Find the firmware that works best with your robot controller here: https://synthiam.com/Support/Hardware/Hardware-Overview
Next, to program the actions, the Auto Position is the most accessible robot skill to use for that: https://synthiam.com/Support/Skills/Servo/Auto-Position-Gait?id=20314
Finally, if you wish to use a camera to detect colors for absolute sorting, it can be done using the Camera Device: https://synthiam.com/Support/Skills/Camera/Camera-Device?id=16120
To get started, we recommend using the getting started guide here: https://synthiam.com/Support/Get-Started/how-to-make-a-robot/plan-a-robot
*Note: we have moved your question into this new thread because it was in the wrong area.
For this skill we will need to use colour detection in ARC and not auto position.
Check for blue if blue exists get coordinates of object from video feed. Move robot arm to coordinates.
do the same for other colours.
we don’t need the paper alignment though and we can throw the blocks anywhere in the room and attach robot arm to a vehicle.
Interesting video. I just watched it based on these comments and there probably is a camera. No one can see it but it must be overhead. Strange they didn’t show it so I can see how that could be confusing.
i disagree about the Auto Position though - you’d still need it. Because once you have the object picked up, the Auto Position will place it in the appropriate cup for the color. So have 3 actions, for dropping in red, blue or green cups.
it doesn’t matter the starting position when using Auto Position actions. Because the transition would start from where ever the servos are.
so the easiest way to do this shouldn’t take much longer than half an hour.
mount an overhead camera
determine the x/y servo position relation to the screen coordinates. Meaning pixel 10,10 is servo 30, 60 (or what ever). That way, when a object is detected at an x/y coordinate, a simple servo command can send the arm to get it
once the arm has the block, have a IF condition that launches the respect Auto Position action to place it in the appropriate cup
keep the arm out of the way until another object is detected.
repeat
Far too easy haha
Ok yeah you could use Auto Position to put in cup but what happens if you move the cup?
You could use object recognition or glyph to find the cup.
Ah touché! Yeah that’s a neat idea for the cup location. So it could be even smarter by knowing where the cup is. I like it
Thanks mr. DJ and every one which community for this post, can u make tutorials for that project , how I can give command to servo to determine the X/Y coordinates to let the end effector to tracking object motion and grip it by using gripper?note I use widowX 250 6DOF in my collage
There are a lot of tutorials on line. If we write the tutorial to do the exact task that really isn’t going to help you learn. (The point of collage)
I doubt anyone here owns your specific robot arm nor would anyone here probably buy one (we would build from components) you need some servos and a camera.
If you want to use your robot arm, step 1 get your robot arm working with ARC. I did not see a servo controller listed for your robot arm. Do you know what the included servo controller is? If not Choose a controller on the list ARC supported controllers. There are dozens supported but an Arduino or an EZB will do to connect to your robot arm. https://synthiam.com/Products/ARC
I would probably recommend and EZ-B iotiny for your project as they are easy to use for beginners. (Edit: If you need servo read position of Dynamixel servos use the EZB V4 as it has hardware UART Support) You will also need a windows PC. It looks like from Robot ARM description there is an optional Raspberry PI 4 you could load windows on but not sure, any old windows laptop will work fine. I took a quick look at your robot arm and it uses dynamixel servos so they are supported by ARC. You can also use ARC with the realsense sensor that robot arm has as an option but most web cams will work just fine. You just select the appropriate plug-in.
https://www.trossenrobotics.com/widowx-250-robot-arm-6dof.aspx
Quick Google it uses an arbotix controller. This is an arduino with proprietary firmware so I don’t believe it is supported directly. you can just swap it out with a supported arduino like a mega https://synthiam.com/Support/Hardware/Arduino-Due-Mega or an ezb etc.
https://www.interbotix.com/arbotix-robocontroller
http://vanadiumlabs.github.io/arbotix/