Australia
Asked — Edited

Orientation Sensors To Create Intelligent Servo Feedback

Hi,

Can the Orientation sensors be placed on each servo motor so that you can move robot arms into place and save a frame without needing to use screen controls?

I want to be able to easily move the arms, head etc into each frame/pose to create specific choreography.


ARC Pro

Upgrade to ARC Pro

Become a Synthiam ARC Pro subscriber to unleash the power of easy and powerful robot programming

#1  

I am going to say no.... Not to get into specifics.... The ezb4 has only 3 I2C ports so unless you can figure out how to chain more orientation sensors I think you are SOL I am afraid... To be honest I am not even sure you can even use 3 Orientation sensors at once....

Why doesn't the basic auto positioner control work for what you want to do?... Maybe I am missing something....

I think an easier way would to be to hack the servos's potentiometer and run the wires back to the adc ports on the ezb... That will give you verified servo position....

Australia
#2  

Thanks for the quick reply:) I'm researching how to hack the potentiometer here:

http://www.instructables.com/id/Servo-Feedback-Hack-free/

Hopefully I'll be able to position my robot into poses with the power off (so the servos don't fight me) and save each position for easy choreography.

#3  

Hacking a servo is not too difficult. Basically I hacked dozens making my inMoovs.... Granted they were large scale Hitec HS-805BBs.

Good luck with your project....

#4  

You really can't make robots like JD hold position without power. It will just collapse whenever you try to bend the leg servos. The weight is too great. Even apart from that, where are you going to send the pot outputs to be read for their value? It's a lot of wires and D-A conversations needed. You would need only one D-A converter on the EZB if you use analog multiplexers for the pots to attach to.

I also echo Richard R's statement about using autopostioner. Using it, you can place the robot into any position you wish. And it will hold that position as long as it's power source lasts. This assumes you use the "Realtime Update" option. I suppose it would be easier to simply move the various servos by hand, but you would have to suspend the robot. Additionally, doing that can easily result in poses that will not really work when actually tried. With the Autopositioner, you can test the pose at the same time as you create it.

The one thing that would make the Autopositioner better would be to have a 3-D representation of the robot that would allow the dragging of it's limbs into position rather than changing a number. That would give the user the ability to change several servos at the same time. In some cases, it can be somewhat difficult being only able to move one servo at a time. Better still, all the choreography could be done on computer. Perhaps someone could make a plug in to do that.

Australia
#5  

My robot weighs around 20 kilograms.

I'll be using a Kinect sensor with a button in my hand to create some poses if necessary, but I've seen seen voodoo posing techniques used before in Japan. They are the fastest way using hacked servos and a secondary trainer robot.

As for servo braking, I've been working on it for a while, but RC switches are letting me down.

Maybe someone can help with the servo brakes here:

https://www.youtube.com/watch?v=euOJe9ISXMY

United Kingdom
#6  

With regards to I2C ports, while the EZ-B may only have 3 I2C ports you can daisy chain I2C (I'd imagine this is what the EZ-B does to get 3 ports).

You can also re-address most I2C devices, so if you have two the same you just need to re-address one in order to use both of them at the same time. For instance, BlinkMs, they come with a set I2C address but you can run multiple BlinkMs by changing the addresses.

The limitations come with the device's datasheet. If the device doesn't allow re-addressing or the datasheet doesn't explain the command to send to re-address then you are stuck. Each different device is different too, it's not a one size fit all command to change an I2C address.

Point is, you can have more than 3 I2C devices without a problem. If memory serves me correctly you can have 128 different I2C devices on a single EZ-B

PRO
Canada
#7  

Yep, you guys are correct. Since I2C is a parallel interface you can have many different devices hooked up to it but as @Rich mentioned they will all need a unique address.

The 4-in-1 Sensor can be readdressed using the physical AD1/AD0 address selector on the PCB that goes between 0x69 and 0x68. The maximum 4-in-1 sensors you can have hooked to the ez-b is two since there only 2 unique addresses.

Australia
#8  

Thank you very much, guys.

I guess my original question is entirely about the fastest and easiest programming technique.

Whether its a graphical representation of my robot on a screen (controlled by Kinect, Myo, Mouse, touch screen or intelligent glove), primitive sliders or arrow buttons or indeed a voodoo duplicate of my robot consisting of Position Feedback Servos, the end result is the point: CODE.

Once the code has been written, the robot will do as instructed. Therefore, the robot I choreograph does not actually need to be present during the programming phase. Therefore the servos inside my robot do not need to be intelligent.

Your responses have helped me to draw this conclusion, so I thank you.

This is what I will do if I can afford it: https://www.youtube.com/watch?v=poPeO7xxA90

I will buy intelligent dynamixel position feedback servos and build a precise voodoo replica of my robot. This will allow me to create poses with the absolute clarity that tactile 3D reality allows using the fastest, most precise method possible. Even a hologram with interactive force fields that reacted to my hands would be an inferior imitation to this supreme method of coding. The only thing that would surpass it would be mind control or a 3D motion scan of trained choreographers/fighters/acrobats who had already rehearsed a performance, after which they would need to tighten the code via editing to eliminate mistakes of movement, something this voodoo technique would have already achieved through meticulous frame-by-frame precision posing. Even then, real humans can do things my robots cannot, so this technique is superior because the voodoo duplicate will accurately depict joints, servo rotation limitations, appendage lengths and other dimensions. Also, certain robots are not humanoid. At any rate, I see nothing that comes close on the market today, and it is almost the perfect solution if you want to create a library of flawless sequences quickly for use in many robots. Its expensive, though.

Does anyone have anything to add? It would be great if ezrobot would compete by building a cheaper version of such a voodoo programmer... :)

#9  

@Zxen This may be just the servo you need. No hardware hacking required. http://www.adafruit.com/products/1404

Australia
#10  

Thanks! So much cheaper than dynamixel!

I'm pretty sure all servos of the same make and model react to PWM signals in exactly the same way.

Do you think that the PWM signals from an analog feedback servo would work with any servo? I guess the gears in each servo model make each degrees of rotation:PWM pulses ratio different, right?

PRO
Canada
#11  

Yes any servo would work, it is simpley the center pin of the servo's potentiometer broken out. One thing to keep in mind is that the voltage from that pin could possibly be higher than than an EZ-B analog pin can handle, but you could further scale the signal down with a resistor based voltage divider.

Australia
#12  

When using the EZ-B microprocessor when it arrives, I intended to use separate power lines for all the servos to be on the safe side. I have a 46a 7.5v transformer because I'll be using a few 40kg.cm servos that require 9a when under load.

I thought I would need to use a different microprocessor for the voodoo feedback programming, but it would be great if the ez-b could handle it. I figured the script could simply be pasted onto the ez-b's CPU afterwards, but this would not allow me to see the the public robot move at the same time as the voodoo doll in real time. I intend to use all 24 channels.

If I did use adafruit's feedback servos, can anyone tell me how to plug them in so I can see my ez-b robot react in real time? Step by Step Images would be best if anyone is willing to draw the answer. I'm sure many people would benefit. I'd be so grateful - I get confused with lots of electronic terminology.