Asked — Edited

What Is The Best Remote Control Solution For My 30Dof Robot?

Hi there,

I would like to control my Humanoid Robot (custom robot with 30 degrees of freedom - 2 neck, 12 arm, 2 spine, 12 leg, 1 jump, 1 pivot and 1 somersault servos - no fingers or face activity yet) using some kind of remote control system.

I am thinking probably a wireless pair of accelerometer glasses and a couple of Wiimotes, but this might not be enough for 30 servos. I'd also like to use movement recognition as an alternative or in combination. I don't want the controllers to be as expensive as a MYO to wear on every limb or something like that (they are unreliable controllers anyway), or to use a touch screen that couldn't work with one finger at a time to control 30 servos simultaneously (unless this has a great solution).

The first question is about headwear. I don't mind the option of seeing out of the head cam occasionally, but my vision cannot be blocked so that I cannot see the robot itself. Therefore I need a clear idea of which headsets are currently compatible with EZ Builder software. As far as I can see, only the 920VR glasses from Vuzix. It is unclear whether [link=]these[/link] Vuzix STAR 1200 glasses are compatible, or any others. I don't really want wired glasses if it can be helped, but I'll try to find some old 920s if necessary.

Secondly, Wiimotes seem to have less buttons than an xbox controller, but I guess this doubles with two of them. Are they good? I'm not really sure if I want my users to require waving their arms around.

Thirdly, the mobile interface for phones and tablets seems to be the most integrated control system and I could chunk actions together to do such things as point left arm straight, wave, run, dance to a particular song and other basic preset poses from a menu, but you can understand the limitations of not being able to express yourself properly if you can only do preset moves.

Fourthly, some kind of gesture gloves that I don't know where to get and I don't know whether they are compatible.

Fifthly, some kind of interface like a lighting desk so I can place all ten fingers on the sliders to control ten body parts at a time at least, while preset movements are also happening.

Sixthly, emotiv epoch headsets. I don't know if you can imagine a person moving in a particular way and it will control all the servos as required, but this would be cool. Even if I use all my fingers and toes and tongue and crotch, thats still only 22 control sliders.

As you can see, controlling the robot simultaneously is difficult with 30 servos, and I haven't even introduced 10 servos for fingers, 2-4 for eyes and about 8 more for mouth, eyebrows, not to mention lighting controls, audio controls, etc, etc.

Lets assume I will need to control 50 things for this robot. What are the BEST ways to achieve it? Thanks.

User-inserted image

Skip to comments


Upgrade to ARC Pro

With ARC Pro, your robot is not just a machine; it's your creative partner in the journey of technological exploration.


I just had this idea, which may or may not be easier than a couple of Wiimotes:

User-inserted image

Legs would be grouped presets unless there was a second person to control the legs and other functions, otherwise the operator would have to be doing the splits and running and doing somersaults while being covered in accelerometers. The head and arms are more expressive so require more real-time. If finger frames were preset then the operator could simply press the thumb button to activate the selected hand pose. This would free up 8 buttons to allocate to other things. The controllers in the image would be fully programmable to control any EZB inputs, of course. Four controllers plus a computer for menu options should nearly be able to control a 50 device robot in real time. Each of these off-the-top-of-my-head controllers contain 12 controls that can fully be controlled in real-time. Also, I don't think the vuzix should really control the head - I would probably assign one of the thumb scrollers. The mouth would move as directed by words, so that should be pre-programmed to move with the correct sounds that words make. Blinking/Winking/closed percentage of each eye could be controlled by a couple of foot sliders.

This is not my first choice, just an idea. I don't have the time nor money to get these made.


you don't have many options, i would use the xbox controller.

You can script the buttons to alternate between body parts, but you will need to have coordinated actions versus individual joints.

The other option is to use a Kinect sensor:

basically you map your robot joints to the kinect joints, and the robot can mirror your movements.

btw still working on it.


Isn't there something with more buttons than an xbox remote? I mean, I'd rather it had to be plugged into a wall socket and have all the joysticks, buttons and sliders I need than be portable and not have enough buttons. What do they use in Film animatronics and stage lighting productions etcetera? What about digging machines and cranes and trains and complicated robotics machinery? xbox can't be the best. Kinect sensor would be cool sometimes, but often you just want to control it. What about a wearable exoskeleton or at least a voodoo duplicate that is small and sits on the desk?


Interesting invention. Wait a second. How many things can puppeteers control at a time? They are surely the masters of controlling humanoids.

T Rex controller for Jurrasic Park is a voodoo system: