Resolved Resolved by DJ Sures!

Operating Servos Under BLOCKLY

Hi all,

just finished the hardware for my large Hexapod (1 m diameter !). I am using an EZ-B v4/2 as a controller, powered by a 2S Lipo. My servos are high quality robot servos (brushless motors): Longrunner B07K68MK3Y.

I intend to use BLOCKLY, since I gained experience for this tool during recent projects.

My current questions:D -    Does anyone have a link to a short demo program in BLOCKLY, showing the required syntax ? -    Which servo speed is meaningful and which position values should be used for a servo, running from -90 through 90 ?

Currently they are just moving around a bit, but not as they should.

Thanks very much in advance !



Related Hardware EZ-B v4


Upgrade to ARC Pro

Join the ARC Pro community and gain access to a wealth of resources and support, ensuring your robot's success.


Wow that is going to be amazing to see:)

Referencing the getting started guide, sounds like you have the first few steps identified. Selecting a Movement Panel is the step I believe you are on. Here’s the link:

The movement style that suits your need will be gait driven, so the Movement Panel will be Auto Position movement panel

To start, create the frames for having the robot walk in different directions (turn left, right, walk forward, reverse). Assign the frames to the respective actions. The manual for the Auto Position explains and lots of videos so it isn’t as dry:)

lastly, once the movements are done, you can determine how the robot is controlled. If it’s going to be autonomous, I’d first start with simple remote control to get started. Select a joystick or wii controller or something.

Also you can use the camera to start giving the robot personality and make it chase balls or point at objects it recognizes. It’s going to be a lot of Fun with a robot that size


Thanks for the link, but - to be honest - it could not help:

-    With my scratch-built Hexapod I am totally within the DIY region and have to control the 18 leg servos.

-    In fact, I need to position them at defined angles.

-    I am currently calibrating these servos and found out that center position is around 105 (for the Blockly command "Set Position").

-    Available angles are not theoretical servo angles (180), but are by far large enough.

-    Strangely a servo SPEED value of 1 produces a servo speed appr. 3 times higher than a value of 3.



#3   — Edited

Hi there again - To begin, you'll want to use the Auto Position Movement Panel robot skill to create a gait for walking forward, reverse, and turning left and right.

Here's a link to the getting started manual explaining what a gait is and a link to the Auto Position Movement Panel manual:

Once you create a gait, you can then control it from blockly, or a joystick, or how ever you wish. But let's begin by getting your robot walking with Auto Position frames and action.

Blockly (or any other programming language you choose) will connect the various behaviors together. WALKING is a gait, and a behavior. In this case, the behavior is controlled by the Auto Position robot skill.

Here's a link to the Auto Position manual:

PS, the speed values are reversed than expected. The manual explains more. Here's a copy and paste from


Sets the speed of the servo port to speed. speed is a value between 0 (fastest) and 10 (slowest).



thanks again for your suggestions, but this auto position feature is obviously something for "upright walking" robots.

My hexapod robot is based on  an insect-type design --> totally different geometry and servo distribution.

But - never mind - during the last two days I have realised the complete code in BLOCKLY. Is quite complex now, comprising individual servo calibration (speed & positioning) and in Java it shows some 600+ lines up to now :-) .

Experimenting with servos was going straight with your hints and from there it was a pleasure to create the Blockly code.

There is only one question left: Robot remote  control under BLOCKLY. I could not find any block, addressing something like "getkey", "if keypressed", or similar. Only the script version is offering something similar.

Any suggestion how to realise robot remote control via keyboard under Blockly ?

Thanks very much in advance !




The Auto Position is used in tens of thousands of Hexapod robots in arc. I highly recommend using the Auto Position control. Create each frame of a walking movement of each direction (forward, left, right, reverse, stop) and assign them to the appropriate actions.

If you still wish to create complicated Blockly servo animations, then assign each loop of walking to a custom movement panel. The problem you will face is the transitions between directions will be lost this way, and the servos will jerk and cause havoc. Here’s a link to the custom movement panel:

Again, I really really really recommend you use the auto position.

anyway, once you’ve finished what ever Movement Panel you use (hopefully auto position), then you can choose multiple ways to control it. The getting started guide has a section on that. Here’s the link, again:

good luck - and please use the Auto Position control.



and thanks for your concerns and - please don't be worried.

BLOCKLY is working marvellously here, since mechanical behaviour of my hexapod is broken down on six basic motions (forward, backward, turn left/right and crab left/right); no intermediate motion details required.

These motions as well as overall "insect"  behaviour are broken down in nested functions (currently six levels). In addition all control parameters are indexed (multiple layers) as well. This is a comfortable approach to create software very close to machine (bit) level.

Looking back on a 35+ yrs career as a physicist, phd in space industry - having worked with some 10 programming languages, including ASSEMBLER, FORTRAN, FORTH, PASCAL, etc. - I am confident that Blockly will fulfill my needs for the current private project.

In a nutshell: Many thanks again for your suggestions and support; I will report first success concerning my walking insect by a link to a video clip.



#8   — Edited

Okay - i want you to have the best experience with your robot and surpass what was expected of it. Know that the Auto Position uses algorithms for fluent movements of each frame transition. It also binds to ARCs movement system, so all other controls can utilize the direction movements. Including things like the Camera and Joysticks and Speech Recognition, etc.. The Auto Position is designed specifically for movement gaits, exactly like you're creating

Here is how movement directions are used in ARC:


Thanks a lot ! !

By the way: Does Synthiam have a dedicated channel, where I could link a video from my 1 m large beetle, discovering our garden ?


You can upload the video here - and create a robot profile, which is cool to get exposure. We have a youtube channel here

I'm whipping up a video that explains the Auto Position a bit more detail for ya - i'll post it in about half an hour. Just editing it now:D

BTW super stoked to see your robot


Here you go - check this out


Thanks very much - the code for my 1 m "insect" is now ready to run (need to wait for better weather).

But - before I forget:

You have this interesting Indoor Camera Navigation System ...

... for outdoor use, what about modifying this by:

-   Using one IR sensitive camera on top of an outdoor robot (my large "insect") -   Operating an omnidirectional high power IR transmitter at the charger station (on top of a corresponding shelter).

I have an EZ-B v4/2 plus the camera.

Application: The robot would crawl through the garden, until it is time for battery recharging. From thereon it would stop all current activities besides what is essential in the now active survival mode: Searching for the charger station. After some minutes of looking around it would recognise the IR-beacon and would crawl towards the station for charging.

This technical solution is based on real world ADF navigation that I tended to use during flying activities; quite simple and reliable.

Local navigation to finally establish electrical contact between the robot and the charger station is another task by itself. An optical VOR-based navigation principle could be used. If working properly, it could even replace the above ADF approach.

What do you think ?

Max. range would be some 50 m.




Unfortunately IR doesn’t work outside because the sun emits ir. That’s the reason we called the ips the indoor positioning system.

However, there are options for outdoor navigation by using a high resolution camera and a glyph. Or even if the robot was a specific color. Outside is difficult for navigation with most lidar or intelrealsense as well.

maybe even buried electric fence and guides?


There is a quite simple solution for suppressing unwanted broadband sunlight: A small and extremely narrowband interference filter in front of the camera on top of the robot.

By these means, more than 99,9 % of sunlight will be blocked and the SNR for the narrow band IR transmitter is drastically increased by orders of magnitudes.

I spent some 30+ years in the development of REAL lidar stations (emitting some kW of laser power) and we always successfully used this technique.

Receing signals from some 10 km in zenith direction during bright midday summer conditions was never a problem. We could easily count single photons of the backscattered laser signals; just by suppressing unwanted sunlight by these means.


Sounds really awesome - your lasers were not in the infrared spectrum. I'm not sure how you could filter IR outside in sunlight. How would you "suppress unwanted sunlight" in the infrared spectrum outside?

Sounds like you have done some great stuff - if you can create an outside IR beacon navigation system. Looking forward to seeing what you come up with because a lot of people would be excited:D



we used lasers in the near and far IR region (Optical Parametric Oscillators near 1500 nm and CO2 laser @ 10 m, respectively) as well as visible lasers (dye laser @ 589 nm) and UV lasers (tripled Nd:YAG @ 358 nm)and more.

Physically, IR light or UV light is nothing else than visible light; just characterised by longer or shorter wavelength.

Only difference to VIS light: The human eye does not recognise it.

The optical spectrum (intensity distribution as function of wavelength) of the sun is following Planck's law; meaning that the maximum intensity is somewhere in the green (--> plants have green leaves; optimised absorption of sun light).

The solar spectrum, however is also extending into the UV section as well as in the IR section.

Applying a narrowband filter in front of a camera is simply rejection all unwanted wavelength of the complete solar spectrum besides a specified and extremely narrow wavelength band. Only light with this wavelength will pass to the camera chip.

In other words: For such a camera it is nearly pitch dark even in the sun at noon, but an IR LED will pass such a filter without intensity losses; finally increasing the above mention SNR.

I know, this stuff is a bit hard to explain, but - be assured - it works :-)   .

Therefore, I guess - your navigation principle could really be a quite useful tool for my large insect.

We could do some joined dvelopment, if you like ...

... but at first my insect needs to walk and navigate autonomously (ultrasonic sensors and tentacles, both in the head front section).




Yes I remember bringing the Robosapien media robot to my mom's birthday party and she loved it as it was the size of a little kid, when she took it outside ,the infrared from the sun made it go crazy,LOL!

#18   — Edited

It isn't hard to explain - we've just been unable to make it happen. There's dozens of filters we have used on the cameras and none remove the daylight IR but also remain the IR transmitter.

With every filter we've used on the cameras, the IR from the sunlight was able to be seen. IR leds unfortunately are within the same spectrum of sunlight. I still do not know how you can filter it out. I can't find information or papers on the internet either about being able to use IR outside in the day. Which is also why all drone path planning and tracking use IR at night only.

Any information that provides an example or using IR cameras and transmitters in daylight would be welcomed.

All you would need to do to use our navigation system is tell me where to get a filter that allows IR from the transmitter to go through while filtering all daylight. If you have that, it will just work outside. You can build it yourself, the hardware files are opensource. Just add your custom daylight ir filter and it'll work for you



is it possible that you used "conventional" IR filters ?

In my text I was talking about so-called interference filters (sorry that I forgot to mention this specific term). They usually have a transmittive bandwidth of a few nanometers (nm). Only this class of filters is able to suppress unwanted IR from the sun as wanted. Others are too broadbanded (--> low SNR).

To be honest, I have to mention that such filters can be quite expensive; one has to check the market for surplus and other options.

A filter with  2" diameter with a bandwidth of some 2 nm may cost around 2k$ (new and custom specified).


... the aperture of the camera is very small (around 2 mm), so that - theoretically - one may produce some 730 tiny filters from such a 2" element.

Considering a loss of about 30% (cutting, etc.), about 220 tiny filters could be obtained, resulting in a cost of about 10 $ per filter. NOT THAT BAD !



#21   — Edited

Go give it a shot and let me know how it works! You’d be changing the world of outdoor robot navigation if you could do it:)

PS, here's the link to the hardware and firmware. All you'd need to do is put your special IR filter on the IR camera. The other camera is for visit light so you can see where to setup way points and such:


Hey guys, to clarify, it's the "daylight filter" that allows IR to pass through. The IR camera filters daylight and allows IR to pass and hit the CCD.

The Visible Spectrum Camera allows daylight to pass and has an IR blocking filter.

The IPS uses Infrared light with an 850nm wavelength. I've never used a 1500nm LED before but it seems like a fun thing to try!


Hi Jeremy,

in principle yes, but keep in mind that these filters are so-called "edge filters" or "cutoff filters", meaning that they remove only the short-wavelength part of the spectrum.

What I was proposing is a "bandpass filter", leaving only a very narrow "spectral window" for the light to pass. But - as said - these can be very expensive.

Bottom line: It is surely worth to try a cutoff filter at first !

LED wavelength will certainly in the NIR around 800 nm - 900 nm. These 1,5 m devices are laser diodes; no LEDs anymore and are quite expensive as well.

In order to increase camera SNR one has another two options at the transmitter side:

-   Easy: Increasing LED power (there are enough ampères available at a charging station).

-   More demanding: Adding a transmitter optics (kind of ring shaped cylinder lens around the LED(s)) that is concentrating the LED light in a more horizontal plane and not waisting the photons into directions, where the robot will never be.

I guess in Spring 2021 I will approach this topic. Beforehand, my 1 m insect needs to "learn" walking properly.


I’m pretty stoked to see this huge guy moving about. You’ll have to connect him to exosphere and let me take a spin sometime:D