Asked — Edited

Omnibot/Camera Tracking:


The camera tracking isn't as "right on" as in the tutorial. I'll walk or wave my hand past the robots head, and it will start to follow, but then it will go the other way, then back again. Its definitely not locked in as in the Omnibot tutorial.

The head also will flutter back and forth at times, and the picture is shaky.

Any hints?

Thanks, Den


Upgrade to ARC Pro

Experience the transformation – subscribe to Synthiam ARC Pro and watch your robot evolve into a marvel of innovation and intelligence.

United Kingdom

What is the lighting like in the room you have tested it in? I get phantom faces due to low lighting levels and this can send the head to go haywire.

On the camera control choose Processed video and watch for phantom faces (a box around nothing means it thinks it's a face). You can also adjust brightness and contrast to attempt to get better recognition.

If you have movement tracking on also then you may need to tweak the settings so it doesn't pick up it's own movement.

South Africa

Is the camera perhaps mounted on any of the movement parts?


Thanks Guys,

This robot has more bugs in it than a bee hive!

The room is well lit. When I look at the screen I get a lot of green when the head is turning on the computer screen.

I built a cradle for the head to sit on, that the servo wheel is connected to on top of the servo. The head goes back, and forth, but won't lock into anything most of the time. Maybe the weight is causing me some trouble?

Another question. In order to have the robot go straight, and turn on its own, do I need to start writing script? I'm not familiar with this?

I have it sitting on a platform, and sometimes the wheels start turning on their own.

One thing at a time I guess!


United Kingdom

It shows green when it detects movement. When the head moves to track movement it will detect the movement too causing it to get confused and go haywire. Turn off movement tracking or change the settings to allow for more time before it checks for movement again.

There is an old topic on that, I think it was more about the "noise" on the camera when movement tracking but it turns out it was this issue. When the image received by the camera changes at all it's classed as movement, so if the camera itself moves and the whole image changes then it thinks it's movement.

Hopefully that explained it well enough.

As for roaming around on it's own, it can be done with scripts. Do you have object avoidance? Is there a set path you want it to follow? I have a basic script on my robot which makes him move forwards until it detects an object it will collide with, then it will turn left and move forwards again. The IR Object Avoidance script is in the scripts section and on the Cloud under examples, just adjust it slightly to include the movement commands. You could easily change it so the robot moves forwards at a random speed for a random time, turns in a random direction for a random time, then moves forwards again for a random time, turns again etc.

The wheels shouldn't start turning on their own. What do you have driving it? Modified Servos or a HBridge? My HBridge will sometimes start moving the robot without being asked to but only when powering the robot on, once powered up, connected to ARC and the Movement Panel used it behaves itself.


Thanks! I will definitely give that a shot. I figured the movement of the head is probably at least one cause.

I have a Hbridge running my 2 motors. I'm going to have to learn write scripts for commands. I've never done it before.

I'll give it a shot, and let you know how I made out.

Thanks again, Den

United Kingdom

We all start from somewhere, the good thing is EZ-Script is very simple and the manual for the commands is on the right hand side of the interface at all times.

Think of the scripts logically and you will be able to write it.

Say you wanted the robot to move around until the collision sensor detected possible collision. Think how the robot would do this and write it with the commands;

# Move forwards

#Check for collision
#Get sensors info
$collision = GetADC(ADC0)
#Check against predetermined level
If ($collision > 200)
#Stop and turn left
#Go back to the start and loop

This code is from memory without EZ-Script open and I haven't tested it. My code may be incorrect but you should be able to see what I am getting at.

Also, check the many examples in the cloud, script part of the forum and the examples in ARC for ideas and help on commands.


Tracking people has to do with alot of things ,lighting (witch you can add lights to it) speed of the camera (now much you can do about it ,then get a better camera) ,camera resolution (get better camera like HD camera logitech makes them fairly cheap ,but need a computer on board) and plus speed of your computer to process the data,(video takes of alot of processing speed and memory resources)

Couple ideas ,look for WIFI camera if not using on-board computer ,add light to your robot.remove some programs of your computer or use a seperate computer only for EZB robots can reduce the the size of the camera to process data faster will help some

United Kingdom

Camera resolution is unimportant for face tracking, low resolution will still track very well. QR Codes, Glyphs, custom HAAR etc. probably could benefit from a higher resolution but face tracking doesn't. This has been covered by a previous topic on here.

Video processing for the basic commands in ARC doesn't require a lot of resources. I have been testing my build on my HTPC whilst streaming HD movies or FLAC music and running other things in the background too and it has had very few problems (the only slow down was during poorly encoded 1080p movies), my HTPC is old and only runs 4Gb Ram and a dual core 2Ghz AMD CPU so is pretty much classed as a low end machine these days.

Lighting is the main issue with tracking anything, then for some things resolution (although the EZ-Kit camera has good enough resolution, as do pretty much all currently available cameras)


here is some more ideas you can try,look at the camera and adjust the settings for best picture and add light and see what happen,in a house there are many lighting factors that can be a problem.night and day,window blinds open or closed ,dark room he enters

So to fix this you need a light detector ,witch is fairly simple sensor ,some use photocells and some use IR diodes and then measure LUX in volts using analog input and values to adjust the lighting conditions i am mostly a expert at sensors all types ,if you need info on this type let me know


I AM DEALING WITH CAMERA,S alot and most you said are kinda wrong ,quality of camera is very important I have over 20 different camera and testing them for face tracking for a long time and the quality of the camera helps a,lot,maybe not as much in EZB that i will start testing on sometime soon,but should not make any difference

RICH how many camera's do you have and tested

ON VIDEO processing speed there is a simple test ,windows task manager and look at performance then cpu and memory usage

United Kingdom

Here we go again...

I'm sure the resolution issue was commented on by DJ in a previous topic and that is what I based my reply on. Personally I will take the words of the guy who wrote the software over anyone else. What I have is unimportant, as stated the EZ-Kit camera is good enough for tracking testing other cameras would be redundant.

Windows task manager is not a very good tool for benchmarking and logging, it's also relative to your specific set up, a CPU load of 25% on one PC could be a lot different on another due to CPU speed and number of active cores.


Very soon will be doing some tests on ez-robot bluetooth camera ,using my computer and windows task manager and see if i can take a snapshot video processing power on the task manager depends on how many programs are installed in a computer.

then i will post the results if you want to do it you can

RICH you your self said i am a expert at sensors like you on software

ALL i do is test every type of sensor ever made and a lot at my work and a camera is a sensor,i posted long time back my results in a robot club photos and more data of almost every type of web camera's using OPENCV for face tracking.

Now so far not tested a bluetooth camera on EZB ,that will be my next test using facetrracking and more,SO RIGHT now hard to say who wrong or right.

But at least on using usb webcamera makes a big difference,using a faster frame per second rate and higher resolution and then in software set it at lower resolution for faster processing power


DJ may be right ,but the tests are much better,only ones i have are USB types and they may give better tracking ,may not be needed but can make it better.



I worked with the robot tonight, and in the camera configuration - the box for the horizontal increment - what is a good setting? I raised it up to 30 and it seems to be working better. Strange thing is when the head follows something the motors get a quick surge, and the robot jars like the tracking is affecting the control for it.

Also how many frames should it check for under the motion screen? I have 7.

I put the robot on a block and let the motors run. Standing behind it I put my hand in front of the camera to see if the head will turn and the motors shut off.

Thanks, Den

Ok, I may have made progress. Inside the grid there is a blue box that captured the red ball. The head keeps turning back and forth, when the picture changes when the head turns back and forth, the ball remains in the box. The head just won't stop moving.


Until have my camera setup made ,cant help you yet my setup is simple a pan and tilt on a board and using EZB and a script,plus using different lighting conditions,then for the output instead of motors i am using leds ,red and green for forward and reverse. I do have a motor controller like the one DJ is selling only with LEDS so i might use that.

May be RICH or others might be able to help you with settings.

After i done my testing will do a new post on my findings.

United Kingdom

Is the setting turned on for allowing the camera tracking to move the robot? Upload your project file to the EZ-Cloud and I'll take a look at the settings, that's probably the easiest way.

It's detecting the ball just fine but something somewhere is not set up correctly, the ball (with the box around it) should be in the middle of the grid, if it isn't then the camera servos will move so that it is in the middle. It could be a number of things so best to upload the project file and we can see what we are dealing with.



I was working on it before I read your thread. I went and started another program from scratch. The motors started to work correctly, and I ran it across the floor. It was going around obstacles, so I was happy. I'm going to try programming the camera again. Ill upload after that. I prob won't have time until tomorrow. Thanks for the support!