Asked — Edited

Camera Tracking

First let me say I love the EZ-B. So many possibilities! Thanks for the wonderful product.

I've been working with the camera tracking and I have a few thoughts.

On color tracking, how hard would it be to add RGB values to tack any specified color? Or better yet a real-time color sampling from the camera? Example: place any solid color object in the center of the camera view and issue a track this color command. It would be great for tacking random objects during demos, or adapting to various lighting conditions.

On motion tracking in a cluttered room, the robots movement of the camera will cause the camera to track static background objects. I've noticed in the forums that others are adding the frame delay, sensitivity, object size, or scripting the tracking on and off. Would it be advantageous or possible to add an automatic pause to the motion detection until the robot finishes moving the camera? Example: pause motion tracking while servos D8, D9, etc. are moving.

Thanks


ARC Pro

Upgrade to ARC Pro

Stay on the cutting edge of robotics with ARC Pro, guaranteeing that your robot is always ahead of the game.

#1  

I find that it's not the camera movement that triggers unwanted movement recognition, but the "resolution" routine. In other words, the processing of the frame to eliminate static deltas. I presume that is highly dependent on processing power of the host PC.

That's why so many have different solutions.

I personally use the fine / gross sliders along with a variable used in the Sleep call at the top of my motion script for easy tuning.

I really like the "wait" routines that DJ is building into the scripting language. A neat one might be something like "trackingWait(Camera, Some state) whereby the call would resolve the wait for the processed frame to clean up before continuing.