
cyberdude
Guys, I just hooked up my camera. I hard wired the leads to the camera circuit board (confirmed to be working) and set up a camera tracking widget.
When I have no servo movement selected in the config, the camera seems to track my hand just fine, however, when I check off camera track and set my neck and head servos, the processed display fills with a LOT of tracking delta and it doesn't seem to clear up. The "head" goes a little bizzerk as it tries to rectify.
Here is the image...
Messed around with sensitivity to no avail. It reduces noice, but still get overall screen of delta noise.
Is it generally a matter of messing with size and sensitivity?
If your robot is physically set to move, you may want to use a tracking method other than "Motion". Because as you experienced, the robot will move and therefore the image moves
Here is the EZ-Script:
Code:
When I tried your script, The head would move up and down, the processed video display would be full of delta pixels for a moment, then clear. So not sure if that is the preferred result.
I will try a few more experiments on this end, this morning and report back. It seems close, just not sure why, when I wave my hand in front of the camera, the tracking goes nuts and fills with room noice.
It certainly "sees" my hand moving, but is "confused" by the tracking noice that gets generated around it, filling the frame.