User654543
USA
Asked
— Edited
Hello, Im wondering if there is a way to use this to track a detected object. For example a two axis head following a person, Thanks ?
Related Hardware (view all EZB hardware)
EZ-B V4
by EZ-Robot
EZ-B v4/2 robot controller: dual Cortex ARM, Wi-Fi, audio/video, 24 servo/digital ports, I2C/UART, camera and Synthiam ARC control for custom robots
Wi-Fi / USB
Servos
24
Camera
Audio
UART
3
I2C
ADC
8
Digital
24
Related Robot Skill (view all robot skills)
Darknet YOLO (Obj Detection)
by ptp
Tiny YOLOv3 CPU-only real-time object detection using a camera; offline detection, script-triggered on-changes or on-demand results with class/scores.
Requires ARC
v5

According to the manual for the darknet Yolo here: https://synthiam.com/Support/Skills/Camera/Darknet-YOLO-Obj-Detection?id=19863
There is a script that will run for every detected object. Because this robot skill is not embedded with the camera device robot skill, the pan and tilt would need to be custom scripted. That's much easier than you think because you need to get the coordinates of the detected object and move the appropriate servo.
If you use Tiny Yolo, it is integrated with the camera device robot skill: https://synthiam.com/Support/Skills/Camera/Tiny-Yolo2?id=20002
That means it supports the built-in tracking features of the camera device. So you need to enable servo tracking in the camera device and select the servos.
Alright that'll defiantly save some time
. Appreciate the quick answer.