
ahmed eshra
Egypt
Asked
— Edited
can i use ARC to track multiple red objects and to report the location of each one they are not custom colors they are red
can i use ARC to track multiple red objects and to report the location of each one they are not custom colors they are red
I'd like to know what you are looking to accomplish. Maybe I can help with a new feature of some sort. What are you attempting to have your EZ-Robot do?
The Kinect can track multiple objects at one time.
So can our ez-sdk, by using the kinect or our camera. Ez-sdk can do absolutely anything and is important for our industry and educational partners.
However, the question still remains: what is the goal? Because I do not understand how multiple object tracking at the same time can integrate with hardware.
Let me be more specific. If your robot is tracking a red ball, how do you expect it to behave with two red balls? Are there two heads on your robot? What is the robot doing with the second ball?
My question is what are you attempting to have your robot do with more than two objects at a time. This question is specifically relevant considering no one has pushed the limits of the existing camera control. Asking for more features when the existing features haven't been utilized is skipping a few important steps
I "believe" what kudo48pa is trying to accomplish is to prioritize 2 objects within the same camera frame to "determine" based on EZ-SDK software (variables) which item to talk to or kick or point at. Even 2 faces within the same camera frame, one may be close (large) square and one maybe furthur away(small) square and the programmer might want to talk to the close one then the far one. Or kick/move the closest red ball
I hope that helps clarify the topic.....I know thats a "feature" alot of people may want to incorporate.
i want to get exam papers to be checked
In Ed-sdk you can do that. I do not feel it is a feature for ARC. It would not be a very intuitive way to allow the average user to select the object to track.
To correct your example of the face... The software currently selects the closest (largest) face. This means that face is in front of the other face. This means the face in the back is far away. A far away face means it will move behind the face in the front if it tried to move. At least, if it tried to move enough to get a reaction from the robot. So a face behind a person cannot be tracked because the robot camera can't see it.
I think it's best if you picture yourself as a human and pretend you're a robot. What exactly does a human do with tracking two objects? We are only able to track one focal point at a time - that's not a limitation - because were good at it.
Using ControlCommand a you can already specify which object tracking method to use. You can also add multiple colors. There are more than enough features to fulfill your interactive requirements
Until my innovative little brain can find an appropriate and simplified use for multi object tracking, ARC will remain human-like by limiting to one object (focal point) at a time
Thanks DJ for the speedy and Clarified reply
"I think it's best if you picture yourself as a human and pretend you're a robot. What exactly does a human do with tracking two objects? We are only able to track one focal point at a time - that's not a limitation - because were good at it. "......good idea!.....Call me MR RobotO
Haha, maybe that's why I've been successful at designing robot skill controls... My alter ego is a robot! This must apply to all of you also... Guess we're all robots