Canada
Asked — Edited

Questions With Object Tracking

Playing around with Object Tracking tonight and I can't get it to work. I watched The Robot Program 46, to make sure I was doing it right. I have trained a few objects and when I try to track them nothing happens. I don't get a blue box on my camera screen nor does it show any tracking is taking place. Color and QR tracking is working but not the Object tracking. Any help would be appreciated. Thanks


ARC Pro

Upgrade to ARC Pro

Unleash your robot's full potential with the cutting-edge features and intuitive programming offered by Synthiam ARC Pro.

PRO
Synthiam
#1  

Must be missing a step with object tracking video. Try it again:) have fun!

#2  

Hi, I have a similar problem. I loaded everything got it working, saved it. Went back to use it again, it wouldn't run. Reloaded objects etc., got it working again. I Just tried it, not working? No blue box.

#3  

Hi Bob,

I just played with my saved project some more and found the back lighting is very important, though I used the same lighting, object and program yesterday. The "blue box took a while (3-4 seconds) before it selected an object (my face), but finally came up. The second and third objects took less time but also had a delay. I saved it and loaded it again and it seems to be ok.

I have a simple speech recognition script running it, but I had duplicated Dj's example during earlier test.

Ron

PRO
Synthiam
#4  

There’s important things I say in the video - that I’m attempting to avoid repeating. If your object does not totally fill the Learning square, then the background will be part of the learned object.

The computer does not know depth. It has one eye (camera) and can’t know anything other than what is in the square. If there’s background, it’ll think that’s part of the object. Understand? It assumes everything in the square while learning IS the object.

So find objects with details that the computer can learn, not the entire object itself.

Also, lighting is important because when the color changes or brightness changes, the object looks different. The computer sees the object as a 2 dimensional thing, so rotating it will also make a difference.

Lastly, you MUST slowly move the object in different angles while learning in the square so it learns what the object looks like at different angles. When the computer sees the object as 2 dimensions, rotating the object makes it look like something entirely new.

Tutorials, such as the robot program episode in question, is a collection of instructions to learn how to do something. It takes us many many many hours to make one instructional video - and it therefore I recommend using it as a resource. It isn’t efficient for me to respond to assist what is already covered in tutorials - specially while expecting new features, new support, changes, etc... I can only do one thing at a time:). So if my answer is vague and refers to the episode, it's because I'm swamped and don't have time.

#5  

Thanks Dj. The issue I have and I think Bob has also is the blue box fails to come on the screen. The position of my face is in the same location and the lighting is the same but the tracking box does not come on. I will try to come up with why it doesn't consistently work.

#6  

Thanks for the comments. I notice that the Camera control in the video is different than the one downloaded from ARC. Control in Video

User-inserted image

Control from ARC

User-inserted image

I don't know if that makes a difference.

#7  

Hi Bob,

Have you been able to get it to work yet?

#8  

I haven't had time today, tonight perhaps.

#9  

If you follow the 046 Tutorial it does work.

I don't know if my issue is something else, but when I reload (re-teach) the faces,it begins to work again. I am even able to have it recognize a face In real time from a picture I have taken with my phone and loaded in.

It works with my IoTiny and even the Webcam on my computer.

Try a clean project. This way the camera doesn't have anything else running.

Be sure the background is not "busy". A plain wall is best.

PRO
Synthiam
#10  

I wrote the answer and it’s in above post. I can’t rewrite it a different way? I reread it a few times and it appears super clear to me. Not sure what I’m missing that isn’t making sense. Let me try again...

  • consistent Lighting
  • Training background with the object
  • consistent Angles

Keep playing with it and you’ll make progress like you have done with the thousands of other things you’ve learned with ARC:). This is a technology that you’re using, and object training does require some understanding of how a computer works. So playing with it and experimenting is the best way.

Seriously, if you don’t train the angles correctly, it’ll never know what the object is without being in the exact angle. I can’t say that any differently.

This also applies to your face. Your smile. Your eyes. Your hair. Etc

It’s training an image. It’s learning images. Anything that isn’t those images won’t detect. It only knows the images that you trained it. It can’t know anything else. I don’t know how else explain it any differently.

And as far as the camera control looking different - it’s the SAME CONTROL but with new features. EZ-Robot adds features to our platform in ever release. Discover the features and history by looking at the ARC page by clicking on the link ARC that’s in blue.

You can discover more about ARC by clicking on the link in blue. See where it says ARC and it’s in blue? Click on that - once you do, you can read more about the software and there’s even a release notes button. If you view release notes, you will see that ARC is updated often with new features. The reason the control has a few new features, which you can see, is because it’s in a release note.

You can look at the differences and determine for yourself what those new features are. In this case, it’s super easy because you can see them.

Have fun! I’ll let you guys take it from here! Take care

#11  

I won't be commenting on this any more, I seem to have hit a sore spot or something.