
Dunning-Kruger
I had briefly touched on this in one of my other threads.... Anyway, I thought of a good idea (well, I think so anyway) for a plugin or new feature for the camera control... Click steer.... So what it would do is if you have a pan and tilt setup with the ezb4 camera you would be able to click on an area of the camera image the x and y servo would move to centre in on the area you clicked.... For example you click on the top left corner of your camera's image, the x and y servo would attempt to centre the camera on the top left corner of the image...
Also, you can use the same principal to steer the robot while it is moving forward... Click left on the camera image and the robot will steer proportionally to the left depending how far left in the image you clicked... And of course same for the right... So steering the robot using your mouse and camera image...
I could probably hack something up using the plugin Remote mouse... However that would mean I could never move the camera control or it would throw everything off...
So here is a quick video of my docking routine that I mentioned above.... It uses similar colour tracking as @Bobs example... One tricky part was to get the robot to drive backwards while tracking and while looking in a mirror
Yes, there is a small mirror in which the ezb4 camera is looking in while driving backwards... and tracking to boot... Yes, this is slightly cheating as this robot was part of a kit meant for a mini itx pc (linux) and ROS... However, I wanted to see if I could mimic the Oculus ROS docking sequence using an ezb4 and ARC. Turns out my docking script works better than the Oculus one... It docks successfully 9 times out of 10... I still have the "big one" to tackle and of course that is autonomous navigation part...
below is my code to do what is shown in the video
Try this - and make sure you have the latest ARC: https://synthiam.com/redirect/legacy?table=plugin&id=58
I didn't test it yet - because i don't have any robots at my house, believe it or not! Isn't that weird? The source code is published in the plugin if you want to see how i did it. It's pretty quick and simple.
@DJ Thanks... I'll have to try it out tomorrow afternoon. I have a very early morning ... Again thank you...
.... I know, I know.... I really should get my a** in gear and learn how to do some plugins....
@Richard:
Nice touch: Docking successful !
What kind of kit is that, is the docking included too ?
Charging circuit, did you built any electronic to manage the charging process ?
Cheers
You can find the kit here... Oculus ROV .... If you get the basic kit, it is an awesome bot to use with an ezb4 and the LIPO charging control board comes with the basic kit to boot...
So, anyone going to take my plugin source code and make it steer?
I'll give you a hint. There's a class called EZ_Builder.EZBManager.Ezbs[0].Movement
And in there is a .Forward() and .Right() etc....
That's how you make the robot execute its direction of the added movement panel.
@DJ I can do the basic logic coding if, else etc, no problem... It's knowing what I am doing within Visual Studio and learning C# syntax that's the hard part....
@Rich
Your get the hang of it. Most are based around C so some commands are easy to grasp. It's the switching between languages for long periods at a time that can get you confused for a bit. Heck even commenting can have a different syntex, -- // /* */.