Dunning-Kruger
I had briefly touched on this in one of my other threads.... Anyway, I thought of a good idea (well, I think so anyway) for a plugin or new feature for the camera control... Click steer.... So what it would do is if you have a pan and tilt setup with the ezb4 camera you would be able to click on an area of the camera image the x and y servo would move to centre in on the area you clicked.... For example you click on the top left corner of your camera's image, the x and y servo would attempt to centre the camera on the top left corner of the image...
Also, you can use the same principal to steer the robot while it is moving forward... Click left on the camera image and the robot will steer proportionally to the left depending how far left in the image you clicked... And of course same for the right... So steering the robot using your mouse and camera image...
I could probably hack something up using the plugin Remote mouse... However that would mean I could never move the camera control or it would throw everything off...
That's a good idea - but don't use the remote mouse. You would use any of the overlay plugins. On the panel which displays the image is an event called "OnMouseClick". You simply need to grab the coordinates of the mouse click and perform the appropiate command (move servo or steer). Of course steering can only be done using the HBridge with PWM. And because every robot is different, you would require many values to adjust/tweak for the turning speeed which is relative to the robot speed. Every robot will have a different speed. Also, same applies with clicking on the window to move servos for focus. This would require some math to calculate the number of degrees based on the servo lever.
Wouldn't be too hard Give it a shot!
Ok.... I will see what I can come up with.... Cheers
nice!
Ok, so to be clear... I need to download your example camera overlay and tweak the Visual Studio example to extract the "OnMouseClick" x and y coordinates? Ha, ha... I'll give it a shot. I'll load your example into visual studio and poke around and see if my seriously limited (actually pretty much none existent) C# abilities can do this.... Assuming I can alter the plugin to accomplish this, would I need to publish it as a new plugin?
Correct
You would need to publish it as a new plugin if you wish to share it with others. Either way, it requires a unique guid, etc..
Crap my 30 day trial version of VS has expired so I can't load the project....... Did I hear you mention there was a free version somewhere?
Free version is called Community Edition
https://www.visualstudio.com/en-us/products/visual-studio-community-vs.aspx
Thia is very similar to something I want to achieve, although more autonomous. I want to recognize an object. By a combination of pan/tilt and robot poaition adjustments drive towards it, adjusting the camera as I get closer so it is always centered in view. When the camera position indicates I have reached the object, execute an AutoPosition action to have an arm pick up the object.
I have been struggling with how to script this, and now think a plugin may be the way to go.
Alan
Hey @Alan... I guess I need to remove my 30day trial version first before I install the one you linked to?
Probably, yes.
Alan
@thetechguru,
I do the first part of this with the camera controls. I have my InMoov robot look for an object, I use a red ball, by panning left and right. Once the object is centered , the robot moves toward the object and it keeps it centered, it stops moving when it can no longer see it. It would be easy then to have it reach out and take on object handed to it. I haven't tried to get it to pick up the object off of a table but it think it is doable. I can do it with my robotic arm. It will search for, say a red object, once it finds and centers it in the camera, it reaches out and picks it up.Bob, is your project posted so I can look at what it is doing?
Alan
Here it is,
Thanks Bob,
That will get me started.
My pleasure, always happy to share. If we go any further with this, I think we should start another thread. Although some of what you are trying to do is related to Richard's original question, I think we are close to hijacking this thread.
No worries guys... I get ideas from what you post and talk about here too....
Funny enough I have something very similar to what Bob posted... It's an auto docking routine for one of my rovers that uses colour tracking, a ping and an IR sensor (like the one I suggested to @Bob for his inMoov's) to locate and dock to recharge.... I wrote it as a small part of my quest to achieve autonomous navigation with EZ Robot. It's a long way from being finished but I might just make a video and post my docking routine...
So here is a quick video of my docking routine that I mentioned above.... It uses similar colour tracking as @Bobs example... One tricky part was to get the robot to drive backwards while tracking and while looking in a mirror Yes, there is a small mirror in which the ezb4 camera is looking in while driving backwards... and tracking to boot... Yes, this is slightly cheating as this robot was part of a kit meant for a mini itx pc (linux) and ROS... However, I wanted to see if I could mimic the Oculus ROS docking sequence using an ezb4 and ARC. Turns out my docking script works better than the Oculus one... It docks successfully 9 times out of 10... I still have the "big one" to tackle and of course that is autonomous navigation part...
below is my code to do what is shown in the video
Try this - and make sure you have the latest ARC: https://synthiam.com/redirect/legacy?table=plugin&id=58
I didn't test it yet - because i don't have any robots at my house, believe it or not! Isn't that weird? The source code is published in the plugin if you want to see how i did it. It's pretty quick and simple.
@DJ Thanks... I'll have to try it out tomorrow afternoon. I have a very early morning ... Again thank you....... I know, I know.... I really should get my a** in gear and learn how to do some plugins....
@Richard:
Nice touch: Docking successful !
What kind of kit is that, is the docking included too ?
Charging circuit, did you built any electronic to manage the charging process ?
Cheers
You can find the kit here... Oculus ROV .... If you get the basic kit, it is an awesome bot to use with an ezb4 and the LIPO charging control board comes with the basic kit to boot...
So, anyone going to take my plugin source code and make it steer?
I'll give you a hint. There's a class called EZ_Builder.EZBManager.Ezbs[0].Movement
And in there is a .Forward() and .Right() etc....
That's how you make the robot execute its direction of the added movement panel.
@DJ I can do the basic logic coding if, else etc, no problem... It's knowing what I am doing within Visual Studio and learning C# syntax that's the hard part....
@Rich
Your get the hang of it. Most are based around C so some commands are easy to grasp. It's the switching between languages for long periods at a time that can get you confused for a bit. Heck even commenting can have a different syntex, -- // /* */.