
Dunning-Kruger
Canada
Asked
— Edited
EDIT False alarm or?...as you were...
EDIT AGAIN I was right a V5 is in development....:)
@DJ, you forgot to mention this little addition in the release notes of the OCT 31st ARC update.... Maybe it's nothing but if you click on the "?" it gives a bit more info alluding to an ezb5 something.... "*Note This is experimental and used for the EZB v5 beta"... You sly dog, did you really think we wouldn't notice?...
One cool feature Oculus has is "Click steer" When you click anywhere on the camera image the tilt servo attempts to centre the camera where you clicked.... If the robot is moving forward you can actually drive/steer the robot using click steer... Where you click on the camera image the robot steers towards it.... Maybe a new ez robot feature...?
That wouldn't be too difficult a feature to add (click steer) as a plug in. You would only need to have the input from the mouse click on the video image then some simple calculations to work out the XYZ axis movements required.
Although I guess it would be even easier for DJ to add that new control to the current camera control.
Adding a SLAM module to ARC would be awesome.
The quickest and easiest modules to work with are the neato XV-11 lidar and a XV Lidar controller. The controller will pass back the information from the neato over usb. There are also public python scripts so that you can see what is being done to convert this information to points on a map. For SLAM to work great, you need odometers on the wheels to measure the distance that the robot is traveling. A sabortooth/kangaroo combo could be used for this along with wheels that have encoders. The sabertooth/kangaroo combo might not be needed but it sure makes things a lot easier.
From there, the fun begins. Storing the map as you travel through it and calculating your current map position probability based on what you see with the sensors becomes the next issue to tackle. All of this is stored in X/Y coords.
From there, path planning becomes the next fun part. You can use something like A* for this if you want. This is a popular path planning routine used in computer gaming. It calculates the least cost route to go from the current point to an X/Y coordinate.
Communicating this information over the wifi channel used by the EZ-B could be problematic. There is a lot of data for sure. It would probably be preferable to do this all on the computer running ARC instead of running all of this data through Wifi. It might also be better to build a SLAM board that is an attachment to the EZ-B that would perform all of these functions and just pass the information back to the EZ-B that is necessary, but this is just a guess based on what I have seen up to this point. SLAM hasn't been my focus up to this point but it is definitely something that I have put some time into learning and experimenting with. It would be an amazing addition to ARC for sure.
@Richard. SLAM algos are graduate level work, at least my limited understanding grasps it. I would think at least with easy robot you would need a standardized platform with a standard sensor package to get a lot of EZers on board and limiting the tsunami of questions from users.
Roborealm seems to be very rich with low hanging fruit. Especially the path planning section. Roborealm Path Planning
The EZ-Robot plugin architecture is wide open for developing path planning and upgrading the floor mapping features. Personally, I've been daydreaming of what it would take to create a new floor mapping and path planning plug in.
@Levels.... I am way ahead of you... I have a video floating around I did to demonstrate crude path planning with roborealm... It uses the AVM navigator module instead of the one you posted above... It does indeed work but it's crude... SLAM is much better at this.... EZ Robot doesn't have to do ROS's version of slam... I mean DJ is capable of developing his own indoor navigation system that would be built into ARC as opposed to it possibly being a complicated add on if he used ROS's SLAM....
Here's the link of my video... AVM Navigator in roborealm
@David... I missed your post somehow.... Great information... I would be willing to mess with the kangaroo/wheel encoders. I actually have 2 that I haven't used yet, but want to... I was also wondering how this can be done with just an ezb... I figured that a PC would have to be on board the robot for at least for using sensors like the xtion or kinect.... The neato lidar may be a different story especially with that controller you posted a link to... No matter how autonomous navigation gets done, I hope ez robot (@DJ) will be able to add this feature to ARC....
That oculus robot is quite disappointing, welcome to 2005... As for the ros - I'm always impressed with some of their Ph.D. contributions to the code base. I don't expect to create a slam control myself. With the open architecture ezrobot will license something or code around the ros library - why reinvent the wheel
specifically if there are inventors actively devoted to continuing improvements.
RoboRealm has been popular within the ezrobot community. I don't know what they have for slam - although I expect they would make something, because they're that kind of folk. The kind of folk that get $&@?$ done!
What matters to me at this point is what sensors are used for the popular ros slam module. If some are using Kinect. And some are using lidar. And some are using a blind mans cane. I'd like to know what the popular choice is surrounding the most active slam development. Because each one of those obviously requires a completely different approach to obtain data.
I'd like to know which of those ezrobot should support and voila, we are on the track to wrap a library.
slam isn't that difficult after reading about it. It's pretty much a reverse Ray trace
. Remember the PC 3D version of Wolfenstein? Just like that.... It's also how my original k9 navigated. Keeping a collection of distances and transforming the 2d array based on the movements.
Also the omnibot had a very very simple 1d array - which helped him understand door ways and such. The code is in the sdk package.
@DJ There is no reason to dis other robot companies... The people on the forum here (including myself) are here because we are already sold on the fact that EZ Robot is the best robotics platform out there hands down... I headed down the ROS road as I wanted to learn something about autonomous navigation (SLAM). I am going to agree however, that the Oculus is a bit of a "one trick pony". That one trick (SLAM autonomous navigation) is a doozy, though. Yes, it does not offer you the "Lego set" flexibility that the ezb4 gives you... That's why I just bought the kit version (I already had an itx board, anyway).... I would not pay $1600 just to play with SLAM.... The good news is if I want I can pull the itx board and throw in an ezb at any time.....
@DJ ... Bottom line what ever you can do to add this type of feature to ARC would be greatly appreciated.... Not just by me, but I suspect by many others as well....
P.S.... I want one of Bob's t-shirts... a white T-shirt with the ez robot logo