
Dunning-Kruger
Canada
Asked
— Edited
***EDIT*** False alarm or?...as you were...
***EDIT AGAIN*** I was right a V5 is in development....:)
@DJ, you forgot to mention this little addition in the release notes of the OCT 31st ez builder update.... Maybe it's nothing but if you click on the "?" it gives a bit more info alluding to an ezb5 something.... "*Note This is experimental and used for the EZB v5 beta"... You sly dog, did you really think we wouldn't notice?...
Although I guess it would be even easier for DJ to add that new control to the current camera control.
The quickest and easiest modules to work with are the neato XV-11 lidar and a XV Lidar controller. The controller will pass back the information from the neato over usb. There are also public python scripts so that you can see what is being done to convert this information to points on a map. For SLAM to work great, you need odometers on the wheels to measure the distance that the robot is traveling. A sabortooth/kangaroo combo could be used for this along with wheels that have encoders. The sabertooth/kangaroo combo might not be needed but it sure makes things a lot easier.
From there, the fun begins. Storing the map as you travel through it and calculating your current map position probability based on what you see with the sensors becomes the next issue to tackle. All of this is stored in X/Y coords.
From there, path planning becomes the next fun part. You can use something like A* for this if you want. This is a popular path planning routine used in computer gaming. It calculates the least cost route to go from the current point to an X/Y coordinate.
Communicating this information over the wifi channel used by the EZ-B could be problematic. There is a lot of data for sure. It would probably be preferable to do this all on the computer running ARC instead of running all of this data through Wifi. It might also be better to build a SLAM board that is an attachment to the EZ-B that would perform all of these functions and just pass the information back to the EZ-B that is necessary, but this is just a guess based on what I have seen up to this point. SLAM hasn't been my focus up to this point but it is definitely something that I have put some time into learning and experimenting with. It would be an amazing addition to ARC for sure.
Roborealm seems to be very rich with low hanging fruit. Especially the path planning section. Roborealm Path Planning
The EZ-Robot plugin architecture is wide open for developing path planning and upgrading the floor mapping features. Personally, I've been daydreaming of what it would take to create a new floor mapping and path planning plug in.
Here's the link of my video... AVM Navigator in roborealm
RoboRealm has been popular within the ezrobot community. I don't know what they have for slam - although I expect they would make something, because they're that kind of folk. The kind of folk that get $&@?$ done!
What matters to me at this point is what sensors are used for the popular ros slam module. If some are using Kinect. And some are using lidar. And some are using a blind mans cane. I'd like to know what the popular choice is surrounding the most active slam development. Because each one of those obviously requires a completely different approach to obtain data.
I'd like to know which of those ezrobot should support and voila, we are on the track to wrap a library.
slam isn't that difficult after reading about it. It's pretty much a reverse Ray trace :). Remember the PC 3D version of Wolfenstein? Just like that.... It's also how my original k9 navigated. Keeping a collection of distances and transforming the 2d array based on the movements.
Also the omnibot had a very very simple 1d array - which helped him understand door ways and such. The code is in the sdk package.
@DJ ... Bottom line what ever you can do to add this type of feature to ARC would be greatly appreciated.... Not just by me, but I suspect by many others as well....
P.S.... I want one of Bob's t-shirts... a white T-shirt with the ez robot logo
That being said, the lidar returns less data to process and is the perferred first approach by most. It requires a less expensive computer to process the data (atom or arm sized processor). These are less power hungry which makes them better suited for robotics.
The lidar is a pretty inexpensive way to do this and they are very available. They also stand less of a chance to flood the communication channel.
@richard, I certainly am not dissing the company - merely disappointed with the physical robot's design and specifications for the price. Given their impressive coding ability, I had high expectations :). There's a philosophy with pricing - and believe it or not, it's a huge attribute to a products success. Many think product pricing is designed by calculating a desired markup - well, the answer is more complicated than that. For example, currently the robot is incredibly niche and therefore sells 1 per month. Now, imagine that robot cost $300 or $400 - they would sell 10/20 per month and make more revenue and profit, respectively. Also one important result of having a larger user base is feedback to strengthen the product which makes it more attractive to new customers. A small user base limited by pricing doesn't promote product growth and therefore isolates the feature-set to ones decided are important by the developers and not the consumers.
Many companies suffer from making the same decision - by focusing on what they feel is important and not what the consumer wants. Ezrobot's approach has been the opposite. When someone says "must be nice being the boss". My answer is "are you kidding, I have 25,000 bosses!"
Ezrobot doesn't make decisions for its users - it's been the other way around, including with pricing. Their pricing model doesn't present companies the option to partner with their product either - and that's hugely disappointing.
Their company and technology is indeed impressive - or at least must be for that price. It's the product pricing and specifications that I am disappointed with - not dissing. Disappointed. They are very different emotions
Lidar controller
This is the place I go through. Shipping is a bit slow but everything has worked great that I have received. He is out of Dallas TX, which I visit a lot, so it is an easy pickup for me.
2) Feedback on load of servos can only be done on servos which are supported, and therefore is not ez-b v5 specific. If dynamixel servos support feedback of load, then they could be supported with the software update. The software for dynamixel support is an open source plugin on the website. Anyone interested in adding additional support is welcome to it
3) Grippers are now starting to ship with foam inserts to hold objects, such as pens/pencils, screw drivers, utensils, etc.. I will write a post once the new grippers are online - the foam inserts will be purchasable. The design for the foam grippers to cut your own will also be posted. Stay tuned
I was wondering if you have considered add on (piggy back) boards which could offer embedded features or functions . This way the v4 platform, or the future v5 would have a much larger potential. I know onboard features offered on a footprint the size of the ezb could be very powerful. (on board sub routines, sensor interface, on board memory) ?
My desire is still indoor navigation. I was hoping an optional piggy back board made for navigation only could become an offer.
What is your thoughts?
Ron R
I do want to ask you what your ideas about the piggy back modules are?... What sort of features would they have that currently can't be done already with the ezb?
David posted this Lidar package My guess is this or @DJ's original camera/IR based system idea are the likely candidates if he were to undertake an indoor nav solution...
Regarding the navigation idea,I was wondering if Dj would or could embed software to work with hardware, maybe the camera or beacon that has been discussed in the past, to perform navigation of some sort. I know you have been working on, and are developing a nav. system for your needs. I was just looking at a way, maybe, Dj would or could consider a EZ Robot system. He could make it a separate package that plugs into a communication port.
My other thought is a system where a module could be plugged into the ezb, say a com port ( allowing a USB port, requested in a earlier post) or maybe the ability of onboard subroutines, or onboard memory or sequencer etc. ?
Just throwing out ideas.
Ron R