Welcome to Synthiam!

Program robots using technologies created from industry experts. ARC is our free-to-use robot programming software that makes features like vision recognition, navigation and artificial intelligence easy.

Get Started
Asked — Edited

Ezb5 Coming?

***EDIT*** False alarm or?...as you were...:)
***EDIT AGAIN*** I was right a V5 is in development....:)

@DJ, you forgot to mention this little addition in the release notes of the OCT 31st ez builder update.... Maybe it's nothing but if you click on the "?" it gives a bit more info alluding to an ezb5 something.... "*Note This is experimental and used for the EZB v5 beta"... You sly dog, did you really think we wouldn't notice?... :P
User-inserted image

AI Support Bot
Related Content
Synthiam
United Kingdom
#18  
That wouldn't be too difficult a feature to add (click steer) as a plug in. You would only need to have the input from the mouse click on the video image then some simple calculations to work out the XYZ axis movements required.

Although I guess it would be even easier for DJ to add that new control to the current camera control.
#19  
Adding a SLAM module to ARC would be awesome.

The quickest and easiest modules to work with are the neato XV-11 lidar and a XV Lidar controller. The controller will pass back the information from the neato over usb. There are also public python scripts so that you can see what is being done to convert this information to points on a map. For SLAM to work great, you need odometers on the wheels to measure the distance that the robot is traveling. A sabortooth/kangaroo combo could be used for this along with wheels that have encoders. The sabertooth/kangaroo combo might not be needed but it sure makes things a lot easier.

From there, the fun begins. Storing the map as you travel through it and calculating your current map position probability based on what you see with the sensors becomes the next issue to tackle. All of this is stored in X/Y coords.

From there, path planning becomes the next fun part. You can use something like A* for this if you want. This is a popular path planning routine used in computer gaming. It calculates the least cost route to go from the current point to an X/Y coordinate.

Communicating this information over the wifi channel used by the EZ-B could be problematic. There is a lot of data for sure. It would probably be preferable to do this all on the computer running ARC instead of running all of this data through Wifi. It might also be better to build a SLAM board that is an attachment to the EZ-B that would perform all of these functions and just pass the information back to the EZ-B that is necessary, but this is just a guess based on what I have seen up to this point. SLAM hasn't been my focus up to this point but it is definitely something that I have put some time into learning and experimenting with. It would be an amazing addition to ARC for sure.
#20  
@Richard. SLAM algos are graduate level work, at least my limited understanding grasps it. I would think at least with easy robot you would need a standardized platform with a standard sensor package to get a lot of EZers on board and limiting the tsunami of questions from users.

Roborealm seems to be very rich with low hanging fruit. Especially the path planning section. Roborealm Path Planning

The EZ-Robot plugin architecture is wide open for developing path planning and upgrading the floor mapping features. Personally, I've been daydreaming of what it would take to create a new floor mapping and path planning plug in.
#21  
@Levels.... I am way ahead of you... I have a video floating around I did to demonstrate crude path planning with roborealm... It uses the AVM navigator module instead of the one you posted above... It does indeed work but it's crude... SLAM is much better at this.... EZ Robot doesn't have to do ROS's version of slam... I mean DJ is capable of developing his own indoor navigation system that would be built into ARC as opposed to it possibly being a complicated add on if he used ROS's SLAM....

Here's the link of my video... AVM Navigator in roborealm
#22  
@David... I missed your post somehow.... Great information... I would be willing to mess with the kangaroo/wheel encoders. I actually have 2 that I haven't used yet, but want to... I was also wondering how this can be done with just an ezb... I figured that a PC would have to be on board the robot for at least for using sensors like the xtion or kinect.... The neato lidar may be a different story especially with that controller you posted a link to... No matter how autonomous navigation gets done, I hope ez robot (@DJ) will be able to add this feature to ARC....
PRO
Synthiam
#23  
That oculus robot is quite disappointing, welcome to 2005... As for the ros - I'm always impressed with some of their Ph.D. contributions to the code base. I don't expect to create a slam control myself. With the open architecture ezrobot will license something or code around the ros library - why reinvent the wheel:) specifically if there are inventors actively devoted to continuing improvements.

RoboRealm has been popular within the ezrobot community. I don't know what they have for slam - although I expect they would make something, because they're that kind of folk. The kind of folk that get $&@?$ done!

What matters to me at this point is what sensors are used for the popular ros slam module. If some are using Kinect. And some are using lidar. And some are using a blind mans cane. I'd like to know what the popular choice is surrounding the most active slam development. Because each one of those obviously requires a completely different approach to obtain data.

I'd like to know which of those ezrobot should support and voila, we are on the track to wrap a library.

slam isn't that difficult after reading about it. It's pretty much a reverse Ray trace :). Remember the PC 3D version of Wolfenstein? Just like that.... It's also how my original k9 navigated. Keeping a collection of distances and transforming the 2d array based on the movements.

Also the omnibot had a very very simple 1d array - which helped him understand door ways and such. The code is in the sdk package.
#24  
@DJ There is no reason to dis other robot companies... The people on the forum here (including myself) are here because we are already sold on the fact that EZ Robot is the best robotics platform out there hands down... I headed down the ROS road as I wanted to learn something about autonomous navigation (SLAM). I am going to agree however, that the Oculus is a bit of a "one trick pony". That one trick (SLAM autonomous navigation) is a doozy, though. Yes, it does not offer you the "Lego set" flexibility that the ezb4 gives you... That's why I just bought the kit version (I already had an itx board, anyway).... I would not pay $1600 just to play with SLAM.... The good news is if I want I can pull the itx board and throw in an ezb at any time.....:)

@DJ ... Bottom line what ever you can do to add this type of feature to ARC would be greatly appreciated.... Not just by me, but I suspect by many others as well....

P.S.... I want one of Bob's t-shirts... a white T-shirt with the ez robot logo:D
#25  
There are two different sensors that are being used that I have seen. One is the lidar and one is the kinect type sensor. There are some advantages to the kinect sensor over lidar. The kinect sensor is able to scan multiple heights at one time. The lidar can only do one height. The kinect type sensor can also return 3d image data, where the lidar is 2d.

That being said, the lidar returns less data to process and is the perferred first approach by most. It requires a less expensive computer to process the data (atom or arm sized processor). These are less power hungry which makes them better suited for robotics.

The lidar is a pretty inexpensive way to do this and they are very available. They also stand less of a chance to flood the communication channel.
PRO
Synthiam
#26  
Thanks Dave, lidar does seem like a more cost effective approach - until you look at the pricing of the 360 scanning versions. Holy! Kinect could never "really" easily connect to the ezb directly and would require a PC. Interesting though...

@richard, I certainly am not dissing the company - merely disappointed with the physical robot's design and specifications for the price. Given their impressive coding ability, I had high expectations :). There's a philosophy with pricing - and believe it or not, it's a huge attribute to a products success. Many think product pricing is designed by calculating a desired markup - well, the answer is more complicated than that. For example, currently the robot is incredibly niche and therefore sells 1 per month. Now, imagine that robot cost $300 or $400 - they would sell 10/20 per month and make more revenue and profit, respectively. Also one important result of having a larger user base is feedback to strengthen the product which makes it more attractive to new customers. A small user base limited by pricing doesn't promote product growth and therefore isolates the feature-set to ones decided are important by the developers and not the consumers.

Many companies suffer from making the same decision - by focusing on what they feel is important and not what the consumer wants. Ezrobot's approach has been the opposite. When someone says "must be nice being the boss". My answer is "are you kidding, I have 25,000 bosses!"

Ezrobot doesn't make decisions for its users - it's been the other way around, including with pricing. Their pricing model doesn't present companies the option to partner with their product either - and that's hugely disappointing.

Their company and technology is indeed impressive - or at least must be for that price. It's the product pricing and specifications that I am disappointed with - not dissing. Disappointed. They are very different emotions:)
#27  
You can find the lidars for around $100 that do 360 degrees. The XV-11 is probably the most cost effective due to being mass produced.
PRO
Canada
#28  
Wow, the cheapest I thought was $400 on Robotshop, Dave you have to let us in on your sources:)
#30  
https://www.getsurreal.com/shop

This is the place I go through. Shipping is a bit slow but everything has worked great that I have received. He is out of Dallas TX, which I visit a lot, so it is an easy pickup for me.
#31  
Any chance the EZB-v5 will have a full bidirectional interface to the Dynamixel AX-12 servos? Would be great for input to auto positioner. Also feedback on load of servos for feedback on grippers.
#32  
@RobertL184 You can do this, just put an inline ADC current draw meter in line with the servo and get the readout from an adc port. I won't get into details here that should work.
PRO
Synthiam
#33  
1) The dynamixel support for reading position data would be software, not hardware and therefore also affect the ez-b v4. The dynamixel software update is an open-source plugin and available in the plugin section of the website. Anyone interested in adding additional support is welcome to it:)

2) Feedback on load of servos can only be done on servos which are supported, and therefore is not ez-b v5 specific. If dynamixel servos support feedback of load, then they could be supported with the software update. The software for dynamixel support is an open source plugin on the website. Anyone interested in adding additional support is welcome to it:)

3) Grippers are now starting to ship with foam inserts to hold objects, such as pens/pencils, screw drivers, utensils, etc.. I will write a post once the new grippers are online - the foam inserts will be purchasable. The design for the foam grippers to cut your own will also be posted. Stay tuned
#34  
Hello Dj,

I was wondering if you have considered add on (piggy back) boards which could offer embedded features or functions . This way the v4 platform, or the future v5 would have a much larger potential. I know onboard features offered on a footprint the size of the ezb could be very powerful. (on board sub routines, sensor interface, on board memory) ?

My desire is still indoor navigation. I was hoping an optional piggy back board made for navigation only could become an offer.

What is your thoughts?

Ron R
#35  
@Ron I am not sure how or what the add on module can do to accomplish autonomous navigation? I mean what would it have hardware wise that would accomplish this? Autonomous navigation is 90% software... ROS uses lidar or kinect and a crap load of programming to do autonomous navigation.... If ez robot goes the route of the Kinect or an xtion 3d camera you will need to have a PC on board your robot. If DJ goes the route of a camera based system or lidar this might work with just the ezb alone. Still it's all about programming...

I do want to ask you what your ideas about the piggy back modules are?... What sort of features would they have that currently can't be done already with the ezb?

David posted this Lidar package My guess is this or @DJ's original camera/IR based system idea are the likely candidates if he were to undertake an indoor nav solution...
#36  
Hi Richard,

Regarding the navigation idea,I was wondering if Dj would or could embed software to work with hardware, maybe the camera or beacon that has been discussed in the past, to perform navigation of some sort. I know you have been working on, and are developing a nav. system for your needs. I was just looking at a way, maybe, Dj would or could consider a EZ Robot system. He could make it a separate package that plugs into a communication port.

My other thought is a system where a module could be plugged into the ezb, say a com port ( allowing a USB port, requested in a earlier post) or maybe the ability of onboard subroutines, or onboard memory or sequencer etc. ?

Just throwing out ideas.

Ron R
#37  
@DJ Sures sorry to argue with you on this but to do the dynamixel AX-12 right it is both a hardware and software solution. The Dynamixel AX-12 series uses a single pin to communicate bidirectionally using a TTL Level Serial. Most solutions I have seen so far that appear to support Dynamixel well use a Tristate Transciever to connect to the Dynamixel bus. The direction of this Tristate Transciever is controlled so that when the controller sends to the Dynamixel, the controllers serial output pin is enabled to transmit on the dynamixel bus. When the controller is not transmitting the controller disables the tristate output and the serial data from the dynamixel bus is available to the controllers serial input pin. Once this Bidirectional data link is implemented the rest of the Dynamixel solution is software as you stated.


User-inserted image