Asked — Edited

Ezb5 Coming?

EDIT False alarm or?...as you were...:) EDIT AGAIN I was right a V5 is in development....:)

@DJ, you forgot to mention this little addition in the release notes of the OCT 31st ARC update.... Maybe it's nothing but if you click on the "?" it gives a bit more info alluding to an ezb5 something.... "*Note This is experimental and used for the EZB v5 beta"... You sly dog, did you really think we wouldn't notice?... :P User-inserted image


ARC Pro

Upgrade to ARC Pro

Unleash your creativity with the power of easy robot programming using Synthiam ARC Pro

#1  

In several threads yesterday about Wifi issues he mentioned that EZ-B needs to change WiFi chip provider because the current chip is discontinued. in the Open IoT Wifi thread he mentioned the ability to modify the WiFi so it "calls home", ie, the EZ-B connects to ARC rather than the other way around, so you don't need to know the IP address. I suspect this new check box is related to that feature.

Alan

#2  

I did read those threads, but this made me think otherwise "*Note This is experimental and used for the EZB v5 beta"... Maybe it was just the wording but it seemed like it was referring to a Version 5 of the ezb. However, like you said it could just be related to the announcement that @DJ made about the open source IoT wifi....

#3  

I think it is about the replacement WiFi chip. That will require an EZ-B version revision, either 4.x or 5.

Alan

#4  

I hope this replacement WiFi chip can just be placed in our current V4 EZB. I sure don't want to buy 3 more new EZB's to replace the ones in my robot to avoid this issue. :(

PRO
Synthiam
#5  

You're all correct. The v5 is in development. The communication pcb of the v4 can be swapped out for it. It has features such as I/o triggers, calling home, programmable and more. It is a different chipset than what's currently used in the open IoT wifi.

As for the open IoT wifi - you can configure it to call home. One of my favorite features of the IoT wifi is being able to connect other ezb's and your PC to a single ezb as an access point.

#6  

Sure.... wait until I am neck deep in learning ROS and then lay this on us... LOL This is really awesome. Just when we though you were laying around at the cottage all summer doing nothing but jet skiing we find out you were actually secretly working on this instead....:P Go figure.... :)

#7  

How about a V5 with mutiple connection options beside wifi, like good old fashion serial RS232/TTL , and/or USB direct connections to a computer. This would be alot easier for those of us that might want the computer on the robot next the EZB unit. The WIfi option could still be used for updates and such.

PRO
Synthiam
#8  

RoboHappy, if you have seen the size of the EZ-B v4 and the traces - there is no room for additional connectivity. However, we have discussed a replacement communication board with USB. You will never be able to use RS232 or TTL because the bandwidth is too high. USB is the only alternate connection type.

#9  

@DJ. I would second the vote for USB connection. Would be great for those with embedded computers. However, I suspect market would be pretty small.

Alan

#10  

The USb connection would be nice! And maybe the ability to store a little code onboard? Perhaps a small 5kb storage area where you can put default able code for when the robot looses of connection, or you tell it to use that code.

Just some ideas. ;D

#11  

I will echo what Alan and Techno said... I would also love a USB connectivity option....

#13  

If no room, maybe a piggyback module, with communication, on board storage, etc. ?

Ron R

PRO
USA
#14  

I would also love the option of USB. For other robots having on board computers. Or as requested, an add on board to v5

PRO
Synthiam
#15  

Thanks for the feedback - USB expansion board is on the list.

@Richard, can you share what you're using for SLAM in ROS? I hear that SLAM is your priority of choosing ROS due to present navigation limitations of ez-robot. What sensors are you using? What sensors are popular in the ROS community for SLAM? Thanks!

#16  

@DJ First off the fact that ARC has no current ability to do auto navigation is not a limitation, it's another feature that just hasn't been implemented yet;) ... Anyway, I bought the kit version of this about 2 weeks ago Oculus Prime ROV Just yesterday I got all the software loaded and the robot calibrated. Right now I have only been messing with the web interface to control the robot. I have also only touched on making room maps for the navigation part... So far so good. If you want when I really get it mapping and navigating I can post my results...

To answer your question... What I wanted to do with slam is have my robot not just wander around aimlessly trying not to bump into to anything, but to be able to send the robot on patrol routes or to certain areas of the house and have it navigate there by itself. The Oculus ROV does this pretty good. However, unless you are willing to cough up $1600 or so Canadian dollars for the "out of the box" full version be prepared for a lot of work assembling and installing/calibrating ROS on the kit version (as I did)....

Bottom line, it's a lot of work just to get a ROS robot to indoor navigate and that's with supported ROS robots like Oculus or Turtlebot.... However, when it comes to designing and building your own robot unless you're, well you or d.cochran, it won't be an option for 95% of us... I won't even go into mentioning how difficult it is to modify your bot with say a ping sensor or any other sensor for that matter. ROS's main focus seems to be sensors like Asus Xtion, Microsoft Kinect or a lidar of some type... You are going to need to learn Arduino as well as that is the preferred micro when working with ROS...

I have mixed feelings about ROS... It does pretty great stuff, but it takes a lot of work and effort to do it. I am going out on a limb to say it is still probably best left for University's, research and serious product development... One major advantage ez robot has over ROS (other than the obvious ease of use and versatility) is the ability to add peripherals or add-ons and interface them quickly.... Also the ability to rapidly test and knock up code...

One side benefit is I am really learning about network stuff, telnet, VNC, ssh and linux too... One last thing... I have yet again been invited to the local college here where I live to help with their inMoov build. It seems their robot program is building one...No doubt they are using Arduino and myrobotlab.... I will be taking my inmoov to demo in a few weeks.... They are about to discover that they are doing things the hard way when I demo how the ezb4 and ARC out perform Arduino and myrobot lab in every way....:) User-inserted image

#17  

One cool feature Oculus has is "Click steer" When you click anywhere on the camera image the tilt servo attempts to centre the camera where you clicked.... If the robot is moving forward you can actually drive/steer the robot using click steer... Where you click on the camera image the robot steers towards it.... Maybe a new ez robot feature...? :)

United Kingdom
#18  

That wouldn't be too difficult a feature to add (click steer) as a plug in. You would only need to have the input from the mouse click on the video image then some simple calculations to work out the XYZ axis movements required.

Although I guess it would be even easier for DJ to add that new control to the current camera control.

#19  

Adding a SLAM module to ARC would be awesome.

The quickest and easiest modules to work with are the neato XV-11 lidar and a XV Lidar controller. The controller will pass back the information from the neato over usb. There are also public python scripts so that you can see what is being done to convert this information to points on a map. For SLAM to work great, you need odometers on the wheels to measure the distance that the robot is traveling. A sabortooth/kangaroo combo could be used for this along with wheels that have encoders. The sabertooth/kangaroo combo might not be needed but it sure makes things a lot easier.

From there, the fun begins. Storing the map as you travel through it and calculating your current map position probability based on what you see with the sensors becomes the next issue to tackle. All of this is stored in X/Y coords.

From there, path planning becomes the next fun part. You can use something like A* for this if you want. This is a popular path planning routine used in computer gaming. It calculates the least cost route to go from the current point to an X/Y coordinate.

Communicating this information over the wifi channel used by the EZ-B could be problematic. There is a lot of data for sure. It would probably be preferable to do this all on the computer running ARC instead of running all of this data through Wifi. It might also be better to build a SLAM board that is an attachment to the EZ-B that would perform all of these functions and just pass the information back to the EZ-B that is necessary, but this is just a guess based on what I have seen up to this point. SLAM hasn't been my focus up to this point but it is definitely something that I have put some time into learning and experimenting with. It would be an amazing addition to ARC for sure.

#20  

@Richard. SLAM algos are graduate level work, at least my limited understanding grasps it. I would think at least with easy robot you would need a standardized platform with a standard sensor package to get a lot of EZers on board and limiting the tsunami of questions from users.

Roborealm seems to be very rich with low hanging fruit. Especially the path planning section. Roborealm Path Planning

The EZ-Robot plugin architecture is wide open for developing path planning and upgrading the floor mapping features. Personally, I've been daydreaming of what it would take to create a new floor mapping and path planning plug in.

#21  

@Levels.... I am way ahead of you... I have a video floating around I did to demonstrate crude path planning with roborealm... It uses the AVM navigator module instead of the one you posted above... It does indeed work but it's crude... SLAM is much better at this.... EZ Robot doesn't have to do ROS's version of slam... I mean DJ is capable of developing his own indoor navigation system that would be built into ARC as opposed to it possibly being a complicated add on if he used ROS's SLAM....

Here's the link of my video... AVM Navigator in roborealm

#22  

@David... I missed your post somehow.... Great information... I would be willing to mess with the kangaroo/wheel encoders. I actually have 2 that I haven't used yet, but want to... I was also wondering how this can be done with just an ezb... I figured that a PC would have to be on board the robot for at least for using sensors like the xtion or kinect.... The neato lidar may be a different story especially with that controller you posted a link to... No matter how autonomous navigation gets done, I hope ez robot (@DJ) will be able to add this feature to ARC....

PRO
Synthiam
#23  

That oculus robot is quite disappointing, welcome to 2005... As for the ros - I'm always impressed with some of their Ph.D. contributions to the code base. I don't expect to create a slam control myself. With the open architecture ezrobot will license something or code around the ros library - why reinvent the wheel:) specifically if there are inventors actively devoted to continuing improvements.

RoboRealm has been popular within the ezrobot community. I don't know what they have for slam - although I expect they would make something, because they're that kind of folk. The kind of folk that get $&@?$ done!

What matters to me at this point is what sensors are used for the popular ros slam module. If some are using Kinect. And some are using lidar. And some are using a blind mans cane. I'd like to know what the popular choice is surrounding the most active slam development. Because each one of those obviously requires a completely different approach to obtain data.

I'd like to know which of those ezrobot should support and voila, we are on the track to wrap a library.

slam isn't that difficult after reading about it. It's pretty much a reverse Ray trace :). Remember the PC 3D version of Wolfenstein? Just like that.... It's also how my original k9 navigated. Keeping a collection of distances and transforming the 2d array based on the movements.

Also the omnibot had a very very simple 1d array - which helped him understand door ways and such. The code is in the sdk package.

#24  

@DJ There is no reason to dis other robot companies... The people on the forum here (including myself) are here because we are already sold on the fact that EZ Robot is the best robotics platform out there hands down... I headed down the ROS road as I wanted to learn something about autonomous navigation (SLAM). I am going to agree however, that the Oculus is a bit of a "one trick pony". That one trick (SLAM autonomous navigation) is a doozy, though. Yes, it does not offer you the "Lego set" flexibility that the ezb4 gives you... That's why I just bought the kit version (I already had an itx board, anyway).... I would not pay $1600 just to play with SLAM.... The good news is if I want I can pull the itx board and throw in an ezb at any time..... :)

@DJ ... Bottom line what ever you can do to add this type of feature to ARC would be greatly appreciated.... Not just by me, but I suspect by many others as well....

P.S.... I want one of Bob's t-shirts... a white T-shirt with the ez robot logo :D

#25  

There are two different sensors that are being used that I have seen. One is the lidar and one is the kinect type sensor. There are some advantages to the kinect sensor over lidar. The kinect sensor is able to scan multiple heights at one time. The lidar can only do one height. The kinect type sensor can also return 3d image data, where the lidar is 2d.

That being said, the lidar returns less data to process and is the perferred first approach by most. It requires a less expensive computer to process the data (atom or arm sized processor). These are less power hungry which makes them better suited for robotics.

The lidar is a pretty inexpensive way to do this and they are very available. They also stand less of a chance to flood the communication channel.

PRO
Synthiam
#26  

Thanks Dave, lidar does seem like a more cost effective approach - until you look at the pricing of the 360 scanning versions. Holy! Kinect could never "really" easily connect to the ezb directly and would require a PC. Interesting though...

@richard, I certainly am not dissing the company - merely disappointed with the physical robot's design and specifications for the price. Given their impressive coding ability, I had high expectations :). There's a philosophy with pricing - and believe it or not, it's a huge attribute to a products success. Many think product pricing is designed by calculating a desired markup - well, the answer is more complicated than that. For example, currently the robot is incredibly niche and therefore sells 1 per month. Now, imagine that robot cost $300 or $400 - they would sell 10/20 per month and make more revenue and profit, respectively. Also one important result of having a larger user base is feedback to strengthen the product which makes it more attractive to new customers. A small user base limited by pricing doesn't promote product growth and therefore isolates the feature-set to ones decided are important by the developers and not the consumers.

Many companies suffer from making the same decision - by focusing on what they feel is important and not what the consumer wants. Ezrobot's approach has been the opposite. When someone says "must be nice being the boss". My answer is "are you kidding, I have 25,000 bosses!"

Ezrobot doesn't make decisions for its users - it's been the other way around, including with pricing. Their pricing model doesn't present companies the option to partner with their product either - and that's hugely disappointing.

Their company and technology is indeed impressive - or at least must be for that price. It's the product pricing and specifications that I am disappointed with - not dissing. Disappointed. They are very different emotions :)

#27  

You can find the lidars for around $100 that do 360 degrees. The XV-11 is probably the most cost effective due to being mass produced.

PRO
Canada
#28  

Wow, the cheapest I thought was $400 on Robotshop, Dave you have to let us in on your sources :)

#30  

https://www.getsurreal.com/shop

This is the place I go through. Shipping is a bit slow but everything has worked great that I have received. He is out of Dallas TX, which I visit a lot, so it is an easy pickup for me.

#31  

Any chance the EZB-v5 will have a full bidirectional interface to the Dynamixel AX-12 servos? Would be great for input to auto positioner. Also feedback on load of servos for feedback on grippers.

#32  

@RobertL184 You can do this, just put an inline ADC current draw meter in line with the servo and get the readout from an adc port. I won't get into details here that should work.

PRO
Synthiam
#33  
  1. The dynamixel support for reading position data would be software, not hardware and therefore also affect the ez-b v4. The dynamixel software update is an open-source plugin and available in the plugin section of the website. Anyone interested in adding additional support is welcome to it :)

  2. Feedback on load of servos can only be done on servos which are supported, and therefore is not ez-b v5 specific. If dynamixel servos support feedback of load, then they could be supported with the software update. The software for dynamixel support is an open source plugin on the website. Anyone interested in adding additional support is welcome to it :)

  3. Grippers are now starting to ship with foam inserts to hold objects, such as pens/pencils, screw drivers, utensils, etc.. I will write a post once the new grippers are online - the foam inserts will be purchasable. The design for the foam grippers to cut your own will also be posted. Stay tuned

#34  

Hello Dj,

I was wondering if you have considered add on (piggy back) boards which could offer embedded features or functions . This way the v4 platform, or the future v5 would have a much larger potential. I know onboard features offered on a footprint the size of the ezb could be very powerful. (on board sub routines, sensor interface, on board memory) ?

My desire is still indoor navigation. I was hoping an optional piggy back board made for navigation only could become an offer.

What is your thoughts?

Ron R

#35  

@Ron I am not sure how or what the add on module can do to accomplish autonomous navigation? I mean what would it have hardware wise that would accomplish this? Autonomous navigation is 90% software... ROS uses lidar or kinect and a crap load of programming to do autonomous navigation.... If ez robot goes the route of the Kinect or an xtion 3d camera you will need to have a PC on board your robot. If DJ goes the route of a camera based system or lidar this might work with just the ezb alone. Still it's all about programming...

I do want to ask you what your ideas about the piggy back modules are?... What sort of features would they have that currently can't be done already with the ezb?

David posted this Lidar package My guess is this or @DJ's original camera/IR based system idea are the likely candidates if he were to undertake an indoor nav solution...

#36  

Hi Richard,

Regarding the navigation idea,I was wondering if Dj would or could embed software to work with hardware, maybe the camera or beacon that has been discussed in the past, to perform navigation of some sort. I know you have been working on, and are developing a nav. system for your needs. I was just looking at a way, maybe, Dj would or could consider a EZ Robot system. He could make it a separate package that plugs into a communication port.

My other thought is a system where a module could be plugged into the ezb, say a com port ( allowing a USB port, requested in a earlier post) or maybe the ability of onboard subroutines, or onboard memory or sequencer etc. ?

Just throwing out ideas.

Ron R

#37  

@DJ Sures sorry to argue with you on this but to do the dynamixel AX-12 right it is both a hardware and software solution. The Dynamixel AX-12 series uses a single pin to communicate bidirectionally using a TTL Level Serial. Most solutions I have seen so far that appear to support Dynamixel well use a Tristate Transciever to connect to the Dynamixel bus. The direction of this Tristate Transciever is controlled so that when the controller sends to the Dynamixel, the controllers serial output pin is enabled to transmit on the dynamixel bus. When the controller is not transmitting the controller disables the tristate output and the serial data from the dynamixel bus is available to the controllers serial input pin. Once this Bidirectional data link is implemented the rest of the Dynamixel solution is software as you stated.

User-inserted image