Asked — Edited

Ezb V4 Run Code On-Board Instead Of Over Wifi?

Hello. I do not have an ezb yet, but have a general question about programming the controller. I understand that the ARC + EZ-Script allows for the capability to control the robot when connected over wifi, but is there a way to load code onto the EZB v4 so it actually runs on the platform? I want to try and reduce network overhead and resources (if any) required on the control computer.

Thanks -Eric


ARC Pro

Upgrade to ARC Pro

Unlock the true power of automation and robotics by becoming a proud subscriber of Synthiam ARC Pro.

#1  

In a word, no. Everything goes through the WiFi link. There is no storage capability on the EZB4 itself. The best way to reduce the load on the WiFi system is to insure you limit the amount of data you send to the robot. For example, don't send a constant stream of requests for a sensor's data. Send one at reasonable intervals like every quarter or half second. Things like that. The general stuff sent to the robot to make it move should not be all that much of a load on your system in the first place, so I would not worry too much about that.

#2  

Ah OK. Yeah I will just do what I can to keep the load down. For a really cool robot platform it is interesting that all of the processing is done on a remote computer and not the ARM processor. It is simpler on the front end, but seems less efficient for a robotics platform. I know that running the code on board won't have the same capabilities, but it would be pretty awesome if you could load code onto the controller via wifi, a serial capable bootloader, or ISP. It would open some doors for much faster sensor response time and possibly allow for interrupts. Anyways... I am just rambling.

Thanks for the quick response! -Eric

#3  

@ericsims interrupts? Geez dude you been hangin' around the Arduino camp and the '90s too long.... The ezb/ARC is fully multi tasking....:P The way it works now is you can change/adjust your code on the fly (unlike the Arduino)... meaning accelerated robot programming...

Don't worry... if you hang around here long enough we will be able to "beat" the Arduino outta' ya'.....:P I am saying this is because I used to use Arduino, but got bored of building boring line following and obstacle avoiding robots....

#4  

Haha. Yeah I see what you mean... when you can have a bunch of threads... However, it is nice to not use an entire thread to poll one digital pin waiting... But its not arduino, I love the LPC11U24 (which is still kinda old)... Many people cling to the Arduino for some reason...

I have LPC11Uxx on some pcbs for projects great for embedded stuff where you don't need the whole dev platform.

-Eric

#5  

@ericsims .... just ribbin' ya'.... Hope you stick around you seem like a real educated robot guy.... welcome....

PRO
Canada
#6  

HI Eric,

@WBS and @RR have you well covered but I figured I'd jump in and add that there's always the ability to use the ez-b as the master and an arduino or LPC11Uxx as a slave device feeding the master sensor data, interfacing to addressable LEDs, etc. You can interface to a slave device via uart or I2C if you'd like and have a hybrid robotic system. A few forum members have gone this route, check out some other threads around the forum if you are curious.

#7  

@richard lol I know. And thanks! That means a lot to me!

@jeremie I will have to check that out! I thought about this before, but didn't want to exercise that much effort.

I also have another question for you guys: I know that the sdk contains a lot of useful stuff, but is there some way to call programs created in ezbulider from the sdk? This would be the same kind of thing. I am trying to streamline my idea, but retain the ability to use the ezbulider 'in line' with the rest of my code.

Thanks -Eric

PRO
Canada
#8  

@Eric I'm actually not sure if you can call ARC project files with the SDK, I'll have to leave that for someone else to answer but I wanted to make you aware of the huge, new ARC feature, check out this thread!

You can now write your own plugins that become custom ARC controls within the GUI. This essentially opens up ARC for all kinds of third party development.

United Kingdom
#9  

The lack of interrupts (in my opinion) is a big disadvantage with the EZ-B/ARC, if it had interrupt capability then (again in my opinion) it would be near perfect. The EZ-B latency issues are a real issue to me personally.

Tony

PRO
Canada
#10  

Hey Tony,

I'm glad you were able to find a work around for the uart communication but I am surprised about your comment about the lack of interrupts.

Since ARC is a multi-threaded system and all variables are global you can interrupt a process very easily. Obviously, it's not a true hardware interrupt where it halts any process at anytime but you can use a "check for flag" anywhere in your ez-scripts that looks for the change of a single variable then use the Halt() command to exit the script and do something else.

I'm curious to where a hardware interrupt would be needed, possibly an emergency button maybe? I would likely just use a button that would just physically cut power in that case. Possibly a reset button? I would just use a button that momentarily separates in-coming power in that case. ....curious.....I'm still very curious..... :)

#11  

I will have to check out the plugins. Somehow I missed that in my research. Thanks!

One example of use of an interrupt in robotics (not necessarily in this case, more so for a child processor) is for a digital encoder. That way you catch all of those pesky little pulses. I have also used interrupts to monitor IC communication. For example, maybe a d latch, with a latch pin that is only held high for 50microseconds. Idk, it is just a good way to solve problems.

-Eric

#12  

I'll just throw my 2 cents in here on a couple of topics covered. A lot of us are building robots with embedded windows computers to run ARC. There are a lot of small low power computers available now pretty cheap.

In most cases the SDK is being used to pass variables between ARC nad an external app. It is certainly possible to build all of your functionality in the SDK, and with the Universal Bot sdk, run it on pretty low end machines (raspberry Pi for instance) but ARC makes the basics so easy, only those looking for a challenge use that. The new plugin capability will probably replace a lot of the current SDK functions.

Alan

United Kingdom
#13  

Jeremie, this is my view on interrupts - its an issue if you want to use the EZ-B v4 to build serious robots, for simpler robots the EZ-B is a truly excellent system. I personally think that an advanced robot needs sub-system nodes that feed back to the EZ-B, these sub-systems (if realized correctly) take away a huge amount of the processing overhead on the EZ-B and together make a much more powerful robotic system. Any system with multiple processors needs a way to "flag" their fellow processors very quickly and efficiently and here is the rub if you want to do even a simple thing like just strobe the v4 from the sub-system it can take up to 600 milliseconds to reliably respond! This in today's computing world is an absolute lifetime, take the sub-system PICs running at 20MHz which equates to around 5 million instructions per second, so the PIC could process around 3 million instructions in the time it takes for the v4 to respond! This latency I know is due to the WiFi overhead and the v4 processing many other things, but if their was a way of using true interrupts (on the EZ-B) so really urgent data could be processed in real time it would turn the EZ-B into a very powerful robotic controller.

Tony

#14  

I'm with Tony on this. In my robot platform I use a Beaglebone Black to handle the communication and rapid decision making. The EZB is attached to the BB and sends commands to it and receives information back from it. The BB is attached to arduinos which do motor control, sensor readings and other input/outputs. The BB is able to react to the arduino readings of sensors and then take an action.

An example of this are the ir sensors that are detecting edges or drop offs like stairs. When the robot is moving and a dropout is detected, the motors are stopped. This happens in a couple of milliseconds. The information is then sent back to the EZB which then transfers back to ARC letting it know that an edge was detected and the robot was stopped.

1/2 of a second delay could cause the robot to to fall down the stairs. This architecture also allows for a simplified method of controlling devices and motors from ARC or the SDK. For example, you are able to use a serial command of "move robot forward 4 inches" or "turn robot right 15 degrees". ARC or the sdk doesn't need to know what these commands do to accomplish these tasks. It just receives a command back that the task has been completed.

The BB was chosen for this because it is has a very reasonable price, is open source hardware, and is pretty impressive hardware. It also will house a lot of the other features that will be proprietary to my platform. It offers a way to more securely house these functions.

Anyway, I just wanted to give an example of where this would be nice.

I love the EZB and the ease of ARC. I do believe that it is great for most situations, but like everything there are a couple of things that could be improved on or expanded on a bit. I just happened to need some of these in this project.

#15  

@thetechguru That was some useful info, thanks! Using an embedded platform sounds good. Could you explain (or point me to another resource that could) how the SDK is used to pass variables to the ARC and an external app. This might be really useful! Thanks a ton!

@Toymaker and @d.cochran Yeah interrupts are definitely more useful on a sub-system processor.

-Eric

#16  

Eric, send me an email (look at my profile) and I will send you some code today that does this. There is a class that I have built in C# that handles this and can be dropped into a project and used.

PRO
Synthiam
#17  

I think the concern is more of a different perspective of approaching development with the ezb. Take the Honda asimo, for example - which uses a very similar approach. The processing speed and operating system benefits of a PC make it a superior platform for high level processing and decision making. Due to speed limitations between the PC and a controller, there is no solution for writing microcontroler-like code on a PC. This is because the PC is not a microcontroller and therefore should not have code written on it to perform microcontroller style loops or expect hardware interrupts, etc.. The solution is to treat the computer as a high level processor for doing things that micros just can't do very well or require a great deal of code. Examples of these functions are speech, speech recognition, vision (color/glyph/object/etc), graphical display, controller peripherals (joysticks/etc).

To perform lower level functions, such as monitoring trigger switches or interrupts, combine ARC.ezb with a microcontroller/arduino. If the ezb is in the way of interupt probing, consider connecting the arduino directly to ARC.via serial port - even though that still won't solve much because if checking for valid packet length is done while the packet is still transmitting will result in what tony refers to as an error. It's not an error, instead an expected behavior for packet transmission.

As in Tony's application, the controller can connect to a PC serial port instead of the ezb. This way ARC.is not blocking the communication protocol to the ezb while probing for serial data.

If there was an interupt handle built into ezb, it would be abused and flood the communication channel due to the programming model which is being discussed here - unless it was understood that it should be used with caution.

On that note, I will add an interrupt function on the next hardware update, which I will lightly introduce in this response.

Something that is overlooked in networking is latency vs speed. An example is to consider the speed of sound. Let's pretend that you're 1000ft away from me over a calm lake. You can yell words across the lake as fast as you want, but I will only start hearing them a second later. Sure, once i start hearing them I can begin executing your commands as fast as they come. Now, this is where it gets interesting... You're expecting me to respond before sending the next commands. Well, you now have to wait 2 seconds for the response. One second while I wait for your last word to hit me, and another second for my acknowledgement to reach you. This means I also have to wait 2 seconds while my response heads to you and your commands head to me.

So when you perform this method of communication over and over, the delays of waiting for commands due to latency creates significant challenges. I'm giving this example because many people say "well wifi is 10 mb per second!". Yes, it's XX mb/sec in constant throughput, but that's it. Keep in mind that a single byte of information contains many many Ack and Syn discussions. More information on that can be discovered researching the OSI model.

This is also one of the challenges when designing the ezrobot camera. Some say "I have a wifi camera and it has blah blah hd resolution". Well, what they don't have is super duper low latency compared to the ezrobot camera. Video is expected to be heavily compressed, buffeted, transmitted and buffered again, rinse repeat. This is because it doesn't quite matter if you television or movie starts playing a second or two after the realtime content. With robotics is very much matters! This is why it's a challenge.

The true way to overcome this challenge is to treat the ezb as what it is, NOT a microcontroller. :)

In the meantime, I can leave you with this bit of information. Jeremie is evaluating some new wifi modules at the moment, which are smaller and will provide an additional ARM processor on the top (communication) pcb of the ezb. If this works, we will be releasing an upgraded communication pcb which will allow custom code to be written on the second arm - and will use its own communication channel to ARC. This will be exciting for you all to code directly, and also give you first hand experience to the challenges of communication latency.