
ericsims
USA
Asked
— Edited
Hello. I do not have an ezb yet, but have a general question about programming the controller. I understand that the ARC + EZ-Script allows for the capability to control the robot when connected over wifi, but is there a way to load code onto the EZB v4 so it actually runs on the platform? I want to try and reduce network overhead and resources (if any) required on the control computer.
Thanks
-Eric
Thanks for the quick response!
-Eric
Don't worry... if you hang around here long enough we will be able to "beat" the Arduino outta' ya'.....
I have LPC11Uxx on some pcbs for projects great for embedded stuff where you don't need the whole dev platform.
-Eric
@WBS and @RR have you well covered but I figured I'd jump in and add that there's always the ability to use the ez-b as the master and an arduino or LPC11Uxx as a slave device feeding the master sensor data, interfacing to addressable LEDs, etc. You can interface to a slave device via uart or I2C if you'd like and have a hybrid robotic system. A few forum members have gone this route, check out some other threads around the forum if you are curious.
@jeremie I will have to check that out! I thought about this before, but didn't want to exercise that much effort.
I also have another question for you guys: I know that the sdk contains a lot of useful stuff, but is there some way to call programs created in ezbulider from the sdk? This would be the same kind of thing. I am trying to streamline my idea, but retain the ability to use the ezbulider 'in line' with the rest of my code.
Thanks
-Eric
You can now write your own plugins that become custom ARC controls within the GUI. This essentially opens up ARC for all kinds of third party development.
Tony
I'm glad you were able to find a work around for the uart communication but I am surprised about your comment about the lack of interrupts.
Since ARC is a multi-threaded system and all variables are global you can interrupt a process very easily. Obviously, it's not a true hardware interrupt where it halts any process at anytime but you can use a "check for flag" anywhere in your ez-scripts that looks for the change of a single variable then use the Halt() command to exit the script and do something else.
I'm curious to where a hardware interrupt would be needed, possibly an emergency button maybe? I would likely just use a button that would just physically cut power in that case. Possibly a reset button? I would just use a button that momentarily separates in-coming power in that case. ....curious.....I'm still very curious.....
One example of use of an interrupt in robotics (not necessarily in this case, more so for a child processor) is for a digital encoder. That way you catch all of those pesky little pulses. I have also used interrupts to monitor IC communication. For example, maybe a d latch, with a latch pin that is only held high for 50microseconds. Idk, it is just a good way to solve problems.
-Eric
In most cases the SDK is being used to pass variables between ARC nad an external app. It is certainly possible to build all of your functionality in the SDK, and with the Universal Bot sdk, run it on pretty low end machines (raspberry Pi for instance) but ARC makes the basics so easy, only those looking for a challenge use that. The new plugin capability will probably replace a lot of the current SDK functions.
Alan
Tony
An example of this are the ir sensors that are detecting edges or drop offs like stairs. When the robot is moving and a dropout is detected, the motors are stopped. This happens in a couple of milliseconds. The information is then sent back to the EZB which then transfers back to ARC letting it know that an edge was detected and the robot was stopped.
1/2 of a second delay could cause the robot to to fall down the stairs. This architecture also allows for a simplified method of controlling devices and motors from ARC or the SDK. For example, you are able to use a serial command of "move robot forward 4 inches" or "turn robot right 15 degrees". ARC or the sdk doesn't need to know what these commands do to accomplish these tasks. It just receives a command back that the task has been completed.
The BB was chosen for this because it is has a very reasonable price, is open source hardware, and is pretty impressive hardware. It also will house a lot of the other features that will be proprietary to my platform. It offers a way to more securely house these functions.
Anyway, I just wanted to give an example of where this would be nice.
I love the EZB and the ease of ARC. I do believe that it is great for most situations, but like everything there are a couple of things that could be improved on or expanded on a bit. I just happened to need some of these in this project.
@Toymaker and @d.cochran Yeah interrupts are definitely more useful on a sub-system processor.
-Eric
To perform lower level functions, such as monitoring trigger switches or interrupts, combine ARC.ezb with a microcontroller/arduino. If the ezb is in the way of interupt probing, consider connecting the arduino directly to ARC.via serial port - even though that still won't solve much because if checking for valid packet length is done while the packet is still transmitting will result in what tony refers to as an error. It's not an error, instead an expected behavior for packet transmission.
As in Tony's application, the controller can connect to a PC serial port instead of the ezb. This way ARC.is not blocking the communication protocol to the ezb while probing for serial data.
If there was an interupt handle built into ezb, it would be abused and flood the communication channel due to the programming model which is being discussed here - unless it was understood that it should be used with caution.
On that note, I will add an interrupt function on the next hardware update, which I will lightly introduce in this response.
Something that is overlooked in networking is latency vs speed. An example is to consider the speed of sound. Let's pretend that you're 1000ft away from me over a calm lake. You can yell words across the lake as fast as you want, but I will only start hearing them a second later. Sure, once i start hearing them I can begin executing your commands as fast as they come. Now, this is where it gets interesting... You're expecting me to respond before sending the next commands. Well, you now have to wait 2 seconds for the response. One second while I wait for your last word to hit me, and another second for my acknowledgement to reach you. This means I also have to wait 2 seconds while my response heads to you and your commands head to me.
So when you perform this method of communication over and over, the delays of waiting for commands due to latency creates significant challenges. I'm giving this example because many people say "well wifi is 10 mb per second!". Yes, it's XX mb/sec in constant throughput, but that's it. Keep in mind that a single byte of information contains many many Ack and Syn discussions. More information on that can be discovered researching the OSI model.
This is also one of the challenges when designing the ezrobot camera. Some say "I have a wifi camera and it has blah blah hd resolution". Well, what they don't have is super duper low latency compared to the ezrobot camera. Video is expected to be heavily compressed, buffeted, transmitted and buffered again, rinse repeat. This is because it doesn't quite matter if you television or movie starts playing a second or two after the realtime content. With robotics is very much matters! This is why it's a challenge.
The true way to overcome this challenge is to treat the ezb as what it is, NOT a microcontroller.
In the meantime, I can leave you with this bit of information. Jeremie is evaluating some new wifi modules at the moment, which are smaller and will provide an additional ARM processor on the top (communication) pcb of the ezb. If this works, we will be releasing an upgraded communication pcb which will allow custom code to be written on the second arm - and will use its own communication channel to ARC. This will be exciting for you all to code directly, and also give you first hand experience to the challenges of communication latency.