Hi, guys. I am considering EZ-Robot as a kit to experiment with and demonstrate an AI system I am working on. Here is a video of an older version of my AI system in an RS Media robot, if you are interested:EZ-Robot seems like a good system because of its high level approach and flexible design. I was wondering if you can help me with some questions:
I would like to run ARC parallel to my C++ AI program. In other words, I would like them to run as their own executable programs that are capable of communicating with one another. I don't need or want low level control (like controlling individual servos). I envision designing scripted motions in EZ-Builder, and triggering them from my C++ program. Additionally, I would need to be able to retrieve sensor feedback in my C++ code. Does anybody have suggestions or links to ways I could do that? FYI - I am aware that ARC supports plugins, but they are in C#.
One of my concerns is encountering limitations on the number of sensors or servos you can attach to one board. To clarify, I envision making a robot similar to the rover kit, but with these possible additions: 2 cameras instead of 1, 4-5 LED blocks, 4-5 touch or Ultrasonic sensors, the 4 in 1 orientation sensor, maybe a few more I haven't thought of yet.
I noticed that touch sensors are conspicuously absent? Can the ultrasonic sensor be used as a pseudo touch sensor, like if something is < 1 cm away from it? Can multiple ultrasonic sensors function without interference from each other?
My ambitions are very sensor-oriented, so the more types of sensors, the better. Are there any 3rd party sensors that you would recommend that have easy integration?
Is a microphone integrated in one of the parts, or do we have to supply a 3rd party solution for audio input?
Is it possible to get the RGB pixel data from the camera in another program? It sounds like EZ Builder supports basic color and scan code recognition, but I may want to try my hand at something like shape recognition in the future.
It will take a good amount of time to complete my current AI version, but I am excited about integrating it in a robot in the future. Thanks for the help!Skip to comments