
Andy Roid
Many members of this forum have discussed and built many styles of indoor navigation. Some designs were fairly simple with minimal accuracy. Some were developed with great sophistication and complexity and of course the resulting ability was very good.
Some controls were based on camera feedback using glyphs, colors or objects as way points, ultrasonic distance measurements, compass directions, encoder values, LIDAR feedback etc. etc. as stand alone sensors, or in combinations. Also beacons, and other point of location devices are used for references as robot is here.
I am curious what has worked for you and what has not. I am not looking for the details of the script you built (unless you want to share it), but rather a direction to go to build something that will work reasonably well. Information on your successes and failures would be appreciated. I have built a few scripts, but never got what I really wanted. I would also like to keep the cost down, so I rather not use LIDAR and specialty sensors.
I have an Adventure Bot which has a camera on a servo, 3 ultrasonic sensors mounted front and sides, a compass which is not currently mounted, but available, and the ability to get pulsed signals from the 5 spoke wheels mounted on the continuous servos (poor man encoder ( approx. 1.67 inches or 42.7 mm per step)).
Open for discussion.
I look forward to feedback on this subject.
Ron
Hello @rz90208, Thanks to DJ, I was able to get my plugin loading issue resolved
When I add your plugin and code, its runs without errors, with of course the limits you mentioned of no movement or map updates... I plan on working in that area
I notice the initial array returned by the plugin seems to consist of numbers like 45 44 43 rather then the expected - or 0 or 255.
Was your plan to initially manually drive the robot through the room and have a sensor push the wall/obstacles 255 values into the array?
Does the plugin have a means to store the recorded for the playback run?
How do you intend the define when desired "-" route values were entered?
Thanks, Frank
Hello Frank, Have not had time to do much work on this for a while. but the plan is to draw the initial map using Microsoft Excel, then with a VB Script convert the Excel spreadsheet to a single dimension array. Then save this to a file I can load later. This way when the robot is initialized the first thing he says is basically what anyone would say if they awake in an unknown environment. "Where am I?" I then tell the robot. "You are at home" or "You are in the garage" He then loads the appropriate map. This next step is where I am struggling. The robot needs to know where in the map he is located. One way is to always initialize the robot in the same place every time, another is to use vision. Put QR Codes around the house and use them to triangulate his position. How ever you do it. Once the robot knows where it is located on the map, it is marked in the map array. Then when you tell the robot to go to the living room, then destination location is entered into the map and the plugin is called. The plugin processes the map using a wavefront routine then returns an array $MyRoute. From the robots current location follow the dash marks "-" to the destination. EZ-AI was to be a big piece of this but with that project not being in production any more, I have to rethink a few things.
RZ90208 RichardZ