Asked
— Edited
I am making a plugin to use the XV-11 LIDAR with the GetSurreal 1.2 controller in ARC, but I am having an issue on step 6.5 of the plugin tutorial. I tried creating the stub plugin and going to my plugins, but I get a 404 error.
Just letting you know. Let me know if there is anything I can do to get the XML file that is required.
Thanks David
@Merne, The values are returned in mm I think. I think the shortest distance that is reliable from the LIDAR is about 6 cm. I haven't gotten a yard stick out to measure but I am pretty sure they are mm. Putting this over an image or stream from the camera is possible. I will first complete this module (thought of some other things last night that could be added) and then worry about the display.
@Richard, Thanks man.
This plugin got me thinking. The way that this works is the way that Rafiki works in general. There are controllers plugged up to USB ports on a computer. Right now I have a program that ties it all together, but I could make ARC skill plugins for each of the components and just run everything through ARC. This would allow me to just use scripting where needed to do what I need to do to communicate between these components of the robot. I have been considering selling individual pieces of the robot separately. This would allow someone to put in the pieces that they want in their own build.
On a side note, I have been experimenting with measuring distances to objects on a single camera using a technique that first showed up in about 2006. It uses a piece of paper of a known size and at a known distance as a calibration mechanism. From there, it then can pick up corners of objects and calculate their distance to the camera. I have some code from 2006 that I will need to convert into something that is more modern, but the code from 2006 runs in .net 2.0. This along with my parking ping sensors, and the LIDAR should provide a pretty complete view of the world around the robot.
@David,
Instead of connecting the Lidar to the PC, is not possible to connect to the EZB and get data through UART methods ?
I've been travelling and haven't responded to this yet. For your Array question...
You can push data into the EZ-Script Variable Manager as an array. Here is an example of setting a number of indexes to random integers:
If you add the above code to a test button on your plugin, and press it. You will see all of the data appear in the variable manager, every time the button is pressed - as atest.
JustinRatliff you can get just the Lidar units on ebay. Sometimes as low as $75
Ptp, not with this plugin. It would be possible but would flood the communications pipe between the ezb and the computer possibly.
This plugin works with the GetSurreal controller specifically.
@DJ, thank you. I will look into this.
I should have a new version of this plugin out today with the added features I mentioned.
Hey Mike, I just looked at my email and saw what you did. Thank you bud. It is appreciated. Have a Merry Christmas.
There is now a video on the plugin. I hope it helps explain things. If you have questions, let me know.
@David... Just finished watching your video of the Lidar... Simply awesome man.... Ok, so can these data points (maybe in a later version) be saved to a file to generate a room map? And can this data be used to navigate like slam or is this beyond the scope of what can be done with this plugin programming wise?... The only two guys I know who can possible write this type of software are you and DJ....
The points can be written to an array and housed for building a map. Actually, the plugin is doing this ($LIDAR) in ARC. The part that interests me is that now that it is in ARC, and ARCs abilities to run everything in its own thread, and the scripting that is possible in ARC, and the ability to use a lot of other components (either via plugins or otherwise), yes I do believe SLAM would be possible. DJ has done some of this in the past with limited sensors. I think with the addition of this plugin and sensor, building an array that would be a map would be possible with a lot more data available to make decisions on.
Let me explain a bit. Lets say that you took a reading at a position you called x1,y1. You have a compass, so you know which direction you are heading. You move forward for a specific amount of time or distance. You take another reading called x1,y2. Based on your movement, and the difference in these points, you should be able to then start building a map.
As I see it, the difference really is the distance out that you can see. This sensor will see across a room with no issue. A ping sensor is far more limited in its range but is great up close. Other IR sensors are also limited in their range, so building a map using them is difficult.
I do think it is important that you have multiple sensors validating data from each other in an autonomous robot though. This is why I use those bumper sensors and why I am looking at the single camera distance thing. I would love to see or participate in creating a SLAM module for ARC. It would be a really cool feature to have.