Asked

Hi @DJ, Probably off topic but if you were to purchase a LiDAR for ARC, would you buy the Hitachi-LG LDS or the RPILiDAR A1 . The link at top of this page for hardware specs appears to be outdated but on Robotis page it says 120mm to 3,500mm and Hitachi with 5% error and 300rpm with 1 degree resolution. https://www.robotis.us/360-laser-distance-sensor-lds-01-lidar/
The RPI Lidar appears to have a much larger range of 12 meters and they appear to be easy to find and a lot available for sale. https://www.slamtec.com/en/Lidar/A1Spec
I don't own either, so I am just trying to work out what would be the best one to purchase to work with ARC, Both seem to be about the same price $130 to $140 Canadian although the Hitachi is hard to find.
The RPI is my favorite. The Hitachi is okay but difficult to come by, and I find it doesn't produce as many scans per rotation. It seems the RPI has a fuller scan if that makes sense. We're doing a ydLidar robot skill for a client right now - that'll probably be up in a week or so. But bang for the buck is RPI for sure. I use it on my test robot.
Do you also have a T265? Because they work better together than with wheel encoders
PS, I made a new thread for you
Oh, and don't ask about hacking the neato xv11. It's such a piece, ugh... I wasted a few days on it a while ago because a school had asked about it. I did have a robot skill working for it, but I won't publish it because it was too bad. It was so bad that it would reflect poorly on our software, so I can't post it. It's just that bad. There aren't words to talk about how bad it is. haha
Thanks for the thread @DJ, Yes I have a T265 and my Roomba had a lot of fun running around the house annoying the family with it
OK RPI it is !!!
Yeah, the RPI combined with the t265 is going to be wild for you to experience. I've been making a thing today - it's 3d printing the 4th modification now. I'll show it tomorrow when it's done because it'll be another few hours. But it's a 3d printed holder for the RPI and T265 together. It might be helpful for your new build.
I'd like to see an IMU to use for pose telemetry. I know it'll suffer from drift, but I feel if the IMU were combined with the wheel encoder, it would be very accurate. It could be whipped up quickly with an Arduino and a low-cost IMU. But I'm not sure about the algorithm to get distance from acceleration or how to fuse it with the wheel encoder. I'm guessing an extended Kalman filter would be the best choice, but the advanced math is beyond my concentration.
The SLAM in The Better Navigator requires a pose hint from some other odometry, such as a wheel encoder or the T265. The concept behind a pose hint is to give the slam a place to start looking for where it thinks the robot is in the saved map. Because without a pose hint, the slam algorithm will have no idea where to begin looking for the robot amongst the saved map. The map will have millions of data points, and some might be similar to others. So having some secondary, albeit less reliable, the source is better than nothing. So the T265 works well, but it's discontinued, and we need a replacement solution.
Looking forward to seeing your design and 3D print for the Lidar and the T265. I have been meaning to design a 3D model for a head for my Dog that will hold a T265 for eyes a microphone for nose and stereo speakers for ears but I need to work on my design skills.
I thought you added glyph support for your pose hint. That appears to be the way everyone else does it and triangulate based on size and location with a cheap stereo cameras. Yeah not as nice as a the T265 but that sensor also drifts. My roomba docking station was the way I recalibrated (go back to start before next mission).
There is always the indoor positioning systems that use electromagnetic fields, wifi, Bluetooth and other RF signals to locate its position like indoor atlas. I wonder if there is an open source thing that can use an IMU gyroscope / accelerometer off an arduino.
Looks like a bunch of opensource projects started but I can’t find anything that really finished or usable .
why am I now reading Papers at 11pm on Kalman filter for establishing location? I see what you mean about the mathematics:)
https://www.mdpi.com/1424-8220/22/2/428/htm
Re instructable link: That’s not a pose hint, it’s a complete realignment of the pose. With a glyph it can only be done when in camera view. You can have the robot in view all the time, as factories would be. Depends on your setup. The Synthiam version is called Camera NMS Pose or something. Look at the NMS manual for more info
The pose hint is what I described above. It needs to happen in real-time a dozen times a second. You’ll have to read what I wrote for the pose hint. summary repeating myself is it’s a hint for every time the slam routine as a hint of where to search in the memory array map
OK I understand the problem a little better now and I have 2 wheel encoders from another build. Problem with my build with the mecanum wheels, it goes sideways (well it is supposed to if I get it working properly) so not sure wheel encoders would work very well. An old Mouse ball solution maybe the trick here but it will stop working when you lift the mouse off the table and you don't have anyway of knowing where the physical mouse actually started from.