NMS Lidar driver for Slamtec RPLidar A1
How to add the Rplidar (A1) robot skill
- Load the most recent release of ARC (Get ARC).
- Press the Project tab from the top menu bar in ARC.
- Press Add Robot Skill from the button ribbon bar in ARC.
- Choose the Navigation category tab.
- Press the Rplidar (A1) icon to add the robot skill to your project.
Don't have a robot yet?
Follow the Getting Started Guide to build a robot and use the Rplidar (A1) robot skill.
How to use the Rplidar (A1) robot skill
NMS Lidar driver for Slamtec RPLidar A1. This pushes scan data to the ARC NMS (Navigation Messaging System) as a level #3 Group #1 sensor. This robot skill connects to the Lidar and pushes the data into ARC's NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used.That might sound more confusing than it needs to be. But, all that means is this sensor will detect the distances of objects using a laser in a 360-degree circle around the robot. It takes that data and pushes it to another robot skill that can display the data and use the information for navigation. The NMS manual explains more about how to set it up. Some videos also demonstrate how to set it up with similar hardware.
Main Window

Configuration

1) BaudRate
The baud rate for the communication. By default, the included controller from the manufacturer is 115200
2) Offset Degrees
Correct the angle that the lidar is mounted
3) Advanced communication parsing
Include debug information in the log window. This is only recommended if asked by Synthiam to help debug a communication issue. Otherwise, this will use a lot of resources.
4) Set variables with location data
If you're not using the NMS, check this, and global variables will be created for the scan data.
5) Fake NMS Pose hint event
The Better Navigator can use the Hector SLAM pose hint, which will not require a pose sensor. If you have The Better Navigator configured to use Hector as the pose hint, you can enable this option. This will send a "fake" pose hint of 0,0 to the NMS so that the pose event will run after every lidar scan.
NMS (Navigation Messaging System) Stack
This skill operates in Level #3 Group #1. It publishes obstacle detection to the NMS. While some SLAM systems will attempt to determine the cartesian position of the robot without telemetry, it is best to combine this skill with a Group #2 sensor.

How to Use
Program the LIDAR in Blockly
Like all Synthiam controls, the Blockly programming language can be used. In the example below, the robot will speak when an object has come close to it. It will also speak the degrees of where the object was. To see a similar lidar program in action, click HERE.

Cheat Sheet Shortcuts
There are several ControlCommand() commands for this robot skill. For a complete list, view the Cheat Sheet of this control. The commands will start and stop.

Variables
In the configuration menu of this control, you can view the variables that this skill creates. The variables hold the most recent values of minimum/maximum distance, degrees, and running status of the lidar.

I push Start and only takes a couple to seconds to return again to Start with that message. I appreciate any suggestion to continue. Also any video showing how this works together with the new Navigator would be great.
Pablo
The Better navigator is probably a smarter choice for this device. You can read about it on it's manual page and the NMS page. There's about 5-6 videos related to the NMS that you can watch on it. Also it's the same as using the Hitachi Lidar, so the videos and hack videos will be the same for this.
Pablo
This one is The Better Navigator and has more innovative path planning - and fuses multiple telemetry sensors to get a better pose estimation. It's still in development, and I'm working on it tonight while I type this. There are a few tweaks I'm finishing. I think it'll be awesome.
Actually, you can add all three sensors (realsense & lidar) for The Better Navigator to get even more data fusing lol.
I can't update the robot skill until I update ARC. The ARC NMS is being updated to a new revision, so all of the NMS robot skills need to be updated simultaneously.
Tomorrow I will test the updated navigator. Impresive work done with all the recent updates.!
- fix for the offset value. it wasn't working before
- fix for scan preview orientation
- has a new option to enable a fake pose hint event. Read the manual above about the option
I have a question about lidar, does a 360 degree clear view have to be available?
What if, for example, the view to the rear on the robot is blocked (Servo Cam), does it still work to the front?
And one more question, if this would be the right one: RPLidar A1M8
Regards
Uwe
in The Better Navigator, you can specify the min distance to register. That way, if something blocks the sensor, the values are ignored.
That lidar looks like the correct one.
.... Well, if you can read, you have an advantage, the answer to my question is already there:
1) Disregard values lower than
Ignore distance values less than this specified distance in CM. The distance values are provided by any NMS Level #3 Group #1 sensor. If wires or a camera block the sensor, this will ignore those values.
It shows up again, read everything first ;-)
Will do it a different way. Sometimes just talking about it brings up different avenues on how to do things. Have to work with what you have available!