Navigation Messaging System
The Navigation Messaging System (NMS) is built into the ARC framework. It allows skills to push data (location & obstacle detection) into the messaging system. Then, mapping/navigation SLAM skills can subscribe to the navigation events. The NMS was designed to provide transparent Navigation to the user in ARC. You do not have to write any complicated code because the robot skills will work for navigation and obstacle avoidance.
NMS Stack
The NMS stack consists of sensors that feed the messaging system, received by mapping skills.
Level #1 - Map & Navigation
This is how to view the sensor data and organize it into maps for navigating. There are a few skills that do this. They subscribe to the sensor event data and assemble a map. Then, they allow you to click on the map and instruct the robot where to navigate to (either way-point or any destination, depending on your sensor data). A few navigation skills are...
- The Better Navigator (preferred)
- The Navigator
- EZ-SLAM
Level #3 - Sensor Groups
You require sensors that feed data into the messaging system that maps are made from. The navigation messaging system collects data by supporting two sensor input groups...
-
Lidar/depth scanners: any sensor that can detect obstacles (laser, 360-degree lidar ultrasonic, IR, etc.). The sensor will detect objects and mark on the map where the robot cannot go (i.e., walls, sofa, chairs, etc.). Supported skills are...
- Intel Realsense D435i
- Hitachi-LG LDS Lidar
- Ultrasonic Distance
- Kinect xBox 360 Depth Mapping
- RPLidar (A1)
- JavaScript & Python commands for manually reporting obstacle detection
-
Localization telemetry pose navigation:
Special sensors that keep track of the robot's position (pose) in the real world (IPS, wheel encoders, Roomba movement panel, intel realsense t265, etc.). *Note: only one Group #2 sensor can be active at one time.
- Intel Realsense T265
- Indoor Positioning System
- Wheel Encoder Counter
- iRobot Roomba Movement Panel
- NMS Faux Odometry
- Camera NMS Pose
- JavaScript & Python commands for reporting positioning manually
What Sensors Do You Need?
Depending on what NMS Level 1 mapping/navigation you will be using, the sensor requirements may differ. For example, using the mapping skill The Better Navigator will require a sensor from each group L3G1 and L3G2 (layer 3 group 1& layer 3 group 2). So check the mapping robot skill so you know what NMS layer 3 group sensors are required for it.
Scripting Support
NMS commands are supported by the JavaScript and Python scripting compilers in ARC. The namespace is Navigation, and You can view more details in the Support section for JavaScript or Python. The script commands allow custom sending Level #3 Group #1/#2 sensor data into the NMS. As well as pausing and stopping any ongoing navigation by a Level #1 navigator.
Cartesian Coordinate System
This robot skill uses cartesian coordinates to reference the robot's starting position. The starting position is always 0,0 and is defined at startup. As the robot navigates, the skill measures the distance from the starting position. The unit of measurement is in CM (centimeters). Read more about the cartesian coordinate system on Wikipedia.
Further Reading
Many experts have researched and documented robotics's navigation, localization, and known pose positioning. While the Intel T265 has demonstrated the most significant success at inside-out localization, many other methods have been attempted over the years. The topic of reach into robot navigation is far too vast to be covered in this manual. However, we'll leave you with a few papers that have greatly influenced the robot industry in this field.
- Where Am I? - Systems and Methods for Mobile Robot Positioning by J. Borenstein, H. R. Everett, and L. Feng
- Carleton Winter 2012 Position Estimation presentation
All Navigation Robot Skills
These are navigation robot skills that work with the NMS or assist with controlling the robot's movement.
Have your robot navigate using a GPS to waypoints and paths.
Use an overhead camera to track the pose location of a robot for localized navigation.
SLAM for the ARC NMS
NMS Driver for the Hitachi-LG LDS 360 degree lidar.
The indoor positioning system is a combination of IR transmitter and IR dual camera receiver. The IR transmitter is mounted on your robot, and the IR dual camera receiver is placed in a corner of the room.
Connect the D435i Intel Realsense depth cameras to the ARC navigation messaging system (NMS) for distance detection and mapping.
Use the Intel Realsense T265 tracking camera for way-point robot navigation
Use the microsoft xbox 360 connect to send navigation/obstacle information into the ARC NMS (navigation messaging system)
An example project with source code that demonstrates how to create an NMS Level #1 navigation robot skill.
Generate estimated pose odometry based on the travel distance of the movement manager.
Pineapple Technology Industries plugin for the Capstone project
NMS Lidar driver for Slamtec RPLidar A1
Improved version of The Navigator based on Hector SLAM, with more features and path finding.
A navigation skill for path planning based on receiving external location and obstacle data
Wavefront path planning Navigation (shortest path to destination)
Compatible with firmware that counts wheel rotation for left and right wheels with wheel encoder.
Run the Neato Robotics XV-11 LIDAR on your robot.