Navigation Messaging System

The Navigation Messaging System (NMS) is built into the ARC framework. It allows skills to push data (location & obstacle detection) into the messaging system. Then, mapping/navigation SLAM skills can subscribe to the navigation events. The NMS was designed to provide transparent navigation to the user in ARC. You do not have to write any complicated code because the robot skills will do the work for navigation and obstacle avoidance.

NMS Stack

The NMS stack consists of sensors that feed data into the messaging system, which are received by mapping skills.

Level #1 - Map & Navigation

This is the way to view the sensor data and organize it into maps for navigating. There are a few skills which do this. They subscribe to the sensor event data and assemble a map. Then, they provide you the ability to click on the map and instruct the robot where to navigate to (either way-point or any destination depending on your sensor data). A few navigation skills are...

Level #3 - Sensor Groups

Skills you require are sensors that feed data into the messaging system that maps are made from. The navigation messaging system collects data by supporting two sensor input groups...

  1. lidar/obstacle scanners (optional): any kind of sensor that can detect obstacles (laser, 360 degree lidar ultrasonic, IR, etc.). The sensor will detect objects and mark on the map where the robot cannot go (i.e. walls, sofa, chairs, etc.). Supported skills are...
  2. Localization navigation: Special sensors that keep track of the robot's position (pose) in the real world (IPS, wheel encoders, roomba movement panel, intel realsense t265, etc.). *Note: only one Group #2 sensor can be active at one time. 

What Sensors Do You Need?

Generally, a Level #1 navigator will require a Level #3 Sensor #2 for navigating because it is the sensor that knows where the robot is. Sensor group #1 is optional because it is used to create a no-go map of places the robot cannot go. However, your robot can still navigate through pre-recorded paths with only sensor group #2. It'll be up to you to decide how important autonomous navigation is vs pre-recorded paths.

For Pre-recorded Path

This option uses only sensor group #2.The robot can only drive pre-recorded paths to pre-defined way-points within the map.

For Autonomous Navigation

This option uses both sensor groups #1 & #2. The robot can drive any path to get to any destination within the map.

Scripting Support

There are NMS commands supported by the JavaScript and Python scripting compilers in ARC. The namespace is Navigation and more details can be viewed in the Support section for JavaScript or Python. The script commands allow either custom sending Level #3 Group #1/#2 sensor data into the NMS. As well as pausing and stopping any ongoing navigation by a Level #1 navigator.

Cartesian Coordinate System

This robot skill uses cartesian coordinates to reference the robot's position from the starting position. The starting position is always 0,0 and is defined at startup. As the robot navigates, the skill measures the distance from the starting position. The unit of measurement is in CM (centimeters). Read more about the cartesian coordinate system on Wikipedia.

Further Reading

Navigation, localization, and known pose positioning of robotics has been researched and documented by many experts. While the Intel T265 has demonstrated the greatest success at inside-out localization, there are many other methods that have been attempted over the years. The topic of reach into robot navigation is far too vast to be covered in this manual. However, we'll leave you with a few papers that have greatly influenced the robot industry in this field.