ARC Pro

Upgrade to ARC Pro

Unleash your creativity with the power of easy robot programming using Synthiam ARC Pro

Navigation Messaging System

The Navigation Messaging System (NMS) is built into the ARC framework. It allows skills to push data (location & obstacle detection) into the messaging system. Then, mapping/navigation SLAM skills can subscribe to the navigation events. The NMS was designed to provide transparent Navigation to the user in ARC. You do not have to write any complicated code because the robot skills will work for navigation and obstacle avoidance.



NMS Stack

The NMS stack consists of sensors that feed the messaging system, received by mapping skills.


Level #1 - Map & Navigation

This is how to view the sensor data and organize it into maps for navigating. There are a few skills that do this. They subscribe to the sensor event data and assemble a map. Then, they allow you to click on the map and instruct the robot where to navigate to (either way-point or any destination, depending on your sensor data). A few navigation skills are...



Level #3 - Sensor Groups

You require sensors that feed data into the messaging system that maps are made from. The navigation messaging system collects data by supporting two sensor input groups...

  1. Lidar/depth scanners: any sensor that can detect obstacles (laser, 360-degree lidar ultrasonic, IR, etc.). The sensor will detect objects and mark on the map where the robot cannot go (i.e., walls, sofa, chairs, etc.). Supported skills are...

  2. Localization telemetry pose navigation: Special sensors that keep track of the robot's position (pose) in the real world (IPS, wheel encoders, Roomba movement panel, intel realsense t265, etc.). *Note: only one Group #2 sensor can be active at one time. 


What Sensors Do You Need?

Depending on what NMS Level 1 mapping/navigation you will be using, the sensor requirements may differ. For example, using the mapping skill The Better Navigator will require a sensor from each group L3G1 and L3G2 (layer 3 group 1& layer 3 group 2). So check the mapping robot skill so you know what NMS layer 3 group sensors are required for it. 


Scripting Support

NMS commands are supported by the JavaScript and Python scripting compilers in ARC. The namespace is Navigation, and You can view more details in the Support section for JavaScript or Python. The script commands allow custom sending Level #3 Group #1/#2 sensor data into the NMS. As well as pausing and stopping any ongoing navigation by a Level #1 navigator.


Cartesian Coordinate System

This robot skill uses cartesian coordinates to reference the robot's starting position. The starting position is always 0,0 and is defined at startup. As the robot navigates, the skill measures the distance from the starting position. The unit of measurement is in CM (centimeters). Read more about the cartesian coordinate system on Wikipedia.

Further Reading

Many experts have researched and documented robotics's navigation, localization, and known pose positioning. While the Intel T265 has demonstrated the most significant success at inside-out localization, many other methods have been attempted over the years. The topic of reach into robot navigation is far too vast to be covered in this manual. However, we'll leave you with a few papers that have greatly influenced the robot industry in this field.


All Navigation Robot Skills

These are navigation robot skills that work with the NMS or assist with controlling the robot's movement.

Use an overhead camera to track the pose location of a robot for localized navigation.


SLAM for the ARC NMS


This skill is used with a movement panel. When the movement panel is used (i.e. your robot is moving forward, left, right, etc) then the floor map skill will begin drawing a line in the direction of...


NMS Driver for the Hitachi-LG LDS 360 degree lidar.


The indoor positioning system is a combination of IR transmitter and IR dual camera receiver. The IR transmitter is mounted on your robot, and the IR dual camera receiver is placed in a corner of the room.


Connect the D435i Intel Realsense depth cameras to the ARC navigation messaging system (NMS) for distance detection and mapping.


Use the Intel Realsense T265 tracking camera for way-point robot navigation


A joystick can be connected and used with ARC to control movement, servo positions, and assign scripts to buttons. For XInput joysticks (such as Xbox), use the Joystick (Xinput) skill. By default,...


Joystick control for XInput devices (xbox controllers, etc).


Use this control to assign EZ-Script to keyboard shortcuts. Any key can be be assigned an EZ-Script command. By default, the arrow keys control any movement panel you have added to the project....


Use the microsoft xbox 360 connect to send navigation/obstacle information into the ARC NMS (navigation messaging system)


A joystick that controls any movement panel


An example project with source code that demonstrates how to create an NMS Level #1 navigation robot skill.


Generate estimated pose odometry based on the travel distance of the movement manager.


Pineapple Technology Industries plugin for the Capstone project


NMS Lidar driver for Slamtec RPI Lidar A1


Improved version of The Navigator based on Hector SLAM, with more features and path finding.


A navigation skill for path planning based on receiving external location and obstacle data


Sketch a path for your robot with your finger (on a tablet) or with the mouse. Use the configuration menu to select the turning and movement speed. These values are specific to the robot and speed....


Wavefront path planning Navigation (shortest path to destination)


Compatible with firmware that counts wheel rotation for left and right wheels with wheel encoder.


This skill allows you to control servos, movement panels, and scripts with a Wii Remote Controller. Use the controller's Home button to enable/disable the accelerometer servo movement. The directional...


Run the Neato Robotics XV-11 LIDAR on your robot.