Navigation Messaging System
The Navigation Messaging System (NMS) is built into the ARC framework. It allows skills to push data (location & obstacle detection) into the messaging system. Then, mapping/navigation SLAM skills can subscribe to the navigation events. The NMS was designed to provide transparent Navigation to the user in ARC. You do not have to write any complicated code because the robot skills will work for navigation and obstacle avoidance.
NMS Stack
The NMS stack consists of sensors that feed the messaging system, received by mapping skills.
Level #1 - Map & Navigation
This is how to view the sensor data and organize it into maps for navigating. There are a few skills that do this. They subscribe to the sensor event data and assemble a map. Then, they allow you to click on the map and instruct the robot where to navigate to (either way-point or any destination, depending on your sensor data). A few navigation skills are...
- The Better Navigator (preferred)
- The Navigator
- EZ-SLAM
Level #3 - Sensor Groups
You require sensors that feed data into the messaging system that maps are made from. The navigation messaging system collects data by supporting two sensor input groups...
-
Lidar/depth scanners: any sensor that can detect obstacles (laser, 360-degree lidar ultrasonic, IR, etc.). The sensor will detect objects and mark on the map where the robot cannot go (i.e., walls, sofa, chairs, etc.). Supported skills are...
- Intel Realsense D435i
- Hitachi-LG LDS Lidar
- Ultrasonic Distance
- Kinect xBox 360 Depth Mapping
- RPLidar (A1)
- JavaScript & Python commands for manually reporting obstacle detection
-
Localization telemetry pose navigation:
Special sensors that keep track of the robot's position (pose) in the real world (IPS, wheel encoders, Roomba movement panel, intel realsense t265, etc.). *Note: only one Group #2 sensor can be active at one time.
- Intel Realsense T265
- Indoor Positioning System
- Wheel Encoder Counter
- iRobot Roomba Movement Panel
- NMS Faux Odometry
- Camera NMS Pose
- JavaScript & Python commands for reporting positioning manually
What Sensors Do You Need?
Depending on what NMS Level 1 mapping/navigation you will be using, the sensor requirements may differ. For example, using the mapping skill The Better Navigator will require a sensor from each group L3G1 and L3G2 (layer 3 group 1& layer 3 group 2). So check the mapping robot skill so you know what NMS layer 3 group sensors are required for it.
Scripting Support
NMS commands are supported by the JavaScript and Python scripting compilers in ARC. The namespace is Navigation, and You can view more details in the Support section for JavaScript or Python. The script commands allow custom sending Level #3 Group #1/#2 sensor data into the NMS. As well as pausing and stopping any ongoing navigation by a Level #1 navigator.
Cartesian Coordinate System
This robot skill uses cartesian coordinates to reference the robot's starting position. The starting position is always 0,0 and is defined at startup. As the robot navigates, the skill measures the distance from the starting position. The unit of measurement is in CM (centimeters). Read more about the cartesian coordinate system on Wikipedia.
Further Reading
Many experts have researched and documented robotics's navigation, localization, and known pose positioning. While the Intel T265 has demonstrated the most significant success at inside-out localization, many other methods have been attempted over the years. The topic of reach into robot navigation is far too vast to be covered in this manual. However, we'll leave you with a few papers that have greatly influenced the robot industry in this field.
- Where Am I? - Systems and Methods for Mobile Robot Positioning by J. Borenstein, H. R. Everett, and L. Feng
- Carleton Winter 2012 Position Estimation presentation
All Navigation Robot Skills
These are navigation robot skills that work with the NMS or assist with controlling the robot's movement.
Use the Beitian BN-880 GPS/Compass sensor for autonomous navigation from predefined waypoints. View a map of the surrounding area to specify waypoints in real time. The robot can navigate between the waypoints using the combined GPS and Compass of the BN-880 sensor. *Note: This robot skill does not use the ARC NMS (Navigation Messaging System). This is a standalone robot skill that will perform its navigation. BN-880 Features Electrical Characteristics: - Receiving Format: GPS, GLONASS,Galileo,...
Use an overhead camera to track the pose location for localized navigation. The robot must have a Glyph attached to the top of itself. The camera must be overhead. Ideally, having the camera overhead looking down works best. Although, having a slight angle will work okay for most cases. Ensure there is proper lighting so the camera can detect the glyph. Anyone with a camera can use this robot skill to experience high accurate NMS with autonomous navigation. This positioning system will work...
This robot skill does not have navigation capabilities. We recommend using The Better Navigator instead of this. This was created for testing and proof of concept if you want to make your own skill for slam. This skill is for the ARC NMS (Navigation messaging system) to display Simultaneous Localization and Mapping (SLAM). A LIDAR is recommended to be used in conjunction with this skill to create an accurate SLAM map of the robots environment. Verify the Lidar you choose can push scan data into...
NMS Driver for the Hitachi-LG LDS 360 degree lidar. This robot skill connects to the Lidar and pushes the data into ARCs NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used. Screenshot Configuration 1) BaudRate The baud rate for the communication to the usb serial adapter or arduino. By default the baudrate should be 230400 for the lidar. However, some usb serial converters will use different baud...
The indoor positioning system is a combination of Infrared (IR) light transmitter and a dual camera receiver. The IR transmitter is mounted on your robot, and the dual camera receiver is placed in a corner of the room. The dual camera receiver is made up of a camera that captures regular image data and another that only captures infrared light. You can mount the dual camera on a tripod so it can see the room with its 100 degree wide angle lenses. There are three variables created that return the...
Connect the D435i Intel Realsense camera to the ARC navigation messaging system (NMS) for distance detection and mapping. This should also work with other Intel RealSense depth camera models because the FOV and parameters are used from the device. Main Screen 1) Select the device by the serial number. This allows multiple Intel Realsense devices to be added to a project. 2) START/STOP button connecting to the RealSense device. 3) Log window for status and errors. Image Align The Image Align...
With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation. The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of its own...
Use the microsoft xbox 360 connect to send navigation/obstacle information into the ARC NMS (navigation messaging system). This skill publishes obstacle scan distance data to ARCs NMS (navigation messaging system) to be used in SLAM or The Navigator skills. Because the entire image depth is unnecessary to be processed for mapping, only a portion is. This prevents your robot from picking up distances of things irrelevant to its size and distance. Using the options for the detection area, tune the...
An example project with source code that demonstrates how to create an NMS Level #1 navigation robot skill. In this robot skill, the NMS scan and location(position) data are displayed. Other than a demonstration of how to code a robot skill to receive NMS data, this is entirely useless. The code for this example robot skill demonstrates how something like The Navigator could be made. Download the source code for this robot skill here: NMS Example Source.zip This robot skill subscribes to an...
For robots that do not have odometry sensor feedback (i.e. wheel encoders, RealSense D435i, etc.), this will generate estimated pose odometry based on the travel distance of the movement manager. Configuration Menu This robot skill calculates the distance the robot is moving based on the speed. Because ARC has dynamic speeds, the calibration will require a few measurements of a robots travel distance over time. This information is entered in the config menu. 1) Speed 255 Distance - This is...
Pineapple Technology Industries plugin Capstone Project (secretive lidar unit)
NMS Lidar driver for Slamtec RPLidar A1. This pushes scan data to the ARC NMS (Navigation Messaging System) as a level #3 Group #1 sensor. This robot skill connects to the Lidar and pushes the data into ARCs NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used. That might sound more confusing than it needs to be. But, all that means is this sensor will detect the distances of objects using a laser in a...
A better navigation skill based on Hector SLAM using ARCs NMS location/positioning and obstacle data. This skill is combined with other skills contributing navigation data to ARCs navigation messaging system (NMS). The lidar or depth camera data will create a map of the room(s) as the robot drives. You can then add way-points that are saved with the project. You can have the robot automatically navigate by clicking on a way-point (i.e., kitchen, sofa, or dining room). The robot will figure out a...
A navigation skill for path planning based on receiving NMS location/positioning and obstacle data. This skill is combined with other skills that contribute navigation data to ARCs navigation messaging system (NMS). Updated Skill This robot skill has been superseded by a newer version called The Better Navigator. Example This screenshot uses an Intel RealSense T265 with three low-cost HC-SR04 ultrasonic distance sensors. The robot was instructed to drive around the waypoints at various speeds....
Wavefront Navigation path planning (shortest path to destination) Define an Array called $MyMap, Assign your goal location in the array a value of 1, assign your robot position in the array a value of 254, all walls and obstacles are assigned 255, open space is assigned 0. Call the wavefront plugin. The plugin returns an array $MyRoute, showing the best route. G indicating the Goal, R indicating the Robot and _ indicating the route. My plan is to the use Sonar to check ahead and if an obstacle...
This skill is compatible with Arduino UNO firmware that counts wheel rotation for left and right wheels. Firmware This robot skill requires supporting capability on the EZB with firmware. The firmware is open-source so that you can modify it for different Arduinos and microcontrollers. The Wheel Encoder Arduino UNO firmware can be found here: https://synthiam.com/Docs/Hardware/Arduino-Genuino-Uno Main Window 1. Get Wheel Values Button Manually requests the encoder values for each wheel. Please...
Run the Neato Robotics XV-11 LIDAR on your robot to generate a 2d distance map The XV-11 is an inexpensive 360 degree LIDAR sensor. This requires that you use a GetSurreal controller. I have a version 1.2 Get Surreal Controller. You must download and install the Teensy Serial Driver This plugin will generate an array variable called $LIDARDistance. Each element in the Array matches to the degree angle reported from the XV-11. The value of each element is the distance reported back by the sensor...