12. Autonomous Navigation
Based on your skill level, there are beginner and advanced solutions for navigation and object avoidance. This page provides details and links to various navigation solutions that fit your robot's requirements and budget.
This video tutorial shows you how to add navigation with waypoints to a robot using a 360-Degree Lidar. This same result can be achieved using ARC's Lidars on any robot with a movement panel.
Navigation solutions range from low-cost single sensors to 3D depth cameras and 360-degree lidars. You can decide what type of sensors to add to your robot and choose the appropriate robot skills.
1. Simple Object Avoidance
The simplest and lowest-cost navigation method uses distance sensors to avoid walls and other objects in the robot's path. This can be done with a single sensor or several sensors. Popular sensors for this simple solution are Ping Ultrasonic and IR Distance.
Ultrasonic Distance Robot Skills
Infrared Distance Robot Skills
2. Autonomous Navigation
ARC contains a messaging system for navigation called the Navigation Messaging System (NMS for short). While there are many navigation sensors, they vary in cost and accuracy.
ARC's most popular SLAM for autonomous navigation system is The Better Navigator, which can navigate between stored points. The Better Navigator takes the average 3D point of a set of waypoints and uses that to navigate. The sensor data input type is essential for The Better Navigator to perform the desired operation. We recommend combining a 360-degree lidar with The Better Navigator robot skill. See the list below for supported Lidars, depth cameras, and many other navigation robot skills.
SLAM Navigation Robot Skills
Here is a list of ARC robot skills for navigation, including autonomous navigation and remote control. In many cases, it may be necessary to combine autonomous navigation with remote control if the robot gets stuck. Another option is to use Exosphere and have the robot request user assistance when it gets stuck.
Use the Beitian BN-880 GPS/Compass sensor for autonomous navigation from predefined waypoints. View a map of the surrounding area to specify waypoints in real time. The robot can navigate between the waypoints using the combined GPS and Compass of the BN-880 sensor. *Note: This robot skill does not use the ARC NMS (Navigation Messaging System). This is a standalone robot skill that will perform its navigation. BN-880 Features Electrical Characteristics: - Receiving Format: GPS, GLONASS,Galileo,...
Use an overhead camera to track the pose location for localized navigation. The robot must have a Glyph attached to the top of itself. The camera must be overhead. Ideally, having the camera overhead looking down works best. Although, having a slight angle will work okay for most cases. Ensure there is proper lighting so the camera can detect the glyph. Anyone with a camera can use this robot skill to experience high accurate NMS with autonomous navigation. This positioning system will work...
This robot skill does not have navigation capabilities. We recommend using The Better Navigator instead of this. This was created for testing and proof of concept if you want to make your own skill for slam. This skill is for the ARC NMS (Navigation messaging system) to display Simultaneous Localization and Mapping (SLAM). A LIDAR is recommended to be used in conjunction with this skill to create an accurate SLAM map of the robots environment. Verify the Lidar you choose can push scan data into...
NMS Driver for the Hitachi-LG LDS 360 degree lidar. This robot skill connects to the Lidar and pushes the data into ARCs NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used. Screenshot Configuration 1) BaudRate The baud rate for the communication to the usb serial adapter or arduino. By default the baudrate should be 230400 for the lidar. However, some usb serial converters will use different baud...
The indoor positioning system is a combination of Infrared (IR) light transmitter and a dual camera receiver. The IR transmitter is mounted on your robot, and the dual camera receiver is placed in a corner of the room. The dual camera receiver is made up of a camera that captures regular image data and another that only captures infrared light. You can mount the dual camera on a tripod so it can see the room with its 100 degree wide angle lenses. There are three variables created that return the...
Connect the D435i Intel Realsense camera to the ARC navigation messaging system (NMS) for distance detection and mapping. This should also work with other Intel RealSense depth camera models because the FOV and parameters are used from the device. Main Screen 1) Select the device by the serial number. This allows multiple Intel Realsense devices to be added to a project. 2) START/STOP button connecting to the RealSense device. 3) Log window for status and errors. Image Align The Image Align...
With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation. The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of its own...
Use the microsoft xbox 360 connect to send navigation/obstacle information into the ARC NMS (navigation messaging system). This skill publishes obstacle scan distance data to ARCs NMS (navigation messaging system) to be used in SLAM or The Navigator skills. Because the entire image depth is unnecessary to be processed for mapping, only a portion is. This prevents your robot from picking up distances of things irrelevant to its size and distance. Using the options for the detection area, tune the...
An example project with source code that demonstrates how to create an NMS Level #1 navigation robot skill. In this robot skill, the NMS scan and location(position) data are displayed. Other than a demonstration of how to code a robot skill to receive NMS data, this is entirely useless. The code for this example robot skill demonstrates how something like The Navigator could be made. Download the source code for this robot skill here: NMS Example Source.zip This robot skill subscribes to an...
For robots that do not have odometry sensor feedback (i.e. wheel encoders, RealSense D435i, etc.), this will generate estimated pose odometry based on the travel distance of the movement manager. Configuration Menu This robot skill calculates the distance the robot is moving based on the speed. Because ARC has dynamic speeds, the calibration will require a few measurements of a robots travel distance over time. This information is entered in the config menu. 1) Speed 255 Distance - This is...
Pineapple Technology Industries plugin Capstone Project (secretive lidar unit)
NMS Lidar driver for Slamtec RPLidar A1. This pushes scan data to the ARC NMS (Navigation Messaging System) as a level #3 Group #1 sensor. This robot skill connects to the Lidar and pushes the data into ARCs NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used. That might sound more confusing than it needs to be. But, all that means is this sensor will detect the distances of objects using a laser in a...
A better navigation skill based on Hector SLAM using ARCs NMS location/positioning and obstacle data. This skill is combined with other skills contributing navigation data to ARCs navigation messaging system (NMS). The lidar or depth camera data will create a map of the room(s) as the robot drives. You can then add way-points that are saved with the project. You can have the robot automatically navigate by clicking on a way-point (i.e., kitchen, sofa, or dining room). The robot will figure out a...
A navigation skill for path planning based on receiving NMS location/positioning and obstacle data. This skill is combined with other skills that contribute navigation data to ARCs navigation messaging system (NMS). Updated Skill This robot skill has been superseded by a newer version called The Better Navigator. Example This screenshot uses an Intel RealSense T265 with three low-cost HC-SR04 ultrasonic distance sensors. The robot was instructed to drive around the waypoints at various speeds....
Wavefront Navigation path planning (shortest path to destination) Define an Array called $MyMap, Assign your goal location in the array a value of 1, assign your robot position in the array a value of 254, all walls and obstacles are assigned 255, open space is assigned 0. Call the wavefront plugin. The plugin returns an array $MyRoute, showing the best route. G indicating the Goal, R indicating the Robot and _ indicating the route. My plan is to the use Sonar to check ahead and if an obstacle...
This skill is compatible with Arduino UNO firmware that counts wheel rotation for left and right wheels. ## Firmware This robot skill requires supporting capability on the EZB with firmware. The firmware is open-source so that you can modify it for different Arduinos and microcontrollers. The Wheel Encoder Arduino UNO firmware can be found here: https://synthiam.com/Firmware/Wheel-Encoder-Firmware-NVXCXGPCQMJ ## Main Window 1. Get Wheel Values Button Manually requests the encoder values for...
Run the Neato Robotics XV-11 LIDAR on your robot to generate a 2d distance map The XV-11 is an inexpensive 360 degree LIDAR sensor. This requires that you use a GetSurreal controller. I have a version 1.2 Get Surreal Controller. You must download and install the Teensy Serial Driver This plugin will generate an array variable called $LIDARDistance. Each element in the Array matches to the degree angle reported from the XV-11. The value of each element is the distance reported back by the sensor...