Thumbnail

The Navigator

A navigation skill for path planning based on receiving external location and obstacle data

+ How To Add This Control To Your Project (Click to Expand)
  1. Make sure you have the latest version of ARC installed.
  2. Select the Get button in this page to download the archive file.
  3. Double click the downloaded archive file to execute installer.
  4. The installer will add this control to ARC.
  5. Load ARC and press the Project -> Add Control button from the menu.
  6. Choose the Navigation category tab.
  7. Press the The Navigator icon to add the control to your project.

Manual

A navigation skill for path planning based on receiving NMS location/positioning and obstacle data. This skill is used in combination with other skills that contribute navigation data to ARC's Navigation Messaging System (NMS).

*Note: this skill requires ARC 2020.12.25.00 or greater

Example
This screenshot is using an Intel RealSense T265 with 3 low-cost HC-SR04 ultrasonic distance sensors. The robot was instructed to drive around the way-points at various speeds. Ideally, you would want to use this robot skill with a 360-degree lidar or other high-resolution obstacle detection.
User-inserted image



Here is a video of The Navigator working in conjunction with the Intel RealSense T265 tracking camera skill, which operates at NMS Level #3 Group #2 by providing position/location data into the NMS



ARC Navigation Messaging System
This skill is part of the ARC navigation messaging system. It is encouraged to read more about the Navigation Messaging System and learn about compatible skills. This particular skill (The Navigator) operates on Level #1 of the NMS overview. This skill (The Navigator) requires a Level #3 Group #2 location/position sensor for operation. The location/positioning system will feed position data into the NMS which this skill will use for navigation. See the NMS for compatible skills that provide location/position data.
User-inserted image



Mapping
While your robot is driving around and navigating, this skill will log the trajectory. By manually driving your robot to various locations (waypoints), you define the waypoint and path points. Once multiple path points are defined for a waypoint, you can instruct your robot to autonomously navigate to that exact waypoint (or back again) at any time.

Map Size
The map is currently hardcoded for 20x20 meters.


Main Screen
User-inserted image

1) Map control buttons for clearing trajectory and clearing the map.

2) Current cartesian coordinates of the robot as reported by an NMS Level #3 Group #2 sensor (i.e. Intel T265, wheel encoders).

3) Saved waypoints. Here you can add, remove and select waypoints.

4) The path points within a waypoint. A waypoint will consist of many path points for navigating throughout the environment. You may right-click on path points to edit the coordinate for fine-tuning. You may also re-order the path points by right-clicking and selecting Move Up or Move Down.

5) Current heading of the robot relative to the cartesian starting position as reported by an NMS Level #3 Group #2 sensor.

6) The yellow dot marks the robot's current cartesian position as reported by an NMS Level #3 Group #2 position/location sensor.

7) Path points are connected with a straight line that demonstrates where the robot would drive. To add path points, right-click on the map view and select Add Path Point. It is best to drive the robot which creates a trajectory. Then, simply right-click on some points of the tractory to add new path points to the selected waypoint.

8) Log messages are displayed about navigation and sensor activity.

Main Screen - Navigation Controls
User-inserted image

There are three buttons for manually starting navigating. You may also start navigating by using ControlCommands from other skills. The three buttons each have different behavior.

Start (Forward Order) - This will instruct the robot to start following the specified paths within the waypoint from the first to last in that order. If your robot is close to the starting (first path point), then it will be convenient and head right there. However, if your robot is far away from the first path point, you may wish to use the Reverse or Auto button.

Start (Reverse Order) - This will instruct the robot to start following the specified paths within the waypoint in reverse order from last to first. If your robot is closest to the last path point, this will be convenient to get your robot back to the first path point in the waypoint. 

Start (Auto) - This will automatically calculate which order to execute the path direction based on the robot's distance from either the first or last path point. This is the most convenient operation because it will allow a waypoint to be navigated to and from. For example, if your robot drives to a location use this button to also get your robot back to the starting location. This will happen automatically.


Config - Scripts
User-inserted image

1) Script that will execute when the navigation to a waypoint is started. Navigation can start by either manually pressing the Start button or using a ControlCommand().

2) Script will execute when the navigation is canceled or successfully ended.

3) If the navigation is paused by either a JavaScript/Python command from the Navigation namespace. Or if the paused is triggered by the NMS Level #3 Group #1 distance sensor returning a value less than the specified range. This is configured in the Settings tab.

Config - Variables
User-inserted image

There are many global variables that are set for The Navigator. There is a question mark next to each variable which explains in greater detail. The variable contents can be viewed using the Variable Watcher skill found in the Scripts category.

Config - Settings
User-inserted image

1) Ignore distance values that are less than this specified distance in CM. The distance values are provided by any NMS Level #3 Group #1 sensor.

2) Ignore distance values that are further than this specified distance in CM. The distance values are provided by any NMS Level #3 Group #1 sensor.

3) Filter false positives by removing obstacles that are detected and then later not detected within the same area.

4) If the NMS distance sensor provides a value that is greater than the "lower than" but lower than this, any navigation will be paused. This will also execute the PAUSE script from the Scripts tab. Your program may use this opportunity to navigate through the obstacle and continue navigating again. To continue navigating, use the Javascript or Python command in the Navigation namespace. That command is Navigation.setNavigationStatusToNavigating();

Starting Position
This navigation skill uses cartesian coordinates in CM from the starting position (0, 0). This means that any saved maps will be referenced from the same starting position and heading angle. When you re-load a project to have the robot navigate the same course, the robot must be positioned in the same starting position and heading angle. We recommend using painter/masking tape as the starting reference point for the robot. If your robot has an auto dock for charging, secure the charger to a fixed position on the floor and that can be used as a reference point. 

User-inserted image


In the photo above, we're using an iRobot Roomba as the robot with an Intel T265 positioning sensor. The painter's tape on the floor is marking the starting position of the robot. The outline allows us to position the robot into the square, and the marking on the front of the robot aligns with the specified heading.

Cartesian Coordinate System
This robot skill uses cartesian coordinates to reference the robot's position from the starting position. The starting position is always 0,0 and is defined at startup. As the robot navigates, the skill measures the distance from the starting position. The unit of measurement is in CM (centimeters). Read more about the cartesian coordinate system on Wikipedia.
User-inserted image


How Do You Use This
This skill is a navigation renderer and movement planner. This means it can build a map from participating sensor data and control the robot by navigating it to specified locations. In order to get sensor data, other skills must be loaded that are compatible. In this quick example, we'll simply use the Intel Realsense T265 in combination with this skill...
1) Connect your Intel RealSense T265 to the computer's USB port

2) Load ARC (version must be >= 2020.12.25.00)

3) Add the Intel RealSense skill to your workspace

4) Now add this skill (The Navigator) to your workspace

5) Press START on the Intel RealSense skill

6) You should now begin to see localization path data from the Intel RealSense sensor being displayed in The Navigator window. This robot skill will be displaying and rendering the data.

*Optionally, you may now add a supporting NMS Level #3 Group #1 sensor for detecting distance of obstacles and environment.

To Do
This skill is under development and a number of items are left to be completed...
- Create custom paths based on pre-programmed paths when navigating between waypoints
- Automatically navigate obstacle avoidance
- Load and save maps

PRO
Portugal
#34  
Thank you DJ, will give it a go tomorrow.
PRO
Synthiam
#35  
Added ability to adjust the speed for navigating in the config menu. Please read the question marks next to the options to learn more.
PRO
Portugal
#36  
Bug> When i change the Forward Speed in The Navigator the path points seem to move down relative to the saved path points.
PRO
Synthiam
#37  
Question> what does that mean?
PRO
Synthiam
#39  
Haha I still don’t understand. Can you write more words that explains some detail?
PRO
Portugal
#40  
Sure DJ, 
If i have various path points and then change the Forward speed in The Navigator, the path points that should be in the green area, move down as seen in the pic.
PRO
Synthiam
#41  
I’m guessing the intel release t265 doesn’t appreciate the speed change. Remember, the coordinates come from the t265, not me
PRO
Portugal
#42   — Edited
Hum, odd..., must experiment a bit.
Does it happen to you to DJ?
PRO
Synthiam
#43   — Edited
Not sure - I haven’t tried. It’s kind of a pain to set up the realsense right now because I’m at my cabin in northern Ontario.

the t265 gives me x,y,z pose coordinates and I plot them on the map. That’s about it - there’s no math or anything on my side. Just take the coordinates and push them into the nms

so if everything is slightly off, I’d gather the t265 returns different distances when moving at different speeds? 

id have to test someday

but your screenshot does kinda show that the robot isn’t on the home position... so not sure
PRO
Portugal
#44  
This shift occurs as soon as you change the speed and hit save. The position of the camera is correct but all the way points shift.
PRO
Synthiam
#45  
Weird eh. I guess there’s always challenges with pose sensors. So choose a static speed and use that
PRO
Portugal
#46  
Hey DJ, did you do something in the last update? The nav point issue is ok now.
Also i need a bigger map, lot bigger.
PRO
Synthiam
#47   — Edited
I did not do an update to this robot skill
PRO
Canada
#48  
2nd the bigger map I can’t even make it from the couch to the beer fridge.
PRO
Synthiam
#49   — Edited
Updated map size to 2000x2000 cm so @Nink can get beer
#50  
Aha ha! Sound like a plan!xD
PRO
Portugal
#51  
#DJ
I am using only one ping sensor to do the mapping and the Navigator does not plot the map. Must i use more?
PRO
Portugal
#53  
Sweet! Locking forward for part 4!