Asked

Future Improvement: Floor Map Skill

Hello Athena,

Regarding the Floor Map skill, I understand that its current functionality is to visualize the path as the robot moves, and it is not designed for navigation purposes at this stage.

I was wondering if there might be a possibility of expanding this feature in the future. Specifically, it would be beneficial if the Floor Map skill could record a path and later allow the robot to replay it, following the same route precisely. This capability could be extremely useful for executing simple, repeatable tasks such as navigating indoor routes or performing patrols.

Could this be something considered for future development or possibly included as a new skill within Synthiam ARC?


ARC Pro

Upgrade to ARC Pro

Discover the limitless potential of robot programming with Synthiam ARC Pro – where innovation and creativity meet seamlessly.

#1  

Robot skills using the NMS (Navigation Messaging System) have deprecated old ideas like the floor map. For example, using the NMS Faux Odemetry allows your robot that does not appear to have any encoders or lidar to use NMS: https://synthiam.com/Support/Skills/Navigation/NMS-Faux-Odometry?id=20456

That way, if your robot lacks navigation sensors such as a lidar, you can use The Navigator: https://synthiam.com/Support/Skills/Navigation/The-Navigator?id=20072

You will not be able to use The Better Navigator because there is no lidar.

Note that relying on telemetry from only wheels will result in an unreliable experience. This setup should be used only for experimentation. I can have @athena explain more about why navigation requires plenty of data, such as from a Lidar for telemetry positioning (pose). Athena, please provide as much information as you can that can help this conversation.

PRO
Canada
#2  

Great question - this is exactly the kind of feature request that makes ARC more useful for many real-world robots. I'll explain why Floor Map is currently only a visual tool, what the practical limitations are for turning it into a true replay/navigation tool, and two realistic paths you can take (one simple/experimental, one more robust). I’ll also give a starter JavaScript script skeleton you can use to record and replay a path (with calibration notes) and instructions for turning it into a robot skill.

Why Floor Map is visual-only today

  • Floor Map was designed as a display/diagnostic tool: it draws where the robot has moved using the Movement Panel inputs (pixels/second & degrees/second settings).
  • It does not maintain precise pose telemetry suitable for navigation - it doesn’t provide reliable real-world coordinates nor the control loop needed to actively drive to exact coordinates.
  • Navigation requires accurate pose (x,y,heading) and sensor data (lidar, encoder odometry, IMU, or camera-based visual odometry). Without that the robot’s position drifts quickly and replay will fail in most environments.

Why NMS Faux Odometry exists and how it helps

  • NMS Faux Odometry produces estimated pose (cartesian x,y,heading) based on movement calibration (speed distances and pivot diameter). It publishes data into the NMS so Level #1 navigation skills (like The Navigator) can subscribe.
  • That means you already have a path to create a replay feature: use Faux Odometry as your pose source and either:
    • Let a Level #1 Navigator accept recorded waypoints and navigate to them
    • Or build a small skill that records pose samples and replays them with a simple control loop

Recommended approaches

Option A - (recommended for repeatable but not high-precision tasks)

  • Use NMS Faux Odometry to produce pose telemetry and use The Navigator (or a custom NMS-aware skill) to accept saved waypoints.
  • This takes advantage of NMS infrastructure and lets the Navigator perform movement control rather than blindly replaying raw Movement commands.
  • Pros: uses NMS messaging, more modular. Cons: still limited accuracy without lidar/encoders; OK for simple patrols.

Option B - (experimental) Add path-record / replay robot skill

  • Create a skill that:
    1. Records NMS pose samples (x,y,heading) while you drive the robot (store in a file or global array).
    2. Replays them later by stepping through the saved waypoints and commanding turns and forward motions based on the delta to the next waypoint.
  • This will work for short, repeatable routes if you carefully calibrate speeds and turning rates. Expect drift with wheel slip, carpets, collisions.

How to build it (high-level)

  1. Add a new skill: Project -> Add robot skill (see https://synthiam.com/Support/Create-Robot-Skill/Overview).
  2. Use the skill’s settings (kebab menu) to expose start/stop recording and save/load file options.
  3. Recording: sample the NMS pose at a fixed rate (e.g., every 250 ms) and append to array/file.
    • Use NMS Faux Odometry as your pose source. Many NMS-style skills expose global variables like $NavigationX, $NavigationY, $NavigationHeading. If your Faux Odometry skill uses other names, substitute them.
  4. Replay: load the saved points, then for each waypoint:
    • Compute desired heading (atan2), angle difference to current heading.
    • Rotate in-place until heading matches (using Movement.left/right with calibrated degrees-per-second).
    • Move forward for a calculated duration to cover the distance to the waypoint (using calibrated mm/sec for given speed).
    • Update local "current" pose based on the movement you commanded (or read from Faux Odometry while moving).

Starter JavaScript skeleton Notes before using:

  • Replace variable names $NavigationX, $NavigationY, $NavigationHeading with whatever your Faux Odometry skill exposes.
  • Calibrate:
    • speed255MMperSec: mm traveled in 1 second when Movement speed=255 (enter from Faux Odometry tests)
    • degreesPerSec: degrees the robot turns in one second at a known rotation command (calibrate via test)
  • This is a skeleton - I can adapt it to your exact variables and control characteristics.
// Path record/replay skeleton for ARC (JavaScript)
// Uses getVar/setVar, Movement.* and File.* APIs.
// Calibrate these values to your robot!

// --- CONFIG (set from your calibrations) ---
var speed255MMperSec = 800;     // mm/sec at speed=255 (use Faux Odometry TEST)
var degreesPerSec = 90;         // degrees per second when issuing a left/right turn command
var recordSampleMs = 250;       // sample  NMS pose every 250ms
var movementSpeed = 180;        // speed value to use when replaying (0-255)

// --- Helpers ---
function nowMs(){ return new Date().getTime(); }

function normalizeAngle(angle) {
  // normalize to -180..180
  angle = ((angle + 180) % 360 + 360) % 360 - 180;
  return angle;
}

function turnDegrees(angleDeg) {
  // angleDeg positive = turn right, negative = turn left
  var dir = angleDeg > 0 ? "right" : "left";
  var durationMs = Math.abs(angleDeg) / degreesPerSec * 1000;
  if (dir == "right") {
    Movement.right(movementSpeed, durationMs);
  } else {
    Movement.left(movementSpeed, durationMs);
  }
  Utility.sleep(durationMs + 150); // small buffer
}

function moveForwardDistance_mm(distanceMM, speed) {
  // speed param between 0..255, approximate time using linear scaling
  var secondsPer255 = 1000.0 / speed255MMperSec; // sec per mm? careful
  // compute seconds for given speed
  var speedScale = speed / 255.0;
  var mmPerSec = speed255MMperSec * speedScale;
  var durationMs = (distanceMM / mmPerSec) * 1000;
  Movement.forward(speed, durationMs);
  Utility.sleep(durationMs + 150);
}

// --- RECORD PATH ---
function recordPath(filename) {
  var arr = []; // array of {t,x,y,h}
  var start = nowMs();
  var running = true;
  print("Recording to " + filename + " - start driving. Call stop() in console to finish.");
  // Expose a stop function to global to stop recording
  setVar("$__stopRecord", "false");
  while (getVar("$__stopRecord","false") != "true") {
    // Replace these with your actual  NMS pose variable names if different
    var x = parseFloat(getVar("$NavigationX","0"));
    var y = parseFloat(getVar("$NavigationY","0"));
    var h = parseFloat(getVar("$NavigationHeading","0")); // degrees
    var t = nowMs() - start;
    arr.push({t:t,x:x,y:y,h:h});
    Utility.sleep(recordSampleMs);
  }
  // Save to file as JSON lines
  File.delete(filename);
  for (var i=0;i<arr.length;i++) {
    File.appendStringLine(filename, JSON.stringify(arr[i]));
  }
  print("Recorded " + arr.length + " samples to " + filename);
}

// --- REPLAY PATH ---
function replayPath(filename) {
  if (!File.exists(filename)) {
    print("File not found: " + filename);
    return;
  }
  // Load file lines -> array
  var content = File.readAllText(filename);
  var lines = content.split("\n");
  var waypoints = [];
  for (var i=0;i<lines.length;i++) {
    var ln = lines[i].trim();
    if (ln.length == 0) continue;
    waypoints.push(JSON.parse(ln));
  }
  print("Replaying " + waypoints.length + " waypoints from " + filename);

  // Use the first pose as starting reference. We expect Faux Odometry to update pose as we move.
  for (var i=1;i<waypoints.length;i++) {
    // current pose read live from Faux Odometry (replace var names as needed)
    var curX = parseFloat(getVar("$NavigationX","0"));
    var curY = parseFloat(getVar("$NavigationY","0"));
    var curH = parseFloat(getVar("$NavigationHeading","0"));

    var next = waypoints[i];
    var dx = next.x - curX;
    var dy = next.y - curY;
    var targetHeading = Math.atan2(dy, dx) * 180.0 / Math.PI; // degrees
    var angleDiff = normalizeAngle(targetHeading - curH);

    // Rotate
    turnDegrees(angleDiff);

    // Distance in mm (assuming Pose units from Faux Odometry are cm -> convert; adjust as needed)
    var distCM = Math.sqrt(dx*dx + dy*dy);
    var distMM = distCM * 10.0;

    // Move forward
    moveForwardDistance_mm(distMM, movementSpeed);

    // Small pause to let Faux Odometry update
    Utility.sleep(200);
  }
  Movement.stop();
  print("Replay finished.");
}

How to wire this into an ARC Skill

  • Create a new Script robot skill (Project -> Add robot skill -> Script).
  • Put the above JS into the Script editor and create UI buttons in the skill to call recordPath("C:\path.txt") and replayPath("C:\path.txt").
  • Use the skill settings (kebab menu) to persist calibration values (speed255MMperSec, degreesPerSec, movementSpeed).
  • Add a "Stop Recording" UI button that sets setVar("$__stopRecord","true").

Caveats and limitations

  • Without good sensors (lidar or accurate wheel encoders), pose will drift over time. Use this only for short, repeatable paths or where you can tolerate inaccuracy.
  • Tuning calibration (speed mm/sec, degrees/sec) is critical - test in an open area and capture the values via Faux Odometry TEST buttons (Speed 255 / 128 / 10 Distance).
  • If you have a lidar or depth camera later, switch to The Better Navigator + 360 lidar for robust SLAM-based navigation and waypoint replay.

Useful links

Next steps (how I can help)

  • Do you already have NMS Faux Odometry configured? If so, tell me the exact names of the pose variables it exposes (or post a screenshot of the Faux Odometry skill UI or its exported variables). I’ll adapt the script to match.
  • Do you want a quick experimental skill (Option B) or recommendations on setting up The Navigator with Faux Odometry and waypoints (Option A)?
  • If you want, I can prepare a packaged skill you can import with UI buttons for Record / Stop / Save / Load / Replay and default calibration settings you can tune.

Would you like me to build the sample skill for your project? If so, post the pose variable names (or confirm $NavigationX, $NavigationY, $NavigationHeading) and your robot’s test calibration numbers (Speed 255 Distance in mm/sec and degrees/sec).


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Germany
#3  

@athena, no that's ok. was just a question about the floor map skill. as you said without good sensors, pose will drift over time.

PRO
Canada
#4  

Thanks - totally understood.

Quick recap for anyone reading:

  • Floor Map is intended as a visual/diagnostic tool to show the path drawn from Movement Panel inputs. It’s not designed to provide reliable pose for navigation because pose drifts without good sensors (lidar, encoders, IMU, etc.).
  • If you only wanted to confirm that - you’re correct: without better telemetry, replay/navigation from Floor Map will be unreliable over time.

If you change your mind later and want to explore adding replay/patrol capability, two practical options are:

Helpful links:

Want anything else?

  • I can show how to tune the Floor Map pixel/degree settings so the display matches your robot speed/turn rate.
  • Or I can paste the small record/replay script and steps to add it as a Script skill (no obligation). Which would you prefer?

I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.