ARC Pro

Upgrade to ARC Pro

Take control of your robot's destiny by subscribing to Synthiam ARC Pro, and watch it evolve into a versatile and responsive machine.

Author Avatar
PRO
Canada
#1  

Below is a practical plan + stepbystep configuration and a ready-to-adapt ARC JavaScript script you can use to have your Roli Rover locate a red Coke can, approach it, and grab it.

Highlevel approach

  • Teach the camera to recognize the can (either train the object with Train Vision Object By Script or use Color/MultiColor tracking).
  • Configure the Camera Device skill to drive your Movement Panel (Camera Movement tracking) so the robot follows the detected object.
  • Configure/test the gripper/arm servos (set safe min/max positions).
  • Run a script that enables tracking, follows the can, uses a distance/size threshold to stop, then commands the gripper servos to grasp.

Important resources

Before you start - questions for you

  1. Do you have a physical gripper/arm installed on Roli? If so, which servo ports are the arm and gripper on (e.g. d2 for wrist, d3 for gripper)?
  2. Do you have a distance sensor (ultrasonic / ping) mounted on Roli? If yes, which digital/ADC ports?
  3. Do you prefer using Train Vision Object (object-based) or simple Color/MultiColor tracking for the red can?

Stepbystep setup (nonscript)

  1. Add/verify Camera Device skill in your project (Project Add robot skill Camera Camera Device).
  2. Configure the camera:
    • Select your camera and a low resolution (320x240 or 160x120 recommended for speed).
    • In the Camera settings (kebab / three lines Settings) set Tracking type to either:
      • Object Tracking (best when you train the can with Train Vision Object by Script), or
      • Color Tracking / MultiColor (simpler: define a "red" color that matches the can).
  3. If using Train Vision Object By Script:
    • Add that skill and attach it to Camera (open the Train Vision skill settings or script ControlCommand Attach).
    • Use the skill’s Train UI or the script to Learn the object (present the can and move it while training).
  4. Configure Movement:
    • Add or configure a Movement Panel robot skill that controls Roli’s motors (H-Bridge, AutoPosition, etc.).
    • In Camera Settings Servo/Movement Tracking, enable Movement tracking (and test with CameraMovementTrackEnable). Tune the grid lines and the forward/turn thresholds so Roli centers the can and moves forward.
  5. Configure gripper/arm servos:
    • In the Camera settings or via Servo.setMinPositionLimit / Servo.setMaxPositionLimit set safe positions for the gripper and arm.
    • Manually test opening/closing the gripper with Servo.setPosition(dX, pos).

Where to configure robot skills (kebab menu)

  • Click the three lines (hamburger) in the topright of a robot skill window Settings to change tracking, movement, and scripts. Use the Script fields for start/stop scripts if desired.

Example ARC JavaScript script (template)

  • Edit the variables at the top (objectName, gripper/arm servo ports and positions, widthThreshold or distanceThreshold).
  • This script:
    • Attaches the Train skill, optionally triggers learning (commented).
    • Ensures camera/object tracking and movement tracking are enabled.
    • Waits until the camera sees the object and then loops while following the object.
    • Stops when the object appears large enough (uses $CameraObjectWidth variables as a sizing heuristic).
    • Runs the gripper servo to grasp.

Copy this into a JavaScript script robot skill and adapt servo ports & thresholds:

// === CONFIGURE THESE FOR YOUR ROBOT ===
var objectName = "COKE";            // name if you trained via Train Vision Object
var gripperPort = d2;               // change to your gripper servo port (no quotes)
var armPort = d3;                   // optional arm lift port (change to your port)
var gripperOpenPos = 30;            // servo angle to keep gripper open
var gripperClosePos = 110;          // servo angle to close gripper
var armLiftPos = 80;                // lift position after grasp (adjust)
var approachWidthThreshold = 140;   // camera object width pixels threshold for "close enough"
var checkDelay = 200;               // loop sleep ms

// === HELPERS ===
function printStatus(msg) {
  // ARC JavaScript uses print() to log in Script Console
  print(msg);
}

// Attach Train Vision skill (optional - safe to call even if already attached)
ControlCommand("Train Vision Object By Script", "Attach");

// If you want to trigger an automatic learn step from script (uncomment to use).
// NOTE: You must present the object to the camera while learning runs.
// ControlCommand("Train Vision Object By Script", "Learn", objectName);
// // wait for learning to finish (polling the learning variable)
 // while (getVar("$CameraIsLearning", "0") !== "0") {
 //   sleep(200);
 // }

// Ensure camera is started (use CameraStart if needed)
ControlCommand("Camera", "CameraStart");

// Enable object tracking and movement following
ControlCommand("Camera", "CameraObjectTrackingEnable");
ControlCommand("Camera", "CameraMovementTrackEnable");
ControlCommand("Camera", "CameraUseTrackingScripts", true);

printStatus("Waiting for object to be detected...");

// Wait until the camera sees the object
while (true) {
  var isTracking = getVar("$CameraIsTracking", "false");
  if (isTracking === "true") {
    // If you used Train Vision Object, verify object name matches (optional)
    var curName = getVar("$CameraObjectName", "");
    // If multiple objects have suffixes, check one of them:
    if (curName !== "" && curName.toLowerCase().indexOf(objectName.toLowerCase()) === -1) {
      // Not our trained object; wait a bit and continue
      sleep(checkDelay);
      continue;
    }
    printStatus("Object detected: " + curName);
    break;
  }
  sleep(checkDelay);
}

// MAIN APPROACH LOOP
printStatus("Approaching object...");
while (true) {
  // Camera gives width/height per detected object variables.
  // Try multiple possible variable names. Use Variable Watch to see exact names in your project.
  var w = parseInt(getVar("$CameraObjectWidth", "0"), 10);
  if (isNaN(w) || w === 0) {
    // try common suffixed name (first object) - many projects use _1 suffix
    w = parseInt(getVar("$CameraObjectWidth_1", "0"), 10);
  }
  // If you have a Ping sensor, prefer using Ping.get(trigger, echo) instead of width
  printStatus("Object width (px): " + w);

  if (w >= approachWidthThreshold) {
    // close enough
    printStatus("Reached close threshold. Stopping movement.");
    Movement.stop();
    break;
  }

  // otherwise, let camera movement tracking handle driving
  // (If you prefer to manually drive, use Movement.forward/left/right based on $CameraObjectCenterX)
  sleep(checkDelay);
}

// GRASP
printStatus("Positioning gripper for grasp...");
// ensure gripper is open
Servo.setPosition(gripperPort, gripperOpenPos);
// small delay for servo movement
sleep(750);

// lower arm if you have an arm (optional)
if (typeof armPort !== "undefined") {
  Servo.setPosition(armPort, armLiftPos - 30); // lower slightly - adjust to your hardware
  sleep(700);
}

// close gripper
printStatus("Closing gripper...");
Servo.setPosition(gripperPort, gripperClosePos);
sleep(1000);

// lift arm after grasp (optional)
if (typeof armPort !== "undefined") {
  Servo.setPosition(armPort, armLiftPos);
  sleep(800);
}

printStatus("Grab sequence completed.");

Notes, tuning and testing

  • Variable names: check the exact camera variables in your project with the Variable Watch skill. Camera object variables may be $CameraObjectWidth, $CameraObjectWidth_1, etc. Use the Variable Watch to confirm which name your camera instance uses and adjust the script accordingly.
  • If the Camera’s builtin movement tracking isn’t tuned, the robot may not center or may overshoot. Tune the Camera Servo/Movement Tracking settings (grid lines and thresholds) before relying on the script.
  • If you have an ultrasonic sensor, use Ping.get(triggerPort, echoPort) or the Ping.wait* helpers to detect distance more reliably than image size.
  • Always run servo movements slowly and within safe min/max limits first to avoid mechanical binding.
  • Reduce camera resolution to improve framerate and response time (320x240 or 160x120 recommended).

Next steps I can help with

  • Provide a tuned script for your exact gripper/arm servo ports and their safe min/max/open/close positions if you tell me the ports (e.g., gripper = d4, arm = d5).
  • Help convert a trained object workflow into an autonomous routine that returns to a home position after pickup.
  • Walk through how to train the object via Train Vision Object By Script stepbystep and show the exact ControlCommand sequence to call from JavaScript.

Tell me:

  • Which servo ports your gripper and arm use, and whether you have a distance sensor (and its port). I’ll update the script to match your hardware and include safe limits and recommended values.

I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Belgium
#2  

hi svorres

i think a red can coke will be to big to grab it . you will also need a bigger gripper for that .

Author Avatar
PRO
Canada
#3  

Good point, Nomad_6R - a standard soda can is fairly large and a lot of stock grippers on small rovers may not have the aperture, wrist geometry, or torque to safely grasp it. Below I’ll explain what to check, why a can is challenging, options (modify gripper or an alternative approach), and give a small ARC JavaScript test you can run to verify your gripper’s range and torque.

Why a Coke can can be hard to grab

  • Size: a standard 12 oz can is about 66 mm diameter × 122 mm tall. Many small grippers can’t open wide enough or wrap around that shape securely.
  • Shape: cylindrical shape is hard to pinch with small two-finger grippers unless jaw length and contact area are adequate.
  • Torque & mounting: lifting may require more torque than a small servo can provide, especially if the arm geometry gives a long lever.
  • Material: aluminum is non-magnetic, so magnets won’t work. Suction could work but needs a pump and good seal.

Checklist - what to measure / verify

  1. Gripper max opening (mm) - physical jaw-to-jaw maximum.
  2. Jaw depth/length (mm) - can the jaws reach around the can side?
  3. Gripper mounting height and wrist geometry - will the gripper hit the ground when trying to pick a can?
  4. Gripper servo model or torque rating (ozin or kgcm).
  5. Arm lifting servo torque and gear ratio.
  6. Robot ground clearance and approach geometry (is your rover able to get the gripper under or in front of the can?).
  7. Do you have a distance sensor (Ping) to know when to stop in front of the can?

If you post those values I’ll recommend whether the current gripper can be used or if you need a larger gripper/approach change.

Options and recommendations

  • Try a larger or modified gripper:
    • Use a larger two-finger parallel gripper with long jaws (able to open >66 mm).
    • Add soft rubber pads on jaws for grip.
    • Add finger extenders / 3D-printed jaw adapters to increase aperture and contact area.
  • Use a scoop or push-and-tilt approach:
    • Use a small scoop in front of the rover to nudge the can into a holding cradle or container on the robot (simpler mechanical solution).
    • Push the can to a bin or platform rather than lifting it.
  • Use a vacuum suction cup:
    • Works well on flat-ish tops, needs vacuum pump & valve and a good seal; adds complexity and power draw.
  • Use a passive hook or clamp + wedge:
    • Drive a wedge under the can and then use a clamp to trap it against a fixed backplate.
  • If you must lift, upgrade arm servos:
    • Ensure arm servo torque margin is > 2-3× what is needed (account for leverage).

ARC setup hints (practical)

  • Use Camera Device + Train Vision Object By Script to teach the can (or Multi-Color/Color if you prefer). Low resolution (320×240 or 160×120) helps responsiveness. See:
  • Configure Movement tracking in Camera Settings so ARC drives the rover to the object. Open the robot skill’s kebab menu (three lines) Settings Servo/Movement Tracking to enable Movement and tune grid and thresholds.
    • Each robot skill’s settings: click the three lines (hamburger) top-right of the robot skill window to open options.
  • Use a Ping ultrasonic sensor for stopping distance instead of relying solely on camera width - more reliable for "close enough" detection.

Quick ARC JavaScript gripper test Run this script in an ARC JavaScript Script robot skill to test gripper opening/closing range and to read commanded positions. Edit the gripperPort to your servo port (d0..d23 or v0..v99).

// Gripper test - change gripperPort to your servo port (no quotes)
var gripperPort = d2; // example - change to your port
var step = 10; // degrees per step
var delayMs = 400; // wait between steps (ms)

// helper
function print(msg){ print(msg); } // we still use print() per ARC guidance

print("Starting gripper sweep test on port: " + gripperPort);

// sweep open (10 -> 170)
for (var p = 10; p <= 170; p += step) {
  Servo.setPosition(gripperPort, p);
  print("Set gripper to: " + p);
  sleep(delayMs);
}

// optional short hold
sleep(600);

// sweep close (170 -> 10)
for (var p = 170; p >= 10; p -= step) {
  Servo.setPosition(gripperPort, p);
  print("Set gripper to: " + p);
  sleep(delayMs);
}

print("Gripper sweep complete. Read commanded position:");
var pos = Servo.getPosition(gripperPort);
print("Servo.getPosition(" + gripperPort + ") = " + pos);

// If your servo supports IsReleased (torque off), check:
try {
  var released = IsReleased(gripperPort);
  print("IsReleased: " + released);
} catch (e) {
  print("IsReleased not supported on this servo plugin/port.");
}

Notes while testing

  • Power: power the servos from a stable supply with enough current. If servos draw too much, the EZB may brown out.
  • Watch for binding - stop the test if the servo stalls or overheats.
  • If the sweep reaches the physical limit and the servo stalls, reduce the max range or set Servo.setMaxPositionLimit(gripperPort, position) to a safe max.

Next steps - how I can help

  • If you post the gripper aperture (mm), servo ports and servo model/torque, and whether you have a Ping sensor (and its port), I will:
    • Tell you if the existing gripper can likely handle the can.
    • Suggest mechanical mods or a gripper upgrade and give exact part suggestions.
    • Provide a ready-to-run ARC JavaScript script to locate (camera train or color), approach (movement tracking or manual Movement commands), stop (Ping or camera width), and grasp using your specific servo ports and thresholds.

Thanks Nomad_6R - your observation is useful. svorres - post the measurements/servo ports and I’ll tailor the plan and script for Roli. If you have photos, paste the exact text written on them or list measurements - I can’t see images.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Canada
#5  

Nice find, Nomad_6R - that Thingiverse gripper is exactly the kind of 3D-printable adapter that can make a soda can pickup feasible. Here’s how to evaluate it and integrate it into your Roli + ARC project, plus a ready-to-adapt ARC JavaScript test and pick-up sequence.

What to check on the Thingiverse gripper

  • Jaw opening: can the printed jaws open 66 mm? (12 oz can diameter 66 mm).
  • Jaw depth/length: jaws must wrap far enough around the can to get a secure hold.
  • Mount interface: does it mount to a standard servo horn or require an adapter?
  • Strength and geometry: will your servo(s) and arm be able to lift a can without stalling or binding?
  • Printing orientation & infill: use stronger infill and correct orientation so the jaws don’t snap.

Mechanical tips

  • Add rubber or foam pads to the jaws for friction and to avoid slipping.
  • Use longer jaws or printed jaw extenders if the stock design won’t reach.
  • If a single small servo has insufficient torque, consider a higher-torque servo or a gear / lever modification.
  • Consider a "scoop" or containment cradle on the robot if fully lifting is too difficult - push-and-capture is often simpler.

Steps to mount & wire to Roli

  1. Print the gripper. Verify fit on your servo horn before final assembly.
  2. Attach gripper to your wrist/gripper servo horn. Secure with appropriate screws.
  3. Connect servo signal to an EZB servo port (d0-d23 or v0-v99). Note which port (e.g., d2).
  4. Ensure servo power supply is sized for the servo torque draw (do not power high-torque servos from EZ-B 5V if they exceed the board’s rating).
  5. Manually test open/close and eliminate binding before trying to lift a can.

How to configure Camera + Movement + Robot skills (kebab menu)

  • Add/verify skills: Project Add robot skill (Camera Camera Device) and (Camera Train Vision Object By Script if using object training).
  • Open the robot skill window (Camera Device) click the three-lines (kebab/hamburger) top-right Settings.
    • In Settings Servo/Movement Tracking: enable Movement tracking so the Camera can command the Movement Panel (Roli drive).
    • Tune grid lines, forward/turn thresholds and servo options if you use servo pan/tilt.
  • For the Train Vision Object By Script skill: open its kebab menu and Attach it to the Camera (or call from script with ControlCommand("Train Vision Object By Script","Attach")).

Basic workflow to get autonomous pickup

  1. Print & mount gripper, connect servo to known port (e.g., d2).
  2. Configure Camera (low resolution 320x240 or 160x120). Either:
    • Train the object with Train Vision Object By Script (recommended for a real can), or
    • Configure Color/Multi-Color tracking tuned to your can’s red.
  3. Enable Camera movement tracking (Camera settings Servo/Movement Tracking Movement). Test that Roli centers & approaches the can.
  4. Add a Ping ultrasonic sensor (recommended) to reliably stop at grasping distance, or use $CameraObjectWidth as a heuristic.
  5. Set safe min/max servo limits: via script use Servo.setMinPositionLimit/Servo.setMaxPositionLimit or in the servo control panels.
  6. Run approach stop open gripper lower arm (if applicable) close gripper lift.

Ready-to-adapt ARC JavaScript

  • Edit the top section (ports, positions, thresholds) to match your hardware.
  • Uses Camera commands to start tracking and Movement tracking, then approaches and executes the grasp.
// CONFIG - set these to your robot's ports and values
var trainedObjectName = "COKE";          // if you trained via Train Vision Object, name here
var gripperPort = d2;                    // change to your gripper servo port (no quotes)
var armPort = d3;                        // optional arm lift servo (or undefined)
var gripperOpenPos = 30;                 // degrees to open gripper
var gripperClosePos = 110;               // degrees to close gripper
var armPreLowerPos = 60;                 // arm position to lower slightly before grasp
var armLiftPos = 90;                     // arm position to lift after grasp
var approachWidthThreshold = 140;        // camera object width px threshold = "close enough"
var pingTriggerPort = d7;                // optional: set your Ping trigger port (or undefined)
var pingEchoPort = d8;                   // optional: set your Ping echo port (or undefined)
var checkDelay = 200;                    // ms

function printStatus(msg){
  print(msg);
}

// Start camera & attach Train skill (safe to call even if already attached)
ControlCommand("Train Vision Object By Script","Attach");
ControlCommand("Camera","CameraStart");
ControlCommand("Camera","CameraObjectTrackingEnable");
ControlCommand("Camera","CameraMovementTrackEnable");
ControlCommand("Camera","CameraUseTrackingScripts",true);

printStatus("Waiting for object detection...");

// Wait until camera is tracking something (optionally validate trainedObjectName)
while(true){
  var isTracking = getVar("$CameraIsTracking","0");
  if(isTracking === "1" || isTracking.toLowerCase() === "true"){
    // if using trained object, check its name (some projects append _1 etc.)
    if(trainedObjectName && trainedObjectName.length > 0){
      var curName = getVar("$CameraObjectName","");
      if(curName === "" || curName.toLowerCase().indexOf(trainedObjectName.toLowerCase()) === -1){
        // Not our object yet
        sleep(checkDelay);
        continue;
      }
    }
    break;
  }
  sleep(checkDelay);
}

// Approach loop: rely on Camera Movement tracking to drive forward/center.
// Stop either by Ping distance or camera width threshold.
printStatus("Approaching...");

while(true){
  // Prefer Ping if available
  var closeEnough = false;
  if(typeof pingTriggerPort !== "undefined" && typeof pingEchoPort !== "undefined"){
    try {
      var dist = Ping.get(pingTriggerPort, pingEchoPort); // 0-255 cm scale
      printStatus("Ping distance: " + dist + " cm");
      if(dist > 0 && dist <= 20) { // adjust 20cm threshold to your geometry
        closeEnough = true;
      }
    } catch(e) {
      printStatus("Ping read failed: " + e);
    }
  } else {
    // fallback: use camera width
    var w = parseInt(getVar("$CameraObjectWidth","0"),10);
    if(isNaN(w) || w === 0){
      w = parseInt(getVar("$CameraObjectWidth_1","0"),10); // try suffix
    }
    printStatus("Camera object width: " + w);
    if(w >= approachWidthThreshold) closeEnough = true;
  }

  if(closeEnough){
    printStatus("Close enough. Stopping movement.");
    Movement.stop();
    break;
  }

  sleep(checkDelay);
}

// Grasp sequence
printStatus("Opening gripper...");
Servo.setPosition(gripperPort, gripperOpenPos);
sleep(700);

if(typeof armPort !== "undefined"){
  printStatus("Lowering arm for grasp...");
  Servo.setPosition(armPort, armPreLowerPos);
  sleep(700);
}

printStatus("Closing gripper...");
Servo.setPosition(gripperPort, gripperClosePos);
sleep(900);

if(typeof armPort !== "undefined"){
  printStatus("Lifting arm...");
  Servo.setPosition(armPort, armLiftPos);
  sleep(700);
}

printStatus("Grasp complete.");

Useful ARC & hardware links

Next steps / questions for you

  • Did you print and mount that Thingiverse gripper yet? If so, what is its measured maximum jaw opening (mm)?
  • Which EZB servo ports are your gripper and arm connected to (e.g., d2, d3)?
  • What servo models (or torque ratings) are you using for gripper and arm?
  • Do you have a Ping ultrasonic sensor (and ports) to improve stopping reliability?
  • Want me to adapt the JavaScript pick-up script to your exact ports/min-maxs and to include Servo.setMinPositionLimit / Servo.setMaxPositionLimit calls?

Tell me those details and I’ll update the script and give exact safe limits and tuning suggestions.


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Author Avatar
PRO
Belgium
#6  

@athena

it uses also the big HDD servo's . im allreddy printing one and modified the base , so we can use the original  square coupling to make it very strong . also using the metal spline . if he/che wants it .

User-inserted image

Author Avatar
PRO
Canada
#7  

Nice - those HDD servos + metal spline coupling sound like a very strong solution for grabbing a Coke can. I can help you finish the integration and produce a ready-to-run ARC JavaScript script, but first a few notes, safety checks and some follow-up questions so I tune the script and limits to your hardware.

I can’t view attached photos. Please paste any text printed on the image or provide measurements (jaw opening, arm length, screw sizes, servo model numbers, etc.). I’ll use that info to set safe min/max values and timings.

Important mechanical & electrical considerations

  • Torque & gear loading
    • HDD-style servos (or large hobby servos with metal splines) are great for torque, but make sure the servo torque rating comfortably exceeds the required lift torque (recommended 2-3x margin, account for leverage and dynamic loads).
  • Power supply
    • Use a dedicated servo power supply sized for stall current of the large servos. Do not draw high-torque servo power from EZ-B’s 5V supply. Brownouts will cause disconnects and data loss.
  • Mounting & coupling
    • Metal spline + square coupling is good for rigidity. Add thread locker and ensure set screws have flat spots on the spline. Reinforce the base and avoid binding at extremes.
  • Mechanical stops / limits
    • Add physical stops or setServo min/max limits in software to prevent over-rotation and binding. Use Servo.setMinPositionLimit and Servo.setMaxPositionLimit in ARC.
  • Grip surface
    • Add rubber pads or foam to the jaw faces for friction and shock absorption.
  • Safety & testing
    • Run slow test sweeps first. Stop immediately on stall/heating. Test with the gripper unloaded, then with the can.
  • Sensor for reliable stopping
    • A Ping ultrasonic sensor (preferred) or a camera width threshold can be used to stop the robot at the correct grasp distance. Ping is more reliable for distance.

What I need from you (quick)

  • Which EZB servo ports are used for: gripper jaw, wrist (if any), arm lift? (e.g., gripper = d2, arm = d3)
  • Do you have a Ping/sonar sensor? If yes, which trigger/echo ports? (e.g., trigger = d7, echo = d8)
  • Measured max jaw opening in mm (or the printable design’s stated opening). A 12 oz can is ~66 mm diameter.
  • Servo models or torque ratings (kgcm or ozin), and the servo power supply voltage.
  • Do you want the Rover to lift the can fully, or just capture it and carry it in a cradle?

Recommended workflow (summary)

  1. Mount the printed HDD-gripper to the servo horn using the metal spline and square coupling. Secure set screws with Loctite or equivalent.
  2. Add rubber pads and test mechanical travel; measure the degrees for open/close without binding.
  3. Add software min/max limits and reduced speed for the gripper during tests.
  4. Wire servo signal to EZB port (d0-d23 or v0-v99) and ensure servo power is on separate supply.
  5. Configure Camera Device + Train Vision Object or MultiColor as previously discussed. Use low camera resolution (320×240 or 160×120).
  6. Use Ping.get() or camera width variable to stop at the grasping distance, then execute the grasp sequence (open -> lower -> close -> lift).

ARC JavaScript template

  • Update the top CONFIG values to your ports, positions and thresholds.
  • This script includes: safety limits, slow speed for torque, servo sweep test, approach loop (Ping preferred), and grasp/lift sequence.
// GRIPPER + PICKUP TEMPLATE - edit CONFIG values to match your robot

// ---------- CONFIG ----------
var trainedObjectName = "COKE";        // name if you trained the object (or leave empty for color tracking)
var gripperPort = d2;                  // gripper servo port (d0-d23 or v0-v99)
var armPort = d3;                      // arm lift servo port (or undefined if none)
var wristPort = undefined;             // optional wrist rotation servo (or undefined)

var gripperOpenPos = 30;               // degrees for open (adjust after mechanical test)
var gripperClosePos = 110;             // degrees for closed (adjust after mechanical test)
var gripperSafeMin = 5;                // software min limit (deg)
var gripperSafeMax = 170;              // software max limit (deg)
var gripperSpeed = 6;                  // 0 (fastest) - 10 (slowest) - use slower for torque

var armLiftPos = 90;                   // post-grasp lift position
var armPreLowerPos = 60;               // lowered position for grasping
var armSafeMin = 10;
var armSafeMax = 170;
var armSpeed = 5;

var pingTriggerPort = d7;              // set trigger port or undefined
var pingEchoPort = d8;                 // set echo port or undefined

var approachWidthThreshold = 140;      // fallback camera width px threshold (adjust)
var checkDelay = 200;                  // ms loop delay

// ---------- HELPERS ----------
function printStatus(msg) { print(msg); } // use print() per ARC guidance

// ---------- INITIAL SAFETY SETUP ----------
printStatus("Setting servo limits and speeds (software).");
try {
  Servo.setMinPositionLimit(gripperPort, gripperSafeMin);
  Servo.setMaxPositionLimit(gripperPort, gripperSafeMax);
  Servo.setSpeed(gripperPort, gripperSpeed);
} catch (e) { printStatus("Warning: failed to set gripper limits/speed: " + e); }

if (typeof armPort !== "undefined") {
  try {
    Servo.setMinPositionLimit(armPort, armSafeMin);
    Servo.setMaxPositionLimit(armPort, armSafeMax);
    Servo.setSpeed(armPort, armSpeed);
  } catch (e) { printStatus("Warning: failed to set arm limits/speed: " + e); }
}

// ---------- SWEEP TEST (manual safe test) ----------
printStatus("Starting gripper sweep test (open->close->open). STOP if you hear stall/strain.");
for (var p = gripperOpenPos; p <= gripperClosePos; p += 10) {
  Servo.setPosition(gripperPort, p);
  printStatus("Gripper -> " + p);
  sleep(300);
}
for (p = gripperClosePos; p >= gripperOpenPos; p -= 10) {
  Servo.setPosition(gripperPort, p);
  sleep(300);
}
printStatus("Gripper sweep complete. Verify mechanical range is correct.");

// ---------- START CAMERA + TRAIN SKILL ----------
ControlCommand("Train Vision Object By Script", "Attach");
ControlCommand("Camera", "CameraStart");
ControlCommand("Camera", "CameraObjectTrackingEnable");
ControlCommand("Camera", "CameraMovementTrackEnable");
ControlCommand("Camera", "CameraUseTrackingScripts", true);

printStatus("Waiting for object detection.");
while (true) {
  var isTracking = getVar("$CameraIsTracking", "0");
  if (isTracking === "1" || String(isTracking).toLowerCase() === "true") {
    if (trainedObjectName && trainedObjectName.length > 0) {
      var curName = getVar("$CameraObjectName", "");
      if (curName && curName.toLowerCase().indexOf(trainedObjectName.toLowerCase()) === -1) {
        // Not our object yet
        sleep(checkDelay);
        continue;
      }
    }
    break;
  }
  sleep(checkDelay);
}

printStatus("Object detected. Approaching...");

// ---------- APPROACH LOOP ----------
while (true) {
  var closeEnough = false;

  // prefer Ping sensor if configured
  if (typeof pingTriggerPort !== "undefined" && typeof pingEchoPort !== "undefined") {
    try {
      var dist = Ping.get(pingTriggerPort, pingEchoPort); // 0-255 cm
      printStatus("Ping distance: " + dist + " cm");
      if (dist > 0 && dist <= 12) { // 12 cm grasp distance example - adjust
        closeEnough = true;
      }
    } catch (e) {
      printStatus("Ping read error: " + e);
    }
  } else {
    // fallback to camera width heuristic
    var w = parseInt(getVar("$CameraObjectWidth", "0"), 10);
    if (isNaN(w) || w <= 0) {
      w = parseInt(getVar("$CameraObjectWidth_1", "0"), 10);
    }
    printStatus("Camera width px: " + w);
    if (!isNaN(w) && w >= approachWidthThreshold) closeEnough = true;
  }

  if (closeEnough) {
    printStatus("Close enough - stopping movement.");
    Movement.stop();
    break;
  }
  sleep(checkDelay);
}

// ---------- GRASP SEQUENCE ----------
printStatus("Opening gripper (pre-position).");
Servo.setPosition(gripperPort, gripperOpenPos);
sleep(700);

if (typeof armPort !== "undefined") {
  printStatus("Lowering arm to pre-grasp position.");
  Servo.setPosition(armPort, armPreLowerPos);
  sleep(700);
}

printStatus("Closing gripper.");
Servo.setPosition(gripperPort, gripperClosePos);
sleep(1000);

if (typeof armPort !== "undefined") {
  printStatus("Lifting arm.");
  Servo.setPosition(armPort, armLiftPos);
  sleep(800);
}

printStatus("Pickup sequence completed.");

// Optionally release camera movement tracking or leave as-is
ControlCommand("Camera", "CameraMovementTrackDisable");

What I will do next if you give me details

  • Replace placeholder ports and positions with your exact gripper/arm ports and measured degree limits.
  • Set approach distances to match your Ping sensor or camera width values.
  • Insert soft-start (Servo.setSpeed) and lift/tilt geometry adjustments based on arm length & torque.
  • Add a recovery routine if grasp fails (e.g., reopen and retry a set number of times).

Useful links and references

Next steps for you

  1. Tell me the servo ports and measured jaw opening (mm).
  2. Confirm whether you have a Ping sensor (ports) and the servo power voltage & servo model/torque.
  3. If you pasted text from the image, include any numbers or printed model names on the servo or coupling.

I’ll then customize the script exactly for Roli, include safe min/max calls, timing, and suggested PID-like tuning steps for the camera/movement tracking. Want me to proceed?


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.