Skip to comments
The Robot
We have 2 of these robots in the office but they don't get used much due to no support for mobile devices. Means, couldn't use your phone. So, they became perfect candidates to have ARC installed on them! Now they can be controlled by my phone and exosphere - but most importantly, they're under my control muahhaha!! Back on topic, here's details about connecting this robot to ARC: https://synthiam.com/GettingStarted/Build-Robots/Ohmnilabs-Telepresence-Robot-18728
Code
The script is really nothing fancy - and there are much more complicated ways of steering the robot. You can also use the camera's object rotation variable to detect the rotation of the glyph, and use that in your math. But, as you know I'm about getting stuff done quick and dirty for proof of concept. Later, if i feel up to it, I'll add additional complexity with incorporating the rotation and use dynamic turning.
Something to note about the camera variables, is that they all have a 2 on the end. This is because I am using 2 cameras and renamed the variables in the configuration for this camera.
Also, I specified the number of objects to be detected to 1 from the default value (which i think is 4). This prevents multiple glyphs or other objects from being detected.
Alternative Code
Another way I could have done it is using the camera control's built in dynamic turning and movement tracking. The grid lines could have been isolated to the width of the glyph. Then simply loop and check if the glyph is within a specific Y and X location rather than steering myself. This way would be much much easier to write, but I never thought about it until writing this post
I used an Ohmnilabs Telepresence robot to write some self docking code as an example. I really like their docking mechanism, which uses magnets. Something like this can be easily 3D printed for any robot. The camera i used facing down is a logitech usb camera set at 800x600 resolution. That seemed to detect the glyph very well at the height of 4 feet.
The Robot
We have 2 of these robots in the office but they don't get used much due to no support for mobile devices. Means, couldn't use your phone. So, they became perfect candidates to have ARC installed on them! Now they can be controlled by my phone and exosphere - but most importantly, they're under my control muahhaha!! Back on topic, here's details about connecting this robot to ARC: https://synthiam.com/GettingStarted/Build-Robots/Ohmnilabs-Telepresence-Robot-18728
Code
The script is really nothing fancy - and there are much more complicated ways of steering the robot. You can also use the camera's object rotation variable to detect the rotation of the glyph, and use that in your math. But, as you know I'm about getting stuff done quick and dirty for proof of concept. Later, if i feel up to it, I'll add additional complexity with incorporating the rotation and use dynamic turning.
Something to note about the camera variables, is that they all have a 2 on the end. This is because I am using 2 cameras and renamed the variables in the configuration for this camera.
Also, I specified the number of objects to be detected to 1 from the default value (which i think is 4). This prevents multiple glyphs or other objects from being detected.
Code:
var centerX = 328;
var centerY = 300;
var deadband = 7;
ControlCommand("Camera 2", "CameraGlyphTrackingEnable");
sleep(1000);
if (!getVar("$CameraIsTracking2")) {
Audio.say("I do not see the dock");
return;
}
while (getVar("$CameraIsTracking2")) {
if (getVar("$CameraObjectCenterX2") < centerX - deadband) {
print("Tun Left");
Movement.left(30);
while (getVar("$CameraObjectCenterX2") < centerX - deadband && getVar("$CameraIsTracking2"));
} else if (getVar("$CameraObjectCenterX2") > centerX + deadband) {
print("Turn Right");
Movement.right(30);
while (getVar("$CameraObjectCenterX2") > centerX + deadband && getVar("$CameraIsTracking2"));
}
if (getVar("$CameraObjectCenterY2") > centerY - deadband
&& getVar("$CameraObjectCenterY2") < centerY + deadband
&& getVar("$CameraObjectCenterX2") > centerX - deadband
&& getVar("$CameraObjectCenterX2") < centerX + deadband) {
Movement.stop();
print("Charging!");
Audio.say("The charger is mine");
return;
}
print("Forward");
Movement.forward(30)
sleep(250);
}
print("Stopping");
Movement.stop();
Alternative Code
Another way I could have done it is using the camera control's built in dynamic turning and movement tracking. The grid lines could have been isolated to the width of the glyph. Then simply loop and check if the glyph is within a specific Y and X location rather than steering myself. This way would be much much easier to write, but I never thought about it until writing this post
By DJ Sures
— Last update
Other robots from Synthiam community

Hazbot's Bob " Built On Bits"
This is Bob, our 3 ft tall house robot using the Ez-Robot kit and leftovers from plastering angle beads (lengths of...

Mickey666maus's ARC And Windows On Raspberry Pi 4
I guess with the rise of ARM, Microsoft will have to commit itself now to the new era!! :D This is just meant to be...

Jstarne1's 4 Sale- RARE HUGE Ultimate Wall-E. Excellent For...
Asking $250 :) https://www.ebay.com/itm/123994416370 Hello All, I am listing the Ultimate Wall-E I have had for a few...
@DJ... nice video btw. The magnets are so simple but quite effective...
Here is mine from a while back
My auto docking code...
Not trying to jack the thread... just that we need more threads like this....Code:
@DJ Let me know if you want me to re-post this in a separate thread...
I find sometimes with the version i made that the robot tries to dock on angles. So I'm thinking of using the object rotation variable to align the robot with the glyph better. Just have to do a bit of thinking how to do it. Cause you first have to line up the angle and then drive toward it. So the lining up the angle would be the interesting part
JavaScript vs EZ-script.... A tool is a tool... It depends how one uses the tool that makes the difference.... EZ-Script is as powerful as the person using it...
I had/have the same problem with the robot occasionally trying to dock on an angle.... I had mine back away to about a meter and then re-try. However, this still only fixed the angled docking problem about 50% of the time...
I know I am getting little too complicated but perhaps side glyphs/IR beacons that can only be seen when the robot is approaching the docking station at an angle? If the bot sees the left or right glyph it will know that it is approaching the docking station at an angle instead of straight on...
I also am slowly learning Javascript and plan to start rewriting and converting my project scripts. I'll start slow and see if it make any difference for either me or my robot.