Robot Skills Overview

ARC apps consist of Robot Skill Controls. Each skill is a behavior for the robot, which is a process (or node). There are skills for Wiimotes, Cameras, Speech Recognition, and hundreds more. Skills can be added to a project workspace using the Add Skill option in ARC.


Robot Skill Categories


ADC Robot Skills

The ADC (Analog-to-Digital Converter) category in Synthiam ARC provides the capability to read analog signals from sensors, converting real-world signals into digital data. This is essential for interfacing with analog sensors and devices, allowing robots to gather precise data from the physical environment.

Displays the ADC voltage in a historical graph. The input voltage can be between 0 and +3.3/+5 VDC (depending on your TTL high side value). The voltage will be displayed as a linear representation of the voltage value between 0 and 255. Main Window 1. Graph Display Displays the ADC value in a line graph format. 2. Value Display Displays the ADC numerical value detected (0-255). 3. Pause Checkbox When checked, the ADC readings will pause. Settings 1. Title Field This field contains the...


Displays the ADC (Analog Digital Converter) voltage in a VU meter. The input voltage can be between 0 and +3.3/+5 VDC (depending on your TTL high side value). The voltage will be displayed as a linear representation of the voltage value between 0 and 255. Main Window 1. Meter Display Displays the ADC value in a vertical meter format. 2. Pause Checkbox When checked, the ADC readings will pause. Settings 1. Title Field This field contains the title of the skill. You can change it if youd...


ADC (Analog Digital Converter) displays the voltage and value from the specified ADC port. The input voltage can be between 0 and +3.3/+5 VDC (depending on your TTL high side value). The voltage will be displayed as a linear representation of the voltage value between 0 and 255. Main Window 1. Linear Value Readout A linear value between 0 and 255 that is proportional to the input voltage. 2. Voltage Readout The analog voltage is detected in VDC units. 3. Pause Checkbox When checked, the ADC...


this is the first robot skill created by arcx to the synthiam server online.


This is the long message for the ARCx test robot skill. This should be visible and the robot skill is compatible with windows, linux, raspberry pi and macos. Those platforms should be visible for compatibility. This robot skill is a placeholder during our software development of the new ARCx platform. For more information, be sure to check out the ARCx Community Update thread. Please note, ARCx is not available for community beta at this time. We will post an update when it is available for...




Artificial Intelligence Robot Skills

The Artificial Intelligence category in Synthiam ARC empowers robots with advanced cognitive abilities. Through integration with AI services, robots can recognize objects, understand speech, and make intelligent decisions. This skill category opens up a realm of possibilities for creating smart and responsive robotic systems.

AimlBot is a localized chatbot client that uses AIML (Artificial Intelligence Markup Language). The AIML configuration files are stored locally and can be edited by hand or with an AIML editing tool. Additionally, settings for the session can be adjusted programmatically using the ControlCommand.  Main Window 1. Text Entry Field This is where you write your text entry. 2. Send button Click the Send button, or hit enter when you are ready to send your text entry to the AimlBot. 3. Response...


This is the first robot skill created from ARCx that should work just fine...


AIML2 chatbot client for ARC using the Bot Libre free open source cloud service. Bot Libre lets anyone create a chatbot for their robot with unique and customizable personalities. Get Started 1) Visit https://botlibre.com 2) Press Sign Up (dont worry its free) 3) Verify the email that Bot Libre will send to you. This email verification must be done in order for the API calls to work, otherwise you will receive an error when using this robot skill. 4) Activate the API by obtaining an...


Detect the percentage of the positive sentiment of the provided phrase using cognitive machine learning services. The detected sentiment percentage will be displayed in the response window and assigned to a variable $SentimentPercentage. The text/phrase can be sent to the skill via ControlCommand() scripts. Main Window 1. Text Field In this field, you enter the text you want this machine learning service to evaluate. 2. Detect Button This button will send the text you entered to the machine...


Googles DialogFlow is a chatbot NLP (natural language processor). Either manually enter conversational phrases into the input field, or send phrases programmatically using the ControlCommand(). This plugin requires an internet connection, which means your computer must be connected to both the robot and the internet at the same time.  The theory behind NLP is to have a conversation with a user to determine intent. The intents are defined by yourself in the DialogFlow configuration web console....


This is the EZ-AI client for ARC. EZ-AI is an AI framework that comes with a set of components that allows you to use natural speech to ask your robot questions, set reminders and timers, get weather information and a lot more. You can also add java plugins into the framework to extend its capabilities. This version is an early beta release of the client. It will be changing as we go through the beta test process. There are 3 components of EZ-AI. The EZ-AI Client (this plugin), the EZ-AI Server...


Give your robot a personality and conversational ability. This robot skill adds chatbot capability to your robot so you can have meaningful conversations that it will remember. Configure the personality in any way you wish so the robot can prefer specific subjects or have particular opinions. Quote:THIS HAS BEEN DISCONTINUED GPT-4 To use GPT4, a newer version of this robot skill called OpenAI ChatGPT is available here. Introducing Open AI OpenAI’s mission is to ensure that artificial...


Use Chat GPT with your robot. Give your robot a personality and conversational ability and describe images. This robot skill adds chatbot capability to your robot so you can have meaningful conversations that it will remember. Configure the personality and include scripts that the AI can run. OpenAI’s mission is to ensure that artificial intelligence benefits all of humanity. This effort is essential to training AI systems to do what humans want. A Generative Pre-trained Transformer is an...


Connect to the PandoraBot AIML2 service using your Pandora developer account. Pandorabots implements and supports development of the AIML open standard.  The Pandorabots Platform is one of the oldest and largest chatbot hosting services in the world. With this robot skill, you can create AI-driven virtual personalities to hold human-like text or voice chats.  Config Window - Authentication 1) The App ID from the Pandora Bot API Dashboard (see how to use it below) 2) The User Key from the...


This skill will send the text you enter to the Pandorabots online server via ControlCommand(). The Pandorabots online server will respond to you with a message from the artificial intelligence engine. By default, we have a Synthiam Pandorabot set up on the Pandorabots online server that you can use or create your Pandorabot. Main Window 1. Text Display This displays your text from the text entry field and the responses from the Pandorabot. 2. Send Button This button sends the text from the...


Make your robot a chatbot with personality and engagement. It will allow your robot to remember details and form emotional relationships with people. The Personality Forge chatbot platforms AI Engine integrates memories, emotions, knowledge of hundreds of thousands of words, sentence structure, unmatched pattern-matching capabilities, and a scripting language called AIScript. Its the heart and mind of all the chatbots here, and its easy enough to work with that you need no programming experience...


Synbot is a chatbot framework which this plugin will use. Either manually enter conversational phrases into the input field, or send phrases programmatically using the provided ControlCommand(). Bot Server is installed on your computer and linked directly to the plugin. *Note: This plugin requires for some API request an internet connection, which means your computer must be connected to both the robot and internet at the same time. Follow this tutorial to learn how to maintain an internet...


Use the Microsoft Translate Text cloud service to translate text in any other languages supported by the API. Input text language is auto detected. For example , you can translate English only responses returned by Microsoft Cognitive Vision plugin. This service requires an internet connection. That means your EZ-B will need to be put into Wi-Fi Client Mode. Consult the appropiate learn tutorial for changing the EZ-B to Wi-Fi Client Mode or add a second USB WiFi adapter from here:...


VADER (Valence Aware Dictionary and sEntiment Reasoner) is a lexicon and rule-based sentiment analysis tool that is specifically attuned to sentiments expressed in discussion. The input text is parsed and variables are declared with values of Positive, Neutral, and Negative percentage of analysis. A Compound variable is set as well with the overall sentiment. You may manually enter text into the skill, or use ControlCommand() to send text. Use Cases You can use Vader for giving your robot a...




Audio Robot Skills

The Audio category in Synthiam ARC enables robots to interact with their surroundings through sound. It includes features such as speech synthesis, audio playback, and voice recognition. This skill set is crucial for creating robots that can communicate effectively using auditory cues.

This is an alternative to the Bing Speech Recognition module for ARC. This skill allows you to specify your own Azure Cognitive Service credentials using the Bing Speech Recognition cloud service. This is by far the most accurate speech recognition service that weve ever used. Main Window 1. Connect API Button Once you have entered the API Key in the advanced configuration you can connect to it with this button. 2. Pause Checkbox This checkbox pauses the detection from the audio input...


Advanced speech synthesis will use Azure voices which are natural sounding for several languages.


Allows ARC to use UWP speech voices, change default audio devices, capture sound and route to an EZB. *** Version 14 *** Minor bug fix (memory release) *** Version 13 *** Fix for ARC 2020.02.28.00 release *** Version 12 *** Fix for ARC 2020.02.19.00 release *** Version 11 *** Fix for new ARC version *** Version 10 *** ImportUWPVoices command restored *** Version 9 *** Bug Fix (UI sync monitor object waits indefinitely, solution pulse the monitor when leaving ARC)  *** Version 8 *** Build Fix...


Speech recognition engine from Microsoft Azure using your own custom key for billing.


The Synthiam ARC Robot Skill for Azure Text to Speech is a powerful integration that enables your robot to generate human-like speech using Microsofts Azure Text to Speech service. This skill allows you to take your robotics project to the next level by providing your robot with a natural and dynamic voice. Whether you are building a companion robot, educational tool, or any other robotic application, this skill enhances user interaction and engagement through spoken language. Applications...


English-only speech synthesis that uses a remote server to generate the audio.


This speech recognition skill for ARC uses the Bing Speech Recognition cloud service. It is one of the most accurate speech recognition services available. Two Version Of This Skill There are two versions of this robot skill, this one and the Advanced Speech Recognition. This version of the robot skill uses a shared license key with Microsoft that enables ARC users to experiment and demo this robot skill. Because this version of the skill shares a license key, users may encounter errors if...


*Note: This robot skill has been replaced with the Azure Text To Speech robot skill Use the Microsoft Bing Text to Speech cloud service. Allows to add break, change speaking rate, volume and pitch. This service requires an internet connection. That means your EZ-B will need to be put into Wi-Fi Client Mode. Consult the appropiate learn tutorial for changing the EZ-B to Wi-Fi Client Mode or add a second USB WiFi adapter from here: http://www.ez-robot.com/Tutorials/Course/5 Play through: Allows...


Have a conversation with your robot using your voice by navigating through menu options so your robot can perform tasks. This is similar to how menu trees on phone systems work. You can define the menu tree in the configuration to run scripts based on options selected by the user with their voice. The menu tree can have as many branches as necessary. The robot will speak each menu prompt and then optionally speak each option to the user. The microphone on the robot or PC is used for the users...


This ARC skill uses the default recording device (i.e. microphone) configured on the Windows PC to detect frequencies. The identified frequency can be assigned to a variable and move servos. A range can be specified to limit the servo movements between Min and Max frequencies. The type of algorithm used to detect the frequency is called Fast Fourier transform, or FFT for short. Detecting multiple frequencies is heard as noise, which will produce bumpy responses in the data log. Ideally, a single...


This skill is a test for the Google Speech Service with ARC. There is some configuration necessary for testing this skill, as the Google Speech Service is paid and not free. There is a 60 day trial, which you can sign up for. This skill requires internet access, so if an I/O controller is also being used a second WiFi connection, wired network or direct UART connection to the PC is needed. Main Window 1. Load Credentials Button This opens the location of your Google speech recognition key...


This skill will record audio from your default audio input device and allow you to play it back through the EZ-B v4 speaker. The Settings menu for this skill will enable you to specify some effects and the sample rate. You may also adjust the recording level to begin recording audio automatically. The audio will be played once the level drops below the threshold. The most recent recording in the buffer can be exported to a SoundBoard EZ-B v4. This allows you to save the current recording from...


Use this skill to send or receive MIDI messages from musical instruments or controllers. Scripts can be added to each received note in the configuration window. Use the ControlCommand() to send MIDI messages to instruments. Any computer with a soundcard will also have a MIDI device, usually called Microsoft GS Wave Table Synth, which will output music instrument from the soundcard output. Otherwise, if you have an external midi device, such as a drum machine or keyboard, that can be selected as...


The MP3 Trigger is a shield that connects to the EZ-B via a serial port. The MP3 Trigger takes a mini SD card with MP3s loaded on. The mp3s can be triggered from this control. This control and hardware have been deprecated with the ezb v4 streaming audio feature. Use the Config button to select the digital port and baud rate of the MP3 Trigger. Note: Synthiam is not a manufacturer of this third-party hardware device. . Nor is ezrobot responsible for the operation of this third-party device....


This skill is an example, with source code, of how to play audio out of the EZ-B while making a custom robot skill. The EZ-B will output audio as a stream or byte array. View the source code of this example to see how it was done. If you are making a skill, be sure to follow the skill tutorial here: https://synthiam.com/Docs/Create-Robot-Skill/Overview If you want your robot to play audio out of an EZB that supports audio stream, have a look at the SoundBoard skill, or many others in the Audio...


The Sound movement skill is for embedded devices on your robot that have two integrated microphones for the left and right channels. This skill will allow your robot to respond to which side the sound is coming from. A script can be applied to each direction of the sound to control movement. *note: not all computers have stereo microphones or stereo microphone inputs. Verify that your computer mic input is stereo, otherwise this robot skill will receive mono audio. Main Window 1. Left...


Execute a script when sound is played out of the EZB speaker. This robot skill will also set 3 variables for the sound level that your script loop can access. The variables include Min Level, Max Level, and Average Level. Because the script is executed once when audio is started to play, it is advised that your on-going script is contained within a loop. Once the audio has stopped playing, the script will cancel and therefore stop running your loop. *Note: To avoid infinite recursive run-away...


Use your EZ-Bs audio output to control servos! You can control many servos with this skill by using the Multi Servo option in the settings or adding multiple instances of the robot skill. Specify the scalar to increase the dynamic range of the audio about the servos position. Dont worry if that sounds confusing; play with it, and see what you get. Use this skill to move the mouth of your robot. If a track is playing, you can move your robots mouth to the audio level. Main Window 1. Level...


Use your PCs audio input device (microphone) to control servos! You can control many servos with this skill by using the Multi Servo option in the settings or adding multiple instances of the skill. Specify the scalar to increase the dynamic range of the audio in relation to the servos position. Dont worry if that sounds confusing; play with it, and see what you get. Use this skill to move the mouth of your robot. If you speak into the microphone, you can have your robots mouth mimic your...


Use your PCs audio output (speakers) to control servos! You can control many servos with this skill by using the Multi Servo option in the settings or adding multiple instances of the skill. Specify the scalar to increase the dynamic range of the audio about the servos position. Dont worry if that sounds confusing; play with it, and see what you get. Use this skill to move the mouth of your robot. If a track is playing, you can move your robots mouth to the audio level. Main Window 1. Level...


This soundboard will play audio files through the EZ-B v4 speaker. You may load WAV or MP3 files into the library to have multiple audio files. Use the ControlCommand() scripts to trigger specific audio files. Main Window 1. Stop Button This button stops the audio coming from the Soundboard. 2. Clean Button If audio files are deleted from the tracklist, they will leave a blank row. This button removes the blank rows from the tracklist. 3. Clipping Indicator If the volume level bar is too...


This Soundboard will play MP3 or WAV files out of the selected default sound output device on your computer. Load files into the tracklist and use the Play button to trigger them. This is a great solution for adding digital sound effects to your project. Main Window 1. Stop Button This button stops the audio coming from the Soundboard. 2. Clean Button If audio files are deleted from the tracklist, they will leave a blank row. This button removes the blank rows from the tracklist. 3. Track...


This Soundboard will play MP3 or WAV files out of your PCs default sound output device. Scripts can be added to the timeline of each audio file for automation. You can use the scripts to create dances and movement events that trigger at specific times along the audio file.  This is similar to the soundboard (PC) but adds the ability to apply scripts to the audio file timeline.  Configuration Scripts can be added to each audio file triggered during the playback timeline. Right-click in the...


Execute scripts based on input from any speech to text recognition.


Using the Microsoft Windows Speech Recognition Engine, this skill uses your computers default audio input device and listens for known phrases. Phrases to be detected are manually configured in the Configuration menu, and custom actions (via script) are assigned to your phrases. Most robots make a lot of noise, so locating the audio input device on a robot is impractical. It is best to find the microphone on the controlling PC/Laptop, on yourself, or somewhere in the room (away from the robot...


Execute a script when speech is created. With this skill, you can create a function that will move servos or LEDs based on spoken speech. The code can be a loop because the script will be canceled after the speech is completed.  Variable The variable containing the speech that is currently speaking is set as $SpeechTxt. *Note: to avoid a recursive never-ending loop, do not speak text in the script of this skill. If you do, the text will call this script, which will call this script, which will...


This text-to-speech skill will verbally speak the user-defined phrase from your PCs default audio output device or EZ-B. However, an alternate way for your robot to speak programmatically from programming code is to use the SayEZB() or Say() commands in Blockly, JavaScript, or EZ-Script. Main Window 1. Text Field This field contains the text you would like spoken; it can be as long as you want. 2. Say (PC Speaker) Button This will output the text-to-speech through the PCs audio output...


Use this robot skill to adjust the speech settings and audio effects for spoken speech synthesis on EZB index #0. Main Window 1. Voice Drop-down This drop-down contains a selection of installed voices. 2. Emphasis Drop-down This drop-down contains a selection of speech emphasis levels. 3. Rate Drop-down This drop-down contains a selection of speeds for the emphasis of speech. Do note that the Rate will have no effect unless Emphasis is configured to Not set. 4. Volume Slider This slider...


Deprecated This robot skill has been deprecated and will be removed in the next update (after 2020-11-02). However, do not be alarmed because it has been replaced with a new and better version HERE This skill will bind to the Text To Speech engine. This skill will move the specified servos to simulate a mouth movement whenever a text-to-speech script commands (i.e., Say, SayEZB, SayWait, SayEZBWait). If your robot has servos connected to simulate a mouth, this skill will move those servos...


This skill will bind to the Text-to-speech engine and move servos to simulate jaw movement when speaking without writing code. This skill will move the specified servos to simulate a mouth movement whenever a ARC text-to-speech command is executed (i.e., Say, SayEZB, SayWait, SayEZBWait). If your robot has servos connected to simulate a mouth, this skill will move those servos while speaking. If the robot skill that is speaking is not using the built-in Say() commands (such as the Azure Text To...


AKA the worst speech recognizer - by request of users who wish for unusable open dictionary offline speech recognition. Unlike the regular speech recognition control which allows pre-defined phrases or the Bing Speech Recognition which works, this is an open dictionary, allowing any combination of words. However, the implementation of this type of speech recognition is not great... not great at all! You can get acceptable results sometimes by using a handheld microphone and very well trained...


Voice Activity Detection (VAD) detects the presence or absence of human speech. This uses an advanced algorithm to detect human voice in the microphone input of the PCs default audio device. When a human voice is detected or lost, a respective script will run. Voice Detected When voice is detected, the graph will display in green and the voice start script will execute. Voice Not Detected The display will be red when there is an absence of a human voice and the voice end script will execute.


Watson Speech to Text is a cloud-native solution that uses deep-learning AI algorithms to apply knowledge about grammar, language structure, and audio/voice signal composition to create customizable speech recognition for optimal text transcription. Get Version 11 Version 11 (2020-11-03) ================== compatibility with ARCs new version Version 10 (2020-10-20) ================== Minor changes Version 9 (2020-10-08) ================== I decided to break the plugin in multiple plugins to help...


With Watson Text to Speech, you can generate human-like audio from written text. Improve the customer experience and engagement by interacting with users in multiple languages and tones. Increase content accessibility for users with different abilities, provide audio options to avoid distracted driving, or automate customer service interactions to increase efficiencies. You will need an IBM cloud account (Free Tier) Watson Text To Speech Sign up for IBM Cloud Log in to IBM Cloud IBM Cloud...


Use Microsoft Windows built-in speech synthesis and recognition engine.




Camera Robot Skills

The Camera category in Synthiam ARC facilitates visual perception for robots. With camera-related functionalities, robots can capture images, process visual data, and perform tasks such as object recognition and tracking. This category is fundamental for developing vision-based robotic applications.

This skill will overlay an image on any detected object, face, color or glyph. Any type of detectable tracking type in the ARC Camera skill can be used. Simply select your image and voila! Its best to use a transparent PNG Main Window 1. Attach/Detach Button Attaches (or Detaches) the loaded image to the first instance of the Camera skill. Once attached the overlay will display on the detected area inside the camera skill. 2. Load Image button This button loads an image. Browse to the location...


Add this skill to your ARC project and bring the camera to life. The video will be interactive, where you can click on objects to center. There are hot spots along the edges which moves the camera as well. Main Window 1. Attach Button This button will add the Camera Click servo functionality to the Camera skill in ARC. Settings 1. Horizontal Servo These settings configure the Pan servo along the horizontal plane. 2. Vertical Servo These settings configure the Tilt servo along the vertical...


This skill uses any camera installed on your PC or robot to combine computer vision tracking with movement and data acquisition. Computer vision is an experimental technology that requires a clean and bright environment to detect objects and colors accurately. If a camera is mounted on your robot, you can use this skill to track color objects, motion, human faces, or view the camera image. The attributes of the detected image can be adjusted manually in this skill to remove any false positives....


This skill only works with the Object tracking inside the Camera Device skill. This skill displays the detected object name on the Camera Device skill video stream. Main Window 1. Attach Button This button will add the Camera Overlay functionality to the Camera Device skill in ARC. How to Use the Camera Overlay Skill 1) Add a Camera Device skill to your ARC project (Project - Add Skill - Camera - Camera Device). 2) Add a Camera Overlay skill to your ARC project (Project - Add Skill - Camera -...


This skill will save snapshots from an active camera in the camera device skill. It will save a picture to your drive or device storage. The image will be saved to a folder called My Robot Pictures on your drive in the Pictures directory. This skill can also take a snapshot at a time interval, modified in the settings menu. You may also instruct the Camera Device to take photos programmatically through code. The script: controlCommand(Camera Snapshot, CameraSnapshot) can be called to instruct...


Use an EZB that supports video as a camera source for recognition, recording and more.


Use a USB camera as a video source for recongition, recording and more.


Overlay image packs onto the camera and control them using a specified control variable. GENERAL USE Choose an image pack from the drop down menu next to the Overlay button. Set the x and y positions, the width and height, and the variable that will control this image pack. In the Auto Assign tab set the min and max values of the control variable. Press Auto Assign. Start the camera. Press the Overlay button. Press the Start button. IMAGE PACKS Image packs consist of a number of images. Each...


SyntaxHighlighter.all(); Use the Microsoft Cognitive Emotion cloud service to describe images. The images come from the Camera Device added to the project. This plugin requires an internet connection. Please consult the appropriate lessons in the learn section to configure your EZ-B to WiFi client mode or add a second USB WiFi adapter from this tutorial. Currently Disabled This robot skill will only return an error because Microsoft has discontinued this service for their AI ethics....


Use the Cognitive Face cloud service to detect faces, describe emotions, guess age and get the persons name from a worldwide database. The images come from the Camera Device robot skill added to the project. This plugin requires an internet connection. Please consult the appropriate lessons in the learn section to configure your EZ-B to WiFi client mode or add a second USB WiFi adapter. Currently Disabled This robot skill will only return an error because Microsoft has discontinued this...


SyntaxHighlighter.all(); Use the Microsoft Cognitive Computer Vision cloud service to describe or read the text in images. The images come from the Camera Device added to the project. This plugin requires an internet connection. If you are using a WiFi-enabled robot controller (such as EZ-Robot EZ-B v4 or IoTiny), consult their manuals to configure WiFi client mode or add a second USB WiFi adapter. The Synthiam Cognitive Vision Robot Skill utilizes machine learning algorithms to enable robots...


You only look once (YOLO) is a state-of-the-art, real-time object detection system. using Tiny YOLOv3 a very small model as well for constrained environments (CPU Only, NO GPU) Darket YOLO website: https://pjreddie.com/darknet/yolo/ Requirements: You only need a camera control, the detection is done offline (no cloud services). 1) start the camera. 2) check the Running (check box) The detection will run continuously when the detection results change an On Changes script is executed (check the...


Track faces from any of the ARC video sources.


This Skill enables the control of your robots servos by moving the joints of your body, which are detected by a Microsoft XBOX 360 Kinect only. Servos can be assigned to each joint using the Settings window. Degrees to move the servos are automatically calculated for joints about the connecting joint. For example, the wrist position in degrees is calculated based on the elbow position. The elbow position in degrees is calculated based on the shoulder position. Each joint can be assigned to...


This control allows you to broadcast live audio and video from the camera control to the Web. Live stream implements HLS protocol from Apple and works cross-browser. You will have to configure your router to access the live broadcast link from external networks.  - If you would like to serve a webpage with an embedded video stream without audio, check out the Custom HTTP Server. - If you are looking for receiving a live stream feed in camera control check out Live Stream Receiver. *Icon...


This control listens to incoming live stream connection from web and playbacks the video and audio stream inside ARC. With this control you can open a web page (currently supports Chrome and Firefox on Desktop) from anywhere and start live streaming directly to ARC camera control. Network configuration might be required to access the server.  If you are looking for boadcasting camera feed to web check out Broadcast Control. * Icon credit: Flat Icons


Deprecated Deprecated Deprecated Deprecated Deprecated Deprecated This plugin is now built into ARC. Find more info here: http://www.ez-robot.com/Tutorials/Help.aspx?id=231 Deprecated Deprecated Deprecated Deprecated Deprecated Deprecated Use the Microsoft Cognitive Emotion cloud service to describe images. The images come from the Camera Device added to the project. This plugin requires an internet connect. Please consult the appropriate lessons in the learn section to configure your EZ-B to...


Deprecated Deprecated Deprecated Deprecated Deprecated Deprecated This plugin is now built into ARC. Find more info here: http://www.ez-robot.com/Tutorials/Help.aspx?id=232 Deprecated Deprecated Deprecated Deprecated Deprecated Deprecated Use the Microsoft Cognitive Computer Vision cloud service to describe or read text in images. The images come from the Camera Device added to the project. This plugin requires an internet connect. Please consult the appropriate lessons in the learn section to...


*Deprecated: Any USB Camera or EZ-B v4 Camera video feed will be displayed through the Oculus Rift screen. The output image will be the same as from the Camera Device display, with tracked objects, etc.. The Oculus Rift has an accelometer, which this plugin will use the to move servos of your robot. When you move your head, the specified servos of your robot will move, respectively. This has been tested with the Oculus Rift DK2. To look left and right, tilt your head left and right - rather than...


Omron HVC-P plugin for ARC (onboard computer). This is used in Rafiki. These are some case STL files so that you can protect this sensor while messing with it. OmronCase2.stl OmronCase1.stl OmronCaseextrusion.stl You need to install the following... Download Python_Installs.zip. Unzip it and install Python and PySerial. FTDI Friend - Adafruit is what I use to connect from the serial port to the Omron. The pin layout is as follows on the back of the Omron. Ground is the first pin toward the...


Required Download And Installation: Download Python_Install_Zip Unzip it and install Python and PySerial. This is an updated version of the original plugin David Cochran created for the Omron HVC-P which used the Omron EvaluationSoftware_v1.2.0 software for trained faces. My version of the plugin includes the updated EvaluationSoftware_rev.2.4.1 software for training faces. It works with both the original HVC-P and the HVC-P2. As with David Cochrans original plugin you will need to use an ARC...


Omron HVC-P2 plugin for ARC Windows, onboard PC required. This will also work with the original HVC-P. Required Download And Installation: Download Python_Install_Zip Unzip it and install Python and PySerial. This was created to be used with a second Omron HVCP(2) camera if you are using 2 as the variables have an extension _2 in ARC. Please Note Per Omron technical support, it is recommended for each camera to have their own album of saved user faces. It is not recommended to share album data...


Integrate state-of-the-art image generation capabilities directly into your robot apps and products. DALLE 2 is a new AI system that can create realistic images and art from natural language descriptions. Have your robot programmatically generate images from speech recognition robot skills descriptions. Or, have the image from the camera sent to Dall-e for its AI interpretation. Experience the whacky world of AI image generation with your Synthiam-powered robot or webcam. How Does It Work?...


This skill super imposes a camera video stream on top of another camera video stream. Main Window 1. Active Checkbox This checkbox will add the Source Camera Device video stream on top of the Destination Camera Device video stream. Configuration 1. Source Camera Device selection This selects the camera device video stream that will be overlaid onto the destination video stream. 2. Destination Camera Device selection This selects destination stream that the source video stream will appear onto....


Using the camera, this skill will allow you to add programming commands by holding up pieces of paper with images printed on them. The concept is to allow programming instructions using visual representations. Each card has an image that represents a specific command, such as move forward, turn right, turn left, or reverse. Using cue cards, the robot can be programmed for specific movements to navigate a maze, for example. The order in which the cards are shown to the robot are stored in memory...


The QR Code Generator will create a QR Code with the text you enter. By default, the QR Code text is Synthiam, and the QR Code Graphic is the same. Using your phones QR Code Scanner App to scan the graphic will say Synthiam. This control works in conjunction with the Camera Control. QR codes, short for Quick Response codes, are two-dimensional barcodes that have become ubiquitous in various industries, including robotics. These codes consist of a pattern of black squares on a white background...


Capture the output display of a robot skill and send it to a camera device. Specify the robot skill to capture, and it will send the video to the selected camera device. The Ultrasonic Radar Scan area is captured in this screenshot below and sent to the Camera Device as a video stream. Usage - You will need a camera device added to the project. Select the camera device from the configuration menu. - In the camera device, select the Custom option in the device list. Then press the start button....


Rubiks Cube Solving Robot skill. This skill is to be combined with a specific robot project to build. Find the robot project here:  https://www.thingiverse.com/thing:2471044 *** Version 5 *** Fix for ARC 2020.02.19.00 release *** First Calibrate Arms Grippers: Main Action: Demo:


Capture any area of the screen and send it to a camera device. Specify the screen area to capture, and it will send the video to the selected camera device.  You will need a camera device added to the project. Select the camera device from the configuration menu.  In the camera device, select the Custom option in the device list. Then press the start button. *Note: the display resolution scaling must be 100% for accurate capture area.


The Sighthound Cloud Detection API returns the location of any people and faces found in robot camera video. Faces can be analyzed for gender, age, pose, or emotion; and a landmark detector can find the various facial features in the detected faces, including eyes, nose and mouth, by fitting 68 landmark points to those features. *Requirement: This plugin requires ARC 2019.12.11.00 or higher Variables are set with information that has been detected. This plugin requires a Camera control to be...


Stream all video sources from any video URI protocol or codec (RTMP, m3u, m3u8, Mkv, MOV, mpg, etc.). The video stream is sent to the selected camera device. This supports webcams or any type of video device/service that provides a video feed over a network.  Protocol Types The URL can be a number of different protocol types that specify an end-point feeding a compatible codec. Some supported protocol types that can be specified in the URL are... - http://xxx.xxx.xxx.xxx:[port]/path -...


Select one of the included templates, or select your own. The images are translucent PNG files which are overlaid on each frame of the camera stream. Main Window 1. Attach/Detach Button This checkbox will add the Source Camera Device video stream on top of the Destination Camera Device video stream. 2. Status Field This field will display the status of the connection to the camera device skill and any errors that occur. 3. Load Image Button This button allows you to load your own custom target...


Display a variable on the processed camera device image. Specify the X/Y coordinates of the variable location, and the variable name. There are ControlCommand() for attaching the skill to a specific camera device, or use any available device.


Object detection is fundamental to computer vision: Recognize the objects inside the robot camera and where they are in the image. This robot skill attaches to the Camera Device robot skill to obtain the video feed for detection. Demo Directions 1) Add a Camera Device robot skill to the project 2) Add this robot skill to the project. Check the robot skills log view to ensure the robot skill has loaded the model correctly. 3) START the camera device robot skill, so it displays a video stream 4)...


In order to train objects to the camera device, the Train Object menu is used in the camera device skill. This menu normally requires human intervention to enter the object name and use the mouse. This skill allows your robot to learn objects in real-time triggered by controlcommand() scripts programmatically. Take a look at the cheat sheet within the Script skill settings to see what controlcommand() this skill accepts. Main Window 1. Beginning Learning Button This button will attach/detach...


Record any video source to a local file.


*Note: Vuzixs decision to no longer support the Synthiam platform with their newer products. Therefore, this control is limited to the deprecated 920VR headset, which may be used on eBay. There will be no further development on this control. For headset support, we recommend the Virtual Reality Robot. The Vuzix augmented reality control enables connectivity between your robot and the Vuzix VR glasses. When the VR module is included with the Vuzix glasses, this allows control of your robots...




Communication Robot Skills

The Communication category in Synthiam ARC focuses on enabling robots to interact and exchange information with other devices. This includes communication protocols, data sharing, and network connectivity. It is crucial for building robots that can collaborate and communicate in various environments.

Retrieve twitter tweets and process them to have your robot do cool stuff via twitter posts. This skill accepts two controlcommand() for retrieving either the latest mention or latest tweet. Mention: A latest mention is when another twitter account sends a message via @UserName to your account. This allows anyone who sends a tweet tagging your @username will be retrieved. Tweet: A tweet is only tweets from your account. This means only tweets from your account will be retrieved. Main Window 1....


This robot skill displays your systems COM/Serial ports opened by script commands. The script commands for accessing COM ports are in the script manual for your specific language (ez-script, javascript, or python) in the websites support section. If you want to display communication data in real-time use the Microsofts PortMon. This robot skill shows if a COM port is open or closed by a script. There is no way to view the communication log of the COM port in this robot skill. *Note: these are...


The Comm Expansion Configuration is an add-on to the EZ-B v4.x/2, which can be configured for two modes. This control is a utility which is used to configure the Comm Expansion module. UART Mode Enables the Comm Expansion UART mode, which allows a transparent proxy of UART data between the PC and Comm Expansion modules UART. USB Mode Connect a USB cable from the Comm Expansion adapter to the PC for direct USB communication rather than WiFi. This benefits robots with onboard computers to not be...


Lets assume you have 2 or 3 of the same robot connected to a single instance of ARC on a computer. This skill allows you to send the same commands to all the selected EZ-Bs. If all of the robots are the same, perhaps JDs, you can have them all dance at the same time synchronized within a single project. All data that is sent to the selected master EZ-B will also be sent to all other slave EZ-Bs. Main Window 1. Pause Checkbox This is will disable the Command Duplicator when selected....


Use this utility to diagnose connection issues. When this utility is loaded, verbose logging of the EZ-B communication will be displayed in this window. Main Window  1. Copy Log to Clipboard Button This button will copy the log to the clipboard so you can paste into into a Word processing program. 2. Ping EZ-B Button This button will will send a ping to the 192.168.1.1 and looks for the reply. The text field is updated with the status of the ping. 3. Load Diagnostic Report Button Allows you to...


Connect ARCx to microcontrollers with the EZB firmware installed.


Use the connection skill to connect from an EZ-B I/O controller to ARC. Connections to an EZ-B are specified as COM ports or IP addresses. There are five connections in this skill, but if your robot requires more, use the ezb connections skill. ARC supports connecting to many I/O controllers and robot products, and you can view a list of supported EZB hardware here. Main Window 1. EZB Board Index This column displays the index for each connection. Note that most skills are configured to work...


This skill allows you to add up to 255 additional EZ-B I/O controllers to the project to the five offered in the connection control. To learn more about EZ-B connections, see the Connection skill page. Main Window 1. Add Button This button adds more connections (up to index #255). 2. Delete Button This button removes connections starting from the highest index number. 3. Connection Number This column displays the number for each connection (5 to 255). 4. Connect Button Once the COM port or...


Have you wanted to run your own MQTT Broker? Now you can with this skill. MQTT (MQ Telemetry Transport) is a publish-subscribe-based lightweight messaging protocol for use on top of the TCP/IP protocol (port 1883). The publish-subscribe messaging pattern requires a message broker. The broker is responsible for distributing messages to interested clients based on the topic of a message. How EZMQTT Works 1) Start the EZMQTT Broker 2) In the EZMQTT Client Configuration enter the Broker IP 3) In...


MQTT (MQ Telemetry Transport) is a publish-subscribe-based lightweight messaging protocol for use on top of the TCP/IP protocol (port 1883). The publish-subscribe messaging pattern requires a message broker. The broker is responsible for distributing messages to interested clients based on the topic of a message. This is an MQTT Client Control that connects to an MQTT broker and publishes/subscribes to topics. When messages are published to matching topics, the incoming data will be assigned to...


Get an RSS feed and assign each response to a variable in an array. Use the variable to speak the response or perform parsing, etc. Send a ControlCommand to the GetRSS robot skill plugin with the URL of the RSS feed. The response will be populated in global variable arrays. Variables The request of the RSS feed is initiated by the ControlCommand from a script. The response will be populated in global variables after the request. $RSSSuccess - bool This will return a true (1) or false (0) about...


Execute scripts based on network connection change activity. Such as if WiFi connects or disconnects. This is useful for headless SBCs mounted on robots. When the computer boots and ARC is configured to load on startup, configure this robot skill to announce to the speaker the status of the wifi connection. *Note: information on configuring an SBC for headless operation here: https://synthiam.com/Support/Tips%20to%20Make%20a%20Robot/headless-robot-sbc Configuration 1) Network Status Variable -...


Push robot status notifications to iOS (iPhone/iPad), Android, and other devices, including Windows. A PushOver.net account is required to use this robot skill. Visit https://pushover.net/ and create an account. Use this robot skill to push the robot statuses of your robots. Have the robot send a push notification to your mobile device when the battery is low or if the robot got stuck - anything you can dream up. Stay in contact with your robot where ever you are. Device App Install the...


We also include two types of serial terminals in ARC. This is great for connecting and debugging serial hardware. The PC terminal will use the COM ports on your computer, and the EZB terminal will use the UART on the EZ-B. *Note: If you wish to use the PC Terminal, remember that most hardware is TTL (not rs232). This means you will need an RS-232 to TTL USB Convertor to connect to your PC. They are plentiful online. Main Window 1. COM Port Drop-down This drop-down allows you to select the...


Send emails using SMTP from your robot using ControlCommand() Syntax. You must first configure this skill. The SMTP Server, username, etc. must be completed. Check with your smtp server host for the correct settings. *Note: This skill will save the email username/password in the project for convenience. If you distribute your project, beware that the email credentials will also be included. Main Window 1. Status Display Display the email address of the recipient of the email. 2. Script Start...


This TCP Client is an ASCII telnet client, essentially - without any terminal emulation (i.e. vt100). The raw data displayed from the host is displayed. There is also an option to display all incoming data as HEX values for additional debugging. Main Window 1. Edit Button This button allows you to Copy, Paste, or clear the Status field. 2. Server Address Field Enter the IP address and port number of the server you would like to connect to.  3. Connect Button This is connects your client to the...


If you have TCP Server enabled (under Config in the Connection), you can use this control to monitor incoming connections.


This TCP server will bind to a port and listen for incoming EZ-Script, JavaScript or Python commands. This is a useful plugin for connecting third part controllers and products to ARC. You may send any command of the specified language, terminated by a new line (\r\n). The response is received which is also terminated by a new line (\r\n). Control Command There are also ControlCommand() for starting, stopping and controlling debug settings. This is useful if you wish to control this plugin from...


This skill will query a Twitter feed to pick up custom phrases that you have pre-configured. Your robot can be controlled by reading tweet commands. You can configure this robot skill so that if someone tweets move forward, your robot can begin moving forward. It operates similarly to the Speech Recognition skill in that the detected phrases can execute scripts. Note: This skill requires an internet connection and a Twitter account. If you are using a WiFi-enabled EZB, you will require an...


A client to send and receive data to a WebSocket, assign the response to a variable, and execute scripts. Use ControlCommand to open a connection to a remote host and send data. The received data will be assigned to a variable and the specified script will execute. Once a connection is established to the server using the ControlCommand Open, your program may now send messages to the server via the ControlCommand Send. Any messages received from the server are stored in a variable and the script...


A WebSocket server for ARC that accepts client connections and executes scripts for each message received. The received message contents will be stored in the respective variable. Configuration Menu 1) The Response Variable will hold the text response message that is sent from the client. 2) Response Script will be executed for every response that is sent from the clients. Within this script, the response message is available in the Response Variable. By default, an example script will speak...




Digital Robot Skills

The Digital category in Synthiam ARC deals with digital signal processing and manipulation. It includes functions for working with digital sensors, logic operations, and data processing. This skill set is essential for handling digital information in robotic applications.

This skill gets the digital status of a port. It detects a TTL voltage signal low (0V) or a TTL voltage signal high (+3.3V or +5V, depending on the I/O Controller). Main Window 1. Digital Status Indicator This Digital Status Indicator changes color based on the detected input voltage. The color will start as black before an I/O controller is connected. Once an I/O controller is connected, the indicated color will be red for a voltage signal low (0V) and green for a voltage signal high (+3.3V...


This skill sets the digital status of a port. It generates a TTL voltage signal low (0V) or a TTL voltage signal high (+3.3V or +5V, depending on the I/O Controller). Main Window 1. ON/OFF Button This button turns a digital port ON (green) or OFF (red). Settings 1. Name Field Enter a name for your Set Digital skill. 2. Board Drop-down Select your board number. 3. Port Drop-down Select your Digital port number. How to Use Set Digital 1) Add the Set Digital skill to your ARC project...




Display Robot Skills

The Display category in Synthiam ARC allows robots to present information visually. It includes features for controlling displays, showing images, and rendering graphical content. This skill set is valuable for creating robots that can convey information to users in a clear and engaging manner.

Full-featured Chrome web browser robot skill that can be configured using control commands. Main Window 1) Print the current page 2) Maximize or restore the window state 3) Open the current page in the PC default browser 4) The current URL or enter a custom URL (press enter to browse) 5) The web content Config Window The config window, which displays settings that can be configured for the robot skills behavior, can be accessed by pressing the three little dots on the skills top left menu.  1)...


Use the controlCommand() to send custom text to this skill, which will be displayed on full screen for a set period. The configuration menu allows you to adjust the text font, text color, and background color. There is a default timeout value as well. All controlCommand() for this skill can be found in the Cheat Sheet tab. See the video below for an example of use. Main Window 1. Display Popup Nothing to interact with. Use the configuration to change parameters such as timeout, text color,...


Play videos in a fullscreen window that sits on top of everything else. Excellent for facial animations or user messages. Use ControlCommands to specify videos to play fullscreen. The ESC key can be pressed while a video is playing to cancel the video. Once the ESC key is pressed, this robot skill will not play another video until the Pause New Playback checkbox has been unchecked. Robot Skill Main Window 1) This checkbox will be checked when the ESC key is pressed during video playback. This...




Games Robot Skills

The Games category in Synthiam ARC introduces entertainment and gaming elements to robotic platforms. It includes functionalities for creating interactive games and activities that engage users with the robot. This category adds a playful dimension to the robot's capabilities.

This skill is a tic tac toe game that you can play with your robot. Configure script routines for many different options to add interesting behaviors as your robot kicks your butt at tic tac toe There are even Cheat Sheet commands for additional control of the game board. For example, set up speech recognition to play with your voice! Main Window 1. Reset Game Button When pressed this will start a new game. 2. Robot Response Display This is where the robot will have a text based response. 3....




General Robot Skills

The General category in Synthiam ARC encompasses a wide range of general-purpose robot skills. It includes versatile functionalities that can be applied across different robotic applications, providing a foundation for building custom behaviors and features.

For debugging your EZ-B. This will demonstrate the speed of commands and the reliability of the connection. Use this for two reasons: 1) To verify the speed of your EZ-B connection and computer. 2) To verify the reliability (or diagnose issues) of the connection or EZ-B. Data Channel Flood Protection If you load the settings screen of the connection control, there is an option for the EZ-B v4.x/0, and EZ-B v4.x/1 called Data Channel Flood Control. The flood control will add a slight delay...


This skill helps you troubleshoot an issue you might be facing. It enables you to figure out where you might be experiencing an error, your settings, or which version you are running. *Note: The debug window at the bottom of the ARC interface can also be activated from the Options menu. When this skill is added to ARC, the bottom popup debug will be deactivated, and all log text will be routed to this skill. Main Window 1. Information Display Displays the date in YYYY-MM-DD format, time in...


If you’re having trouble connecting to an ezb or robot and the tutorials aren’t helping, you may need some assistance. In this particular case, someone from synthiam might be able to help by reviewing your diagnostic information. How To Submit Report 1) Install the latest version of ARC. The most recent version of ARC must be installed. 2) Connect to the Internet on your PC. An internet connection will be necessary for the data to be transmitted to the Synthiam server. 3) Load ARC.br / 4)...


This skill was created for schools or owners of many robots with dedicated WiFi network configuration for the EZ-Robot EZ-B4. This robot skill does not work with Arduino or other robot skills. This is only for EZ-Robo products. Use this skill to quickly change the WiFi settings to the connected EZ-B v4 inside the ARC software. If the reset button is pressed on the EZ-B v4, the WiFi settings will be set back to default. You can use this tool to configure settings for an EZ-B after reset rather...


This utility allows control over the hardware settings of the EZ-B. You can control the Bluetooth name of the EZ-B and apply any updates to settings that may have been altered.


The EZ-B v4 has an internal temperature sensor and voltage monitor. This control will display both of the values from the EZ-B v4. The battery monitor is built into the EZ-B v4. The settings for the LiPo battery protection can be configured in the Connection Control Settings.


The EZB robot controllers created by EZ-Robot have WiFi capability. Each EZB can be named to describe its use or placement within a robot. When the EZBs are powered, connect to WiFi, and obtain an IP Address, they will broadcast that information. This robot skill allows you to connect to an EZB by the name rather than IP. When a new EZB is detected by its broadcast, it will be displayed in the log window.  Multiple EZBs Some advanced robots will have multiple WiFi EZB controllers. In this case,...


This control will display the list of files currently open for reading by script file methods. This will not display files that are being written too. The files for writing are not opened because the data is appended to the file and the file position is unnecessary. Files are only open when specified to open for reading. If a file is opened for reading, Its a good idea to close it before writing to it. If you do not close the file, it will be automatically closed when a new project is loaded....


This is a very simple skill for keeping notes. Its great for keeping track of connections and project notes. Main Window Add your notes to the body of this window. The text color and background will display according to the window theme you have chosen (Options - Preferences - Window Theme). How to use Notepad 1) Add the Notepad skill to your ARC project (Project - Add Skill - General - Notepad). 2) In the main window, add the text note that youd like to keep in your ARC project.


This skill will record your screen to a video file. You can use ControlCommand() to start, stop the recording or simply press the skill buttons. This skill will record in the WMV1, WMV2, or H263P formats. This can be used to generate video snippets for tutorials or demonstrating how to perform certain tasks. Upload the video files to YouTube and share with the community. Main Window 1. Start/Stop Button Start recording the screen with the chosen settings. 2. Choose Button  Choose the folder...


The shortcut creator has been moved to the ARC Options menu. CLICK HERE to view the new Shortcut Creator manual page. You will be redirected in 3 seconds var timer = setTimeout(function() { window.location=/Support/ARC-Overview/Options%20Menu/shortcut-creator }, 3000);


A plugin to allow people to give instructions to users in the ARC software. Updates: March 19 - Huge Update New Features - You are now able to format text - Here is the current ways to format the text: Indent Outdent Font: Bold Font: Italic Font: Underline Font: Size Align: Center Align: Left Align: Right Lists: Unordered Commands: Cut Commands: Copy Commands: Paste - Images can be added alongside text (directly in the textbox) - Image shown on the left was retained, but renamed...


*This has been deprecated because teh browser it uses doesnt work well. It needs to be updated to use chronium, which surprisingly is included in ARCs distribution This is a web browser control which connects to the synthiam community forum. The source code is available here: Tutorial65-DynamicPluginBrowser.zip




GPS Robot Skills

The GPS (Global Positioning System) category in Synthiam ARC enables robots to determine their precise geographical location. This is essential for applications that require navigation, mapping, and location-based decision-making. Robots equipped with GPS capabilities can navigate autonomously in various environments.

This robot skill has been deprecated and replaced by: https://synthiam.com/Support/Skills/Navigation/BN-880-GPS-Compass?id=21348 Read longitude, latitude, speed and time from the ublox NEO-6M GPS. This control will automatically initialize UART port 0 at 9600 baud for the GPS. One initial startup, the GPS must warm up to begin communicating. The GPS Status will be red and say disconnected until the GPS warms up and begins communicating with the satellites. If there are clouds or a building above...




Graphs Robot Skills

The Graphs category in Synthiam ARC focuses on visualizing data through graphical representations. It includes tools for creating charts, graphs, and visual displays of information. This skill set is useful for presenting complex data in a comprehensible manner.

Basic 4 series bar chart for comparing your robots sensor values at a glance. EZ-Script Basic Example Usage:


Dashboard style gauge chart with red zone display. v0.0 initial release. This plugin is the most complex of leversofpowers chart plugins. Enjoy. Red Zone Chart (EZ-Script example)


Basic 4 series historical line chart for comparing your robots sensor values over time. Code example 1:


The Graphy skill is used for creating graphs and plotting data using ControlCommand() scripts. Graph data can be saved as a .CSV file. Main Window 1. File Button Save the Graph data to a .CSV file 2. Remove All Series Button Clears all the graph data.  3. Graph Window Displays all the graph data. 4. Script In order to use the Graphy skill a Script must be used to populate graph data. How to Use the Graphy Skill Add Data




I2C Robot Skills

The I2C (Inter-Integrated Circuit) category in Synthiam ARC facilitates communication between devices using the I2C protocol. This skill set is crucial for connecting and interacting with a variety of sensors and peripherals in a robotics system.

Driver for the EZ-Robot 3-in-1 IMU sensor. This includes a sensor for accelerometer, gyro, and compass.


*Note: The author created this robot skill for a specific use case; therefore, information on how to use it has not been published. You probably do not need to use this robot skill. A robot skill that allows communication between the EZB v4 and Arduino using I2C and can use user-made code in C#. Instructions to use this robot skill are found in the Help tab in the robot skill. The  Robot Skills plugin folder is located in the C:\ProgramData\ARC\Plugins\ in File Explorer, which is shown by going...


The BlinkM I2C multi-color LED is a popular add-on for robots. This control provides slide dials that allow custom color adjustments. For custom I2C peripherals, use the SendI2C() function in the Scripting Control. Note: Synthiam is not a manufacturer of this third-party hardware device. . Nor is ezrobot responsible for the operation of this third-party device. For operational questions, its best to consult the instruction manual for operational instructions.


The EZ-B already supports USB joysticks with ARC. This feature allows you to add an IR sensor to your robot and assign functions to the buttons. In future firmware, this will allow your robot to have remote control ability without needing the PC. Note: Synthiam is not a manufacturer of this third-party hardware device. . Nor is ezrobot responsible for the operation of this third-party device. For operational questions, its best to consult the instruction manual for operational instructions.


The Sure Electronics DC-SS503V100 i2c Dual-Axis Magnetic Sensor. Adding this sensor to the i2c port of your EZ-B will allow your robot to understand magnetic directions. We have found that the wires on this device need to be pretty short (around 2-3 inches max). Note: Synthiam is not a manufacturer of this third-party hardware device. . Nor is ezrobot responsible for the operation of this third-party device. For operational questions, its best to consult the instruction manual for operational...


This is an updated version of the ESP32_MPU6050 skill that EZB ESP32 over WiFi and Arduino EZB_UNO EZB_Leonardo firmware connected through USB with MPU6050 support.  This skill uses a modified versions of the EZB firmwares created by DJ and has all the features of that firmware and also supports the MPU6050.  This skill provides servo values to create gimbal servo values for every axis like a camera gimbal and also provides accelerometer angle values that can be used for other scripting needs....


SyntaxHighlighter.all(); The HMC5883 EZ-Bit module is a 3 axis compass sensor over an i2c connection. This ARC control will read data from the sensor and set EZ-Script Variables, respectively. The control will require an Initialization (INIT) before the sensor will return data. If the sensor is not connected, the EZ-B v4 will lock up. This control will not read data automatically. We have provided a ControlCommand() for querying the device and setting the variables. Each time you wish to receive...


This control is for the Adafruit 8x8 LED Matrix Array, which uses the HT16K33 i2c chip. Once connected to the EZ-B v4 over i2c, this control lets you create animations using the 8x8 LED Array. The Adafruit LED module can be purchased here: http://www.adafruit.com/product/870 *Note: The INIT button must be pressed once the module is connected to i2c of the EZ-B. You may also use the EZ-Scripts ControlCommand() to initialize the module programmatically. Note: Synthiam is not a manufacturer of...


SyntaxHighlighter.all(); Lidar-Lite Distance Sensor connects to any of the three i2c ports. The Lidar-Lite is a laser-based measurement solution supporting various applications (e.g., drones, general robotics, industrial sensing, and more). Measures distance, velocity, and signal strength of cooperative and noncooperative targets at distances from zero to more than 40 meters. *Note: You will need to solder the i2c wires from the Lidar-Lite to an EZ-B v4 I2C Peripheral Cable (25cm) to connect...


This is an i2c accelerometer that can be connected to an EZ-B that supports i2c. How It Works Interested in how the IMU Sensor works? Find out by reading this fantastic article HERE.


SyntaxHighlighter.all(); The MPU6050 EZ-Bit module is an all-in-one Gyro/Accelerometer/Temperature sensor over an i2c connection. This ARC control will read data from the sensor and set EZ-Script Variables, respectively. The control will require an Initialization (INIT) before the sensor will return data. If the sensor is not connected, the EZ-B v4 will lock up. This control will not read data automatically. We have provided a ControlCommand() for querying the device and setting the variables....


You need to flash the firmware to the Arduino. With this firmware you can use both the DHT11 and the MPU6050 sensors at the same time. EZ-Genuino_DTH11_MPU6050_Uno.zip For the DTH11 wiring help go Here. For the MPU connect like pic. The variables $AngleX, $AngleY and $AngleZ are created. Have Fun!


SyntaxHighlighter.all(); The MPU9150 EZ-Bit module is an all-in-one Compass/Gyro/Accelerometer/Temperature sensor over an i2c connection. This ARC control will read data from the sensor and set EZ-Script Variables, respectively. The control will require an Initialization (INIT) before the sensor will return data. If the sensor is not connected, the EZ-B v4 will lock up. This control will not read data automatically. We have provided a ControlCommand() for querying the device and setting the...


This animator controls the Synthiam RGB 8x8 display. The animator allows you to create multiple frames using the mouse, which can be played back at specified speeds. This operates much like a flip book. *Note: The RGB 8x8 Animator skill connects to the RGB 8x8 display through an i2c interface. If the RGB 8x8 display is not connected to the i2c interface, executing the RGB 8x8 Animator will lock up the EZB. Main Window 1. Action List This section lists the animation actions you have created....


The eyes of the EZ-Robot JD Humanoid and RGB Eyes EZ-Bit contain an array of 18 RGB LEDs. This animator allows you to create colorful patterns and animations with them. *Note: The RGB Animator connects to the RGB Eyes through an i2c interface. If the RGB Eyes are not connected to the i2c interface, executing the RGB Animator will lock up the EZ-B. Main Window 1. Action List This section lists the animation actions you have created. Click on the desired action to highlight it for execution....




Infrared Distance Robot Skills

The Infrared Distance category in Synthiam ARC enables robots to measure distances using infrared sensors. This is essential for obstacle avoidance, navigation, and proximity sensing. Robots equipped with this capability can interact safely in dynamic environments.

Sharp infrared (IR) distance sensors are a great way to determine the distance to an object or wall, using reflected infrared (invisible) light. They offer excellent range and accuracy in an easy-to-use package. Their operation is not affected by soft materials, as ultrasonic sensors are. However, they do have trouble with sunlight interference and black materials that are transparent to IR. When connecting a Sharp IR distance sensor to an EZBs ADC (Analog Digital Converter) port, this skill...


Sharp infrared (IR) distance sensors are a great way to determine the distance to an object or wall, using reflected infrared (invisible) light. They offer excellent range and accuracy in an easy-to-use package. Their operation is not affected by soft materials, as ultrasonic sensors are. However, they do have trouble with sunlight interference and black materials that are transparent to IR. When connecting a Sharp IR distance sensor to an EZBs ADC (Analog Digital Converter) port, this skill...




Misc Robot Skills

The Misc (Miscellaneous) category in Synthiam ARC includes a collection of miscellaneous robot skills that don't fit into specific categories. It provides a versatile toolkit for handling various tasks and functionalities in robotic applications.

Start by adding the Admin Toolbox plugin to your project: The plugin control window is empty nothing is done here, all the fun is in the new Admin tab. Default status: Project is Unlocked, and ARC application mode is no full screen The default lock/unlocks password is admin A password is required to lock or unlock the project The default Full screen modes password is password Changing lock password Changing full screen password. *) Both passwords can be changed in a single step. Once...


This ARC plugin enables customize angle calculations, publish the calculations to EZ-Script variables, and control servos. Supported Hardware: Kinect Xbox 360 (Kinect for Windows Runtime 1.8 - https://www.microsoft.com/en-us/download/details.aspx?id=40277) Asus Xtion Pro (OpenNI 2 and NITE 2) Kinect One (Kinect for Windows Runtime 2.0 - https://www.microsoft.com/en-us/download/details.aspx?id=44559) Addional Setup Notes for Asus Xtion: Please confirm the following windows environment variables:


You need to flash the DTH11 firmware to the Arduino. EZ-Genuino_DTH11_Uno.zip Use Pin 4 for signal. The variables $Temp and $Humid are created. Have fun!


Plug in to monitor system boards for Temperature and Voltage. Can display either Fahrenheit or Celsius. High Low voltage can be configured as well Normal Board Temperature. Icons change as voltage and temperatures change with large numeric display. Can select any of 5 EZ V4 boards. Version2 added Over Voltage Warning. Add more error protection routines. Added Typical Voltage Chart Version 3.5 added High Low Voltage warnings


The Texas Instruments eZ430-Chronos Programmable Wrist Watch has many neat features, and ARC supports some of them. The built-in accelerometer can be assigned to servo positions on the X and Y axis, much like the Wii Remote.


This control allows you to attach a GPS to your computer. Nearly any USB or RS-232 Serial GPS can be connected to your computer for this control. Simply connect the GPS to your computer and wait for it to detect the virtual COM port for the GPS. The COM port is the communication port for the GPS. Select the COM port and voila, you will begin receiving GPS information. Because adding GPS functionality is different for everyone, the GPS controls are built into EZ-Script. When creating an...


Please note that this plugin only handles the sending of messages to IFTTT. If you plan on receiving messages from IFTTT, take a look at this tutorial: https://synthiam.com/Support/Skills/Misc/IFTTT-Sender?id=15883 As this is my first plugin, it may be a bit rough... feedback and advice welcomed. Thanks to DJ Sures for the great plugin tutorial and sample code. * First, create an IFTTT account here: https://ifttt.com/ * Click on My Applets * Click on New Applet * Click the If This clause *...


This is a server for the iPhone Sensor Stream App from FNI. The Sensor Streamer will transmit all sensor data from the Apple Mobile Device to your computer. This control will accept the data and assign the data to variables. The dark green dot displays the accelerometer position. The light green displays the gyro. And the blue displays the compass heading. Pressing the config button will allow you to specify which parameters are assigned to variables. The variables may be used in EZ-Script....


This skill contains a variety of utilities for database connectivity and more. Miscellaneous Utility Plugin ====================== source code: https://github.com/ppedro74/ezrobot-playground/tree/master/plugins/MiscUtilityPlugin version 6: ARC update version 5: ARC update version 4 : Added: DataQuery function Syntax: DataQuery (connection, query, [limit], [output variable], [query argument 0], [query argument 1], [query argument n]) = Int32 Connection: Provider Name and connection string. Query:...


SyntaxHighlighter.all(); The Neopixel Blaster is a compact ezbit which controls up to 640 neo pixels on 8 different output ports. The compact footprint is designed to fit snug in a female ez-clip. Using this controller means you can have 80 neo pixels per output port (8 * 80 = 640 neopixels). Once this control is added to your project, select the digital port that the controller is connected to. Use the available ControlCommand() from the Cheat Sheet to set colors to the neo pixels. There are...


This plugin is for people with onboard computers to control the Neopixel ring from ARC I will work on adding the ability to have this run from a script. All of this is doable through a script anyway, but I wanted to give a user interface to the work that has been done. This update displays the command that needs to be passed via serial. you can copy and paste this command. Also, it allows you to either run the light or not based on a check box. It allows you to quickly test, and then copy the...


SyntaxHighlighter.all(); The NEST (www.nest.com) devices (thermostat and co/smoke detector) connect your home to the internet. With the Synthiam Nest Control, your robot can request and set the temperature of your home. ARC will require access to the your Nest devices, which can be done with the following steps in ARC... Configure Nest Access 1. Press OPTIONS from ARC top menu 2. Select NEST from Global Settings in top menu 3. Follow the instructions to activate ARC with your Nest account Once...


Control your mouse with this skill with scripting. Be creative and use your voice, camera skill or switches on your I/O controller to move the mouse cursor on your PC. Use the ControlCommand() for this skill to move the mouse cursor with scripts. Main Window 1. Pointer Position Display This displays where the mouse cursor location (in pixels) on the screen. Upper left is the starting position (0 x 0). 2. Start Script Button Push this button to execute your code. See code samples below for...


Two instances different camera angles: Top Menus: Configuration - configuration form Scroll Bars: Rotate Camera - When selected the top, bottom, right scroll bars controls the camera rotation (roll, pitch, yaw) Position Camera - When selected the top, bottom, right scroll bars controls the camera distance from the object (x,y,z) Rotate Object - When selected the top, bottom, right scroll bars controls the object (cube) rotation (roll, pitch, yaw) Reset - sets 0,0,0 for the selected scroll bars...


Provides two different features: An EZB Uart Serial Parser, SerialTcpBridge (Connect EZ-Robot Camera via USB) Version 2 ======================== New Feature: EZB Uart Serial Parser Results: //TODO: Improve details This plugin requires: ======================== 1) Extra hardware 1.1) Teensy 3.2 Micro-controller https://www.pjrc.com/store/teensy32.html http://www.robotshop.com/en/teensy-32-usb-microcontroller-development-board.html 1.2) JST 2.0 PH 6-Pin Male Connector...


This robot skill has been deprecated and replaced with Full Screen Video Player


This control allows you to display an image of your robot and display servo position information. *Note: To delete a servo, click on the Servo and press the DEL key on the keyboard




Movement Panels Robot Skills

The Movement Panels category in Synthiam ARC focuses on controlling and managing the movement of robots. It includes features for controlling motors, wheels, and other actuators, allowing robots to navigate and move in a controlled manner. There can only be one movement panel per project. Movement Panel Details

The AR Drone Movement Panel allows control of an AR Parrot Drone Quadcopter v1 and v2. The A and Z keys are used for Up and Down when using the keyboard mode. The Arrow keys are used to move the robot Forward, Reverse, Rotate Left, and Rotate Right ARC software enables speech recognition, joystick control, Wii control, web control, and camera tracking - tracking faces, colors, motion, and glyphs. Video Device As mentioned in the Camera Device Manual, the AR Parrot Drone Movement Panel must...


Use this to make your robot walk or move using servos. This movement panel is used for humanoids and hexapods. Create frames and actions to animate the robot servos into pre-defined positions. A gait is the movement pattern of animals limbs, including robots, during locomotion. This robot skill allows robots to customize servo animations for gait locomotion and interacting with the world. Some robots (i.e., humanoids, hexapods, and more) use servos to move with gaits based on speed, terrain,...


A movement panel to control brushless motor controllers. These are used in hoverboards or many other robot types. This uses generic (popular amongst DIY projects) brushless motor controllers. These controllers have a pin for direction, speed, and brake for each motor. Configuration The configuration menu has options to select the ports for generic brushless motor controllers. Select the ports that are connected to the appropriate connections between the EZB and motor controller.  *Optional:...


Two or more continuous servos used as a movement panel to move the robot with movement commands.


This movement panel controls two continuous servos, which have continuous rotation. Unlike a Standard Servo, a continuous servo has its internals changed to allow continuous 360 rotation. The two parts that are removed from a standard servo to create a continuous rotation servo are: Mechanical Stop: The spindle will stop if you attempt to rotate a standard servo further than 180 degrees. That is due to a mechanical stop built onto the output gear within the servo. Continuous servos have that...


Use this skill if your robot requires a motor/movement controller that is not supported. You may assign scripts to the Forward, Backward, Left, Right, and Stop commands. Script functions are also available to you to obtain the currently selected speed of both Left and Right wheels. You may modify the speed using SetSpeed(value). The speed values may be altered by any joystick-style skill that affects speed, such as the Joystick in Mobile Interface or a physical joystick using the Joystick...


An advanced custom movement panel with more features than the regular panel. This extends the common custom movement panel by including roll left, roll right, up, and down. It is recommended to read what a movement panel is to understand better how to implement your custom code. The movement panel description document contains valuable information about how ARC integrates movement panels and how each movement direction should be interpreted. This movement panel has scripts for various...


Control the DJI Tello drone from ARC and receive the camera video feed for tracking.  What Is DJI Tello? Tello is an impressive and affordable little drone thats a blast to fly and helps users learn about drones with coding education. The high-resolution onboard camera can be used for computer vision tracking within ARC. With the camera being fully compatible with the ARC camera device, theres no limit to what this drone can do! Firewall Permission When the connect button is pressed with this...


A movement panel for using a two-channel hbridge that uses DC motors. There is no speed control with this movement panel. The motors will be either ON or OFF. An H-Bridge is an electronic circuit that enables a voltage to be applied across a motor in either direction. Its triggered by TTL signals from a microcontroller, such as an EZ-B. These circuits are often used in robotics and other applications to allow DC motors to run forward and reverse. A Dual H-Bridge (such as the EZ-Robot 2.5 Amp...


A movement panel for using a two-channel hbridge that uses DC motors. This will also control the speed of the motors using PWM signals to increase the robots mobility. An H-Bridge is an electronic circuit that enables a voltage to be applied across a motor in either direction. Its triggered by TTL signals from a microcontroller, such as an EZ-B. These circuits are often used in robotics and other applications to allow DC motors to run forward and reverse. A Dual H-Bridge (such as the EZ-Robot...


The inverted pendulum skill is designed to be used with the Sainsmart v3 balance robot but technically you can use the Sainsmart hardware, combined with ARC, to balance any robot. Youll just need some patience to tune the PID values. In order to use this skill youll have to install the custom EZB firmware onto the Sainsmart robots Arduino Mega and connect it to ARC.   Main window 1. PID Gain Value Sliders These sliders adjust the proportional, integral, and derivative (Kp, Ki, and Kd) values...


Control the older iRobot Discovery and 4xxx for legacy support. This robot skill is available for those who have robots built on the older roombas.  *Note: For 500 series users, please use the iRobot Movement Panel, which also has NMS compatibility and features available to newer roombas. Screenshot EZB Communication The ezb port is hardcoded for software serial D0, which means it operates on any EZB that supports software serial. This includes Arduino, EZ-Robot controllers, and more. This...


Control the iRobot Roomba and read sensor data. This skill currently works with the 500 series robots and iRobot create. The baud rate and connection type will need to be configured in the settings menu.  This robot skill can connect to an iRobot by using either a USB adapter or a microcontroller acting as an EZB (Arduino, etc.). The connection type in the config menu will specify the way this skill communicates with the iRobot. Continue reading below... Main Window 1) Enable sensor streaming...


This skill connects the Ohmnilabs Telepresence Robot servo/wheel hardware with ARC. Windows must be installed on the robots internal UP Board so that ARC can run. This skill connects to the Ohmnilabs bus through the COM port thats detected. Prerequisite - Installing Windows 10 The Ohmnilab robot comes with an Intel Up Board as the SBC (Single Board Computer). In order to run ARC and all of the available skills, Windows 10 x64 (64 bit) operating system needs to be installed on the UP board inside...


*Note: This skill has been discontinued and replaced with the Ohmnilab standalone skill This control lets you control a Telepresence robot from Ohmnilabs using ARC and immediately get access to all the features that Synthiam platform offers including face detection, automated tracking, easy navigation and more. In order to use this control you need to run the following custom firmware on the robot: https://synthiam.com/Firmware/Ohmnilab-telepresence-robot-firmware-DE45ISCUODD


This control has been deprecated and replaced with iRobot Movement Panel


Movement panel for the Kangaroo and Sabertooth from Dimension Engineering. This allows a robot with wheel encoders to be used, which enables the kangaroos speed control and various other features. Configuration 1. Software Serial Option Checkbox This uses any EZB digital I/O port to send an emulated serial signal to control the Sabertooth. This section includes a drop-down for the serial communication speed (baud rate) and a Button to change I/O ports. *note: Software serial should only be...


This skill supports the Sabertooth Motor Controller as a movement panel with variable speed in Simplified Serial mode. The Sabertooth Motor Controllers are high power. They come in many versions that range from 5Amp to 60Amp. The ARC software supports the Simplified Serial Mode at multiple baud rates on a configured software serial or hardware UART port. Main Window 1. Movement Panel Buttons These buttons will allow you to control the motors attached to the Sabertooth Motor Controller (left,...


The Sphero Movement Panel controls a Sphero over Bluetooth. Connect your computer to the Bluetooth signal of your Sphero. The computer will detect 2 COM ports for the Sphero; only one of them will connect to the sphero - we are unable to determine which COM port is the Sphero, so you have to guess. Sorry about that! Every time you disconnect or reboot the Sphero, the Bluetooth connection must be deleted and re-synced. This means you will have to delete/remove the Sphero Bluetooth device from...


Control UBTech Alpha robot via bluetooth and control servos. The Virtual Servos in ARC will control the UBTech Alpha robot servos. Use the configuration button on this skill to configure what servos are to be used. The Vx servos in ARC correspond to the servo IDs of the UBTech Alpha robot. For example: - V1 controls servo ID #1 - V2 controls servo ID #2 - V3 controls servo ID #3 - and so on... Main Window 1. COM Port Drop-down Select the Bluetooth COM port that corresponds with the Alpha1 robot...


The WowWee MIP robot is hackable with a UART port on the mainboard. Connect the GND, TX, and RX to an EZB’s UART TX and control it from ARC. You can power the EZB with the MIPs onboard batteries, but if more power is needed, an EZ-B v4 + camera, for example, an additional 9 Volt battery, can be used. Any EZB that supports a hardware UART can use this robot skill, such as the ESP32, ESP32-Cam, or EZB v4. The ESP32-Cam is preferred because it’s the smallest and has a built-in camera. *Note:...


The WowWee MIP robot is hackable with a UART port on the mainboard. Connect the GND, EZB TX to MIP RX. You can power the EZB with the MIPs onboard batteries, but if more power is needed, like for an EZ-B v4 + camera, for example, an additional 9 Volt battery can be used. Any EZB that supports a hardware UART can be used with this robot skill. Such as the ESP32, ESP32-Cam, or EZB v4. The ESP32-Cam is preferred because it’s the smallest and has a built-in camera. Because the MIP does not send any...


This is a movement panel for the Wowee Rovio toy robot that allows controlling from Synthiam ARC. The WowWee Rovio is a now somewhat dated but notable remote-controlled robot that was introduced in 2008 by WowWee, a company known for creating innovative and interactive robotic toys. The Rovio was designed to provide users with a way to remotely explore their surroundings and interact with people and objects using a combination of video, audio, and mobility features. Key Features Remote Control:...




Navigation Robot Skills

The Navigation category in Synthiam ARC is dedicated to enabling robots to navigate and move autonomously. It includes functionalities such as path planning, obstacle avoidance, and localization. This skill set is crucial for developing robots capable of exploring and navigating complex environments.

Use the Beitian BN-880 GPS/Compass sensor for autonomous navigation from predefined waypoints. View a map of the surrounding area to specify waypoints in real time. The robot can navigate between the waypoints using the combined GPS and Compass of the BN-880 sensor. *Note: This robot skill does not use the ARC NMS (Navigation Messaging System). This is a standalone robot skill that will perform its navigation. BN-880 Features Electrical Characteristics: - Receiving Format: GPS, GLONASS,Galileo,...


Use an overhead camera to track the pose location for localized navigation. The robot must have a Glyph attached to the top of itself. The camera must be overhead. Ideally, having the camera overhead looking down works best. Although, having a slight angle will work okay for most cases. Ensure there is proper lighting so the camera can detect the glyph. Anyone with a camera can use this robot skill to experience high accurate NMS with autonomous navigation.  This positioning system will work...


This robot skill does not have navigation capabilities. We recommend using The Better Navigator instead of this. This was created for testing and proof of concept if you want to make your own skill for slam.  This skill is for the ARC NMS (Navigation messaging system) to display Simultaneous Localization and Mapping (SLAM). A LIDAR is recommended to be used in conjunction with this skill to create an accurate SLAM map of the robots environment. Verify the Lidar you choose can push scan data into...


NMS Driver for the Hitachi-LG LDS 360 degree lidar. This robot skill connects to the Lidar and pushes the data into ARCs NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used. Screenshot Configuration 1) BaudRate The baud rate for the communication to the usb serial adapter or arduino. By default the baudrate should be 230400 for the lidar. However, some usb serial converters will use different baud...


The indoor positioning system is a combination of Infrared (IR) light transmitter and a dual camera receiver. The IR transmitter is mounted on your robot, and the dual camera receiver is placed in a corner of the room. The dual camera receiver is made up of a camera that captures regular image data and another that only captures infrared light. You can mount the dual camera on a tripod so it can see the room with its 100 degree wide angle lenses. There are three variables created that return the...


Connect the D435i Intel Realsense camera to the ARC navigation messaging system (NMS) for distance detection and mapping. This should also work with other Intel RealSense depth camera models because the FOV and parameters are used from the device. Main Screen 1) Select the device by the serial number. This allows multiple Intel Realsense devices to be added to a project. 2) START/STOP button connecting to the RealSense device. 3) Log window for status and errors. Image Align The Image Align...


With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation. The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of its own...


Use the microsoft xbox 360 connect to send navigation/obstacle information into the ARC NMS (navigation messaging system). This skill publishes obstacle scan distance data to ARCs NMS (navigation messaging system) to be used in SLAM or The Navigator skills. Because the entire image depth is unnecessary to be processed for mapping, only a portion is. This prevents your robot from picking up distances of things irrelevant to its size and distance. Using the options for the detection area, tune the...


Touch tablets allow controlling a movement panel much more accessible. This control requires at least one movement panel. This is only a touch control that controls movement panels. Add a movement panel to your project for your respective movement method (i.e., Modified Servos, HBridge, Roomba, AR Drone, etc.)


An example project with source code that demonstrates how to create an NMS Level #1 navigation robot skill. In this robot skill, the NMS scan and location(position) data are displayed. Other than a demonstration of how to code a robot skill to receive NMS data, this is entirely useless. The code for this example robot skill demonstrates how something like The Navigator could be made.  Download the source code for this robot skill here: NMS Example Source.zip This robot skill subscribes to an...


For robots that do not have odometry sensor feedback (i.e. wheel encoders, RealSense D435i, etc.), this will generate estimated pose odometry based on the travel distance of the movement manager. Configuration Menu This robot skill calculates the distance the robot is moving based on the speed. Because ARC has dynamic speeds, the calibration will require a few measurements of a robots travel distance over time. This information is entered in the config menu. 1) Speed 255 Distance - This is...


Pineapple Technology Industries plugin Capstone Project (secretive lidar unit)


NMS Lidar driver for Slamtec RPLidar A1. This pushes scan data to the ARC NMS (Navigation Messaging System) as a level #3 Group #1 sensor. This robot skill connects to the Lidar and pushes the data into ARCs NMS (navigation messaging system) to be used with level 1 navigation viewers. Read the NMS manual page to understand how this skill can be used. That might sound more confusing than it needs to be. But, all that means is this sensor will detect the distances of objects using a laser in a...


A better navigation skill based on Hector SLAM using ARCs NMS location/positioning and obstacle data. This skill is combined with other skills contributing navigation data to ARCs navigation messaging system (NMS). The lidar or depth camera data will create a map of the room(s) as the robot drives. You can then add way-points that are saved with the project. You can have the robot automatically navigate by clicking on a way-point (i.e., kitchen, sofa, or dining room). The robot will figure out a...


A navigation skill for path planning based on receiving NMS location/positioning and obstacle data. This skill is combined with other skills that contribute navigation data to ARCs navigation messaging system (NMS). Updated Skill This robot skill has been superseded by a newer version called The Better Navigator. Example This screenshot uses an Intel RealSense T265 with three low-cost HC-SR04 ultrasonic distance sensors. The robot was instructed to drive around the waypoints at various speeds....


Wavefront Navigation path planning (shortest path to destination) Define an Array called $MyMap, Assign your goal location in the array a value of 1, assign your robot position in the array a value of 254, all walls and obstacles are assigned 255, open space is assigned 0. Call the wavefront plugin. The plugin returns an array $MyRoute, showing the best route. G indicating the Goal, R indicating the Robot and _ indicating the route. My plan is to the use Sonar to check ahead and if an obstacle...


This skill is compatible with Arduino UNO firmware that counts wheel rotation for left and right wheels. Firmware This robot skill requires supporting capability on the EZB with firmware. The firmware is open-source so that you can modify it for different Arduinos and microcontrollers. The Wheel Encoder Arduino UNO firmware can be found here: https://synthiam.com/Firmware/Wheel-Encoder-Firmware-NVXCXGPCQMJ Main Window 1. Get Wheel Values Button Manually requests the encoder values for each...


Run the Neato Robotics XV-11 LIDAR on your robot to generate a 2d distance map The XV-11 is an inexpensive 360 degree LIDAR sensor. This requires that you use a GetSurreal controller. I have a version 1.2 Get Surreal Controller. You must download and install the Teensy Serial Driver This plugin will generate an array variable called $LIDARDistance. Each element in the Array matches to the degree angle reported from the XV-11. The value of each element is the distance reported back by the sensor...




Power Robot Skills

The Power category in Synthiam ARC focuses on managing and monitoring the power aspects of robots. It includes features for monitoring battery levels, managing power consumption, and implementing power-related behaviors. This skill set is essential for ensuring the efficient and reliable operation of robotic systems.

Simple skill that prevents the PC from entering sleep mode. Resize the control to be as small as you need to not take up too much real-estate. Long as this control is added to your project, the PC will not go to sleep.




PWM Robot Skills

The PWM (Pulse Width Modulation) category in Synthiam ARC facilitates precise control of actuators such as motors and servos by adjusting the duty cycle of the signal. This skill set is crucial for achieving fine-grained control over the movement and position of robotic components.

An EZ-B can rapidly trigger a digital port ON and OFF to simulate a TTL voltage signal low (0V) and a TTL voltage signal high (+3.3V or +5V). This method is called Pulse Width Modulation (PWM). The slider controls the value of the PWM on any digital I/O pin of the EZ-B. If you have a motor controller (H-Bridge) with an enable port, you can connect it to a Digital I/O pin and use this skill to control the speed of the motor. Alternatively, you could also control the brightness of an LED with...




Rafiki Robot Skills

The Rafiki category in Synthiam ARC introduces social and emotional interaction capabilities to robots. It includes features for recognizing emotions, responding empathetically, and engaging in social interactions. This category adds a human touch to robotic interactions.

This is the front bumper sensor plugin for Rafiki from Cochran Robotics. A Cochran Robotics sub-controller for a parking sensor must be used with this control, which is part of Rafiki. THIS IS IN BETA 4 different proximity sensors are on the front of the Rafiki (left, left front, right front, and right). This plugin converts the information provided from these sensors into variables in ARC. If the object is close to a particular direction, the motor controller will not allow the robot to move or...




Remote Control Robot Skills

The Remote Control category in Synthiam ARC enables robots to be controlled remotely by users. It includes functionalities for remote operation, telepresence, and remote monitoring. This skill set is valuable for scenarios where direct user control or observation is required.

Avatar JD is a Virtual 3D simulator of JD the Humanoid Robot. You can move this Avatar JD to move your real JD Robot. Also, when you move your real robot Avatar JD will move as well (Version 9: Internal coding changes: Removed timer and used VariableManager and On Servo Movement event to get Servo Values and Scripting variables.) (Version 4: Used WPF Element host for smoother animation. Added $SIM_ global variables so you can write scripts to move Avatar JD and interact with the simulation....


Exosphere is a game-changing product that allows users to control robots remotely to complete tasks anywhere with an internet connection. You can use your phone or tablet to control a robot or add your robot to be controlled by others. This product gives you the power to help make the world a safer place and improve business efficiency. Exosphere gives any robot the ability to execute autonomous tasks or telepresence capabilities in the workplace or home with human intelligence. Exosphere...


This skill is used with a movement panel. When the movement panel is used (i.e. your robot is moving forward, left, right, etc) then the floor map skill will begin drawing a line in the direction of your robot. Use this skill to keep track of where your robot has been. This is not a navigation skill, it merely draws a line of direction and distanced that your robot has traveled over time for informational purposes. By default, the Config values specified will not work with your specific...


Remote control of the ARC Application (and your desktop) over a web browser. Use the Config button to load the configuration window. Specify the TCP Port and usernames for authentication. The default port of the HTTP server is set to 80. If you already have an HTTP server listening on port 80, there will be an error message in the status window when starting the ARC HTTP server. Main Menu Features Remote Control: View the ARC screen and interact with it via left mouse clicks. Using this...


This HTTP Server is different than the regular HTTP Server because it allows custom HTML Files. The HTML Files may contain special commands for controlling your robot Supported file extension types: .HTML .JPG .PNG .CSS *Note: Any other file extension is returned as a mime type of application/octet-stream The HTTP server will get the files from the My Documents\ARC\HTTP Server Root folder by default. This is found in your My Documents folder. In that folder, you can place HTML files, jpg...


The Interface builder skill empowers you to create a touch-screen user interface (UI) for your robot. Use buttons, labels, pads, sliders, drop-downs, checkboxes, and displays to make a control panel to activate features. Its a powerful way to interact with your robot. With the Interface Builder skill, designing a user interface is very easy, and you can create as many pages as you like. The user is presented with navigation buttons to switch between fullscreen pages. The skill allows you to...


A joystick can be connected and used with ARC to control movement, servo positions, and assign scripts to buttons. For XInput joysticks (such as Xbox), use the Joystick (Xinput) skill. The Analog/Digital joystick #1 is configured to control a movement panel. This can be changed by using the Config button and assigning servos to the joystick instead. Each button can execute a script by either pressing down or releasing up. *Note: If your joystick is connected after the project has been...


This is a Joystick skill for XInput devices (Xbox controllers, etc.). The advantage to this over the Direct Input joystick skill is that you have access to vibration hardware and independent access to analog inputs. Analog inputs, such as the triggers, can control servos or vibration motors. *Note: If your joystick is connected after the project has been loaded, press the Refresh button to detect and load the joystick. Main Window 1. Device Drop-down This drop-down shows the Xinput devices that...


Use this control to assign scripts to keyboard shortcuts. Any key can be assigned a Script command. By default, the arrow keys control any movement panel you have added to the project. If you press the UP arrow, the forward direction of the current movement panel is executed. If you press the Down arrow, the reverse direction of the existing movement panel is executed. If you want to configure scripts to execute for any keys other than arrow keys, you can do so in the configuration menu of...


This analog joystick skill controls any movement panel. Rather than using the interface builder to create a custom interface, this can be used to provide an analog movement joystick to any movement panel. Main Window 1. Analog Joystick This red analog joystick can be manipulated with the mouse cursor or a touch screen. Configuration 1. Dead Zone Field This value is the threshold the joystick must move by to be registered as a movement. The lower the value, the more sensitive the joystick will...


An on-screen joystick that will control any movement panel in the project.


Myo Gesture Armband with support for multiple devices. This will allow control of your servos and script variables with Thalmic Labs Myo. The gesture and accelerometer data are mapped to both servos and script variables. Much like how the Wii Controller works with ARC, this is very similar. The benefit on this armband is that it physically attaches to your arms, giving you the ability to have the robot mimic your movements without having to hold a controller or push buttons. Simply move your arm...


The Remote UI Client skill for ARC enables PCs to remotely control robots using customizable interfaces created in other ARC instances, with the ability to switch between multiple interface pages. This skill enhances robot functionality by allowing for tailored control schemes and remote operation, which can be particularly beneficial for managing complex tasks or operating in challenging environments. This is the PC version of the Remote UI Client. The ARC Remote UI Client allows ARC PCs to...


Sketch a path for your robot with your finger (on a tablet) or with the mouse. Use the configuration menu to select the turning and movement speed. These values are specific to the robot and speed. Because this depends on the robots speed and not wheel encoders, it is not always accurate. With wheel slippage and incorrect timing, this is not a reliable method of navigation. This also does not use the NMS (navigation messaging system), so there are no external sensors being used. Consider...


This skill allows you to control servos, movement panels, and scripts with a Wii Remote Controller. Use the controllers Home button to enable/disable the accelerometer servo movement. The directional pad (D-pad) can control a movement panel. You can use the other buttons to activate custom scripts. *Note: There are hundreds of versions of Wii remotes from both Nintendo and aftermarket clones. Synthiam supports the standard protocol of Wii Remotes but isnt a manufacturer of them. Synthiam is...




Scripting Robot Skills

The Scripting category in Synthiam ARC empowers users to create custom scripts and behaviors for robots. It includes scripting languages and tools that allow users to define complex and customized robot actions, expanding the flexibility and capabilities of robotic systems. Here are the manuals for JavaScript, Python, EZ-Script, Blockly, and RoboScratch.

This skill gives you a console directly to the EZ-Script engine. You can execute commands in real-time. A history of your commands is saved and can be recalled using the UP and DOWN arrow keys. Main Window 1. Clear Button This button clears the console display. 2. Console Display Displays the scripts added to the console text field. 3. Console Text Field This is a text field to enter scripts into, a single line at a time, in real-time. Hit enter to execute the script. How to use Script...


This plugin will register new EZ-Script commands that can be used to dampen and filter values. For instance, if you are receiving data from a sensor to move servos, these filters will smooth the output to prevent the servo from being jerky. There are currently two filters that can be used. In order to allow multiple instances of each filter, the commands have an ID. This means you can have multiple instances of the same filter and each instance can have an unique ID. Mean Filter Examples Here...


This skill adds a new ez-script function GetLineContainingFromArray() that will return an item from the array that contains the specified text.  Main Window 1. Start Button Once the Get Line Containing From Array Skill is loaded you will need a Script skill to access its features. Once Get Line Containing From Array code is loaded into the Script skill use the Start button to activate it.  How to Use Get Line Containing From Array 1) Add the Get Line Containing From Array Skill (Project - Add...


*DEPRECATED (javascript is built into ARC as of 2020.01.23.00) This plugin is a script control which accepts JavaScript as the language. There are plans to continue adding features to this plugin, and eventually integrating the JavaScript language directly into ARC in the future. *Note: Remember, JavaScript commands are case sensitive. This applies to variables and functions. Commands Theres a whole bunch of commands, and new ones implemented in future updates of this plugin. The code edit...


This skill allows you to trigger scripts based on the robots movement. It is best to understand what a movement panel is by reading this support document first. For example, all Movements are global. So if you use a Movement Panel of any kind, this skill will execute the script for the movement direction that was activated, such as if you have a Modified Servo Movement Panel and press FORWARD, the respective Forward Script in the settings of this skill will execute. You can assign a...


This skill will execute a script when a variable has changed. Add the variable to watch, and a respective script. When the variable has changed, the script will run. Main Window 1. Status Display Displays variable changes. Configuration 1. Variable Name Field Enter in a variable name. 2. Script Field Enter in a script that you would like to execute when the variable changes. 3. Row Manipulation Buttons Move, Delete, Insert, or Append the rows that you have entered. How to Use On Variable...


This control uses a random-like algorithm to add a unique personality to your robot. By adding script commands to the Config section, the commands will be executed within the specified High/Low timeframe (in seconds). This control can run in the background while your robot operates. The control will add unique actions that you create. For example, add script commands such as: - left(1000) - right(1000) - SetServo(d1, 20) - ServoRandom(d1, 1, 30) - ControlCommand(ScriptStart, Flash Eyes) -...


This skill is a mimic tool. You perform specific actions while the skill records communication between the EZB and your PC (movement, servo positions, digital set, etc.). It then allows you to play those actions back on demand. You can save many recordings and trigger them from scripts. To create multiple recordings, add, and rename several instances of the Recorder skill. *Note: This skill isnt a macro recorder. It doesnt record your actions across the windows on your PC. It doesnt record...


*NOTE: This plugin has been deprecated because RoboScratch is now included with ARC RoboScratch Goal The goal of this project is to introduce an entirely new graphical programming interface which does not require using the keyboard. Blocks of functions are dragged on to the screen. Specific parameters of each block, such as wait for the camera to see an object, wait for speech, or execute a dance routine. How Is It Different What really makes RoboScratch different from other scratch-like...


The script skill is very versatile; it can interact with most ARC skills. It is also very intuitive, as it contains multiple detailed manuals for assistance with programming in its settings menu. The Script skill currently includes the following languages to program in: Blockly Javascript EZ-Script Python Main Window 1. Start/Stop Button Executes the script that is written in this skill. 2. Status Display Displays the status according to the script written in this skill. Settings -...


Execute a JavaScript or Python script to automate a process and control other robot skills.


This control holds many scripts. Rather than having one control per script, this contains many. Like the regular Script control, you can execute scripts within the Script Manager using the ControlCommand(). Here is an example that will execute a script within the Script Manager named MyScript. ControlCommand(Script Manager, ScriptStart, MyScript) To view all available ControlCommand()s, use the Control Details tab when editing the script. This will show all ControlCommands() for all controls...


*Note: This control has been removed and is no longer available.


This is a diagnosing tool to identify scripts that are running. Many skills have script events that are triggered based on activity. For example, a Wii Skill will execute a script when a button is pressed. Because these scripts are executed in the background, this skill lets you monitor the activity. If your script has a loop, it may run extra long. Using this monitor; you can see which scripts continue to run. Main Window 1. Script Labels These labels are the titles of each script skill you...


Bind a script to servo position movements. This allows you to create scripts that run each time a servo is moved. A good use case for this is to control a smart serial-style servo as a real servo by binding to a specific servo port. If you had a servo that worked via Serial UART and wanted it to operate as a regular ARC servo, you could write code here to bind to it.  The alternative method would be creating your behavior control that binds to the servo movement event. This is covered in the...


There are so many ways to present visual feedback with this skill. The Sketch Pad allows you to draw with controlCommand() from a script. You can create scripts that draw object locations in a room or display information from the path the robot has driven. You can also display custom face emotions using simple graphic shapes of circles, squares, and rectangles. Use your creativity to visualize robot data! *Note: For usage information, reference ARCs example folder for a SketchPad project. Main...


This skill operates similarly to the Servo Slide controls but adjusts the specified variable with a slider. You can also specify a script to execute when the slider value has been reached. Main Window 1. Slider Press and hold the left mouse button in this area to adjust the slider value. The slider value will be displayed here, and it will be between the minimum and maximum values set up in the skill settings. 2. Bonus Button This button is only available when a script is added to the Button...


This is a smarter variable watcher that replaces the standard built-in version. This version uses less CPU for large projects. When large arrays are defined, it may take a long time for the grid to render the data. However, once rendered, the data will update much faster than the ARC built-in variable watcher.  1) The name of the variable. The variables in [ brackets ] are internal values and are not variables. For example, the volume or speed are set with functions. They are displayed here for...


*Note: deprecated because this is now built-in function  Plugin for sorting the contents of an array using the ControlCommand() syntax. Here is an example ez-script code using this plugin...


This has been replaced with the Sound Script robot skill: https://synthiam.com/Support/Skills/Audio/Speech-Script?id=20204


Split complex tasks into several simple scripts. Each Script is a stage of completing a task, executed one after another. A large, seemingly overwhelming task will be transformed into a set of smaller, more manageable microtasks that can each be accomplished independently. For example, you can use this to have your robot navigate into a room, look for an item, move toward the object, pick up the article, and return to another point. Each task step is split into stages with a script for each...


Add custom ez-script functions in c# using this example. The source code is available here: UserDefinedFunctionExample.zip Additional tutorial information can be found here: https://synthiam.com/Support/Create-Robot-Skill/Examples/Example-Custom-EZ-Script-Function Optionally, you can create custom JavaScript functions with these instructions: https://synthiam.com/Support/Create-Robot-Skill/Examples/custom-javascript-extension There is also an example code below the video.


The Variable State Server skill allows a robot to retain important variable data between sessions, enhancing its ability to remember past interactions or states, thus providing users with a more consistent and personalized experience. By enabling automatic or manual saving and loading of variable states, this skill ensures that a robot can quickly restore its operational context upon startup, making it more efficient and adaptable to ongoing tasks and user preferences. If you have some variables...


This skill views the known script variables and their respective values. Variable Watch automatically refreshes to show variable values. This is a great skill for debugging your programs and viewing the variable data. *Note: It is advised to run the Smart Variable Watcher robot skill instead of this. This skill may cause performance challenges with your program. Only use this skill for debugging purposes. Main Window 1. Variable Display This area displays the variable name, its type, its...




Servo Robot Skills

The Servo category in Synthiam ARC focuses on controlling servo motors, allowing precise control over the rotational movement of robotic components. This skill set is essential for tasks requiring accurate and controlled motion in robotic applications.

Create frames and actions to animate the robot servos into pre-defined positions. Use this to have the robot wave, dance, or display interactive gestures. A gait is the movement pattern of animals limbs, including robots, during locomotion. This robot skill allows robots to customize servo animations for gait locomotion and interacting with the world. Some robots (i.e., humanoids, hexapods, and more) use servos to move with gaits based on speed, terrain, maneuverability, and energetic...


This skill automatically releases specified servos after a certain amount of time from their last movement. This is very helpful to use when it comes to repetitive, quick servo movements. Main Window 1. Pause Checkbox This button pauses the application of the Auto Release skill. 2. Status Display This displays which port the auto release skill was applied to and at what position it was last in before the auto release was activated. Settings 1. Title Field This field contains the title of...


Unlike a Standard Servo, a continuous servo has its internals changed to allow continuous 360 rotation. The two parts that are removed from a standard servo to create a continuous rotation servo are: Mechanical Stop: The spindle will stop if you attempt to rotate a standard servo further than 180 degrees. That is due to a mechanical stop built onto the output gear within the servo. Continuous servos have that stop removed. Potentiometer: The potentiometer in a standard servo records the...


Use Vx virtual servo ports to control up to 99 servos on a custom servo controller via a serial protocol, such as Arduino Mega, with custom firmware. The Vx virtual servo ports allow you to control up to 99 servos on a custom servo controller via a serial protocol, such as an Arduino Mega, equipped with custom firmware. This custom firmware can be loaded onto an Arduino or a custom microcontroller, transforming it into a specialized servo controller. The robot skill will establish a connection...


Connect and control Robotis Dynamixel Smart Servos with ARC, including Arduino and Robotis controllers. This open-source plugin converts the Virtual Servo ports with any EZ-B (v0-v99) to control Robotis Dynamixel Servos. The servos supported are XL-320, AX-12, and XL430 compatible protocols. Dynamixel Servos are very different than regular hobby PWM servos. Firstly, they are considered much more professional due to their design specifically for use in robotics. Each Dynamixel Servo contains a...


Control Feetech serial bus smart servos over EZB UART or PC COM port. The selected ARC’s Vx virtual ports must match the servo IDs in the config screen  - and voila, youre off to the robot races! Main Window The main window on the project workspace will display information about active connections and errors. To configure the servos, press the ... configuration in the title bar. Configuration Screen The configuration allows selecting the virtual servo ports that represent the IDs of the...


Control Feetech serial bus smart servos over EZB UART or PC COM port. The selected ARC’s Vx virtual ports must match the servo IDs in the config scree  - and voila, youre off to the robot races! Main Window The main window on the project workspace will display information about active connection and errors. To configure the servos, press the ... configuration in the title bar. Configuration Screen The configuration allows selecting the virtual servo ports that represent the IDs of the Freetech...


This servo skill allows you to drag the mouse horizontally (left right) on display to move the position of a horizontal servo. The horizontal servo will only move to the minimum and maximum limits you specify in the skills settings. The limits are both set to 90 by default. *Note: Like all robot skills, this does not save the last servo position used. To set a servo position when the robot connects, create an initialization script using the Connection Skill. Main Window 1. Position...


This Synthiam ARC Robot Skill is a powerful, customizable tool that provides advanced kinematic solutions for your robot arm. This tool allows you to define and manipulate the structure of your robot arm by adding joints and bones, providing a high level of customization to suit your specific needs. It is designed to be user-friendly and intuitive, making it accessible to beginners and experienced users. Whether designing a robot for industrial applications, research, or personal projects, the...


Control Kondo KRS smart servos with ICS 2 3.5 Protocol over the EZB UART. We cannot verify this works because we do not have servo hardware to test with. This plugin is in response to a community thread regarding Kondo servos. *Note: This is experimental The servos can be chained together and connected to the EZ-Bs UART #1 on port D5. The baud rate must also be configured in the settings of the plugin. Each virtual port can be selected for the communication version based on the servo model. The...


Control the LewanSoul Digital smart Servos (i.e., LX-16A) with ARC. The servos must be powered appropriately and connected to the EZ-B v4 or IoTiny with the respective port. Visit the Config menu of this plugin to view the port configuration. This plugin will only operate on EZB Index #0. Demonstrated with IoTiny using Software UART Details ARCs Virtual Ports (V1..V99) can be assigned to the LewanSoul servos. - This plugin requires the RX signal wire of the servo to be connected to the TX of...


The Lynxmotion Smart Servos (LSS) are compact, modular actuators which were designed to be an evolution of the standard servo for use in multi degree-of-freedom robots, animatronics, custom robot projects and more. The servos form the core modules of the next generation Lynxmotion Servo Erector Set (S.E.S.) and a wide variety of modular brackets, mechanics and electronics have been designed around them. Smart means there is sensor feedback for Position, built-in safety features. The LSS...


Control servos using the Pololu Maestro servo controllers that come in flavors of 6, 12, 18, or 24 servos. The connection for this robot skill is USB, and therefore great for onboard computers. This robot skill binds to the ARC Vx virtual servo ports. The index of each Vx port is directly related to the index of the Maestro servo port. That means if you send a movement to V1, the servo is sent to Maestro #1 servo. Configuration 1) Device This displays the serial numbers of all Maestro USB...


This control will allow hobby PWM servos to have positional feedback in ARC with a slight modification requiring a wire, soldering iron, and Philips screwdriver. This uses the ADC port on the EZ-B (v4, Arduino, etc.) Example In this video, we remote control a JD with another JD. The servo positions from one JD are read using this robot skill, and the positions are set to the other JD. This is like using one JD as a puppet to control the other JD. Video Tutorial Watch this video tutorial to see...


Receive servo positions in real-time over an Arduino and its ADC ports. This control connects to an Arduino over I2C and receives the values of the ADC ports. The Arduino requires firmware to be programmed, which can be downloaded below. Depending on your Arduino version, the code can be modified for the number of ports.  The Arduino code available here is for the Arduino Nano, a tiny little Arduino that nicely fits next to an EZ-B or IoTiny. It connects to the EZ-B via I2C using its A4 and A5...


This skill is similar to the Horizontal and Vertical servo skills. However, the difference between this skill and the others is that it allows you to specify a Master Servo. The slave servos will move into relative positions of the Master. If you move the master servo from ANY skill, the slave servos will move to their relative positions. You can use scripting or the checkbox to Pause and Unpause this skill whenever you like, preventing the slave servos from moving. Note that servo selection...


Inverse kinematics calculates the joint angles to position the Robotis Open ManipulatorX effector to a specific desired location in 3d cartesian space.  Inverse Kinematics Inverse kinematics is a computational process used in robotics to determine the joint configurations necessary to position and orient the end-effector (usually a robot arms hand or tool) at a specific target location and orientation in three-dimensional space. It involves solving equations that map the desired end-effector...


Move a servo using the mouth with an on-screen widget


This skill allows you to drag your mouse or finger (on a touchscreen) to move x-axis and y-axis servos. Its like a virtual joystick. This is a great skill to use when you have servos in a pan/tilt configuration. Main Window 1. Servo Pad Hold your mouse or finger (if you have a touchscreen) on this pad to control servos in a pan/tilt configuration. Swipping left and right will move the x-axis servo. Swipping up and down will move the y-axis servo. Settings 1. Title Field This field contains...


*Note: Ths plugin has been deprecated and replaced with this: https://synthiam.com/Software/Manual/Servo-Recorder-15963 This works similar to the ARC Recorder, except it is limited to recording servo positions only. You can start/stop recording or start/stop playback by either using the mouse buttons on the control or through EZ-Script. The Cheat Sheet will display available EZ-Script ControlCommands(). Such as...


This plugin allows recording and replaying servo movements in specified speeds and direction. Trigger the recording playback and speed using ControlCommand() from scripts. Realistic Motions For realistic motion recordings, we recommend creating the recordings with the Servo Pad Control. Find more information about the Servo Pad here: http://www.ez-robot.com/Tutorials/Help.aspx?id=150 Recording You can store multiple recordings in this plugin. Each can be given a unique name that allows the...


This skill allows you to adjust the speed of a servo or PWM between two positions. It can be used for quickly identifying and experimenting with servo speeds for your application. This skill does not save the last setting or restore settings when the project is loaded. *Note: Before this skill can be used, the servo position must be initialized. To do this, you can simply set the position with a script command or a servo skill. If you wish to have your robot initialize a servo speed when...


This control is an all-in-one servo view. You can add servos to be viewed in the summary. The position and speed are displayed and both can be modified. The control can be resized to fit the number of servos added.


Adding this control to your project will connect to an SSC-32 servo controller. The connection can be direct via pc com port. Or through an ezb supporting the Hardware Uart or Software Uart capabilities, such as the EZ-Robot EZ-B v4/2 Wi-Fi Robot Controller or Arduino Mega. The Virtual Servo Ports in ARC, labeled V0, V1, V2... V31 will be connected to the SSC-33 and used to move servos through the SSC-32. The configuration menu on this control will allow you to specify the baud rate and how the...


Control up to 127 stepper motors as servos with any stepper motor driver or motor type. This robot skill requires a low-cost and small-profile Arduino connected to each stepper motor driver. Each Arduino is given a unique ID on the optional serial network, which matches an ARC Virtual Servo (Vx) port. All 127 Arduinos will share the same PC COM port via the first Arduinos USB connection. An optional end-stop limit switch can calibrate the steppers home position. For example, if the Arduino...


Control the UBTECH Alpha Robot Digital smart Servos (UBT-12HC) with ARC. The servos must be powered appropriately, and connected to the EZ-B v4 or IoTiny with the respective port. Visit the Config menu of this plugin to view the port configuration. The Virtual Ports (V0..V99) in ARC can be assigned to the UbTech servos. UART Ports - This plugin requires the RX signal wire of the servo be connected to TX of the selected UART or digital port (if Software UART is selected on IoTiny) - Hardware UART...


This servo skill allows you to drag the mouse vertically (up down) on display to move the position of a vertical servo. The vertical servo will only move to the minimum and maximum limits you specify in the skills settings. The limits are both set to 90 by default. *Note: Like all robot skills, this does not save the last servo position used. To set a servo position when the robot connects, create an initialization script using the Connection Skill. Main Window 1. Position Adjust/Display...




Ultrasonic Robot Skills

This skill enables your robot to use an ultrasonic distance sensor to detect objects within a specified range. When an object is detected, a user-defined script is executed. This skill is particularly useful for navigation, obstacle avoidance, and interactive applications. ### Features - Object Detection: Uses an ultrasonic distance sensor to detect objects. - Adjustable Detection Interval: Set the frequency of detection checks. - Minimum Distance Setting: Specify the minimum distance for object...




Ultrasonic Distance Robot Skills

The Ultrasonic Distance category in Synthiam ARC enables robots to measure distances using ultrasonic sensors. This is valuable for applications such as obstacle avoidance, object detection, and proximity sensing. Robots equipped with this capability can navigate and interact safely in various environments.

Attach HC-SR04 Ultrasonic Ping Sensors to your AR Parrot Drone v1 or v2 for collision detection and avoidance. Of course, you must mount the EZ-B to the drone. We do not provide specific instructions to mount the EZ-B to your drone or how to mount the distance sensors.


Ultrasonic sensors use sonar (sound navigation ranging) to determine the distance to an object or wall, the same way some animals like bats and dolphins do. They offer excellent range and accuracy in an easy-to-use package. Unlike infrared sensors, their operation is not affected by sunlight or black materials. However, they have trouble with soft fabrics; the sonar echo works ideally when bouncing off hard surfaces. When an HC-SR04 or equivalent Ultrasonic Distance Sensor is connected to an...


Ultrasonic sensors use sonar (sound navigation ranging) to determine the distance to an object or wall, the same way some animals like bats and dolphins do. They offer excellent range and accuracy in an easy-to-use package. Unlike infrared sensors, their operation is not affected by sunlight or black materials. However, they have trouble with soft fabrics; the sonar echo works ideally when bouncing off hard surfaces. When an HC-SR04 or equivalent Ultrasonic Distance Sensor is connected to an...


This robot skill uses an ultrasonic distance sensor mounted to a servo to sweep up to 180 degrees for objects within proximity. If an object is detected, the robot skill will use the current movement panel to have the robot reverse or turn away toward a clear direction. Ultrasonic sensors use sonar (sound navigation ranging) to determine the distance to an object or wall, the same way some animals like bats and dolphins do. They offer excellent range and accuracy in an easy-to-use package....




Virtual Reality Robot Skills

The Virtual Reality category in Synthiam ARC introduces immersive virtual experiences to robots. It includes features for integrating virtual reality technologies, allowing robots to interact with virtual environments and providing users with unique and engaging experiences.

A server for DIY usage that allows a client to control servos and sends two camera streams to the client for stereoscopic view. This allows a custom VR app to use two cameras for stereoscopic view. C# .Net Win Forms Test App We have provided an example C# .Net test app demonstrating how a client can connect to this robot skill server. The app can be downloaded here: Dual Camera Servo Test App.zip Unity VR App The ideal custom VR app can be created in Unity. If using an Oculus Quest 2, we have a...


Use the Oculus Meta Quest 2 VR Headset to view any robots camera and control any robot’s servos with your hands or controllers. Servos can be attached to finger movements, hand movements, controllers, and buttons. Additionally, a built-in remote desktop controls ARC without removing the VR headset. This robot skill requires the Meta Quest 2 app installed from the Oculus Store to control a robot.  Transform your robot interaction experience with the Oculus Quest Robot Skill for Synthiam ARC,...


This is a servo camera video server which allows a remote client to move servos and receive video stream from a camera device. This is specifically useful for those creating Unity apps that operate as a client to ARC. where the camera video stream can be received and servo positions can be sent. The camera must be streaming for the servo positions to transmit. Demo #1 This is an overview of how this robot skill can integrate with a unity scene using the Unity animation tool. Demo #2 This...


Use Virtual Reality with your robot to see what the robot sees, and control servos by moving your head. This plugin supports all virtual reality headsets through Steam VR (OpenVR), Google Play, and iOS, including Google Cardboard, Oculus, Windows Mixed Reality, and HTC Vive. The plugin does two things; it allows you to see what the robot sees and control the servos with the Pitch and Yaw of the headset. This means when you move your head, the robot head can also move, mimicking your movement....