Exosphere combines telepresence, artificial intelligence, machine learning and human assistance to give autonomous behaviors to cloud robotics.

+ How To Add This Control To Your Project (Click to Expand)
  1. Make sure you have the latest version of ARC installed.
  2. Select the Get button in this page to download the archive file.
  3. Double click the downloaded archive file to execute installer.
  4. The installer will add this control to ARC.
  5. Load ARC and press the Project -> Add Control button from the menu.
  6. Choose the Machine Learning category tab.
  7. Press the Exosphere icon to add the control to your project.


Exosphere gives any robot the ability to execute autonomous tasks or telepresence capabilities in the workplace or home with the aid of human intelligence. Exosphere bridges the gap between robotics, machine learning, and artificial intelligence.

Who Is Exosphere For?
It’s for any robot to be remote controlled by a human to achieve tasks or become a telepresence host. 

The way we think is being replicated in AI algorithms include moral hesitation, judgment, and responsibility. This is why AI greatly benefits from human interaction. With close to 1 million robot connections on Synthiam’s platform, when the exosphere is publicly released, robots from around the world are added daily with new tasks. These tasks are created for real robots with real problems to solve. This gets AI out of the lab and into the real world.

Exosphere can control as much or little as necessary to help robots achieve their goals. Even if that means using human intelligence to fill in the 10% edge cases that robots can’t do.

User-inserted image

How Does It Work?
1. A robot is instructed to execute a task.
2. The task is submitted to Exosphere as a contract request.
3. AI/ML and humans provide a confidence bid to win the contract.
4. If AI/ML can’t control the robot to successfully perform a task, another contract is issued to a human operator to take over or assist the robot. In turn, training the global AI knowledge base for future task requests.

User-inserted image

Main Window - Control Tab

User-inserted image

1. Control and Log Tabs
This selects between the Control and Log sections of the Exosphere Skill main window.

2. Task Description Field
Enter the description of the task you are generating, with detailed instructions for the user who will be accepting the task. 

3. Submit/Cancel Task Button
Once the task is submitted, a remote operator will take control of the robot. It is highly recommended to monitor the robot when remotely operated to avoid hazards to the robot or damage to the environment. Once submitted, your task will be added to the Synthiam servers, and users on the website will be notified of the new task. You can also use this button for an emergency stop to cancel the task when it's running.

4. Task Details Field
Displays the time the task was requested, who picked up the task and when it was picked up.

Main Window - Log Tab

User-inserted image

1. Log Selection Drop-Down
This selects the level of information you would like to see in the Log Display window. Verbose, Debug, Error, or None.

2. Clear Log Button
Clears the Log Display Window.

3. Log Display Window
Displays the Log information when errors occur. The amount of information is tailored by the Log Selection Drop-down.

Configuration - General Tab

User-inserted image

1. Robot Model and Description
Write in your Robot's Make, Model, and Title. Then add a description of it which describes the Robot's qualities.

2. Session Password and Minimum Rating
Optionally set a password and a minimum rating a user must have to control your robot.
Note: *These features have not been implemented yet* 

3. Maximum Joystick Drive Speed 
Use the drop-down to set the maximum speed of your robot being controlled by the Joystick. If your robot is connected to ARC you can use the on-screen Joystick to test the speed.

4. Keyboard Drive Speed 
Use the drop-down to set the maximum speed of your robot being controlled by the Keyboard. If your robot is connected to ARC you can use the "Test Forward" and "Stop" buttons to test with. 

Configuration - Scripts Tab

User-inserted image

1. Task Created, Accepted, and Completed Scripts
This section allows you to add scripts that will execute when the following conditions are met: your task is created, accepted, or completed.

2. User Defined Scripts
Add additional scripts to your generated Exosphere task. They will show up as playable scripts on your generated Task in the order you have them added.

Configuration - Advanced Tab

User-inserted image

1. Audio
If enable this section allows the user who accepts your task to hear through your robot's microphone with the settings you configure. This allows you to communicate with the user in real-time.

2. Text to Speech Check-box
Once enabled, the user can write text responses that will be converted to audio. The audio will come out of your robot's speaker.

3. Video Settings
You can select an installed video device for the user to use for real-time visual feedback. You can use the check box to send video tracking information to the user as well. 

4. Telepresence Mode
Enabling this allows two-way communication between the remote operator and the robot. The remote operator's camera video and audio will be displayed in fullscreen on the robot. This allows the robot to approach someone and have a two-way conversation with the remote operator. At the end of the session, the "rating" window is not displayed to the remote operator. There is a tutorial on configuring a robot for exosphere telepresence by clicking here.

5. Experimental Features
These are internal development features. These features may not work and are not documented.

Configuration - User Access Tab

User-inserted image

1. Username Field
This field allows you to write in a username.

2. Access Type
Once the add button is pressed it will add the inputted username into your Access list and assign it to the selected access type (in the drop-down).

3. Access List
This window displays the users you have added and their access type.

How to Use the Exosphere Skill

1) Add the Camera device skill to your ARC project (Project -> Add Skill -> Camera -> Camera Device).

2) Add a Movement Panel skill to your ARC project that matches your robot (Project -> Add Skill -> Movement Panels).

3) Add the Exosphere skill to your ARC project (Project -> Add Skill -> Machine Learning -> Exosphere).

4) Add a brief description of your task to the task description field in the Exosphere skill main window.

5) Click the Submit Task button. The task is now on the Synthiam servers awaiting a user to pick it up.

6) Once the task is completed. Rate the user in three areas on how well the task was completed. 5 stars = the best job. This will help advance the machine learning database with reinforcement learning.

User-inserted image



With the onset of pandemics like Covid-19, it is important for Exosphere to have compatibility with existing robots to leverage their ability with human operators - while continuing to train the knowledge base for future autonomous behaviors. Exosphere was designed as a web service & skill to allow connections from existing robots or control software, such as ROS or ARC. However, it only takes minutes to add any robot to Exosphere with Synthiam’s ARC software. As per the Synthiam way, we’ll be adding many tutorials that demonstrate how to add a variety of robots to Exosphere using ARC.

#155   — Edited
Hey, I was wondering....where is Jeremie's office?
#156   — Edited
I saw the way but

my robot froze can't move
@Mickey666Maus  You have to find it.  I thought Jer's office was the game consoles.  just kidding Jer.
OMG...I edited my post three times to get Jerimies name right!!
I am drunk!!
@Mickey666Maus   This is in Canada not Germany it is illegal to drive Robots Drunk.
Where did everyone go? lol

come back and try it
very good admin

how do you get it to ask  a question? like directions, excuse me

Add a script in the exosphere control
#165   — Edited
ok I will try

#166   — Edited
I figures it out

User-inserted image

find Jeremy's  office today?

#168   — Edited
Added optional two way telepresence option in config menu in latest update
I will try it soon as I back from the farm tomorrow.


Ok, I feel like I'm late to the party here.  We can start controlling each other's robots?  Oh my!xD
Updated to not prompt for ML reinforcement questions when used in telepresence mode
Updated for a few bug fixes
How about the custom control configuration DJ? Any plans?
#174   — Edited
Yah - with api access there’s a few enterprise customers that have custom UI. Api access isn’t available with pro. It’s documented on the ARC product page:)

User-inserted image

in the meantime for pro users, create scripts and the buttons will show up.