Mitch Torri, the owner of ASI in Mendon, Utah, generously provided my V2R_AG_Robotics 4H club with a robot project reminiscent of the Chaos Robot they developed for the military in the early 2000s. Our current robot setup includes two electric wheelchair motors, a Sabertooth 2x32 motor controller, and is operated via a FlySky remote control. We aim to expand this system to accommodate a total of eight motors of similar size.
My primary question is: How can we configure multiple Sabertooth 2x32 controllers to work harmoniously within Synthiam ARC to control and collect data from all the motors for autonomous operation? It seems impractical for a single person to manually coordinate eight motors, especially across rugged terrain. We have a challenging task ahead, as we need to manage a 30-acre steep, overgrown hillside along the downtown Ogden River in Utah. The goal is to clear and maintain this area to prevent potential fire hazards, similar to those seen in Los Angeles.
We envision deploying a swarm of these Chaos-style robots, which will communicate with overhead drones. These robots will cut through three-foot tall grass, brush, and maneuver around both standing and fallen trees.
We plan to use gas engines to initially charge the batteries, but our future goal involves developing a more efficient power supply system. With various technological advancements coming together, this project is an exciting opportunity to embrace what I refer to as 21st-century literacy.
What are your thoughts on this endeavor? Is it a bold vision or overly ambitious?
It is not a 4 tracked skid steer though. Can we Use ARC to accomplish the Chaos Robot without joystick? Can we use ARC to create a URDF description and simulate it gaining function?
You cannot simulate in ARC or any programming environment. You can use a simulation software for that, which eventually requires programming the robot with what you learned from the simulation - if you have an outcome that can be defined. The word "Autonomous" on its own does not imply an outcome. Autonomous is an adjective to describe how something is complete. For example, "Autonomously navigate from point A to point B through known terrain C with sensors D and navigation algorithm E, etc...". But, you would have strict definitions of all points.
It's fantastic that you have great goals, and I have no doubt that you will achieve them. However, all great things start with small strides - and that's where I am taking time to direct you.
@Athena will explain more.
Hi Chuck,
ARC isn’t meant for simulation purposes or for modeling your robot using URDF files. Instead, ARC is designed to control real hardware using its suite of robot skills. Let’s break this down a bit:
Simulation vs. Real-World Control
ARC focuses on interfacing with actual hardware-including motor controllers like Sabertooth, servos, sensors, and so on. If your goal is to develop a joystick-operated system for your current two-motor setup, ARC does that very well. But when it comes to true autonomous behavior (for example, a Chaos Robot performing complex tasks without a joystick), that kind of high-level autonomy typically requires you to define exact behaviors and sensor inputs (e.g., navigate from point A to B using sensor D and algorithm E).
URDF and Autonomous Simulation
URDF (Unified Robot Description Format) is generally used in simulation environments (like ROS with Gazebo) to model the mechanics of a robot. ARC doesn’t support creating or simulating URDF-based models. Instead, simulation software lets you test sensor integration and navigation algorithms in a virtual world. The insights from your simulation can help you develop the control code. You’d eventually need a method (robot skill or script) in ARC to replicate those algorithms on your physical robot.
Building Autonomous Functionality in ARC
While ARC itself doesn’t simulate your robot’s environment, you can still develop and test autonomous routines by:
The Path Forward
It sounds like you’re aiming for a complex, highly autonomous system-a goal that definitely requires rigorous testing and clear outcomes from simulations. While ARC won’t let you throw a URDF into it, it does let you implement and execute defined control behaviors on real hardware. Many developers start small (for instance, controlling two motors via a Sabertooth and gradually adding sensors/logic) and then scale their system once the fundamentals are solid.
Does this help clarify how ARC fits into your overall plan? Feel free to ask more questions if you’d like specifics on how to start developing autonomous functions within ARC or about integrating simulation outputs with your real-world robot control.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
There you go-well said. Athena's summary, which will get you up and running, clarifies the difference between the "program" and the "simulation." The simulation produces the results that the program will use. You can choose which simulation environment you wish to use. But you also mentioned "Autonomous," I have yet to hear your input on what "autonomous action" your robot's goal is.
Here's an example. You can load any simulation environment, from Gazebo to Blender to Unity or others, depending on your level of technical ability. But "throwing a URDF" into a simulation environment doesn't do anything on its own. The word "Autonomous" at this stage has no relationship to an outcome because no outcome has been defined. In addition, you cannot simulate a robot without knowing the hardware configuration and ensuring it physically moves so you can compare the physical capabilities to the simulated environment. This is why the first step is to program your robot to operate reliably. Following that, there are several options for how you receive simulation data.
The result of a simulation requires training a model or building off existing models. Due to the lack of appendages and sensors, I don't see your robot configuration as the right fit for Synthiam's Exosphere robot program. But I'm sure there are some HuggingFace models that you can build from. Either way, you need a robot that moves before heading down that path.
Knowing the full capabilities of your robot's goal is essential, and you're doing a great job educating yourself with potential. That's why starting with a robot that provides you with the experience is a great start, because that is an excellent achievement in itself. The getting started guide will introduce you to the start of the building process to identify the microcontroller, computer, sensors, etc.. that your robot will use. But, again, I recommend starting with a robot such as the wheelchair base that you've modified into a skid steer to get started.
https://youtu.be/1FLD0djPFgU?si=599kwLZ9VcjJXxyC
https://youtu.be/TxpE5yryCTU?si=kpMZpjzV9Q-CIyjZ
This is what we are going for. I thank you for the responses and will dive into ARC. With the community this is going to be fun.
It looks like you and DJ had a great conversation. There are good things to come! Thank you for sharing the videos. Our team developed and maintains the ARC platform to make everyone's robot dreams a reality-without having to become a low-level programmer! Let your creativity fly. Okay, looks like your next mission is to get that converted wheelchair base moving with Synthiam ARC. I foresee simply remote controlling it at first, and then we'll add some autonomous capability with camera tracking, to get you settled in.
We already have the wheelchair base remote controlled with Sabertooth 2X32 and FlySki toy RC setup. But I will redo it in ARC. Still reading the docs. Question, what are Synthiam recommending to replace the Real sense T256 etc. for visual autonomous guidance hardware? And what hardware do I need to buy to go from PC to control hardware? I see mention of EZ-B. Still more research to do on my part. Glad to have a community to reach out to! Thanks