kslaxme
India
Asked
@Athena which approach is recommended for a interactive robot which control by VR oculus. ARC on Pi as a brain to robot, 4 motors and serial servos as arms to Arduino mega here i have another doubt also Hbridge skill has only 2 motor control. if i have 4 motors how to control it.
Related Controls
Unnamed Plugin
Dual HBridge w/PWM
For an interactive robot controlled by a VR Oculus,using ARC on a Raspberry Pi as the brain is a great approach.Heres a recommended setup and how to address your concerns:
Recommended Setup:
Raspberry Pi as the Brain:
Arduino Mega for Motor and servo Control:
VR Oculus Integration:
Controlling 4 Motors with H-Bridge:
H-Bridge Skill Limitation:
Options for Controlling 4 Motors:
Dual H-Bridge Modules:
Custom Motor Controller:
Custom Scripting:
Additional Considerations:
Serial Servos:
Testing and Calibration:
By following this setup,you can create an interactive robot controlled by VR Oculus with ARC as the central control system.If you have further questions or need more detailed guidance,feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
We have moved your question into a dedicated thread to get the most out of the Athena AI. This will ensure your question gets the attention it deserves to ensure your robot's success.
ARC on a Pi with Windows 11 should be acceptable hardware if there aren't too many additional features running.
You mentioned Oculus, is that an Oculus Quest? If so, the Oculus Quest robot skill will be the one you'll want to use: https://synthiam.com/Support/Skills/Virtual-Reality/Oculus-Quest-Robot?id=21158
Becuase you will be running on ARC on a Pi with vision camera, VR processing, etc.. I recommend ensuring the operating system's configuration is optimized for performance. There is a great guide here: https://synthiam.com/Support/Tips-to-Make-a-Robot/free-space-with-windows-10
Choosing a microcontroller for the servos can be easy and affordable. For example, an Arduino with a servo shield is always a good, cost-effective option. That connects to the Pi via a USB cable, and the servos are powered by a more powerful source. Choosing a controller can is part of step #3 of the guide here: https://synthiam.com/Support/Get-Started/how-to-make-a-robot/EZB You will find the respective controller firmware by selecting it from that page.
To better understand what you mean by "4 motors", we'd need to know how they're used. Do you have a motor for each of the four wheels of the robot (i.e., 4x4 drive?)? Or do you have four motors throughout the robot for moving arms or levers? Explaining how the motors are used will help us direct you to the correct robot skill and configuration.
4X4 drive Wheels, 4 serial bus servo for each arms of robot.
If you have 4x4 wheels, you do not need more than one hbridge robot skill. Both HBridges will share the same I/O from the microcontroller to the HBridge. @Athena will explain more.
This is the robot skill you would want to use: https://synthiam.com/Support/Skills/Movement-Panels/Dual-HBridge-w-PWM?id=16067
Ensure it's the Movement Panel type to register as a movement panel. Click on the word " Movement Panel " to read more about it. There's a great support document.
One more thing, this step of the getting started guide for building a robot explains the Movement Panel types: https://synthiam.com/Support/Get-Started/how-to-make-a-robot/Movement-Style
For a 4x4 Drive System Using a Single H-Bridge Robot Skill
It is indeed possible to share the I/O from the microcontroller to the H-Bridge modules. Here's how you can set it up:
Setting Up 4x4 Drive with Dual H-Bridge
Dual H-Bridge Module
Wiring Configuration
ARC Configuration
Movement Panel
Controlling Serial Bus Servos for Arms
Arduino Mega Setup
ARC Integration
Servo Control Logic
Additional Resources
Movement Panel Guide
Review the Movement Style guide for more information on configuring movement panels in ARC.
Servo Control
Ensure your Arduino Mega is set up to handle serial communication with the servos and that the correct libraries are installed.
By following these steps, you can effectively control a 4x4 drive system and serial bus servos for the robot's arms using ARC and VR inputs. If you have further questions or need more detailed guidance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
@AThena explain how two dual channel hbridges can be connected to a microcontroller by sharing the same ports for the HBridge pwm movement panel.
Certainly! When using two dual-channel H-Bridges to control a 4x4 drive system, you can share the same microcontroller ports to manage all four motors. Here's how you can set it up:
Connecting Two Dual-Channel H-Bridges:
H-Bridge Configuration:
Microcontroller Port Sharing:
Wiring Diagram:
ARC Configuration:
Movement Panel:
Benefits of Port Sharing:
Considerations:
By sharing the same ports for both H-Bridges, you can efficiently control a 4x4 drive system using a single set of control signals. If you have any further questions or need additional assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
@Athena , How to make the robot to feed basic information about the robot Say about yourself, What do you do? , How long will you run? these are all conversational questions asked to robot? how to preload answers to the robot, give me sample script to load the answers to the robot to speak.
To preload answers for your robot to respond to conversational questions,you can use ARCs scripting capabilities to create a script that listens for specific phrases and responds accordingly.Heres a step-by-step guide and a sample script to get you started:
Step-by-Step Guide:
Add Speech Recognition Skill:
Configure Speech Recognition:
Create a Script for Each Phrase:
Audio.say()
orAudio.sayEZB()
function to make the robot speak the response.Assign Scripts to Phrases:
Sample Script:
Here's a sample script for the questions you mentioned:
Assigning Scripts:
aboutYourself()
script.whatDoYouDo()
script.howLongWillYouRun()
script.Testing:
By following these steps, you can preload answers to common conversational questions and enhance your robot's interactivity. If you have any further questions or need additional assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Given your goal of creating an AI-powered robot, I'd use one of the several AI robot skills available. Specifically, I'd use the Open AI GPT robot skill, but you can choose the one you think is best. Find them here: https://synthiam.com/Support/Skills/Skills-Overview#Artificial-Intelligence
Please do not comment about unrelated topics any further on other threads. Keep your conversation about the robot that you are building in this thread. It helps us organize the forum for others who learn from reading manuals and watching others receive assistance.
, I plan to use a Raspberry Pi 5 in my robot, with Windows 11 installed and ARC running on it. The information provided on the website regarding the setup of single-board computers (SBC) seems outdated and unclear. Could you please provide more detailed and up-to-date instructions on how to configure a headless setup for the Raspberry Pi to automatically run ARC from startup?
The getting started guide will be of use to you.
By scrolling up, as mentioned in point #4, you'll see all the answers given to you. If you spend an hour focusing on the answers, you will have the robot running.
I beleive you are distracting yourself with too many goals and not focusing on a single task. Start by splitting your goal into smaller micro-goals. Tackle each micro-goal in order.
Incase you are not able to find your account profile to see past posts you have created, the link is here: https://synthiam.com/User/Profile/kslaxme
Follow that link to see the multiple topics you have created and the answers that have been given.
@Athena , how to feed the x,y,z values from other skills like oculus quest robot for example right controller value to inverse kinematics skill to control servos.
To feed the X, Y, Z values from other skills, such as the Oculus Quest robot skill, to the Inverse Kinematics skill in ARC, you can use the
ControlCommand()
function to pass the coordinates. Here's a step-by-step guide on how to achieve this:Identify the Source of X, Y, Z Values:
Access the Values:
getVar()
function to access the global variables that store the X, Y, Z values from the Oculus Quest robot skill. For example:Send Values to Inverse Kinematics:
ControlCommand()
function to send these values to the Inverse Kinematics skill. ReplaceInverseKinematicsArm
with the actual name of your Inverse Kinematics skill in ARC:Integrate into a Script:
Test and Adjust:
By following these steps, you can effectively control the servos of your robot arm using the X, Y, Z values from the Oculus Quest right controller or any other compatible input device. If you encounter any issues or need further assistance, feel free to ask!
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
seems like it doesnt has any relavant global variables representing left and right controller position, how can define the data as a variable and use in script?
We moved your question into your robot thread because it concerns using control commands and has nothing to do with the Oculus VR robot skill. Let me explain further.
You can send commands to other robot skills using a command called ControlCommand() . It is highlighted in the Getting Started guide. Here is a great link to better understand it. I encourage you to read the links that we provide: https://synthiam.com/Support/Programming/control-command
Now that you have read that link let me provide you with more information on how to do what you're trying to do.
This is how you should do it
Your question doesn't make sense of what you are attempting to do because the Oculus VR robot skill already controls the servos. You specify the servo ports and positions, which will move about the hand position. For example, you can have up to 255 servos on a single servo setting in ARC. This is done by pressing the ADVANCED button when selecting servo parameters. It is documented in the guide; here is a link: https://synthiam.com/Support/ARC-Overview/Servo-Controls
The Oculus VR robot skill is already designed to move the servos into position based on the values provided by the Oculus VR headset. You do not need to use extra robot skills. You should read the manual for the Oculus VR headset. The manual for the Oculus VR headset is detailed and explains how it works. Here is a link for you to read it: https://synthiam.com/Support/Skills/Virtual-Reality/Oculus-Quest-Robot?id=21158
This is what you want to do
You're asking to take arbitrary servo positions and contradicting oculus vr robot skill the manual by assuming they are cartesian coordinates. While that doesn't make sense and won't provide any practical result, I'll explain how you "can" do it. Remember, what you'll do by following these instructions is to prove to yourself that you do not need to do this. But, this exercise might help you achieve and learn a bit, which I think you will value greatly.
In the Oculus VR robot skill, instead of selecting real ports (d0, d2, d6, etc), choose vx (virtual ports). I'm unsure of the range because this solution doesn't make sense, but you can try it anyway. So, the range can be set to Min:1 and Max: 180. If you don't know what a Virtual Port is, it's because you are not reading any of the links. To understand the difference between ports, here's the link about servos again: https://synthiam.com/Support/ARC-Overview/Servo-Controls
Now that you have read the above link, you will know what a virtual port is.
Let's add a script robot skill. In the script robot skill, you'll want to create a script. Here is a script to take the virtual port values you specified and send them to the inverse kinematics robot skill. You can change the ports in my example script to match yours. You may also need to change the robot skill names in the ControlCommand to match yours.
I have added comments to this JAVASCRIPT example to show how to use it. The ports are v0, v1, and v2 for the x,y, and z, respectively. As I mentioned above, you may need to change some of these for your usage.
support, that's not entirely true. the end effector of the servo positions is normalized to a servo range from the cartesian coordinates. so it'll "sort of" work. your script will work okay I think. but the values will have to be adjusted by a multiplier. This is because the inverse kinematics robot skill expects CM. So the range of 1-180 might work, but that's 180 cm which is almost 2 meters.
anyway, be worth trying the way it is and add a multiplier to the x,y,z values if necessary