India
Asked

Building A VR Controlled Robot

@Athena which approach is recommended for a interactive robot which control by VR oculus. ARC on Pi as a brain to robot, 4 motors and serial servos as arms to Arduino mega here i have another doubt also Hbridge skill has only 2 motor control. if i have 4 motors how to control it.



ARC Pro

Upgrade to ARC Pro

Experience early access to the latest features and updates. You'll have everything that is needed to unleash your robot's potential.

PRO
Canada
#1  

For an interactive robot controlled by a VR Oculus,using ARC on a Raspberry Pi as the brain is a great approach.Heres a recommended setup and how to address your concerns:

Recommended Setup:

  1. Raspberry Pi as the Brain:

    • Install ARC on the Raspberry Pi to act as the central control unit.
    • Use the Raspberry Pi to manage high-level tasks,VR integration,and communication with other components.
  2. Arduino Mega for Motor and servo Control:

    • Connect the Arduino Mega to the Raspberry Pi via USB or serial communication.
    • Use the Arduino Mega to handle low-level control of motors and servos.
    • Install the appropriate firmware on the Arduino Mega to interface with ARC.
  3. VR Oculus Integration:

    • Use ARC's VR skills to integrate the Oculus for controlling the robot.
    • Map VR inputs to control the robot's movements and actions.

Controlling 4 Motors with H-Bridge:

  • H-Bridge Skill Limitation:

    • The standard H-Bridge skill in ARC typically controls two motors.
    • For four motors,you can use two H-Bridge modules or a motor controller that supports four motors.
  • Options for Controlling 4 Motors:

    1. Dual H-Bridge Modules:

      • Use two separate H-Bridge modules,each controlling two motors.
      • Add two H-Bridge skills in ARC,one for each module.
    2. Custom Motor Controller:

      • Use a motor controller that supports four motors and is compatible with ARC.
      • Check if the motor controller has a specific skill in ARC or if it can be controlled via scripting.
    3. Custom Scripting:

      • If using a custom motor controller,write scripts in ARC to send commands to the controller.
      • Use JavaScript or Python in ARC to manage motor control logic.

Additional Considerations:

  • Serial Servos:

    • Ensure the Arduino Mega is configured to control serial servos.
    • Use ARC's scripting capabilities to send commands to the Arduino for servo control.
  • Testing and Calibration:

    • Test each component individually to ensure proper communication and control.
    • Calibrate the VR inputs to match the robot's movements accurately.

By following this setup,you can create an interactive robot controlled by VR Oculus with ARC as the central control system.If you have further questions or need more detailed guidance,feel free to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#2  

We have moved your question into a dedicated thread to get the most out of the Athena AI. This will ensure your question gets the attention it deserves to ensure your robot's success.

  1. ARC on a Pi with Windows 11 should be acceptable hardware if there aren't too many additional features running.

  2. You mentioned Oculus, is that an Oculus Quest? If so, the Oculus Quest robot skill will be the one you'll want to use: https://synthiam.com/Support/Skills/Virtual-Reality/Oculus-Quest-Robot?id=21158

  3. Becuase you will be running on ARC on a Pi with vision camera, VR processing, etc.. I recommend ensuring the operating system's configuration is optimized for performance. There is a great guide here: https://synthiam.com/Support/Tips-to-Make-a-Robot/free-space-with-windows-10

  4. Choosing a microcontroller for the servos can be easy and affordable. For example, an Arduino with a servo shield is always a good, cost-effective option. That connects to the Pi via a USB cable, and the servos are powered by a more powerful source. Choosing a controller can is part of step #3 of the guide here: https://synthiam.com/Support/Get-Started/how-to-make-a-robot/EZB  You will find the respective controller firmware by selecting it from that page.

  5. To better understand what you mean by "4 motors", we'd need to know how they're used. Do you have a motor for each of the four wheels of the robot (i.e., 4x4 drive?)? Or do you have four motors throughout the robot for moving arms or levers? Explaining how the motors are used will help us direct you to the correct robot skill and configuration.

#3  

4X4 drive Wheels, 4 serial bus servo for each arms of robot.

PRO
Synthiam
#4   — Edited
PRO
Canada
#6   — Edited

For a 4x4 Drive System Using a Single H-Bridge Robot Skill
It is indeed possible to share the I/O from the microcontroller to the H-Bridge modules. Here's how you can set it up:


Setting Up 4x4 Drive with Dual H-Bridge

  1. Dual H-Bridge Module

    • Use a dual H-Bridge module that can control four motors.
    • Connect two motors to each H-Bridge channel.
  2. Wiring Configuration

    • Connect the H-Bridge inputs to the microcontroller's digital I/O pins.
    • Ensure that the power supply to the H-Bridge is sufficient to drive all four motors.
  3. ARC Configuration

    • Add the Dual H-Bridge w/ PWM skill to your ARC project.
    • Configure the skill to control both H-Bridge channels, each managing two motors.
  4. Movement Panel

    • Ensure the Dual H-Bridge skill is set as the Movement Panel in ARC.
    • This will allow other robot skills and scripts to send movement commands to the robot.

Controlling Serial Bus Servos for Arms

  1. Arduino Mega Setup

    • Connect the serial bus servos to the Arduino Mega.
    • Use a library compatible with your servos (e.g., Dynamixel, LewanSoul) to control them.
  2. ARC Integration

    • Use ARC's scripting capabilities to send commands from the Raspberry Pi to the Arduino Mega.
    • You can use JavaScript or Python in ARC to control the servos based on VR inputs.
  3. Servo Control Logic

    • Map VR inputs to servo movements for interactive control.
    • Test and calibrate the servo movements to ensure they match the VR inputs accurately.

Additional Resources

  • Movement Panel Guide
    Review the Movement Style guide for more information on configuring movement panels in ARC.

  • Servo Control
    Ensure your Arduino Mega is set up to handle serial communication with the servos and that the correct libraries are installed.


By following these steps, you can effectively control a 4x4 drive system and serial bus servos for the robot's arms using ARC and VR inputs. If you have further questions or need more detailed guidance, feel free to ask!



I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#7  

@AThena explain how two dual channel hbridges can be connected to a microcontroller by sharing the same ports for the HBridge pwm movement panel.

PRO
Canada
#8  

Certainly! When using two dual-channel H-Bridges to control a 4x4 drive system, you can share the same microcontroller ports to manage all four motors. Here's how you can set it up:

Connecting Two Dual-Channel H-Bridges:

  1. H-Bridge Configuration:

    • Each dual-channel H-Bridge can control two motors. For a 4x4 drive, you'll need two dual-channel H-Bridges.
    • Connect the first H-Bridge to control the front left and front right motors.
    • Connect the second H-Bridge to control the rear left and rear right motors.
  2. Microcontroller Port Sharing:

    • Signal Pins: Use the same digital signal pins from the microcontroller for both H-Bridges. This means the same set of pins will control the direction of all four motors.
    • PWM Pins: Use the same PWM pins for both H-Bridges to control the speed of the motors. This allows you to control the speed of all four motors simultaneously.
  3. Wiring Diagram:

    • H-Bridge 1:
      • IN1 and IN2 connected to microcontroller pins for direction control.
      • ENA connected to a PWM pin for speed control.
    • H-Bridge 2:
      • IN3 and IN4 connected to the same microcontroller pins as IN1 and IN2.
      • ENB connected to the same PWM pin as ENA.
  4. ARC Configuration:

    • Add the Dual H-Bridge w/ PWM skill to your ARC project.
    • Configure the skill to use the shared signal and PWM pins.
    • Set the PWM sliders to control the speed of all motors.
  5. Movement Panel:

    • Ensure the Dual H-Bridge skill is set as the Movement Panel in ARC.
    • This setup allows the Movement Panel to send commands to all four motors simultaneously, ensuring synchronized movement.

Benefits of Port Sharing:

  • Simplified Wiring: Reduces the number of unique connections needed between the microcontroller and H-Bridges.
  • Synchronized Control: Ensures all motors receive the same control signals for consistent movement.

Considerations:

  • Power Supply: Ensure your power supply can handle the current requirements of all four motors.
  • Testing: Test the setup to ensure all motors respond correctly to movement commands.

By sharing the same ports for both H-Bridges, you can efficiently control a 4x4 drive system using a single set of control signals. If you have any further questions or need additional assistance, feel free to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#9  

@Athena , How to make the robot to feed basic information about the robot Say about yourself, What do you do? , How long will you run? these are all conversational questions asked to robot? how to preload answers to the robot, give me sample script to load the answers to the robot to speak.

PRO
Canada
#10  

To preload answers for your robot to respond to conversational questions,you can use ARCs scripting capabilities to create a script that listens for specific phrases and responds accordingly.Heres a step-by-step guide and a sample script to get you started:

Step-by-Step Guide:

  1. Add Speech Recognition Skill:

    • In ARC, add the Speech Recognition skill to your project. This skill will allow the robot to listen for specific phrases.
  2. Configure Speech Recognition:

    • Open the Speech Recognition skill configuration.
    • Add the phrases you want the robot to respond to, such as "Say about yourself," "What do you do?" and "How long will you run?"
  3. Create a Script for Each Phrase:

    • For each phrase, create a script that contains the response you want the robot to give.
    • Use the Audio.say() or Audio.sayEZB() function to make the robot speak the response.
  4. Assign Scripts to Phrases:

    • In the Speech Recognition configuration, assign the corresponding script to each phrase.

Sample Script:

Here's a sample script for the questions you mentioned:

// Script for "Say about yourself"
function aboutYourself() {
    Audio.say("I am a robot created to assist and entertain. I can perform various tasks and answer your questions.");
}

// Script for "What do you do?"
function whatDoYouDo() {
    Audio.say("I can perform a variety of tasks, including moving, speaking, and interacting with my environment.");
}

// Script for "How long will you run?"
function howLongWillYouRun() {
    Audio.say("I can run as long as my battery is charged. Please ensure I am charged regularly for optimal performance.");
}

Assigning Scripts:

  • In the Speech Recognition skill, add the phrase "Say about yourself" and assign the aboutYourself() script.
  • Add the phrase "What do you do?" and assign the whatDoYouDo() script.
  • Add the phrase "How long will you run?" and assign the howLongWillYouRun() script.

Testing:

  • Once the scripts are assigned, test the setup by speaking the phrases to the robot. It should respond with the preloaded answers.

By following these steps, you can preload answers to common conversational questions and enhance your robot's interactivity. If you have any further questions or need additional assistance, feel free to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#11  

Given your goal of creating an AI-powered robot, I'd use one of the several AI robot skills available. Specifically, I'd use the Open AI GPT robot skill, but you can choose the one you think is best. Find them here: https://synthiam.com/Support/Skills/Skills-Overview#Artificial-Intelligence

Please do not comment about unrelated topics any further on other threads. Keep your conversation about the robot that you are building in this thread. It helps us organize the forum for others who learn from reading manuals and watching others receive assistance.

#12  

, I plan to use a Raspberry Pi 5 in my robot, with Windows 11 installed and ARC running on it. The information provided on the website regarding the setup of single-board computers (SBC) seems outdated and unclear. Could you please provide more detailed and up-to-date instructions on how to configure a headless setup for the Raspberry Pi to automatically run ARC from startup?

PRO
Synthiam
#13   — Edited
  1. install windows 11 on the raspberry pi
  2. install ARC on the raspberry pi
  3. choose an ezb to use and install the firmware on it
  4. for programming your robot, follow tutorials and the several dozen examples and links that we have provided you. This thread has several examples and answers to your questions. I recommend scrolling up and reading responses to your questions. You can view your other threads which also received answers. Your other threads can be viewed in your account page.

The getting started guide will be of use to you.

By scrolling up, as mentioned in point #4, you'll see all the answers given to you. If you spend an hour focusing on the answers, you will have the robot running.

I beleive you are distracting yourself with too many goals and not focusing on a single task. Start by splitting your goal into smaller micro-goals. Tackle each micro-goal in order.

Incase you are not able to find your account profile to see past posts you have created, the link is here: https://synthiam.com/User/Profile/kslaxme

Follow that link to see the multiple topics you have created and the answers that have been given.

#14  

@Athena , how to feed the x,y,z values from other skills like oculus quest robot for example right controller value to inverse kinematics skill to control servos.

PRO
Canada
#15  

To feed the X, Y, Z values from other skills, such as the Oculus Quest robot skill, to the Inverse Kinematics skill in ARC, you can use the ControlCommand() function to pass the coordinates. Here's a step-by-step guide on how to achieve this:

  1. Identify the Source of X, Y, Z Values:

    • First, ensure that the Oculus Quest robot skill is providing the X, Y, Z values you want to use. These values might be available as global variables or through specific functions within the skill.
  2. Access the Values:

    • Use the getVar() function to access the global variables that store the X, Y, Z values from the Oculus Quest robot skill. For example:
      var x = getVar("$OculusQuestRightControllerX");
      var y = getVar("$OculusQuestRightControllerY");
      var z = getVar("$OculusQuestRightControllerZ");
      
  3. Send Values to Inverse Kinematics:

    • Use the ControlCommand() function to send these values to the Inverse Kinematics skill. Replace InverseKinematicsArm with the actual name of your Inverse Kinematics skill in ARC:
      ControlCommand("InverseKinematicsArm", "MoveTo", x, y, z);
      
  4. Integrate into a Script:

    • You can create a script in ARC that continuously reads the controller values and updates the Inverse Kinematics skill. Here's an example of how you might set this up:
      while (true) {
          // Get the X, Y, Z values from the Oculus Quest right controller
          var x = getVar("$OculusQuestRightControllerX");
          var y = getVar("$OculusQuestRightControllerY");
          var z = getVar("$OculusQuestRightControllerZ");
      
          // Send the values to the Inverse Kinematics skill
          ControlCommand("InverseKinematicsArm", "MoveTo", x, y, z);
      
          // Add a short delay to prevent overwhelming the system
          sleep(100);
      }
      
  5. Test and Adjust:

    • Run the script and observe the robot arm's movement. You may need to adjust the mapping of the controller values to the robot arm's coordinate system, depending on the scale and orientation of your setup.

By following these steps, you can effectively control the servos of your robot arm using the X, Y, Z values from the Oculus Quest right controller or any other compatible input device. If you encounter any issues or need further assistance, feel free to ask!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
#16  

seems like it doesnt has any relavant global variables representing left and right controller position, how can define the data as a variable and use in script?

User-inserted image

User-inserted image

#17   — Edited

We moved your question into your robot thread because it concerns using control commands and has nothing to do with the Oculus VR robot skill. Let me explain further.

You can send commands to other robot skills using a command called ControlCommand() . It is highlighted in the Getting Started guide. Here is a great link to better understand it. I encourage you to read the links that we provide: https://synthiam.com/Support/Programming/control-command

Now that you have read that link let me provide you with more information on how to do what you're trying to do.

This is how you should do it

Your question doesn't make sense of what you are attempting to do because the Oculus VR robot skill already controls the servos. You specify the servo ports and positions, which will move about the hand position. For example, you can have up to 255 servos on a single servo setting in ARC. This is done by pressing the ADVANCED button when selecting servo parameters. It is documented in the guide; here is a link: https://synthiam.com/Support/ARC-Overview/Servo-Controls

The Oculus VR robot skill is already designed to move the servos into position based on the values provided by the Oculus VR headset. You do not need to use extra robot skills. You should read the manual for the Oculus VR headset. The manual for the Oculus VR headset is detailed and explains how it works. Here is a link for you to read it: https://synthiam.com/Support/Skills/Virtual-Reality/Oculus-Quest-Robot?id=21158

This is what you want to do

You're asking to take arbitrary servo positions and contradicting oculus vr robot skill the manual by assuming they are cartesian coordinates. While that doesn't make sense and won't provide any practical result, I'll explain how you "can" do it. Remember, what you'll do by following these instructions is to prove to yourself that you do not need to do this. But, this exercise might help you achieve and learn a bit, which I think you will value greatly.

  1. In the Oculus VR robot skill, instead of selecting real ports (d0, d2, d6, etc), choose vx (virtual ports). I'm unsure of the range because this solution doesn't make sense, but you can try it anyway. So, the range can be set to Min:1 and Max: 180. If you don't know what a Virtual Port is, it's because you are not reading any of the links. To understand the difference between ports, here's the link about servos again: https://synthiam.com/Support/ARC-Overview/Servo-Controls

  2. Now that you have read the above link, you will know what a virtual port is.

  3. Let's add a script robot skill. In the script robot skill, you'll want to create a script. Here is a script to take the virtual port values you specified and send them to the inverse kinematics robot skill. You can change the ports in my example script to match yours. You may also need to change the robot skill names in the ControlCommand to match yours.

I have added comments to this JAVASCRIPT example to show how to use it. The ports are v0, v1, and v2 for the x,y, and z, respectively. As I mentioned above, you may need to change some of these for your usage.

while (true) {

   // get the X virtual servo position, assuming it's v0
   var x = Servo.getPositionRealtime(v0);
   
   // get the Y virtual servo position, assuming it's v1
   var y = Servo.getPositionRealtime(v1);
   
   // get the Z virtual servo position, assuming it's v2
   var z = Servo.getPositionRealtime(v2);
   
   // send the movement command
   ControlCommand("Inverse Kinematic Arm", "MoveTo", x, y, z);
   
   // pause to avoid running wild
   sleep(250);
}
PRO
Synthiam
#18  

support, that's not entirely true. the end effector of the servo positions is normalized to a servo range from the cartesian coordinates. so it'll "sort of" work. your script will work okay I think. but the values will have to be adjusted by a multiplier. This is because the inverse kinematics robot skill expects CM. So the range of 1-180 might work, but that's 180 cm which is almost 2 meters.

anyway, be worth trying the way it is and add a multiplier to the x,y,z values if necessary