Asked — Edited

The Birth Of The XR-1 DIY Robot

Introducing the Updated XR1 and the New XJ-5: A New Class of Hobby Robots

User-inserted image

The XJ-5 robot platform, available at Robots Builder, stands out for its user-friendly design, emphasizing accessibility and ease of assembly. Designed to make robotics more approachable, the XJ-5 is meticulously engineered for easy 3D printing and assembly with standard parts like servos and bearings.

Our goal is to simplify the building process so that enthusiasts and robot hobbyists of all skill levels can engage in constructing their own robots. We've invested time in perfecting the design, ensuring that users can quickly move on to the exciting tasks of programming and teaching their robots using Synthiam ARC. The XJ-5 is highly customizable, featuring a variety of hands and heads to allow personalization of each creation. To encourage a collaborative and open community, we're proud to announce that the XJ-5 is entirely open source, inviting users to contribute, modify, and share their innovations globally. Get ready to embark on a journey of creativity and exploration with the XJ-5!

For more information, check out Robots Builder.

As a creator of the XJ-5 and our other XR series, I will be leveraging the Synthiam ARC platform for robot control, and I hope others here will join me in developing an ARC project to support the XJ-5 robot. As of January 9th, 2024, we have started posting files for 3D printing. Meanwhile, we are developing documentation and the above website, so please check back regularly.

My goal is to have a walking, talking, listening, and seeing robot the size of a small child using ARC and its various skills. As I progress, I will be sharing more about my efforts here.

User-inserted image

Now, let's dive into the project focusing on the new and improved XR1, the XJ-5, which I have been designing and building over the past year.

User-inserted image

As a young enthusiast, I loved the movie Short Circuit and fell in love with the Johnny 5 robot, inspiring me to create a similar robot. It was not designed to replicate the original Johnny 5. About a year ago, DJ suggested preferring a robot with wheels or tracks over a bipedal robot due to the complexity of making them walk, a challenge I am familiar with from our XR2 project. But hey, that's another story.

Now, in May 2025, I've almost completed the assembly and will start implementing ARC and the AI brain I've been working on to support ARC. More updates will follow.

I have started building the torso from the hips up, as shown in the pictures below. The shoulders are not fully installed because I'm waiting for the correct screws to arrive. Please excuse the messy workbench.

User-inserted image

To support the robot, I have designed a set of boards with ARC firmware that will connect to the onboard PC, which will be situated in the middle of the chest.

User-inserted image

The above board will be mounted on the backplate between the two shoulder sockets and connected to the USB hub that will be connected to the PC. Two more USB cables, along with servo power, run down to the boards in each hand. The USB camera and headboard will be connected to the PC's other two USB ports. ARC will run headless on the Windows 11 Pro PC using RDP for remote connectivity. Additionally, there is an MPU connected to the main board that I plan to leverage with ARC.

I have added the shoulders and arms down to the wrists, and they appear to be strong and capable.

I decided to create a Johnny 5-style head for my build featuring two cameras. Here is the completed head. I have added six more servos to the build for the eye flaps, bringing the total count to 50 servos. It includes a directional mic array and ultrasonic radar for enhanced audio and visual capabilities. For sound, two speakers are mounted in the head with an audio amplifier.

User-inserted image

I have also decided to add an option to the XR-1 to support a 5-inch touch display, as shown below.

User-inserted image

The display will provide more options for operator interaction using ARC remote controls.

I have created both two and three-finger claw grippers and human-like hands, providing versatility in interaction for the XR-1.

User-inserted image

User-inserted image

User-inserted image


ARC Pro

Upgrade to ARC Pro

Your robot can be more than a simple automated machine with the power of ARC Pro!

PRO
Canada
#49  

To achieve reliable SLAM (Simultaneous Localization and Mapping) pose estimation, it's generally recommended that the LIDAR sensor has a clear line of sight for as close to 360 degrees as possible. This ensures that the sensor can detect obstacles and map the environment accurately from all directions.

LIDAR Coverage

  • Ideal Coverage: 360 degrees
  • Minimum Reliable Coverage: At least 270 degrees, but more is better.

Current Setup

  • Your Coverage: 260 degrees
  • Potential Issue: The remaining 100 degrees could lead to blind spots, which might affect the accuracy of the SLAM algorithm.

Adding a Second LIDAR

If you add a second LIDAR sensor to cover the remaining blind spots, ARC does not automatically merge the data streams from multiple LIDAR sensors. However, you can achieve this by using the Navigation Messaging System (NMS) in ARC.

Steps to Merge Data Streams

  1. Configure Each LIDAR: Set up each LIDAR sensor in ARC to push its data into the NMS.
  2. Use a Navigation Skill: Utilize a navigation skill that can subscribe to multiple LIDAR data streams and merge them. For example, "The Better Navigator" skill can handle multiple data inputs.
  3. Custom Scripting: If needed, you can write custom scripts in JavaScript or Python to process and merge the data streams manually.

Example Configuration

  1. Primary LIDAR (Front):

    • Skill: Hitachi-LG LDS Lidar or RPLidar(A1)
    • Coverage: 260 degrees
  2. Secondary LIDAR (Back):

    • Skill: Another instance of Hitachi-LG LDS Lidar or RPLidar(A1)
    • Coverage: Remaining 100 degrees
  3. Navigation Skill:

    • Skill: EZ-SLAM or The Better Navigator
    • Configuration: Subscribe to both LIDAR data streams

Custom Script Example

Here’s a basic example of how you might set up a script to merge data from two LIDAR sensors:

// Assuming LIDAR1 and LIDAR2 are the names of your LIDAR skills
var lidar1Data = getVar("$LIDAR1_Data");
var lidar2Data = getVar("$LIDAR2_Data");

// Merge the data
var mergedData = mergeLidarData(lidar1Data, lidar2Data);

// Function to merge LIDAR data
function mergeLidarData(data1, data2) {
    // Custom logic to merge data1 and data2
    // This is a placeholder function
    return data1.concat(data2);
}

// Push merged data to NMS
Navigation.updateScan(mergedData);

Conclusion

  • Current Setup: 260 degrees is good but not ideal.
  • Adding a Second LIDAR: Can cover blind spots but requires configuration.
  • Merging Data: Use NMS and possibly custom scripts to merge data streams.

For more detailed guidance, you can refer to the Synthiam documentation on the Navigation Messaging System (NMS) and the specific LIDAR skills you are using:

Feel free to ask if you need further assistance!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#50  

Your latest diagram is much clearer regarding placement. Images like that would speed conversations lIke this up.

270 degress is 75% which leaves an error of 25%

hopefully the slam will compensate for the 25% error. If your robot moved very slow I suspect it’ll be ok. One way is to experience it by testing. You don’t need to 3D print a robot. Just put a lidar on a small 2 wheeled robot and block off 25% to see how it works

PRO
Canada
#51  

I would probably mount a D435 on front for distance tracking and a T265 for navigation location tracking on this type of robot.

#52  

Ok @Athena,  I was able to get 280 degrees of scan area so I thank you and DJ for your help and guidance on this.  I am looking forward to setting this up on the XR-1.

PRO
Canada
#53   — Edited

That's great to hear! Achieving 280 degrees of scan area should significantly improve the accuracy of your SLAM and overall navigation capabilities. Here are some additional tips and steps to help you set up your XR-1 robot with the improved LIDAR coverage:

Setting Up Your XR-1 with Improved LIDAR Coverage

  1. Mounting the LIDAR:

    • Ensure that the LIDAR sensor is securely mounted on the XR-1 to cover the 280 degrees of scan area.
    • Make sure there are no obstructions in the LIDAR's field of view.
  2. Configuring the LIDAR in ARC:

    • Add the LIDAR robot skill to your ARC project.
    • Configure the LIDAR settings to ensure it is correctly calibrated and providing accurate data.
  3. Integrating with Navigation Skills:

    • Use navigation skills like EZ-SLAM or The Better Navigator to process the LIDAR data.
    • Ensure these skills are configured to use the LIDAR data for mapping and navigation.
  4. Adding Additional Sensors:

    • As suggested by @Nink, consider adding an Intel RealSense D435 for distance tracking and a T265 for navigation location tracking.
    • These sensors can provide additional data to improve the accuracy of your robot's navigation.
  5. Testing and Calibration:

    • Perform initial tests in a controlled environment to ensure the LIDAR and other sensors are working correctly.
    • Calibrate the sensors as needed to ensure accurate data.
  6. Fine-Tuning:

    • Use the data from initial tests to fine-tune the settings in ARC.
    • Adjust the parameters in the navigation skills to optimize performance.
  7. Field Testing:

    • Conduct field tests in various environments to ensure the robot can navigate accurately.
    • Make any necessary adjustments based on the results of these tests.

Example Configuration for LIDAR and RealSense Sensors

  1. LIDAR Configuration:

    • Add the LIDAR skill (e.g., RPLidar A1) to your ARC project.
    • Configure the LIDAR settings to cover the 280 degrees scan area.
  2. RealSense D435 Configuration:

    • Add the Intel RealSense D435 skill to your ARC project.
    • Configure the D435 for distance tracking.
  3. RealSense T265 Configuration:

    • Add the Intel RealSense T265 skill to your ARC project.
    • Configure the T265 for navigation location tracking.
  4. Navigation Skill Configuration:

    • Add The Better Navigator skill to your ARC project.
    • Configure it to use data from the LIDAR, D435, and T265 sensors.

Conclusion

With 280 degrees of LIDAR coverage and the addition of RealSense sensors, your XR-1 robot should have a robust navigation system. Continue testing and fine-tuning to achieve the best performance. If you encounter any issues or need further assistance, feel free to reach out.

Good luck with your XR-1 project, and happy building!


I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
PRO
Synthiam
#54  

I find the t265 goes out of sync real bad and can’t be fixed. Because it’s map is internal, it starts to drift and there’s no way to fix that without resetting it.

the d435 or other depth cameras aren’t necessary if the lidar is being used. The depth does add more detail, but in many cases that extra detail affects the slam pose estimation. To believe it’s because too many data points increases the false positives to a point where the error is too high.

Depth cameras (in my opinion today) are most useful for interacting with a robot arm. Identify object,  know the Cartesian coordinates of objects, and use kinematics to pickup or interact with object.

id stick with lidar for navigation.

another option is putting a TOF time of flgiht sensor on the rear of the robot pointing exactly 180 degrees from the lidar. That might be useful for the slam I find having distances from behind aid pose estimation. But I’d only consider it if you have too much trouble with the lidar alone.

#55   — Edited

DJ,  do think I would be better served to add the second lidar in the rear of the unit?  The reason I ask is because this would be the time to add it to the design.  The front has 280 scan and the rear has 330 scan degrees.

User-inserted image

PRO
Synthiam
#56  

Yeah - both athena and i have said that above:)

you could add 100 of them if you wanted - it'll just require a bit more processing as it's a lot more data... but not that much more, probably not noticable.