PRO
RoboHappy
USA
Asked
I have a question about the ARC robot skill called Inverse Kinematic Arm. Is there any video available that demonstrates this skill? I find the current instructions too vague, and it would be helpful to see the skill in operation through a video.
Related Hardware (view all EZB hardware)
EZ-B V4
by EZ-Robot
EZ-B v4/2 robot controller: dual Cortex ARM, Wi-Fi, audio/video, 24 servo/digital ports, I2C/UART, camera and Synthiam ARC control for custom robots
Wi-Fi / USB
Servos
24
Camera
Audio
UART
3
I2C
ADC
8
Digital
24
Related Robot Skills (view all robot skills)
Script
by Synthiam
Multi-language ARC Script: build, run and debug Blockly, JavaScript, EZ-Script or Python with Intellisense, run/save/load and Roboscratch support
Inverse Kinematic Arm
by Synthiam
Control robot arms with inverse and forward kinematics, converting X/Y/Z positions into joint motion and reporting end-effector location.
Requires ARC
v12
Upgrade to ARC Pro
Discover the limitless potential of robot programming with Synthiam ARC Pro – where innovation and creativity meet seamlessly.
📚 This thread has been preserved as a helpful resource!
While new comments are no longer being accepted, the information here remains available for learning and reference. Check the most recent comment for additional details.

Short answer: There is no official video embedded on the Inverse Kinematic Arm skill manual page at this time. You can view the skill manual here (it contains examples and code snippets):
https://synthiam.com/Support/Skills/Servo/Inverse-Kinematic-Arm?id=21839
If a video would help, I can either:
Below is a concise howto and example scripts so you can see the skill in action right away.
How to get a quick demo running in ARC
Add the skill
Open the skill settings
Configure joints (example)
Initialize safely
Test inverse kinematics MoveTo command
Where to find more help
Would you like me to:
Tell me:
I’ll prepare the exact demo steps or project based on your answers.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
While there is no video, it's difficult to create because every arm configuration can be so different. We have recreated the manual to provide far more in-depth information that should help you get across the finish line.
Thanks - that matches what Synthiam_Support explained: a single generic video is hard because every robot arm is different. I can help you get the same benefits a video would provide by walking you through a clear, visual-style checklist and providing ready-to-run demo code or a sample ARC project you can load and test.
Below I’ll give:
Useful links
Step-by-step: add & open the skill
Configure joints (kebab menu -> Configure)
Safety & test prep (do these before inverse moves)
Basic JavaScript tests (replace the skill window title exactly if you changed it)
Camera + pick example (only after geometry verified)
Troubleshooting checklist (most common issues)
Saving & sharing your config
What I can do for you
Next step - tell me which of these you want me to prepare
Also answer these quick questions so I can provide the best next response:
I’ll prepare either the demo ARC project, a tailored configuration, or a step-by-step guide based on your choice.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
I still would love to see a video of just any configuration. In the meantime, would love to see what could be created here.
My specs for one of the arms, say Left arm are (remember arm hangs straight down from a shoulder servo, so in a 'home position, all servos would be a set position of 90 degrees. All servos are EZ-Robot HDD type, including gripper. Arm has 5 dof. v11= Left shoulder servo (servo rotates), Base of arm v12=Left Arm servo (lever), 110mm long v13=Left Elbow servo (lever), 110mm long v14=Left Wrist servo (lever), 110mm long v15=Left Gripper servo(lever), At end of the arm
All arm servos "should" be 90 degrees for center, and swing from 1 to 180 degree
I will have a camera, not on arm but on head d0 Lever servo (90 degree centered) for camera left/ right d1 Lever servo (90 degree for center) for camera up/down. Camera to be looking down to see arm.
Great - thanks for the details. I’ll create a ready-to-run, concrete example you can paste into ARC and use as a starting point for your Left Arm. I’ll:
Everything here uses your specs (5 DOF arm, servos centered at 90, three arm bones of 110 mm each). If you want, I can turn this into a downloadable ARC project next.
Suggested joint entries (enter exactly as shown). Remember the config uses millimeters (mm) for the lengths:
Joint #1 (base / rotation)
Joint #2 (shoulder -> upper arm lever)
Joint #3 (elbow)
Joint #4 (wrist)
Joint #5 (gripper)
Important: save this configuration in the config window: File -> Save. That produces a .kinematic file you can share/back up.
Notes:
Example Camera Device Tracking Start script (paste into Camera skill Tracking Start JS):
Calibration tips:
Which of these would you like next? Please answer:
If you want the ARC project, say "Yes - build ARC project" and supply the gripper length + camera resolution and I’ll prepare the tailored files and exact values for open/close gripper positions.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
Yes-bild ARC project Gripper length unkown Camera is the EZ-robot camera (use defaults) Skill name could be called "LeftArmTest"
Great - I’ll build the LeftArmTest example for you. I assumed the gripper tip length is 50 mm (because you said unknown) and the EZ-Robot camera uses the default 320x240. I’ll show exactly how to create the ARC project (skill configuration via the kebab menu), give you safe initialization and demo scripts you can paste into ARC, and provide a camera-to-arm tracking script. After you test, give me the actual wristgripper-tip measurement and I’ll update the configuration and provide a downloadable ARC project if you’d like.
Important assumptions (change these later if needed)
Useful docs
Step A - Add & configure the Inverse Kinematic Arm skill (LeftArmTest)
Joint 1 (base / shoulder rotation)
Joint 2 (upper arm)
Joint 3 (elbow)
Joint 4 (wrist)
Joint 5 (gripper)
Step B - Safety & manual checks (do these first)
Step C - Paste-and-run scripts Copy these scripts into ARC script editor or into action buttons. They use JavaScript and print() for debug.
Calibration tip: Place a colored marker at several known arm positions, detect it, and adjust the Utility.map min/max values until the gripper aligns with the marker.
Step D - Save & export
Troubleshooting quick checklist
Next steps - which would you like?
Which do you want me to do next? Also confirm if the camera resolution is indeed 320x240 (default) - otherwise provide the resolution.
I am a customer support GPT-powered AI and still learning. If you have doubts about my response, please use the support documentation to verify my answer.
You cannot have an unknown length of any joints, including the gripper (end effector). The math requires all values or the arm will not work. The measurements must be exact with no exceptions.
athena, please explain in detail what this robot skill is and how it works