The easiest way to program the most powerful robots.
Use technologies by leading industry experts.
ARC is a free-to-use robot programming software that makes servo automation, computer vision, autonomous navigation, and artificial intelligence easy.
@Mickey666, I've done some deep digging into that plugin for Maya and watched about 6 hours of video. That thing will definitely allow me to get my face mocap out of maya onto my robot . Its not live but i think that will be ok. The set up is pretty clunky but will work. Might have to get a plug in made for the data back into ARC. Thanks for the link.
@fxrtst where did you get to see the tutorials? Are they being included when you buy the plugin? Are those helper objects doing anything more than extracting rotational values? It seems like the plugin has a lot of cool additional features, I can get everything I need to do to work within 3ds max...but I could get me a student version of Maya and try the plugin if you are recommending it!
It is basically a playback recorder for the Polulu servo Controller right...I guess if you can find the file where the output is written down @DjSures will get a plugin going...he already said that he is working on a frame by frame importer for ARC. It might be flexible, or if it requires a strict format, you could still reformat the output of the plugin to match the need of the plugin!
I went to the Stan Winston school to watch 3 videos by the author of the plug in (not free). Yes he has put quite a lot into the plug in.
His helper objects and naming conventions allow you to use any models or he has prebuilt servos where you can represent motion of a servo. You can also attach mocap data as long as you follow his naming. You can animate just about anything and attach helpers.
Somethings were unclear as in the videos it seems like you can only use translational Y for most things but not sure if that’s the plugin or if just his examples.
Also you can use educational version but NOT maya light.
And yes after you animate you record and spit out the code to go to the controller, but limited to 8k files. That’s about 20 secs animation for 18 servos.
A plug in would be easy as you can look at the order in a text file for frames and positions.
My big question, can I use that data with the Dynamixels? I will most Likely use those for my professional robots. Not sure how to get those two to play together.
Hey @Djsures, here is the template. A little late but its here I'll post the script and instructions on how to use it shortly. This works reliably and is robust. We downloaded a bunch of random single object animation scene that had joints and it worked for everything so hopefully it will come in handy to the community you've built here.
(p.s not related to thread but I saw that post with the your old video. You really are very personable and even though I am new to the Ez-Robot community I personally would love to see a askDj series!)
Ha @Mickey666Maus, that would be amusing to see x) Im sure if I were to wire it then it would very likely tear itself apart quickly haha well, I just order the PC, ezBv4 and associated tech suggested by @Djsures and we will see if this thing blows up once the plug in is made. can't wait till I can post a project showcase! unfortunately, it'll be about a month or so till I'm allowed to but it will be fun. The script will take a few more days to post, it was a bit of a pain to use so Im having him turn it into a UI and UX friendly plug-in that asks for conversion formulas for slider joints, allows you to add multiple scenes to the same csv or select a particular frame range instead of the default of the entire scene etc Things that make is super easy to use even for newbies or even for people who can't animate at all but have a JD animation someone posts and want to program their JD.
@Ramon3141 This sound so cool! Combined with the plugin, we would have a powerful animation tool for ARC, and could add so much more personality to our robots! This is a great leap forward!
I watched the videos a few more times to see how the helpers are working in conjunction with the animations I.E.animation is tied only to the "Y" axis of the helpers. See pic. Brow slider attaches to the "y" of the brow helper etc.
But i thought to myself, we should be able stay away from complicated math to solve the Euler angles. Let the program solve the angles and just grab the passed on values from the helpers.
Mocap data on mocap skeleton ------> retarget to new skeleton bones------> bones rotation to helper single axis.
So each helper slider represents a single servo. That could be written out to a csv and loaded into ARC from a plug in.
When PTP was working on the live data from Perception Neurons software, it was pushing out raw Euler angles which he created math to solve. This skips over that step. For you guys you can do the same thing for IK solving. IK to Retarget then to helper. So again you avoid the Euler angle math/solving.
@fxrtst Hey, I guess for whatever reason I turned off notifications for this thread...so I might have a bit of a delay at times!
You are absolutely right in what you are saying, if your 3d model matches your robot, all you would have to do is to extract the rotational values locally and store them in the format needed for further processing!
Extracting the values is not my problem, I do have a problem in correctly setting up the IK for my robot, since mechanical rigging with all the dependencies and constraints is not an easy task...this is why I still use FK to get things done.
Also I never used Maya, I did everything in 3ds max so far...
You know how when trying to create something new you end up jumping thru so many programs to get the job done? Well I've looked into different ways to accomplish this without too much extra effort. And to make it available for everyone.
I have MAYA MAX MotionBuilder and Lightwave, all pay for use programs. But I've never tried Blender, which is free and opensource. I found that its mostly built on python and pretty modular. I also found several people who have manged to get Blender to talk to servos via ardurino and pyserial, live in real time. If that is possible, then we should be able to get ARC to work nicely with Blender.
Blender supports FBX and 3ds so you could export your max scene.
I've downloaded it and its just like all the others. You learn one animation program they are all basically the same. You just have to find the buttons.
Anyways I'll explore more and download the test with the two servos try and get my head around it and post more here.
@fxrtst Hhhhhhmmmmmm.....I guess it kind of depends on what you are up to!
Quote:
I have MAYA MAX MotionBuilder and Lightwave, all pay for use programs.
All of them are available for everyone to use, when you agree on the terms of a student version...so if you are not aiming for a tool which is being sold in tandem with your robot, you are on the safe side! If you plan to do so Blender would be a very good way to go!
Quote:
Blender supports FBX and 3ds so you could export your max scene.
Well the scene depends very much on parameter wiring, kind of the same thing which the plugin you are testing in MAYA right now is doing...this cannot be used cross platform.
Quote:
You learn one animation program they are all basically the same. You just have to find the buttons.
Which is kind of true when scratching the surface...but once you dig in deeper, you will have to do a lot of scripting, the tools for rigging a model and general workflow are not as easy to learn as shifting a box around your workspace!
Anyways, you can get stuff done in any of those 3D applications, the main problem is that we are still lacking a way to playback our CSV files in ARC.
I will be going to Japan for three weeks...so I am off the grid, but once I am back I will be more than happy to sit down to find a solution for this, I would not mind to take a look at Blender...the last time I tried I found the interface a bit clumsy, but that might have changed in the meantime!
Trust me if this solution is Max Based, then I'm just as happy!I'm going to try this Blender thing because it is already written and I can do some testing with animation and moving the servos.
You are right the interface is a bit clunky but it is pretty powerful. Funny thing i thought it was something that would never make it this far, as far as competing with autodesk and Newtek. But its come a long way. In fact Lightwave development is so far behind Blender. And most companies are now offering a plugin for Blender and no longer for lightwave.
*stress*
Have a great trip to Japan, chat when you get back.
@fxrtst I made a quick demo on how far I got this...I build an interface in 3ds max to control the robot and edit animations. Those animations can be exported as CSV files. Also the demo shows how to use the Google Assistant with a custom wakeword, you can always enter your custom Dialogflow Agent, and return back to the Google Assistant! I did not really have time to put it together, since I am off to Japan tomorrow, but I hope it is OK enough to show where I want to go at some point!
The animation created can be exported to a CSV file...only thing I did not manage so far is building a stable playback Software. I experienced problems playing back frame by frame motion in Arduino and ARC, but maybe @DJSures will build a Plugin?
When I am back from Japan, I will try to use a Rasberry Pi with a servo Controller board...there should be a sollution, its just a matter of fine tuning on whatever platform we will be working on!
@Mickey that's a great video - like the style of the music as well.
If you give me a sample copy of your CSV, i can easily make a plugin for you. There's a new plugin i created last week for @Will that has a few anti-bounce filters. I can incorporate one of those. No need to go a raspberry pi route and write a bunch of code *yuck*
@DJSures I saw the plugin, I still need to fully explore what it does,but I know it can be very useful to a lot of things!
My CSV file looks like this 90, 86, 15, 47, etc...starting from servo D0 up to the number of servos involved! So it might be cool to implement an option to state the number of servos being used in the robot!
I want to use servo control on the Rasbery Pi because it runs the Google Assistant and also a local server is running on it anyways...so with an additional servo controller board it will eliminate the need to run a windows computer.
But I will definitely run it on windows and the EZ-B, so we will have a common ground to push it further, it will be tandem to the Rasberry Pi project...because the great thing about EZ-Robot is, to create and share within the community! That is the biggest fun with DIY Robotics right?
Are those helper objects doing anything more than extracting rotational values?
It seems like the plugin has a lot of cool additional features, I can get everything I need to do to work within 3ds max...but I could get me a student version of Maya and try the plugin if you are recommending it!
It is basically a playback recorder for the Polulu servo Controller right...I guess if you can find the file where the output is written down @DjSures will get a plugin going...he already said that he is working on a frame by frame importer for ARC. It might be flexible, or if it requires a strict format, you could still reformat the output of the plugin to match the need of the plugin!
His helper objects and naming conventions allow you to use any models or he has prebuilt servos where you can represent motion of a servo. You can also attach mocap data as long as you follow his naming. You can animate just about anything and attach helpers.
Somethings were unclear as in the videos it seems like you can only use translational Y for most things but not sure if that’s the plugin or if just his examples.
Also you can use educational version but NOT maya light.
And yes after you animate you record and spit out the code to go to the controller, but limited to 8k files. That’s about 20 secs animation for 18 servos.
A plug in would be easy as you can look at the order in a text file for frames and positions.
https://docs.google.com/spreadsheets/d/1pYTcRHukt4mjG5cuBfKYq4ako1xEgWtkc_s6ls63NYc/edit?usp=sharing
(p.s not related to thread but I saw that post with the your old video. You really are very personable and even though I am new to the Ez-Robot community I personally would love to see a askDj series!)
The script will take a few more days to post, it was a bit of a pain to use so Im having him turn it into a UI and UX friendly plug-in that asks for conversion formulas for slider joints, allows you to add multiple scenes to the same csv or select a particular frame range instead of the default of the entire scene etc
Things that make is super easy to use even for newbies or even for people who can't animate at all but have a JD animation someone posts and want to program their JD.
But i thought to myself, we should be able stay away from complicated math to solve the Euler angles. Let the program solve the angles and just grab the passed on values from the helpers.
Mocap data on mocap skeleton ------> retarget to new skeleton bones------> bones rotation to helper single axis.
So each helper slider represents a single servo. That could be written out to a csv and loaded into ARC from a plug in.
When PTP was working on the live data from Perception Neurons software, it was pushing out raw Euler angles which he created math to solve. This skips over that step. For you guys you can do the same thing for IK solving. IK to Retarget then to helper. So again you avoid the Euler angle math/solving.
I might be missing something here or....?!
You are absolutely right in what you are saying, if your 3d model matches your robot, all you would have to do is to extract the rotational values locally and store them in the format needed for further processing!
Extracting the values is not my problem, I do have a problem in correctly setting up the IK for my robot, since mechanical rigging with all the dependencies and constraints is not an easy task...this is why I still use FK to get things done.
Also I never used Maya, I did everything in 3ds max so far...
I have MAYA MAX MotionBuilder and Lightwave, all pay for use programs. But I've never tried Blender, which is free and opensource. I found that its mostly built on python and pretty modular. I also found several people who have manged to get Blender to talk to servos via ardurino and pyserial, live in real time. If that is possible, then we should be able to get ARC to work nicely with Blender.
Blender supports FBX and 3ds so you could export your max scene.
I've downloaded it and its just like all the others. You learn one animation program they are all basically the same. You just have to find the buttons.
Anyways I'll explore more and download the test with the two servos try and get my head around it and post more here.
All of them are available for everyone to use, when you agree on the terms of a student version...so if you are not aiming for a tool which is being sold in tandem with your robot, you are on the safe side! If you plan to do so Blender would be a very good way to go!
Well the scene depends very much on parameter wiring, kind of the same thing which the plugin you are testing in MAYA right now is doing...this cannot be used cross platform.
Which is kind of true when scratching the surface...but once you dig in deeper, you will have to do a lot of scripting, the tools for rigging a model and general workflow are not as easy to learn as shifting a box around your workspace!
Anyways, you can get stuff done in any of those 3D applications, the main problem is that we are still lacking a way to playback our CSV files in ARC.
I will be going to Japan for three weeks...so I am off the grid, but once I am back I will be more than happy to sit down to find a solution for this, I would not mind to take a look at Blender...the last time I tried I found the interface a bit clumsy, but that might have changed in the meantime!
You are right the interface is a bit clunky but it is pretty powerful. Funny thing i thought it was something that would never make it this far, as far as competing with autodesk and Newtek. But its come a long way. In fact Lightwave development is so far behind Blender. And most companies are now offering a plugin for Blender and no longer for lightwave.
*stress*
Have a great trip to Japan, chat when you get back.
Also the demo shows how to use the Google Assistant with a custom wakeword, you can always enter your custom Dialogflow Agent, and return back to the Google Assistant!
I did not really have time to put it together, since I am off to Japan tomorrow, but I hope it is OK enough to show where I want to go at some point!
The animation created can be exported to a CSV file...only thing I did not manage so far is building a stable playback Software. I experienced problems playing back frame by frame motion in Arduino and ARC, but maybe @DJSures will build a Plugin?
When I am back from Japan, I will try to use a Rasberry Pi with a servo Controller board...there should be a sollution, its just a matter of fine tuning on whatever platform we will be working on!
Have fun in Japan, one of the few countries ive not visited but high on my to go list.
If you give me a sample copy of your CSV, i can easily make a plugin for you. There's a new plugin i created last week for @Will that has a few anti-bounce filters. I can incorporate one of those. No need to go a raspberry pi route and write a bunch of code *yuck*
My CSV file looks like this 90, 86, 15, 47, etc...starting from servo D0 up to the number of servos involved! So it might be cool to implement an option to state the number of servos being used in the robot!
I want to use servo control on the Rasbery Pi because it runs the Google Assistant and also a local server is running on it anyways...so with an additional servo controller board it will eliminate the need to run a windows computer.
But I will definitely run it on windows and the EZ-B, so we will have a common ground to push it further, it will be tandem to the Rasberry Pi project...because the great thing about EZ-Robot is, to create and share within the community! That is the biggest fun with DIY Robotics right?