ARC Pro

Upgrade to ARC Pro

ARC Pro is more than a tool; it's a creative playground for robot enthusiasts, where you can turn your wildest ideas into reality.

#105  

I actually went and checked out your youtube videos and I believe I saw the facial mo-cap thing you're talking about @fxrtst. REALLY cool stuff, I agree in that in essence its all under the same system of logic. really using mo-cap is just a step removed from live that allows that fluidity. I think you are right in that all our work will help and probably be consolidated so one day soon they are just normal features and I'll be able to mo-cap someone swing dancing and get my robot to do it in a few minutes or real time.

That motion builder blending feature actually really fascinates me. Unfortunately Ive never used it but if i could ask you a specific question given your extensive experience. I have to explain some context so forgive the length and hopefully you find it interesting. Okay, so in essence my team is partnering with the robotic prosthetic department at a university to build a machine I designed that through a series of simple sensors, programming and mechanical trickery allows one to walk, run, jump etc in place in 360 degree. They call this class of machine Omnidirectional treadmills but all the ones that currently exist are useless. Basically a harness that keeps you from busting your tail holds you while you lean forward and move on a slip surface which is just... sleep anyway, this actually allowing nearly completely authentic unrestricted movement and has medical, VR, exercise, military and even special effects applications in your field come to think of it. anyway, long story... long they have agreed to build a version of the machine for medical applications with intentions on doing clinical trials with people who have paralysis or gait abnormalities (it can either be passive where you move and it just allows you to move freely or active where it moves you or hybrid for rehabilitation). Anyway the point being, we have a presentation in a month to show off a 1/3 scale simplified proof of concept model we my team has built. This presentation will determine how much money they put into its development lol so yeah x) This is what started my interest here because after I drew up the CAD designs in Fusion 360, another member of my team then imported it into 3ds Max and using public mo-cap data and mo-cap data from the lab (they use it to study gait) animated it, basically manually animating the model by just running the bare math I worked out as to what moves when but it looks and works great!

The rub: We have a completed (largely 3d printed machine) with 15+ nema 17's, and 4 servos that our electrical engineer team member figured out how to control via driver boards hooked through to a pi via an SPI connection. The pi will run window 10 Iot core so that it can process the Universal windows Platform code that ARC spits out and voila! but without the ability to turn that animation to movement instructions we are stuck because it will take alot of time, money and expertise to work out the math an programming for the kinematics. (we lack abundance in all those things)

Now my question: If you've made it this far down, essentially my question is: How hard would it be to take the animation scenes of the biped + machine turning around, turning left, right, walking, running, doing the salsa (for real lol) etc and blend them like a video game character? so using a game controller it would seamlessly transition from forward to slightly left to really left etc (the person who did the animation simply learned how to do it for this project since she is actually a medical resident lol Basically we learn whole skill type things only know what we've had to.) because THAT is what Id love and Im sure would blow them away at the presentation, increasing the chances this thing actually gets made and helps people. so just Hand them a game controller or something and have the machine, which is just copying the animation values real time mimic what is essential a game character. of course, it can be prerecorded too if can't figure it out by then or there is too much lag but having them blended would still be so much better! So, again, sorry for the length but given your experience and what ive seen of what you can do I thought it was worth the effort to ask:D anyway thanks man.

#106  

Quote:

SPI connection
I tried SPI connection and it seems to be kind of slow for frame to frame animation being send to the microcontroller board if there are a lot of servos involved! That is why I switched to LAN!

Quote:

The pi will run window 10 Iot core
If you are having a pi, you can store your animation sequences directly on the pi, why would you have to send them to the pi in the first place...the animations would just be a bunch of numbers in CSV format anyways...

Quote:

How hard would it be to take the animation scenes of the biped + machine turning around, turning left, right, walking, running, doing the salsa (for real lol) etc and blend them like a video game character?
Well here comes the hard part...and this is the one I am still struggling to get right! FK is pretty straight forward! But IK is kind of hard to set up if there is a robotic rig with a lot of DOF...also the hardest part in the whole setup is euler rotation spaces, and gimbal locks plus the fact that rotation has to be extracted locally and always the child and the parent plus the gimbal order needs to be taken care of...look into this when you are thinking about extracting rotational values! Another thing is that the 3ds max biped is a wonderful thing as an animation tool, but it is sadly kind of useless for robotics, trust me...I have been loosing a lot of sleep over this! A custom rig is the very best option...I had been partly successful in linking a mecanical rigto a biped and then extracting the values I needed out of this rig..but as I said, someone being specialized in rigging would being very very helpful!

Anyways its Friday night, so I will look into you Max Script tomorrow, looks interesting...maybe to complicated for me, but lets see!:D

#107  

@DJSures

Quote:

This conversation is a dream of million wishes but no clarity.

Thanks for bearing with us, it been a year already, I guess even longer, since this topic has been brought up...:)

Actually, you would not have to worry about the 3rd party applications which will hopefully extract the data for the animations in the end, all we would need it a tool to send or data on a frame to frame basis to the servos, opposed to a pose to pose approach like the Auto Positioner!

I would make 3ds max to write a CSV like

90, 90, 70, 60, 55 etc

or it could be

D1, 90, D2, 90, D3, 70, D4, 60, D5, 55 etc

or

fps, 30, D1, 90, D2, 90 etc

It would not matter...the main problem I encountered was that the positions could not be send correctly timed! If you would let me know the format you would like the data to have, I would format it any way it suits the plugin!:)

PRO
USA
#108  

Ok, so alot of information there. I gotta a bit of an idea what you are trying to accomplish. I've seen those treadmill 360 for VR and video games. Im having a problem understanding how the mocap is applied to the 3d printed treadmill. Or are you looking for a virtual representation on the screen of what the treadmill is doing?

Using Motionbuilder is a quick way to merge several motion capture clips together but i would not say its seamless, the files need to somewhat match to get it looking "real" between transitions. There are new neural network programs being written and explored.

Your time frame is pretty short, and getting a portion of what you want to accomplish will be challenging in that time. But the plug in is something these guys have been working on a year...so who knows.

#109  

Mickey666Maus Germany Posted 17 minutes ago Quote:

Quote:

I tried SPI connection and it seems to be kind of slow for frame to frame animation being send to the microcontroller board if there are a lot of servos involved! That is why I switched to LAN!

Hmm our engineer thought it would be the easiest plus we found these great SPI stepper driver break out boards to allow us to save on GPIO pins but if we encounter that problem of slowness I'll definitely mention switching to LAN. I'd rather wait to make sure its too slow because as it is this machine has been crazy expensive to build lol

Quote:

If you are having a pi, you can store your animation sequences directly on the pi, why would you have to send them to the pi in the first place...the animations would just be a bunch of numbers in CSV format anyways...
hmm could you elaborate? I meant using the Pi to read the control code the UWP code made from the UniversalBot, the open source Microsoft Windows Universal Robot SDK made to work on windows and that is at the heart of the alot of ARC functionality I believe, the CSV files are great but unless we run it through something like autopositioner but make it, like you said in the post to Dj, frame to frame instead of pose to pose then due to the different variable degrees of what a joint might be at frame x as opposed to frame y then you can get the fluid movement of the animation as well... well thats the idea anyway. lol have i mentioned I am a biologist by training? Im doing most of this through inductive reasoning largely. Now that you know my particular needs, if you have a superior method please do not hesitate a critique. Id really appreciate any help at this point, I have pretty much everything done physically and in hardware/wiring etc but I have one month to turn animations to movement and my engineer, bless him, is something of a weekend warrior so he can't dedicate anymore time beyond the base hardware configs etc.

Quote:

3ds max biped is a wonderful thing as an animation tool, but it is sadly kind of useless for robotics, trust me...I have been loosing a lot of sleep over this!
Ha i very strongly sympathize with sleep loss here my friend. Im not a coder or engineer by any substantial means, yet have a few weeks to accomplish what much more talented people in this domain, such as yourself, have been struggling with for more than a year. Fortunately, I know enough to have ideas and ask people who really know what they're doing in a particular field how to get it done. Collaboration is a beautiful thing! Anyway, yeah essentially we used the biped to manually work out the forward kinematics but due to the nature of the machine IK becomes something of a nightmare. Im not even going to think about it for now lol

Quote:

Friday night, so I will look into you Max Script tomorrow, looks interesting...maybe to complicated for me, but lets see! Grin
Have fun then buddy, if I recall friday nights in Berlin were always great fun. cheers!

PRO
USA
#110  

A couple of programs, one is Ikinema that is easier and cheaper than motionbuilder and also Mixamo which is free to getting animations onto a character...you have both motions and characters that are free to download. Then those fbxs go into Ikinema to transition between motions to get your animation as seemless as you can get.

PRO
USA
#111  

Quote:

...also the hardest part in the whole setup is euler rotation spaces, and gimbal locks plus the fact that rotation has to be extracted locally and always the child and the parent plus the gimbal order needs to be taken care of...look into this when you are thinking about extracting rotational values!

PTP had to resolve the euler angles in the plug in he made for Perception Neuron. The way they process using IMUs as the form of movement for their bones you have to do some serious math to get the the rotational values. Which he did because hes genius. But he is someone to chat with. Gimbal lock in relation to parent child I had not thought of. Hmmm scratches head. I may go back to my original bulkier idea of capturing motion using what we call slave system. Its basally a suit you wear with pots at all your joints. Its like a complex joystick you wear...lol. Anyways Im thinking out loud again.

#112  

Amazing, I found that video enthralling, I appreciate it. We've actually already been courting data scientists for when the full scale build begins because we plan on correlating, using various methods we've studied and have had suggested to us by learned advisers, the EMG data from the lower legs, motion capture data and the kinematics of the machine to get it to understand how people walk, run etc and eliminate lag and expand functionality by having the machine know where you're most likely going based on the slight variations of your movements and neuromuscular impulse signals. but yeah, for the purposes of this 1/3 scale simplified proof of concept design presentation I think the programs you mentioned will do haha I just need to make sure we cobble together a sufficiently capable script that can gather the data and plug it in before I set to work on that lol I will forego sleep for days if I have to manually plug in every key value for every joint for every frame, if I have to but by god it will get done one way or another. sleep stress:D

As far as the mo-cap data. We took the data from public sources (thats how we animated the machine with salsa steps) and from the lab we're working with and used it to run a bi-ped then using keyframes and divine patience, my partner who had a little 2d animating experience (but is actually a doctor and had no clue what she was doing at first), key framed the entire movement sequence of the machine to correspond to the steps of the bi-ped controlled via mo-cap. Then we simply removed the bi-ped and voila! definitely brute force but it worked to create animations of the machines movements with little more than some light math, curve editing, manual positioning and LOTS of key frames since it always got it wrong on its own.

anyway, thanks for the advice and if anything crosses your mind please Id greatly appreciate it. Ive looked over your amazing work on Alan, and his apparently naggy wife lol so I feel you'd understand passion projects that push the bounds of your knowledge and abilities.

thanks again man.

ooh you're a movie guy, seen pacific rim? you know that thing they get in to control the bots? well in the scenes where they are strapped in they have these things strapped to their feet that move. think along those terms but MUCH more complicated, just a fun reference similarity someone once pointed out to me.