ARC Pro

Upgrade to ARC Pro

Stay at the forefront of robot programming innovation with ARC Pro, ensuring your robot is always equipped with the latest advancements.

#113  

@fxrtst I've never seen that clip before...it never crossed my mind that you could use a neural network to drive motion, I guess this is the way to go! All I could think of so far is trigger motion by sound levels like a pause in movement while there is a muted narrator...I was also thinking that it could benefit to use Tensor flow for Emotional content recognition like Dani Thomas did, but I guess soon we will have more interesting things coming up within that field!

@Ramon3141 Sorry man I am no engineer at all, and the device I was talking about being slow at communication with the serial port is a less that 10$ device!:D So you handpoked all the rotations to your device...which is a good thing to do to have the presentation up and running! But the main question still persists! How did you play it back on your machine?;)

#114  

@Mickey666Maus Im not either but yet we engineer things anyway right? lol

you mean hand poked as in in the animation or in real life? I havent manually done it yet but im prepared to do it just to get the machine to move at least in a programmed arrangement. If in about a week or so I dont see a closing route for a script to get csv data and a plug-in to get UWP code then I will start manually doing it which will take many many sleepless days and nights lol In animation, if you mean manually moved the machine to match the bi-ped, Idk my partner is the magician there. but whatever she did it worked beautifully. We have a bunch of scenes of all kind of movement that arn't optimized kinematics (as I said earlier that will require machine learning) but work and well enough to salsa so hey x) ha well, engineer or not I appreciate the advice as always and friendly discussion over such interesting topics.

I must admit, I think more than just motion driving, machine learning in animation and video game design alone will leave us will games that evolve differently every time you play them, characters that move and act differently according to experience and so much more. Im very much looking forward to seeing what machine learning will bring us and to use it too!

#115  

@Ramon3141 Ok, this is kind of the main point...since you are using the same setup like I, I was wondering what did I miss so far that my motion is still not fluid at all! My next approach to this would have been to use the Rasberry Pi in conjunction to an I2C servo Breakout Board and to store the data within the Pi! Now I am still sending it to the micro controller board, using LAN...

So if you found any way to create a fluid frame to frame motion, it would be cool if you could share how this had been accomplished!:)

#116  

Quote:

you mean hand poked as in in the animation or in real life
I think extraction rotational values is secondary, getting fluid frame to frame motion is the main task!

#117  

I also checked your Max Script now...could you please attach a reference scene, or setup instructions? Does it need a biped with mocap in the scene? It looks like someone took a great deal of time to get this thing coded, but then again maybe I am to limited in MaxScript to understand it?

#118  

I see where the miscommunication was. Okay yeah, besides using some G-code and basic motion tests my physical machine has not moved an inch. My virtual model moves smoothly and by creating a very high frame rate and moving that data through a kind of autposition type plug in I suspect choppiness will go away once the UWP code is on the pi and the pi executes the direction to the nema's. personally I think sending data to the board through anything is a bad move, I can see a bunch of places where things can go wrong. Thats why Im trying this windows 10 Iot core + UWP code thing. granted it might end miserably lol

let me ask you, what does your interest in this topic stem from? maybe that can help me better understand. At the very least Ive made some friends on the Autodesk forum (they've been helping me figure out the code and best route to take) that i can relay things to. Friendly bunch of people they are. happy to helPP.

Well, that code is cobbled together from a group of people. Ive only gotten it to work with the small test animation at the top.

before i started seeking help me and my partner made something like this which will extract rotational data relationships, its ugly code with a bunch of errors but it produced data so that spurred on refinements. I think we are close. its above my head at this point but from what I can follow my buddy at the autodesk forum that is working on fixing the code you saw before is getting very close. maybe even by tomorrow. lets see.


(
	fn writeData bn toFile =
	(
		data = if bn.parent == undefined then bn.transform.rotation as eulerAngles else (bn.transform*inverse(bn.parent.transform)).rotation as eulerAngles
		format ",%,%,%" data.x data.y data.z to:toFile
	)
	
	fn recursiveWriteData bn toFile =
	(
		format "%" bn.name to:toFile
		format ",%" (if bn.parent != undefined then bn.parent.name else undefined) to:toFile
		for t = animationRange.start to animationRange.end do
		(
			at time t
			(
				writeData bn toFile
			)
		)
		format "\n" to:toFile
		for obj in bn.children do recursiveWriteData obj toFile
	)
	
	fn getFileName =
	(
		return getSaveFileName caption:"Save..." types:"Excel(*.csv)|*.csv|"
	)
	
	fn main =
	(
		root = undefined
		if selection.count != 1 then
		(
			messageBox "Please select just the root joint and re-execute the script." title:"Error: rootJoint"
			return -1
		)
		else
		(
			root = selection[1]
		)
		fileName = getFileName()
		if fileName == undefined do
		(
			messageBox "Please select a filename for the file to be saved." title:"Error: fileName"
			return -1
		)
		file = undefined
		if doesFileExist fileName then file = openFile fileName else file = createFile filename
		recursiveWriteData root file
		
		flush file
		close file
	)
	main()
)

As far as a test file I cant really or I could be sued :/ not yet at least in about a month Im going to post it here (as long as we actually get it working lol )

oh and no, that parts of it that work should work with any scene with a single animation entity in it. try the test code at the beginning. that works to create a basic hip and leg joint animation that the following code extracts from but there are a couple errors. Like a mapping error in that csv code that I cant figure out etc. Ive only been coding for about 6 months out of necessity for my different projects but before that i couldnt print hello world in python lol so we might be the blind leading the blind here but at least we will be able to hear twice as much!

PRO
USA
#119  

Ramone Finally got it now. Great that Dr. had some background in 2d animation! And yeah got the Pacific Rim connection. I work with Legacy Effects alot and have seen the set ups in person.

Mickey666 cool video. Interesting to see how all this plays into robotics. So many possibilities!

#120  

@Ramon3141 Yes getting the animation up an running on a virtual model is not a problem at all...but playing them back on the physical machine is!

The 3ds max script would only make sense with a scene at hand otherwise I am not possible to decipher the code...

Sending the data to a server was an idea to sync the live and the playback thing...you could send the whole motion at once in a single big array, and let your device play it back! This would eliminate the need to have all your motions stored within the device itself! With the microcontroller board this is not possible due to limited onboard memory, but the pi has more than enough to do so! So my next approach will be the Rasberry Pi in conjunction to an I2C servo Breakout Board.

It might also be just the wrong approach and servos should not be driven frame by frame, the Auto Position that @DjSures implemented is a robust solution, so maybe I will also try to just make the pose to pose movements really tight, to kind of mimic natural movements?

My interest in robotics was sparked here within this community...EZ-Robot got me into robotics, and since I am on the hook now I just cannot let go!:D