ARC Pro

Upgrade to ARC Pro

ARC Pro is your gateway to a community of like-minded robot enthusiasts and professionals, all united by a passion for advanced robot programming.

PRO
Synthiam
#1  

What's wrong with the original thread?

#2  

@DJ_Sures Good question.

The original thread title was "Camera Interference With servo Movement?" That issue and the reasons for it have been addressed and a new thread direction has been established which centers on remote robot control with 3ds. Specifically software methods to achieve the goal (along with robot design). I felt a more descriptive title was in order to allow for better reader understanding of what the thread was going to be about from this point on. Not to mention future searches being more relevant. In reality there never was any "interference" from a camera in the first place. Nor would the new direction further address that issue.

That being said; in hindsight, an error on my part may have been starting the new thread myself since it is under my name now. In that regard, can the title author be changed to Mickey666Maus? Or perhaps it would it better if this thread was deleted entirely to give him the opportunity to start the new thread instead for purposes of author continuity?

#3  

Hey guys, I would have re-opened again anyways...just being on the train to Berlin without Wifi kept me from doing it!;) The old thread went a little off topic so I guess it was a good idea to have a new title so everyone can already eyeball what's going on here! I do not care too much about being the author here, its a community so its more about sharing than owning I guess! :D

We started off discussing the problems on the sleep() task being not real time due to Windows being not a real time OS...which will cause problems if you are scripting a Frame By Frame animation where a sleep task is called every 33ms.

So @WBS00001 was proposing a vector approach between keyframes, rather than sending the exact position to a servo every single frame! The code and also the discussion of the problem can be found here Thread #9584

While being on this problem which could also be a general problem of animating movement outside of the Auto Position and just using EZ-Script to create movement, we started getting more specific to the point of creating motion in an external animation package called 3ds max...so we decided to open this thread for further discussion!

3ds max is a 3D modeling and animation software by Autodesk, it is free to use for three years if you are studying or for personal use...there are tons of tutorials on the Internet, so give it a try if you havent already checked it! ;)

I modeled and rigged a robot in 3d, it can be animated in 3ds max and the motion can be transferred....either in real time thru the ARCs HTTP server, or it can be played back directly within ARC by importing the motion into ARC using an EZ-Script that @WBS00001 coded!

We will still have to do some testing and hopefully clean up my messy maxscript...but I guess we are closing in! :)

#5  

The above video is to shows how far I got it wired up by now...My goal is to model and print a robot directly from within 3ds max and then animate it with the software and play it back fro ARC, using different motions for different commands, moods etc!

An now comes my messy piece of code...hope you can throw in some ideas on how to clean it up @WBS00001

#6  

/creating a textfile for output/ global txt_file = createFile "C:\Users\swann\Desktop\movement.txt"

/unregistering previous definitions/ --unregisterRedrawViewsCallback servo_r unregisterTimeCallback servo_r

/* Http socket open*/ rollout httpSock "httpSock" width:0 height:0 ( activeXControl port "Microsoft.XMLHTTP" setupEvents:false releaseOnClose:false ); createDialog httpSock pos:[-100,-100]; destroyDialog httpSock;

/* Send the values from the GUI*/ fn servo_r = ( /first servo/ a = "http://192.168.178.20/Exec?password=admin&script=Servo(D10, " b = ($HandRechts.pos.z + 9) *10 c = ")" d = a+b as string +c

/* HTTP socket connect */ httpSock.port.open "GET" d false; httpSock.port.send();

/second servo/ e = "http://192.168.178.20/Exec?password=admin&script=Servo(D8, " f = ($ArmLinks.pos.z + 9) *10 g = ")" h = e + f as string + g

/* HTTP socket connect */ httpSock.port.open "GET" h false; httpSock.port.send();

/third servo/ i = "http://192.168.178.20/Exec?password=admin&script=Servo(D6, " j = ($SchulterRechts.pos.z + 9) *10 k = ")" l = i + j as string + k

/* HTTP socket connect */ httpSock.port.open "GET" l false; httpSock.port.send();

/fourth servo/ m = "http://192.168.178.20/Exec?password=admin&script=Servo(D2, " n = ($Augen.pos.z +9) *10 as integer o = ")" p = m + n as string + o

/* HTTP socket connect */ httpSock.port.open "GET" p false; httpSock.port.send();

--Arm to textfile u = f as string v = ("Servo(D8,"+ u +")") as string format "%\n" v to:txt_file

print (format "%\n" v)

--Shoulder to textfile w = 33.3333333333 as string x = ("Sleep(" + w + ")") as string y = j as string z = ("Servo(D6,"+ y +")") as string format "%\n" z to:txt_file format "%\n" x to:txt_file

print (format "%\n" z) print (format "%\n" x) )

--registerRedrawViewsCallback servo_r registerTimeCallback servo_r

#7  

RedrawViewsCallback servo_r The RedrawViewsCallback would update the real world robot live anytime a virtual joint is being rotated...

unregisterTimeCallback servo_r The TimeCallback will be called if the time slider is moved to the next frame

b = ($HandRechts.pos.z + 9) *10 The weird .pos.z + 9) *10 comes from the size of my GUI... :D

As you can see, I had to spell out all my servos separately, it would be kind of cool if this could be stringed together somehow, like having a variable on how many servos there are in total and this would be how many times the script looks for those values... assumed that proper naming within the 3d model is given! :D

Also it would be really awesome to have a global variable that will give another name to the .txt file everytime the script is called...I am getting an error if I try to run it more than once because it already exists of course!

Hahahaha and sorry about the simley faces in my code...I really dont know how to get rid of them! ;)

#8  

Just wanted to let you know I'm working on it. I have to do a bit of reading on MaxScript to find the proper statements to re-do the code since I'm not familiar with that language. It should be possible to set up some parameters initially and use a loop to go through the servos one after another, regardless of how many there are or how they are re-arranged in the future. It may even be possible to read something from the rigging information in the model and get the info automatically.

To eliminate the "Winky" emoticons that pop up you will need to put the code between the code and /code markers. Some of the character combinations in the regular text are interpreted as emoticons otherwise. In this case I believe it's the quote symbol followed by the close parenthesis symbol.

Quick questions ...

  1. Are you connected to the EZ-B via an ARC project when you run the Maxscript program?
  2. If so, in which mode are you connected to the EZ-B, AP mode or Client mode?
#9  

All of this sounds very good...I did all the coding using copy and paste from stuff I found on scriptspot etc...I can get it done, but as you can see, it's like a little child trying to speak grown up language! :D

I am connected to ARC with the HTTP server running while executing the Maxscript, this will move the servo according to the RedrawViewsCallback or TimeCallback, whichever is running while the Maxscript is operating!

I am connected in client mode.

I guess for just animating and saving the string to a texfile none of the above would matter, but it is kind of convenient to have the robot executing the movement from frame to frame along with recording it to a textfile, just to make sure the motion works properly! :)

#10  

Thanks for the information. That is what I thought your configuration would be but I wanted to be sure. Even though I don't have Maxscript, I can still run tests and have found you can run scripts and Frames or Actions from Auto Positioner via the HTTP server directly. Probably anything else as well. What this means is we have a choice here of doing everything from Maxscript, as opposed to sending strings of commands and having a script running in ARC to interpret those commands.

On the other hand, we still need to have a way to save the movements for playback later. Therefore we still need to save something to a file. At the moment, I'm thinking the best way might be to save the actual commands such as Servo(D0,10) (like you were doing originally) instead of using the character groups I had suggested, such as D0|10. In this way, there would be the option of re-sending those recorded commands through a simple 3ds script, or having an ARC script read the file and execute each command itself.

Each method has it's drawbacks. To playback through a script in 3ds means, of course, you need 3ds running. That may be difficult or impossible when playing back on a different computer or a tablet computer. I believe, however, I can write a standalone program that would do the same thing so that sort of thing would not be so much of a problem. Whatever is the case, an HTTP connection must be established via some sort of network. And that may be the biggest problem when using a different computer. One advantage of this method is that you would not have to switch back and forth between ARC and 3ds when doing a playback operation.

If played back directly through ARC, nothing else is needed. The ARC script, however, would have to interpret the commands and run them through a selection from an If-ElseIf list since there is no way to directly run a command like you can through the HTTP server method. However, since there won't be a large variety of commands, I don't believe that will be much of a problem.

Having said all that, it may be the best path to do both. That way you have a choice depending on the situation. For development work, use the HTTP method via a 3ds script. For playback to show others, use the ARC script method.

Overall, then, it looks like the biggest hurdle will still be creating the command strings in Maxscript in the easiest way I can come up with, as well as, making getting the data for those strings as generic as possible. For that, I need to educate myself more on Maxscript.

Anyway, think it over and let me know your opinions. Assuming I haven't confused the issue too much. :D

#11  

No, you haven't confused it at all...

Using an external animation package gives us the option of using sophisticated rigging methods, Inverse Kinematic and non destructive workflow when fine tuning timing and position of our motion...so it is obviously kind of tempting to build a rig in 3ds max which allows us to remote control the real model live!

But my overall idea was to build motion to use from within ARC directly, as you have stated it would be an issue once you would like to relocate your robot and use it while ARC is running as a mobile version or when using a different computer.

Quote:

...and have found you can run scripts and Frames or Actions from Auto Positioner via the HTTP server directly.
This is something that did not cross my mind...it would be something that could be interesting, as you mentioned that Auto Positioner operates slightly different that if we would use an EZ-Script to control servo positions right? If those frames could be safed for later used it could be something that might work out!

Quote:

On the other hand, we still need to have a way to save the movements for playback later.
But this is what I was actually aiming for...I would like to have a way to store my pre-recorded animations within ARC directly to use them with my robot in any given environment!

One last thing that just came to my mind...3ds max is a sofware to build animations and render them out or export those motion to a specific file format, it is also not working real time! So if I move my model in 3ds max or if I playback my animation, there is always a lag...it is never as smooth as the finally rendered animation would be, as there is the modeling/animation software involved, max script will be running, the HTTP server will be running plus ARC with EZ-Script will be up to translate te motion to those servos!

So I guess we should aim for getting rid of all these taks with would have to be up an running and try to have it as clean and easy as possible, right? :)

#12  

@Mickey666Maus I'm beginning to get a handle on Maxscript. At least as it relates to the script you presented. In that light I have a question: When does the script actually run? That is to say, what triggers the script to run when you are moving the virtual robot in 3ds?

I'm thinking perhaps you trigger it manually to run once when you move the robot to a new position for the next key frame, but I don't know if that's correct.

#13  

Once I manually start the script it will run in the background and depending on the RedrawViewsCallback or the TimeCallback it will either send the servo position based on viewport update or timeslider movement! I would need to manually terminate the script to stop it from running in the background!

#14  

Post deleted since it served it's purpose in trading email addresses.

PRO
USA
#15  

Wish you guys could have continued here on this thread. This was getting good. Hopefully you are making progress!

I'm leaving my example of what I was able to do as a proof of concept maybe it will interest someone.

This video demonstrates live and recorded facial motion capture by the asus sensor and out putting channels to a servo.

The motion uses fast noise filters to take out alot of the jitters ( the motion capture filtering was so good because it looses my lips alot because the camera was tricked by my goatee). This also captues eye motion based on pupil location.

I then tied my motions to a puppet, in this example an alien i created. This puppet was rigged in the animation package Maya. One of the rotational values (this case the Jaw) was output to an ardurino with a servo. So when my jaw opened the puppet jaw opened, then the servo moved to that position.

I want to continue with this project as my vision of ALAN progressing is with this type of motion capture and prerecorded motions. These can be called upon when a sensor detects emotions.

I'm looking for anyone interested in working with me on this in the next 30 days or so. Anyone who can code in python, write sketches for the ardurino. Notify me off forum if you are interested and we can speak further.

In the mean time have fun watching me make some funny faces!

#16  

@fxrtst We're still at it. Just that we have been corresponding via email instead of posting every little back and forth between us. I should be posting soon as far as a Maxscript for real-time motion from the virtual robot in 3ds. Probably tonight. But it's jittery as yet. Maxscript is based on Python and has a host of similarities so translation to something else Python based might not be too difficult. I'm not sure if Maya can use Maxscript directly, but maybe so since it's the same company. I've heard talk of 3ds supplanting Maya in the future, but that will be a tough slog since Maya is the industry standard.

Currently, the main goal of the project has shifted to recording motions from the movements of the virtual robot for playback into a real robot, like JD for example (Through ARC). The motions are first created with the animation features of 3ds. The animation is run as the various movements are recorded into a file. The method used for that will store much, much less data than the real-time recordings do now. That's mainly because they will be based on a vector model rather than a frame-by-frame animation as it is now. The new vector method can also be used to drive the robot directly, but it will not be real-time since it can't send a specific bit of movement data until the virtual robot has finished making that particular movement. I'll explain what is proposed in the vector method when I post the code later tonight.

Another goal is to be able to create a robot in 3ds, test it, and make what is necessary to build it from 3-D printing. Then use what is being developed now to run the real robot from the virtual robot and create series of movements for the real thing from animations made in 3ds. With 3ds' animation capabilities, you can make the virtual robot do a lot of things without having to map out every little detail manually. Just capture the movements that are done automatically by the animation software. That'll be Mickey666Maus' domain, however. I'm just here to help with the code.

After all this I may see if it's feasible to do somewhat the same thing in Blender with it's built-in Python. 3ds is great but pricey for the long run unless you're making money from it.

Good luck with what you're doing, it looks great!

#17  

@fxrtst This looks supercool, recording motion capture myself was always one of my greatest desires...I will check on that sensor you mentioned! Is there a thread where you are explaining how you accomplished? Or will you probably make one for us?:) The Kinect linked up to record motion also seems to be something worth exploring! I had my Kinect linked up to Motion Builer a long time ago...but the noise of the motion and the lack of real 3d capture made me stop working on this method for extracting motion data.

You seem to know a lot about animating, rigging and modeling in a 3D enviroment...as you know these days 3ds max and Maya are getting easier to use side by side, so I guess you should be able to use your Maya scenes from within 3ds max with a little tweaking! I think it should also be possible to script the same thing we are doing for 3ds max in Maya...but I dont know anything about its scripting language!

Awesome project...lets try to string it all together if possible!

#18  

@WBS00001 Cant wait to see what you came up with! :)

PRO
USA
#19  

Sounds very much like what I trying to do here. Good question on the Maya and Max merge or one gets dropped. Autodesk owns them and I'm surprised that they continue to develop them all. Not sure if Maya and Max share the same plug in architecture, my first thought is probably not. The motion Capture program I am using only has a plug in for Maya live, but recorded motions can be sent to any animation package. Just need someone to write a python plug for them. This software was bought last year by a big company and they no longer support it but it is a full fledged working program for now. I believe they used it in the new Star Wars film for the female alien with the thick glasses.

I want to use it for controlling and recording motions for robots. Then I'll move onto full body capture. They use the same rotational values.

#20  

There comes a time in every software project when you have to stop tinkering with it and send it out into the world. This is very difficult for programmers to do. And so it is with the first pass on the Maxscript code I have written for this project. This will probably be a long post so don't say you weren't warned.

By way of disclaimer, I should say I have never worked with Maxscript or Python before this so the code is not as elegant as it could be. I still have a lot to learn. Additionally, I could have combined some operations in the code, but I decided to use separate variables to hold intermediate values (instead of putting the functions in-line) for better readability.

Basically as the code now stands, it is working software with debug statements sprinkled through it. The debugging statements can be turned on and off by setting the state of a global variable. Likewise with other functions. Here it is:



  /* Declare ALL Global Variables Here */
Global SaveToFile  =True  --Saves data into text file (TextFileName)
Global SendToEZ    =True  --Send HTTP commands to ARC HTTPServo control
Global PrintOutput =True  --Show servo Instructions generated (in Listener)
Global DebugScript =False --Enables the debug features

Global TextFile
Global Servo_R
Global FileExists  =False
Global SockPrefix  ="HTTP://"--Standard web prefix  
Global SockAddress ="NN.NNN.NNN.NNN"    --Address from HTTP Server in ARC.
Global EZ_Logon ="Exec?password=admin&" --Change password (admin) as needed
Global TextFileName ="C:\movement.txt"  --Output File (Change as needed).
Global EZ_CommandScript ="Script="
Global EZ_CommandAction ="Action="  --Still working on these last 2.
Global EZ_CommandFrame  ="Frame="

Global ScriptCommPrefix =SockPrefix+SockAddress+"/"+EZ_Logon+EZ_CommandScript
Global ActionCommPrefix =SockPrefix+SockAddress+"/"+EZ_Logon+EZ_CommandAction
Global FrameCommPrefix  =SockPrefix+SockAddress+"/"+EZ_Logon+EZ_CommandFrame

  -- Array of servo Numbers, Order listed = order processed
Global ServoNumbers = #("D6", "D8", "D10" )

  -- Array of servo Objects. Order Must match Order of servo Numbers, above
Global ServoObjects  = #($SchulterRechts, $ArmLinks, $HandRechts)

Rollout StopButton "Stop" --Generates a UI with a button
(
  Button Btn "Stop Script"
  On Btn Pressed Do
  (
    DestroyDialog StopButton --Get rid of the User Interface
    Close TextFile --Close The File Variable
    SliderTime =1 --Move the slider to force a callback
    SliderTime =0
  )--End On Btn Pressed
)--End Rollout
CreateDialog StopButton Pos:[1540,600] --  X-Y Position Coords.

  /* Creating a textfile for output */
If SaveToFile == True Then
(
  FileExists =DoesFileExist TextFileName --If True, file already exists
  If FileExists == True Then
  (
      If DebugScript Then Print ("Pos 1 "+TextFile as string)
    TextFile =OpenFile TextFileName mode:"r"
      If DebugScript Then Print ("Pos 2 "+TextFile as string)
    If TextFile != undefined Then --File sucessfully opened
    (
        If DebugScript Then Print("Pos 3 "+TextFile as string)
      Close TextFile
        If DebugScript Then Print("Pos 4 "+TextFile as string)
      TextFile =(OpenFile TextFileName mode:"a")--Open for writing
        If DebugScript Then Print("Pos 5 "+TextFile as string)
    )--End If TextFile
  )
  Else --Doesn't exist so try to create it.
	TextFile =CreateFile TextFileName
    Sleep 1 --Delay 1 sec. just to be safe.
    If TextFile != undefined Then FileExists =True
  --End If FileExists
)--End If SaveToFile


--unregistering previous definitions
--UnregisterRedrawViewsCallback Servo_R
UnregisterTimeCallback Servo_R --In case it's still registered from before

--Http socket open
Rollout HttpSock "HttpSock" width:0 height:0
(
  ActiveXControl Port "Microsoft.XMLHTTP" SetupEvents:false ReleaseOnClose:false
);
createDialog HttpSock Pos:[-100,-100]; --Just to get it initialized.
destroyDialog HttpSock;

Fn Servo_R =  -- Send the values from the Animator by scanning the Arrays
(
    --Local Variables. Declaring them frist helps during debugging
  If DebugScript Then
  (
    Local SleepVal  ="0"
    Local SleepStr  =""
    Local ServoStr  =""
    Local ServoVal  =1.0
    Local ValueStr  =1.0
    Local HttpStr   =""
  )

  Local NextServo =1
  While NextServo <= ServoNumbers.Count Do
  (
    ServoStr  ="Servo(" +ServoNumbers[NextServo] +"," --Begin forming the servo string
    ServoVal  =ServoObjects[NextServo].pos.z
    ValueStr  =((ServoVal + 9) *10) as string
    ServoStr  =ServoStr +ValueStr +")" --Finish forming the servo string
    HttpStr   =ScriptCommPrefix +ServoStr --To send to the ARC

    If SendToEZ == True Then --HTTP socket connect and send
    (
      HttpSock.Port.Open "GET" HttpStr False;
      HttpSock.Port.Send(); --HTTP Server control in ARC must be running
    )--End If SendToEZ
    If PrintOutput Then Print(Format "%\n" HttpStr)

    SleepVal =33.3333333333 as string
    SleepStr ="Sleep(" + SleepVal + ")"
    HttpStr  =ScriptCommPrefix +SleepStr

    If SendToEZ == True Then --HTTP socket connect and send
    (
      HttpSock.Port.Open "GET" HttpStr False;
      HttpSock.Port.Send(); --HTTP Server control in ARC must be running
    )--End If SendToEZ

    If SaveToFile AND FileExists Then
    (
      Format "%\n" ServoStr to:Textfile
      Format "%\n" SleepStr to:TextFile
    )--End If SaveToFile

    If PrintOutput Then
    (
      Print(Format "%\n" HttpStr)
      Print(Format "%\n" ServoStr)
      Print(Format "%\n" SleepStr)      
    )--End If PrintOutput

    NextServo =NextServo +1
  )--End While NextServo
)--End Fn Servo_R

--registerRedrawViewsCallback Servo_R
registerTimeCallback Servo_R


When using the code, it is best to employ a mono-spaced font so as to take advantage of the indentation. It lines up okay here since the code window uses a mono-spaced font. However, things like keywords and comments will not show up properly because the code window is made for displaying ARC script only. As I write this I can't see how it is, since there is no preview function here so I'll just have to discuss it as I think it will look. In a Maxscript Editor window, you can choose Options then click Use Monospaced Font. Unfortunately it seems you have to set that every time you open the editor window unless you change the settings in the Maxscript.Properties file. Which I did.

First off, what does it do? As I said, it's a first pass. It is basically code which was put together by Mickey666Maus who did a pretty good job of it. What I have done is addressed some problems he had with it and introduced formatting and good coding practices, as well as, some additional functionality.

What it does is take movements from a virtual robot created in 3ds and generates a series of statements that are designed to be sent to ARC to control a robot. As it stands now, it will control the robot in real time, following the movements of the virtual robot. This is done by sending thousands of Servo() instructions and Sleep instructions in pairs.

These instructions get to the EZ-Buider through one of two ways. One is via an HTTP Server port. The HTTP Server component is added to the project and set to run. At the 3ds end, HTTP statements are constructed and sent along to the HTTP Server via a network. Upon arrival, they are executed immediately, one after another. No script is needed on the ARC end to do this. Instruction execution is built in if the string sent is in the correct format.

The second way to send the information is via a disk file. One of the functions of the above script is to send these same instructions to a disk file. This file can be loaded into a script variable, decoded and run, line by line via a script. This accomplishes the two initial goals of the project. It's not perfect. The real-time movements are not smooth. And, with other things also running, it can get jerky with hesitations. Previous posts in this thread discuss the reasons for that.

Even though the real-time method is not perfected as would be liked, we are going to live with it for now to concentrate on a method which should provide smoother movements with many fewer instructions needed. I call this the Vector Method.

What the next pass at the Maxscript software will do is to monitor the virtual servos movements. The overall series of movements will begin with a starting point for all the servos. Then the script will watch each to see when a given one has either stopped or changed direction. When that happens, an end point will be said to have been established for that movement. An instruction set will be generated and sent to the robot, via the HTTP Server and (if desired) also sent to a text file. A new start point will be established at the end point from before and the sequence begins again.

Additionally, timers will be going that will allow a sleep period to be calculated to go along with the vector instruction. Finally, the time and distance moved will be used to determine a speed setting for the servo. How that last point will be done has not been determined as yet. The obvious answer would to use the ServoSpeed command, but even a small change in that setting seems to have a disproportionately large effect on servo speed, so using it may not work out. This method will not be Real-Time either. The robot will be one movement behind. This is because the movement instruction group cannot be calculated or sent out until that movement has completed. Of course that only matters when comparing it to the virtual robot, If using the disk file, you will never know it's happening.

Now, a word about the use and setup of the Maxscript.

Up front is a list of global variables which are used throughout script for data manipulation and storage, as well as, for control of feature selection. The first few are for that feature selection. Global SaveToFile
When set to True, the data generated by the 3ds program and this script goes to a file.

Global SendToEZ
When set to True, the Send HTTP commands are allowed to be sent to the ARC HTTPServo control

Global PrintOutput
When set to True, The servo Instructions generated are displayed in Listener.

Global DebugScript When True the debug features are enabled.

The next part of the code that needs to be described is the servo Number - servo Name relationship. The servo Number is the usual designation for a servo port. D1, D10, etc. The servo Name is the name given to the virtual servo in the 3ds animation. For example in the current setup, these are: $SchulterRechts, $ArmLinks, and $HandRechts. There are two places these values are placed into variables. Lines 24 and 27. They are:


Global ServoNumbers = #("D3", "D4", "D5")
Global ServoObjects = #($SchulterRechts, $ArmLinks, $HandRechts)

This is where you would place your servo numbers and servo names.

                   NOTE: ORDER IS CRITICAL! 

When these arrays are used to access the positions of the virtual servos, they are used in pairs. First to first, second to second, and so on. Having these lists allows the script to go through the servos in a loop, making for compact code. Several methods to do this were considered, including one which would have done all that automatically. But this was considered the most flexible. You can easily insert a new servo in the list or move everything down 1 as needed. Just follow the examples and be careful to maintain the one-to-one correspondence. Also, the servos in the list are used in the order they are listed so if you need a different order for some reason, keep that in mind.

If you decide to modify the code, which is fine, be aware of certain restrictions concerning Maxscript. The big concern during software development is that there is no built-in way to stop the script once it has been started. This seems like a big oversight on the people who make the software, but they no doubt have their reasons. And you WILL need to stop the script and restart it in development. As far as I have been able to tell from the web, there is only one way to stop it. Crash the script. To this end, I have put in a mechanism to do just that.

One of the great things about Maxscript is that you can make and display UI's (User Interfaces) right from the script. This means you can create a window populated with buttons and display areas and menus and drop down lists etc. right from the script. Then make them pop up wherever you want them for the user to see data and provide input as needed. When you are done with them, you can make them vanish as well.

When the DebugScript variable is set to True, one of the things that happens is that a small window will pop up shortly after you start the script. This window will have a button on it which will say "Stop Script". And that is just what it means. When you click it the program will crash and you will get a crash report in the Listener window. The script is now stopped. You can modify the script as you like and restart it again. If you don't do that and you restart the script, what you will actually be doing is starting a second copy of it. The first one will still be running. If there is something like having a file open in that first copy, you will not be able to access the file at all in the second copy because you will not be able to set a variable to it with an open command. Nor will you be able to close it or delete it. You're reading the voice of experience here. :D

One more thing about the Stop Script pop-up window. It can be positioned wherever you like when it appears. The call to make it show up is on line 44: CreateDialog StopButton Pos:[1540,600]

The numbers at the end are X-Y co-ordinates in pixels. On my screen that places it about midway up to the screen and to the far right. You will want to adjust it for your screen. It's important it is moved away from the editor or Listener windows because they will tend to cover it up. Remember it's small, so you may have to look for it the first time.

Lines 12, 13, and 14 have places for you to put information specific to your setup.

Line 12 is the address from the HTTP Server control. It is the address in blue on the control itself. The upper one. Put it in place of the "NN.NNN ..." designations: Global SockAddress ="NN.NNN.NNN.NNN"

Line 13 is the execution and logon data. Unless you change the password in the HTTP Server control, you will not need to change this line. Global EZ_Logon ="Exec?password=admin&"

Line 14 is the filepath and name of the file to which you intend to write the data generated by the script. You can call it whatever you wish and locate it where ever you like. Just be sure the script will have write privileges to it: Global TextFileName ="F:\Users\Bills\Desktop\movement.txt"

That's it for now. Questions and comments welcome!

#21  

I have no idea what this script says but nonetheless I'm blown away it's description. It must be an elegant and complex masterpeace judging from what you say and what little I actually understand of the code writing. You continue to amaze me. If I had just a 10th of your brain power and tried to apply it with your skill and drive I'd be exhausted most of the day. How do you do it? ;)

PRO
USA
#22  

I know right Dave?!? Amazing! Impressive indeed. Hats off to you good sir!

PRO
USA
#23  

@Mickey666Maus, yes some how I missed one of your posts. Thanks for the compliment. I've been working with a program Lightwave for many many years. Once you know one you kinda know thrm all. The interesting thing is animation is very much like animatronics which is what I do. For 20 years I have want to bring them together. Technology is finally here, but I don't know how to do any coding. But I have amazing ideas from my animatronic days that lend to robotics. The key to getting clean mo cap in filtering....kinda like turning an animation path from linear to a s spline ( curve) , the curve filters the highs and lows noise. The next solution is to build a telemetry suit where there is a potenometer at each joint to act as an input . The old Fashion way and a bit restricting. But my interest now lies in facial mo cap for advancing other test robots I have here.

Worst case if I can't find a partner, I'll start paying someone which is always good motivation!

#24  

@Dave Schulpius Thank you for the praise. When you enjoy something it's not work and the knowledge comes easily and the hours pass all too swiftly. I enjoy programming. My meager efforts pale in comparison to your B9 though. That IS a work of art, Inside and out. Heck, this little program isn't even 150 lines. It will expand however ... soon. :) The real headbanging. "why did I ever get into this?!" stuff is about to start. Fun fun fun!

@fxrtst Edited: (I should know better than try to post when I haven't had nearly enough sleep) Thank you as well for your kind words. I would be glad to help with the coding you need, but it would be pretty much on a catch as catch can basis. At least what you need and what Mickey666Maus needs seem to dovetail, so much of what I would do can be used on both projects (I think). Assuming 3ds Maxscript can run under Maya. I'll have to look into that.

#25  

@WBS00001 THIS IS AWESOME! I checked the MaxScript and I guess it will take me some time to figure it out...IF I can figure it out!:D Looks supercool, I cannot wait to try it out! As I am attending my dads birthday in Southwest Germany right now it will be some hard time for me to wait until I can go for further testing! stress

It is so cool that you found a way to debug and stop/crash the script...this has always been a pain! I will run a quick test now, to see if I can get it up and running, putting it all together will have to wait until I get back home I guess!

Talking about MaxScript and creating GUI objects...this would also be some great idea, creating an interface that features the servos listed in the script plus creating the dependencies right at startup!

But I don't want to start daydreaming, this is an awesome piece of code...I'll try to put myself to decryption mode now! eek

Can't thank you enough for your help! Lets have fun with this...I will create a rig for a JD asap so we can get some animation going! :D

And yes...if you need help with 3ds max, let me know! I know it inside out! Also I guess once you reached the limit of your copy, there should have been at least three new versions. They should give the new ones to you to use for free to learn and study too! I would not worry too much, these days Autodesk seems to be very kind to students and enthusiasts! :)

If you are giving me the servo structure of you Meccano I will rig him up too.. Lets rock! :D

#26  

@fxrtst It's kind of cool to think of it this way, maybe you could just download and install 3ds max to give it a try... So we would all stay on the same platform while doing this? I know, there is a servo plugin for Maya, and some other cool stuff that 3ds max does not offer and vice versa, but as you said in the end all of those applications sort of work the same way!

One of my goals would of course be IK/FK rigging of the virtual robot model, and importing some clean MOCAP data, .bip .bvh .fbx or the like and link those to my robot model to get readable motion!

Lets see how far we can push this! :)

Also I think Max and Maya integration should be on its way... ;)

#27  

And so it begins.

A flaw (or 3 ) has been discovered in the code submitted in Post #21 of this thread. The code window of that post now contains the revised version. If you are using it, please go there and copy and paste it to replace the old code with the new.

You can find that Post HERE.

PRO
USA
#28  

I do indeed have max installed. I have used it many times. The reason I am using Maya is the program I am using only has a plug in for Maya and i believe motion-builder to stream live which interests me more than pre recorded motions, for uses inside and outside of robotics. I certainly can use prerecorded motions for alot things too. Thats why i am following here as well. If i have recorded motion inside this program I can save as FBX and then can use with Max or lightwave or what ever program as long as i can tie the programs and serovs together. Exactly how are you doing this now from Max to ARC? Remember im no programmer but i know hardware and programs...ie laymen terms please. If you have already described how you are doing this forgive me. Perhaps a video as well.

Also I believe PhP was working on streaming motion capture directly to DJ.

The sensor i was telling you about also is no longer available as Apple bought them two years ago. Uses same tech as kinect TOF (time of flight), but higher resolution. But not as high as the Kinect 2, but i believe it uses something different than TOF.

The company below has kinect 2 working with motion capture he has body and face capture affordable and he is accessible to chat with.

http://brekel.com/

I believe this is very noisy, something my program i use filters out.

PRO
USA
#29  

On the same thread concept, I had proposed a visual animation plugin for auto position. This way creating animations are a little more intuitive. Here is a link to that thread. Used alot of CGI based animation parameters.

Animation plug in

#30  

@fxrtst

Quote:

Exactly how are you doing this now from Max to ARC?
Without going into excessive (and probably confusing) detail, it is a matter of placing an HTTP Server component in your project and sending a properly formatted text string to it. The HTTP Server control is found under the General tab in the Controls window. If you have a functioning network up and running, it should show you 2 numbers (in blue) on that component. It is the upper one we will be concerned with. That is the IP address used by the control. The lower one is the address of your Router.

Normally a program is used to send something to it. In this thread, that program is the one I presented the code for in Post 21. However, for purposes of this discussion we will use a browser to do it. Start up your browser and, if possible, set it to a blank page.

Now you will put an address in the Address Bar[/i] at the top of your browser. This is where you usually put in web addresses. I will provide the address so you can simply copy and paste it in.

Here we are talking about that "properly formatted text string" I mentioned earlier. The string consists of two basic parts: The HTTP (Hypertext Transfer Protocol) part and the data.part. The HTTP (and IP) part is this: [b]HTTP://NN.NNN.NNN.NNN/

That's the actual IP (Internet Protocol) address of the HTTP Server control in your ARC project. You will need to replace the NN.NNN.NNN.NNN part with the number in blue from your HTTP Server control (the upper one). Don't worry if yours doesn't have exactly the same pattern as shown above.

Then comes the data part. Everything after the last slash is data of some kind. It is up to the receiving software component to interpret it and do something accordingly. In this particular example, the first part of the data is the Command and Login information. In this case it is Exec?password=admin&

The Command is Exec (Execute), and the login is password=admin. The question mark and ampersand are separator characters between the parts of the data.

The final part is the ARC Script Instruction that will be sent to be executed. In this example it will be to move a servo: Servo(D1,60)

That's it. So putting it all together then: HTTP://NN.NNN.NNN.NNN/Exec?password=admin&Servo(D1,60)

It is this line you will copy and paste into the address area of your browser. Again, you will want to replace the NN.NNN ... etc. part with the address from your HTTP servo control. Also, you can make the servo port number which ever you wish, as well as, change the 60 to whatever you wish. Be sure to save the string in which you replaced the values to something, like Notepad, because it will often be overwritten in the address bar if something is wrong.

Now, be certain that a servo is attached to it in the D-number port you have in the Instruction part of the address string (or change the number in the string to a servo of your choice). Then connect to your EZ-B and Press Start at the bottom of the HTTP Control. If all is well, you should see a green area appear. It is ready to receive commands.

The servo should move when you press Enter in your browser address bar (assuming it is not already at 60 degrees). Additionally, "OK" should appear in the main body of your browser window. Change the number at the end of the address (the servo degree number) to make it move to a different position.

That's it. You just use the basic part of the string: HTTP://NN.NNN.NNN.NNN/Exec?password=admin&

And add whatever Script Instruction you wish to perform. No script is needed at the ARC end. The HTTP Server control does what is necessary.

All that is what my Maxscript program does. It takes data from the virtual robot as it is moving along the animation time-line and creates the appropriate text string, then sends it to the network to be received by the HTTP Server control and executed. As currently applied, it sends out thousands since it moves the servo(s) in tiny increments.

Basically it is a simple, plug in the appropriate data and send it kind of thing. All the messy details are taken care of by the Maxscript routines and the HTTP Server control within ARC.

The professor will now entertain questions ... :D

#31  

@fxrtst Sounds very good, if you already have 3ds max installed we would have a common ground for our discussion! It is quiet understandable that you would like to stay on the platform integrating your favorite plug in that also supports a special piece of hardware you are using, but since you obviously do know a lot about exporting motion data in different file formats cross platform it should be fairly easy for you to use our build for your needs too!

I saw your post about implementing some animation features for ARC and was very interested, but since the main purpose of ARC is being a multi purpose robotic platform rather than an interface for processing correctly timed motion, I guess it makes a lot of sense to start researching how we can use a piece of software specially build to fulfill this task and tie the strings together! Not saying of course that it would be great if there was a way to build those animations directly within ARC!

I remember using a Kinect and some early build of the software that Brekel provided to transfer live MOCAP into Motion Builder, but was a bit disappointed by the noise and the overall unsatisfying results I was getting...but this was way before they had the two Kinect tied together, I was thinking of this as a solution to get real 3d MOCAP, but was far from getting the math done to do so by myself! :D

Motion Builder is well implemented within 3ds max, so if this is one of you main platforms it is even better! :)

I am traveling right now, but once I get back home my plan is to make a JD rig, so we would have a common testing ground for the community...I would share the .max scene, and could also make a quick tutorial on how to load it, start the script and how to keyframe animation.

WBS00001 is currently working on improving the code, you could help on finding ways to import motion since this seems to be your main field!

I guess one of the biggest advantages we would have rigging up our robots in our animation packages are the advanced FK/IK options plus all the joint dependencies that could make robots with a high number of servo joints to work smoothly!

Lets see how far we can push this! ;)

#32  

This was just a one click search on YouTube, but I guess there is a lot of info on how 3ds max and Motion Builder are tied together....probably the most common groung would be the .fbx file format? I would also be interested in any ideas on how to import motion, live or pre recorded! I used to make animations in 3ds max years ago...just stopped because it was just eating up too much of my time back then!;)

#33  

I was using a lot of pre-recorded motion in those clips back then...

#34  

That was oddly disturbing. I felt bad for the main character. If real, what a sad way to exist. :(

PRO
USA
#35  

@wbs00001 thank you, I'll try this out later today!

@mickey666maus, fun video! Yes .fbx seems the most universal way to get the motions moved around. I can get live streaming into motion builder, but again would have to be recorded motion to bring into max unless there is a live bridge between the two.

Fk/ik absolutely would be a great way to create animations and key frames. The two Kinect set up works really well, but the recording volume still remains small, like 7 feet cube. Ipi software does an amazing job at recording motion and is filtered motion. The price is higher than brekel but results are a lot better. There are several options for camera choices as well. It's still under development but it's slow.

Motion builder works great for getting the motions into our packages. We should use it as our main central Mocap processor (when we get to Mocap).

#36  

@DaveSchulpius I consider your pitty for the protagonist in my short as a great compliment...but you can rest assured, I know him personally and he is doing just fine! :D

#37  

@fxrtst I also used Motion Builder back in the days to get the live data streamed... Using the Kinect we should also keep an eye on getting it hooked directly to ARC as being discussed here in the forum! The HTC Vive could also be a good DIY sollution for clean MOCAP data although they do not really track all the joints I guess...I will check what Brekel offers, as I said it has been a long time since I took a look at what they are offering!

Its been a quantum leap since then I guess! :)

PRO
USA
#38  

Gentlmen I found this project. Although it's from 2011 and probably superseded today, and it uses blender not max, it does describe in detail how to get from one program to the servo, using Python, writing out to use serial. Some useful info here.

Blender to servo

PRO
USA
#39  

When I was playing with RoboRealm and I had D.Cochran hook me up with streaming data into ARC from Roborealm and Vise Versa, as the EZ camera went into Roborealm and the RR then built facial and object recognization and that data was pass back to ARC as a variable, why cant we just have data stream from MAX to ARC via variables? Like PTP did with his live kinect set up?

Kinect and JD

#40  

@fxrtst This is exactly what we are doing... I will make a tutorial video once I have a little time on my hands! The beauty of it is, there is no need to stream the data to the serial port because of the HTTP Server that ARC supplies! :)

#41  

@fxrtst Thank you for the links. They were most informative.

@Mickey666Maus What I have seen of the code involved in the links fxrtst provided is that all of them are using the same basic method we are using in the 3ds project. For example, the Blender project simply sends the servo position to the controller card as the virtual servo moves. The difference is that it does not use the network to do it as we do. And that makes a big difference in the smoothness of the operation. However, there are steps that can be taken to smooth out the operations when using the network method as well. Some may be mathematical in fact. But I'll explore them a bit later. For now I'll concentrate on doing better whit what we have, as described below.

Anyway, all that got me to wondering would happen when I use the file generated by the Maxscript directly in ARC. So I whipped up a quick ARC script to read the file and run the servos accordingly.

What I found was that the servos run much smoother (as expected) when the servo commands come directly from a file instead of coming in through the HTTP Server control via the network.

Additionally, I found that the 33.3333 sleep period between frames (groups of servo positions) was too long and causing some of the rough movement. When I changed it down to 10ms, for instance, the movement was smoother.

This, in turn, led me to the realization that we did not need to send the sleep period every time we sent out a servo command. Something I should have realized before, but was too blinded by science to notice it I guess. :D

So now, I just send a sleep command once at the beginning of the output and that's it at first. Unless and until a change in sleep period occurs, only groups of servo position commands are sent. This allows for faster transmission and can result in smoother operation. Also, the sheer fact the Maxscript program is sending out groups of servo position commands, is a delay in itself. It will be some X-milliseconds before another command for a given servo will be sent again as the other servos are processed and sent. The more servos, the greater the delay. That can have a pronounced impact on the smoothness of of servo operation. However, that can also be used to calculate the appropriate sleep period. The more servo positions being sent, the lower the sleep period. Such a thing can be calculated either at the ARC end or the Maxscript end. For now it will be at the Maxscript end.

The tests so far have been with only 3 servos. There will be much greater delays when 23 servo positions are sent. I'm thinking I will need to introduce software steps that will check each servo for a change in position and not send anything out for servos that have no change in position, as opposed to sending every one out every time regardless. Basically using the delta compression principle of only sending data which has changed from the last time data was sent.

I will enter new code in Post 21 when I get things going on the Maxscipt end, and I'll post the ARC script I am using to test the direct reading of the file method, when I am done with testing.

EDIT: Oh yes, I forgot to mention, I had to save the servo positions in the Movement.txt file in a different format from the proper one for ARC script. Instead of Servo(D1,60) I had to use Servo,D1,60. This is due to an ongoing bug in the script interpreter. Anything in a string should not effect the code. However, a close parenthesis symbol in a string will be interpreted as actual code. If you try to use it like that you get an error saying you're missing a parenthesis. So you can't have one in any string. In this case, that string would come from the Movement.txt file and the reason for the change in format. It is read Okay, but as soon as some instruction tries to act on it you get the error. Since I had to change the close paren to something else, I decided to use a comma because there already was a comma in the statement and I was using that comma in the Split function to grab the data out of the string. To make it simpler still I also changed the open paren to a comma.

I posted about this bug, and others, over a year ago (8/12/2015) and it's still there. It was one of the reasons I came up with my text scrubber program.

#42  

@WBS00001 Good to see you are getting closer...step by step! :)

I also realized that the motion from within ARC seemed to work a lot better than a directly translated motion, I was also guessing the delay is caused by the network vs in the Blender example a com port was used, so data transmission was a lot faster. As you mentioned its milliseconds piling up to the point where things get messy!

Also I made a mistake on my side when I jammed the robotic arm rig together while being on the train I changed the FPS within the 3ds max project...I was rolling the idea back and fourth in my head whether or not a high framerate could cause problems in data transmission and I reduced it from 30FPS to 20FPS. I am really sorry that I forgot to tell you about this idea...so the motion in our testing scene is 100 frames long and is set for 20FPS meaning that it is a motion lasting for 5sec long...which sould make it 50ms for each frame right? So this would mean each frame would get a 50ms sleep() period...

I just did some testing and it does not work at all, as you said 33ms seems to be to slow already...so if I put a sleep value of 50ms the script takes 15sec to execute instead of five which it is supposed to! confused

The 10ms stop seems to be pretty close but it runs kind of too fast...3sec instead of the five seconds it should run! eek

FPS Converter

I will wrench my brain now to try and find out why this is the case... :D

#43  

Still could not figure out why the sleep() has such an unexpected effect on the animation...but as you said you are on the way to try a different approach I just linked the motion rendered, so you might have a better comparison to how it affects the real world model!:)

#44  

I made a 3ds max scene with EZ-Robots JD, it comes with an animation that you can see on the video I made for timing referrence, but it is also fully customizable if you want to build your own motion! It is just been made to have a common testing ground to push our idea to transfer animation from this 3d application into ARC, so only the head and the arms are setup! No IK is added yet...but it should be fun to try! :)

User-inserted image

#45  

You can find the .max scene here...it was a bit bulky for a direct upload!;) JD_Rig

PRO
USA
#46  

@Mickey,

One question, can you export the animation (JD) to fbx file ?

Do you know if a fbx file as all the components plus the motion ?

#47  

@ptp I can export it as an fbx file...it should have everything included! Will do it right now! :D

#48  

It does export the animation but it does not export a functioning rig... But thats how .fbx works I guess! The dependencies should still be there I figure!

Give it a try! ;)

JD_Fbx

PRO
USA
#49  

Yes FBX kinda bakes down the animation file almost into single frames. The reason so you can rig in one package, animate in another and render out frames on your favorite render all with the FBX. All translation and rotational values for each item and or joint are recorded with FBX.

PRO
USA
#50  

This is what i did in the final of this facial capture video....I ran the motion capture in faceshift , live into Maya on a rigged character i built, then recorded the animation and exported via FBX, then imported it to Lightwave to render out the final frames...seen at the end. Funny thing is you have to have the correct FBX plug in for all packages and Autodesk updates them all the time. So if i use FBX 1.2.3.4 and i dont have the same in Lightwave does screwy things. As in this case the Y and Z were inverted, So I had to parent to a null and rotate it.

FBX was created by Autodesk the compnay that bought all the remaining 3d packages and they are the ones who created FBX formate for studios that have an existing pipeline as a way to transition from one package to another.

PRO
USA
#51  

As a reminder we dont need to be concerned with translation only rotational values of joints. Translation in world space won't even be used. Thats 50% less data and values to have to be concerned with.

#52  

Man this is awesome! So it seems like you know a lot about all the different types of setting up animations, morph targets, rigging and all the different types of getting stuff moved cross different platforms!

Not talking about the great job you did lighting, modeling, and rendering the model with all the high displacement mapped details!

Maybe you could check the rig I did for the JD in max...i am really struggling to get the IK right, just trying to find a solution which would work out for our idea to use the virtual JD rig to drive the real one...

And since you mentioned working on the Guardians Of The Galaxy Sequel over at the other post, you must be a very talented person in every field! I would die to be on the set and check how they get all this awesome stuff done! :D

#53  

I came out supergreat...cannot stop watching!:D:D :D

PRO
USA
#54  

Thanks ! My actual job is make up and make up effects, but in 1997 I hurt my back and was off work a year so taught myself cgi. I'm always coming up with POC merging all the stuff I know into different areas, sculpting, 3D printing, mo cap, animation, animatronics and now robotics.

PRO
USA
#55  

Heyya fellas any progress?

#56  

There is a axiom in software development which says "10% of your problems will take 90% of your time." That has, indeed, been the case the last couple of weeks or so. I knew zero about Maxscript when I started, but still managed to come up with something, thanks largely to the script Mickey666Maus already had.

However, as the nitty gritty phase of making things work came to pass, I ran into more and more time consuming problems. I continue to run into them. So I have had to back up and punt fairly often, looking for solutions and testing, testing, testing. Maxscript is sorely lacking in certain desirable features, despite having something like 8000 commands and variations. Or maybe because of it. And, of course, life keeps interfering with the fun.

Presently, I have finally overcome the major ones I have encountered. Those I've not still bug me (every pun intended), but I have found work-arounds that will have to do for now.

I had hoped to have an update posted this week. But then, I had hoped to have one posted last week too. Ain't gonna happen this week either so my sights are set on next week. Currently, the next "release" will include just the recording and playback functions. After that the real time motion mimicking.

Things are slow but steady. The both of us are still quite enthusiastic about the project so it hasn't lost steam, just a some of the momentum. :)

PRO
USA
#57  

That's great! Carry on gentlemen!

#58  

You're an inspiration WB! I love your axiom about the 10%. It's so true. Like you I'm fighting a motor control PID issue with my B9 arm. It's turning out to be some minor settings that are throwing off the entire elbow motor tune. I don't know what your process of finding bugs in scripts is. To find bugs do you have strategies, methods or do they just pop up and you need to backtrack and rewrite sections to make them go away? With PID motor control issues like I'm the ones I'm fighting, engineers usually have strategies or methods of finding the best settings in the software for smooth motor movement. They use the Z-N Method (ZieglerNichols), C-C Method (Cohen-Coon), T-L Method (Tyreus-Luyben), exc. Most often engineers in frustration will resort to the methoid I usually use all the time; the WAG Method (Wild A** Guessing). I was just curious if this last method ever pops up in your field of interest? :)

#59  

@fxrtst I just saw the clip you posted over at the other thread...amazing! I will have to take a look at that sequencer you used! I hope that we will be able to transfer correctly timed motion from 3ds max to ARC soon, thanks to @WBS00001 who made this project possible...my rudimentary scripting abilities are way to limited to create something as sophisticated! :)

I will supply a rigged JD with animation control so everyone interested will have a common testing scene for creating motion to be played back on a JD! :)

I am on the train for eight hours today...so hopefully I can make some progress! :D

Good to see that you are still interested! :)

PRO
USA
#60  

Yes still interested. The sequencer is free software created by Flowbiotics. It controls any brand of the SSC-32. For that example I used the new usb version from Lynx . You can also connect 4 inputs or use the a PS controller.

#61  

@Dave_Schulpius Thank you for your thoughtful praise. This rambling post may disabuse you of the notion of my being an inspiration (or even of any use at all), however.:D But I thought I should answer in some way before more time passes.

WAGs seem to be a way of life in many fields, not the least of which is software. That's where all that testing comes in. You just try anything out of frustration. It can often turn out to be the source of the problem even though you thought it would be wrong. You just could not see how it could be the source of the problem until you see that it IS the source of the problem. Then you can backtrack and figure it out. Sort of like writing a mystery novel backwards. Start at the end and figure out how that end came about.

The advantage in software, however, is the guess can be tried and implemented much easier than in a mechanical system and results can be seen immediately. That is one of the reasons I changed from electronics as my fun thing of choice to software. Same mental challenge, none of the physical mess. Not to mention instant results.

In this particular case, most of my problems with the software is simply out of ignorance. It's the usual steep learning curve and, with enough research and testing, I manage to get things going ... eventually. Other times I just have to stop and think of another way to proceed. Good or bad just another way. And then there are the "that's impossible" things that occur. Eventually I find it is indeed possible and its because I forgot a comma or a pair of parenthesis at a certain point or had to enclose some part of the code in them at another. Little things, that can take forever to track down.

Mainly though, it's a process of getting better at analyzing the symptoms and coming up with an appropriate diagnosis. Getting better at looking at the pieces and what the crash report is really saying that finally gives me a clue as to what's wrong.

I think that is much like what you are doing with the tuning control. Controlling a process through a PID loop is one of the hardest things to do, period. People study that problem for all their lives. Get degrees in the field even. Write copious numbers of books on the subject and come up with all sorts of equations.

I was sent to a couple of courses at Allen Bradley, in PID control by the company I worked for at the time so I could better know how to control a group of machines that worked in tandem to produce a product. What I learned from that, plus talking to people who had done that sort of thing for years, as well as educating myself on the formulas and techniques, came down to one thing. All that stuff just gets you close. After that you have to tweak it. Seemingly endlessly.

And that is where you are now, apparently. The advantage someone who has done that sort of thing for years would have in your situation is their ability to read the symptoms and come up with a diagnosis. They know, by observing what is happening and, what part of the control is making that happen like it is. That is to say, is it the Proportional, the Integral, or the Differential portion of the PID control process which needs adjusting? That is the trick.

I had hoped to be of more help when writing a response to your post by looking more closely at your posts on the B-9 but I just have not had the time to do so. Usually that sort of thing is for connecting disparate units of something and adjusting the overall process so as to be able to adjust the input side so as to get the output side to do what is desired. Because there is a delay involved in that sort of process, the PID controller is employed to attempt to better respond to variations in the overall loop, especially the input, so as to maintain a good, average output. They don't try for perfection, just a good, statistical average. Usually the process is so slow there is time to do that.

In your case the process is fast, but has repeatable variations in loading as the arm moves through it's course. The tuning process is trying to smooth out those variations for better motion. At least I think that's what is happening. As I said, I had hoped to get a better grasp on the problem before posting.

Having said all that, in my usual overly wordy way, were it my problem, I would backup and punt. I have no real idea why the first arm seems to have worked out so well compared to the second one. Perhaps it only seems to have worked out that way. Regardless, I would go back and see if there wasn't some way to use a linear actuator for the operation. I know you have space constraints, but I would be looking for a way to overcome that. For example, it is not currently possible to place servos in the fingers of a robot hand so the movement is done through a linkage such that the servos can be placed in the arm or the palm. Similarly, it may be possible to locate the actuator vertically someplace where there may be more room and link it to the slide mechanism in some creative way. Some of them have built in potentiometers which could eliminate the need for that thin variable resistor you are using now. Come to think of it, It's even possible that device is what is causing this tuning to be so much more difficult. Maybe this one is noisier or has flat spots on it. It is what provides the feedback for the loop, is it not?

Sorry if I'm way off here in suggesting a linear actuator, I know it's been suggested before, but it would be my device of choice here. Hopefully this lengthy missive I've written has been of some small benefit nonetheless. When I have time, I will look more closely at the problem, but that is no help to you NOW when you need it. So I have offered what I can off the top of my odd but reasonably functional head. :D By the time I get around to being of better assistance, you will have already solved the problem anyway.

Cheers, and try to have fun, difficult though it may be sometimes.

#62  

Well, WB, as usual you've astounded me. Thanks for the thoughtful reply to my post comparing our two processes. I'm flattered and thankful you're willing to lend me your brainpower and expertise with my issue. However I don't want to hijack this thread and I know you have lot of other things on your plate.

I've been able to fine tune my arm PID to the point that I think it's acceptable. I really won;t know how close it is till I have it mounted on the B9 and start feeding it movement commands from ARC. That won't be for weeks or months down the road. At time I'll post the results in my project thread. If needed and you have the time then I'd welcome your experienced and wise input. Thanks again! :)

#63  

Hey guys...as I am sort of finished with my rig of the virtual JD, that will be used to drive the animation of the real world model by using a max script which @WBS00001 is currently developing, I thought I will post the progress to get more people interested in our project!

The software used is currently available on the Autodesk website for download, and it can be used for free if it is on a student licence...I will attach the scene, so hopefully anyone interested in animating motion for a JD can have fun fooling around with it!

The rig offers custom blending of forward and inverse kinematics for the legs, all the rest is straight Forward Kinematic! :D

It will be still some tweaking until me and @WBS00001 find the final working solution, but I could not resist posting it here...hope you guys gonna like it! ;)

PRO
USA
#64  

Looking very nice! Great job. I look forward to rigging and animating my own robots and characters with this plug in.

#65  

Ups...forgot to post the link to the 3ds max scene! :D

https://we.tl/uxPnm06AUU

#66  

@fxrtst good that you will be with us on this...lets try to get further into creating correctly timed and readable motion for our robot rigs! To me this is one essential element in creating an emotional believable character for the robots we are trying to build!

There are tons of good tutorials on this topic, but this one from TED kind of nails it down to the very core of it... ;)

PRO
USA
#67  

Absolutely correct and the main reason I'm also trying to bring in facial motion capture into this whole thing. Ease in and ease out are crucial to animation and animatronics ( my world) We've had to create animatronic puppets that behave like real animals or creatures. The mind know what it has seen for thousands of years. It's shall we say in our programming. It's also one of the reason we all react to Boston Dynamic robots. They mimic nature. And that's where I want to continue to take robotics. But the foundation has to be importing real motion.

#68  

Motion has a big impact on how believable a character is...this is a very good talk on the topic!

Sorry for the re-post! Could not help it... :)

PRO
USA
#69  

.....yes perfect example.

PRO
USA
#70  

@Mickey666Maus

Can you export (Virtual JD) to 3ds format ?

#71  

@Mickey666Maus

Here is the script I mentioned in my recent email. One thing though. I had said you would need to modify the first 2 lines if you placed the text file in a place different than I used. It should have read to change just the first line. Originally it was 2 lines, but I combined them into 1. Let me know if there are problems with it or you need more info.

I also realized I could include the modified text at the end of the script so you could just cut and paste it into Notepad and save it. So I did. Be sure to cut it and not just copy it so it will be gone from the script.


ClearVariables()

$ServoDataPath ="C:\Users\Public\Documents\Test\ServoData.txt"
DefineArray($ServoDataArray,1,"" ) #Define an array with 1 element (used later)
 
If(FileExists($ServoDataPath))
  FileReadReset($ServoDataPath) #Set to beginning of file
  $TheText =FileReadLine($ServoDataPath) #Read a line of text
  Print($TheText)
  $ServoDataArray[0] =$TheText
  $NextLine =1
  RepeatWhile(FileReadEnd($ServoDataPath) =0) #Read until end of the file
    $TheText =FileReadLine($ServoDataPath)
    AppendArray($ServoDataArray,$TheText)
    Print($ServoDataArray[$NextLine]) #So you can see the stored data
    $NextLine++  
  EndRepeatWhile  
  FileReadClose($ServoDataPath) #ALWYS close the file after use
    
    #Now the data is in an array.
  
    #If you wish to get the individual numbers from each line of text,
    #the Split function can be used.
    
  $HighServoArray =GetArraySize("$ServoDataArray") #Find out how big it is
  $NextLine =0 #Now read from the array, character by character
  RepeatWhile($NextLine <= $HighServoArray)
    $NextNum =0
    Print($ServoDataArray[$NextLine])   
    $TheNumStr =Split($ServoDataArray[$NextLine],";",$NextNum) 
    Print($TheNumStr)
    RepeatWhile($TheNumStr != "") 
      $TheNumStr =Split($ServoDataArray[$NextLine],";",$NextNum)
        #Here is where you would do whatever it is you want to do with
        #the value just extracted. I'll just print it.  
      Print($TheNumStr)
        #I put a sleep here if you want to actually see the numbers as
        #they go by.
      Sleep(500)
      $NextNum++        
    EndRepeatWhile
    $NextLine++
  EndRepeatWhile
EndIf

#The modified text:
0;0;0;0;0;0;0;0;0;
0;0;0;0;0;1;1;1;1;
1;1;2;2;2;2;3;3;3;
4;4;5;5;6;6;7;7;8;
9;9;10;11;12;13;14;14;15;
16;17;18;20;21;22;23;24;25;
27;28;29;31;32;34;35;37;38;
40;41;43;45;46;48;50;52;53;
55;57;59;61;63;64;66;68;70;
72;74;76;78;80;82;84;86;88;
90;92;94;96;98;100;102;104;106;
108;110;112;114;116;117;119;121;123;
125;127;128;130;132;134;135;137;139;
140;142;143;145;146;148;149;151;152;
153;155;156;157;158;159;160;162;163;
164;165;166;166;167;168;169;170;171;
171;172;173;173;174;174;175;175;176;
176;177;177;177;178;178;178;178;179;
179;179;179;179;179;180;180;180;180;
180;180;180;180;180;180;180;180;180;
180;

PRO
USA
#72  

@wbs00001

I believe the code below is wrong.


$HighServoArray =GetArraySize("$ServoDataArray") 

Replace with


$HighServoArray =GetArraySize($ServoDataArray)

#73  

@ ptp

Thanks for your observation but the script help says: *Note: The variable passed as a parameter must be in quotations. See the example below. Example: $x = GetArraySize($myArray) :)

PRO
USA
#74  

@wbs0001

Wow, learning something new every day...

a string for me is delimited by quotation marks, so i didn't think twice...

it breaks the common pattern, although some languages have an Eval method with some code inside a string.

#75  

@WBS00001 this is great help! I just spent so much time wrenching my brain on how to get this going, seeing your code just explains why I could not get it done! :D

Thanks for helping me with all the coding, I would be stuck instantly and the whole project would be on a dead stop!

Did you see that @Anthony from XLRobots.com just posted a thread where he is modeling a virtual robot in 3ds max...it would be kind of cool to integrate this into our project since he is using the same platform for modeling and rigging!

Johnny-5 Made For 3d Print Ezbv4 Controlled Robot Kit

@ptp Hey I just read your explanation on the different communication types available to choose from when talking to servos, really something that I missed out so far...thanks for the links and doing the research on it!

I can export the Virtual JD to the .3ds format, but I doubt that the wiring will still work...the whole idea is to animate in 3ds max and send out the data to ARC, which can be done with HTTP as a live control, or by 3ds max writing the animation to a disk file which will be then processed by ARC! The .3ds will most likely contain all the objects and the corresponding animation, but will not transport all the information of the parameter wiring and of course not the script which will handle the exchange of the servo rotation to the disk file or to the ARCs HTTP server...

#76  

@Anthony So just in case you paid us a visit I thought I'd repost some of my progress in modeling my robot in 3ds max and the revisions of my servo hinge for the Tower sg90 9g micro servo...I did not want to flood your thread, so if you see this I would be happy to have you on board! ;)

User-inserted image

User-inserted image

User-inserted image

#77  

@Mickey666Maus

Glad to do it. You're very welcome. I had to make yet another change to the code I posted. After a much needed good nights sleep, I realized I had left in some debugging code that prevented the last line from being loaded into the array. (Yeah, that's my excuse and I'm sticking to it :) ) It 's these lines of the code that I removed:


    If(FileReadEnd($ServoDataPath) =0)
      $TheText =FileReadLine($ServoDataPath)
    EndIf

#78  

@WBS00001 finally I had the time to test your code...I works like a charm, good to know how to handle this, I would not ever have coded this myself! :D

I got some error messages so I added some lines of code, now it works for driving the servo from file...

Needles to say that it still runs very choppy as too many different servo positions are being send and no accurate timing is given! But that is another story, and I am curious how you will tame this beast! ;)

We could also try to clean up the array on the 3ds max side and just store the keyframes to the disk file...or the servo number first, followed by a tandem of keyframes plus the frame number of the keyframe? Just an idea that came to my mind a while ago...

$HighServoArray =GetArraySize("$ServoDataArray") #Find out how big it is
  $NextLine =0 #Now read from the array, character by character
  repeatwhile($NextLine <= $HighServoArray)
    $NextNum =0
    if($NextLine < $HighServoArray) #it gave me an error code over here
        Print($ServoDataArray[$NextLine])   
        $TheNumStr =Split($ServoDataArray[$NextLine],";",$NextNum) 
        Print($TheNumStr)
    else 
        print("finished")
    endif
    repeatwhile($TheNumStr != "") 
      $TheNumStr =Split($ServoDataArray[$NextLine],";",$NextNum)
        #Here is where you would do whatever it is you want to do with
        #the value just extracted. I'll just print it.  
        Print($TheNumStr)
        if($TheNumStr = "") #and it gave me an error code here if the variable was blank
        Print("empty")
        else
        Servo(D0,$TheNumStr)
        Print($TheNumStr)
        endif
        #I put a sleep here if you want to actually see the numbers as
        #they go by.
      Sleep(5)
      $NextNum++        
    endrepeatwhile
    $NextLine++
  endrepeatwhile
endif
#79  

So this is a new version of the Virtual JD, it is a 3ds max scene with an animation for testing purposes. This will let you control your JD with the animation package so you can transfer motion to ARC by a script that @WBS00001 is currently developing!

The new scene has some changes in the naming conventions to make it more suitable for the script which will be reading the rotation values! ;)

https://we.tl/6Vm37jJ2nF

PRO
USA
#80  

Any luck with smoothing out the motions?

#81  

@WBS00001 is still at it! He was asking me some days ago if I wanted to check his progress on the script, but as my coding is so limited I told him that I might rather wait a little and get my hands on the finished piece of software since there will be further revisions! The problem persists in that ARC runs on Windows and there is no real timer to precisely drive those servos... @WBS00001 took a swing at using ticks rather than the sleep() command, which means the timing might not be exactly the same on every PC, because depending on the tasks running and the hardware differences of each machine there is a tolerance on how fast the individual PC is able to process those tasks! But this also means that the integrity of the motion will be given and you could think of it like maybe fine tuning your old vinyl record player I guess...

Ease in or easy out motion seems to be tricky...but he already told me that it is more than possible to get this correctly time too! ;)

The last option would be to transfer the data string of the animation to some breakout board which runs its independent timer script and will be able to process this data correctly! This would be the last option because we are trying to stay within the ARCs given mechanics so everyone in the community can use it without the need to purchase extra hardware! :)

Thanks for showing us you are still interested, @WBS00001 is still rocking it! :D

#82  

@fxrtst EDIT: I see Mickey666Maus has posted while I was typing up my reply. That's fine. He has encapsulated well what I am saying below in more technical detail. So ...

The short answer is ... well, there is no short answer. Would that getting smoother operation was the main problem. Things have progressed beyond that relatively simple goal however. I was getting pretty good results when working with just 3 servos in my initial testing, smooth motion and all. However, scaling things up to 10 is not working out so well. I shudder to think how it will be with 24.

My whole system revolves around running each servo in their own script, thus their own thread. The movements of the servos are tied together by sending out "ticks" from a master timer (counter), also running in it's own thread. Each servo script sets itself up to wait for a certain tick, then, when it comes, do a series of small moves based on frame by frame information from a file generated by the 3ds script. Each frame data being timed to run from subsequent ticks. All that is set up ahead of time such that it all resides in memory (actually in an array).

I had theorized that each servo script working on it's own thread would mean each gets a uniform slice of time to work, thereby helping to make up for the lack of a real-time series of ticks. But that's not quite working at the moment.

So the problem has moved from smoothing things out to just getting the servos to follow the movements as generated by the 3ds script program. And that has gotten more complicated since I had to add data for each frame for each servo in order to have any hope of the playback system mimicking the motions in the 3ds model at all. That is, with regards to speed and direction of motion as the servos goes from point A to point B to point C, etc. All this to achieve the non-linear motion the model shows as the virtual servos make their movements. IF I can get that going, then I can get back to work on smoothing things out.

If I stuck with linear movement as I originally had, things would be fine, but that does not mimic the movements of the 3ds model very well. It gets the servos from point A to point B but bypasses all the nuisances in the movements. Still, in the end that may be all that can be achieved given the real world vs virtual world translations of movements involved. The next level up would be to use a separate controller board and send commands from the EZ-Scripts to it. Not a great solution, but one which holds promise of better servo control since it would have a real-time clock to work with.

Probably much more than you wanted to know. :) But I wanted to throw all that out there via this post, to see if others have any ideas.

PRO
USA
#83  

No you can never give too much info! I didn't think you could do the project without an external board. Timing is critical and as you said once you start stacking up servos, the worse it gets. Ease in and out do indeed soften the animation and really is key to a successful set up.

Do you guys have keyframes on every single frame? Or do you have a key frame on say frame 0 then a key frame set for frame 30? I would assume keyframes on every frame would bog down the system. Even if you are looking at just the rotational values not the translate values.

PRO
USA
#84  

@wb, can you contact me off forum at, fxrtst@aol.com

Thanks!

#85  

@WBS00001 its been quite a while but have you had some luck on this code and getting the kinks worked out?

PRO
Synthiam
#86  

after reviewing this thread - it seems WBS has done almost everything that was asked. The Max Script that he posted in this thread will export a text file of servo positions. This is exactly what you're all asking for. All that is really needed is a way to import it into ARC

After reviewing FBX - i don't think it's the right direction to go. FBX contains way too much information and doesn't identify the exact positions of each joint - nor is there a way to map it do a servo. I can look into it further, but i believe WBS' script is the way to go.

Let me look into getting Max 3DS to explore more someday in the future

#87  

okay, well since it kind of sputtered out and wbs said it didnt work past two servos I figured it was a dead project but alright. on importing that to ARC, will that be a possible plug in? from a text file?

for your work on importing the data to ARC/autoposition controller this should produce a .csv file in the same vein. hope it helps importing with getting that data imported. looking forward to seeing it.


 (
	fn writeData bn toFile =
	(
		data = if bn.parent == undefined then bn.transform.rotation as eulerAngles else (bn.transform*inverse(bn.parent.transform)).rotation as eulerAngles
		format ",%,%,%" data.x data.y data.z to:toFile
	)
	
	fn recursiveWriteData bn toFile =
	(
		format "%" bn.name to:toFile
		format ",%" (if bn.parent != undefined then bn.parent.name else undefined) to:toFile
		for t = animationRange.start to animationRange.end do
		(
			at time t
			(
				writeData bn toFile
			)
		)
		format "\n" to:toFile
		for obj in bn.children do recursiveWriteData obj toFile
	)
	
	fn getFileName =
	(
		return getSaveFileName caption:"Save..." types:"Excel(*.csv)|*.csv|"
	)
	
	fn main =
	(
		root = undefined
		if selection.count != 1 then
		(
			messageBox "Please select just the root joint and re-execute the script." title:"Error: rootJoint"
			return -1
		)
		else
		(
			root = selection[1]
		)
		fileName = getFileName()
		if fileName == undefined do
		(
			messageBox "Please select a filename for the file to be saved." title:"Error: fileName"
			return -1
		)
		file = undefined
		if doesFileExist fileName then file = openFile fileName else file = createFile filename
		recursiveWriteData root file
		
		flush file
		close file
	)
	main()
) 
PRO
Synthiam
#88  

Ah okay! Now we’re onto something.

I have a real neat idea since you’re familiar with max script and I’m in the dark. What if you exported a csv with your script for me - and I created a plugin?

I think what Will was looking for with fbx is like recreating Max 3D! This goal is to take an export of animation joint movements, convert into servo degrees and import into ARC - not recreate a HUGE existing animation package.

If the problem and suggested solution are both identified and relate to each other, I can create a plugin. The suggestion of fbx would have ezrobot recreating a 3D design animation software and that’s not our forte - also it’s a significant feat since autodesk already has a great software package.

So let’s get a working and dynamic Max Script that exports joint servo degrees as frames and I’ll make a magic import plugin.

Once we get that working - I’d like to explore how we can connect ARC directly to max 3ds and move the robot servos in real-time.

PRO
Synthiam
#89  

One last thing / regarding my mention in the other thread about using Auto Position. Your response was discouraged about creating positions for each frame.

It occurred to me throughout this conversation that you might think frame is like animation frame. Where there’s 30 per second or something....

The Auto Position calls a frame a resting goal position of servos. An action consists of a few frames. The algorithm interpolates the servo positions between two frames. Frames are goal positions - not per second or per movement. The system calculates that for you.

Are you sure it’s not the Auto Position you’re looking for?

PRO
USA
#90  

The reason I suggested FBX as it is an interchange file format created by Autodesk, so that any animation software (3D max, Maya, motionbuilder and Lightwave etc) can use animation data across different software. Basically you are logging the scene, lights camera, angles of joints etc. in the FBX file.

Obviously the only thing useful to us are joint positions on bones on certain frames so we can use those to map to our robots.

Here is where Id like us to be someday.

Here is a plug in written in 2009 by Dan Thomson https://www.highend3d.com/maya/plugin/servo-tools-for-maya-for-maya

..and it demonstrated here:

PRO
Synthiam
#91  

Now I have no idea what’s going on. What is maya? Or motion builder? What program do you actually want this for? I’m not sure I know what it is you need anymore. If someone can explain the exact problem you’re wanting to solve and with what software and what format?

This conversation is a dream of million wishes but no clarity. I’m not sure i can help you at this stage. Identify between the few of you what it actually is you need and let me know once you agree :)

#92  

that sounds great actually, let me fiddle with it, run it by some friends who have the magic. Make sure it does it reliably by running it through a view different configs then send it over tomorrow once I know it works and you can do your magic.

I think that'd be really great as a consumer to get this software working with yours. So I started out as a scientist and got sick of publish or perish academia malarkey and decided to take my ideas, make them real, try to do some good and help people. So as an inventor early in my "mad scientist" career I've had to start learning a bit of everything. Where before I studied neuroscience and barely could wire up an arduino, now I have to do and learn everything from code to MIM manufacturing techniques etcs. For me and other makers, real time control with animation would be a great tool to create truly new things without having to learn freaking EVERYTHING.

I appreciate your double checking, yeah I read through your manual, watched a bunch of videos (good stuff BTW) and played with the software. Its not that its failing me algorithmically, its that as I said the kinematics in this machine are VERY weird. Setting a position goal would get it to move to that position but there are 17 joints in total and the NEXT movement, to be completed correctly will require some parts to move away from each other and others to move in not entirely intuitive ways. In short, I would love to just buy your kits and stay 100% within the platform but needing nema's, the alien kinematics etc extends past your original vision I think, though its so cool that the platform is evolving complexity like this. As a biologist, im all about that self organizing complexity ha :D:D :D

#93  

ah didnt see the last few bits. No nothing has really changed, I agree with @fxrtst that .fbs would be ideal, its simply a matter of quantifying what that means. Like you said, it has alot of complicated data in each file. Motion builder, Maya, 3ds Max are just different animation/game design type suites that autodesk has and fbs is (with certain limitations) interchangeable among them. I think taking fbs files should be the goal, like @fxrtst said someday but starting with the route we discussed I think is a good starting point. Frankly it has alot of potential getting fbs working through ARC. Itll expand the range of interested consumer to entire categories that are being ignored. Would you agree @fxrtst? I think we should start with this plugin then work up to general .fbs guided robotic programming as that will be a crazy involved process with a bunch of unknowns thatll probably fizzle out again like it did last year with the last guy wbs that tried. Instead with a bit of max script and magic plugin stuff from DJ we can have something that works to build off of as a community and not lose steam since it will take a long time to get fbs ARC programming completely realized.

PRO
USA
#94  

@DJ, Its OK no worries.

All these are exactly the same concept, altho if not coming from an animation background its hard to wrap your head around.

Autodesk owns 3D Studio Max, Maya and Motionbuilder so the animation systems are basically the same. The .FBX files from all these programs are the same. Everything we have been discussing be it 3d Studio Max or Maya all have the same animation processes.

The endgame is the same, exporting animations (or real time) from a virtual robots inside an animation program and applying the rotational values to matching servos.

How we get there doesn't matter to me. Which program doesn't matter to me. I added the links above because someone has already created a python plug that does this with ardurino and sometimes reverse engineering gives some or all the answers.

I built the above system from Dan Thomson in 2012 and got a servo to move in realtime by moving a virtual jaw in Maya (video somewhere?). But, I just followed his directions. I have no coding skills or I would have tackled a plug in for ARC a long time ago :(

It has been a dream as I said for 30 years. But, in this thread are others thinking the same way as me, so I assume Im not the only one who sees value in a system like this.

That said this is an extremely time consuming concept. You have a business to run. Perhaps some of the fellas here on the forum would like a stab at moving forward (PTP ?)

#95  

Yeah, this is fair. I think starting out with a plug-in would get the train rolling to get users with the skills to take the software there. Basic group psychology, show people something works and mass psych forces take over. "Hey look fire cooks food," that cought on and now I'm heating up pizza bagels in my microwave.

I haven't had the dream for 30 years but I would love to see this work done as well. I'm working with my buddies on cleaning up that script that produces the csv file. It works-ish at the moment but there's a few irregularities that should be worked out tomorrow. As long as we all agree and DJ is still willing then we can in a sense, program a robot with an animation. That I think should grab the attention of the more skilled users.

Anyway, let me know if this is cool with everyone. Unfortunately my skills are pretty limited so I can't help much beyond this light data extraction stuff but I'll do what I can!

#96  

Hey guys, great to see this dead threat is hot again! :D

I had a lot of stuff going on in my life lately and @WBS00001 had an accident which is the reason he could no longer work on this...

But its good timing to dig this out again since I have been doing a lot of robotics lately, including the continous work of exporting animation to my robot out of 3ds max! Writing the servo positions to a .txt file is not a problem at all...live control is also possible! IK is possible...I made a fully rigged JD for testing purposes! I wrote all this in Max Script and can share it if needed! At the moment my robot is FK rigged and I am exporting the servo positions as a plain .txt file, I am still struggling with an IK setup...someone specialized in mechanical rigging within 3ds max would be highly appreciated!

The main problem though still persists, which is sending a lot of data correctly timed to the EZ-B or any other micro controller board...the EZ-B will not send those with the correct timing, since Windows is a task based operating system. And other microcontroller boards that I tried could not handle the traffic due to their limited onboard memory, and dropped out due to memory buffer overflow!

At the moment I am sending the positions frame by frame via LAN, rather than all the position in one coherent array..I had a bit of trouble with Wifi and switched to LAN for the sake speed and stability. Also I realized that even when I am sending 30fps to the servos, the motion seems to be a bit choppy and needs to be smoothed out. Just enough not to cause the movement being not carried out correctly, but only smoothing it a tiny bit in between the frames!

#97  

Aw WBS? Poor guy. Just had a bad one of those myself last year that put me out of commission for a while too.

Anyway, interesting info @Mickey666Maus thanks! Now if you got frame info from 3Ds Max, per frame and ported that directly through the autopositioner at an equivalent number of "frames" and used the UWP code output to run your board to command the servos (or in my case Nema's and servos) wouldn't that resolve your choppy issue? Reasoning: I think the final solution to full live replication and fbs file servo programming is going to rely on essentially creating a script that is pretty much the backbone of 3ds but that only selects for certain values then sorts that data in such a way that it can be coded to automatically set the positions/frame rate equivalent to that of whatever the frame rate the animation is. Honestly, I have no idea on how to do that but I'm glad I could help reignite the discussion. I can see this extrapolated in the future, people programming robotics through a hybrid animation robo programmer suite where one can only design things like lights, sounds, sensors responses etc in an animation and have a real machine corresponding to those instructions without having to become a master electrical engineer and animator at the same time. And that's just be the start! Well, ways away for now but let's see if we can get that train chugging.

@Mickey666Maus you said you have the code to export as text. Could you share it? I'm an absolute novice and would love to compare notes basically to see if we can optimize this method. And once it get it established then build the complexity from there. This is cool. Ha.

#98  

so I got some help from my friends at in the auto desk community. This is what we have so far:

Lets start with a simple animation first


delete objects
(
	p0 = point name:#hip pos:[0,0,100] wirecolor:red
	p1 = point name:#knee pos:[0,0,50] parent:p0 wirecolor:orange
	p2 = point name:#foot pos:[0,0,0] parent:p1 wirecolor:yellow

	s0 = p0.rotation.controller = rotation_script()
	s0.setexpression "eulerangles 0 (-(sin (t*0.1))*30) 0"

	c0 = p0.position.controller = position_script()
	c0.setexpression "[t*0.05,0,100 + sin(t*0.1)*10]"

	s1 = p1.rotation.controller = rotation_script()
	s1.setexpression "eulerangles 0 (-(sin (t*0.1))*60) 0"
)

Then do the same thing but with transforms instead of just positions


struct JointPos (time, tm) 
struct JointData (node, name, transforms = #()) 

fn collectJointPositions joints range: = 
(
	if not iskindof range Interval do range = animationrange
	
	data = for joint in joints collect (JointData node:joint name:joint.name)
	
	for t = range.start to range.end do at time t 
	(
		for d in data do append d.transforms (JointPos time:t tm:d.node.transform)
	)
	data
)

/* make a test scene **********/

delete objects
(
	p0 = point name:#hip pos:[0,0,100] wirecolor:red
	p1 = point name:#knee pos:[0,0,50] parent:p0 wirecolor:orange
	p2 = point name:#foot pos:[0,0,0] parent:p1 wirecolor:yellow

	s0 = p0.rotation.controller = rotation_script()
	s0.setexpression "eulerangles 0 (-(sin (t*0.1))*30) 0"

	c0 = p0.position.controller = position_script()
	c0.setexpression "[t*0.05,0,100 + sin(t*0.1)*10]"

	s1 = p1.rotation.controller = rotation_script()
	s1.setexpression "eulerangles 0 (-(sin (t*0.1))*90) 0"
	
	global root = p0
)

/* collect the data **********/

joints = join #() root 
data = collectJointPositions joints

/* reset scene and restore animation per frame *******/

for joint in joints do 
(
	c = joint.controller = createinstance prs
	notifydependents c
)

/* restore data **************************************/

animate on for d in data do
(
	for p in d.transforms do at time p.time d.node.transform = p.tm  
)

Quote:

because we apply transforms in hierarchical order we are good. but we can do everything in local space.
So here should be how to store csv data and the retrieve it

 
data_file = getdir #temp + @"\joints_data.csv"

fn writeDataToCSV data file =
(
	fs = createfile file
	for d in data do
	(
		format "%" d.name to:fs
		for p in d.transforms do 
		(
			format "," to:fs
			for i=1 to 4 do for j=1 to 3 do format " %" p.tm[i][j] to:fs -- 12 floats matrix3 for every frame
		)
		format ";\n" to:fs
	)
	close fs
	file 
)

writeDataToCSV data data_file
edit data_file -- show in editor to check

fn readDataFromCSV file =
(
	if iskindof file String and doesfileexist file do
	(
		fs = openfile file
		local data = #()
		while not eof fs do
		(
			ss = readline fs 
			ps = filterstring ss " ,;"
			d = JointData()
			
			d.name = ps[1]
			deleteitem ps 1
			t = 0

			n = 1 
			/****** restore a matrix3 from 12 floats **********/
			for k=1 to ps.count/12 do 
			(
				row1 = [execute ps[n], execute ps[n+1], execute ps[n+2]]; n += 3  
				row2 = [execute ps[n], execute ps[n+1], execute ps[n+2]]; n += 3  
				row3 = [execute ps[n], execute ps[n+1], execute ps[n+2]]; n += 3  
				row4 = [execute ps[n], execute ps[n+1], execute ps[n+2]]; n += 3  
				
				append d.transforms (JointPos time:t tm:(matrix3 row1 row2 row3 row4))
				t += 1
			)
			append data d
		)
		data
	)
)

dd = readDataFromCSV data_file

/* reset scene and restore animation per frame *******/

joints = join #() root
for joint in joints do 
(
	c = joint.controller = createinstance prs
	notifydependents c
)

/* restore data **************************************/

animate on for d in dd do
(
	d.node = getnodebyname d.name
	for p in d.transforms do at time p.time d.node.transform = p.tm  
)

so this way you can also get positional data that if the plugin is made with sufficient complexity could seriously simplify the user process. I dont think this is the best way to go through this route but Ive reached far beyond my technical ability and am frankly flailing about in empty dark space at this point lol what do you guys think?

also @Mickey666Maus since you know max script does this follow the basic path you took? any suggestions?

#99  

@Ramon3141 This is not at all my approach, but I am really happy about this...another point of view and another swing at is is more than appreciated!

The way I did it was basically just to set up Expose Transform Helpers and Wire Parameters...then the Time Callback Function, was used to periodically send the retrieved values either to the ARCs or any other custom server to move the servos, or write them to a .txt file for being used to playback animation!

I am not at home right now, but I will take a look at what the code does once I am! Good to have you here, lets get things going! :D

PRO
USA
#100  

@mickey666maus Glad to see you back. So, basically we have a system on the EZB where as we can connect a boatload of servos, but it works because not all servos are getting position data all at once?! Is it because 3d max is outputting data for every single frame for every single servo? And in Auto Position you are telling it to move servo 12 from 90 to 180 degrees at a particular speed, and thats the only data sent? I'm trying to wrap my head around why there is too much data? I'm thinking like mocap data where you have data on every single frame?

The project im working on, is connecting Perception Neuron mocap suit to a robot to get it to move. PTP managed to take the mocap glove and get three joints of one finger move 3 servos in real time. But what happens when I scale it up to 18 axis? And the Kinect plug in will get several axis to move live?! What is different in 3D Max?

I think I need to fully understand why there is a data overflow. Any additional info would be appreciated.

#101  

@fxrtst that sounds great...and I guess for live control you guys will not run into too much difficulties! Because anyways, to eliminate jitter and smooth out the servo movements, you will have to filter the incoming data, and it will be as live action as possible restricted by how much frames you can get out of your system! Very interesting, I would like to see the setup! The playback has the difficulty that, even with 30fps th motion looks not smooth...I used a smoothing function to equal it out...but it can also affect the motion at some point! Also the ARC approach has timing issues, I dont know if @DJSures has something in mind? Frame by frame animation was the very tricky part which has not been tackled yet, while a system like the Auto Positioners is a pose to pose animating system, works very well...but getting realistic motion is not really working this way!

I will try to make a litte clip on my progress and post it here, visually it work better I guess...so you can see where I stand at the moment! Frame to frame animation is still my ultimate goal!

Good to see we are back on track! ;)

#102  

Im with you on frame to frame animation @mickey666maus. Id love live control but with frame by frame animation we can create and tune completely natural fluid movements. I used mo-cap data to control a bi-ped and then key framed my machines movements to that of the bi-ped, it means no fussing with insane kinematics math for new configuration, no timing issues etc.

@mickey666maus looking forward for you to get a chance to check out that code, for though my code skills are weak I think the logic is the most straight forward work-around for the system but then again, I may simply be confusing straight forward for primitive lol we've gotten the script to work partially but the trick seems to be in the nittry gritty details. lets see!

thanks @mickey666maus, happy to join the community.

PRO
USA
#103  

Yes a video would be great!

And yes on filtering. Any mocap will have errors and filtering is a must. My past experience with facial mocap program called FaceShift was a live set up for puppeteering characters.

There were three filters, small med and large filtering. The large one produced such a lag in the live feed, but the animation was so smooth! But this used the carmine 1.09 sensor and faceshift to produce the data super noisy. Thats been replaced with a good webcam and new software, creates same results as the sensor or even better. Instead of depth it uses 2D facial tracking.

My 30 year long yearning, has been for live/record and playback system that can be used for animatronics/ film industry and now with robots. This is really the only way to get realistic motion into robots. When the recorded and clean up data is run through the robot it would be great to have a motionbuilder type program that would splice together 2 or motions together on the fly. As an example a robot detects the person is sad. The robot could then call out the sad face and body mocap to reflect the user. It would take its current servo positions and merge/transition that with the "sad" mocap animation in real time.

PRO
USA
#104  

Dont get me wrong I think they are both related (live capture and keyframe animation. Be it live or hand keyframe animated, I think the processes will be similar and piggyback one another.

The video above with Maya and the Ardurino and Dan Thomsons plug in. He used IK to set up that animation. Even tho its live going back out to the ardurino, he still hand animated that walk cycle and just played it back out. Live or recorded or exported out as a text file and loaded in ARC are all still along the same technique. Everything we share here will help in all the endevours.

#105  

I actually went and checked out your youtube videos and I believe I saw the facial mo-cap thing you're talking about @fxrtst. REALLY cool stuff, I agree in that in essence its all under the same system of logic. really using mo-cap is just a step removed from live that allows that fluidity. I think you are right in that all our work will help and probably be consolidated so one day soon they are just normal features and I'll be able to mo-cap someone swing dancing and get my robot to do it in a few minutes or real time.

That motion builder blending feature actually really fascinates me. Unfortunately Ive never used it but if i could ask you a specific question given your extensive experience. I have to explain some context so forgive the length and hopefully you find it interesting. Okay, so in essence my team is partnering with the robotic prosthetic department at a university to build a machine I designed that through a series of simple sensors, programming and mechanical trickery allows one to walk, run, jump etc in place in 360 degree. They call this class of machine Omnidirectional treadmills but all the ones that currently exist are useless. Basically a harness that keeps you from busting your tail holds you while you lean forward and move on a slip surface which is just... sleep anyway, this actually allowing nearly completely authentic unrestricted movement and has medical, VR, exercise, military and even special effects applications in your field come to think of it. anyway, long story... long they have agreed to build a version of the machine for medical applications with intentions on doing clinical trials with people who have paralysis or gait abnormalities (it can either be passive where you move and it just allows you to move freely or active where it moves you or hybrid for rehabilitation). Anyway the point being, we have a presentation in a month to show off a 1/3 scale simplified proof of concept model we my team has built. This presentation will determine how much money they put into its development lol so yeah x) This is what started my interest here because after I drew up the CAD designs in Fusion 360, another member of my team then imported it into 3ds Max and using public mo-cap data and mo-cap data from the lab (they use it to study gait) animated it, basically manually animating the model by just running the bare math I worked out as to what moves when but it looks and works great!

The rub: We have a completed (largely 3d printed machine) with 15+ nema 17's, and 4 servos that our electrical engineer team member figured out how to control via driver boards hooked through to a pi via an SPI connection. The pi will run window 10 Iot core so that it can process the Universal windows Platform code that ARC spits out and voila! but without the ability to turn that animation to movement instructions we are stuck because it will take alot of time, money and expertise to work out the math an programming for the kinematics. (we lack abundance in all those things)

Now my question: If you've made it this far down, essentially my question is: How hard would it be to take the animation scenes of the biped + machine turning around, turning left, right, walking, running, doing the salsa (for real lol) etc and blend them like a video game character? so using a game controller it would seamlessly transition from forward to slightly left to really left etc (the person who did the animation simply learned how to do it for this project since she is actually a medical resident lol Basically we learn whole skill type things only know what we've had to.) because THAT is what Id love and Im sure would blow them away at the presentation, increasing the chances this thing actually gets made and helps people. so just Hand them a game controller or something and have the machine, which is just copying the animation values real time mimic what is essential a game character. of course, it can be prerecorded too if can't figure it out by then or there is too much lag but having them blended would still be so much better! So, again, sorry for the length but given your experience and what ive seen of what you can do I thought it was worth the effort to ask:D anyway thanks man.

#106  

Quote:

SPI connection
I tried SPI connection and it seems to be kind of slow for frame to frame animation being send to the microcontroller board if there are a lot of servos involved! That is why I switched to LAN!

Quote:

The pi will run window 10 Iot core
If you are having a pi, you can store your animation sequences directly on the pi, why would you have to send them to the pi in the first place...the animations would just be a bunch of numbers in CSV format anyways...

Quote:

How hard would it be to take the animation scenes of the biped + machine turning around, turning left, right, walking, running, doing the salsa (for real lol) etc and blend them like a video game character?
Well here comes the hard part...and this is the one I am still struggling to get right! FK is pretty straight forward! But IK is kind of hard to set up if there is a robotic rig with a lot of DOF...also the hardest part in the whole setup is euler rotation spaces, and gimbal locks plus the fact that rotation has to be extracted locally and always the child and the parent plus the gimbal order needs to be taken care of...look into this when you are thinking about extracting rotational values! Another thing is that the 3ds max biped is a wonderful thing as an animation tool, but it is sadly kind of useless for robotics, trust me...I have been loosing a lot of sleep over this! A custom rig is the very best option...I had been partly successful in linking a mecanical rigto a biped and then extracting the values I needed out of this rig..but as I said, someone being specialized in rigging would being very very helpful!

Anyways its Friday night, so I will look into you Max Script tomorrow, looks interesting...maybe to complicated for me, but lets see! :D

#107  

@DJSures

Quote:

This conversation is a dream of million wishes but no clarity.

Thanks for bearing with us, it been a year already, I guess even longer, since this topic has been brought up... :)

Actually, you would not have to worry about the 3rd party applications which will hopefully extract the data for the animations in the end, all we would need it a tool to send or data on a frame to frame basis to the servos, opposed to a pose to pose approach like the Auto Positioner!

I would make 3ds max to write a CSV like

90, 90, 70, 60, 55 etc

or it could be

D1, 90, D2, 90, D3, 70, D4, 60, D5, 55 etc

or

fps, 30, D1, 90, D2, 90 etc

It would not matter...the main problem I encountered was that the positions could not be send correctly timed! If you would let me know the format you would like the data to have, I would format it any way it suits the plugin! :)

PRO
USA
#108  

Ok, so alot of information there. I gotta a bit of an idea what you are trying to accomplish. I've seen those treadmill 360 for VR and video games. Im having a problem understanding how the mocap is applied to the 3d printed treadmill. Or are you looking for a virtual representation on the screen of what the treadmill is doing?

Using Motionbuilder is a quick way to merge several motion capture clips together but i would not say its seamless, the files need to somewhat match to get it looking "real" between transitions. There are new neural network programs being written and explored.

Your time frame is pretty short, and getting a portion of what you want to accomplish will be challenging in that time. But the plug in is something these guys have been working on a year...so who knows.

#109  

Mickey666Maus Germany Posted 17 minutes ago Quote:

Quote:

I tried SPI connection and it seems to be kind of slow for frame to frame animation being send to the microcontroller board if there are a lot of servos involved! That is why I switched to LAN!

Hmm our engineer thought it would be the easiest plus we found these great SPI stepper driver break out boards to allow us to save on GPIO pins but if we encounter that problem of slowness I'll definitely mention switching to LAN. I'd rather wait to make sure its too slow because as it is this machine has been crazy expensive to build lol

Quote:

If you are having a pi, you can store your animation sequences directly on the pi, why would you have to send them to the pi in the first place...the animations would just be a bunch of numbers in CSV format anyways...
hmm could you elaborate? I meant using the Pi to read the control code the UWP code made from the UniversalBot, the open source Microsoft Windows Universal Robot SDK made to work on windows and that is at the heart of the alot of ARC functionality I believe, the CSV files are great but unless we run it through something like autopositioner but make it, like you said in the post to Dj, frame to frame instead of pose to pose then due to the different variable degrees of what a joint might be at frame x as opposed to frame y then you can get the fluid movement of the animation as well... well thats the idea anyway. lol have i mentioned I am a biologist by training? Im doing most of this through inductive reasoning largely. Now that you know my particular needs, if you have a superior method please do not hesitate a critique. Id really appreciate any help at this point, I have pretty much everything done physically and in hardware/wiring etc but I have one month to turn animations to movement and my engineer, bless him, is something of a weekend warrior so he can't dedicate anymore time beyond the base hardware configs etc.

Quote:

3ds max biped is a wonderful thing as an animation tool, but it is sadly kind of useless for robotics, trust me...I have been loosing a lot of sleep over this!
Ha i very strongly sympathize with sleep loss here my friend. Im not a coder or engineer by any substantial means, yet have a few weeks to accomplish what much more talented people in this domain, such as yourself, have been struggling with for more than a year. Fortunately, I know enough to have ideas and ask people who really know what they're doing in a particular field how to get it done. Collaboration is a beautiful thing! Anyway, yeah essentially we used the biped to manually work out the forward kinematics but due to the nature of the machine IK becomes something of a nightmare. Im not even going to think about it for now lol

Quote:

Friday night, so I will look into you Max Script tomorrow, looks interesting...maybe to complicated for me, but lets see! Grin
Have fun then buddy, if I recall friday nights in Berlin were always great fun. cheers!

PRO
USA
#110  

A couple of programs, one is Ikinema that is easier and cheaper than motionbuilder and also Mixamo which is free to getting animations onto a character...you have both motions and characters that are free to download. Then those fbxs go into Ikinema to transition between motions to get your animation as seemless as you can get.

PRO
USA
#111  

Quote:

...also the hardest part in the whole setup is euler rotation spaces, and gimbal locks plus the fact that rotation has to be extracted locally and always the child and the parent plus the gimbal order needs to be taken care of...look into this when you are thinking about extracting rotational values!

PTP had to resolve the euler angles in the plug in he made for Perception Neuron. The way they process using IMUs as the form of movement for their bones you have to do some serious math to get the the rotational values. Which he did because hes genius. But he is someone to chat with. Gimbal lock in relation to parent child I had not thought of. Hmmm scratches head. I may go back to my original bulkier idea of capturing motion using what we call slave system. Its basally a suit you wear with pots at all your joints. Its like a complex joystick you wear...lol. Anyways Im thinking out loud again.

#112  

Amazing, I found that video enthralling, I appreciate it. We've actually already been courting data scientists for when the full scale build begins because we plan on correlating, using various methods we've studied and have had suggested to us by learned advisers, the EMG data from the lower legs, motion capture data and the kinematics of the machine to get it to understand how people walk, run etc and eliminate lag and expand functionality by having the machine know where you're most likely going based on the slight variations of your movements and neuromuscular impulse signals. but yeah, for the purposes of this 1/3 scale simplified proof of concept design presentation I think the programs you mentioned will do haha I just need to make sure we cobble together a sufficiently capable script that can gather the data and plug it in before I set to work on that lol I will forego sleep for days if I have to manually plug in every key value for every joint for every frame, if I have to but by god it will get done one way or another. sleep stress :D

As far as the mo-cap data. We took the data from public sources (thats how we animated the machine with salsa steps) and from the lab we're working with and used it to run a bi-ped then using keyframes and divine patience, my partner who had a little 2d animating experience (but is actually a doctor and had no clue what she was doing at first), key framed the entire movement sequence of the machine to correspond to the steps of the bi-ped controlled via mo-cap. Then we simply removed the bi-ped and voila! definitely brute force but it worked to create animations of the machines movements with little more than some light math, curve editing, manual positioning and LOTS of key frames since it always got it wrong on its own.

anyway, thanks for the advice and if anything crosses your mind please Id greatly appreciate it. Ive looked over your amazing work on Alan, and his apparently naggy wife lol so I feel you'd understand passion projects that push the bounds of your knowledge and abilities.

thanks again man.

ooh you're a movie guy, seen pacific rim? you know that thing they get in to control the bots? well in the scenes where they are strapped in they have these things strapped to their feet that move. think along those terms but MUCH more complicated, just a fun reference similarity someone once pointed out to me.

#113  

@fxrtst I've never seen that clip before...it never crossed my mind that you could use a neural network to drive motion, I guess this is the way to go! All I could think of so far is trigger motion by sound levels like a pause in movement while there is a muted narrator...I was also thinking that it could benefit to use Tensor flow for Emotional content recognition like Dani Thomas did, but I guess soon we will have more interesting things coming up within that field!

@Ramon3141 Sorry man I am no engineer at all, and the device I was talking about being slow at communication with the serial port is a less that 10$ device!:D So you handpoked all the rotations to your device...which is a good thing to do to have the presentation up and running! But the main question still persists! How did you play it back on your machine? ;)

#114  

@Mickey666Maus Im not either but yet we engineer things anyway right? lol

you mean hand poked as in in the animation or in real life? I havent manually done it yet but im prepared to do it just to get the machine to move at least in a programmed arrangement. If in about a week or so I dont see a closing route for a script to get csv data and a plug-in to get UWP code then I will start manually doing it which will take many many sleepless days and nights lol In animation, if you mean manually moved the machine to match the bi-ped, Idk my partner is the magician there. but whatever she did it worked beautifully. We have a bunch of scenes of all kind of movement that arn't optimized kinematics (as I said earlier that will require machine learning) but work and well enough to salsa so hey x) ha well, engineer or not I appreciate the advice as always and friendly discussion over such interesting topics.

I must admit, I think more than just motion driving, machine learning in animation and video game design alone will leave us will games that evolve differently every time you play them, characters that move and act differently according to experience and so much more. Im very much looking forward to seeing what machine learning will bring us and to use it too!

#115  

@Ramon3141 Ok, this is kind of the main point...since you are using the same setup like I, I was wondering what did I miss so far that my motion is still not fluid at all! My next approach to this would have been to use the Rasberry Pi in conjunction to an I2C servo Breakout Board and to store the data within the Pi! Now I am still sending it to the micro controller board, using LAN...

So if you found any way to create a fluid frame to frame motion, it would be cool if you could share how this had been accomplished! :)

#116  

Quote:

you mean hand poked as in in the animation or in real life
I think extraction rotational values is secondary, getting fluid frame to frame motion is the main task!

#117  

I also checked your Max Script now...could you please attach a reference scene, or setup instructions? Does it need a biped with mocap in the scene? It looks like someone took a great deal of time to get this thing coded, but then again maybe I am to limited in MaxScript to understand it?

#118  

I see where the miscommunication was. Okay yeah, besides using some G-code and basic motion tests my physical machine has not moved an inch. My virtual model moves smoothly and by creating a very high frame rate and moving that data through a kind of autposition type plug in I suspect choppiness will go away once the UWP code is on the pi and the pi executes the direction to the nema's. personally I think sending data to the board through anything is a bad move, I can see a bunch of places where things can go wrong. Thats why Im trying this windows 10 Iot core + UWP code thing. granted it might end miserably lol

let me ask you, what does your interest in this topic stem from? maybe that can help me better understand. At the very least Ive made some friends on the Autodesk forum (they've been helping me figure out the code and best route to take) that i can relay things to. Friendly bunch of people they are. happy to helPP.

Well, that code is cobbled together from a group of people. Ive only gotten it to work with the small test animation at the top.

before i started seeking help me and my partner made something like this which will extract rotational data relationships, its ugly code with a bunch of errors but it produced data so that spurred on refinements. I think we are close. its above my head at this point but from what I can follow my buddy at the autodesk forum that is working on fixing the code you saw before is getting very close. maybe even by tomorrow. lets see.


(
	fn writeData bn toFile =
	(
		data = if bn.parent == undefined then bn.transform.rotation as eulerAngles else (bn.transform*inverse(bn.parent.transform)).rotation as eulerAngles
		format ",%,%,%" data.x data.y data.z to:toFile
	)
	
	fn recursiveWriteData bn toFile =
	(
		format "%" bn.name to:toFile
		format ",%" (if bn.parent != undefined then bn.parent.name else undefined) to:toFile
		for t = animationRange.start to animationRange.end do
		(
			at time t
			(
				writeData bn toFile
			)
		)
		format "\n" to:toFile
		for obj in bn.children do recursiveWriteData obj toFile
	)
	
	fn getFileName =
	(
		return getSaveFileName caption:"Save..." types:"Excel(*.csv)|*.csv|"
	)
	
	fn main =
	(
		root = undefined
		if selection.count != 1 then
		(
			messageBox "Please select just the root joint and re-execute the script." title:"Error: rootJoint"
			return -1
		)
		else
		(
			root = selection[1]
		)
		fileName = getFileName()
		if fileName == undefined do
		(
			messageBox "Please select a filename for the file to be saved." title:"Error: fileName"
			return -1
		)
		file = undefined
		if doesFileExist fileName then file = openFile fileName else file = createFile filename
		recursiveWriteData root file
		
		flush file
		close file
	)
	main()
)

As far as a test file I cant really or I could be sued :/ not yet at least in about a month Im going to post it here (as long as we actually get it working lol )

oh and no, that parts of it that work should work with any scene with a single animation entity in it. try the test code at the beginning. that works to create a basic hip and leg joint animation that the following code extracts from but there are a couple errors. Like a mapping error in that csv code that I cant figure out etc. Ive only been coding for about 6 months out of necessity for my different projects but before that i couldnt print hello world in python lol so we might be the blind leading the blind here but at least we will be able to hear twice as much!

PRO
USA
#119  

Ramone Finally got it now. Great that Dr. had some background in 2d animation! And yeah got the Pacific Rim connection. I work with Legacy Effects alot and have seen the set ups in person.

Mickey666 cool video. Interesting to see how all this plays into robotics. So many possibilities!

#120  

@Ramon3141 Yes getting the animation up an running on a virtual model is not a problem at all...but playing them back on the physical machine is!

The 3ds max script would only make sense with a scene at hand otherwise I am not possible to decipher the code...

Sending the data to a server was an idea to sync the live and the playback thing...you could send the whole motion at once in a single big array, and let your device play it back! This would eliminate the need to have all your motions stored within the device itself! With the microcontroller board this is not possible due to limited onboard memory, but the pi has more than enough to do so! So my next approach will be the Rasberry Pi in conjunction to an I2C servo Breakout Board.

It might also be just the wrong approach and servos should not be driven frame by frame, the Auto Position that @DjSures implemented is a robust solution, so maybe I will also try to just make the pose to pose movements really tight, to kind of mimic natural movements?

My interest in robotics was sparked here within this community...EZ-Robot got me into robotics, and since I am on the hook now I just cannot let go! :D

#121  

@Ramon3141 This sounds like a very good starting point...if you guys could attach the code and the 3ds max scene here, I can take a look at what it does and we can try to figure out on where to go! It is very good that you are already having friends over at the 3ds max community, because max script is so poorly documented, that help always comes in handy when you are stuck! I love 3ds max but the biped animation system can drive you nuts at times...it is so very easy to import motion, but extracting it to drive those servos can be quiet a pain!

Anyways, its late and I will have to leave here...but I am happy 3ds max is hot gossip again! I am more than willing to contribute! :)

#122  

@Ramon3141 And btw, I do not need the real deal...just attach any scene that works, its always a pain to try and get these code snippets to work if they eventually are not of any use in the end! :)

#123  

oh lots of interesting stuff @mickey666Maus I have to hit the sack tonight but ill respond first thing since find a solution to this problem is apparently my point in life this week lol

#124  

I was checking once more on the real time motion capture project that @fxrsts and @ptp have going on and the IMU capruring approach might be interesting for you guys too...there is also a plugin they made for ARC!

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4029684/

PRO
Synthiam
#125  

If you want I can make a plug accept your tcp data to move servos directly. Would that help?

All I need are the specifics and I can make it

#126  

Also @Ramon3141 I was reading your info once more, and now I understood correctly. Your friend build an animation rig which is a virtual representation of the real machine, this rig she animated in a way to match the mocap driven biped as close as possible right? Which is a very good approach since the 3ds max biped is by no means a working representation of your real machine, its a very good tool to represent the mocap data though!

Now I fully understand why you cannot share the 3ds max scene, but as I said...any working scene will do, so I can take a look!

#127  

I just checked those NEMA servos with a direct Eternet connection...cheap stuff! :D

#128  

@DJSures My build uses EZ-Robot HD Servos, and I am ashamed to say...a few of those cheap Tower Pro Micro Servos, which will eventually be replaced soon! I just burned so many in the beginning of my robot life that I was afraid to use those EZ-Robot Micro Servos! :)

#129  

@fxrtst I know that you have been researching this topic deeply already, but did you ever stumble across this product?

https://animservo.com/about-animservo/

PRO
USA
#130  

@mickey666, no I have not. Interestingly it seems to have been developed by Stan Winston’s f/x shop. Which is now Legacy Effects which is who I work for when on my Marvel Films. Seems they have training at the Stan Winston school of effects. I want to look into it a bit deeper.

Maya aside, I’ve used something from a company called Brookshire. They created software and hardware that I’ve been using for years. The interface is clunky and old but you can record servos and then put them on an sd card and into their playback system called RAPU. I could not get it to play at all with EZB, but PTP found the issue and got it working for me. I’m surprised that after 15 years nothing better has come along. Especially with robotics and servos are so mainstream right now.

Anyways thanks for the link going to go explore it a bit more!

PRO
Synthiam
#131  

Quote:

I would make 3ds max to write a CSV like

90, 90, 70, 60, 55 etc

or it could be

D1, 90, D2, 90, D3, 70, D4, 60, D5, 55 etc

or

fps, 30, D1, 90, D2, 90 etc

That would be perfect. I can make an import plugin to read that data and load it as frames, which can be applied to the auto position, that way there’s smoothness on the frame transitions.

I’ll do that tomorrow for you - stay tuned.

#132  

@mickey666maus here you go buddy. this is an old concept model so its safe enough but just let me know as soon as you download so I can kill the link :)

https://drive.google.com/file/d/1jbYp27GBkVqAFb7LfVsrZPQ-bpymilGb/view?usp=sharing

https://drive.google.com/file/d/1xs_duO20QnaV6rohLy0T4m5CCODjVXqt/view?usp=sharing

Quote:

It might also be just the wrong approach and servos should not be driven frame by frame, the Auto Position that @DjSures implemented is a robust solution, so maybe I will also try to just make the pose to pose movements really tight, to kind of mimic natural movements?

Thats what I was suggesting. just set the pose to pose movements at the same frequency as the actual frame rate and logically speaking, it should be smooth as butter my friend. There could be some hardware issues but thats not a timing code problem I think and would be minimized it you eliminate lag and com errors by running through different boards. Thats why we want to do run it off of a pi that has Windows 10 Iot core since itll be able to play the code that autopositioner plug-in produces.

Quote:

My interest in robotics was sparked here within this community...EZ-Robot got me into robotics, and since I am on the hook now I just cannot let go

Thats fantastic! Im sure @Dj would be happy to read that. Originally I began investigating this platform as a solution to a problem but have since grown to love the work that @Dj, his team and the community has done. Personally, though my body of knowledge and best skill sets are in biotech, I plan on getting much more involved in this community and that sweet looking EzB V4 board once this presentation is finished!

Quote:

I was reading your info once more, and now I understood correctly. Your friend build an animation rig which is a virtual representation of the real machine, this rig she animated in a way to match the mocap driven biped as close as possible right? Which is a very good approach since the 3ds max biped is by no means a working representation of your real machine, its a very good tool to represent the mocap data though!

Yes, exactly. My dream is a software setup where I can just import (cleaned up) mo-cap data and have the machine move accordingly or even better, import the data, blend the motions and be able to play it like a game character -_- I rue the day I went into biology lol haha

Anyway, @mickey666maus and @DjSures, I have taken the code me and the community here and at another forum have made and am hiring an expert in max script. Its a simple job so it should be done soon but by tonight or tomorrow we will have a script that will pump out .csv file that is organized, reproducible, dynamic and can handle not just rotational joints but also things like slide joints (a person will have to first take the distance calculated by scripy that the slide joint moves and calculate the number of rotations per unit distance their machine requires and simply plug that value in to get the degrees of rotation for that joint to that is specific to their machine. ) Kind of like how you configure the firmware on a 3d printer if anyone here has ever done that. Anyway, im excited! it will open so many doors for really NEW things for people with fewer resources or just so its much faster!

PRO
Synthiam
#133  

That’s amazing! With an example copy of the csv, I’ll create a plugin to import into ARC’s auto position! Awesome

I recommend the csv include the port names (ie d0 or d2, etc) as the header (first line of the file)

And every consecutive line be the degrees

If a time stamp is necessary, it can be the first row. I believe a time stamp would be important for you guys.

#134  

Thanks @DJsures, yeah you tell me exactly what you need and I'll relay it. Ive already got the freelancer working on it but if you're willing to wait on the plug-in till its done in say a day or two I can make sure it will work best and be as user friendly as possible.

So include: port names lines by degrees (good, we had that already.) time stamp

... got it. Guy I hired is expert at max script, C++ and C# with electronics experience so I think its gonna be wicked. Let me know if there are any other features that would benefit the community. I personally will get everything I need from whats been discussed but it'll be worth it spending a little extra $$$ if it can be made much more user friendly and have wider applications.

If there is nothing else, I'll be posting the file tomorrow then, tomorrow night at the latest and if everything is copacetic we will have ourselves a wicked new ARC plug-in. :D

PRO
Synthiam
#135  

That’s real great!

This guy might also be interested in creating a plugin to unify the solution that he comes up with - might want to consider that. I’m by no means revoking my effort of having a plugin created for you - it’s that sometimes having someone own the whole process gets things done quicker.

I do feel that a file export is the best idea - although there’s an option of having his code send data to an ARC skill plugin via tcp directly. I do think the file is the best option

#136  

I'll ask about the tcp, perhaps making it optional since for those using anything with belts or actuators of any kind they'll need to take the joint types that are non rotational, like sliding joints and do that "1 degree of rotation equals 10mm" kind of thing that is specific to their build but I told him to make it so you'd only need to input the ratio in one box and it'll calculate the rest and provide all by degrees of rotation. My example file will have such a feature and you'll see what I mean. If it does have those kinds of joints. If it doesn't, tcp would work great I think, Ive been talking to him about the general R&D work my team does and am considering hiring him as a regular programmer so if this works out via file then going up the next step would be less of an impediment.

alright, Ill post an update of the progress on the code or the code plus example file itself same time tomorrow.

PRO
Synthiam
#137  

All TCP does is bypass the loading and saving of the data in a CSV file as a middle-man

The advantage to having a CSV file is that there's extra layer of redundancy for data loss. Also, a CSV file can be manually edited for quick tweaks - or combined with other CSV files to create a longer animation.

With TCP, the data would move directly from the animation software into ARC's plugin. If it's TCP, i would recommend a simple packet format such as..


0x45 // header
0x5A // header
uint32 TIMESTAMP_MS
byte BOARD_INDEX
char[3] PORT_NAME // i.e. D0, D2, V2, etc
uint16 DEGREES // I'd make this 16 bit so it can accommodate higher precision servos

#138  

I see the benefits of both now, will relay this copy+pasta style.

PRO
Synthiam
#139  

Also note that if the desired communication into ARC is via TCP, that the ARC skill plugin can also be responsible for saving a CSV. So that's possible as well...

Downside to TCP is that ARC would need to always be running, the robot project loaded... And if you want the robot to move in realtime, it'll have to be connected to a robot (but that's optional)

#140  

Quote:

Guy I hired is expert at max script, C++ and C# with electronics experience
where did you hire that guy...it seems to be a good option if there is anything to be done which is out of scope, when trying to get stuff done yourself!

I got the files, so reading your last posts I am right about that those scenes do not relate to the Max Script examples that you postet earlier on right?

What you send me is a soild mechanical rig, I will use this to show how I extract rotational values by stiing everything up for the base object and this way you can also get the rotations for all the other joints...But I guess anyways the guy that you hired will figure it out!

#141  

@DJSures That might be the way to go...even on 30fps there is still the need to have a transition in between frames, and this might be the biggest issue! At the moment I am sending integer values and I am smoothing them out in between frames like this.

filter = 0.05; // 0.01 to 1.0 d0_sh = d0_sh * (1.0-filter) + d0 * filter;

So I can tune a smooth transition...this at times affects the movement since the next values are being send before the final position is reached, but since I know the framerate I can kind of eyeball how much filter needs to be applied!

#142  

I put a slider to the scene which reads the rotational values, the vaules are being read out from the Expose Transform Helper Object. They are connected using the Wire Parameters Dialog! Also you need to convert to to degrees, because the Expose Transform Helper extracts the rotational values in radians! Its all set up, and can be applied to all you other joins as well!

https://we.tl/c0JpSvnszv

User-inserted image

#143  

Finding a Max Script expert is not easy lol so I used Upwork. Freelance specialist from all over the world and every discipline you can think of. We use them occasionally when we encounter a technical problem that no one on the team is trained in. It hurts me everytime because Im so cheap that if they started charging for air I'd just hold my breath but... THE WORK MUST GO ON! lol

yes, they do not relate. That was just scrabbled up bits of code from different sources trying to work out the problem. the best we had gotten to fully work was the first code I posted. Its sloppy, doesnt handle different kinds of joints and is formatted horribly but it was a start! once our programmer is done though we will have a good reliable script that will be clean and dynamic for @DJ to make a plug-in for. oh quick update for you guys, so he sent me a first run of the output data for the first ten frames and it looks good. still needs some work to format it for the plug-in and to have time stamp, a line for port at the top etc but he should be done and it should all be posted by tomorrow evening. I'll post an example of the csv file tomorrow before I finish my contract for this job with him and hopefully if @Dj is on and takes a look he can give the go ahead or corrects to make so that its format suits the needs for making this plug-in.


(
	fn writeData bn toFile =
	(
		data = if bn.parent == undefined then bn.transform.rotation as eulerAngles else (bn.transform*inverse(bn.parent.transform)).rotation as eulerAngles
		format ",%,%,%" data.x data.y data.z to:toFile
	)
	
	fn recursiveWriteData bn toFile =
	(
		format "%" bn.name to:toFile
		format ",%" (if bn.parent != undefined then bn.parent.name else undefined) to:toFile
		for t = animationRange.start to animationRange.end do
		(
			at time t
			(
				writeData bn toFile
			)
		)
		format "\n" to:toFile
		for obj in bn.children do recursiveWriteData obj toFile
	)
	
	fn getFileName =
	(
		return getSaveFileName caption:"Save..." types:"Excel(*.csv)|*.csv|"
	)
	
	fn main =
	(
		root = undefined
		if selection.count != 1 then
		(
			messageBox "Please select just the root joint and re-execute the script." title:"Error: rootJoint"
			return -1
		)
		else
		(
			root = selection[1]
		)
		fileName = getFileName()
		if fileName == undefined do
		(
			messageBox "Please select a filename for the file to be saved." title:"Error: fileName"
			return -1
		)
		file = undefined
		if doesFileExist fileName then file = openFile fileName else file = createFile filename
		recursiveWriteData root file
		
		flush file
		close file
	)
	main()
)

@mickey666maus Ah that slider is pretty cool man! looks good. Ill go ahead an forward it to him, just in case it helps him. Either way once he is done I will post the example .csv file and the code for anyone who wants to improve and expand upon it in the future to include TCP and any other feature. How would you go about converting the sliding joint? In your opinion. We took similar routes to get rotational degrees I think but my thoughts on the slider was to calculate the distance it travels in the local axis its confined to and have that placed in the csv file separately so one can apply a ratio formula that is specific to their machine to know how many degrees it takes to move 10 units of distance or something. Like I told @Dj thats roughly how you calibrate the firmware on a 3d printer and it seems like the most versatile route. what do you think? what would be your approach?

#144  

Question for whenever for @DJSures: So I'll be taking the output from your plug-in and running it through a Pi thats running windows 10 Iot Core so it can process the UWP code. My question is, because I am ignorant in the specifics of this and my engineer is busy rewriting the SPI on our stepper boards for the pi ATM, what if anything will he have to do to the UWP output that your software will produce to get it to run? What I mean is: Your platform is servo centered largely, which makes sense but I have 4 servos and more than a dozen nema's. So between having the UWP code on the Pi and executing it to the stepper driver boards do you suspect any complications my engineer will have to work around given the nature of your software platform? I don't know anything about UWP so Im sorry if this is a stupid question, my permanent excuse to you engineer types is "I'm a biologist" lol Just thought I'd ask.

PRO
Synthiam
#145  

Do you need to run this on UWP with the Pi? Could you not spend an additional $100 and get a real mobile PC that runs full Windows 10? I ask because the time and dollar costs of developing an application will be significantly greater than the $100 or $200 difference.

If you throw a real PC that's tiny and portable (i.e. 4"x4" Voyo), you can run full ARC and never need to write a line of code.

The plugin won't work with UWP. It can import Auto Position frames into ARC, that you can export to UWP library and you can write a program around it. But wow, that's a lot of work...


Also, I don't know Max Script so I cannot comment on that above code. All I need is an example of the CSV file to begin the plugin planning and dev.

#146  

You had me with "never have to write a line of code" Lol I'm already many thousands into this particular project so at this point I'll sell my own mother to finish it. Okay so, I know you're running a business but if you can, to clarify:

Using a smaller real PC we can just run ARC right off of the PC. No problem, will do. I have 4 microservos and a dozen nema's, dual stepper driver SPI boards and most importantly an engineer that's too busy to handle this aspect. So, whats the smoothest fastest route between the PC and motors (stepper driver boards and servos included) ?
As in if I don't use the Pi how do I run these motors given that PC's tend not to come with GPIO pins. As I said before I'd love to just buy the EzBv4 run it through there but... 12 nemas lol advice? What do I tell my engineer to do basically. confused

PRO
Synthiam
#147  

Well, an ezb v4:) that’s my initiative to assist with on this topic - ezrobot operates its business through financial transaction of product goods.

Also, the uwp library or any of ezrobot software will require the ezb v4 or IoTiny... even if it runs on a pi. The pi is not a robot controller. The pi may be an acceptable at quick linear processing, but it is absolutely not a hardware controller because it’s gpio is slow and very cpu extensive - and will always require a complimentary hardware controller.

Since you will have an embedded pc in the system - I’d recommend a usb communication method for the ezb vs it’s native WiFi. An affordable prolific usb uart adapter does the trick. Find out more here: https://synthiam.com/Tutorials/Lesson/76?courseId=4

#148  

Haha slick. Okay, well you just sold yourself a board and if it works, then probably a couple dozen more but fortunately with USF's money :D:D :D

Alright, thank @DJSures for the info and customer support. My programmer is finishing up the script, he has sent me his most recent revision and has got all the math working well and is now just revising the format to fit your general description. Soon as he sends me a final draft I'll post the sample file here, I'll have him tweak it if need be, otherwise we'll be golden. Tonight or Tomorrow same time.

#149  

@Ramon3141 Hey, I tried to set up the extraction for the distance real quick...but I had some crashes in 3ds max, this sometimes happens if there are hierarchical issues with linking and wiring parameters! I can try again tonight if you want me to. If it is all sorted out by then, please let me know!

I experienced that 3ds max is mostly running stable if you are freezing the transformations of all objects involved before setting up all the parameter wiring!

#150  

@Ramon3141 I set up another helper to extract the positional values, hope this helps! I experienced the scene to be unstable...so saving often is a good idea!

https://we.tl/1hJkHHbNmz

User-inserted image

PRO
USA
#151  

@Mickey666, I've done some deep digging into that plugin for Maya and watched about 6 hours of video. That thing will definitely allow me to get my face mocap out of maya onto my robot . Its not live but i think that will be ok. The set up is pretty clunky but will work. Might have to get a plug in made for the data back into ARC. Thanks for the link.

#152  

@fxrtst where did you get to see the tutorials? Are they being included when you buy the plugin? Are those helper objects doing anything more than extracting rotational values? It seems like the plugin has a lot of cool additional features, I can get everything I need to do to work within 3ds max...but I could get me a student version of Maya and try the plugin if you are recommending it!

It is basically a playback recorder for the Polulu servo Controller right...I guess if you can find the file where the output is written down @DjSures will get a plugin going...he already said that he is working on a frame by frame importer for ARC. It might be flexible, or if it requires a strict format, you could still reformat the output of the plugin to match the need of the plugin! :)

PRO
USA
#153  

I went to the Stan Winston school to watch 3 videos by the author of the plug in (not free). Yes he has put quite a lot into the plug in.

His helper objects and naming conventions allow you to use any models or he has prebuilt servos where you can represent motion of a servo. You can also attach mocap data as long as you follow his naming. You can animate just about anything and attach helpers.

Somethings were unclear as in the videos it seems like you can only use translational Y for most things but not sure if that’s the plugin or if just his examples.

Also you can use educational version but NOT maya light.

And yes after you animate you record and spit out the code to go to the controller, but limited to 8k files. That’s about 20 secs animation for 18 servos.

A plug in would be easy as you can look at the order in a text file for frames and positions.

PRO
USA
#154  

My big question, can I use that data with the Dynamixels? I will most Likely use those for my professional robots. Not sure how to get those two to play together.

#155  

Hey @Djsures, here is the template. A little late but its here:) I'll post the script and instructions on how to use it shortly. This works reliably and is robust. We downloaded a bunch of random single object animation scene that had joints and it worked for everything so hopefully it will come in handy to the community you've built here.

https://docs.google.com/spreadsheets/d/1pYTcRHukt4mjG5cuBfKYq4ako1xEgWtkc_s6ls63NYc/edit?usp=sharing

(p.s not related to thread but I saw that post with the your old video. You really are very personable and even though I am new to the Ez-Robot community I personally would love to see a askDj series!)

#156  

@Ramon3141 looking foward to see the script! That is a huge amout of data...if you wire it incorrect your machine will make a really sweet mess! :)

#157  

Ha @Mickey666Maus, that would be amusing to see x) Im sure if I were to wire it then it would very likely tear itself apart quickly haha well, I just order the PC, ezBv4 and associated tech suggested by @Djsures and we will see if this thing blows up once the plug in is made. can't wait till I can post a project showcase! unfortunately, it'll be about a month or so till I'm allowed to but it will be fun. The script will take a few more days to post, it was a bit of a pain to use so Im having him turn it into a UI and UX friendly plug-in that asks for conversion formulas for slider joints, allows you to add multiple scenes to the same csv or select a particular frame range instead of the default of the entire scene etc Things that make is super easy to use even for newbies or even for people who can't animate at all but have a JD animation someone posts and want to program their JD.

#158  

@Ramon3141 This sound so cool! Combined with the plugin, we would have a powerful animation tool for ARC, and could add so much more personality to our robots! This is a great leap forward! :D

PRO
USA
#159  

I watched the videos a few more times to see how the helpers are working in conjunction with the animations I.E.animation is tied only to the "Y" axis of the helpers. See pic. Brow slider attaches to the "y" of the brow helper etc.

But i thought to myself, we should be able stay away from complicated math to solve the Euler angles. Let the program solve the angles and just grab the passed on values from the helpers.

Mocap data on mocap skeleton ------> retarget to new skeleton bones------> bones rotation to helper single axis.

So each helper slider represents a single servo. That could be written out to a csv and loaded into ARC from a plug in.

When PTP was working on the live data from Perception Neurons software, it was pushing out raw Euler angles which he created math to solve. This skips over that step. For you guys you can do the same thing for IK solving. IK to Retarget then to helper. So again you avoid the Euler angle math/solving.

I might be missing something here or....?!

User-inserted image

#160  

@fxrtst Hey, I guess for whatever reason I turned off notifications for this thread...so I might have a bit of a delay at times! :)

You are absolutely right in what you are saying, if your 3d model matches your robot, all you would have to do is to extract the rotational values locally and store them in the format needed for further processing!

Extracting the values is not my problem, I do have a problem in correctly setting up the IK for my robot, since mechanical rigging with all the dependencies and constraints is not an easy task...this is why I still use FK to get things done.

Also I never used Maya, I did everything in 3ds max so far... :)

PRO
USA
#161  

You know how when trying to create something new you end up jumping thru so many programs to get the job done? Well I've looked into different ways to accomplish this without too much extra effort. And to make it available for everyone.

I have MAYA MAX MotionBuilder and Lightwave, all pay for use programs. But I've never tried Blender, which is free and opensource. I found that its mostly built on python and pretty modular. I also found several people who have manged to get Blender to talk to servos via ardurino and pyserial, live in real time. If that is possible, then we should be able to get ARC to work nicely with Blender.

Blender supports FBX and 3ds so you could export your max scene.

I've downloaded it and its just like all the others. You learn one animation program they are all basically the same. You just have to find the buttons.

Anyways I'll explore more and download the test with the two servos try and get my head around it and post more here.

#162  

@fxrtst Hhhhhhmmmmmm.....I guess it kind of depends on what you are up to!

Quote:

I have MAYA MAX MotionBuilder and Lightwave, all pay for use programs.
All of them are available for everyone to use, when you agree on the terms of a student version...so if you are not aiming for a tool which is being sold in tandem with your robot, you are on the safe side! If you plan to do so Blender would be a very good way to go!

Quote:

Blender supports FBX and 3ds so you could export your max scene.
Well the scene depends very much on parameter wiring, kind of the same thing which the plugin you are testing in MAYA right now is doing...this cannot be used cross platform.

Quote:

You learn one animation program they are all basically the same. You just have to find the buttons.
Which is kind of true when scratching the surface...but once you dig in deeper, you will have to do a lot of scripting, the tools for rigging a model and general workflow are not as easy to learn as shifting a box around your workspace!

Anyways, you can get stuff done in any of those 3D applications, the main problem is that we are still lacking a way to playback our CSV files in ARC.

I will be going to Japan for three weeks...so I am off the grid, but once I am back I will be more than happy to sit down to find a solution for this, I would not mind to take a look at Blender...the last time I tried I found the interface a bit clumsy, but that might have changed in the meantime! :)

PRO
USA
#163  

Trust me if this solution is Max Based, then I'm just as happy!I'm going to try this Blender thing because it is already written and I can do some testing with animation and moving the servos.

You are right the interface is a bit clunky but it is pretty powerful. Funny thing i thought it was something that would never make it this far, as far as competing with autodesk and Newtek. But its come a long way. In fact Lightwave development is so far behind Blender. And most companies are now offering a plugin for Blender and no longer for lightwave.

stress

Have a great trip to Japan, chat when you get back.

#164  

@fxrtst I made a quick demo on how far I got this...I build an interface in 3ds max to control the robot and edit animations. Those animations can be exported as CSV files. Also the demo shows how to use the Google Assistant with a custom wakeword, you can always enter your custom Dialogflow Agent, and return back to the Google Assistant! I did not really have time to put it together, since I am off to Japan tomorrow, but I hope it is OK enough to show where I want to go at some point! :)

PRO
USA
#165  

Very cool! You certainly have made it past the first step of POC!

#166  

@fxrtst If I would only know what POC is! :D

The animation created can be exported to a CSV file...only thing I did not manage so far is building a stable playback Software. I experienced problems playing back frame by frame motion in Arduino and ARC, but maybe @DJSures will build a Plugin? :)

When I am back from Japan, I will try to use a Rasberry Pi with a servo Controller board...there should be a sollution, its just a matter of fine tuning on whatever platform we will be working on! ;)

PRO
USA
#167  

Proof Of Concept =POC

Have fun in Japan, one of the few countries ive not visited but high on my to go list.

PRO
Synthiam
#168  

@Mickey that's a great video - like the style of the music as well.

If you give me a sample copy of your CSV, i can easily make a plugin for you. There's a new plugin i created last week for @Will that has a few anti-bounce filters. I can incorporate one of those. No need to go a raspberry pi route and write a bunch of code yuck

#169  

@DJSures I saw the plugin, I still need to fully explore what it does,but I know it can be very useful to a lot of things! :)

My CSV file looks like this 90, 86, 15, 47, etc...starting from servo D0 up to the number of servos involved! So it might be cool to implement an option to state the number of servos being used in the robot!

I want to use servo control on the Rasbery Pi because it runs the Google Assistant and also a local server is running on it anyways...so with an additional servo controller board it will eliminate the need to run a windows computer.

But I will definitely run it on windows and the EZ-B, so we will have a common ground to push it further, it will be tandem to the Rasberry Pi project...because the great thing about EZ-Robot is, to create and share within the community! That is the biggest fun with DIY Robotics right? :)