All of this sounds very good...I did all the coding using copy and paste from stuff I found on scriptspot etc...I can get it done, but as you can see, it's like a little child trying to speak grown up language!
I am connected to ARC with the HTTP server running while executing the Maxscript, this will move the servo according to the RedrawViewsCallback or TimeCallback, whichever is running while the Maxscript is operating!
I am connected in client mode.
I guess for just animating and saving the string to a texfile none of the above would matter, but it is kind of convenient to have the robot executing the movement from frame to frame along with recording it to a textfile, just to make sure the motion works properly!
Thanks for the information. That is what I thought your configuration would be but I wanted to be sure. Even though I don't have Maxscript, I can still run tests and have found you can run scripts and Frames or Actions from Auto Positioner via the HTTP server directly. Probably anything else as well. What this means is we have a choice here of doing everything from Maxscript, as opposed to sending strings of commands and having a script running in ARC to interpret those commands.
On the other hand, we still need to have a way to save the movements for playback later. Therefore we still need to save something to a file. At the moment, I'm thinking the best way might be to save the actual commands such as Servo(D0,10) (like you were doing originally) instead of using the character groups I had suggested, such as D0|10. In this way, there would be the option of re-sending those recorded commands through a simple 3ds script, or having an ARC script read the file and execute each command itself.
Each method has it's drawbacks. To playback through a script in 3ds means, of course, you need 3ds running. That may be difficult or impossible when playing back on a different computer or a tablet computer. I believe, however, I can write a standalone program that would do the same thing so that sort of thing would not be so much of a problem. Whatever is the case, an HTTP connection must be established via some sort of network. And that may be the biggest problem when using a different computer. One advantage of this method is that you would not have to switch back and forth between ARC and 3ds when doing a playback operation.
If played back directly through ARC, nothing else is needed. The ARC script, however, would have to interpret the commands and run them through a selection from an If-ElseIf list since there is no way to directly run a command like you can through the HTTP server method. However, since there won't be a large variety of commands, I don't believe that will be much of a problem.
Having said all that, it may be the best path to do both. That way you have a choice depending on the situation. For development work, use the HTTP method via a 3ds script. For playback to show others, use the ARC script method.
Overall, then, it looks like the biggest hurdle will still be creating the command strings in Maxscript in the easiest way I can come up with, as well as, making getting the data for those strings as generic as possible. For that, I need to educate myself more on Maxscript.
Anyway, think it over and let me know your opinions. Assuming I haven't confused the issue too much.
Using an external animation package gives us the option of using sophisticated rigging methods, Inverse Kinematic and non destructive workflow when fine tuning timing and position of our motion...so it is obviously kind of tempting to build a rig in 3ds max which allows us to remote control the real model live!
But my overall idea was to build motion to use from within ARC directly, as you have stated it would be an issue once you would like to relocate your robot and use it while ARC is running as a mobile version or when using a different computer.
Quote:
...and have found you can run scripts and Frames or Actions from Auto Positioner via the HTTP server directly.
This is something that did not cross my mind...it would be something that could be interesting, as you mentioned that Auto Positioner operates slightly different that if we would use an EZ-Script to control servo positions right?
If those frames could be safed for later used it could be something that might work out!
Quote:
On the other hand, we still need to have a way to save the movements for playback later.
But this is what I was actually aiming for...I would like to have a way to store my pre-recorded animations within ARC directly to use them with my robot in any given environment!
One last thing that just came to my mind...3ds max is a sofware to build animations and render them out or export those motion to a specific file format, it is also not working real time! So if I move my model in 3ds max or if I playback my animation, there is always a lag...it is never as smooth as the finally rendered animation would be, as there is the modeling/animation software involved, max script will be running, the HTTP server will be running plus ARC with EZ-Script will be up to translate te motion to those servos!
So I guess we should aim for getting rid of all these taks with would have to be up an running and try to have it as clean and easy as possible, right?
@Mickey666Maus
I'm beginning to get a handle on Maxscript. At least as it relates to the script you presented. In that light I have a question: When does the script actually run? That is to say, what triggers the script to run when you are moving the virtual robot in 3ds?
I'm thinking perhaps you trigger it manually to run once when you move the robot to a new position for the next key frame, but I don't know if that's correct.
Once I manually start the script it will run in the background and depending on the RedrawViewsCallback or the TimeCallback it will either send the servo position based on viewport update or timeslider movement!
I would need to manually terminate the script to stop it from running in the background!
Wish you guys could have continued here on this thread. This was getting good. Hopefully you are making progress!
I'm leaving my example of what I was able to do as a proof of concept maybe it will interest someone.
This video demonstrates live and recorded facial motion capture by the asus sensor and out putting channels to a servo.
The motion uses fast noise filters to take out alot of the jitters ( the motion capture filtering was so good because it looses my lips alot because the camera was tricked by my goatee). This also captues eye motion based on pupil location.
I then tied my motions to a puppet, in this example an alien i created. This puppet was rigged in the animation package Maya. One of the rotational values (this case the Jaw) was output to an ardurino with a servo. So when my jaw opened the puppet jaw opened, then the servo moved to that position.
I want to continue with this project as my vision of ALAN progressing is with this type of motion capture and prerecorded motions. These can be called upon when a sensor detects emotions.
I'm looking for anyone interested in working with me on this in the next 30 days or so. Anyone who can code in python, write sketches for the ardurino. Notify me off forum if you are interested and we can speak further.
In the mean time have fun watching me make some funny faces!
@fxrtst
We're still at it. Just that we have been corresponding via email instead of posting every little back and forth between us. I should be posting soon as far as a Maxscript for real-time motion from the virtual robot in 3ds. Probably tonight. But it's jittery as yet. Maxscript is based on Python and has a host of similarities so translation to something else Python based might not be too difficult. I'm not sure if Maya can use Maxscript directly, but maybe so since it's the same company. I've heard talk of 3ds supplanting Maya in the future, but that will be a tough slog since Maya is the industry standard.
Currently, the main goal of the project has shifted to recording motions from the movements of the virtual robot for playback into a real robot, like JD for example (Through ARC). The motions are first created with the animation features of 3ds. The animation is run as the various movements are recorded into a file. The method used for that will store much, much less data than the real-time recordings do now. That's mainly because they will be based on a vector model rather than a frame-by-frame animation as it is now. The new vector method can also be used to drive the robot directly, but it will not be real-time since it can't send a specific bit of movement data until the virtual robot has finished making that particular movement. I'll explain what is proposed in the vector method when I post the code later tonight.
Another goal is to be able to create a robot in 3ds, test it, and make what is necessary to build it from 3-D printing. Then use what is being developed now to run the real robot from the virtual robot and create series of movements for the real thing from animations made in 3ds. With 3ds' animation capabilities, you can make the virtual robot do a lot of things without having to map out every little detail manually. Just capture the movements that are done automatically by the animation software. That'll be Mickey666Maus' domain, however. I'm just here to help with the code.
After all this I may see if it's feasible to do somewhat the same thing in Blender with it's built-in Python. 3ds is great but pricey for the long run unless you're making money from it.
All of this sounds very good...I did all the coding using copy and paste from stuff I found on scriptspot etc...I can get it done, but as you can see, it's like a little child trying to speak grown up language!
I am connected to ARC with the HTTP server running while executing the Maxscript, this will move the servo according to the RedrawViewsCallback or TimeCallback, whichever is running while the Maxscript is operating!
I am connected in client mode.
I guess for just animating and saving the string to a texfile none of the above would matter, but it is kind of convenient to have the robot executing the movement from frame to frame along with recording it to a textfile, just to make sure the motion works properly!
Thanks for the information. That is what I thought your configuration would be but I wanted to be sure. Even though I don't have Maxscript, I can still run tests and have found you can run scripts and Frames or Actions from Auto Positioner via the HTTP server directly. Probably anything else as well. What this means is we have a choice here of doing everything from Maxscript, as opposed to sending strings of commands and having a script running in ARC to interpret those commands.
On the other hand, we still need to have a way to save the movements for playback later. Therefore we still need to save something to a file. At the moment, I'm thinking the best way might be to save the actual commands such as Servo(D0,10) (like you were doing originally) instead of using the character groups I had suggested, such as D0|10. In this way, there would be the option of re-sending those recorded commands through a simple 3ds script, or having an ARC script read the file and execute each command itself.
Each method has it's drawbacks. To playback through a script in 3ds means, of course, you need 3ds running. That may be difficult or impossible when playing back on a different computer or a tablet computer. I believe, however, I can write a standalone program that would do the same thing so that sort of thing would not be so much of a problem. Whatever is the case, an HTTP connection must be established via some sort of network. And that may be the biggest problem when using a different computer. One advantage of this method is that you would not have to switch back and forth between ARC and 3ds when doing a playback operation.
If played back directly through ARC, nothing else is needed. The ARC script, however, would have to interpret the commands and run them through a selection from an If-ElseIf list since there is no way to directly run a command like you can through the HTTP server method. However, since there won't be a large variety of commands, I don't believe that will be much of a problem.
Having said all that, it may be the best path to do both. That way you have a choice depending on the situation. For development work, use the HTTP method via a 3ds script. For playback to show others, use the ARC script method.
Overall, then, it looks like the biggest hurdle will still be creating the command strings in Maxscript in the easiest way I can come up with, as well as, making getting the data for those strings as generic as possible. For that, I need to educate myself more on Maxscript.
Anyway, think it over and let me know your opinions. Assuming I haven't confused the issue too much.
No, you haven't confused it at all...
Using an external animation package gives us the option of using sophisticated rigging methods, Inverse Kinematic and non destructive workflow when fine tuning timing and position of our motion...so it is obviously kind of tempting to build a rig in 3ds max which allows us to remote control the real model live!
But my overall idea was to build motion to use from within ARC directly, as you have stated it would be an issue once you would like to relocate your robot and use it while ARC is running as a mobile version or when using a different computer.
This is something that did not cross my mind...it would be something that could be interesting, as you mentioned that Auto Positioner operates slightly different that if we would use an EZ-Script to control servo positions right? If those frames could be safed for later used it could be something that might work out! But this is what I was actually aiming for...I would like to have a way to store my pre-recorded animations within ARC directly to use them with my robot in any given environment!One last thing that just came to my mind...3ds max is a sofware to build animations and render them out or export those motion to a specific file format, it is also not working real time! So if I move my model in 3ds max or if I playback my animation, there is always a lag...it is never as smooth as the finally rendered animation would be, as there is the modeling/animation software involved, max script will be running, the HTTP server will be running plus ARC with EZ-Script will be up to translate te motion to those servos!
So I guess we should aim for getting rid of all these taks with would have to be up an running and try to have it as clean and easy as possible, right?
@Mickey666Maus I'm beginning to get a handle on Maxscript. At least as it relates to the script you presented. In that light I have a question: When does the script actually run? That is to say, what triggers the script to run when you are moving the virtual robot in 3ds?
I'm thinking perhaps you trigger it manually to run once when you move the robot to a new position for the next key frame, but I don't know if that's correct.
Once I manually start the script it will run in the background and depending on the RedrawViewsCallback or the TimeCallback it will either send the servo position based on viewport update or timeslider movement! I would need to manually terminate the script to stop it from running in the background!
Post deleted since it served it's purpose in trading email addresses.
Wish you guys could have continued here on this thread. This was getting good. Hopefully you are making progress!
I'm leaving my example of what I was able to do as a proof of concept maybe it will interest someone.
This video demonstrates live and recorded facial motion capture by the asus sensor and out putting channels to a servo.
The motion uses fast noise filters to take out alot of the jitters ( the motion capture filtering was so good because it looses my lips alot because the camera was tricked by my goatee). This also captues eye motion based on pupil location.
I then tied my motions to a puppet, in this example an alien i created. This puppet was rigged in the animation package Maya. One of the rotational values (this case the Jaw) was output to an ardurino with a servo. So when my jaw opened the puppet jaw opened, then the servo moved to that position.
I want to continue with this project as my vision of ALAN progressing is with this type of motion capture and prerecorded motions. These can be called upon when a sensor detects emotions.
I'm looking for anyone interested in working with me on this in the next 30 days or so. Anyone who can code in python, write sketches for the ardurino. Notify me off forum if you are interested and we can speak further.
In the mean time have fun watching me make some funny faces!
@fxrtst We're still at it. Just that we have been corresponding via email instead of posting every little back and forth between us. I should be posting soon as far as a Maxscript for real-time motion from the virtual robot in 3ds. Probably tonight. But it's jittery as yet. Maxscript is based on Python and has a host of similarities so translation to something else Python based might not be too difficult. I'm not sure if Maya can use Maxscript directly, but maybe so since it's the same company. I've heard talk of 3ds supplanting Maya in the future, but that will be a tough slog since Maya is the industry standard.
Currently, the main goal of the project has shifted to recording motions from the movements of the virtual robot for playback into a real robot, like JD for example (Through ARC). The motions are first created with the animation features of 3ds. The animation is run as the various movements are recorded into a file. The method used for that will store much, much less data than the real-time recordings do now. That's mainly because they will be based on a vector model rather than a frame-by-frame animation as it is now. The new vector method can also be used to drive the robot directly, but it will not be real-time since it can't send a specific bit of movement data until the virtual robot has finished making that particular movement. I'll explain what is proposed in the vector method when I post the code later tonight.
Another goal is to be able to create a robot in 3ds, test it, and make what is necessary to build it from 3-D printing. Then use what is being developed now to run the real robot from the virtual robot and create series of movements for the real thing from animations made in 3ds. With 3ds' animation capabilities, you can make the virtual robot do a lot of things without having to map out every little detail manually. Just capture the movements that are done automatically by the animation software. That'll be Mickey666Maus' domain, however. I'm just here to help with the code.
After all this I may see if it's feasible to do somewhat the same thing in Blender with it's built-in Python. 3ds is great but pricey for the long run unless you're making money from it.
Good luck with what you're doing, it looks great!