Asked — Edited

Vr Options?

Does anyone know if the SDK of the Oculus Rift package would work with EZB if there was a plug in developed for it? It doesn't really need to be something as big and bulky as that. Very unfortunate that Vuzix is no longer offering support with EZB. That would work great.

Like the Vuzix, I'd like to pull from the IMU data to move servos and transfer the camera feed from the EZ camera to the goggles over two complete separate computers in different places.

Any ideas or suggestions on how to POC this, would be grateful.


ARC Pro

Upgrade to ARC Pro

Get access to the latest features and updates before they're released. You'll have everything that's needed to unleash your robot's potential!

PRO
USA
#81  

I used Mickeys JD model to demonstrate the unity process from beginning to end. Fairly quick process.

Edit : watched video and I refer JD as DJ at the beginning, sorry for the confusion

Edit: Obsolete.Watch the next video from Mickeymaus, this plug in is no longer needed. He has connected Unity and ARC via http server.

:P

#82  

@fxrtst et voilà!:D

I am happy we are making progress here! Once we have the Virtual JD as a common ground, it will be the developing platform for future ideas! Bringing Unity up was a very cool thing...I am having a lot of fun already!

Getting a full IK setup going is something I will only believe once we got it. Gimbal lock and Euler Angle order is a tricky thing to resolve, but I think we might be able to get it done!

You should place a camera in your scene so you will have the correct view once you hit the play button, since the final product will be the game view. Which can be used as a standalone, so everybody in the forum will be able to have fun with a virtual JD!:)

PRO
USA
#83  

Beautiful. Ok you’ve connected the rotation to the sliders. So how did you connect to the EZB do you have a script in ARC? How are you using the rotational values as real positions 1-180? Can you share the ARC and unity project files?

PRO
USA
#84  

Im pretty sure i can get the IK working. I knew Unity was the way to go after you discovered http server hooks.

I'm going to set up an arm with IK from JD see what happens:)

PRO
USA
#85  

I finally got the hang of the IK. I reset your arm (merged) in Lightwave and reset pivot points, exported out as FBX for Unity. This shows Realistic mode which creates a sort of mass for the objects. Have not yet tried to connect servos, but no reason too try (because you have the EZ connection.) But need to check actual angles of each part are available to hook. I'll try with ArdUnity.

BTW turns out (as far as I can tell) that you have to run IK in scene mode, because the goal object cant be moved in game mode.

#86  

@fxrtst This looks VERY promising! Good to see that Bio IK gets the job done!:D

Could you share the new merged JD file as an .fbx, it'll save me time and we will have a common model to work off... Btw, which screen capturing software are you using, I would like to use it for future clips too!

The handling of Unity within ARC is pretty straightforward, all you need to do is to connect your EZ-B with your preferred Wifi mode, start the Http Server and have the servos connected as you are used to.

In Unity it is a bit more tricky, I wrote some code to take the values from the slider and to send them to ARC everytime a slider is being changed. So I guess the next step is to rig our JD in Bio IK like you did and see how to implement that code in tandem with ARC and Bio IK...

I will buy the plugin later today, if you can share your scene I can save some time, but I can also try it myself...I will have to learn how it works anyways!;)

This looks very good...great progress, and I guess we are really onto something this time!:D

PRO
Canada
#87  

wow that is really cool @fxrtst. I am wondering if you are going to need some form of feedback loop back into the model to know when the instructions have been executed on EZ-B. We don't have position data from HDD servo's so if we move the model with the assumption the servo completed the move in time before we make the next move we could get all sorts of strange results. Maybe some form of delay needs to be "calculated" based on the complexity of the move requested.

PRO
USA
#88  

@Nink, There are a few use case examples for how this could be used. One is "live" where the user would just move sliders around to move the robot like Mickey has demonstrated, i.e "pose animation". Then you have an IK rigged robot like my demonstration, less pose, but still kinda just moving the robot around (follow the ball). The power comes in the animation process (not yet show). After using a combination of pose and IK, FK and recording the motions, you could then fine tune the animation by simply moving keys around.

The servos only move at a finite max speed. So if using the model in Real Time or Live mode, you would be able to adjust the speed of the live playback to match the real robot servos, so you would be less likely to get strange results. In both our videos where we move servos, I can tell you that the movement is very very close to 1:1, no real lag.

@mickey , yeah the code for Unity, both the get/post portion and the sliders was what i was interested in looking at if you could post the Unity scene would be great. How do you want to share these assets? Dropobox or?

I'm in Mr mom mode this week as my wife is travelling for work. So will be in Kid mode for a good portion of the week. I will try and get some things done as time permits.

I'm using Bandicam for screen capture

Last nights success capturing angle data from each part. There is no worry about the IK hijacking the rotational values, they are passed along normally. Yay.

I. This video two servo set up one for upper arm and the other lower arm.