Asked
— Edited
Does anyone know if the SDK of the Oculus Rift package would work with EZB if there was a plug in developed for it? It doesn't really need to be something as big and bulky as that. Very unfortunate that Vuzix is no longer offering support with EZB. That would work great.
Like the Vuzix, I'd like to pull from the IMU data to move servos and transfer the camera feed from the EZ camera to the goggles over two complete separate computers in different places.
Any ideas or suggestions on how to POC this, would be grateful.
Sorry I'm not contributing as much as I'd like, but I'm extremely new to all this and would likely get in the way more than anything. So is the idea to be able to model a robot in Unity, then map servos for control through ARC? Oculus support?
Yes, this is the goal...Oculus support is also something I would really like to add to this project!
@fxrtst I found out how Sebastian made the IK work for reacting with the ground...first he made a traditional animation, then he created a bone system in Unity and BioIK is getting setup as usual. At last a Projection-Objective is added. This Object will modify the Walk Cycle thru BioIK in a way that the ground is always taken care of!
So we should use our 3d animation package to create walking motion traditionally and then we can use BioIK to modify those motions!
I will create an example using JD...
So cool! I envisioned using telepresence robot with oculus headset and oculus touch (or gloves) to control hands! Think of the possibilities if this works!
I was able to get my robot's head to move around by using Bigscreen login through Oculus to my desktop and then manually manipulating the servos through ARC. Ridiculously clunky and inefficient way of doing tele-presence, but just shows this concept IS very possible!
@mickey That was the missing piece. He did not include documentation for the other objectives, I was wondering how/ what the projection objective was for.
JD is definitely a challenge for the first setup in our tests. In part do to his physical limitations and missing joints. I’m really looking forward to setting up some animations and get a real robot to reproduce them live.
Next step for the Virtual JD will be adding reference joints, so Unity will recognize him for further animations. After doing so we can hardcode animations, like eg a walk cycle...which can be altered live by the BioIK plugin.
Also I will add a recorder, so motion can be recorded and played back...
I tried to find if Microsoft's Augmented Reality headsets could be linked to Ez-Robot since it it a Microsoft software and hardware. If it has been previously been discussed ignore this reply.
Thanks Ellis