Asked
— Edited
Does anyone know if the SDK of the Oculus Rift package would work with EZB if there was a plug in developed for it? It doesn't really need to be something as big and bulky as that. Very unfortunate that Vuzix is no longer offering support with EZB. That would work great.
Like the Vuzix, I'd like to pull from the IMU data to move servos and transfer the camera feed from the EZ camera to the goggles over two complete separate computers in different places.
Any ideas or suggestions on how to POC this, would be grateful.
Something strange happened when I pasted the code, it should look like this! I guess formatting it did so...
That arduino code is very useful but not the way you think. Because it’s the most terrible implantation of a protocol I’ve seen. I think it’s best to design our own. So, looks like we’ll revisit supporting other software - unless you think unity is the way to go.
@mickey For you when you get up! think you are 9 hours ahead of me?!
Yeah thats strange the code did not paste correctly. Were you able to to connect to ARC/Unity using that code in the second image? If so we are half way there!
@DJ, yeah I can say that code is really half the story, since the other half is the lack luster plug in. What I can reiterate, is that the plug in is unstable, unfinished, poorly documented and not well supported (last post from them on google plus forum for plug in was 7/17). Seems he might be in litigation with his two colleagues over the software.
But seems Mickey has found a way to hook Unity to ARC via http server. That cuts out their flaky plug in i've been testing on.
I personally think Unity is the way to go as far as choosing a program. Especially if its accessible by ARC. I think its important that we choose free software so everyone can use it. Other software for animation can be in the thousands of dollars per user license. Unreachable for many.
That really leaves us with Blender and Unity.
Thoughts, comments?
After doing a bit more research with Blender and watching some videos, it looks to use its own game engine (BGE) to connect to the outside world. Pyserial and a few python scripts to get it going.
My only concern, it seems to be a slower connection than Unity. Kinda like the difficulty Mickey was having with 3D Studio Max. I have no idea why that would be or how to get around it.
If we look at your original use case
"Creating a POC with Alan and Alena to be used as long distance Avatars. Business guy in Tokyo calls into corporate headquarters for meeting. At the headquarters is an Alan in the meeting room. Guy in Tokyo has on VR headset and can see EZ robot camera image from Alan live, and as the guy moves his head, Alan moves his head in unison so he can see everybody at the meeting. As he speaks so does Alan using our audio servo control board. "
Are you really looking at controlling yaw pitch and roll of your robot remotely and feeding video back into a headset? I think if that is the case the 3D rendering that mimics this is a bit of a red hearing. If you have the data you could also feed that into a 3D space so you have an avatar that mimics the movement for others but is it really needed for the use case.
Maybe all you really need is just an old vuzix as you previously stated and pull the data off the vuzix and feed into servo's and pull video feed off EZ-B camera and feed into vuzix. https://synthiam.com/Support?id=166
@fxrtst Hahaha, I should be...but it seems like I am stuck in Japanese time! This is why I am up again already!
I was getting really excited when I was giving the whole Unity concept a second thought, since it offers one thing that all the other options don't. Which is portability!
When I was putting out my success driving my robots servos within 3ds max I way kind of wondering why nobody else was up to follow that path, but the truth is... Nobody want to got thru the whole process of installing some new software and trying to understand a scene someone else created in some unknown environment, while having to learn how the user interface is handled beforehand.
Well with Unity you do not have to, the final product can be exported as a standalone version which everybody can use out of the box!
You are so right about this @fxrtst Unity is the key, I will put some work in! We all can still reuse our work previously made in Blender, Maya, 3ds max etc and connect it thru Unity with ARC!
@DJSures This could be a great option to build new interactive ways to explore your product...the least of it would be a visual feedback of JD's movement, either being dragged around in the viewport for live control, or to interact with objects in a virtual setting! I have seen only a few robotic platforms, like eg the Nao Robot, offering this! I think it would have a huge impact on consumers!
Anyways, I managed to connect Unity to ARC, no pluigin required,NO ARDUINO!
I will wire up the Virtual JD, I am still having it fully rigged somewhere...so we can have a common playground!
I managed to get the JD with all the hierarchical dependencies into Unity, my custom controls do not work, but the goal would be to re-rig him in Unity to learn about the progress anyways!
@mickey ah still in Japan! Thought you were back! I’m still up 11 pm west coast USA time. I think you sleep like I do or rather don’t sleep much stress
So glad to hear the test was a success ( getting unity to chat with EZ.) I brought in a rigged Lightwave object and ran into same issue. Just have to rerig. You can bake motions and bring that in via FBX. If you animate in MaX. But if you want to animate in Unity gotta rerig. It’s fun and easy with the IK rigger I mensioned earlier.
I’m glad we both are on the same page with Unity. I agree with you on portability. And it’s free etc etc. and I’m glad you can see what this can mean for EZ Products. Nothing has been commercially available like this... ever. Opens a massive world for robotics including machine learning for robots.
@nink this does apply to the my original title of this thread but we have wandered back to a path that crosses over with a prior thread started by Mickey about using animation programs for moving robots. For me this goes waaayyy past VR and avatars to include motion capture, machine learning and advanced robot movements. As stated before this has been a 30 year pursuit. All my POCs lead into one direction and for the very first time I can just about reach it with my fingertips. Most most exciting!