Asked — Edited

Vr Options?

Does anyone know if the SDK of the Oculus Rift package would work with EZB if there was a plug in developed for it? It doesn't really need to be something as big and bulky as that. Very unfortunate that Vuzix is no longer offering support with EZB. That would work great.

Like the Vuzix, I'd like to pull from the IMU data to move servos and transfer the camera feed from the EZ camera to the goggles over two complete separate computers in different places.

Any ideas or suggestions on how to POC this, would be grateful.


ARC Pro

Upgrade to ARC Pro

Experience early access to the latest features and updates. You'll have everything that is needed to unleash your robot's potential.

PRO
Canada
#257  

If you go with option 1 are we giving up all the existing functionality in EZ-Robot (The ability to script, use existing peripherals, plugins etc)? Is there an option 4? MQTT from Unity to EZ-Robot. Pro's and Cons of option 3 but I think we are going to need a protocol to interface not just from Unity=>MQTT or HTTP etc => EZ-Robot=>EZB but from EZB (Ultrasonic Sensor, images etc)=>EZ_robot (as well as all other attached peripherals with EZ-Robot like Head mounted displays, mice, controllers etc)=>MQTT=>Unity.

I can't imagine I will always want to be clicking a ball with a mouse to move model and robot. Ultimate goal would be a bi directional digital twin (move virtual JD Robot and Physical robot moves, Move Physical robot position with EZ-Robot and virtual robot changes position. With digital servo's like Dynamixels you can even go one more step and track position of servo=>EZB=>EZ-Robot=>MQTT=>Unity so you can program virtual model just by moving robot limbs. MQTT protocol at that point could also transmit to other robots (Move one physical robot arm and 3 physical robot arms and virtual robot arm moves) Multi Robot synchronized movements...

User-inserted image

#258  

@Nick I totally agree with you my point was just that, the more we get involved in a heavy setup...the slower the progress of reaching the first goal will get!

We are all having different things in mind, @ptp just mentioned we should add more joints to JD to get better DOF...

I think your idea is great but we should first of all, get the JD proppely running, second would be to integrate ARC using its native HTTP server since it is a single click to add it to everyones projects...

After this is accomplished, we can move on to integrate our ideas like eg MQTT protocol, raising JDs DOF or Oculus Rift control...there are a million possibilities out there! :)

#259  

@all I already started to look into animating objects in Unity, it works well to animate the IK goals, but the calculation of the joints will only be seen once Unity jumps into play mode, I am not sure yet if there is a way to actually see your robot make those moves while you are animating, I might ask the author of BioIK about this...

I will look deeper into this, and also into setting up constraints in Unity, so we will not experience any weird flips or jumps in servo rotation when setting up animations for JD!:)

PRO
Canada
#260  

No worries @Mickey666Maus was just making sure we all new we were making a conscious decision to move off ez_robot as the platform. There is a lot of plugins etc for unity3d, it is cross platform, leading 3D game dev engine so it would make a great development platform. It also gets us off dependence on windows and .net.

If we go back to the original post in this thread @fxrtst goal was a virtual robot connected to oculus rift and EZB. Unity supports oculus so that would solve his goal as well. I guess this would be our new architecture.

User-inserted image

#261  

@Nick I really love those visual flowcharts you are making, is it an app or do you just make them up by yourself? It makes it very easy to see how stuff branches out and to not loose focus on what has to be addressed to keep up a good pace of progression!

I got kind of excited about the idea that even if we were only a small team like the four of us which are mainly participating in this thread....so you, @ptp , @fxrtst and me, and each of us had had the equivalent of one hour a day to put intio this, we would have at least 28 hours a week to work on getting this ahead!

@ptp set up a project at Bitbucket where we can keep things up to date with Git, and we could try to find the best fields for each us us to divide what has to be solved!

I guess @fxrtst and I would be good for modeling, animating and you guys could help coding?

So for this I guess it would be ver good to have clean structured goals!:)

This seems to be very proising, and a big leap for all of us, which are into robotics!:D

PRO
USA
#262  

@all, just my opinion. After reading through the options that PTP has laid out, I guess I was hoping that we had an option #3 with the ability to have a tethered serial mode for onboard computers. I definitely would miss using the abilities of ARC. A Mixed mode environment is important to me. I.E. the ability to use both regular RC servos and dynamixels. Especially for advanced robots.

As far as advancement in the model, I'm with Mickey on this one. Getting everything to work as a POC, is more important than having everything rigged properly with working dynamics etc. No reason to spend energy on perfecting the model if we are unable to use Unity and robots the way we were want.

The next step is animation. If we can't get that to work correctly, then Unity will only serve useful as a live puppet playing machine. It would solve the title of this thread, but i think we all agree we are way past that with our current exploration.

A small digression, but an important one for me. Let me say I'm in awe of anyone that can play an instrument, read sheet music or anyone who can code. I struggle with understanding these things. My head is full of all the thousands things I can do proficiently, like 3d animation, 3d printing, make up and effects, animatronics, video editing, motion capture, and on and on.

As an artist I'm a visual learner. I'm challenged daily by sequences and when I ask a question for clarity or an explanation about something, imagine you are standing in front of child who knows nearly nothing. As an example, I need to know what every component is and where it goes in sequence. In our little project here, I have no idea where the c# script that PTP made goes or how it is applied to the scene. I also struggle with setting up or understanding how http server works. .

A detailed description step by step, of the procedures would be helpful to me, i.e. "first.....place this here...second add this there...then connect this to that." Usually after I study something, I eventually get it.

Imagine if i handed you a paint brush prosthetic, make up and glue then pushed you into a room and said "Now age Brad Pitt to 75 years old, oh and you have three hours to complete it." Where do you start?

#263  

Yes this is exactly what I would like us to do, get the Virtual JD running first...in a way that we will all just have to open the project in Unity and hit play!

So the way @ptp is setting it up now is very direct...there is no way to have it not working because Unity is talking directly to the EZ-B! I hope I can get to work on the scene a bit tonight or tomorrow, its almost there. And you do not need to understand every aspect of the script since @ptp coded it in a way that once set up it does not need to be attended in any way...

Once everything runs smoothly, we can progress to tethering by cable, MQTT or HTTP server...but this should be the step AFTER we have all the rest sorted out, otherwise we will have too many loose stings.

Correct me if there is a better approach...:)

PRO
USA
#264  

@nink, love the visual flow charts!