New Zealand
Asked — Edited

Noob Joining In (With Question About Video Feed)

Hi!

I'm a graduate student (in computer science and human interface) and I'm pretty new to the robot scene and I'm pretty excited about it. I'm not too sure where to start and was told to look in this forum.

I'm planning on getting a Rover in the next day or two and I would like to overlay, on top of the camera video feed, some computer-generated graphics.

Would anyone know if this is possible to do with the EZ-Robot platform?

Thanks for any thoughts and advice!


ARC Pro

Upgrade to ARC Pro

Elevate your robot's capabilities to the next level with Synthiam ARC Pro, unlocking a world of possibilities in robot programming.

New Zealand
#9  

Hi, folks.

First of all, Bob looks like one cool robot! My studies actually have more to do with navigation than robotics per se but looking at the stuff that people have done with EZ-robot, I may throw myself at building a robot from scratch it once I get past my current my project. At the moment, however, I need to focus on what I can use the robot for rather than putting one together and I only have the Rover that I bought yesterday...which brings me back to my original question regarding overlays.

The iPhone app works well enough but ARC is not displaying the video feed from the Rover's camera so I am unable to test out the augmented reality features Gunner pointed out. I'm not sure why that is: I selected the Rover in the Video Device pull-down menu and played with the settings a bit but all I see is the 3x3 grid over a blank background...

I may start chewing on the SDK if that's the only way to fiddle around with things but I'm guessing I should be able to try the augmented reality features out via ARC, first...? Has anyone had success with the augmented reality feature through ARC? If so, was the overlaid image always aligned with the glyph? (That is, if the image is an arrow pointing away from me when I view the glyph on one side, will it be pointing towards me when I view the glyph from the opposite side?)

Thanks for any thoughts and advice!

Canada
#10  

Hey Hilad, sorry for the drama:)

Couple of questions,

A) Are you viewing (or trying to view) the video stream via the iphone or a PC?
I don't have an iPhone, rather an Android... but no luck locating any EZ-B app for it, so I don't know what kind of capabilities either would have.

B) Is this the Brookstone Rover your are referring too? As I understand it, DJ has backward engineered signals from the supported toys, including any video stream. But there may be different versions of the Rover... just as there is with the AR Parrot Drone - that does something different and thus no video.

Another guess would be that something else may be receiving the video stream? doubtful that it would cause your issue, but it is something I am familiar with in the PC realm (i.e Skype or EZ-B will not grab video if factory software, say Logitech, or another app is using it at same time. Are you able to test the video stream on the PC outside of ARC?

As for the augmented glyph thing... while I am sure more is possible within coding... as far as I have tested, the glyph is one of four pre-programmed patterns. And the augmented image you input to represent one or more of the four seems to match detected size as seen from camera, but not tilt or angle.

You can experiment with any web cam on a PC.

Soon the long weekend will be over and other, more knowledgeable, answers may come your way:)

Gunner

New Zealand
#11  

Hi, Gunner.

A) I've tried viewing the video stream both on an iPhone (it works with the Rover app) and on a PC (using ARC, through which I was able to control the motor but was not able to view the video)

B) Yes, it is the Brookstone Rover. I'm guessing I'm using the same version as the one that was reverse engineered because I think there are only two versions (the earlier one is white while the newer one, Rover 2.0, is black).

I'm not running anything else that would be using the video stream, to the best of my knowledge. I have not tested the video stream on the PC outside of ARC. I haven't done much video streaming work - are there particular approaches or websites you can recommend for me to use to do this sort of stuff?

Regarding the glyphs, when you say that the match is made according to "size as seen from camera, but not tilt or angle," does that mean that the image used to replace the glyph is essentially shown as given - without distortion - except maybe scaled to match the distance from the camera to the glyph. This is in contrast to an object that is fixed to the glyph - both positionally and orientation-wise - in 3-space, so walking around the glyph will allow you to view other sides of the object.

Thanks for following up - and absolutely no apologies needed for the drama:)

Hitlad

Canada
#12  

On your PC, just plug in almost any old webcam (USB) and install drivers if required... EZ-B should then see the cam as an option... this will be one way of confirming the EZ-B end works. And you can do some basic tests with glyphs at least. As for that, they seem to be just a simple picture overlay, looks like scaled to size but that is it, no distortion. But that alone might mean the capacity it there for more... me thinks:) I am not sure about 3-space awareness and rotation, etc. That seems like that would require much more sensor input to supply spacial awareness then what can be gleaned off a 2D glyph. Although you could place multiple glyphs in a cube form and reference each with an appropriate picture and simulate a 3D object. Well on the horizontal plane at least (only 4 glyphs)

I know that there is an interest in getting Microsoft's wonderful little Kinect tied in with EZ-B, but only DJ knows for sure:)

Gunner

New Zealand
#13  

Hi, Gunner.

Thanks for the info.

One (extremely noob) question I have is about EZ-B: I was under the impression that I could do just about everything with the SDK using the Rover without any additional hardware. Having ARC control the Rover motor directly without any additional hardware sort of reinforced that perspective. My supervisor had pointed me to the Rover and I just assumed that was all I needed to invest financially.

But, I'm guessing I came in with the wrong assumptions and was not clear on some of the basics. So, if I understand correctly then, I'll minimally need the EZ-B. From there, I can add on all sorts of stuff, like wheels and cameras, etc., using either parts from the site or a sort of all-in-one by getting a Rover. Is that correct? Kind of makes sense...

Thanks again, Hitlad

PRO
Synthiam
#14  

I'd take what hazbot says with caution. He's looking for a lazy robot, not easy robot:).

A block development environment is single threaded and single process. I encourage hazbot to use a different robot platform for single thread simple process. The challenge of designing a GUI with the scalability for hazbot's and people willing to learn is challenging. Thousands of users would object to hazbot's opinion. I will delete any future references to block single thread development opinions from hazbot, as the repeated comments are disruptive and do not apply to the ez-robot goals.

If hazbot was a developer or GUI designer, he'd recognize the challenges. So instead it's a constant repeat:(. My suggestion to anyone willing to script advance features is to use your energy in a productive manner - compared to disruptive forum complaints.

So the answer is simple: if you want a lazy robot, buy Lego and be disappointed with being unable to achieve the features of ez-robot:)

Ps, welcome to the site!:) Learning EZ-Script is quite easy. You don't need to start with it. I suggest following the tutorials and learn the controls. With only the controls, your robot can do amazing things. That won't even require ez-script! And new features are always added:)

Canada
#15  

@hitlad

I may be using some of the wrong terminology and adding to your confusion, sorry blush

When I referenced EZ-B, I was referring to the GUI... but I think I am wrong there... let see if I can get this right:

EZ-B = The controller board (The hardware part that I got in the kit, that I interface all the bits and pieces with)

ARC = The GUI software (free download. The part I use on my computer)

EZ-SDK/EZ-Script = The... not GUI:)... software (free download. I have no experience with this... I should probably download it and at least look at it.)

Since I do not have ANY of the "toys/pre-built robotic test bases":) including the Rover. I have no hands on experience... but I believe you can control any and all of the rover's functions with just the builder or SDK software. You only need the EZ-B (board) if you also want to add other sensors and such to the rover, like sonic, IR, touch, etc. Basically taking the Rovers built in capabilities way past its native function, which is already enhanced with the EZ-"software" interface.

Hope you find what you need... and discover the additional benefits of the whole EZ-Robot experience.

Keep us updated on your study:)

Gunner

New Zealand
#16  

Hi, Gunner.

Thanks for the clarification.

I'll see how far I can get with just the Rover and the SDK, then. If everything works out, I'm guessing I'll end up liking the system and invested in it enough to reach into my student savings and invest in an EZ-B to build some sort of R2 robot for myself after the navigation study:D

For now, I guess I'll have to shift away from my iOS development work and get sorted with C# or VB.

Thanks again - I really appreciate your time and help!

I'll definitely keep you posted:)

Hitlad