Asked
— Edited
Hi,
I would like to extend the pyJD (https://github.com/BrutusTT/pyJD) project to support video streaming for the JD Humanoid Robot. Is there any way to retrieve the camera images without using the EZB software stack?
The idea is to integrate the robot into an Yarp-based experiment setup. From the information on the forum I was able to control the servos via telnet. But now I'm stuck on accessing the camera. Unfortunately, using the Windows OS and ARC is not an option for the setup but I don't mind tinkering with low-level communication for the robot.
Cheers, BrutusTT
I think that the ez-bs camera is like a wifi camera. Though I'm not sure whether it is encoded or not.
It is a custom video stream that DJ developed for performance, but he has stated in the past he would share how to connect. I forget if he was going to say how to connect a custom camera to the EZ-B or how to use custom software to read from the EZ-B though.
Alan
The code is pretty straight forward to create a library for any application. Take a look at the univeralBot in the software section. Let me know if you have any questions.
Thanks for the pointer. I found the class but was not able to test it yet because the battery of our robot died. I will let you know if I got it working once that problem is solved.
Cheers, BrutusTT
Batteries are finally here
I had a look at EZBv4Video which looks like the stuff I need. However, I can not figure out the port I need to connect to as it seems not to be port the telnet or http port. Scanning with nmap also did not provide another port that I could use.
I know that in the ARC you can activate a Videostream port but since I can not use ARC I can not do this switch.
Is there any other method to activate the video stream port? Did I miss something?
video is port 24.
Alan
Thanks
I finally managed to fill my image buffer starting with EZIMG magic header. Only question remaining is:
How do I decode the buffer into something that I can save as an image or image stream? Easiest for me would be to convert the EZ image to an OpenCV image.
Any ideas where to look for the decoding?
Ok, I guess I found the solution.
The buffer contains an JPEG/JFIF encoding and after removing the first couple of bytes its working.
In case someone wants to do it as well:
The header includes the image size as well, which you can use. Again, I would recommend viewing the universalbot code as its open source. I suggested it earlier.
I did had a look into UniversalBot code. But either it lacks the mentioned information or I was not able to find it. Afterall I'm not a C# programmer. The overall idea how to work with the videostream I got from the EZB4Video class.
But the last missing piece I dont see: How the EZIMG header is constructed or should be read.
The 4 bytes immediately after the header is the length of the frame in bytes. It's an unsigned 32 bit int. Good news is c# has many c++ similarities.
Here's some additional details...
Header...
And here's some documentation added to the parsing loop....
The loop searching for the header, and removing previous bytes from buffer are only cpu consuming for the first frame. This is because your first frame may only contain partial data, depending on where in the fifo the connection was established. Once your code finds the header, removes the leading bytes from the buffer, it will be in sync for each consecutive frame. Meaning, the first byte after cleaning the buffer of the last frame will be the image header...
Because once the data is synchronized with the code logic, the header will always be the first byte after the last frame. So don't worry about cpu activity with the FOR loop to find the header
Ok, thanks. Got it working now. Hope I can release a new version with video soon
Awesome!