
EEGbiofeedback
Hi,
Just did a search through the forum and didn't see anyone using this here yet.
This is the smartest sensor I've seen so far.
https://www.youtube.com/watch?v=k43xJs3Roqg
https://plus.google.com/u/0/110913548697960843112 Here's one person's use that's quite similar to my desires:
" I use the Tango for full autonomous behaviors around my house. I can start the robot in any room in my house and tell it with speech recognition to go to any other location in my house and it can safely travel to that location. I can sit on the couch, take off my shoes, and say "take them to the closet" and it will take my shoes to the closet......once it gets there it can't do anything with the shoes since I haven't given it arms yet........but the arms are very close to being added into the design.
My favorite trick so far is telling it to go wake up my kids in the morning......the robot travels to the bedrooms and starts and alarm until they get up and use the touch screen face to turn off the alarm.
I have been creating the developer SDK for all aspects of the robot in Java, but a Unity SDK will be coming soon. "
So far there are only about 3,000 of these sold and only 5 of us using them for robotics. I'll create a tutorial once I get it interfaced with the EZ-B v4 via the USB and can turn on an LED.
Best,
Bill
That would be really cool. Just did a search on YouTube and found someone already started started part of what you suggested. This PT app can combine multiple people's renderings and stitch them all together. Making them in to VR game is a great idea.
Here's someone's Project Tango crude scan of a big house. You can explore it in VR on this page: PT scan of a house you can explore
Just found the entire Project Tango C API and SDK: Project Tango C API
FUN!
Any info on how to integrate this would be supercool...man I tried 3D tracking some years ago I wanted to integrate a 3D model into a real life shot, it was so difficult to get the jitter of of it and the lighting right! I was always thinking the most you could get out of VR would be to re-texture your environment in real time and add 3D elements! Just imagine turning your flat into a spaceship, a beach or just hang out with friends from all around the world!
With robotics we could finally have the option to specifically point out any location within an ever changing environment...this type of tracking is so fast it is unbelievable!
Good digging there @EEGbiofeedback guess there is something to be found!
Is it me or is there just been this explosion of technology? I remember sitting around for ten years waiting for something to happen (sans EZB
. Or maybe its people thinking differently about the use of technology. I know this isnt new, I had Reconstruct Me from years ago and I have Skanect. But the way they software splices the data between devices is amazing. The tracking of a room full of people is outstanding. Lets get some thing going with EZB!
:)
What do you think the route would be for Ez B integration? a) through the plugin building process: https://synthiam.com/Tutorials/UserTutorials/146/1
b) Ask someone in the Google developer community to output and continuously update all sensor data to the devices built in drive that hold pictures etc. An ini file could be read and parsed via Ez Script. (This option is blasphemy among trained engineers but really easy to interface with when one doesn't need microsecond response times and timing.)
c) an easier way?
Here's a video someone did with its camera matching capabilities:
In other news, I did an extensive dead reckoning test where I had previously shown the device an environment. It was accurate within 3cm of the starting location. This would make self-charging very easy.
I have a feeling this device is just a few projects below the top of DJ's fun todo list. But I'll see how far I can take it in the meantime.
Best,
Bill
I have hidden my credit card from myself to prevent impulsively buying one of these tablets........
Alan
I think A. Does the device have a way to output information (serial or other)? It looks like the USB 3 port is only active when docked. I suppose that you could transmit the data via wifi but that would be pretty slow for the amount of data that would need to be transferred. If I read this right it looks like you would have to design the robot to also contain the doc which might not be that big of a deal.
I just skimmed things so it is entirely possible that I missed something.
Ooooh... I found Alan's credit card. Where can I order?
This is one that I will wait on until someone else writes a plugin for it. My hands are full and this would just take me down another path that I don't have time to work on right now. I will be digging into the SDK some though (as I can't resist seeing what is out there). My hope is that they will open it up to other hardware before too long and allow something like this to run on an on-board computer instead of this device.