
EEGbiofeedback
Hi,
Just did a search through the forum and didn't see anyone using this here yet.
This is the smartest sensor I've seen so far.
https://www.youtube.com/watch?v=k43xJs3Roqg
https://plus.google.com/u/0/110913548697960843112 Here's one person's use that's quite similar to my desires:
" I use the Tango for full autonomous behaviors around my house. I can start the robot in any room in my house and tell it with speech recognition to go to any other location in my house and it can safely travel to that location. I can sit on the couch, take off my shoes, and say "take them to the closet" and it will take my shoes to the closet......once it gets there it can't do anything with the shoes since I haven't given it arms yet........but the arms are very close to being added into the design.
My favorite trick so far is telling it to go wake up my kids in the morning......the robot travels to the bedrooms and starts and alarm until they get up and use the touch screen face to turn off the alarm.
I have been creating the developer SDK for all aspects of the robot in Java, but a Unity SDK will be coming soon. "
So far there are only about 3,000 of these sold and only 5 of us using them for robotics. I'll create a tutorial once I get it interfaced with the EZ-B v4 via the USB and can turn on an LED.
Best,
Bill
That's awesome - i've never seen it, or not where to get one?
Here is info on Project Tango: https://developers.google.com/project-tango/
Requires a specific model tablet (link on that page).
Alan
Yes, It is a tablet. They're coming out with a phone with built in Project Tango sensors in June named Lenovo. This tablet uses the NVIDIA Tegra K1 w/ 192 CUDA cores packed in. Not sure if they'll be able to do that with a phone without significant heat issues.
I'm currently doing their recommended fully charge thing prior to first use.
Meanwhile, I'm looking through the Ez-B tutorials on interfacing via USB serial. I'm still short one Micro USB OTG to USB 2.0 Adapter Cable on order from Amazon.
I placed my order last night with ground delivery scheduled to arrive on the 18th but they overnighted it. I really love Google! They are an excellent example of a big mature company who knows the difference between power and control.
DJ, are you ordering one?
I'll keep you posted,
Bill
The Mobile App Builder worked to make servo slider operate a servo on the EZB v4!
Naturally there wasn't access to the depth sensors, IMU, Area Learning (floor mapping) etc.
DJ, I have some very supportive acquaintances on Google's Project Tango Developer Forum who would give you whatever you would like to drop into your Ez-Mobile App Builder if you would like. I think this group would be very impressed with what you've done with Ez-Robot. Here's a link to the Project Tango Developers Forum if you care to join.
https://plus.google.com/communities/114537896428695886568
Here's Google's sample code on git: https://github.com/googlesamples/tango-examples-c https://github.com/googlesamples/tango-examples-java https://github.com/googlesamples/tango-examples-unity
Thanks,
Bill
I will be watching this closely. Please post info as you have it available.
I think these are using the intel RealSense cameras but I could be wrong. Looks very interesting for fairly close real world identification and mapping.
I'm really curious about this! Yeah I will be looking more into it shortly. It's such an amazing demo. Don't know how I couldn't be intrigued
No problem,
Just found this valuable link describing the core features of the device.
The area learning is the most relevant to us. It uses 6DOF IMU to record relative motion from a starting location. But then they compensate for drift by remembering what it's previously seen and comparing it to it's current IMU dead reckoning. So it's the virtual equivalent of compensating for floor mapping drift by using IR sensors to prevent it from hitting a wall. The real-time IMU combined with real-time wide angle vision can tell if they're about to hit a stored virtual wall that it remembered.
Project Tango applicable to robotics (how it works)
Best,
Bill
You know I was thinking... How amazing would it be to convert this 3D depth data into a virtual world and use the camera images as the texture. You could use the camera to move around a room and save the "world". Then walk through the world in virtual reality using an oculus rift.
Think about it... You could make your house into a game and share in with the world. Or make any room a game and share it. That's awesome