EEGbiofeedback
Hi,
Just did a search through the forum and didn't see anyone using this here yet.
This is the smartest sensor I've seen so far.
https://www.youtube.com/watch?v=k43xJs3Roqg
https://plus.google.com/u/0/110913548697960843112 Here's one person's use that's quite similar to my desires:
" I use the Tango for full autonomous behaviors around my house. I can start the robot in any room in my house and tell it with speech recognition to go to any other location in my house and it can safely travel to that location. I can sit on the couch, take off my shoes, and say "take them to the closet" and it will take my shoes to the closet......once it gets there it can't do anything with the shoes since I haven't given it arms yet........but the arms are very close to being added into the design.
My favorite trick so far is telling it to go wake up my kids in the morning......the robot travels to the bedrooms and starts and alarm until they get up and use the touch screen face to turn off the alarm.
I have been creating the developer SDK for all aspects of the robot in Java, but a Unity SDK will be coming soon. "
So far there are only about 3,000 of these sold and only 5 of us using them for robotics. I'll create a tutorial once I get it interfaced with the EZ-B v4 via the USB and can turn on an LED.
Best,
Bill
That's awesome - i've never seen it, or not where to get one?
Here is info on Project Tango: https://developers.google.com/project-tango/
Requires a specific model tablet (link on that page).
Alan
Yes, It is a tablet. They're coming out with a phone with built in Project Tango sensors in June named Lenovo. This tablet uses the NVIDIA Tegra K1 w/ 192 CUDA cores packed in. Not sure if they'll be able to do that with a phone without significant heat issues.
I'm currently doing their recommended fully charge thing prior to first use.
Meanwhile, I'm looking through the Ez-B tutorials on interfacing via USB serial. I'm still short one Micro USB OTG to USB 2.0 Adapter Cable on order from Amazon.
I placed my order last night with ground delivery scheduled to arrive on the 18th but they overnighted it. I really love Google! They are an excellent example of a big mature company who knows the difference between power and control.
DJ, are you ordering one?
I'll keep you posted,
Bill
The Mobile App Builder worked to make servo slider operate a servo on the EZB v4!
Naturally there wasn't access to the depth sensors, IMU, Area Learning (floor mapping) etc.
DJ, I have some very supportive acquaintances on Google's Project Tango Developer Forum who would give you whatever you would like to drop into your Ez-Mobile App Builder if you would like. I think this group would be very impressed with what you've done with Ez-Robot. Here's a link to the Project Tango Developers Forum if you care to join.
https://plus.google.com/communities/114537896428695886568
Here's Google's sample code on git: https://github.com/googlesamples/tango-examples-c https://github.com/googlesamples/tango-examples-java https://github.com/googlesamples/tango-examples-unity
Thanks,
Bill
I will be watching this closely. Please post info as you have it available.
I think these are using the intel RealSense cameras but I could be wrong. Looks very interesting for fairly close real world identification and mapping.
I'm really curious about this! Yeah I will be looking more into it shortly. It's such an amazing demo. Don't know how I couldn't be intrigued
No problem,
Just found this valuable link describing the core features of the device.
The area learning is the most relevant to us. It uses 6DOF IMU to record relative motion from a starting location. But then they compensate for drift by remembering what it's previously seen and comparing it to it's current IMU dead reckoning. So it's the virtual equivalent of compensating for floor mapping drift by using IR sensors to prevent it from hitting a wall. The real-time IMU combined with real-time wide angle vision can tell if they're about to hit a stored virtual wall that it remembered.
Project Tango applicable to robotics (how it works)
Best,
Bill
You know I was thinking... How amazing would it be to convert this 3D depth data into a virtual world and use the camera images as the texture. You could use the camera to move around a room and save the "world". Then walk through the world in virtual reality using an oculus rift.
Think about it... You could make your house into a game and share in with the world. Or make any room a game and share it. That's awesome
That would be really cool. Just did a search on YouTube and found someone already started started part of what you suggested. This PT app can combine multiple people's renderings and stitch them all together. Making them in to VR game is a great idea.
Here's someone's Project Tango crude scan of a big house. You can explore it in VR on this page: PT scan of a house you can explore
Just found the entire Project Tango C API and SDK: Project Tango C API
FUN!
Any info on how to integrate this would be supercool...man I tried 3D tracking some years ago I wanted to integrate a 3D model into a real life shot, it was so difficult to get the jitter of of it and the lighting right! I was always thinking the most you could get out of VR would be to re-texture your environment in real time and add 3D elements! Just imagine turning your flat into a spaceship, a beach or just hang out with friends from all around the world!
With robotics we could finally have the option to specifically point out any location within an ever changing environment...this type of tracking is so fast it is unbelievable!
Good digging there @EEGbiofeedback guess there is something to be found!
Is it me or is there just been this explosion of technology? I remember sitting around for ten years waiting for something to happen (sans EZB. Or maybe its people thinking differently about the use of technology. I know this isnt new, I had Reconstruct Me from years ago and I have Skanect. But the way they software splices the data between devices is amazing. The tracking of a room full of people is outstanding. Lets get some thing going with EZB!
:)
What do you think the route would be for Ez B integration? a) through the plugin building process: https://synthiam.com/Tutorials/UserTutorials/146/1
b) Ask someone in the Google developer community to output and continuously update all sensor data to the devices built in drive that hold pictures etc. An ini file could be read and parsed via Ez Script. (This option is blasphemy among trained engineers but really easy to interface with when one doesn't need microsecond response times and timing.)
c) an easier way?
Here's a video someone did with its camera matching capabilities:
In other news, I did an extensive dead reckoning test where I had previously shown the device an environment. It was accurate within 3cm of the starting location. This would make self-charging very easy.
I have a feeling this device is just a few projects below the top of DJ's fun todo list. But I'll see how far I can take it in the meantime.
Best,
Bill
I have hidden my credit card from myself to prevent impulsively buying one of these tablets........
Alan
I think A. Does the device have a way to output information (serial or other)? It looks like the USB 3 port is only active when docked. I suppose that you could transmit the data via wifi but that would be pretty slow for the amount of data that would need to be transferred. If I read this right it looks like you would have to design the robot to also contain the doc which might not be that big of a deal.
I just skimmed things so it is entirely possible that I missed something.
Ooooh... I found Alan's credit card. Where can I order?
This is one that I will wait on until someone else writes a plugin for it. My hands are full and this would just take me down another path that I don't have time to work on right now. I will be digging into the SDK some though (as I can't resist seeing what is out there). My hope is that they will open it up to other hardware before too long and allow something like this to run on an on-board computer instead of this device.
Quick! Buy it before Alan catches up with you: https://store.google.com/product/project_tango_tablet_development_kit
I didn't have the money for it either but I committed my first born child to Google. It's a new program they offer if techiness is in your DNA.
Just found this:
Regarding waiting for someone else to build a plugin, that was my intention also but we all know what path is paved with good intentions.
Bill
This is exactly why I made it public that I wouldnt be working on this one. I have enough plugins to complete and just decided that the others will never get finished if I go down this rabbit hole...
Regarding interfacing with it. I've seen bluetooth, micro USB OTG to female cable, and I connected with it through WIFI with EZ B's build an android interface builder.
I wasted a day searching for "Project Tango interfacing, USB, bluetooth connectivity". Then I went Dah, it's an Android OS. There are 1000s of links and how tos out there. The differences are adding the portions from the SDK for loading the Area Learning Files etc.
Cheers,
Bill...
Here's the bot body I've begun installing the Ez B / Project Tango (PT) brain into. I'm going to start by making it just remote controlled via that super cool iPhone Sensor Streamer app interface DJ added to ARC. Then when someone breaks the PT code, I'll add that device. I still have about 2 months left of construction of the new home / office that this thing will interface with.
It's 19x16x6 inches or 48.26x40.64x15.24 cm
This is that cart project I've been planning for about two years.
Best,
Bill
@Bill,
Did you find any information or videos using the Tango with Robots ?
I found these 2 links (not positive views):
https://www.reddit.com/r/computervision/comments/44o2vl/psa_do_not_buy_a_project_tango_dev_kit/
https://np.reddit.com/r/oculus/comments/45bl1i/problems_and_limitations_of_google_tango_slam/
I've been keeping an eye on the project and Google group. Added tango to my google keyword search so I get updates. The mental barrier I'm having at the moment is a practical application for robotics considering its form factor, availability, price and development challenges.
Although I really really want to see a plugin utilize the data. And the hacker in me wants to jump in and okay with this thing. The responsible adult with a list of immediate ezrobot priorities says otherwise
This project needs hackers like you to identify use case scenarios.
The tango v2 will be one to take seriously...
And that assumes that this is not one of those Google projects that lives just long enough to be useful and then they kill it. If they don't find a way to monetize this beyond selling special purpose tablets, it may not continue.
I do wonder how much of the same technology is being used in their self-driving car project(s). If there is a lot of overlap, that could keep it alive even for a small development community.
Alan
It seems to be some awesome piece of tech...I dont think this will be ditched, but could also be outdated or there might be a lot of updates to the product soon!
Its still some sort of a developers edition right now, but it is a big leap into a direction which all of us would like to be heading!
Still super stoked!
If the black tablet wasn't out of stock I might not have hidden my credit card. My Nexus 7 is having some issues so I might need a new tablet anyway, but I don't want a white one.
Hahahaha Alan you are the best!
Dont start searching...you might find it!
Make's sense DJ. You have that whole office build thing going on. We're going on two years of delays for our office / home. Just installed the staircase Tuesday. Whew what that hair raising. It came from Wisconsin to CA in two 1 ton pieces and had to be cork screwed in through our sky roof. Everything fit perfectly within millimeters. CHOCK ONE UP FOR ENGINEERS!
All right, I'll give my best shot at building a plugin. I'm off to learn MS Developer Studio Express. If I can wrap my head around Apple's Xcode, I should be able to do anything. Xcode was about as steep of a learning curve as 6502 assembly on my Atari 400.
Regarding others using PT for robots, this guy has the coolest completed consumer level robot I've seen. He built his own robot controller and is threatening to create a PT robotic SDK, in all his spare time. He's a very busy robotics engineer so that's likely to take some time. https://plus.google.com/110913548697960843112
Here are some others: https://plus.google.com/u/0/s/project%20tango%20robot/top
Alan, PT has a slot for a SIM card so I would assume it could be used as a phone. I don't see anyone using them for that yet.
Wish me luck,
Bill
Good luck Bill....if you can accomplish it will definitely be a piece of hardware that I would be willing to invest in!
Ahh Johnny Lee famous for his work as a student using Wii to create 3d views and other tracking hacks
He's one sharp dude!
I hope they don't kill this one. I have a feeling the Levono phone they're releasing next month is going to have some big problems with heat and battery life. The 192 CUDA core processor is a power pig.
I'm setting up MS Visual Studio Community 2015. It's taking most of the day. MS doesn't make things easy.
Cheers,
Bill
OK, now the controller is working with the custom movement panel.
I'm controlling it with the keyboard. I'm also powering it from the 5v BEC from the Sabertooth and it seems to be working fine.
I love your SuperDroid base. It's built seriously tough... Wish I could afford one. This is a great project you got going....
Lenovo just announced that they will be making the first Project Tango phone, available unlocked at BestBuy for $499 in September (even without a carrier, this might be a better Robot solution than a tablet due to the size, although it is going to be a big phone - 6.4" display). Having the 4G and ability to create a hotspot might actually be really useful in a robot.
http://www.droid-life.com/2016/06/09/lenovo-phab2-pro-project-tango/
Alan