Thumbnail

Intel Realsense T265

Use the Intel Realsense T265 tracking camera for way-point robot navigation

+ How To Add This Control To Your Project (Click to Expand)
  1. Make sure you have the latest version of ARC installed.
  2. Select the Get button in this page to download the archive file.
  3. Double click the downloaded archive file to execute installer.
  4. The installer will add this control to ARC.
  5. Load ARC and press the Project -> Add Control button from the menu.
  6. Choose the Navigation category tab.
  7. Press the Intel Realsense T265 icon to add the control to your project.

Manual

With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation.

The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of its own location within that environment. Before the days of GPS, sailors would navigate by the stars, using their movements and positions to successfully find their way across oceans. VSLAM uses a combination of cameras and Inertial Measurement Units (IMU) to navigate in a similar way, using visual features in the environment to track its way around unknown spaces with accuracy. All of these complicated features are taken care of for you in this ARC robot skill.


Update Firmware
The device sensor may require a firmware update.

1) Visit the Realsense GitHub page, scroll to the bottom of the page, and install the Intel.Realsense.Viewer.exe from here: https://github.com/IntelRealSense/librealsense/releases/latest
User-inserted image


2) Click the hamburger settings icon and select Install Recommended Firmware
User-inserted image




Robot Skill Window
The skill has a very minimal interface because it pushes data in the NMS and is generally used by other robot skills (such as The Navigator).
User-inserted image


1) Drop-down to select Realsense device by the serial number. This is useful if there are multiple devices on one PC.

2) START/STOP the Intel T265 connection. 

3) The confidence of the tracking status between 0 (low) and 3 (highest). In a brightly lit room with many points of interest (not just white walls), the tracking status will be high. Tracking will be low if the room does not have enough light and/or detail for the sensor to track.

4) Log text display for errors and statuses.

Config Menu
User-inserted image


1) Mounted Radius Offset (mm) is the distance in mm of the T265 from the center of the robot. A negative number is toward the front of the robot, and a positive number is toward the rear. The sensor must be facing 0 degrees toward the front of the robot. The sensor must not be offset to the left or right of the robot.

2) Enable Video Stream will send the fisheye b&w video from the T265 to the selected camera device. The selected camera device robot skill must have Custom specified as the input device. Also, the camera device will need to be started to view the video.

3) Distortion Correction will use a real-time algorithm to correct the fisheye lens, which isn't always needed and is very CPU intensive.

Video Demonstration
Here's a video of the Intel RealSense T265 feeding The Navigator skill for way-point navigation



ARC Navigation Messaging System
This skill is part of the ARC navigation messaging system. It is encouraged to read more about the messaging system to understand available skills HERE. This skill is in level #3 group #2 in the diagram below. This skill contributes telemetry positioning to the cartesian positioning channel of the NMS. Combining this skill with Level #3 Group #1 skills for obstacle avoidance. And for Level #1, The Navigator works well.
User-inserted image





Environments
This T265 will work both indoors and outdoors. However, bright direct light (sunlight) and darkness will affect performance. Much like how our eyes see, the camera will is also susceptible to glare and lack of resolution in the dark. Because the camera visual data is combined with the IMU, the camera must have reliable visible light. Without the camera being able to detect the environment, the algorithm will be biased to use the IMU and will experience drift, which greatly affects the performance of the sensor's accuracy.

Screenshot
Here is a screenshot of this skill combined with The Navigator in ARC while navigating through a room between two way points.
User-inserted image



Starting Location
The T265 does not include a GPS/Compass or any ability to recognize where it is when initialized. This means your robot will have to initialize from a known location and direction to reuse saved maps. Make sure you mark the spot on the ground with masking tape where the robot starts from.

How To Use This
1) Connect your Intel RealSense T265 camera to the computers USB port

2) Load ARC (version must be >= 2020.12.05.00)

3) Add this skill to your project

4) Now we'll need a Navigation skill. Add The Navigator to your project

5) Press START on the Intel RealSense skill and data will begin mapping your robot's position

How Does It Work?
Well, magic! Actually, the camera is quite interesting and it breaks the world down into a point cloud of features. It remembers the visual features so it can re-align itself on the internal map. It uses a VPU, is what Intel calls it. Here's a video of what the camera sees.

PRO
Canada
#80   — Edited
I have spent a bit of time playing with this.  It is really good!.  I like the way it follows the path you set to get around obstacles.  I do get a bit of a drift the longer it is away from home. 

Question is there a way to tell robot it is currently in a certain location when it drifts.  I start in the docking station and turn Realsense on. Then I tell roomba to move back about 1 foot from docking station and then give waypoint voice commands. It knows where it is because I started realsense when it was docked.  Now it runs around my house but starts to drift.  I guess I could send it back to a point in front of docking station and then dock using ControlCommand Seek Dock on roomba then stop realsense, then start again then backup one foot again etc, but it would be good if there was a way to tell realsense I am here. please reset your tracking location to hear now?  

We talked about glyph as being a way to fix with camera.  Maybe if we had 2 cameras working we could triangulate and get position.  I thought about some lines on the ground in various locations that the robot could follow and when he crosses them he starts to follow and finds a dash code etc to recalibrate using a sensor like this https://cnclablb.com/adjustable-line-tracking-sensor-module.html   But I think what I need is a way to send a command to Navigator to say HELLO I AM HERE.   Maybe something like a ControlCommand("The Navigator", "ResetWayPoint", "Kitchen");
PRO
Synthiam
#81  
@Nink, I mentioned previously that I'll add a ControlCommand() for you - it is on the list todo. However, I don't think it's going to help you and will cause much larger challenges to face. But that being said, I'm interested in how you'll overcome them. 

You see, the T265 requires a cartesian X/Y and Heading Degree for mapping. This means you will have to specify the absolute exact coordinate (including heading degree). I'm sure there's a way you can calculate it with a glyph size and such. But be forewarned that it won't be as simple as you may think.

*PS Note: moved the above comment into this thread because it's regarding the Intel T265 and now a robot build.
PRO
Canada
#82  
OK thanks @DJ.  BTW I tried telling the robot to goto a spot close to dock, then go seek dock then stop realsense camera and start Realsense  again to recalibarate but after a couple of times doing this it will just hang.  If I unplug realsense wait 30 second and plug back in again it will work when you tell it to start again.  

Question Can you Disable and Enable a USB port in ARC to reinitialize the driver or can you call a batch file to potentially use something like  devcon to disable port in windows and then re-enable the USB port so it resets each time you stop and start it?  


 User-inserted image
PRO
Portugal
#84  
Bye the way, i have a Ryzen 5 with 16GB onboard my roomba and the T265 does not drift anymore.
PRO
Canada
#85  
Thanks I am not sure how to execute that code from ARC. I did a google and found I can disable and enable with the PNPUTIL command but windows wants to reboot in between ( I guess they forget what the PNP  in PNPUTIL stands for) 
example for my device
 pnputil /disable-device "USB\VID_8087&PID_0B37\952322111114" 

I could always just create a USB hardware add-on that allows me to turn it on and off with a relay (this is actually starting to sound like easiest method) 

With regards to drift, I played with offset for a while and concluded -160 was a good level (When I go into a spin it doesn't create a donut) but yeah drift is still a problem.   

Ryzen 5 must chew some power.  What are you using for battery.  I have blown out 4 Lipos running all this stuff (My 4S is now a 3S and my 3S's are now 2S's).  I have now switched back to an old Sealed lead battery and plug in when docked so computer is always on.  This seams to work OK and I get about 2 hours of play time.   I am going to add some springs and magnets so it will auto charge (just a matter of finding parts) this week.  

I really want to get this as autonomous as possible.
PRO
Portugal
#86  
Great idea with the relay. The code is just an example i found. I think it is written C+.
My setup with this mboard is recent and i am still in the build phase. I use 2 Xunzel SOLARX-8 gel batterys in parallel.
PRO
Canada
#87  
heh I found out how to fix camera hanging when you turn off Realsense and turn it back on.  Put a 1 minute delay in the script between turning off and on.  Sometimes it is quick sometimes slow but 1 minute seems to work all the time for me.  More testing.  

I need to work on distance sensors maybe laser sensors as these ultrasonics are garbage.  Will smash into something or see something that is not there.  Also I think we need a way to navigate with individual sensors,  if something towards left turn right, towards right turn left. directly in front stop.  When you have sensors directly on side you really don't care as much about distance from wall beside you, also when you are going fast they don't work as well as slow. 

I also really need to add a panic button off an Arduino accelerometer to shut off roomba battery if it starts to go crazy. That seems to happen a lot :-)
PRO
Portugal
#88   — Edited
Nink, i use the roomba built in IR sensor array to avoid obstacles coming from left, right or front and try to navigate past it. It is very crapy code and is still in "beta" but works.

Code:

import time

cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0

while (True):

if getVar("$RoombaLightBumperCenterLeft")==1:
Navigation.SetNavigationStatusToPause();
print "Bumper Center Left";
cleft += 1;
print cleft
#time.sleep(2);
if cleft == 2:
Movement.Right(30,3500);
Movement.Forward(30,1500);
Movement.Stop();
Navigation.SetNavigationStatusToNavigating();
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
print cleft

if getVar("$RoombaLightBumperCenterRight"):
Navigation.SetNavigationStatusToPause();
print "Bumper Center Right";
cright += 1;
print cright
#time.sleep(2);
if cright == 2:
Movement.Left(30,3500);
Movement.Forward(30,1500);
Movement.Stop();
Navigation.SetNavigationStatusToNavigating();
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
print cright

if getVar("$RoombaLightBumperFrontLeft")==1:
Navigation.SetNavigationStatusToPause();
print "Bumper Front Left";
fleft += 1;
print fleft
#time.sleep(2);
if fleft == 2:
Movement.Right(30,3000);
Movement.Forward(30,1500);
Movement.Stop();
Navigation.SetNavigationStatusToNavigating();
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
print fleft

if getVar("$RoombaLightBumperFrontRight")==1:
Navigation.SetNavigationStatusToPause();
print "Bumper Front Right";
fright += 1;
print fright
#time.sleep(2);
if fright == 2:
Movement.Left(30,3000);
Movement.Forward(30,1500);
Movement.Stop();
Navigation.SetNavigationStatusToNavigating();
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
print fright

if getVar("$RoombaLightBumperRight")==1:
Navigation.SetNavigationStatusToPause();
print "Bumper Right";
bright += 1;
print bright
#time.sleep(2);
if bright == 2:
Movement.Left(30,2000);
Movement.Forward(30,1500);
Movement.Stop();
Navigation.SetNavigationStatusToNavigating();
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
print bright

if getVar("$RoombaLightBumperLeft")==1:
Navigation.SetNavigationStatusToPause();
print "Bumper Left";
bleft += 1;
print bleft
#time.sleep(2);
if bleft == 2:
Movement.Right(30,2000);
Movement.Forward(30,1500);
Movement.Stop();
Navigation.SetNavigationStatusToNavigating();
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
print bleft
else:
Navigation.SetNavigationStatusToNavigating();
time.sleep(1);
cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0
PRO
Canada
#89  
wow great idea, didn't even know the roomba had a sensor array, I thought it just had a microswitch activated bumper.  cool.
PRO
Canada
#90  
Spent some time playing with the sensors on the Roomba, wow what a wealth of data!  
Makes you wonder why the original Roomba smashes the furniture with all these sensors on board.  

Thanks for the sample code and your idea to navigate around stuff using the onboard sensors @proteusy very cool. I had a play and if you run around really slow you can avoid some stuff but it still managed to knock over my green screen, dive under the treadmill, smash into my chair and get tangled up in the power cords under the desk.  How a Tesla navigates at 100km an hour blows me away.  

That said, I think I need to add a bunch more sensors, a LIDAR and lots of code to make sense of all this data, if wheels are spinning and it is not moving, I am probably tangled up have smashed into something.
PRO
Synthiam
#91  
Updated manual to include instructions suggesting to upgrade the firmware
PRO
Synthiam
#92  
Updated with a dropdown to select from multiple realsense devices.
PRO
Synthiam
#93   — Edited
Fixed Intel's bug so the device can be started and stopped without needing to restart the project.

Also, this robot skill will only display tracking cameras in the drop-down.
PRO
Synthiam
#94   — Edited
- New ControlCommand for resetting position and setting specific positions (i.e. recalibrate)

- Bug fix and small performance improvement
PRO
Synthiam
#96  
- Added a button to reset position to zero rather than having to use the control command
PRO
USA
#97  
@DJ,

Do you thing if I place the T265 in the chest of an Inmoov
 will work correctly or does it need to be ground leave?


Thank you, Cheers
PRO
Synthiam
#98  
I’m guessing it will work fine. It can’t have anything blocking it. So no hands can be in front. Otherwise it won’t know where it is when the robot drives around
PRO
USA
#99  
Thank you DJ
, I will try it soon.