Intel Realsense T265

Use the Intel Realsense T265 tracking camera for way-point robot navigation

+ How To Add This Control To Your Project (Click to Expand)
  1. Make sure you have the latest version of ARC installed.
  2. Select the Get button in this page to download the archive file.
  3. Double click the downloaded archive file to execute installer.
  4. The installer will add this control to ARC.
  5. Load ARC and press the Project -> Add Control button from the menu.
  6. Choose the Navigation category tab.
  7. Press the Intel Realsense T265 icon to add the control to your project.


With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation.

*Note: this skill requires ARC 2020.12.12.00 or greater

The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of it’s own location within that environment. Before the days of GPS, sailors would navigate by the stars, using their movements and positions to successfully find their way across oceans. VSLAM uses a combination of cameras and Inertial Measurement Units (IMU) to navigate in a similar way, using visual features in the environment to track it’s way around unknown spaces with accuracy. All of these complicated features are taken care of for you in this ARC robot skill.

Robot Skill Window
The skill has a very minimal interface because it pushes data in the NMS and is generally used by other robot skills (such as The Navigator).
User-inserted image

1) START/STOP the Intel T265 connection. Due to a bug from Intel's SDK, the sensor cannot be stopped and restarted in the same session. The only way to restart the T265 connection is to close ARC and reload. We understand this is unconventional and would normally be fixed. However, Intel has claimed they "stopped development and any additional bug fixes on the T265"

2) The confidence of the tracking status between 0 (low) and 3 (highest). In a brightly lit room with many points of interest (not just white walls), the tracking status will be high. Tracking will be low if the room does not have enough light and/or detail for the sensor to track.

3) Log text display for errors and statuses.

Config Menu
User-inserted image

1) Radius Offset (mm) is the distance in mm of the T265 from the center of the robot. A negative number is toward the front of the robot, and a positive number is toward the rear. The sensor must be facing 0 degrees toward the front of the robot. The sensor must not be offset to the left or right of the robot.

Video Demonstration
Here's a video of the Intel RealSense T265 feeding The Navigator skill for way-point navigation

ARC Navigation Messaging System
This skill is part of the ARC navigation messaging system. It is encouraged to read more about the messaging system to understand available skills HERE. This skill is in group #2 of level #3 in the diagram below. This skill contributes telemetry positioning to the cartesian positioning channel of the NMS. Combining this skill with Level #3 Group #1 skills for obstacle avoidance. And for Level #1, The Navigator works well.
User-inserted image

This T265 will work both indoors and outdoors. However, bright direct light (sunlight) and darkness will affect performance. Much like how our eyes see, the camera will is also susceptible to glare and lack of resolution in the dark. Because the camera visual data is combined with the IMU, the camera must have reliable visible light. Without the camera being able to detect the environment, the algorithm will be biased to use the IMU and will experience drift, which greatly affects the performance of the sensor's accuracy.

Here is a screenshot of this skill combined with The Navigator in ARC while navigating through a room between two way points.
User-inserted image

Starting Location
The T265 does not include a GPS/Compass or any ability to recognize where it is when initialized. This means your robot will have to initialize from a known location and direction to reuse saved maps. Make sure you mark the spot on the ground with masking tape where the robot starts from.

How To Use This
1) Connect your Intel RealSense T265 camera to the computers USB port

2) Load ARC (version must be >= 2020.12.05.00)

3) Add this skill to your project

4) Now we'll need a Navigation skill. Add The Navigator to your project

5) Press START on the Intel RealSense skill and data will begin mapping your robot's position

How Does It Work?
Well, magic! Actually, the camera is quite interesting and it breaks the world down into a point cloud of features. It remembers the visual features so it can re-align itself on the internal map. It uses a VPU, is what Intel calls it. Here's a video of what the camera sees.

We'll be creating tutorials and examples in the upcoming future

Ok I was just marking major destinations not path to destinations. I guess I need to spend some more time mapping.
Ha ya. If you try and guess coordinates it’ll lead to disaster lol. Drive robot and watch the trajectory.

@DJ have you seen this on the T265 Intel page?

What is wheel odometry and how does it help T265 navigate?

Wheel odometry is the use of sensors to measure how much a wheel turns, it can be used to estimate changes in a wheeled robots position. T265 has wheel odometry support built in, allowing it to use the data from these sensors to refine the position data of the robot. Providing robotic wheel odometer or velocimeter data over USB to TM2 will make the tracking much more robust on wheeled robots, which otherwise can experience many tracking failures. We consider odometer input to be a requirement for robust tracking on wheeled robots.
#44   — Edited
I think what intel is inferring is that their sensor can be used to calibrate wheel odometer on robots. 

once you get the calibration right on a specific surface or in a specific environment then you can remove the sensor.  

I envision you don’t want to put a $250 sensor on every robot doing minor tasks like vacuuming but you could calibrate it with the T265 and then send it on its way.
#45   — Edited
* Moved your comments to correct thread 

In the NMS are some additional reading links that talk about the challenges with navigation and pose. I highly encourage you read them prior to this discussion. 

Intel is very clear in that paragraph that you copied. It clearly says the T265 is compatible with wheeled robots. It does NOT state that it works with wheel encoders. It can correct wheel odometry but it cannot accept an input from a wheel encoder. It can be used to correct wheel encoded robot pose because wheel encoder robots have terrible accuracy - which is why the T265 exists. 

I really recommend reading the additional links in the NMS page to catch up on the difference between a wheeled robot and a wheel encoder. And also the challenges of the two etc

*edit: the api seems to have an older interface that allows wheel odometry input. The early release of the T265 does mention you can push wheel odometry data into the api and it gets fused with the T265. So that must be an early attempt at converting the old school wheeled ros guys to vslam
#46   — Edited
Sorry, my English with technical terms is terrible, i thought that Wheel odometry meant wheel with encoder, as they state: "Wheel odometry is the use of sensors to measure how much a wheel turns".
Oh and Happy New Year Everyone!!
Your English is great - don't think I ever noticed anything else:)
Hey there Nink  ,my Roomba does more than just vacuum,keeps the Racoons and Squirrels terrified from knocking over the patio garbage can,LoL!
Hi DJ, i moved the T265 camera position to the edge of the roomba and when i spin 360deg in place i get a circle like on the pic. Could you make an offset for the camera as you did with the US sensors?User-inserted image
It would be interesting to see how accurate the orientation data is in order to do this. Playing with the sensor it does have orientation yaw pitch and roll. xy and z access tracking also would be nice for drone track and robot arms. I guess you could always mount the sensor in centre of roomba on your pole and reduce offset. 

I would love if we had D435 support. Watch this video.

long term goal would be Map room with T265 and D435. Use object recognition to identify and find the object.  Use T265 data to go to the object and use data from D435 to calculate exact location and orientation of object, now use inverse kinematics to calculate how to pick it up and run a bunch of simulations, finally use robot arm and gripper to pick up object.  Mounting the T265 on robot arm would verify are calculations as we pick up object and also be used to train robot using ML to improve IK calculations. 

So GPU TPU support (Nvidia Jetson?) for accurate object recognition and IK calculations. D435i support for 3D point cloud and T265 for location and orientation and movement of robot arm in 3D space.
#51   — Edited
I had the cam on the pole but it creates some vibration and having an offset parameter lets you place the camera where you want.
How are you using your T265? 
The Jetson is ARM based, so not ARC compatible.
#52   — Edited
Well it could run on Linux and arm,  it used to :-).  But I get DJ doesn’t want to support 2 distros as there is a lot of effort involved amd that = costs, but that doesn’t prevent someone adding Jetson support as an accessory and offload all the GPU requirements to the Nano.

right now just playing with T265 (not enough hours in day) but my goal is a robot that can do some simple tasks around the house.  Pick up shoes put them away, vacuum without smashing the wife’s furniture up or getting stuck under the coffee table  and most important. GET ME A BEER.

what are your plans @proteusy?
I am currently working on the "Go to work for me" script.:p
Updated for radius offset in MM from the center of the robot. Read the manual above for more detail or use the question mark in the config menu.
What is the performance and power consumption like on the stick computer @proteusy, are you able to Remote Desktop in ok? 
Since we are only getting telemetry data of T265 I am wondering if it would be better to just pull the data off the T265 with a pi or stick pc and send it to Navigator on a remote desktop.  I mounted a NUC with 2 * 3 cell Lipo's in series for ~22v to run everything off, works fine but battery life is short.
#56   — Edited
For now i use the roombas battery only for the roomba and have 4 x 3.7V 3500mAh MR18650 cells for the rest. The intel stick is surprisingly fast and the nominal consumption is around 1.2 amp (just the stick). All together i get around 2 hours of play. My next step is to buy a li-ion with 5500mAh for the roomba and run all from there.
I am curranty working on the obstacle avoidance when using The Navigators way points.
#57   — Edited
With my Roomba I am using the roomba battery power to run the Ezb4 (or Tiny io) with EZ camera and Terminator brain LED lights, it makes only 5 minute reduction in runtime of roomba with vacuum and brush motors on 1 hour 15 minutes.
The Chrome skull is fairly heavy about 8 pounds so i bet without it,would get at least 1 hour 45 minutes.
Moved from the Roli to the roomba. Zig zags a lot when going to way points. do other people have that problem.
User-inserted image

where is the best place to position the realsense. I did play with new offset a bit.
Adjust the speed to be a little slower. The position is fine. Use a measuring tape for the offset from the center

i Have a todo to add speed adjustment settings for moving and turning. It’ll come in time:)
Yes i have it too. As DJ suggested go slower for now.