Intel Realsense T265

by Intel

Use the Intel Realsense T265 tracking camera for way-point robot navigation

How to add the Intel Realsense T265 robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Navigation category tab.
  5. Press the Intel Realsense T265 icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Intel Realsense T265 robot skill.

How to use the Intel Realsense T265 robot skill

With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation.

The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of its own location within that environment. Before the days of GPS, sailors would navigate by the stars, using their movements and positions to successfully find their way across oceans. VSLAM uses a combination of cameras and Inertial Measurement Units (IMU) to navigate in a similar way, using visual features in the environment to track its way around unknown spaces with accuracy. All of these complicated features are taken care of for you in this ARC robot skill.

Update Firmware
The device sensor may require a firmware update.

1) Visit the Realsense GitHub page, scroll to the bottom of the page, and install the Intel.Realsense.Viewer.exe from here:
User-inserted image

2) Click the hamburger settings icon and select Install Recommended Firmware
User-inserted image

Robot Skill Window
The skill has a very minimal interface because it pushes data in the NMS and is generally used by other robot skills (such as The Navigator).
User-inserted image

1) Drop-down to select Realsense device by the serial number. This is useful if there are multiple devices on one PC.

2) START/STOP the Intel T265 connection. 

3) The confidence of the tracking status between 0 (low) and 3 (highest). In a brightly lit room with many points of interest (not just white walls), the tracking status will be high. Tracking will be low if the room does not have enough light and/or detail for the sensor to track.

4) Log text display for errors and statuses.

Config Menu
User-inserted image

1) Mounted Radius Offset (mm) is the distance in mm of the T265 from the center of the robot. A negative number is toward the front of the robot, and a positive number is toward the rear. The sensor must be facing 0 degrees toward the front of the robot. The sensor must not be offset to the left or right of the robot.

2) Enable Video Stream will send the fisheye b&w video from the T265 to the selected camera device. The selected camera device robot skill must have Custom specified as the input device. Also, the camera device will need to be started to view the video.

3) Distortion Correction will use a real-time algorithm to correct the fisheye lens, which isn't always needed and is very CPU intensive.

Video Demonstration
Here's a video of the Intel RealSense T265 feeding The Navigator skill for way-point navigation

ARC Navigation Messaging System
This skill is part of the ARC navigation messaging system. It is encouraged to read more about the messaging system to understand available skills HERE. This skill is in level #3 group #2 in the diagram below. This skill contributes telemetry positioning to the cartesian positioning channel of the NMS. Combining this skill with Level #3 Group #1 skills for obstacle avoidance. And for Level #1, The Navigator works well.
User-inserted image

This T265 will work both indoors and outdoors. However, bright direct light (sunlight) and darkness will affect performance. Much like how our eyes see, the camera will is also susceptible to glare and lack of resolution in the dark. Because the camera visual data is combined with the IMU, the camera must have reliable visible light. Without the camera being able to detect the environment, the algorithm will be biased to use the IMU and will experience drift, which greatly affects the performance of the sensor's accuracy.

Here is a screenshot of this skill combined with The Navigator in ARC while navigating through a room between two way points.
User-inserted image

Starting Location
The T265 does not include a GPS/Compass or any ability to recognize where it is when initialized. This means your robot will have to initialize from a known location and direction to reuse saved maps. Make sure you mark the spot on the ground with masking tape where the robot starts from.

How To Use This
1) Connect your Intel RealSense T265 camera to the computers USB port

2) Load ARC (version must be >= 2020.12.05.00)

3) Add this skill to your project

4) Now we'll need a Navigation skill. Add The Navigator to your project

5) Press START on the Intel RealSense skill and data will begin mapping your robot's position

How Does It Work?
Well, magic! Actually, the camera is quite interesting and it breaks the world down into a point cloud of features. It remembers the visual features so it can re-align itself on the internal map. It uses a VPU, is what Intel calls it. Here's a video of what the camera sees.


Upgrade to ARC Pro

Become a Synthiam ARC Pro subscriber to unleash the power of easy and powerful robot programming

Hi DJ, for the T265, when starting ARC most of the time I need to erase the skill and install again to recognize the camera, is this something already identified? or is an isolated case. Thanks
#119   — Edited
Yeah, that's one of the most significant issues with this device. You can ping intel about it, but they haven't seemed to act on those complaints. Super frustrating, right? 

It doesn't seem to be an issue if it's the only intel realsense connected. It seems to happen when there are two realsense devices connected.
OK good to know that. Thanks DJ.
v26 updated

- hides an error from Intel's driver that sometimes raises when shutting down the camera

- displays coordinates in the status every 100 updates
Does anyone know where to buy the Intel Realsense T265 now? It looks like this is one of the sensors that got the ax from intel, unfortunately. I was looking forward to having one of these to mess around with.

Does anyone know if there is there a direct replacement for this sensor, in the works?

User-inserted image

User-inserted image
Wow over $800 Cad on eBay.  I guess when we heard they were being discontinued we should have stocked up.  It’s disappointing that once again intel left us high and dry. These are used in several commercial robots so I assume they have grabbed whatever supply is on the market.

Perhaps it’s time we had support for another product.
#125   — Edited
Hey - who called it?:) I shouldn’t be happy about being right, but I saw the writing on the wall. The robot industry is code first, usability last. I met with the CTO of real sense and told him that we could give them tens of thousands of customers. He said, no thanks. He said he’d prefer to focus on ROS. Well, how did that turn out? Sorry but I don’t see anyone doing anything with ROS. Well, I should clarify I don’t see enough people doing anything exciting or reproducible. I think they sold 300 intel real sense tracking cameras. If marketed, priced right, and bundled with ARC, they would have sold tens of thousands or more. That might not be the millions of units they were hoping for, but it is an excellent start for a business case. 

Someone writes a post that’s twenty-eight pages long about moving a servo with ROS. If it takes 35 weeks to learn how to drive a servo with ROS, how many Intel real sense cameras do you think they will sell? For some reason, there's this conflicting issue with robotics and businesses. A business is about decreasing internal costs and increasing revenue - therefore, the result between the two is profit. The CEO wants the most efficient and cost-effective solution for any business function. However, with robot companies, their CEOs are CTOs. So, the result is "a difficult robot system means business system." *confused*

How do you run a successful business like that? You're telling me that using software like ARC that is easy, quick, and extensible is not professional? Because to be professional, you need 38 guys in a room writing 5 million lines of code to "test" a prototype that will be thrown out in a month. <- this is the unfortunate reason for "why we can't have nice things" with robot products.

Take a look at Boston Dynamics, for example. @Nink, you posted the documentation, so I took a look. My head was spinning, haha - no wonder they can't find a business model. The darn thing is impossible to use without a Ph.D. engineering team. And boy, don't Ph.D. engineering programmers have the best business ideas? Hahaha, *sarcasm*. Inventions are fueled by creativity - and creativity doesn't involve complexity. So, therefore, simplicity inspired creativity. Want new products or business models? Give the responsibility to creative teams, not engineering teams.

Anyway - if you read the NMS manual, you don’t need a t265 for the better navigator. You can use other options for providing a pose hint.

<rant over> lol
@EzAng wow, $325 USD is like $9 million Canadian
Hum, why focus on ROS and not on ARC? Thant is strange, wonder what the reasons are...
#EzAng That Amazon link doesn't ship to Canada so you need a US redirect address like MyUS + Import Duties + additional shipping so I guess about $500+ Canadian.  Perhaps if someone wanted to buy all 17 they could resell for $700 CAD. Cheapest I could find them on eBay was $700 Canadian from China but Stuff from china takes months now.  Not sure if there would be demand though.
Hi Dj, 
Is it possible that the skill can keep the latest serial number connected so I can start the device programmatically with the start command and not to select  it manually each time and start the device?.  Thanks.
We can look into adding that ability if possible. I can see how it could be useful if the Intel product supports it. Thanks for your feedback.
Would appreciate some input. I have a small tracked vehicle that I want to get around on its own. I have RPI 3+, Arduino mega and Uno, ESP32 dev, EZB-v4,  T265, encoded motors, ultra sonic sensors, dual H Bridge. What will be the best way to get T265 and sensors and  H Bridge talking to one another? Or, What would you do?
Hi @DSBaily.

The piece of the puzzle you are missing is a windows Single board computer like a rock pi x or similar. You really want the T265 plugged directly into USB port on the PC so the PC really needs to be on the vehicle. 

you can also connect the EZB direct to the PC or you can use an Arduino either via a serial connection or ESP32 wireless.
Thanks Nink

That is what I thought. I may use sonics and encoders and see how well it does. 

Thanks again.
You can put windows on the RPI 3+, I think. But either way, the T265 should be combined with a 360-degree lidar or depth camera. The T265 is okay but not nearly as good as using an actual Lidar, IMO. 

As for having the robot move, you need to add the HBridge PWM Movement Panel to your project. But, I would first recommend following the getting started guide, which introduces what a Movement Panel is because a Movement Panel is how the robot moves. Once a Movement Panel is added, it registers itself into the ARC framework, and then all other robot skills can move the robot. Here's an excellent place to start:
I do have a Slamtec RPLIDAR A1M8 2D 360 Degree 12 Meters Scanning , will that work? I think I tried it last year but never got to work. My large robot uses T265 and D435 with LattePanda and works well. Trying to use what I have in stock