Thumbnail

Intel Realsense T265

by Intel

Use the Intel Realsense T265 tracking camera for way-point robot navigation

How to add the Intel Realsense T265 robot skill

  1. Load the most recent release of ARC (Get ARC).
  2. Press the Project tab from the top menu bar in ARC.
  3. Press Add Robot Skill from the button ribbon bar in ARC.
  4. Choose the Navigation category tab.
  5. Press the Intel Realsense T265 icon to add the robot skill to your project.

Don't have a robot yet?

Follow the Getting Started Guide to build a robot and use the Intel Realsense T265 robot skill.

How to use the Intel Realsense T265 robot skill

With its small form factor and low power consumption, the Intel RealSense Tracking Camera T265 has been designed to give you the tracking performance for your robot. This ARC user-friendly robot skill provides an easy way to use the T265 for way-point navigation.

The T265 combined with this robot skill provides your robot a SLAM, or Simultaneous Localization and Mapping solution. It allows your robot to construct a map of an unknown environment while simultaneously keeping track of its own location within that environment. Before the days of GPS, sailors would navigate by the stars, using their movements and positions to successfully find their way across oceans. VSLAM uses a combination of cameras and Inertial Measurement Units (IMU) to navigate in a similar way, using visual features in the environment to track its way around unknown spaces with accuracy. All of these complicated features are taken care of for you in this ARC robot skill.

Update Firmware The device sensor may require a firmware update.

  1. Visit the Realsense GitHub page, scroll to the bottom of the page, and install the Intel.Realsense.Viewer.exe from here: https://github.com/IntelRealSense/librealsense/releases/latest User-inserted image

  2. Click the hamburger settings icon and select Install Recommended Firmware User-inserted image

Robot Skill Window The skill has a very minimal interface because it pushes data in the NMS and is generally used by other robot skills (such as The Navigator). User-inserted image

  1. Drop-down to select Realsense device by the serial number. This is useful if there are multiple devices on one PC.

  2. START/STOP the Intel T265 connection.

  3. The confidence of the tracking status between 0 (low) and 3 (highest). In a brightly lit room with many points of interest (not just white walls), the tracking status will be high. Tracking will be low if the room does not have enough light and/or detail for the sensor to track.

  4. Log text display for errors and statuses.

Config Menu User-inserted image

1) Mounted Radius Offset (mm) is the distance in mm of the T265 from the center of the robot. A negative number is toward the front of the robot, and a positive number is toward the rear. The sensor must be facing 0 degrees toward the front of the robot. The sensor must not be offset to the left or right of the robot.

  1. Enable Video Stream will send the fisheye b&w video from the T265 to the selected camera device. The selected camera device robot skill must have Custom specified as the input device. Also, the camera device will need to be started to view the video.

  2. Distortion Correction will use a real-time algorithm to correct the fisheye lens, which isn't always needed and is very CPU intensive.

Video Demonstration Here's a video of the Intel RealSense T265 feeding The Navigator skill for way-point navigation

ARC Navigation Messaging System This skill is part of the ARC navigation messaging system. It is encouraged to read more about the messaging system to understand available skills HERE. This skill is in level #3 group #2 in the diagram below. This skill contributes telemetry positioning to the cartesian positioning channel of the NMS. Combining this skill with Level #3 Group #1 skills for obstacle avoidance. And for Level #1, The Navigator works well. User-inserted image

Environments This T265 will work both indoors and outdoors. However, bright direct light (sunlight) and darkness will affect performance. Much like how our eyes see, the camera will is also susceptible to glare and lack of resolution in the dark. Because the camera visual data is combined with the IMU, the camera must have reliable visible light. Without the camera being able to detect the environment, the algorithm will be biased to use the IMU and will experience drift, which greatly affects the performance of the sensor's accuracy.

Screenshot Here is a screenshot of this skill combined with The Navigator in ARC while navigating through a room between two way points. User-inserted image

Starting Location The T265 does not include a GPS/Compass or any ability to recognize where it is when initialized. This means your robot will have to initialize from a known location and direction to reuse saved maps. Make sure you mark the spot on the ground with masking tape where the robot starts from.

How To Use This

  1. Connect your Intel RealSense T265 camera to the computers USB port

  2. Load ARC (version must be >= 2020.12.05.00)

  3. Add this skill to your project

  4. Now we'll need a Navigation skill. Add The Navigator to your project

  5. Press START on the Intel RealSense skill and data will begin mapping your robot's position

How Does It Work? Well, magic! Actually, the camera is quite interesting and it breaks the world down into a point cloud of features. It remembers the visual features so it can re-align itself on the internal map. It uses a VPU, is what Intel calls it. Here's a video of what the camera sees.


ARC Pro

Upgrade to ARC Pro

Subscribe to ARC Pro, and your robot will become a canvas for your imagination, limited only by your creativity.

#1  

Wow that is what you were working on earlier!:D

PRO
Synthiam
#2  

Updated for maps to grow as necessary

#3  

That's great.  Perhaps what I wrote earlier can just attach to this...if there is any value add left.  This may render my musings moot I guess.  I wish I had known (story of my life)...would not have spent time on it if the skill was a done deal.

Does this skill provide an API to the map?  If so, 2D or 3D?  So curious.

PRO
Synthiam
#4   — Edited

Updated to fix a bug with the growing map

Martint, there will be access to the map in a few more versions. Stay tuned... I'm adding things as we go over the next few days. However, accessing the map is somewhat irrelevant because you merely specify where you want the robot to go and it uses what ever Movement Panel that is added to take you there.

Movement panels allow any robot to move by any other skill. So it takes control - that's how ARC works. The robot can be a drone, humanoid with gait, hexapod with gait, hbridge, continuous rotation servos, etc, etc, etc... it doesn't matter. It just goes.

This isn't the place for that conversation, however. But you can read what a Movement Panel in the support section here: https://synthiam.com/Support/ARC-Overview/movement-panels

#5  

Thanks.  I get it on the movement, and why it is somewhat irrelevant for that, especially inside a single open room.

I am very interested in the map for other reasons...so I will stay tuned.  Here is one below.  This is more of a nav thing so I'll go read what you guys have on that too.

If you are moving from point A in a Bedroom to point B in a Living Room...there are obstacles between A and B, both known (walls, doors, hallways, furniture, etc.) and unknown (people, pets, laundry), that come into play.  The map presumably is or will play a central part in this.  Obstacle detection is already in other skills.  Fusing a bunch of disparate obstacle detection sensors is another piece (I use 3 types).  Pathfinding is a piece too....and pruning the path down...eliminating all the useless points in the middle of a path to find sensical free diagonals.  I personally like to layer some kind of force field on too so the bot will drive more reasonable paths down the center of open space (like hallways) instead of driving the shortest path (like pathfinding algos usually provides).  To me, the best path is not the shortest, but the easiest.  This obviously can be done by modifying the terrain cost around everything bad before the pathfinding does its path.

Anyway...to anyone out there, which parts of this still need skills if any or is it all already there?  I suppose I could have left out some skills in that process that you guys have encountered.   I don't want to pest and will put a cork in it and RTFM if that if that is the etiquette preference here, just hint away.

Portugal
#6   — Edited

That looks promising. What approach are you thinking of taking regarding the map if any? I was thinking it would be cool to drive around to create the map with a tof sensor like the VL53L0X cheep laser. What do you guys think?

Portugal
#7  

Another idea. Simple DIY lidar that for our needs is enough:

#8   — Edited

Hi proteusy...when you mentioned something looking promising...were you referring to the skill DJ is working on or something in my last post?  Sorry...had to be the skill, my bad.

Portugal
#9  

Hey martint, when i was posting the last post was from DJ. Didnt read yours yet:)

#10  

Gotcha...the skill does look promising.  I really like the T265 so far but haven't figured out how to do mapping with it yet.  It does it automatically on the sensor and DJ is right...you can use it without having access to the maps for some use cases.  The mapping and sensor questions you brought up are a big topic with lots of different and valid opinions.  I am happy to jump in to it and give mine if and when its the time and place...my guess is others are already way ahead of me on it and have other great solutions.

Historically, I have been a sonar array nerd (every 30 degrees) and now adding depth cam to that mix for more precision up front.  I am still trying to assess what's been done and what needs more effort so I can see if there is a way to help.

Portugal
#11  

Yes, i think we all begun in robotics whit sonars, bump sensors...  From what i understand the T265 makes some kind of vslam internally with the two fish eye cams to help localize the camera in a known space. Can you confirm that martint?

#12  

Yes.  That what it does...seems to do it very well but I just got it a few weeks ago.  I have a lot more to learn but it is one of my fav sensors.

I tried out a Marvelmind system on my bot initially before I got the T265...it was a lot more trouble and a waste of $450 compared to this.

I wrote about my experiences so far with the T265 in a doc at the bottom of my Ava project under "Additional Resources".

PRO
Synthiam
#13  

@proteusy - the new Navigation Messaging System in ARC does just that. It has events that skills can connect to for contributing data into the navigation messages. This means anything from a lidar, or sharp ir sensor, or ultrasonic ping, or 360 degree lidar, or bump sensor.... It means absolutely anything can contribute data and The Navigator will receive it and process it.

You can even combine the Intel RealSense skill with a wheel encoder skill and send both to the Navigation Messaging System to improve accuracy.

#14  

Damn, way  to keep staying ahead of the crowd on robot Innovation DJ!

PRO
USA
#15  

@DJ, I'm trying to stay away from another Intel purchase and you are making my life difficult :)

Nice Job ! plugin... plus "bird eye" documentation.

PRO
Synthiam
#16  

Ya. After being burned with the ezbv5 joule, I didn’t want to do this. That realsense has been sitting on my desk for a long time. Finally decided to give it a go... and I have to say wow!! It works really well

PRO
Synthiam
#17  

Version 10: Minor bug fix

PRO
Canada
#18   — Edited

Very cool. I know what I want for Christmas now. Any thoughts about supporting the D435 Camera as well?

#19  

I have been very interested in the D435 camera and would probably have bought it had I known the T265 was coming.  I chose a different camera at the time (an Orbbec) because I evaluated depth cams in isolation on their own merits.  There were no tracking cams.  The D435 is bulky, but the main D435 component inside the housing looks to be a lot smaller, so I am very interested in getting one and taking it apart.  The huge factor for me is that the SDK looks to have strong support for integrating the streams from the D435 and the T265 together, and odometry from motor encoders too.  For 3D vision purposes, integrating the depth and color streams with the T265 streams is VERY beneficial.  To me, this makes this sensor combo perhaps one of the best sensor combos yet.  I REALLY hope they produce a single product in the future that does both.

I bought an Orbbec depth cam instead and have used multiple Orbbecs (2) at the same time on one bot.  I spoke with a rep who says they have hooked up many at the same time to one pc and that you can context switch between them in a few millis.  The Orbbec has some positives like (small size, no interference with each other, they say it works outside too) I think the Orbbec (and some others) is a better depth cam if only considered by itself.  (just opinion here) However, because of the potential for the D435 and T265 to be tightly integrated and time synced by their SDK...I think this makes the D435 (or any subsequent Intel depth cams) to be a VERY enticing option.

PRO
Synthiam
#20  

Version 11: updated sample rate that is reported to the NMS

PRO
USA
#21  

My T265 should be here in a couple days! Ordered direct.

PRO
Canada
#22  

I sent a link to Santa and begged for a Christmas Present so fingers crossed it will be under the tree on the 25th.

PRO
Synthiam
#23  

Haha - geez, I should be a realsense reseller :D. Anyway, check this...

PRO
Synthiam
#24  

Here's some video out of the fish eye as well - so you can see how the camera breaks the image down. Quite interesting...

PRO
USA
#25  

Christmas came early...too bad i dont have a wheeled robot for it.....:)

User-inserted image

PRO
USA
#26  

Thats pretty cool..the fisheye video.

PRO
Synthiam
#27  

It doesn't have to be a wheeled robot. Remember, ARC uses movement panels. Any robot with a Movement Panel will work

PRO
Synthiam
#29  

What about your giant hexapod? :D

Oh, and you need a silly long usb cable if the robot doesn't have a PC installed. I used an USB 3.0 active extension cable: https://www.amazon.ca/CableCreation-Meters-Extension-Extender-Female/dp/B0179MXKU8/ref=sr_1_5?crid=3B86YRQ5Z4T4T&dchild=1&keywords=usb+3+active+extension+cable&qid=1607543830&sprefix=usb+3+active%2Caps%2C203&sr=8-5

Might be able to get one today from bestbuy or a local computer shop.

PRO
Canada
#30   — Edited

We have a find a way to make the T265 Wireless. It hurts me every time I see DJ have to swing that USB cable around the robot LOL :D

As you mentioned DJ, this sensor is likely more ideally used with an onboard SBC/PC.

PRO
Synthiam
#31   — Edited

Jer, usb 3 is too fast for wireless - so no wireless usb 3 hub exists. The only options are a usb cable for testing, or an onboard pc

PRO
USA
#32  

The Hex is dead for now...but he gets a resurrection on my Youtube series prolly around Jan/Feb. Much stronger servos for the upgrade.

User-inserted image

The Latte Panda has a USB 3 port...wonder how much CPU is used by Real Sense?

#33  

When I get the hardware, before I start building the final home for it, I plan on making a trailer for my Roli to carry a small windows tablet and maybe a bigger battery for Roli too.

PRO
Synthiam
#34   — Edited

@will, The realsense doesn't use any CPU because the processing is done in the sensor using a VPU. The only processing will be ARC's navigation skills, which use 1 or 2% on my i7 home PC from 2012

@alan, a trailer eh? That'll be pretty fun to see! Hope that happens soon. You could easily fit a rock pi/x with an additional Lipo on the roli and run headless. Here's to links that will be useful

  1. setting up windows for SBC (save performance and storage space): https://synthiam.com/Support/Install/free-space-with-windows-10

  2. set up SBC for headless: https://synthiam.com/Support/Install/headless-robot-sbc

Portugal
#35  

This beauty just arrived! Lets have some fun!!!!!

User-inserted image

PRO
Canada
#36  

haven't received my Lidar or T265 yet :-).  Santa must have Covid  (hoping the neato XV11 works with NMS by the time it arrives).

PRO
Canada
#37  

T265 turned up. Testing my luck on the stairs.  Need to strap some LIPO together so I can power my PC so doing it DJ style for now.  User-inserted image

PRO
Synthiam
#38  

Nice!!! Using The Navigator with it?

PRO
Canada
#39  

Trying  to work it out. Set the way points and moving between.  Unfortunately don’t have a good open space. The 16 year old is in the basement playing video games (he yells really loud). Wife is on main floor reading wants quiet, daughter is tutoring via zoom up stairs so I found a quiet nook but lots of obstacles and I keep running into walls, bookshelves, tables etc. Need to write an obstacle avoidance script for sonar but my LIDAR will be here soon I hope.   Roll is not exactly nimble either but I needed something I could build a platform on for PC and battery.   May have to steal the rumba from telepresence robot.

BTW how do you get the fisheye camera to work in ARC?

PRO
Synthiam
#40  

No no

First drive your robot manually around. Drive to the first way point. Mark home. Mark each significant corner/turn. Then mark the end.

do that for each day point.

the key is to FIRST drive the robot TO the way point. Then mark the path.

also. Mark home on the ground. Use painters tape so you can reuse the maps.

instructions above will helpm

PRO
Canada
#41  

Ok I was just marking major destinations not path to destinations. I guess I need to spend some more time mapping.

PRO
Synthiam
#42  

Ha ya. If you try and guess coordinates it’ll lead to disaster lol. Drive robot and watch the trajectory.

Portugal
#43  

Thanks@Nink

@DJ have you seen this on the T265 Intel page?

What is wheel odometry and how does it help T265 navigate?

Wheel odometry is the use of sensors to measure how much a wheel turns, it can be used to estimate changes in a wheeled robots position. T265 has wheel odometry support built in, allowing it to use the data from these sensors to refine the position data of the robot. Providing robotic wheel odometer or velocimeter data over USB to TM2 will make the tracking much more robust on wheeled robots, which otherwise can experience many tracking failures. We consider odometer input to be a requirement for robust tracking on wheeled robots.

PRO
Canada
#44   — Edited

I think what intel is inferring is that their sensor can be used to calibrate wheel odometer on robots.

once you get the calibration right on a specific surface or in a specific environment then you can remove the sensor.

I envision you don’t want to put a $250 sensor on every robot doing minor tasks like vacuuming but you could calibrate it with the T265 and then send it on its way.

PRO
Synthiam
#45   — Edited
  • Moved your comments to correct thread

In the NMS are some additional reading links that talk about the challenges with navigation and pose. I highly encourage you read them prior to this discussion.

Intel is very clear in that paragraph that you copied. It clearly says the T265 is compatible with wheeled robots. It does NOT state that it works with wheel encoders. It can correct wheel odometry but it cannot accept an input from a wheel encoder. It can be used to correct wheel encoded robot pose because wheel encoder robots have terrible accuracy - which is why the T265 exists.

I really recommend reading the additional links in the NMS page to catch up on the difference between a wheeled robot and a wheel encoder. And also the challenges of the two etc

*edit: the api seems to have an older interface that allows wheel odometry input. The early release of the T265 does mention you can push wheel odometry data into the api and it gets fused with the T265. So that must be an early attempt at converting the old school wheeled ros guys to vslam

Portugal
#46   — Edited

Sorry, my English with technical terms is terrible, i thought that Wheel odometry meant wheel with encoder, as they state: "Wheel odometry is the use of sensors to measure how much a wheel turns". Oh and Happy New Year Everyone!!

PRO
Synthiam
#47  

Your English is great - don't think I ever noticed anything else :)

#48  

Hey there Nink  ,my Roomba does more than just vacuum,keeps the Racoons and Squirrels terrified from knocking over the patio garbage can,LoL!

Portugal
#49  

Hi DJ, i moved the T265 camera position to the edge of the roomba and when i spin 360deg in place i get a circle like on the pic. Could you make an offset for the camera as you did with the US sensors?User-inserted image

PRO
Canada
#50  

It would be interesting to see how accurate the orientation data is in order to do this. Playing with the sensor it does have orientation yaw pitch and roll. xy and z access tracking also would be nice for drone track and robot arms. I guess you could always mount the sensor in centre of roomba on your pole and reduce offset.

I would love if we had D435 support. Watch this video.

long term goal would be Map room with T265 and D435. Use object recognition to identify and find the object.  Use T265 data to go to the object and use data from D435 to calculate exact location and orientation of object, now use inverse kinematics to calculate how to pick it up and run a bunch of simulations, finally use robot arm and gripper to pick up object.  Mounting the T265 on robot arm would verify are calculations as we pick up object and also be used to train robot using ML to improve IK calculations.

So GPU TPU support (Nvidia Jetson?) for accurate object recognition and IK calculations. D435i support for 3D point cloud and T265 for location and orientation and movement of robot arm in 3D space.

Portugal
#51   — Edited

@Nink I had the cam on the pole but it creates some vibration and having an offset parameter lets you place the camera where you want. How are you using your T265?  The Jetson is ARM based, so not ARC compatible.

PRO
Canada
#52   — Edited

Well it could run on Linux and arm,  it used to :-).  But I get DJ doesn’t want to support 2 distros as there is a lot of effort involved amd that = costs, but that doesn’t prevent someone adding Jetson support as an accessory and offload all the GPU requirements to the Nano.

right now just playing with T265 (not enough hours in day) but my goal is a robot that can do some simple tasks around the house.  Pick up shoes put them away, vacuum without smashing the wife’s furniture up or getting stuck under the coffee table  and most important. GET ME A BEER.

what are your plans @proteusy?

Portugal
#53  

I am currently working on the "Go to work for me" script.:p

PRO
Synthiam
#54  

Updated for radius offset in MM from the center of the robot. Read the manual above for more detail or use the question mark in the config menu.

PRO
Canada
#55  

What is the performance and power consumption like on the stick computer @proteusy, are you able to Remote Desktop in ok?  Since we are only getting telemetry data of T265 I am wondering if it would be better to just pull the data off the T265 with a pi or stick pc and send it to Navigator on a remote desktop.  I mounted a NUC with 2 * 3 cell Lipo's in series for ~22v to run everything off, works fine but battery life is short.

Portugal
#56   — Edited

For now i use the roombas battery only for the roomba and have 4 x 3.7V 3500mAh MR18650 cells for the rest. The intel stick is surprisingly fast and the nominal consumption is around 1.2 amp (just the stick). All together i get around 2 hours of play. My next step is to buy a li-ion with 5500mAh for the roomba and run all from there. I am curranty working on the obstacle avoidance when using The Navigators way points.

#57   — Edited

With my Roomba I am using the roomba battery power to run the Ezb4 (or Tiny io) with EZ camera and Terminator brain LED lights, it makes only 5 minute reduction in runtime of roomba with vacuum and brush motors on 1 hour 15 minutes. The Chrome skull is fairly heavy about 8 pounds so i bet without it,would get at least 1 hour 45 minutes.

PRO
Canada
#58  

Moved from the Roli to the roomba. Zig zags a lot when going to way points. do other people have that problem. User-inserted image

where is the best place to position the realsense. I did play with new offset a bit.

PRO
Synthiam
#59  

Adjust the speed to be a little slower. The position is fine. Use a measuring tape for the offset from the center

i Have a todo to add speed adjustment settings for moving and turning. It’ll come in time :)

Portugal
#60  

Yes i have it too. As DJ suggested go slower for now.

#61  

My realsense is on the way, so I will be joining the fun soon.  I just need to build a little trailer or mount so my Roli can carry a windows tablet.

I'll just use ultrasonic for object avoidance for now.  Probably upgrade to Lidar at some point though.

Alan

PRO
Synthiam
#62  

Woooo - I have a few updates planned for the NMS as well.

PRO
Canada
#63  

Love to see stereo camera support and a way to test if we are in exact location and get back on track if not with glyph or something like that. D435  would be nice as well :-)

PRO
Synthiam
#64   — Edited
  1. what do you mean by "stereo camera support"? The t265 does not have a camera useful for tracking or any visualization. The t265 cameras are used for identifying points for vslam. See early posts where I displayed the camera video

  2. What do you mean by "test if we are in exact location"? The T265 returns a location pose (x,y,z) relative to the starting location when initialized. As per the instructions, your robot will always need to start at the same starting location as this is how the T265 was designed. Synthiam doesn't make the T265, it is designed by Intel.

PRO
Canada
#65  

1 the two fisheye cameras If you load up the intel tools you get the view from the two cameras.  Would be nice if we could use them.

2 having to line up robot is not really practical in real world.  If robot does not know where it is there should be a way to auto align to a known way point via some method.  Ie look for a glyph and go ok I know where I am now. And then choose a known path from that waypoint to the new destination.

Portugal
#66   — Edited

You cant use the T256 cameras as they are only for vslam. The charging station is the starting point for my robot.

PRO
Synthiam
#67  
  1. this is what the fish eye looks like using the intel sdk. What would you want to do with that?

  2. My advice is to have the robot search for a glyph and align itself, then start the intel t265 tracking. There's no way to modify the tracking data. The data returned by the t265 is the data from the sdk made by Intel. The alignment of the robot in the real world is quite practical because that's how it's being used. The robots start from a docking station and begin navigating from there.

There's also a huge but with the intel sdk, where you can't start, stop and start the driver. While ptp made good points about working with Intel is difficult because they abandon projects easily, he's accurate. What we have from the t265 is about as good as it'll get.

The one thing I can sort-of think of is having to apply an offset to the t265 coordinates to "re-align" itself to the home position. Or I guess, to any position that you specify. But, you'd have to be REALLY REALLY accurate on whatever offset you provide. I can make a ControlCommand that allows you to specify the offset, which essentially would be the "new position". Then, if your glyph is at 100cmx100cm @ 90 degrees heading from home, you can specify that is where the robot is and it would re-align itself to that position. Now, remember that the degrees also matter to the t265. If the degrees are off, then the t265 will be off.

This is also why I am much more fond of indoor positioning from external sensors (ie cameras) to know where the robot is. You'll notice that even game consoles (ie Wii, Sony Playstation) motion controllers use a camera for tracking along with side inertia sensors. This is because unlike a human or animal, having a robot know its position is somewhat of a philosophical discussion.

PRO
Canada
#68  

Stereo Camera access was to triangulate glyph if that makes sense.

Portugal
#69  

That issue with the stat, stop and start issue rarely happens to me. A camera reset button and some message output would be nice to have in the T256 plugin.

PRO
Synthiam
#70  

@nink, did you watch the video? that camera video feed will never ever detect a glyph or anything. Their SDK does not provide a usable image. Also, the fish eye angle would not work with a glyph so that'll be out.

@proteusy, what do you mean a camera reset button? The issue with the Intel SDK is that when you stop the t265, it cannot be restarted without restarting arc. It is a bug in the intel sdk, not arc's robot skill. I don't have control over the t265 driver and their hardware problems as it's a product made by Intel. We get the blunt response of their bugs :(

Portugal
#71  

I know what you meant. The reset I was talking about is the factory reset to erase the slam data from the camera.

PRO
Synthiam
#72  

I couldn’t find anything like that in the sdk either. It’s quite incomplete as ptp predicted by an intel product. We got what we got I guess :)

Portugal
#73   — Edited

@DJ, I found this?

rs2::config cfg; rs2::pipeline pipe; cfg.resolve(pipe).get_device().hardware_reset();

https://github.com/IntelRealSense/librealsense/issues/4113

PRO
Canada
#74  

FYI here is an example of the T265 fisheye camera using Fiducial (essentially a glyph) to detect location.  https://github.com/IntelRealSense/librealsense/tree/master/examples/pose-apriltag

PRO
Synthiam
#75   — Edited

Neither of those work with the c# .net wrapper. Maybe one day I can look into converting the c++ to c#. That's just a lot of work for right now. I'll add to the list :)

PRO
Synthiam
#76  

Added config option to send fish eye data to the specified camera device

*Note: requires ARC 2021.02.08.00 or greater

Portugal
#77  

Works great DJ, thanks. Now to see if we can do some glyph detection.

PRO
Canada
#78  

Nice work!  I noticed when I was reading the Boston Dynamics Spot documentation on GraphNav that it used a similar stereo fisheye camera and IMU (although it has 5 of them two front, one each side and one rear).  It aligns to a fiducial (up to 4m away) and then recalibrates its position.  The T265 works great for our wheeled robots running around the living room but as soon as you get to larger legged robots moving long distances they drift as they move further away from the start location.  In order to recalibrate a fiducial is required of known size the robot can triangulate its new position.

This seems like a really primitive way to do this but environments change (pallet moved in a warehouse etc) so you need something guaranteed to lock on to so you can move along the next edge to the next way point.  We need a better solution than this in the future to deal with drift, the need to recalibrate and the need to enter into a known path.  I guess ML is the long term answer but in the short term the world will be filled with fiducial's.

About 11 min mark onwards.

PRO
Synthiam
#79  

It’s interesting that with so much funding their solution was to spend more money by adding more sensors - because that’s how large the problem is to solve.

Put it this way, the problem of a robot knowing where it is in the world is so big that Boston dynamics merely threw a bunch of redundant vslam LOL. And most likely a bias shifting complimentary filter fusing the sensor data. The sensor that starts to drift first would begin getting a lower weight in the fusing calculation.

Anyway - the much larger conversation on this topic is not about sensors, but AGI. Artificial general intelligence.

Even if you close your eyes and I move you to a new location, you’ll have no idea where you are. The robot needs some level of self in relation to the world to create an adaptive internal model. Like how we do :)

PRO
Canada
#80   — Edited

I have spent a bit of time playing with this.  It is really good!.  I like the way it follows the path you set to get around obstacles.  I do get a bit of a drift the longer it is away from home.

Question is there a way to tell robot it is currently in a certain location when it drifts.  I start in the docking station and turn Realsense on. Then I tell roomba to move back about 1 foot from docking station and then give waypoint voice commands. It knows where it is because I started realsense when it was docked.  Now it runs around my house but starts to drift.  I guess I could send it back to a point in front of docking station and then dock using ControlCommand Seek Dock on roomba then stop realsense, then start again then backup one foot again etc, but it would be good if there was a way to tell realsense I am here. please reset your tracking location to hear now?

We talked about glyph as being a way to fix with camera.  Maybe if we had 2 cameras working we could triangulate and get position.  I thought about some lines on the ground in various locations that the robot could follow and when he crosses them he starts to follow and finds a dash code etc to recalibrate using a sensor like this https://cnclablb.com/adjustable-line-tracking-sensor-module.html   But I think what I need is a way to send a command to Navigator to say HELLO I AM HERE.   Maybe something like a ControlCommand("The Navigator", "ResetWayPoint", "Kitchen");

PRO
Synthiam
#81  

@Nink, I mentioned previously that I'll add a ControlCommand() for you - it is on the list todo. However, I don't think it's going to help you and will cause much larger challenges to face. But that being said, I'm interested in how you'll overcome them.

You see, the T265 requires a cartesian X/Y and Heading Degree for mapping. This means you will have to specify the absolute exact coordinate (including heading degree). I'm sure there's a way you can calculate it with a glyph size and such. But be forewarned that it won't be as simple as you may think.

*PS Note: moved the above comment into this thread because it's regarding the Intel T265 and now a robot build.

PRO
Canada
#82  

OK thanks @DJ.  BTW I tried telling the robot to goto a spot close to dock, then go seek dock then stop realsense camera and start Realsense  again to recalibarate but after a couple of times doing this it will just hang.  If I unplug realsense wait 30 second and plug back in again it will work when you tell it to start again.

Question Can you Disable and Enable a USB port in ARC to reinitialize the driver or can you call a batch file to potentially use something like  devcon to disable port in windows and then re-enable the USB port so it resets each time you stop and start it?

User-inserted image

Portugal
#83  

I use the same method. I found this code, duno if it helps...

https://github.com/ralight/usb-reset/blob/master/usb-reset.c

Portugal
#84  

Bye the way, i have a Ryzen 5 with 16GB onboard my roomba and the T265 does not drift anymore.

PRO
Canada
#85  

Thanks I am not sure how to execute that code from ARC. I did a google and found I can disable and enable with the PNPUTIL command but windows wants to reboot in between ( I guess they forget what the PNP  in PNPUTIL stands for)  example for my device pnputil /disable-device "USB\VID_8087&PID_0B37\952322111114"

I could always just create a USB hardware add-on that allows me to turn it on and off with a relay (this is actually starting to sound like easiest method)

With regards to drift, I played with offset for a while and concluded -160 was a good level (When I go into a spin it doesn't create a donut) but yeah drift is still a problem.

Ryzen 5 must chew some power.  What are you using for battery.  I have blown out 4 Lipos running all this stuff (My 4S is now a 3S and my 3S's are now 2S's).  I have now switched back to an old Sealed lead battery and plug in when docked so computer is always on.  This seams to work OK and I get about 2 hours of play time.   I am going to add some springs and magnets so it will auto charge (just a matter of finding parts) this week.

I really want to get this as autonomous as possible.

Portugal
#86  

Great idea with the relay. The code is just an example i found. I think it is written C+. My setup with this mboard is recent and i am still in the build phase. I use 2 Xunzel SOLARX-8 gel batterys in parallel.

PRO
Canada
#87  

heh I found out how to fix camera hanging when you turn off Realsense and turn it back on.  Put a 1 minute delay in the script between turning off and on.  Sometimes it is quick sometimes slow but 1 minute seems to work all the time for me.  More testing.

I need to work on distance sensors maybe laser sensors as these ultrasonics are garbage.  Will smash into something or see something that is not there.  Also I think we need a way to navigate with individual sensors,  if something towards left turn right, towards right turn left. directly in front stop.  When you have sensors directly on side you really don't care as much about distance from wall beside you, also when you are going fast they don't work as well as slow.

I also really need to add a panic button off an Arduino accelerometer to shut off roomba battery if it starts to go crazy. That seems to happen a lot :-)

Portugal
#88   — Edited

Nink, i use the roomba built in IR sensor array to avoid obstacles coming from left, right or front and try to navigate past it. It is very crapy code and is still in "beta" but works.

import time

cleft = 0
cright = 0
fleft = 0
fright = 0
bleft = 0
bright = 0

while (True):

  if getVar("$RoombaLightBumperCenterLeft")==1:
     Navigation.SetNavigationStatusToPause();
     print "Bumper Center Left";
     cleft += 1;
     print cleft
     #time.sleep(2);
  if cleft == 2:
     Movement.Right(30,3500);
     Movement.Forward(30,1500);
     Movement.Stop();
     Navigation.SetNavigationStatusToNavigating();
     cleft = 0
     cright = 0
     fleft = 0
     fright = 0
     bleft = 0
     bright = 0
     print cleft
     
  if getVar("$RoombaLightBumperCenterRight"):
     Navigation.SetNavigationStatusToPause();
     print "Bumper Center Right";
     cright += 1;
     print cright
     #time.sleep(2);
  if cright == 2:
     Movement.Left(30,3500);
     Movement.Forward(30,1500);
     Movement.Stop();
     Navigation.SetNavigationStatusToNavigating();
     cleft = 0
     cright = 0
     fleft = 0
     fright = 0
     bleft = 0
     bright = 0
     print cright
     
  if getVar("$RoombaLightBumperFrontLeft")==1:
     Navigation.SetNavigationStatusToPause();
     print "Bumper Front Left";
     fleft += 1;
     print fleft
     #time.sleep(2);
  if fleft == 2:
     Movement.Right(30,3000);
     Movement.Forward(30,1500);
     Movement.Stop();
     Navigation.SetNavigationStatusToNavigating();
     cleft = 0
     cright = 0
     fleft = 0
     fright = 0
     bleft = 0
     bright = 0
     print fleft
     
  if getVar("$RoombaLightBumperFrontRight")==1:
     Navigation.SetNavigationStatusToPause();
     print "Bumper Front Right";
     fright += 1;
     print fright
     #time.sleep(2);
  if fright == 2:
     Movement.Left(30,3000);
     Movement.Forward(30,1500);
     Movement.Stop();
     Navigation.SetNavigationStatusToNavigating();
     cleft = 0
     cright = 0
     fleft = 0
     fright = 0
     bleft = 0
     bright = 0
     print fright 
     
  if getVar("$RoombaLightBumperRight")==1:
     Navigation.SetNavigationStatusToPause();
     print "Bumper Right";
     bright += 1;
     print bright
     #time.sleep(2);
  if bright == 2:
     Movement.Left(30,2000);
     Movement.Forward(30,1500);
     Movement.Stop();
     Navigation.SetNavigationStatusToNavigating();
     cleft = 0
     cright = 0
     fleft = 0
     fright = 0
     bleft = 0
     bright = 0
     print bright
     
  if getVar("$RoombaLightBumperLeft")==1:
     Navigation.SetNavigationStatusToPause();
     print "Bumper Left";
     bleft += 1;
     print bleft
     #time.sleep(2);
  if bleft == 2:
     Movement.Right(30,2000);
     Movement.Forward(30,1500);
     Movement.Stop();
     Navigation.SetNavigationStatusToNavigating();
     cleft = 0
     cright = 0
     fleft = 0
     fright = 0
     bleft = 0
     bright = 0
     print bleft
else:
  Navigation.SetNavigationStatusToNavigating();
  time.sleep(1);
  cleft = 0
  cright = 0
  fleft = 0
  fright = 0
  bleft = 0
  bright = 0
PRO
Canada
#89  

wow great idea, didn't even know the roomba had a sensor array, I thought it just had a microswitch activated bumper.  cool.

PRO
Canada
#90  

Spent some time playing with the sensors on the Roomba, wow what a wealth of data!   Makes you wonder why the original Roomba smashes the furniture with all these sensors on board.

Thanks for the sample code and your idea to navigate around stuff using the onboard sensors @proteusy very cool. I had a play and if you run around really slow you can avoid some stuff but it still managed to knock over my green screen, dive under the treadmill, smash into my chair and get tangled up in the power cords under the desk.  How a Tesla navigates at 100km an hour blows me away.

That said, I think I need to add a bunch more sensors, a LIDAR and lots of code to make sense of all this data, if wheels are spinning and it is not moving, I am probably tangled up have smashed into something.

PRO
Synthiam
#91  

Updated manual to include instructions suggesting to upgrade the firmware

PRO
Synthiam
#92  

Updated with a dropdown to select from multiple realsense devices.

PRO
Synthiam
#93   — Edited

Fixed Intel's bug so the device can be started and stopped without needing to restart the project.

Also, this robot skill will only display tracking cameras in the drop-down.

PRO
Synthiam
#94   — Edited
  • New ControlCommand for resetting position and setting specific positions (i.e. recalibrate)

  • Bug fix and small performance improvement

PRO
Synthiam
#96  
  • Added a button to reset position to zero rather than having to use the control command
#97  

@DJ,

Do you thing if I place the T265 in the chest of an Inmoov will work correctly or does it need to be ground leave?

Thank you, Cheers

PRO
Synthiam
#98  

I’m guessing it will work fine. It can’t have anything blocking it. So no hands can be in front. Otherwise it won’t know where it is when the robot drives around

#99  

Thank you DJ , I will try it soon.

PRO
Synthiam
#101  

Quote:

when it came to the roughly 10 customers who bought RealSense products in relatively low volumes. There were only a few that were buying multiple units, like 10 to 40 at a time

well I feel deceived. I was told directly in a zoom call by the realsense cto that customers were purchasing thousands at a time for volume discounts.

#102  

Looks  like Xiaomi will need to re-design their new Cyberdog.  That was clearly a Realsense on its face:  https://www.theverge.com/2021/8/10/22618043/xiaomi-cyberdog-robot-dog-quadruped-specs-price

PRO
Canada
#103  

There are 5 realsense in every Boston dynamics spot.   I hope they stocked up.

PRO
Synthiam
#104  

Boston dynamics bought at least 40 lol O_o

It's amazing the poor ethics of businesses to lie about their sales volume - boom, you've been busted

PRO
Canada
#105  

I believe BD have only sold about 400 Spot world wide.  Several other dogs use them as well but not in volume.  They are also used in various restaurant food service bots, delivery bots, warehouse bots etc but again not in volume.

I am surprised they did not sell IP to a robot company or spin off to a startup for a royalty fee on every sale. There is probably value for a small startup to continue to sell and develop with an existing customer base.

This makes we wonder if there is some other issue (China Manufacturing, Chip Shortage, Shipping Issues etc)

PRO
Synthiam
#106  

Boston dynamics sold 400? Or 4?

#107  

Thought Id pass this along..was mentioned via anothere group, A company called StereoLabs makes a simular product called the ZED, Might be worth a look?

PRO
Synthiam
#108  

I do not believe there is a similar product on the market for navigation & mapping. The other devices (ie stereo lab zed) ate depth sensing cameras and therefore require external applications for navigation/slam mapping.

If the comparison is depth camera functionality, then there are other options.  But feature to feature there aren’t

Portugal
#109  

I wonder why and why now. Just last week Xiaomi released its robot cyberdog that will storm the markets at a very low price. They use a D455 realsense camera. I wonder, will they sell the realsense technology to Xiaomi? But as always the tech market is what it is and always evolving. I just hope the prices drop and I get some cameras.

PRO
Synthiam
#110   — Edited

I think the robot dog market peaked. Even at a lowered price, value can’t be added to something with no purpose. I doubt a robot dog company will generate enough profit on revenue to justify the high cost of owning a that technology specially for a single product function.

It’s not financial feasible for a product company to own the technologies. The only company that I think came close to buying the guts of realsense would be Boston dynamics - but that ship has sailed since they can’t figure out a product/revenue mode. They’re like the Honda Asimo team and being used for marketing and will fade away soon. Their videos are receiving less attention. People want to know what’s it for? What’s it do? now.

Realsense IP will most likely remain in intel’s hands and end up in their dusty museum like everything else. Intel isn’t a company known for selling or sharing IP.

I don’t think it’s all gloom and doom. The removal of realsense from the market will most likely stimulate a demand for a similar product line.

#111  

I think the robot cyberdog would make an awesome security guard, patrol and alarm system. That could be a market for them to explore.  However you would have to have two of them. One to watch the other so it wouldn't get stolen. LOL.

PRO
Synthiam
#112   — Edited

Security patrol robots have been attempted for the last 40 years. They aren’t effective because

  1. criminals don’t get scared by a robot with a camera.

  2. cameras already exist and can be mounted on buildings (cheaper)

  3. hiring humans means they can chase or discourage or offer help etc etc etc. humans do way more than a robot. Such as clean and check doors to see if they’re locked. Oh man there’s a million reasons why a human security guard is better than a robot

  4. humans don’t need to recharge their batteries every 25 minutes :)

  5. humans don’t get stuck on a pebble in the parking lot

  6. humans can get wet in the rain

  7. thieves most likely Won’t shoot a human. But they’ll hit a robot with a bat

#113  

LOL DJ. Great list. Mostly love #4. You should have met some of the humans I've worked with over the years. xD

PRO
Synthiam
#114  

Ha ha - that’s so true. Robots have their place, it’ll just take a bit more time :)

PRO
Canada
#115   — Edited

Interesting news today, looks like all the RealSense Stereo Depth (D series) cameras that we know and love will continue to live on!

LinkenIn post by Steve Crowe Unfortunately the T series (tracking cameras) are EOL

PRO
Canada
#116   — Edited
PRO
Synthiam
#117  

Updated to latest Intel library - which they say have bug and performance fixes

PRO
Colombia
#118  

Hi DJ, for the T265, when starting ARC most of the time I need to erase the skill and install again to recognize the camera, is this something already identified? or is an isolated case. Thanks

PRO
Synthiam
#119   — Edited

Yeah, that's one of the most significant issues with this device. You can ping intel about it, but they haven't seemed to act on those complaints. Super frustrating, right?

It doesn't seem to be an issue if it's the only intel realsense connected. It seems to happen when there are two realsense devices connected.

PRO
Colombia
#120  

OK good to know that. Thanks DJ.

PRO
Synthiam
#121  

v26 updated

  • hides an error from Intel's driver that sometimes raises when shutting down the camera

  • displays coordinates in the status every 100 updates

PRO
Canada
#122  

Does anyone know where to buy the Intel Realsense T265 now? It looks like this is one of the sensors that got the ax from intel, unfortunately. I was looking forward to having one of these to mess around with.

Does anyone know if there is there a direct replacement for this sensor, in the works?

User-inserted image

User-inserted image

PRO
Canada
#123  

Wow over $800 Cad on eBay.  I guess when we heard they were being discontinued we should have stocked up.  It’s disappointing that once again intel left us high and dry. These are used in several commercial robots so I assume they have grabbed whatever supply is on the market.

Perhaps it’s time we had support for another product.

PRO
USA
#124  

Here you go:

https://www.amazon.com/s?k=Intel+Realsense+T265&crid=SCY3NZM4N1NY&sprefix=intel+realsense+t265%2Caps%2C467&ref=nb_sb_noss_1

PRO
Synthiam
#125   — Edited

Hey - who called it?:) I shouldn’t be happy about being right, but I saw the writing on the wall. The robot industry is code first, usability last. I met with the CTO of real sense and told him that we could give them tens of thousands of customers. He said, no thanks. He said he’d prefer to focus on ROS. Well, how did that turn out? Sorry but I don’t see anyone doing anything with ROS. Well, I should clarify I don’t see enough people doing anything exciting or reproducible. I think they sold 300 intel real sense tracking cameras. If marketed, priced right, and bundled with ARC, they would have sold tens of thousands or more. That might not be the millions of units they were hoping for, but it is an excellent start for a business case.

Someone writes a post that’s twenty-eight pages long about moving a servo with ROS. If it takes 35 weeks to learn how to drive a servo with ROS, how many Intel real sense cameras do you think they will sell? For some reason, there's this conflicting issue with robotics and businesses. A business is about decreasing internal costs and increasing revenue - therefore, the result between the two is profit. The CEO wants the most efficient and cost-effective solution for any business function. However, with robot companies, their CEOs are CTOs. So, the result is "a difficult robot system means business system." confused

How do you run a successful business like that? You're telling me that using software like ARC that is easy, quick, and extensible is not professional? Because to be professional, you need 38 guys in a room writing 5 million lines of code to "test" a prototype that will be thrown out in a month. <- this is the unfortunate reason for "why we can't have nice things" with robot products.

Take a look at Boston Dynamics, for example. @Nink, you posted the documentation, so I took a look. My head was spinning, haha - no wonder they can't find a business model. The darn thing is impossible to use without a Ph.D. engineering team. And boy, don't Ph.D. engineering programmers have the best business ideas? Hahaha, sarcasm. Inventions are fueled by creativity - and creativity doesn't involve complexity. So, therefore, simplicity inspired creativity. Want new products or business models? Give the responsibility to creative teams, not engineering teams.

Anyway - if you read the NMS manual, you don’t need a t265 for the better navigator. You can use other options for providing a pose hint.

<rant over> lol

PRO
USA
#126   — Edited

https://www.amazon.com/Intel-Realsense-T265-Webcam-USB/dp/B07QFWQ1VS/ref=sr_1_1?crid=SCY3NZM4N1NY&keywords=Intel+Realsense+T265&qid=1648428945&sprefix=intel+realsense+t265%2Caps%2C467&sr=8-1

17 left

PRO
Synthiam
#127  

@EzAng wow, $325 USD is like $9 million Canadian

Portugal
#128  

Hum, why focus on ROS and not on ARC? Thant is strange, wonder what the reasons are...

PRO
Canada
#129  

#EzAng That Amazon link doesn't ship to Canada so you need a US redirect address like MyUS + Import Duties + additional shipping so I guess about $500+ Canadian.  Perhaps if someone wanted to buy all 17 they could resell for $700 CAD. Cheapest I could find them on eBay was $700 Canadian from China but Stuff from china takes months now.  Not sure if there would be demand though.

PRO
Colombia
#130  

Hi Dj,  Is it possible that the skill can keep the latest serial number connected so I can start the device programmatically with the start command and not to select  it manually each time and start the device?.  Thanks.

#131  

We can look into adding that ability if possible. I can see how it could be useful if the Intel product supports it. Thanks for your feedback.

#132  

Would appreciate some input. I have a small tracked vehicle that I want to get around on its own. I have RPI 3+, Arduino mega and Uno, ESP32 dev, EZB-v4,  T265, encoded motors, ultra sonic sensors, dual H Bridge. What will be the best way to get T265 and sensors and  H Bridge talking to one another? Or, What would you do? Thanks

PRO
Canada
#133  

Hi @DSBaily.

The piece of the puzzle you are missing is a windows Single board computer like a rock pi x or similar. You really want the T265 plugged directly into USB port on the PC so the PC really needs to be on the vehicle.

you can also connect the EZB direct to the PC or you can use an Arduino either via a serial connection or ESP32 wireless.

#134  

Thanks Nink

That is what I thought. I may use sonics and encoders and see how well it does.

Thanks again. Scott

PRO
Synthiam
#135  

You can put windows on the RPI 3+, I think. But either way, the T265 should be combined with a 360-degree lidar or depth camera. The T265 is okay but not nearly as good as using an actual Lidar, IMO.

As for having the robot move, you need to add the HBridge PWM Movement Panel to your project. But, I would first recommend following the getting started guide, which introduces what a Movement Panel is because a Movement Panel is how the robot moves. Once a Movement Panel is added, it registers itself into the ARC framework, and then all other robot skills can move the robot. Here's an excellent place to start: https://synthiam.com/Support/Get-Started/how-to-make-a-robot/make-a-diy-robot

#136  

I do have a Slamtec RPLIDAR A1M8 2D 360 Degree 12 Meters Scanning , will that work? I think I tried it last year but never got to work. My large robot uses T265 and D435 with LattePanda and works well. Trying to use what I have in stock

PRO
Synthiam
#137  

Here’s the robot skill for the rplidar a1: https://synthiam.com/Support/Skills/Navigation/RPI-Lidar-A1?id=20980

if using the rplidar, there’s no need for the t265. I find The Better Navigator and rplidar together is enough.