Intel RealSense T265 Adventures-Bot Part 3

DJ Sures

Canada

This is a test with the Intel RealSense T265 tracking camera for localization with the EZ-Robot AdventureBot. I use 3 ultrasonic sensors as well to demonstrate the navigation messaging system, but that's not the full point of the video. Ultrasonic sensors are terrible for mapping:) A 360 degree lidar will do a much better job. I'll get to that in the future... but for now, we're playing with localization and way points.

So, the fact that this robot can get back to exactly where it started and the realsense maps that position... wow, i have to say wow! My USB 3.0 active extension cable is only 16 feet long, so that's as far as i can go without putting the realsense in the robot. For the time being, this is how we test. It's also a good test because the wheels slip like crazy, so it demonstrates why wheel encoders are bad news bears.

Preparation for Part #4 Removing false positives and filtering the data from the Lidar, then adding a SLAM algorithm to allow detected objects to only be static for a certain period of time. This allows things in the way to move, like humans, and not be part of the map. I wanted to make a robot that can navigate around my bedroom to each night stand, closet and doorway. So I used a combination of the 360 Lidar and 3 ultrasonic distance sensors to scan my bedroom. I threw a bunch of clothes on the floor to simulate obstacles, which you can see around the closet.

User-inserted image

Closer inspection, you can see the way-points and the navigation paths i took to create them. I'm implementing a path finding algorithm right now for the next update - which splits the map into quadrants based on your defined robot size. It then finds a path by using your robot's width.

User-inserted image

Here's with filtering to remove false positives. This gives the path finding system less false positives to worry about

User-inserted image

Part #3 We give way-points names

Part #2 In this video, i demonstrate multiple way-points and controlling them through control commands

Part #1 Just testing this Intel RealSense T265 out and seeing how it works with the ARC NMS (Navigation messaging system)

By — Last update

ARC Pro

Upgrade to ARC Pro

Discover the limitless potential of robot programming with Synthiam ARC Pro – where innovation and creativity meet seamlessly.

PRO
USA
#1   — Edited

Weren't you guys developing a Lidar SLAM module .....I remember reading you were having trouble with a supplier or something like in 2019. Are you going to develop the software around an existing piece of hardware, if so which one?

This is getting exciting.

PRO
Synthiam
#2  

We were working with hitachi/lg on a lidar. But the cost and volume was terrible. The biggest issue we found was we couldn’t get any slam software to work correctly. They all would go out of sync after some time. The maps would warp and twist.

we gave up because there was no real way for us to get positioning

until now! The realsense solves that it seems

as for a 360 lidar... it really doesn’t matter what we choose. The new Navigation Messaging System in ARC makes any sensor at all work. Absolutely anything will work

PRO
USA
#3  

Awesome! So excited!

#4  

Wow looks like I may be getting that Realsense cam, very exciting news for Navigating!

PRO
Portugal
#5  

Spot on DJ! Any chance of getting the odometry from the roomba to the navigator?

PRO
Canada
#6   — Edited

If anyone finds a good low cost LIDAR that works I would love to hear about it.  There are a few at Robotshop for under $100 US but I don’t want to waste money on something if it doesn’t work very well.

PRO
USA
#7  

very good part 1

thanks

EzAng

PRO
Portugal
#9  

Lol. Will there be any conversion factor (cm/s) or you just push that raw sloppy in?:)

PRO
Synthiam
#10  

haha - well, the conversion will be standard across all roomba's because its' the number of ticks per wheel rotation. I don't recall off the top of my head but it's published in their documentation.

PRO
Synthiam
#11   — Edited

Updated with the video #2 of the series demonstrating way points. Look at the top of this robot description for PART #2 of this series

#12   — Edited

Is that Realsense camera readily available? so many web companies I want stuff have delays on back orders,can never get what I want. I really like how clear that camera is too!

Edit----Oh I see that video is just the EZ Cam right? LOL! looks good when Lots of light in the room! Realsense just showing the radar map part.

PRO
Canada
#13  

Looking forward to part 3.  Is the LattePanda or Rock Pi X powerful enough to run this so you can cut the cord? Love to see some obstacle avoidance in next demo as well.

With colour ball tracking  and a few of these bots who know where their goal posts are a game of robocup would be fun (although a little expensive, I guess we need to find a cheaper sensor).

If I was a realestate agent during Covid, selling Condos I would be all over this.  Upgraded roomba Robot sits charging in the corner, You click on the floor plan, the robot drives into the room.  You can take over and do some basic controls  (access camera controls to look around) then click on another area in floor plan and off you go.  Let them keep the robot when they buy the condo as a telepresence / security robot / vacuum that actually works and doesn’t smash into the furniture.

PRO
Synthiam
#14  

The location processing is done within the sensor. So the latte panda or any sbc would work with this. Just needs usb and the respective robot skills

#15  

I REALLY enjoyed the vids DJ!  It gave me much relief that you had similar experimental results to what I have seen that this sensor is capable of.  You never know how something will work until you test it in a given environment...so its really great to see it working for someone else...and integrated with the other skills too!

Its also great to see it working without the encoders for someone else...which is how I plan on using it too.  The SDK supports integrating encoders into the SLAM algo on the sensor, so this would be a great (but not necessary) addition to the skill in a future version.  It would probably improve the accuracy even further when trying to drive correct paths, especially in visually uninteresting areas.  I have not been a fan of encoders either in the past, but from everything I read, they can play an important part in SLAM in particular as one part of the pose probability calc.  For me, so far, I am happy with the accuracy without encoders in my office.  From what I have read, the lack of encoders can become more of an issue when the bot is moving through hallways and other areas that have less interesting visual features to focus on (at multiple angles).  Does my assessment seem accurate on this?

This is all exciting stuff.  I am looking ahead to your next prototype and video.  Here is a bit of a wishlist of sorts.  I would think others like me will want to mount the T265 on a moving head...likely pan and tilt.  This complicates the whole nav thing if the head is out of line and changing relative to the driving path...maybe that is already handled somehow.  I am trying to integrate a compass into this mix which may not be necessary in the short run but I think is desirable for me and others.  I used it in the past to maintain a bot on a proper heading while driving or between waypoints.  Have any thoughts on compass integration?  In future discussions once more people get on board with the T265, my guess is we will all start talking more about integrating depth cams, and persistent 2D or 3D gridmaps.  Is that the path forward as you or others see it?

#17  

You are of course right about the compass accuracy and all that and the need for some filter.  I'd also agree the T265 has to be the very dominant partner in the calc.  I think the issue I am having is that the axes on the T265 (specifically the direction of the yaw axis at 0, its "north") is arbitrary...its whatever the bot was pointed to when it last reset its map (which is when the device powers up by default).  I keep feeling like I need to correlate the two.  So far, I have been taking a bunch of compass readings on startup (with the face front and level), averaging them, so I can use this to translate from T265 to world.  Perhaps there is a better way.

It may just be a human problem of me not wanted to abandon compass headings and use the arbitrary T265 yaw axis.  I am used to thinking and calculating so many things using compass headings.  I am used to being able to tell my bots to look, rotate, or drive on a particular heading.  If a bot looks to one side and sees something in a cam, I am also used to thinking in terms of calculating the bearings and distances to those things and remembering that, and creating a situational awareness of things around a bot in memory.  Maybe I need to abandon caring about headings and just use T265 0 yaw, but it is hard to, I want to know which way the bot is going in something I can relate to!  I also have the desire to build and store a map that uses a non-arbitrary coordinate system with a "north".

If you imagine your ideal bot, what do you think you would do?

PRO
Canada
#18   — Edited

Digital compasses do not work very well in doors as they use a combination of a magnetometer, accelerometer and gyroscope to calculate direction and this is heavily influenced by external factors. The indoor atlas guys who invented indoor location tracking using a compass use this flaw to their advantage. The magnetic fields generated by electrical wiring, metal frames, drink machines, DJ's pinball machines etc will interfere with the compass on the robot. Indoor atlas use this discrepancy to approximate the position of a phone or robot using other positioning technology with the unique magnetic fingerprint created of where you are in the building at the time in order to increase the accuracy of their location tracking.     https://www.indooratlas.com/positioning-technology/

PRO
Synthiam
#19  

You’re both right. However, the discussion about the compass is void unless the application specifically requires one for some reason. Because the intel t265 does a much better job at keeping heading degrees.

Having the compass might be a human problem, like you said. But even us humans have a very vague idea of north, and it doesn’t play a vital role in our lives.

If the goal of the robotics navigation is to visit different way points using the t265, there’s no need for a compass.

actually, other than basic obstacle avoidance, there’s not much else needed. The t265 takes care of pretty much everything. It’s quite wonderful!

#20  

I am so tempted to buy one now instead of waiting boxing day for that t265!

PRO
Synthiam
#21  

I should be getting some sort of commission on those purchases:D

PRO
USA
#22  

Excellent part 2 demonstration

I need to try the Navigation Messaging System (NMS) that is built into ARC soon , need to purchase a T265.

I have been using the ultrasonic sensor and the Arduino with ARC serial monitor, measuring the speed of sound and using the ultrasonic sensor for echolocation

thanks again,

waiting for part 3

EzAng

PRO
Synthiam
#23  

Part #3 is posted in the top of this

#24  

Oh yes part 3 fantastic waiting for 4!

PRO
USA
#25   — Edited

When is part four coming? this is fun

PRO
Synthiam
#26  

Working on a few things for it. Next few steps are a bit bigger and will take a bit of time to finish up:)

PRO
USA
#27   — Edited

part 4 I bet will be more exciting,

Thanks

EzAng

PRO
Synthiam
#28  

I already answered you.

PRO
USA
#29  

Did I miss part 4?

PRO
Synthiam
#30  

There is no part 4

*moved your comment to this thread because it was originally on The Navigator robot skill

PRO
Portugal
#31  

There is no part 4, Yet!:)