ptp
USA
Asked
Hello guys.
I'm trying to setup a base template project to use T265 and Lidar Plugins. Can someone share a project or scripts to navigate and use the Lidar data to avoid the obstacles ?
Hello guys.
I'm trying to setup a base template project to use T265 and Lidar Plugins. Can someone share a project or scripts to navigate and use the Lidar data to avoid the obstacles ?
What lidar? The hitachi lgs? https://synthiam.com/Support/Skills/Navigation/Hitachi-LG-LDS-Lidar?id=20086
use The Navigator robot skill for the nms: https://synthiam.com/Support/Skills/Navigation/The-Navigator?id=20072
Unless you want to get the core slam or hector slam built as a robot skill: https://github.com/mikkleini/slam.net
that author also did a good c# a path planning port that’s great.
The navigator will execute scripts and stop navigation based on detected object. It doesn’t do path planning. See my previous response about path planning by that author. He did a good job porting to c#
the navigator robot skill page has info on how it works with the nms. But I suspect you’ll figure it out without needing to read it
in your example usage case - the t265 publishes Cartesian coordinates to the nms. The lidar publishes distance data as a laser scan to the nms. The navigator subscribes to the NMS and uses both of those datas to create a map and plot paths and way points. The navigator uses the ARC Movement Panel service to control the robot with what ever Movement Panel is installed in the project.
@DJ and @ptp Nice project by mikkleini, just what we need. Could that hector slam and path planning be integrated in The navigator?
Regarding SLAM I'm still using the gmapping (http://wiki.ros.org/gmapping) with encoders (odometry) and IMU (EKF fused odometry) and lidar data for robot localization (monte carlo). There are at least 2 interesting options :
I'm trying to stay close to the ARC path with the existing plugins.
I don’t think either would be integrated into the navigator. They’d work best as their own robot skills. That way they could work together. If too much is built into a single skill then it limits the flexibility.
Currently i am trying to use my Roomba with ROS Kinect and later integrate my Ylidar X2 that i bought. I am still waiting for the lidar to test it with ARC. ptp, isnt there a topic in ROS that we could use to send and receive coordinates from Rviz to ARC? We could have a Rpi or Jetson Nano handling navigation with ROS and a sbc running ARC.
@DJ:
The distance data seems to be accumulating on the Navigator view e.g. not clearing up. Only "updates" after you press "Clear obstacle map" is this the expected behavior ? I'm not getting the expected results. Do you have plans to demo the T265 & Lidar with an Example Project template to navigate between waypoints and avoiding obstacles ?The NMS accumulates laser scan data and raises the event when it receives position data. I’ll give you an example code tonight.
the NMS page explains that laser scan data requires pose position data as well. I’m the pose positron data is prioritized for the nms.
Apologies for the delay. Few things came up. I’ll get to it this evening
Take a look at this: https://synthiam.com/Support/Skills/Navigation/NMS-Example?id=20640
For your SLAM module, which I think you're working on, will use this model. Your robot skill will subscribe to an NMS event that contains combined scan and location data.
@DJ: Thanks for the sample example/project.
I'm trying to put some pieces together. Initially I was looking for a script to navigate using the lidar and combining the existent plugins Navigator, T265, Hitachi lidar.
But deeper I go in the robot hole... more issues and/or missing pieces are missing to use the existent plugins as base project to implement autonomous navigation.
I'm not a big fan of Intel Support I believe the T265 has more features like map and re-localization although it seem the T265 is in auto-pilot mode (minimal support) and no roadmap/future.
You mentioned here: https://synthiam.com/Community/Questions/Hey-DJ-what-ya-got-cooking-20622
A few questions: What are you planning as navigations enhancements ? Can you share your roadmap/improvements for the existent plugins ? Do you have new hardware and/or (replacement for the T265) ?Cartographer can provide 2D odometry of decent quality, by using only low cost 360 degree LDS with pretty low data rate (57 Hz). IMU and additional odometry source (for example wheeled platform odometry or visual odometry) can increase resulting map quality on big environment areas. But for indoor mapping of about 5060 square meters area they are not so important Cartographer’s internal loop closure algorithm is capable of keeping such small maps consistent.
@DJ,
To piggy back on PTP's question, I am about to get back into working on the robot again now that the family issues have stabilized a bit. I have been thinking about the scripting needed for object avoidance either with the Lidar or if I can get the T435 working (I am beginning to think even though Intel says the 435i and 435 are identical from the SDK perspective there must be some difference since mine simply refuses to work properly in ARC). Anyway, I have an outline of what I want the script to do, but it is quite complex if I try to use the script to determine the obstacle shape or size to figure the best way around, or will do a lot of trial and error going back and forth until it finds a way past. I don't want to put a huge amount of effort into figuring out hte scripts if your roadmap has obstacle avoidance in the near (next few months) term.
Alan
We’re experimenting with a few NMS updates that allows the map to be global. This way there can be other skills for path planning, rather than the feature being built into the navigator, etc