Nice video! As I understand it, AMCL publishes map to odom tf, while FAST-LIO publishes odom to robot's base frame tf. Will this method work well even if the robot platform experiences significant roll or pitch movements, such as in two-wheeled balancing robots?
Hi Rachid, thanks for the video! I see you're using Ubuntu 20.04 with ROS1 Noetic and ROS2 Galactic, but ROS2 Humble is recommended for the LiDAR. How did you manage to run all these versions on the same system?
The laserscan is extract from a package called poiontcloud_to_laserscan package. The term “relay” on my package is just to repeat the Odometry topic from FastLio but replace the frame_id to “odom” and also publish the odom to base_link TF.
Cook video! I wonder if the Fast LIO can be used with a depth camera (with depth image to point cloud conversion). I have a special version of Asus Xtion that is depth-only instead of typical RGB-D.
@sencis9367 That what I already had. Also It's depth only. So the visual odometry was not available here. I have a plan to mount the camera on the front and back of the robot, as I have two of then. And then write a CPP ros2 node to combine the Point cloud into a single cloud message.
@sencis9367 I have tested the Asus Xtion (640x480 depth resolution) with various Laser Odometry SLAM but not all of them. I ported the SSL_SLAM to ros2 and this seems to give the best result. Still, the problem remains when the robot rotates. The estimated odometry just fly off as the robot start to rotate. My guess was that the narrow horizontal FoV compared to 3D LiDAR, But something like Livox Horizon, MID-40 seems to works despite the similar narrow FoV. I yet to test it with KISS-ICP and another IMU coupled SLAM.
@@TinLethaxto my opinion kiss-icp is useless for real world application. It is fast, it is easy to understand, but it has no backend optimisation mechanism to handle the long term drift.
What's the minimum angle of the lidar vertically? As in if it's mounted flat can it see the ground? Seems like you have some ground data in your neighborhood scan
Max speed with 18V battery is 1.3m/s It’s DC brushed motors, so if higher voltage it could go faster. The AT Motor Driver that I am using could go up to 24V or 6 cells battery. I have trie run that with full speed outdoor, the Odometry from FAST-LIO still pretty reliable :) they really made a good work.
@@stepbystep-robotics6881 Thank you for your answer. In addition to the robot depending on the battery voltage, does it also depend on the ROS processing speed? If I'm not mistaken, robots developed on ROS only have a maximum speed of about 3m/s?
@@mr2tot I don’t think any ROS robot will be limited as 3m/s because robot’s physical speed doesn’t relate to ROS processing speed. The vehicle physical speed should be depending on ESC or motor driver. From my point of view, I don’t use ROS for all of control level, sometime we need realtime data in low level then better to use RTOS. ROS is better from middle to higher level software I feel.
I wonder what how the point cloud would look like with the LIVOX MID-360 for a wire mesh fence. Our baseball field is surrounded by 6 foot high mesh fencing. In the process of building a mower to do the field and need a way for the mower to see the perimeter.
I believe the wire mesh could be detected by the MID-360. It could detect some of power line, so in case of planar fence, it might be no problem. The lidar itself is not too expensive, you better to give a try on it. 🙂
Very simple explanation! Thank you
Dude, this is amazing. Thank you for video.
Hope it will be useful information 😁
@@stepbystep-robotics6881 it definitely is 👍👍
Great video. We tried fast-lio2 before but quickly drifted.
I have seen it couple times, but after restart it and place on proper vehicle, it works most of the time.
Well done!
Thank you to PCBWay for the nice 3D printed part. It’s really neat and fast delivery!
Nice video!
As I understand it, AMCL publishes map to odom tf, while FAST-LIO publishes odom to robot's base frame tf. Will this method work well even if the robot platform experiences significant roll or pitch movements, such as in two-wheeled balancing robots?
Hi Rachid, thanks for the video! I see you're using Ubuntu 20.04 with ROS1 Noetic and ROS2 Galactic, but ROS2 Humble is recommended for the LiDAR. How did you manage to run all these versions on the same system?
@@AbdrahamaneKOITE Hi, yes we can install ROS2 Humble from source. I have two ROS2 on same system, galactic and humble.
@@stepbystep-robotics6881 Great, thank you!
Super sweet technology, thank you for sharing!!
very nice explanation of the tech and how you put it together!
I am glad it useful somehow 😊
Hi Great video. Do you have any videos on how to program LOAM, SLAM or FAST LIO using a livox avia lidar?
Can u explain a bit how do u extract the laser point from fastlio registered point cloud? And what’s the relay?
The laserscan is extract from a package called poiontcloud_to_laserscan package.
The term “relay” on my package is just to repeat the Odometry topic from FastLio but replace the frame_id to “odom” and also publish the odom to base_link TF.
Cook video! I wonder if the Fast LIO can be used with a depth camera (with depth image to point cloud conversion). I have a special version of Asus Xtion that is depth-only instead of typical RGB-D.
if that pointcloud from depth camera is fast enough, I think it might work. And you still need IMU too.
@sencis9367 That what I already had. Also It's depth only. So the visual odometry was not available here. I have a plan to mount the camera on the front and back of the robot, as I have two of then. And then write a CPP ros2 node to combine the Point cloud into a single cloud message.
@sencis9367 I have tested the Asus Xtion (640x480 depth resolution) with various Laser Odometry SLAM but not all of them. I ported the SSL_SLAM to ros2 and this seems to give the best result. Still, the problem remains when the robot rotates. The estimated odometry just fly off as the robot start to rotate. My guess was that the narrow horizontal FoV compared to 3D LiDAR, But something like Livox Horizon, MID-40 seems to works despite the similar narrow FoV. I yet to test it with KISS-ICP and another IMU coupled SLAM.
@@TinLethaxto my opinion kiss-icp is useless for real world application. It is fast, it is easy to understand, but it has no backend optimisation mechanism to handle the long term drift.
Nice one
What's the minimum angle of the lidar vertically?
As in if it's mounted flat can it see the ground? Seems like you have some ground data in your neighborhood scan
Min. FOV angle is -2degree. The lidar is placed in flat, and yes with this min FOV angle, meaning it will see ground with farther away from the robot.
Great video! Could you also share the code for your fl_nav2_helper package please?
Great video
Does it work with 2d lidars?
Fast-Lio only works with 3D lidar. But Nav2 just 2D laserscan is enough.
Hi, what is the maximum speed that your robot can achieve? And what maximum speed have you tested with this algorithm ? Thanks :D
Max speed with 18V battery is 1.3m/s
It’s DC brushed motors, so if higher voltage it could go faster. The AT Motor Driver that I am using could go up to 24V or 6 cells battery.
I have trie run that with full speed outdoor, the Odometry from FAST-LIO still pretty reliable :) they really made a good work.
@@stepbystep-robotics6881 Thank you for your answer. In addition to the robot depending on the battery voltage, does it also depend on the ROS processing speed? If I'm not mistaken, robots developed on ROS only have a maximum speed of about 3m/s?
@@mr2tot I don’t think any ROS robot will be limited as 3m/s because robot’s physical speed doesn’t relate to ROS processing speed. The vehicle physical speed should be depending on ESC or motor driver.
From my point of view, I don’t use ROS for all of control level, sometime we need realtime data in low level then better to use RTOS. ROS is better from middle to higher level software I feel.
I wonder what how the point cloud would look like with the LIVOX MID-360 for a wire mesh fence. Our baseball field is surrounded by 6 foot high mesh fencing. In the process of building a mower to do the field and need a way for the mower to see the perimeter.
I believe the wire mesh could be detected by the MID-360. It could detect some of power line, so in case of planar fence, it might be no problem.
The lidar itself is not too expensive, you better to give a try on it. 🙂
How do we run this on a GPU?
Nice Project, I working with the similar project for agriculture robot, here in Japan. It will be nice if I can contact you for getting some advice.
❤❤❤❤