Robust Odometry and Fully Autonomous Driving Robot | ATCrawler x FAST-LIO x Navigation2

Поделиться
HTML-код
  • Опубликовано: 27 ноя 2024

Комментарии • 38

  • @wongpokhim
    @wongpokhim 3 месяца назад

    Very simple explanation! Thank you

  • @fra5715
    @fra5715 5 месяцев назад +2

    Dude, this is amazing. Thank you for video.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад +1

      Hope it will be useful information 😁

    • @fra5715
      @fra5715 5 месяцев назад

      @@stepbystep-robotics6881 it definitely is 👍👍

  • @madeautonomous
    @madeautonomous 5 месяцев назад +2

    Great video. We tried fast-lio2 before but quickly drifted.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад +1

      I have seen it couple times, but after restart it and place on proper vehicle, it works most of the time.

  • @PCBWay
    @PCBWay 5 месяцев назад +3

    Well done!

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад

      Thank you to PCBWay for the nice 3D printed part. It’s really neat and fast delivery!

  • @김영욱-n5u1r
    @김영욱-n5u1r 3 месяца назад +1

    Nice video!
    As I understand it, AMCL publishes map to odom tf, while FAST-LIO publishes odom to robot's base frame tf. Will this method work well even if the robot platform experiences significant roll or pitch movements, such as in two-wheeled balancing robots?

  • @AbdrahamaneKOITE
    @AbdrahamaneKOITE 27 дней назад +1

    Hi Rachid, thanks for the video! I see you're using Ubuntu 20.04 with ROS1 Noetic and ROS2 Galactic, but ROS2 Humble is recommended for the LiDAR. How did you manage to run all these versions on the same system?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  27 дней назад +1

      @@AbdrahamaneKOITE Hi, yes we can install ROS2 Humble from source. I have two ROS2 on same system, galactic and humble.

    • @AbdrahamaneKOITE
      @AbdrahamaneKOITE 27 дней назад

      @@stepbystep-robotics6881 Great, thank you!

  • @nanastos18060
    @nanastos18060 5 месяцев назад

    Super sweet technology, thank you for sharing!!

  • @TNERA
    @TNERA 5 месяцев назад

    very nice explanation of the tech and how you put it together!

  • @F2MFireDraftingDesignsIn-ch4tw
    @F2MFireDraftingDesignsIn-ch4tw 2 месяца назад

    Hi Great video. Do you have any videos on how to program LOAM, SLAM or FAST LIO using a livox avia lidar?

  • @thomasluk4319
    @thomasluk4319 5 месяцев назад +2

    Can u explain a bit how do u extract the laser point from fastlio registered point cloud? And what’s the relay?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад +1

      The laserscan is extract from a package called poiontcloud_to_laserscan package.
      The term “relay” on my package is just to repeat the Odometry topic from FastLio but replace the frame_id to “odom” and also publish the odom to base_link TF.

  • @TinLethax
    @TinLethax 5 месяцев назад +1

    Cook video! I wonder if the Fast LIO can be used with a depth camera (with depth image to point cloud conversion). I have a special version of Asus Xtion that is depth-only instead of typical RGB-D.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад +1

      if that pointcloud from depth camera is fast enough, I think it might work. And you still need IMU too.

    • @TinLethax
      @TinLethax 5 месяцев назад

      @sencis9367 That what I already had. Also It's depth only. So the visual odometry was not available here. I have a plan to mount the camera on the front and back of the robot, as I have two of then. And then write a CPP ros2 node to combine the Point cloud into a single cloud message.

    • @TinLethax
      @TinLethax 5 месяцев назад

      @sencis9367 I have tested the Asus Xtion (640x480 depth resolution) with various Laser Odometry SLAM but not all of them. I ported the SSL_SLAM to ros2 and this seems to give the best result. Still, the problem remains when the robot rotates. The estimated odometry just fly off as the robot start to rotate. My guess was that the narrow horizontal FoV compared to 3D LiDAR, But something like Livox Horizon, MID-40 seems to works despite the similar narrow FoV. I yet to test it with KISS-ICP and another IMU coupled SLAM.

    • @PointLab
      @PointLab 5 месяцев назад +1

      @@TinLethaxto my opinion kiss-icp is useless for real world application. It is fast, it is easy to understand, but it has no backend optimisation mechanism to handle the long term drift.

  • @FactorSocial4797
    @FactorSocial4797 3 месяца назад

    Nice one

  • @hrmny_
    @hrmny_ 5 месяцев назад +1

    What's the minimum angle of the lidar vertically?
    As in if it's mounted flat can it see the ground? Seems like you have some ground data in your neighborhood scan

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад

      Min. FOV angle is -2degree. The lidar is placed in flat, and yes with this min FOV angle, meaning it will see ground with farther away from the robot.

  • @nicholastan170
    @nicholastan170 2 месяца назад

    Great video! Could you also share the code for your fl_nav2_helper package please?

  • @sebastianrada4107
    @sebastianrada4107 5 месяцев назад +1

    Great video
    Does it work with 2d lidars?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад

      Fast-Lio only works with 3D lidar. But Nav2 just 2D laserscan is enough.

  • @mr2tot
    @mr2tot 5 месяцев назад +2

    Hi, what is the maximum speed that your robot can achieve? And what maximum speed have you tested with this algorithm ? Thanks :D

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад +2

      Max speed with 18V battery is 1.3m/s
      It’s DC brushed motors, so if higher voltage it could go faster. The AT Motor Driver that I am using could go up to 24V or 6 cells battery.
      I have trie run that with full speed outdoor, the Odometry from FAST-LIO still pretty reliable :) they really made a good work.

    • @mr2tot
      @mr2tot 5 месяцев назад +1

      @@stepbystep-robotics6881 Thank you for your answer. In addition to the robot depending on the battery voltage, does it also depend on the ROS processing speed? If I'm not mistaken, robots developed on ROS only have a maximum speed of about 3m/s?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад

      @@mr2tot I don’t think any ROS robot will be limited as 3m/s because robot’s physical speed doesn’t relate to ROS processing speed. The vehicle physical speed should be depending on ESC or motor driver.
      From my point of view, I don’t use ROS for all of control level, sometime we need realtime data in low level then better to use RTOS. ROS is better from middle to higher level software I feel.

  • @Flynn3778
    @Flynn3778 5 месяцев назад

    I wonder what how the point cloud would look like with the LIVOX MID-360 for a wire mesh fence. Our baseball field is surrounded by 6 foot high mesh fencing. In the process of building a mower to do the field and need a way for the mower to see the perimeter.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  5 месяцев назад

      I believe the wire mesh could be detected by the MID-360. It could detect some of power line, so in case of planar fence, it might be no problem.
      The lidar itself is not too expensive, you better to give a try on it. 🙂

  • @KognitiveLearn
    @KognitiveLearn Месяц назад

    How do we run this on a GPU?

  • @sutanmuhamadsadamawal715
    @sutanmuhamadsadamawal715 5 месяцев назад

    Nice Project, I working with the similar project for agriculture robot, here in Japan. It will be nice if I can contact you for getting some advice.

  • @salh2665
    @salh2665 5 месяцев назад

    ❤❤❤❤