Robust Odometry and Fully Autonomous Driving Robot | ATCrawler x FAST-LIO x Navigation2

Поделиться
HTML-код
  • Опубликовано: 7 фев 2025
  • This video is to setup the FAST-LIO, highly robust LiDAR-inertial Odometry system with our ATCrawler.
    The 3D map and pointcloud data could be generated from this package.
    Th Navigation2 is used for an indoor-outdoor navigation and 2D map is created by slam_toolbox package.
    This video is sponsored PCBWay. They provides a full feature custom prototype from PCB, PCBA, 3D printing, CNC machining, laser cutting, and sheet metal bending.
    For more detail, please check it on,
    pcbway.com/g/9...
    You will get $5 of New User Free Credit for your first order.
    Timeline:
    0:00 Intro
    0:45 FAST-LIO Intro
    2:08 LIVOX MID-360 review
    2:50 3D printed design for MID-360
    4:19 Hardware overview
    5:05 Install and configure
    6:44 Odometry testing
    9:04 Making 2D map
    10:26 Start Navigation
    11:52 Outdoor Mapping
    13:17 End
    For more detail about ATCrawler and AT Motor Driver, the part that I am using in this video, please check out,
    attraclab-shop...
    For more detail about FAST-LIO, please check out,
    / @marslabhku1418
    For more detail about Navigation2, please check out,
    github.com/ros...
    #robots #ugv #mobilerobots #ardupilot #programming #ai #technology #pcbway #3dprinting #webapplicationdevelopment

Комментарии • 51

  • @ntenn4763
    @ntenn4763 4 дня назад

    Great job!

  • @fra_the_maker
    @fra_the_maker 8 месяцев назад +2

    Dude, this is amazing. Thank you for video.

  • @wongpokhim
    @wongpokhim 6 месяцев назад

    Very simple explanation! Thank you

  • @PCBWay
    @PCBWay 8 месяцев назад +4

    Well done!

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад

      Thank you to PCBWay for the nice 3D printed part. It’s really neat and fast delivery!

  • @nanastos18060
    @nanastos18060 8 месяцев назад

    Super sweet technology, thank you for sharing!!

  • @madeautonomous
    @madeautonomous 8 месяцев назад +2

    Great video. We tried fast-lio2 before but quickly drifted.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад +1

      I have seen it couple times, but after restart it and place on proper vehicle, it works most of the time.

  • @TNERA
    @TNERA 8 месяцев назад

    very nice explanation of the tech and how you put it together!

  • @김영욱-n5u1r
    @김영욱-n5u1r 5 месяцев назад +2

    Nice video!
    As I understand it, AMCL publishes map to odom tf, while FAST-LIO publishes odom to robot's base frame tf. Will this method work well even if the robot platform experiences significant roll or pitch movements, such as in two-wheeled balancing robots?

  • @VishalChandila-n8x4c
    @VishalChandila-n8x4c 18 дней назад

    hi guys, your robots tracks looks amazing, can you tell me where i can get them alos, i am using some 3d orinted stuff right now

  • @georgebethel1320
    @georgebethel1320 25 дней назад

    PLease, could you share an image of what your TF looks like. my base_link keeps jumping around but tf has no error

  • @F2MFireDraftingDesignsIn-ch4tw
    @F2MFireDraftingDesignsIn-ch4tw 5 месяцев назад

    Hi Great video. Do you have any videos on how to program LOAM, SLAM or FAST LIO using a livox avia lidar?

  • @nicholastan170
    @nicholastan170 5 месяцев назад

    Great video! Could you also share the code for your fl_nav2_helper package please?

  • @thomasluk4319
    @thomasluk4319 7 месяцев назад +2

    Can u explain a bit how do u extract the laser point from fastlio registered point cloud? And what’s the relay?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  7 месяцев назад +1

      The laserscan is extract from a package called poiontcloud_to_laserscan package.
      The term “relay” on my package is just to repeat the Odometry topic from FastLio but replace the frame_id to “odom” and also publish the odom to base_link TF.

  • @Wolf-dx6jk
    @Wolf-dx6jk 2 месяца назад +1

    What software do you use to actually tell your motors how they should rotate? Because you only point your robot the point on the map where it should go and it goes. But what data does robot use as input data to understand that it should move this or that direction? And where this data store to convert into motors rotation?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  2 месяца назад

      Sorry maybe I didn't have much explain too detail about robot navigation.
      The robot that I am using is a part from my previous video here ruclips.net/video/xYo6Bpzbto8/видео.htmlsi=TwPnH5ht5e6_Qp3H
      Simply speaking, the motors are given PWM signal and to generate the PWM I have the ROS package to easily send the command.
      About the map and and goal point things, I am using Navigation2 package which is the most well known navigation in ROS2, you can try searching on that.

    • @Wolf-dx6jk
      @Wolf-dx6jk 2 месяца назад

      @ sorry but i still didn’t get it.
      After you defining the point in your room robot starts to get to the point. So drivers of motors should be connected somehow to get the information of the path robot are going to move through. So I’m trying to understand which package do you use to convert path to the goal into order of commands motors must execute. And how to connect motors for them to consuming this order as input signal (or may be after parsing).
      For example I found in nav2 packages shim_controller package (I guess it’s pwm controller though). And if this controller is automatically convert path into order of commands so it’s obvious that I need to find how to make my robot get signals from this package. But I’m not sure I am on the right path.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  2 месяца назад +1

      @@Wolf-dx6jk Okay let me explain a bit more.
      I made a package called md100a_ros2_utils github.com/attraclab/md100a_ros2_utils
      on this package, it's listening on /cmd_vel topics from everywhere. Once any publisher send /cmd_vel, the md100a_ros2_utils package will convert that into PWM signal for motor driver to move wheel, as I have explained in this video ruclips.net/video/xYo6Bpzbto8/видео.htmlsi=jc-wiGhkRY6EyIlo&t=515
      So you have to understand how we can control this robot by /cmd_vel topic first before moving to next step.
      Next, I said I was using Navigation2 package which is the package for Localization (know position on the map) and Navigation (move the robot by sending goal point). This package will be doing everything about generating path, create /cmd_vel. If you already have static map and decent odometry source, and once you run Navigation2 package you just need to pick a point on the map, then boom you will get /path and /cmd_vel topics from it. The package will keep sending /cmd_vel until the robot can reach the goal on map.
      I am not sure your ROS2 knowledge, but the Navigation2 package is a well-known one in ROS2, so maybe you can try checking on that, I think it could help.
      Good luck ;)

    • @Wolf-dx6jk
      @Wolf-dx6jk 2 месяца назад

      @@stepbystep-robotics6881oh, thank you very much. That’s really clear explanation.
      However I’ve got another question. Does fl_nav2_helpers is your own package which I can’t get anywhere?

  • @FactorSocial4797
    @FactorSocial4797 5 месяцев назад

    Nice one

  • @AbdrahamaneKOITE
    @AbdrahamaneKOITE 3 месяца назад +1

    Hi Rachid, thanks for the video! I see you're using Ubuntu 20.04 with ROS1 Noetic and ROS2 Galactic, but ROS2 Humble is recommended for the LiDAR. How did you manage to run all these versions on the same system?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 месяца назад +1

      @@AbdrahamaneKOITE Hi, yes we can install ROS2 Humble from source. I have two ROS2 on same system, galactic and humble.

    • @AbdrahamaneKOITE
      @AbdrahamaneKOITE 3 месяца назад

      @@stepbystep-robotics6881 Great, thank you!

    • @junyang1710
      @junyang1710 2 месяца назад +1

      @@stepbystep-robotics6881:must we install it from the source? is it possible to just using the Dock image?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  2 месяца назад

      @@junyang1710 I haven't tried on Docker, but it seems possible as well.
      If you can install the package inside Docker and could manage to get ROS topic back to local environment, that could work too I guess.
      I don't think the package has pre built Docker image, so we will need to make our own.

  • @hrmny_
    @hrmny_ 8 месяцев назад +1

    What's the minimum angle of the lidar vertically?
    As in if it's mounted flat can it see the ground? Seems like you have some ground data in your neighborhood scan

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад

      Min. FOV angle is -2degree. The lidar is placed in flat, and yes with this min FOV angle, meaning it will see ground with farther away from the robot.

  • @mr2tot
    @mr2tot 8 месяцев назад +2

    Hi, what is the maximum speed that your robot can achieve? And what maximum speed have you tested with this algorithm ? Thanks :D

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад +2

      Max speed with 18V battery is 1.3m/s
      It’s DC brushed motors, so if higher voltage it could go faster. The AT Motor Driver that I am using could go up to 24V or 6 cells battery.
      I have trie run that with full speed outdoor, the Odometry from FAST-LIO still pretty reliable :) they really made a good work.

    • @mr2tot
      @mr2tot 8 месяцев назад +1

      @@stepbystep-robotics6881 Thank you for your answer. In addition to the robot depending on the battery voltage, does it also depend on the ROS processing speed? If I'm not mistaken, robots developed on ROS only have a maximum speed of about 3m/s?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад

      @@mr2tot I don’t think any ROS robot will be limited as 3m/s because robot’s physical speed doesn’t relate to ROS processing speed. The vehicle physical speed should be depending on ESC or motor driver.
      From my point of view, I don’t use ROS for all of control level, sometime we need realtime data in low level then better to use RTOS. ROS is better from middle to higher level software I feel.

  • @sebastianrada4107
    @sebastianrada4107 8 месяцев назад +1

    Great video
    Does it work with 2d lidars?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад

      Fast-Lio only works with 3D lidar. But Nav2 just 2D laserscan is enough.

  • @TinLethax
    @TinLethax 8 месяцев назад +1

    Cook video! I wonder if the Fast LIO can be used with a depth camera (with depth image to point cloud conversion). I have a special version of Asus Xtion that is depth-only instead of typical RGB-D.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад +1

      if that pointcloud from depth camera is fast enough, I think it might work. And you still need IMU too.

    • @TinLethax
      @TinLethax 8 месяцев назад

      @sencis9367 That what I already had. Also It's depth only. So the visual odometry was not available here. I have a plan to mount the camera on the front and back of the robot, as I have two of then. And then write a CPP ros2 node to combine the Point cloud into a single cloud message.

    • @TinLethax
      @TinLethax 8 месяцев назад

      @sencis9367 I have tested the Asus Xtion (640x480 depth resolution) with various Laser Odometry SLAM but not all of them. I ported the SSL_SLAM to ros2 and this seems to give the best result. Still, the problem remains when the robot rotates. The estimated odometry just fly off as the robot start to rotate. My guess was that the narrow horizontal FoV compared to 3D LiDAR, But something like Livox Horizon, MID-40 seems to works despite the similar narrow FoV. I yet to test it with KISS-ICP and another IMU coupled SLAM.

    • @PointLab
      @PointLab 7 месяцев назад +1

      @@TinLethaxto my opinion kiss-icp is useless for real world application. It is fast, it is easy to understand, but it has no backend optimisation mechanism to handle the long term drift.

  • @Wolf-dx6jk
    @Wolf-dx6jk 2 месяца назад

    Hi again! Did you change anything into slam toolbox files to get the map creates from your laserscan topic?

    • @Wolf-dx6jk
      @Wolf-dx6jk 2 месяца назад

      Sorry for being annoying but I really need your help. 9:47 I saw you are using custom config file. Can you please tell me what have you changed in original one to make everything work with livox mid360?

    • @Wolf-dx6jk
      @Wolf-dx6jk 2 месяца назад

      Okay I saw my first map but I guess I misunderstood how to use static transform publisher
      I transform like this body (default frame of laserscan) -> Odometry -> map -> odom
      But this only provide me see my map after initialization not further
      So I’m here again to ask how do you transform your laserscan frame to odom?
      It would be the best if you can provide your TF tree in Rviz that is working for creating map.

  • @Flynn3778
    @Flynn3778 8 месяцев назад

    I wonder what how the point cloud would look like with the LIVOX MID-360 for a wire mesh fence. Our baseball field is surrounded by 6 foot high mesh fencing. In the process of building a mower to do the field and need a way for the mower to see the perimeter.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  8 месяцев назад

      I believe the wire mesh could be detected by the MID-360. It could detect some of power line, so in case of planar fence, it might be no problem.
      The lidar itself is not too expensive, you better to give a try on it. 🙂

  • @KognitiveLearn
    @KognitiveLearn 3 месяца назад

    How do we run this on a GPU?

  • @salh2665
    @salh2665 7 месяцев назад

    ❤❤❤❤

  • @sutanmuhamadsadamawal715
    @sutanmuhamadsadamawal715 8 месяцев назад

    Nice Project, I working with the similar project for agriculture robot, here in Japan. It will be nice if I can contact you for getting some advice.