Это видео недоступно.
Сожалеем об этом.

Visual and LIDAR based SLAM with ROS using Bittle and Raspberry Pi

Поделиться
HTML-код
  • Опубликовано: 16 авг 2024
  • Article on Hackster.io
    www.hackster.i...
    0:00 Intro
    0:30 Quick demo
    0:53 Theory
    2:23 Elon Musk appearance :)
    2:58 Visual SLAM with ORB-SLAM2
    5:11 Hector SLAM
    7:03 Ending
    The hardware for this video was kindly provided by Seeed studio, check out RPLIDAR A1M8 LIDAR, Raspberry Pi 4 2Gb and other hardware at Seeed studio store!
    www.seeedstudi...
    www.seeedstudi...
    More articles about SLAM:
    What Is Simultaneous Localization and Mapping?
    blogs.nvidia.c...
    LSD-slam and ORB-slam2, a literature based explanation
    / lsd-slam-vs-orb-slam2-...
    RPLIDAR and ROS programming- The Best Way to Build Robot
    www.seeedstudi...
    Driver and launch files for BIttle:
    github.com/AIW...

Комментарии • 57

  • @Hardwareai
    @Hardwareai  17 дней назад +1

    SLAMTEC RPLIDAR A1M8 with free shipping (affiliate link):
    s.click.aliexpress.com/e/_DeVJ2LN

  • @witchking64
    @witchking64 3 года назад +2

    Thanks for this! I'm looking into implementing vision/pov into movement so your explanations help a lot

  • @lujustin668
    @lujustin668 Год назад

    Great video, makes the responsibility question easy to understand

  • @Tetsujinfr
    @Tetsujinfr 3 года назад +1

    Thanks for your efforts on testing those components in the wild. Always good to see real world perf vs marketing videos bs. Keep up the good work!

    • @Hardwareai
      @Hardwareai  3 года назад

      Thank you for the feedback! Real world performance wise, Bittle actually exceeded my expectations - I didn't think it would be able to even start with LIDAR and Pi 3 connected, because of the current draw, let alone walk/crawl and even (do not try this at home) jump. Hector SLAM performance was adequate. ORB-SLAM2 didn't not perform well - in the video you can see Path messages are mostly correct, but occupancy map doesn't make any sense. That can be issue about settings/camera height, but even if these issues are resolved, the camera still moves way too much during walking, so I don't think monocular SLAM is an option for Bittle. Would be interesting to try stereo.

    • @Tetsujinfr
      @Tetsujinfr 3 года назад

      @@Hardwareai usually blur is a killer for feature matching. Fyi I am using a zed camera @ 100fps and it supports pretty fast movements, provided there is enough light (it does have an IMU to help track very fast/short rotation type of movements too). I did not experience with Lidar but your testing is cool, and I appreciate that achieving something with a rpi is quite something.

  • @marktadlock5428
    @marktadlock5428 Год назад

    Thank you for being helpful about slam software

  • @andrewa6713
    @andrewa6713 3 года назад +3

    Good evening sir 😊 I was wondering if you could make a video about integrating hector slam with the navigation stack ?

    • @Hardwareai
      @Hardwareai  3 года назад +2

      Good morning :) I could, but aren't there existing videos for this topic already? I made this particular one because 1) I wanted to try LIDAR and camera on Bittle 2) not so much information on the internet about V-SLAM for ROS

    • @andrewa6713
      @andrewa6713 3 года назад

      I would be very thankful if you send me one of these videos about this topic. Actually, you were the only one who made this kind of videos. Anyway, thank you sir for replying 😊

  • @antonwinter630
    @antonwinter630 3 года назад

    thanks for sharing the results.

  • @iwoaugustynski9265
    @iwoaugustynski9265 3 года назад

    Another inspiring video. Great work!

  • @ahmedaly6698
    @ahmedaly6698 3 года назад +1

    hi you said in the video that hector slam can be used to publish odometry to be later used in the navigation stack
    can you make a video or tell the guidelines how to integrate hector slam with the navigation stack
    thank you in advance (i finished hector slam controlling the robot using commands from the keyboard)
    side note:it is for a graduation project

    • @Hardwareai
      @Hardwareai  3 года назад +1

      Right. Well, this video actually turned out to be more popular than I expected, so a continuation is a possibility indeed. Unfortunately I am super busy at the moment and probably will finish TinyML series first, despite that one wasn't very popular.

    • @ahmedaly6698
      @ahmedaly6698 3 года назад

      @@Hardwareai if you aren't able at the moment to make a video
      if you know the main steps how to integrate them both it would be extremely helpful to just tell me the main steps

    • @andrewa6713
      @andrewa6713 3 года назад

      What a coincidence ... it will be something great as it’s for my graduation project too.

  • @chiragsorate839
    @chiragsorate839 3 года назад

    Can you plz make a video on step by step implementation of this SLAM technique

    • @Hardwareai
      @Hardwareai  3 года назад

      I have published a supplementary article for more details. Is there something that needs more detailed explanation?

  • @HemangJoshi
    @HemangJoshi 2 года назад

    Awesome bro...,👍👍👍

  • @ppowell1212
    @ppowell1212 3 года назад +1

    Thanks for the video. I run a small nonprofit theatre. I'm looking to navigate small wheeled scenery pieces with an electric motor on and off stage. To drive themselves from point A to point B and back again. (they don't need to move fast) What you are doing seems similar, I just want to do it with wheeled platforms. Any thoughts on a model I could emulate to accomplish this?

    • @Hardwareai
      @Hardwareai  2 года назад

      Model? I don't think a neural network model is necessary for that. Wheeled navigation will be a bit easier, just get LIDAR + motors with encoders. ROS packages are available for making Navigation work with this stack.

    • @ppowell1212
      @ppowell1212 2 года назад

      @@Hardwareai Thanks. Forgive me, what is an ROS package?

  • @Tamingshih
    @Tamingshih Год назад

    nice job!

  • @melihaslan9509
    @melihaslan9509 2 года назад

    Awesome I learned something new

  • @MemoxCid
    @MemoxCid 4 месяца назад

    How did you run RVIZ in your raspberry so fast? I cant run RVIZ in my raspberry because is so slow...Great Video

    • @Hardwareai
      @Hardwareai  4 месяца назад

      I think it mentioned in either video or the article that I don't run RVIZ on Raspberry Pi? It's not recommended to do that. Instead you run one ROS node with image / LIDAR/ robot driver on the robot itself and another ROS node with RVIZ on your PC.

    • @MemoxCid
      @MemoxCid 4 месяца назад

      @@Hardwareai Is it posible to do this communication between Raspberry and PC using docker in both?

    • @Hardwareai
      @Hardwareai  4 месяца назад

      @@MemoxCid yes, here is the link wiki.ros.org/docker/Tutorials/Docker

  • @pewpewpistol
    @pewpewpistol 3 года назад

    That's a really small lidar! Where can I look that up?
    5:40 mark, for later

    • @Hardwareai
      @Hardwareai  3 года назад +1

      You can get it here www.seeedstudio.com/RPLiDAR-A1M8-360-Degree-Laser-Scanner-Kit-12M-Range.html

  • @kimagurerobo
    @kimagurerobo 3 года назад

    Awesome!!! Thank you!

  • @grqfes
    @grqfes Год назад

    im trying to use slam with an array of phototransistors close to the ground which would pick up white lines on the playing field for robocup soccer. this would also let it autonomously navigate. do you think its possible? how difficult would it be to set that up? i have around 5 months deadline

    • @Hardwareai
      @Hardwareai  Год назад

      Okay, considering that was 9 month ago (sorry, was busy with my MSc at that time, replying to all old comments now)....
      That would be a very interesting task - but you will need a highly custom algorithm for that. Are you okay with using a cheap camera module(s) instead>

  • @levbereggelezo
    @levbereggelezo 3 года назад

    Very good 👍

    • @Hardwareai
      @Hardwareai  3 года назад

      Thank you! Cheers!

    • @DSEC_Siddhanthvjain
      @DSEC_Siddhanthvjain 9 месяцев назад

      @@Hardwareai can we use this SLAM in VOXL with the help of ROS in it pls help

  • @devanshuchaudhary5700
    @devanshuchaudhary5700 3 года назад

    Visual slam using ORB-SLAM was performed completely on raspberry pi or on a seperate master PC??

    • @Hardwareai
      @Hardwareai  3 года назад

      The answer is in the video ;)
      but yes, it is performed completely on R Pi - that's why I'm using newer Raspberry Pi 4 for ORB-SLAM.

    • @devanshuchaudhary5700
      @devanshuchaudhary5700 3 года назад

      Will it work on Ubuntu 20.04?

  • @pafcunha100
    @pafcunha100 3 года назад

    Hi bro first of all nice work!
    Can i use Ubuntu MATE 20.04.1 LTS?
    Also, could u make a step by step video on install and usage?
    Thanks

    • @Hardwareai
      @Hardwareai  3 года назад +1

      Thank you!
      Well, is there any reason to use 20.04 instead of 18.04 currently? 18.04 is more stable and from running it on Raspberry Pi has almost the same features. I understand it is newer, but unless it has features that you require and are not present in previous version, newer is not always "better" :)

    • @osaydhussein258
      @osaydhussein258 2 года назад

      Did it work for you ?

    • @DSEC_Siddhanthvjain
      @DSEC_Siddhanthvjain 9 месяцев назад

      ​@@osaydhussein258 can we use this SLAM in VOXL with the help of ROS in it pls help

  • @AryanKapur0605
    @AryanKapur0605 2 года назад

    Can you please make a human following car using this hardware

    • @Hardwareai
      @Hardwareai  2 года назад

      You mean Bittle?? It's not a car though xD

  • @michaelzajac5284
    @michaelzajac5284 Год назад

    It's a free download, isn't it?