Open-source SLAM with Intel RealSense depth cameras

Поделиться
HTML-код
  • Опубликовано: 5 сен 2024

Комментарии • 42

  • @lucasguo8090
    @lucasguo8090 3 года назад +5

    08:06, available SLAM solutions
    25:33, global shutter for rapid moving platforms

  • @ChicagoBob123
    @ChicagoBob123 5 лет назад +3

    The issue is the extremity limited range. If it worked well.at 30 meters then it would be a lot more useful. Also has issues in sunlight reflections

  • @Hasan...
    @Hasan... 5 лет назад +6

    I don't get it. The tracker was used separately (additionally) to provide IMU data to the visual odometry. But this presentation is of 28th May 2019. Why didn't he use D435i that has a built-in IMU ? Shouldn't that provide both IMU and Visual Odometry together as a single unit?

    • @patrickpoirier1877
      @patrickpoirier1877 5 лет назад +1

      Adding the T265 provides VIO based odometry topic to the system making tracking much more accurate than just IMU that is prone to drift as shown on the first video

    • @mattanimation
      @mattanimation 5 лет назад +1

      the tracker outputs the visual odometry using it's onboard asic to compute whereas the d435i outputs the imu data that would then need to but fed into an algorithm running on the host device CPU or GPU

  • @gustavovelascoh
    @gustavovelascoh 3 года назад +2

    Nice video. At first I thought it was "Realsense with Jake Gyllenhaal"

  • @jamesdavidson11
    @jamesdavidson11 3 года назад +1

    SLAM = Simultaneous Localization And Mapping

  • @ChicagoBob123
    @ChicagoBob123 4 года назад +1

    Where is a link to the next speaker

  • @natethegreatest1000
    @natethegreatest1000 Год назад

    How does this software account for moving objects, such as other people walking around in that office?

  • @kappapride6226
    @kappapride6226 10 месяцев назад

    Does it work on D435 and L515 cameras?

  • @yankoaleksandrov
    @yankoaleksandrov 4 года назад

    Can I run a drone with jetson nano and the stack of t264 and d400 cameras and use them for indoor navigation ?

  • @tienhoangngoc7867
    @tienhoangngoc7867 4 года назад

    Can I ask what's kind of SLAM this project used?

  • @TheRockeyAllen
    @TheRockeyAllen 4 года назад +3

    Can I use these products for commercial applications? Can I make a robot with an Intel T265 and Jetson Nano and just sell it? Or will that require licenses and stuff?

    • @robnierreyes7197
      @robnierreyes7197 3 года назад

      depends on who you sell it to :)

    • @TheRockeyAllen
      @TheRockeyAllen 3 года назад

      @@robnierreyes7197 oh... where can I find more information on this?

    • @robnierreyes7197
      @robnierreyes7197 3 года назад +2

      @@TheRockeyAllen you need to know the regulations for the field where you are deploying your robotics solution. You need to look for ISO standards and such regulatory documents. If you are selling a surgical robot to a hospital then you need to follow medical standards. If you are selling a drone to consumers then you need to follow aviation regulations, etc.

  • @omnigeddon
    @omnigeddon Год назад +1

    Basically buy a used Xbox kinect and gain color and depth.. instead of just depth.. you can buy a kinect for 3 dollars

  • @smtabatabaie
    @smtabatabaie 5 лет назад +7

    That's great, Where can I find the source code or sample?

    • @froimv
      @froimv 5 лет назад +3

      I second the motion.

  • @laisan86
    @laisan86 4 года назад +3

    I would like to have complete documents and source code files to read. Where I can get them ?

    • @olawlor
      @olawlor 4 года назад +6

      A T265 overview is here, basically a paper version of this talk, with links to the T265 ROS node source code: dev.intelrealsense.com/docs/intel-realsensetm-visual-slam-and-the-t265-tracking-camera SLAM with D435i setup guide for ROS: github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i RTAB-Map (underlying SLAM library) source code: github.com/introlab/rtabmap This also needs the Point Cloud Library: github.com/PointCloudLibrary/pcl

    • @laisan86
      @laisan86 4 года назад

      @@olawlor Thanks a lot!

  • @KK-fh1ds
    @KK-fh1ds 3 года назад

    Good presentation

  • @darthmop2293
    @darthmop2293 4 года назад

    Hi I have some questens. How did you make the Sensorsystem? Which hardware did you use, can I bild it with a Jetson Nano and a Akkupack? Is there a German Person to talk?

    • @TheBigCrag123
      @TheBigCrag123 2 года назад

      Hey, ich habe ein ähnliches Vorhaben wie du. Hast du dein Projekt inzwischen beenden können?

  • @_aawawaa_
    @_aawawaa_ 4 года назад +2

    I tried this with my Realsense d435i on a jetson nano. It lost track easily, i even move it slowly with my hand. Anyone have suggestion to fix the problem?

    • @Frankx520
      @Frankx520 4 года назад

      I just bought a D415 for my Jetson nano robot, and I saw your comment... So, is your tracking get any better now?

    • @_aawawaa_
      @_aawawaa_ 4 года назад

      @@Frankx520it got better after i set the parameter recommended here wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping
      Dont use the launch file included on realsense sample folder, read RTabMap wiki and github so you can tune it better.
      Goodluck~

    • @Frankx520
      @Frankx520 4 года назад

      @@_aawawaa_ COOL! I am also trying to add two encoders, and IMU for the robot car. I hope I can get a good result : ) Tx for the link!!!

  • @ChicagoBob123
    @ChicagoBob123 5 лет назад +2

    Also the current cameras has no build for pi4

  • @OEFarredondo
    @OEFarredondo 5 лет назад

    Shukran habibi

  • @ericpham8205
    @ericpham8205 3 года назад

    The sensor was not correctly design. Use space sense without camera

  • @somatosthatine
    @somatosthatine 3 года назад

    Steve Carell explaining SLAM...

  • @EliSpizzichino
    @EliSpizzichino Год назад

    3 years later... are there any better solution on the market??? I can't believe Intel or someone else didn't come with a single camera doing everthing

    • @TULEYDIGITAL
      @TULEYDIGITAL 3 месяца назад

      Closest would be the HTC Vive Ultimate -> Calibrate -> Run it in STEAM VR in Headless Mode -> Use OpenXR package in Python, installed via pip, to read the pose

  • @RowdyCoders
    @RowdyCoders 5 лет назад

    👏👏👍

  • @goofybits8248
    @goofybits8248 4 года назад

    👌👍❤