matlabbe
matlabbe
  • Видео 27
  • Просмотров 474 583
RTAB-Map 3D LiDAR SLAM with the NetherDrone
More about this drone in this paper:
"NetherDrone: a tethered and ducted propulsion multirotor drone for complex underground mining stope inspections" (Editor's Choice)
cdnsciencepub.com/doi/abs/10.1139/dsa-2023-0001
More about autonomous exploration with this drone:
"TAPE: Tether-Aware Path Planning for Autonomous Exploration of Unknown 3D Cavities Using a Tangle-Compatible Tethered Aerial Robot"
ieeexplore.ieee.org/abstract/document/9844242
ruclips.net/video/nROO0BFK4lc/видео.html
Merci à Vertige de nous avoir permis de tester notre drone!
vertige-escalade.com
RTAB-Map: introlab.github.io/rtabmap/
00:00 Mapping
05:55 Final Point Cloud
Просмотров: 1 427

Видео

Testing my teleop skill with Unitree Go1 quadruped robot in simulation! (Multi-camera VSLAM)
Просмотров 75811 месяцев назад
Launch file demo: github.com/introlab/rtabmap_ros/blob/master/rtabmap_demos/launch/demo_unitree_quadruped_robot.launch
Car Mapping with RTAB-Map on CAT Vehicle Simulator (CitySim)
Просмотров 1,9 тыс.2 года назад
(Re-uploaded with simulated terrain fixed) The simulator: github.com/jmscslgroup/catvehicle, based on github.com/osrf/citysim See github.com/introlab/rtabmap_ros/tree/master/rtabmap_demos/launch/demo_catvehicle_mapping.launch for the configuration used and installation steps to modify CAT Vehicle URDF with appropriate sensors. 0:00 Intro 00:52 Mapping streets 7:30 Mapping parking tower 12:40 Dr...
Drone Visual-Based 2D Indoor Navigation with ROS
Просмотров 6 тыс.2 года назад
You can try it! It is Open Source on Github: github.com/matlabbe/rtabmap_drone_example 0:00 Phase 1: Guided Exploration (Mapping) 2:16 Phase 2: Autonomous Navigation (Patrolling) 5:58 Full screen visualization
RTAB-Map for iOS - LiDAR Scanner App
Просмотров 13 тыс.3 года назад
For best results, iPhone or iPad with LiDAR sensor required. Available on App Store: apps.apple.com/ca/app/rtab-map-3d-lidar-scanner/id1564774365 Free and Open Source: introlab.github.io/rtabmap 0:00 Mapping 1:39 Post-Processing 3:00 VR/AR Visualization 4:15 Append Mode 5:52 Complete Scan Result 7:25 Raw LiDAR Point Cloud 8:42 Without LiDAR Mode
LiDAR-SLAM on Self Racing Cars Dataset (Velodyne HDL-32)
Просмотров 2,1 тыс.3 года назад
The dataset: data.selfracingcars.com/ RTAB-Map is the SLAM approach used. For more info and to know how to reproduce this video, read this post: answers.ros.org/question/366097/anyway-to-use-the-map-data-from-this-website-for-slam/
Four Kilometers Walk in Forest (an uncut real-time visual SLAM demo)
Просмотров 20 тыс.4 года назад
RTAB-Map ( introlab.github.io/rtabmap ): an open source Simultaneous Localization And Mapping (SLAM) library Stereo images recorded with a MYNT EYE S camera: www.mynteye.com/ 00:00 Mapping 40:00 - Parameters 50:05 - Closing the large loop 52:40 - Closing the small loop 54:40 - Final loop closure Google Map comparison with view similar to 57:16 - www.google.com/maps/place/Mont-Bellevue, Sherbroo...
RTAB-Map Tango High Texture Resolution, Asus ZenFone AR
Просмотров 3,4 тыс.6 лет назад
Example of RTAB-Map Tango output with max texture resolution. To compare with output from RTAB-Map Tango with default settings, see sketchfab.com/models/16bcf7b60e7b46209154aecfd073e5e6 (note how we cannot read the small names on the wall unlike in this video).
3D Scanning of a Cottage with a Phone
Просмотров 119 тыс.7 лет назад
For more info: github.com/introlab/rtabmap/wiki/Multi-Session-Mapping-with-RTAB-Map-Tango Sketchfab model: skfb.ly/6p7YJ (Updated Nov 2019: skfb.ly/6OyUy) Erratum: the phone is "Phab2Pro", not "Phan2Pro"
Tango ROS Streamer + RTAB-Map
Просмотров 6 тыс.7 лет назад
Quick demo of integration of Tango ROS Streamer and RTAB-Map. See wiki.ros.org/rtabmap_ros/Tutorials/Tango ROS Streamer for details.
RTAB-Map Tango 0.11.14 on Phab2Pro
Просмотров 20 тыс.7 лет назад
Result on Sketchfab: skfb.ly/6nryX App: play.google.com/store/apps/details?id=com.introlab.rtabmap Project: introlab.github.io/rtabmap/
RTAB-Map: Example Odometry Optical Flow (KITTI 07 dataset)
Просмотров 2,9 тыс.7 лет назад
RTAB-Map: Example Odometry Optical Flow (KITTI 07 dataset)
RTAB-Map on Google's Project Tango
Просмотров 6 тыс.8 лет назад
Port of RTAB-Map SLAM approach on Google's Project Tango. The application is available on Google Play Store: play.google.com/store/apps/details?id=com.introlab.rtabmap Project page: introlab.github.io/rtabmap/
RTAB-Map: Robust Graph Optimization with GTSAM and Vertigo
Просмотров 7 тыс.8 лет назад
For more information, see this tutorial: github.com/introlab/rtabmap/wiki/Robust-Graph-Optimization Open source: RTAB-Map: introlab.github.io/rtabmap/ GTSAM: collab.cc.gatech.edu/borg/gtsam Vertigo: openslam.org/vertigo.html
Hand-Held Stereo SLAM
Просмотров 4,9 тыс.9 лет назад
Demo showing RTAB-Map mapping with a stereo camera: -Top-left: the features tracked to estimate the camera position -Bottom-left: loop closure detection -Center: 3D Map with camera trajectory -Right: Top-view 2D map with ground (light gray) and obstacles (black) More info: github.com/introlab/rtabmap/wiki/Stereo-mapping
Outdoor stereo SLAM with RTAB-Map
Просмотров 111 тыс.9 лет назад
Outdoor stereo SLAM with RTAB-Map
Outdoor Autonomous Robot Navigation with a Stereo Camera
Просмотров 15 тыс.9 лет назад
Outdoor Autonomous Robot Navigation with a Stereo Camera
Speeding up RTAB-Map
Просмотров 13 тыс.9 лет назад
Speeding up RTAB-Map
IROS 2014 Kinect Challenge Winner: SLAM approach used
Просмотров 28 тыс.9 лет назад
IROS 2014 Kinect Challenge Winner: SLAM approach used
Robot Mapping using Laser and Kinect-like Sensor
Просмотров 62 тыс.10 лет назад
Robot Mapping using Laser and Kinect-like Sensor
RTAB-Map: Multi-session RGB-D mapping with hand-held Kinect
Просмотров 5 тыс.10 лет назад
RTAB-Map: Multi-session RGB-D mapping with hand-held Kinect
Kinect odometry comparison: SIFT, SURF, FAST/BRIEF and ICP
Просмотров 6 тыс.10 лет назад
Kinect odometry comparison: SIFT, SURF, FAST/BRIEF and ICP
RTAB-Map RGB-D mapping (desktop example)
Просмотров 13 тыс.10 лет назад
RTAB-Map RGB-D mapping (desktop example)
VVindow : Kinect virtual reality
Просмотров 1,4 тыс.10 лет назад
VVindow : Kinect virtual reality
RTAB-Map RGB-D mapping with global loop closure detection only
Просмотров 5 тыс.11 лет назад
RTAB-Map RGB-D mapping with global loop closure detection only
Cartographie et localisation simultanées avec une caméra Kinect - Journée de la recherche 2013
Просмотров 54111 лет назад
Cartographie et localisation simultanées avec une caméra Kinect - Journée de la recherche 2013
Find-Object inverted search
Просмотров 79411 лет назад
Find-Object inverted search

Комментарии

  • @cekuhnen
    @cekuhnen 4 месяца назад

    I am still experimenting with RTAB-MAP and have some questions - Mathieu Labbé would you be open for a quick zoom chat at one point?

  • @GuilhermeGomes2
    @GuilhermeGomes2 5 месяцев назад

    Do you recommend any tutorial for me to start?

  • @2612cristhian
    @2612cristhian 8 месяцев назад

    what type of rgbd camera do you use?

    • @matlabbe
      @matlabbe 8 месяцев назад

      In this example it was a simulated RGB-D camera.

  • @forrestallison1879
    @forrestallison1879 9 месяцев назад

    What did you build this on? ROS?

    • @matlabbe
      @matlabbe 9 месяцев назад

      Yes on ROS. The drone is using open source PX4 Autopilot, with interface to ROS with mavros package. This video shows off the mapping part created by rtabmap_ros package (using libpointmatcher under the hood for scan matching) with an ouster os1 lidar.

    • @forrestallison1879
      @forrestallison1879 9 месяцев назад

      @@matlabbe very nice. I wish I didn't wasn't bother that ROS has a big stack of bad design decisions (or ones I don't personally like), it would be more fun to work with. Glad it's working so well for so many, though

  • @Flamenawer
    @Flamenawer 9 месяцев назад

    Would be nice you integrate a 3d planner .

    • @matlabbe
      @matlabbe 9 месяцев назад

      We did integrate with a planner called TAPE (see reference in description, I added a video link ruclips.net/video/nROO0BFK4lc/видео.html) that we tested on that drone.

    • @Flamenawer
      @Flamenawer 9 месяцев назад

      ​@@matlabbethanks I'll check it out.

  • @DaveShap
    @DaveShap 9 месяцев назад

    This is really pretty

  • @martincerven
    @martincerven 9 месяцев назад

    Very cool!

  • @i_am_rajee
    @i_am_rajee 9 месяцев назад

    Wow

  • @madeautonomous
    @madeautonomous 9 месяцев назад

    Amazing!

  • @cekuhnen
    @cekuhnen 10 месяцев назад

    How is a loop closure detected ? Most of my scans suffer from drift and misalignment

    • @matlabbe
      @matlabbe 10 месяцев назад

      Loop closures are detected visually. Enough discriminative visual features should be detected to close the loop. While scanning, if you come back to a previously scanned area, try looking at it with same point of view than the first traversal to maximize the chance to detect a loop closure. Once a loop closure can be detected, you should see "Loop closure detected" appearing, with background changing to green or yellow like at 0:49.

    • @cekuhnen
      @cekuhnen 10 месяцев назад

      @@matlabbe I will try this tomorrow again I teach in interior design and this app looks very promising

  • @i_am_rajee
    @i_am_rajee 10 месяцев назад

    Longtime, no see!

  • @yuan0816
    @yuan0816 11 месяцев назад

    I can convert OBJ format to DAE with Meshlab (I use RTAB-Map iOS app), but I didn't have any texture, I need some help!

    • @matlabbe
      @matlabbe 11 месяцев назад

      Can you see texture when you open OBJ in meshlab? If so, maybe an option in meslab DAE exporter should be enabled to export with texture.

  • @martincerven
    @martincerven 11 месяцев назад

    Did you use rosbag or is rtab running directly on the robot?

    • @matlabbe
      @matlabbe 11 месяцев назад

      This is in simulation (Gazebo)

    • @martincerven
      @martincerven 11 месяцев назад

      I thought it’s your house😅 but I guess you made a mesh from real building to gazebo model right?

    • @matlabbe
      @matlabbe 11 месяцев назад

      @@martincerven yeah, it is a 3d scan from a real apartment that I took, imported to Gazebo

  • @sumitsarkar4517
    @sumitsarkar4517 11 месяцев назад

    So here you are doing only Visual Localisation on a pre built .db file ...are you using multiple camera feeds for localisation...( coz its written Multi-camera VSLAM )

    • @matlabbe
      @matlabbe 11 месяцев назад

      In this video it is SLAM, not localization. The db is built as the robot is moving. Yes, three rgb-d cameras (front, left, right) of the robot are used.

  • @madeautonomous
    @madeautonomous 11 месяцев назад

    Very cool! Might try it later. Was checking out their Unitree 4D LiDAR L1... but seems like the support is not easy to ROS2 yet.

    • @matlabbe
      @matlabbe 11 месяцев назад

      It looks like a great lidar for that price!

  • @yuan0816
    @yuan0816 11 месяцев назад

    How to export the RTAB-map (.db file) and import to Gazebo? Is there has any tutorial? thanks you (sorry about my poor English, I'm from Taiwan)

    • @matlabbe
      @matlabbe 11 месяцев назад

      If you use RTAB-Map iOS app, you can directly export an OBJ file after the assembled mesh has been created. If the database contains a texture mesh (from iOS app for example), in rtabmap-databaseViewer you can do File->Export optimized mesh. If you have a db without mesh generated, you may generate one following instructions from here: github.com/introlab/rtabmap/wiki/Multi-Session-Mapping-with-RTAB-Map-Tango#7-export-the-mesh Afterwards, convert OBJ format to DAE with Meshlab. Once you have the DAE, you can create a model for gazebo. As example, the environment in this video is the "apt" model from this repo: github.com/matlabbe/rtabmap_drone_example/tree/master/models/apt

    • @yuan0816
      @yuan0816 11 месяцев назад

      ​@@matlabbe Thanks for your reply!! I will try later

  • @sumitsarkar4517
    @sumitsarkar4517 11 месяцев назад

    just a querry...can I do the whole thing on ROS2 or is it specific to ROS1....and Loop Closure and Odometry images of RTabMap looks inverted ! (Sorry I may be wrong !! )

    • @matlabbe
      @matlabbe 11 месяцев назад

      The cameras seem upside down on the robot. The demo is specific to ROS1 (the rtabmap launch can be ported to ROS2 though). I am not sure there is an official ROS2 simulator for that robot (it looks not very convenient to use the controller right now on ROS2 ruclips.net/video/YSedTUxI0wc/видео.html)

  • @denzle83
    @denzle83 11 месяцев назад

    I'd be interested to know what sensors were used to get the initial textured point cloud?

    • @matlabbe
      @matlabbe 11 месяцев назад

      The simulated textured environment was captured using RTAB-Map android app on a Google Tango phone. I don't actively support android app anymore, but I would refer you to RTAB-Map iOS instead ruclips.net/video/rVpIcrgD5c0/видео.html

  • @ur-techpartner_de
    @ur-techpartner_de Год назад

    Hi matlabbe, awesome work. Are you planing to have ros2 branch. Since px4 officially supporting XRECDD agent using ros2 now . what are important changes we need to run this without mavros , using nav2 and ros2.

  • @bartoszziembicki6199
    @bartoszziembicki6199 Год назад

    Can "rtabmap_ros::VoxelLayer" and "rtabmap_ros::StaticLayer" be used also as layer plugins in global/local costmaps in Nav2 (ROS2)?

    • @mathieulabbe4889
      @mathieulabbe4889 Год назад

      I don't think so, we would need to port them to ROS2.

  • @achraf1647
    @achraf1647 Год назад

    hey Mr i want to ask u some questions please, can i get any contact ? i really need it

    • @matlabbe
      @matlabbe Год назад

      You can ask on the forum official-rtab-map-forum.206.s1.nabble.com/ or on github github.com/introlab/rtabmap_ros/issues

  • @kopterflug-inspection
    @kopterflug-inspection Год назад

    Would it be possible to support Intel RealSense l515?

  • @pavanchowdarycherukuri
    @pavanchowdarycherukuri Год назад

    What does that mean by " However, the camera of the robot should follow similar paths than the iPad/iPhone did (e.g. same height, same orientation) so that visual localization can work"? Can't we use the RTAB map created on the ipad/iphone for other robots?

    • @matlabbe
      @matlabbe Год назад

      We can, following that comment to get localization between the robot and iPad/iPhone map.

    • @pavanchowdarycherukuri
      @pavanchowdarycherukuri Год назад

      @@matlabbe Thank you for the reply. I am trying to create a map by using ipad and use it as a map to localize my mobile manipulator(it has RGBD camera and 2D lidar). Could you please let me know if there are any useful references so that I can do this? Thank you.

  • @dragonlaw2008
    @dragonlaw2008 Год назад

    Works amazingly great app 😊 but for some reason can’t workout how to adjust the clipping plane. If I get too close to an object the o Jeff disappears. How do I adjust the clipping plane ?

    • @mathieulabbe4889
      @mathieulabbe4889 Год назад

      There is no option to do that, would need to expose that functionality to UI.

    • @dragonlaw2008
      @dragonlaw2008 Год назад

      @@mathieulabbe4889 thanks for the reply. There is another issue I’m having after I finish the LiDAR scan & try to build the mesh it disappears ? The only way I can bring it back is clicking the button close visualisation at the bottom. Is there an option I am missing in settings ? Such as making the scene size bigger ?

  • @____blank____
    @____blank____ Год назад

    Just beautiful.

  • @hariharanramamurthy9946
    @hariharanramamurthy9946 Год назад

    hi when in launch gazebo i am getting error, ERROR [px4] Startup script returned with return value: 32512 ERROR [param] Parameter SYS_RESTART_TYPE not found. --> i have no idea about the error

  • @KensonLeung0
    @KensonLeung0 Год назад

    This is awesome. How to make your apt as a gazebo world (how you generate the .dae file)?

    • @matlabbe
      @matlabbe Год назад

      I scanned with Rtabmap for google tango, but it can now also be done woth rtabmap for iOS (should have iphone pro or ipad pro with LiDAR). You can then export in OBJ from the app. To convert to dae, I did it with Meshlab free softwars, open OBJ, then export as dae. Hope it will you getting nice textured environments in Gazebo!

    • @KensonLeung0
      @KensonLeung0 Год назад

      ​@@matlabbe thanks for the instructions. i am using samsung s22 ultra (not tango supported but arcore with deptAPI). rtabmap on play store reported tango core outdated and terminated. I guess I have to build from source.. Do you have the build from source instructions for ios/android app?

    • @matlabbe
      @matlabbe Год назад

      @@KensonLeung0 I don't provide the android app on play store for other android phones than Google Tango phones because it is too hard to support. I got a version on Huawei P30 Pro or Samsung Note10+ "kinda" working, having both a TOF camera, but since the TOF camera is not officially integrated in ARCore, I am not satisfied with the results and we have to re-code / re-test on every different phones, which is unsustainable (cannot buy every android phones...). It seems there is no TOF camera on S11 Ultra, I think Samsung stopped to put one in their new phones. It is why I recommend iOS now, Apple did a good job integrating the LiDAR camera, same API for all their phones. ARKit is also a lot closer to Google Tango, than ARCore is to Google Tango. If I remember well, I think Apple bought the company behind Google Tango back in the days... so ARKit is kinda Google Tango 2.0.

    • @KensonLeung0
      @KensonLeung0 Год назад

      @@matlabbe i just looked into the android code and i can see the effort you've put on catching the individual android models. I agree that the ios one is much more consistent and sustainable. Thank you i will work on the ios one.

  • @moseschuka7572
    @moseschuka7572 2 года назад

    Is it possible to implement object detection, recognition, pose and tagging will doing this? if so, please how?

  • @hariharanramamurthy9946
    @hariharanramamurthy9946 2 года назад

    hi i read your question in d435 map error in rtabmap_forum, can you say why you used it as a "subscribe_scan" & "subscribe_scan", isnt the d435 a rgbd sensor, please answer this , and also can you share the repo used for simulation please , I am trying to implement same but ,I dont understand any part of rtabmap in documentation

  • @pablodavidarandarodriguez163
    @pablodavidarandarodriguez163 2 года назад

    So, have you used any additional information apart from the pictures that the camera is obtaining? This is, do you fuse the information that you obtain from the camera images with any other sensor information (maybe GPS or so)

    • @matlabbe
      @matlabbe 2 года назад

      Only stereo images and IMU from the camera

    • @pablodavidarandarodriguez163
      @pablodavidarandarodriguez163 2 года назад

      @@matlabbe The same camera had both sensors? Was it your phone or a different device?

    • @matlabbe
      @matlabbe 2 года назад

      @@pablodavidarandarodriguez163 this was with a Mynteye camera

  • @gulhankaya5088
    @gulhankaya5088 2 года назад

    hello, how do I use the map I created? I am using the application in an autonomous vehicle. How will it go through the map I created when I put the vehicle at the starting point?

  • @gulhankaya5088
    @gulhankaya5088 2 года назад

    hello, how do I use the map I created? I am using the application in an autonomous vehicle. How will it go through the map I created when I put the vehicle at the starting point?

  • @chemistry2023
    @chemistry2023 2 года назад

    A great full 3d map

  • @meinotherside1
    @meinotherside1 2 года назад

    Amazing project, I see any markers or landmarks in the traject , how do i configure rtabmap so that my mapping is aligned with respect to these marks? thanks

    • @mathieulabbe4889
      @mathieulabbe4889 2 года назад

      You can enable aruco/april tags detection in settings of the App. They will be used for loop closure detection, but won't be used to transform the map in some world coordinate frame that the tags would be in.

    • @meinotherside1
      @meinotherside1 2 года назад

      @@mathieulabbe4889 Thank you , but, when I want optimize with this settings the Rtabmap it closes, why ?. I use rtabmap standalone for Windows.

    • @meinotherside1
      @meinotherside1 2 года назад

      @@mathieulabbe4889 And finally, if I want use only markers for loop closure, is possible ?

    • @mathieulabbe4889
      @mathieulabbe4889 2 года назад

      @@meinotherside1 Yes, you can disable visual loop closure by setting Preferences->Vocabulary->Maximum words per image... to -1.

    • @meinotherside1
      @meinotherside1 2 года назад

      @@mathieulabbe4889 thank you for your answers. I will try.

  • @dikmugget
    @dikmugget 2 года назад

    I bet seeing those loop closures happen was an amazing feeling and vindication of your hard work! :)

  • @zeeshan5735
    @zeeshan5735 2 года назад

    rosrun package_name Offboard Does not work...the terminal has output neither the drone starts floating Can u tell what could have gone wrong

    • @matlabbe
      @matlabbe 2 года назад

      Just tested again on ROS Noetic ubuntu 20.04 and it works as expected. Make sure you are using the right version of mavros/px4 like in the readme of the example.

  • @yw9926
    @yw9926 2 года назад

    If it is possible to export acc, gyr and magnetomter measurements and poses with timestamp?

  • @uuu12345-z
    @uuu12345-z 2 года назад

    What dataset did you use?

    • @matlabbe
      @matlabbe 2 года назад

      I recorded it next to where I live.

  • @elmichellangelo
    @elmichellangelo 2 года назад

    To think was made a couple of years ago. Big up to you.

  • @Gfish17
    @Gfish17 2 года назад

    How many Terabytes? Can we get small handheld versions of LiDar like in Scanner sombre?

    • @matlabbe
      @matlabbe 2 года назад

      The final map size is around 1GB (actually at 15:22 you can see 965 MB), but the raw point clouds have been decimated at 20 cm voxel in this example. Handheld LiDAR solutions already exist, though maybe too expensive for gaming. That could be possible to do a low cost version, see for example official-rtab-map-forum.206.s1.nabble.com/Kinect-For-Azure-L515-ICP-lighting-invariant-mapping-td7187.html

    • @Gfish17
      @Gfish17 2 года назад

      @@matlabbe interesting 🤔

    • @KensonLeung0
      @KensonLeung0 2 года назад

      get a VLP16.

    • @Gfish17
      @Gfish17 2 года назад

      @@KensonLeung0 Like Scanner sombre in real life.

    • @KensonLeung0
      @KensonLeung0 2 года назад

      @@Gfish17 that would be fun. How about getting a wearable glasses with the RTABMAP android app? (To ensure good quality… pick a compatible phone with ARCore and tof depth camera then mirror to the glasses)

  • @GospodinJean
    @GospodinJean 2 года назад

    which software is that?

    • @matlabbe
      @matlabbe 2 года назад

      RTAB-Map for Google Tango, but now available on iOS (with LiDAR required)

  • @MarkRuvald
    @MarkRuvald 2 года назад

    What drone did you use?

    • @matlabbe
      @matlabbe 2 года назад

      This is a scaled down version (0.6 the size) of default simulated drone from PX4 project (www.arducopter.co.uk/iris-quadcopter-uav.html).

  • @MarkRuvald
    @MarkRuvald 2 года назад

    Awesome! This is with monocular RGB sensor right? What compute requirements does this need? GPU or is CPU possible too?

    • @matlabbe
      @matlabbe 2 года назад

      This is RGB-D, a depth camera is required to see obstacles accurately. GPU only used for simulator and visualization. All navigation stuff are done on CPU. To use on less powerful computer, you may have to switch the visual odometry approach used by a lighter one (ideally a VIO approach).

    • @jonsundar
      @jonsundar 2 года назад

      @@matlabbe How do you think using a stereo camera in the place of RGBD effect the output? Great job by the way.

    • @matlabbe
      @matlabbe 2 года назад

      @@jonsundar With stereo, you cannot "see" white/textureless walls, which are often seen in indoor environments. As a workaround, tape a pattern on the walls at the height at which the drone will fly, then it will be able to avoid them!

    • @jonsundar
      @jonsundar 2 года назад

      @@matlabbe Thank you for answering that. I am just in the conception stage of a mobile application for building construction and infrastructure industry. Your tech will certainly be the future in construction applications👍

  • @matlabbe
    @matlabbe 2 года назад

    Update: Bug at 3:56 doesn't seem to appear anymore in latest PX4 versions. On Noetic with PX4 v1.12.3, it could fly fully autonomous over 60 min without any sign of that bug. PX4 v1.8.2 was used in this video under Melodic.

  • @KennyDu
    @KennyDu 2 года назад

    How to build an iOS Scan app with your open source?

    • @matlabbe
      @matlabbe 2 года назад

      See github.com/introlab/rtabmap/wiki/Installation#ios

  • @JabiloJoseJ
    @JabiloJoseJ 2 года назад

    What camera u r using....?

    • @matlabbe
      @matlabbe 2 года назад

      In this video it was an old Bumblebee2 stereo camera (yes the same name than the transformer)

  • @MarkRuvald
    @MarkRuvald 2 года назад

    How would you compare RTABmap with ORB-SLAM2?

    • @matlabbe
      @matlabbe 2 года назад

      You can check this paper for a comparison with standard datasets: introlab.3it.usherbrooke.ca/mediawiki-introlab/images/7/7a/Labbe18JFR_preprint.pdf