NanoSLAM: Enabling Fully Onboard SLAM for Tiny Robots

Поделиться
HTML-код
  • Опубликовано: 8 фев 2025

Комментарии • 5

  • @RC_Ira
    @RC_Ira Год назад +1

    Very interesting video, awesome work!🤩👍

  • @krishnapranav9123
    @krishnapranav9123 3 месяца назад

    This is vey insightful thankyou

  • @Flare1107
    @Flare1107 11 месяцев назад +1

    Is the loop closure scan and reference scan required because of a lack of precision with the odometery? Or is this typical of all robotics? I know there are plenty of issues for real world vs simulated location/trajectory in quadrupeds. If a more sensitive TOF sensor were used, maybe a rgb-d camera, would you still lack precision? Or could you include redundant positioning with a secondary IMU?

  • @vigneshbalaji21
    @vigneshbalaji21 Год назад +1

    Very nice :) to take care of visual odometry drift with graph based approach. I have a doubt, can the onboard IMU be used to take care of this drift ? The main reason being for this graph to make a SLAM, it needs a closed graph path. Can IMU make a difference with a kalman filter maybe ?

  • @kunaldesarda1095
    @kunaldesarda1095 Год назад

    Hey compliments on the research work.
    Just one question as you are using plain cardboard boxes and apparently there are not a lot of things in the maze, how is the drone being able to localize?