Follow-Me AGV based on SLAM and Dynamic Navigation in Large-Scale Environments

Поделиться
HTML-код
  • Опубликовано: 2 мар 2020
  • Autonomous Ground Vehicles for assistance in logistics centers
    The Follow-Me AGV project uses an intelligent, mobile robot to support warehouse workers in picking goods and products. To accomplish all tasks necessary in this context, the robot is able to create a map of large-scale environments (SLAM) and navigate to any places within this map. Furthermore, the robot is able to follow the worker during commission by using a classical 2D laser scanner. During navigation, the robot automatically avoids all dynamic and static obstacles in the environment. The video shows the different strengths of our navigation pipeline during an onsite test in the Hong Kong Science Park. Thank you to RV Technology for a great cooperation and letting us share the videos.
    ---------------------------------------------------------------------------------
    Autonome Fahrerlose Transportfahrzeuge zur Unterstützung in der Logistik
    Das Follow-Me AGV-Projekt unterstützt durch einen intelligenten, mobilen Roboter Lagerarbeiter bei der Kommissionierung von Waren und Produkten. Um alle in diesem Zusammenhang notwendigen Aufgaben zu bewältigen, ist der Roboter in der Lage mittels SLAM, eine Karte auch von sehr großen Umgebung zu erstellen und zu beliebigen Orten innerhalb dieser Karte zu navigieren. Darüber hinaus ist der Roboter in der Lage, dem Arbeiter während der Kommissionierung mit Hilfe eines klassischen, einzeiligen Laserscanners zu verfolgen. Während der Navigation vermeidet der Roboter automatisch alle dynamischen und statischen Hindernisse, die in der Umwelt vorhanden sind. Das Video zeigt die Stärken unserer Navigationspipeline während eines Vor-Ort-Tests im Hong Kong Science Park. Vielen Dank an RV Technology für die super Zusammenarbeit und die Möglichkeit das Video zu veröffentlichen.
  • НаукаНаука

Комментарии • 6

  • @julyarifianto
    @julyarifianto 2 года назад

    Awesome !!

  • @alesss_mc
    @alesss_mc 2 года назад +1

    Hello, I like a lot how does your robot work. I am intereste on follow me algorithm, one you have the position of the person tracked by leg detection, how do you implement speed control and obstacle avoidance to follow the person?

  • @GustavoRezendeSilva
    @GustavoRezendeSilva 3 года назад

    Nice.
    What is the maximum environment size that the robot is able to map? How much memory does the robot computer have?

  • @understandwithakashs1367
    @understandwithakashs1367 3 года назад

    AWESOME.
    The position accuracy with your system is in millimeters?
    Will adding encoders could get us there?

    • @FZIchannel
      @FZIchannel  2 года назад +1

      The current position accuracy is about 50 mm. This is due to the accuracy of the laser scanners themselves (30 mm) in combination with the error correction of the SLAM algorithm. Additional odometry sensors would improve the accuracy, but only to a certain grade.

  • @chentean206
    @chentean206 3 года назад +1

    hello Dear
    could you share some code and hardware interfaces? Thank you