DQN (Deep Q-Network) theory and implementation. Using DQN with ROS and Gazebo.

Поделиться
HTML-код
  • Опубликовано: 5 окт 2024
  • In this tutorial I explain the basics of the DQN (Deep Q-Network) and how to use it with ROS. We will do a simple cart-pole (inverted pendulum) simulation in Gazebo to see how DQN works.
    ROS : Noetic
    The project files are here:
    drive.google.c...

Комментарии • 31

  • @cmtro
    @cmtro 2 года назад +1

    This is science ! Thx...

    • @robotmania8896
      @robotmania8896  2 года назад +1

      Hi cmtro!
      I also think machine learning is a very promising field!

  • @pullmed
    @pullmed 2 года назад +1

    Great as always

  • @matiassandacz9145
    @matiassandacz9145 Год назад

    Hello! Great job on your work! I'm currently working on a similar topic for my master's thesis, and I have a few questions for you:
    1) Have you experimented with utilizing the IMU plugin to extract the angle of the pole?
    2) Have you explored the use of a plugin to publish to the /cmd_vel topic in order to control the movement of the cart?
    3) In your opinion, do you believe that resetting the gazebo environment adequately resets the physical properties of the model, such as forces, gravity, link and joint states back to the initial states?
    Thank you in advance!

    • @robotmania8896
      @robotmania8896  Год назад

      Hi Matías Sandacz!
      Thanks for watching my video!
      1) No, I have not. It is possible to extract angle of the pole using IMU, but it brings additional complexity to the simulation. Since I wanted to focus on the learning method itself, I obtained angles using link pose.
      2) No, I just use node to publish /cmd_vel topic.
      3) Yes, I believe that by resetting the gazebo world, physical properties are also initialized. But you have to be careful with initialization. Because when several processes are running, depending on the order of how they are initialized, when a new simulation begins, the other process may still be outputting previous simulation value which may cause a problem.

  • @mikaeltrana8426
    @mikaeltrana8426 12 дней назад

    Really nice! I am using Jazzy and Harmonic and tried to rewrite your code for those but have a hard time resetting the model in gazebo during training. When I use the available reset service the model disappears. Have you managed to get it to work in the newer versions?

    • @robotmania8896
      @robotmania8896  11 дней назад

      Hi Mikael Trana!
      Thanks for watching my video!
      Yes. I think you should use the “/world/”your world name”/set_pose” service to reset the model pose.

    • @mikaeltrana8426
      @mikaeltrana8426 11 дней назад

      @@robotmania8896 Thanks for the answer! Hm, yes I tried that but I only managed to set the pose of the model not the links/joints that moves during training.

    • @robotmania8896
      @robotmania8896  10 дней назад

      @@mikaeltrana8426 I understand. I did this with a car-like robot that is not that sensitive to links/joints positions. So, I guess in my case the link poses were not initialized too. I think it should be possible but I don’t know right now how to do it.

    • @mikaeltrana8426
      @mikaeltrana8426 10 дней назад

      @@robotmania8896 Ok, thanks for your answer! If you run into a solution I would be happy if you posted it. Thanks again for the great videos!

  • @j_sudi
    @j_sudi 2 года назад

    Thank you for your remarkable work and your pedagogy. How can I cite your work?

    • @robotmania8896
      @robotmania8896  2 года назад +2

      Hi Joyce sudi!
      Thanks for watching my video!
      I would be grateful if you will write my channel name, video name and URL of this video like this :
      (robot mania : DQN (Deep Q-Network) theory and implementation. Using DQN with ROS and Gazebo, ruclips.net/video/9GLVB6Trn10/видео.html)

  • @TrueVasember
    @TrueVasember 11 месяцев назад

    I'm new to this and where are you using ros here exactly?
    And what projects can i do with only gazebo and very few physical parts
    I can code btw

    • @robotmania8896
      @robotmania8896  11 месяцев назад

      Hi WhySoSerious!
      Thanks for watching my video!
      I am using ROS to control the pole. You can do differential drive robot simulation. It only requires two wheels and a body. But you also may take a project in my channel and use it as a starting point.

  • @aravindprakash4753
    @aravindprakash4753 6 месяцев назад

    Can you specify your computer specifications which was used for training

    • @robotmania8896
      @robotmania8896  6 месяцев назад

      Hi ARAVIND PRAKASH!
      Thanks for watching my video!
      I used Ubuntu20.04 with RTX2060.

  • @collinsokara1372
    @collinsokara1372 6 месяцев назад

    I have downloaded the project file and I have also installed gazebo. How do I run your simulation on my gazebo. I am inserting the path but its not coming on.

    • @robotmania8896
      @robotmania8896  5 месяцев назад

      Hi Collins Okara!
      Thanks for watching my video!
      If you are executing gazebo for the first time, it may take a while before gazebo actually launches. Do you have any errors in the terminal?

  • @brendanbrowne2103
    @brendanbrowne2103 Год назад

    hey, trying to follow your code but with a different gazebo environment, can I reach out to you for help?

    • @robotmania8896
      @robotmania8896  Год назад

      Hi Brendan Browne!
      Thanks for watching my video!
      Yes, here is my e-mail: robotmania8867@yahoo.com

  • @christianvaldes1117
    @christianvaldes1117 Год назад

    hello, when trying to load your model I get the error "Could not load controller 'cart_controller' because controller type 'effort_controllers/JointPositionController' does not exist
    was wondering if you had any clue as to how to go about it

    • @robotmania8896
      @robotmania8896  Год назад

      Hi Christian Valdes!
      Thanks for watching my video!
      I think you don’t have the required packages. Please try this:
      $ sudo apt install ros-noetic-ros-control
      $ sudo apt install ros-noetic-ros-controllers

    • @christianvaldes1117
      @christianvaldes1117 Год назад

      @@robotmania8896 hi, you were right
      My issue now is loading the /gazebo/link_states
      I receive an error when trying to get the state of the cart through the cart_pole::cart_link on line 192 and 196

    • @christianvaldes1117
      @christianvaldes1117 Год назад

      this is the specific error
      ValueError: "/gazebo/get_link_properties 'cart_pole::cart_link'" is not in list

    • @christianvaldes1117
      @christianvaldes1117 Год назад

      I noticed you get the same error but in a different line. Line 272 to be precise, did you alter your file to run it?

    • @robotmania8896
      @robotmania8896  Год назад +1

      @@christianvaldes1117 This error occurs in the very beginning of the simulation when learning node starts, but gazebo is not yet publishing link states. I think it has negligible effect on the learning, so I left it as it is.

  • @CathyNorris-y7t
    @CathyNorris-y7t 29 дней назад

    Lilly Track

    • @robotmania8896
      @robotmania8896  27 дней назад

      Hi Cathy Norris!
      Thank you for watching my video!

  • @StoneGeoffrey-n5n
    @StoneGeoffrey-n5n 28 дней назад

    Viola Villages

    • @robotmania8896
      @robotmania8896  27 дней назад

      Hi Stone Geoffrey!
      Thank you for watching my video!