Creating Simulink reinforcement learning environment and training agent walkthrough

Поделиться
HTML-код
  • Опубликовано: 21 окт 2024

Комментарии • 11

  • @codeandrelax6792
    @codeandrelax6792 2 года назад +1

    Well done Ivan. Thank you for the video

  • @suneetd9388
    @suneetd9388 13 дней назад

    Can you try creating a model and try applying RL without using predefined examples? that would be great any simple example will be helpful

  • @rominazarrabiekbatani8776
    @rominazarrabiekbatani8776 Год назад

    Hi Evan, thanks for the video, is the local function m file provided in the example package?

  • @sangeethaep942
    @sangeethaep942 7 месяцев назад

    Hi Sir, May I know why there is no change in the reward in all the conditions?

  • @loyjiehao
    @loyjiehao 5 месяцев назад

    Can you make a video of how to replace DDPG with PPO, i am stuck with that, because it seems like PPO is "PG" algorithm instead of AC

  • @alibaniasad8535
    @alibaniasad8535 Год назад

    Hi, thank you for your video. Can you please tell me if the agent can learn during testing, or how to train the agent solely in simulation after it has been trained? Thanks.

    • @IvanGarcia-cx5jm
      @IvanGarcia-cx5jm  Год назад +1

      If what you mean is that the agent keeps learning when used in production or something like that, then I have not seen something like that yet. Although it would be nice to have something like that. It might have a performance and complexity penalty in keeping the agent learning. Maybe doing a custom agent with custom training could help. This exxample might be related:
      www.mathworks.com/help/releases/R2022b/reinforcement-learning/ug/train-reinforcement-learning-policy-using-custom-training.html

    • @IvanGarcia-cx5jm
      @IvanGarcia-cx5jm  Год назад

      Or you can see the "Develop Custom Agents and Training Algorithms" section in this page as well:
      www.mathworks.com/help/releases/R2022b/reinforcement-learning/training-and-validation.html?s_tid=CRUX_lftnav

  • @apoorvpandey3D
    @apoorvpandey3D Год назад

    Hi, i am trying to hover a drone using same logic but running into some errors. It would be really helpful if you could provide some help. Can i get your email for contact?

    • @IvanGarcia-cx5jm
      @IvanGarcia-cx5jm  Год назад

      I don't give my contact, but usually answer questions in this chat. If you still have the error you can paste the error here and give a brief description and I could help like that.