Autonomous U-turns with Direct Perception

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024
  • An extension of DeepDriving (deepdriving.cs.princeton.edu) for two-way traffic with U-turns.
    The host car can have one of two goals at any moment: either continue ahead in the current direction of travel or make the the next available U-turn. The host car's "goal" is analogous to a navigation system providing directions (e.g., make the next right, take the next exit). To gather training data for this setting, TORCS was modified to allow for two-way traffic with grass medians. At occasional breaks in the medians are turning lanes where U-turns can be completed. At test time, the "s" or "u" keys may be pressed to trigger a goal of "Straight" or "U-turn" respectively.
    In this video, the host car autonomously completes U-turns under different traffic conditions. The only input to the CNN is the bottom left image. In the visualization window to the right, green rectangles in the middle lane represent the ground truth locations of the grass medians. The probability bar displays p(U), the probability output by the CNN of being next to a viable turning lane. The visualization window also displays the current "Goal" of the system. When this goal is to make the next U-turn, the "Intent" of the controller ("Searching", "Slowing", "Waiting", or "Turning") is shown.
    In the second clip, there is dense oncoming traffic in the opposing lane. Therefore, the host car must wait until the lane clears before turning.

Комментарии • 8

  • @ekbastu
    @ekbastu 7 лет назад +1

    simply awesome :)
    how much computational power and time are needed to train?

    • @ariseffai
      @ariseffai  7 лет назад +1

      Just a couple hours fine-tuning the original DeepDriving model with a Tesla K40c :)

  • @faridalijani1578
    @faridalijani1578 4 года назад

    Do u have pretrained prototxt and caffemodel CNN files ?

  • @ashokElluswamy
    @ashokElluswamy 7 лет назад

    During the second U-turn (around 1:10), the estimation of the host car position seems to be quite far off from the ground truth. Given that, I was surprised that the actual U-turn maneuver was successfully performed by the host car. Was there any other trick to make it successfully execute the turn in spite of estimation errors.

  • @eatsleepplayrepeat
    @eatsleepplayrepeat 7 лет назад

    What if you want to make that u turn but this is traffic behind you?

  • @anet9535
    @anet9535 7 лет назад

    Hi Ari, I really like the deep-driving project from your team. I want to study it further. Currently I'm looking at the original code available at the deep learning homepage, but it seems to be outdated.
    For example the dynamic lane detection, which is visiable in the video with the dashcam is not implemented and the u-turn stuff is also not implemented. Is there any chance to get the latest code?

    • @ariseffai
      @ariseffai  7 лет назад +1

      Thanks for your comment. The code for the main results of the paper is available. The U-turn code is not online, but I will check into that.

    • @anet9535
      @anet9535 7 лет назад

      Thanks for the clarification. Is your team activly working on that topic? I'm looking forward to read/discuss new stuff about it and to see how the project developed after the release the paper.