Visualizing the NEAT Algorithm - 1. Evolution

Поделиться
HTML-код
  • Опубликовано: 1 янв 2025

Комментарии • 64

  • @ixenroh
    @ixenroh 3 года назад +21

    Thank you youtube Algorithm. As an AI student and fan, this is amazing, and the orchestral music is a cherry on top. Instant sub

  • @pascalbercker7487
    @pascalbercker7487 3 года назад +56

    I don't know if it was intentional or not - but the word "fruits" mutated to "furits" in your video - and that made them feel almost more "alive" in my mind and I started rooting for the little "furits" hoping that the evil snake would not find them! The music was perfect for this unfolding evolutionary drama!

    • @earnings_cc
      @earnings_cc 2 года назад +3

      You should look up "furries" .

    • @inanitas
      @inanitas 2 года назад +3

      @@earnings_cc Those are not living beings lol

  • @OneShot_cest_mieux
    @OneShot_cest_mieux 2 года назад +2

    Amazing ! There is a lacking content of NEAT on youtube, and you just fixed it (very good visuals by the way)

  • @alexandruionut4209
    @alexandruionut4209 2 года назад +2

    This was so well put together. Great music and great representation. Congratz. :)

  • @hxhelm
    @hxhelm 3 года назад +12

    Very impressive! I'm trying to get into visualizing my nets and was wondering if there is a github repository for this project that one could look at?

  • @maksimon519
    @maksimon519 2 года назад +4

    the visualization is cool! how do you implement it?

  • @lorenzotinfena
    @lorenzotinfena 3 года назад +2

    It has also some sensual content, great video!

  • @2000franci
    @2000franci 3 года назад +1

    What is the input of the Nets? You feed the whole grid? Also the Nets make a decision each frame of the game?

  • @stefanwinter4865
    @stefanwinter4865 3 года назад +3

    Amazing visualization!

  • @mazen6773
    @mazen6773 8 месяцев назад

    How can I display the changes in each generation in the neural network like this

  • @JohnnyCodes
    @JohnnyCodes 3 года назад +5

    Very cool! I always wanted to learn how to make a neat network. Any way you can share the code for this?

  • @fredericobsantos
    @fredericobsantos 3 года назад +5

    I'm trying to do something with neuroevolution as well and I have been searching for a way to visualize the changes in the network itself. How did you do these neural network graph visualizations + animations? They are great!

    • @hemanthkotagiri8865
      @hemanthkotagiri8865 2 года назад +3

      Hey Santos. The tool that he is using here is Manim - and open source python tool created by 3b1b. There's a community version too, which is extensively documented. You can get started there!

  • @ducnguyenminh6736
    @ducnguyenminh6736 3 года назад +3

    Love the music tho

  • @SocialPrime
    @SocialPrime Год назад

    Absolutely beautiful and terrifying at the same moment.

  • @TheMazyProduction
    @TheMazyProduction 3 года назад +1

    Underrated channel. Just subbed

  • @zix2421
    @zix2421 4 месяца назад

    Neat is so cool, I love it very much.

  • @manojkothwal3586
    @manojkothwal3586 2 года назад +1

    This is so Myelinating........... 👌👌👌👌👌👌👌👌

  • @marcel2711
    @marcel2711 Год назад

    what's your fitness function?

  • @saaddahmani1870
    @saaddahmani1870 3 года назад +1

    Really great presentation.... thanks.

  • @Friendsofirony
    @Friendsofirony 2 года назад

    Can you teach us how you did the animation

  • @TrifourceGuardian
    @TrifourceGuardian 3 года назад +2

    This is fascinating. How did you make the cool visuals btw?

    • @manojkothwal3586
      @manojkothwal3586 2 года назад +1

      manim ( Mathematical Animation Engine ) library by Grant Sanderson aka 3blue1brown.

  • @carloselfrancos7205
    @carloselfrancos7205 3 года назад +1

    SO EPIC !!!!!!!! Really cool video

  • @Graverman
    @Graverman 3 года назад +2

    Great video!!! Btw I am curious how long does it take to simulate 100 generations?

    • @semperzero
      @semperzero  3 года назад +4

      It depends on how large the population is and on how many times you make each gene play. with a pop of 150 and each gen playing 10 times it was under 5 minutes for the first 100 generations, but after 400-500 gens it started taking 30-60 seconds per generation. The key is to cache the game's steps and just kill it if it gets caught in a loop.

  • @dianaiuliana3734
    @dianaiuliana3734 3 года назад +4

    This is amazing!!!!

  • @eugeniopolanski2390
    @eugeniopolanski2390 2 года назад

    amazing. how do you do the neural network video? thats no also very cool, but also useful

  • @ramanShariati
    @ramanShariati Год назад

    GREAT work . beautiful

  • @ferramatis
    @ferramatis 3 года назад +1

    Do it manage to fully win the game at a certain moment?

    • @semperzero
      @semperzero  3 года назад +2

      At the moment no. But if i were to keep training it for say 10000 more generations, considering the current direction of the evolution, i think it would learn to make the zig zag at the bottom right from bottom to top and cover the entire map with that movement pattern, resulting in completing the entire game this way.

  • @Gister
    @Gister 2 года назад

    How were you able to visualize the genetic structure?

  • @dragolov
    @dragolov 3 года назад +2

    Deep respect!

  • @VikramRaja-u8v
    @VikramRaja-u8v Год назад

    Amazing. I have a question does Neat-python allow adding Hidden layers using config File or the Net evolves by itself for adding hidden layers?

  • @peteroliver7975
    @peteroliver7975 3 года назад +1

    What was the exact selection criterion? It seems like it was finding the dot and avoiding obstacles, but other than the boundary I am not sure what obstacles you refer to? I would like to see the criterion be simply 'the least number of steps to reach the dot' as that doesn't seem to be the measure you chose. In fact as the generations increase it seemed to become more efficient initially, then less and less efficient. Your thoughts?

    • @dietrevich
      @dietrevich 3 года назад

      The obstacles he's referring to is itself. Coming in contact with his own body.

  • @Djellowman
    @Djellowman Год назад

    how'd you make the animations

  • @F00000x
    @F00000x 5 месяцев назад

    Great video!

  • @ahmedyasser8416
    @ahmedyasser8416 3 года назад +1

    how do you reward the gens
    ?

    • @semperzero
      @semperzero  3 года назад +1

      I make each gene play the game 50 times and compute its average result.
      pastebin.com/rudkrxcL

    • @ahmedyasser8416
      @ahmedyasser8416 3 года назад

      @@semperzero
      thanks for replying ❤
      I'm asking because I'm trying to make the same project for learning purposes.
      I'm rewarding each gene for eating and getting near to the apple .. I see your approach to make each one play 50 times is to avoid randomness , am I right? please correct me.
      I think this is why it took me 200 generations to make them learn to eat, however I'm currently stuck with the snakes moving clockwise as near as possible to the wall and eat the apple from the down side , this causes them to hit their tail after a certain score ... any suggestions ?
      I think playing 50 games wouldn't be the solution at this point
      can I ask also about your population size ?

    • @semperzero
      @semperzero  3 года назад +1

      @@ahmedyasser8416 yes, it is to give a lesser weight to lucky and unlucky games.
      I think the best approach would be to re-learn the snake from zero and not make changes after you already trained it 200 generations.
      My pop size was somewhere around 150.
      Now, the biggest problem I encountered was in choosing good hyperparameters (mutation rate, mutation power etc.) In order to overcome this and find good hyperparameters I build 2 engines, one using hyperopt random search (hyperopt.rand.suggest) and one using neat itself for a genetic search engine. In short I tried 100 random sets of hyperparameters and for each one of them I trained the snake up to generation 100 or 200 (you can also train the snake multiple times) and took the best results.
      In order to make the genetic search engine using neat i just made a different project in which the neural network had 1 input of value 1, no bias, no way to add or delete neurons and so on. The output had as 6 neurons, one for each hyperparameter.
      You can stop snakes that hit a plateau very early on, or don't go past certain thresholds (for example more than 5 fruits on average by gen 40, more than 10 by gen 100, etc)
      In order to change the hyperparameters all you need to do is a function like this:
      def set_config(self, confg):
      self.config.genome_config.bias_mutate_power = confg["bias_mutate_power"]
      self.config.genome_config.bias_mutate_rate = confg["bias_mutate_rate"]
      ....
      call the function after you initialize the config and before you initialize the population
      self.p = neat.Population(self.config)

  • @MrWrklez
    @MrWrklez 3 года назад

    Extremely well done

  • @Dewdimpple
    @Dewdimpple 3 года назад +2

    This is neat

  • @DimchanskyLive
    @DimchanskyLive Год назад

    How the input state is encoded?

    • @DimchanskyLive
      @DimchanskyLive Год назад

      I found the answer here: ruclips.net/video/h9JZ0YHtKWQ/видео.html

  • @Friendsofirony
    @Friendsofirony 3 года назад

    Hey. I got an open source project I thought you might be interested in, how can we get in touch? Can I link you?

    • @semperzero
      @semperzero  3 года назад

      Alex.semperzero@gmail.com
      Will check it out tomorrow

  • @l.halawani
    @l.halawani 2 года назад

    Hi there I've been working on my custom evolving algorithm for a while and I'm stuck on one question, maybe you could help me please! Or any one in the community?
    If i've got a connection from input to output, and two other neurons on it i.e.:
    (IN)---->-----(N1)------>-----(N2)--->-----(OUT)
    And say N1 gets disabled in a mutation, should it cause N2 to also be disabled automatically or does NEAT allow keeping of disconnected DEEP neurons?
    I'can't find the answer anywhere....
    I'd so appreciate it. I tried on forums stackexchange, tried on facebook in groups but there doesn't seem to be anyone knowing this well enough to answer.

    • @semperzero
      @semperzero  2 года назад

      From my knowledge, nodes are interdependent in the mutation process. If one gets disabled, then all it's edges get disabled as well, but it will not affect other nodes. The reason is that a new connection may be made with N2 in the future.

  • @matterb6049
    @matterb6049 2 года назад

    👋 to continue

  • @Mhrn.Bzrafkn
    @Mhrn.Bzrafkn Месяц назад

    Perfect 👌🏻

  • @minos99
    @minos99 3 года назад +2

    I give thanks to thee Almighty Algorithm for this content you've fated me to partake. Bless the Creator with endless likes and may you protect us from dislikes. Amen.

  • @bluewhale37
    @bluewhale37 2 года назад

    Revolution

  • @teenspirit1
    @teenspirit1 8 месяцев назад

    NEAT is much more exciting to me than transformer nonsense.

  • @timothylincoln8200
    @timothylincoln8200 3 года назад +1

    excellent video!

  • @yokmp1
    @yokmp1 3 года назад +2

    Dramatic

  • @Karan_Thakkar
    @Karan_Thakkar 3 года назад +1

    Very cool

  • @moooooou
    @moooooou 7 месяцев назад

    god made by human

  • @KhazaeeTech
    @KhazaeeTech 2 года назад

    Where is the source code? Dislike

  • @StreakyFly
    @StreakyFly Год назад

    Amazing visualization!