- Видео 36
- Просмотров 302 137
Finn Eggers
Германия
Добавлен 13 май 2014
Using NEAT to play Snake
I used NEAT to evolve a network to learn snake.
You can download the code including the libraries here:
github.com/Luecx/SnakeAI
You can download the code including the libraries here:
github.com/Luecx/SnakeAI
Просмотров: 8 057
Видео
Using a Genetic Algorithm to learn a smaller version of BlockuDoku
Просмотров 6 тыс.5 лет назад
AI Library: github.com/Luecx/AILibrary/blob/master/src/boids_model/BoidSwarm.java The game: github.com/Luecx/SudoBlock
Neat - Java Implementation 8 - Evolving
Просмотров 3,3 тыс.5 лет назад
source code: github.com/Luecx/NEAT/tree/master/vid 7 and 8/src
Neat - Java Implementation 9 - Optimization
Просмотров 2,7 тыс.5 лет назад
source code: github.com/Luecx/NEAT/tree/master/vid 9/src
Neat - Java Implementation 7 - Client and Species
Просмотров 2,4 тыс.5 лет назад
source code: github.com/Luecx/NEAT/tree/master/vid 7 and 8/src fixed Genome-Code: github.com/Luecx/NEAT/blob/master/vid 7 and 8/src/genome/Genome.java
NEAT - Java Implementation 6 - Calculating
Просмотров 2,7 тыс.5 лет назад
Source code: github.com/Luecx/NEAT/tree/master/vid 6/src
NEAT - Java Implementation 5 - Mutations
Просмотров 3,2 тыс.5 лет назад
Full source code: mega.nz/#!WjRWAAwA!JyFKkR_AJREKlyQz8ZQWK_mV2JSJbdxWrxIPd4Ughtw Only frame classses: mega.nz/#!jzRSQIyZ!n8_OTdO5krupybdbq2ww3_7eLhNwnUWREX0lF2Y9VGU add_sorted method: mega.nz/#!u7ZG1aZT!reTMP5CCAgxXBvRj6iFafA7S59a2fcO31-EgD-rReig
Neat - Java Implementation 4 - Distance function and crossover
Просмотров 3,8 тыс.5 лет назад
source code: mega.nz/#!W3RxGIKa!Ajx2r9hwYQc3fT-Pt0ntSnCI1raXCTqLjPieur5m_D8
Neat - Java Implementation 3
Просмотров 4,6 тыс.5 лет назад
Source Code: mega.nzmega.nz/#!22RFXSaI!BPk7-xoy7e-UlKHGEJ3h6vew95XjpvOaslon0FJpc7E
Neat - Java Implementation 2
Просмотров 5 тыс.5 лет назад
Source files: mega.nz/#!mjAkCKZZ!fOT1jgTjQb3NQMt9wFmRdDJNBAbbaYbYAAAAAAAAAAA
Neat - Java Implementation 1
Просмотров 11 тыс.5 лет назад
I am back after a while. Hope you enjoy it. Next videos are coming in the following days. Feel free to ask any question in the comments below. Code download: mega.nz/#!biAG1KrI!fOT1jgTjQb3NQMt9wFmRdDJNBAbbaYbYAAAAAAAAAAA
NEAT - Introduction
Просмотров 85 тыс.6 лет назад
Please give me some feedback. Again, my mic quality is not amazing but I hope you are fine with that. MarI/O: ruclips.net/video/qv6UVOQ0F44/видео.html
Genetic algorithm - 4: Flappy bird
Просмотров 1,6 тыс.6 лет назад
Full code: www.mediafire.com/?179s7f7w36he3 Again, my mic just messed up a little bit. I am really sorry for that.
Genetic algorithm - 2: Implementation 1
Просмотров 9816 лет назад
My mic just screwed up at the beginning. So NO, that's not me saying some weird words. Full code in the last video :)
Genetic algorithm - 3: Implementation 2
Просмотров 6026 лет назад
Genetic algorithm - 3: Implementation 2
Genetic algorithm - 1: Introduction
Просмотров 2,8 тыс.6 лет назад
Genetic algorithm - 1: Introduction
NN: The Problem of the vanishing gradient
Просмотров 8866 лет назад
NN: The Problem of the vanishing gradient
Neural networks library [Java] 6 - 3D (2D) MNIST
Просмотров 8656 лет назад
Neural networks library [Java] 6 - 3D (2D) MNIST
Neural networks library [Java] 5 - Transformation Layer
Просмотров 3976 лет назад
Neural networks library [Java] 5 - Transformation Layer
Neural networks library [Java] 4 - Dense Layer
Просмотров 6156 лет назад
Neural networks library [Java] 4 - Dense Layer
Neural networks library [Java] 3 - Input and Output
Просмотров 5456 лет назад
Neural networks library [Java] 3 - Input and Output
Neural networks library [Java] 2 - Layer construct
Просмотров 9386 лет назад
Neural networks library [Java] 2 - Layer construct
Neural networks library [Java] 1 - Structure
Просмотров 1,6 тыс.6 лет назад
Neural networks library [Java] 1 - Structure
Neural networks tutorial: Fully Connected 11 [Java] - Some projects
Просмотров 5 тыс.6 лет назад
Neural networks tutorial: Fully Connected 11 [Java] - Some projects
Neural networks tutorial: Fully Connected 10 [Java] - Saving and loading
Просмотров 4,3 тыс.7 лет назад
Neural networks tutorial: Fully Connected 10 [Java] - Saving and loading
Neural networks tutorial: Fully Connected 9 [Java] - Mnist dataset
Просмотров 10 тыс.7 лет назад
Neural networks tutorial: Fully Connected 9 [Java] - Mnist dataset
Neural networks tutorial: Fully Connected 8 [Java] - Advanced learning
Просмотров 6 тыс.7 лет назад
Neural networks tutorial: Fully Connected 8 [Java] - Advanced learning
Neural networks tutorial: Fully Connected 7 [Java] - Backpropagation implementation
Просмотров 11 тыс.7 лет назад
Neural networks tutorial: Fully Connected 7 [Java] - Backpropagation implementation
Thanks a lot . The study of the subject was presented in an interesting and useful way. The source code of the GUI is missing. Can you add it to web
I want to have an offspring from you, where you will be a more fit parent
Why not passing the average weight of the parents to the offspring. Doesn't it improves the network diversity over when we copy the gene of only one parent?
Hello Finn, im trying to implements pacman with NEAT and i rlly struggle with it . if u can help me i will very appreciate it and please let me know
dose anyone know why in the last slide he killed 6 with fitness 442 but did not kill 5 and 7?
THANK YOU. This presentation was AWESOME, I understood it very well, thank you thank you thank you so much
love you, needed this soo much!
Everything is great except for the part about speciation. And it's not because of the animation! You're trying to be specific while also refering to future videos for the details you're being specific about. Don't repeat it five times. Just say that you categorize the genomes by cerain method that you'll show later and you'd skip a lot of confusion, and time!
Wow, great idea with inputs to the neural network. I was looking for an info on how to do it!
i have worked on this again with a c++ project and better NEAT implementation and found abetter way. maybe of interest for you. instead of just doing that, create a space of like 7x7 around the snakes head. the impotrant part is to rotate that data bsaed on the direction of the snake. basically pretend that not the snake is changing directions but its always going up while the board rotates. this way the output is either left, forward or right and is relative to the pov of the snake. i got a prefect game with that.
pronounciation +dangit... You said "gõringurschaaan" instead of "derivation". In instances such as this you might be causing someone to waste days because they heard you say something incorrect. Annuuuuunciate please
Gotta aggree on this one, the pronounciation is rough! If you start making tutorials again, you might wanna check out, how natives pronounce words or simply ask DeepL or something :) Also, why don't you record chunks instead of the whole thing at once. This way you can explain small pieces and make them sound proper instead of getting confused with words and "ähhms" all the time.
I'm not sure why you're using ArrayList<T> and HashSet<T> at the same time. IMHO single HashSet<T> will work just fine
There’s no problem with sigmoid , all activation functions have their uses
Sorry, I'm having a little trouble in understanding how to ensure that the nodes have consistent ids. Should the function that creates them take as arguments the ids of the input and output nodes of the connection it is splitting? What about more complex structures with many hidden nodes that interact? But generally though, great video. Definitely earned a like and subscribe!
doesn't matter, I actually solved the issue. I'm currently implementing a simple version of a neural network that can be used for NEAT in python if anyone would want to take a look
GREAT VIDEO ,Thank you !
Great overview, thanks for this
can you provide the whole code for this project... i am interested in rewriting it in C++
Thank you very much for this introduction, It was very helpful
Hi, ich schreibe meine Facharbeit über den Lernprozess künstlicher Intelligenzen mit NEAT. bei 12:13 erklärst du mutate_node, in den offiziellen NEAT docs finde ich jedoch nur mutate_add_node und mutate_delete_node. Gab es dahingehend Updates? Desweiteren finde ich gar keine Information in den docs zu den anderen Mutation, wie mutate_weight_shift. Sind diese möglicherweise geupdadet wurden im Namen oder wurden komplett ersetzt? Vielen Dank im voraus
Thanks a lot!
Finn, thanks for great video series about networks. Love your engine!
I found a problem that if you set initial weights random from 0 to 1, the network breaks. Because the sum output: weights * prevNeuron is so high that the output_derivative is approximately 0. Hence, delta will be 0 and weights and biases will not be updated.
Hello. What resources did you use to learn NEA?. Have you only read the original paper or are there other great source to learn from?
Mainly the original paper as well as a few google results. I also scanned through stackexchange for answers
Hello finn, i will strat learning neural network from this séries of vidéo, i know little about them, but i am good Java programmer and i had good background about ia, i worked before with aco ant colony algorithms.
thanks man
Ooooh only one representative client instead of comparing to every member of the species. Brilliant idea, thanks.
How can I do the following : Please help asap *1.* Make an Artificial Neural Network with dynamic input and binary out .... *2.* Make a Self Organizing Maps with dynamic input and binary out .... Use only C++ or Java
Hey, nice presentation! Could you please provide a link for formula that calculates genome distances, to sort genomes into species? Thanks!
This is actually a part that the original paper left pretty open. I did some further research and Also asked on stackexchange but I am unable to find it. I also dont remember the exact method but I think I am doing something like: 1: the distance of a genome to a species is the distance of the genome to the representative of the species which I consider the FIRST one to enter the species. For each genome g: Go through each existing species S If distance (g,s) < some threshold Add g to s Break If no species found: Create new species with g as the representative
That’s a simple, yet probably not ideal solution which works good tho and which I used
I love how voice and its volume always changes
well, that's neat
Hey, great series! Love it. One thing though, I think you got the batch training concept wrong. Batch training is done so the weights are not updated AS WE TRAIN EACH INPUT over the batch. We accumulate the delta weights, and update them all at the end of the batch. The reason we do this, is because our final AIM is to minimize cost (error) over all the dataset (which is hard, hence batches). If this is confusing, I'm not sure how to explain this, but you can check 3Blue1Brown's 4 video playlist on Neural networks. He explains this in 3rd or 4th vid I think. Good luck all
you are correct. I am sorry for this. Has been long time ago and i realised this was very wrong :P
Sorry for asking so much questions but this are the last ones... 1. How the selection of the species is made? how you score the results to then mutate that selection? 2. The nodes(neurons) what data contains?. Does it contains the weighted sum? like a normal neuron do? 3. How many types of NEAT algorithms are? 4. I read on the paper that there are some formulas you didn’t mention, what are those? like the fit formula, why those formula are useful?
Those are the last questions:)
I have a question. In what does the Encoding helps us, the Neat Encoding Scheme helps us only to visualize as a Genetically form? or is there another use? can you explain it to me?
the encoding is the principle of how genomes are compared which then serves for speciating.
I noticed you have a lot of getters and setters. I would suggest looking at a project Lombok. You can generate getters and setters by using annotations. You can even generate constructors, that initialize final values by using @AllArgsConstructor annotation before the class or generate data classes. There are more things it can do but that's the basics :)
The presentation was good, but I believe your explanation suffered from the lack of video editing. Some cuts could have made it all clearer and shorter.
Awesome work. After evolve, purging old links nodes not used by any gnome improves performance quite a lot
I got it working, the missing files are not needed. I also implemented a thread pool so rateClient can be parallelized. As for the game it self, I changed the inputs to 8 [0-6] are the number of free spaces in a direction (n, nw, w, sw, s, se, e, ne) plus [7] which is still the direction towards the food. A bit better results, than with just 4 inputs, and slower to evolve time. My best results I got was 109, starting with a gnome where all outputs are connected to all inputs and only weights and shift waits were allowed. Again, thank you for taking the time to post this.
best explanation on the internet. i'm doing an ecosystem project right now and tried to look up and understand NEAT algorithms but they were all far too technical for me, but the way you explained it gave me hope that i can implement this step by step. thank you. right now i have a simple feedforward network with no hidden layers for each of the creatures (just input/output and the directions they move) i suppose the next step would be to make it so the network prototype has some type of static method to generate a new hidden neuron/connection. this is going to be tough. even still this type of genetic algorithm fused with NN topic interests me way more than gradient descent/backpropagation calculus, so i think this path will be worth it for me in the long run as i just find this topic so much more interesting it would be great if this algorithm could be optimized even further somehow
this algorithm has been improved even further :) The algorithm is still the same although it can be applied to larger networks. Its called HyperNEAT. HyperNEAT does basically the same with the genomes although they represent different informations. I havent looked into HyperNEAT in depth but you may want to do that. It scales way better with larger networks.
@@finneggers6612 i'll definitely check that out after i implement the simpler version first! question, i'm still in the beginning stages and i have a Neuron class and a Brain class configured. The Neuron can store a connection object, and the Brain uses static methods to generate the initial network and has an instance method to form a connection between two neurons in your instructions you mentioned the importance of the 'innovation number.' in my neuron's connection object, i have both an innovation number and a path value like [1-6] showing which neurons are connected. i'm having a hard time distinguishing why i would need both a path and innovation number. would just having the path stored be sufficient enough to check if a previous connection has been made in the global brain? or does innovation number/id play some type of important role later on since it just keeps incrementing over time? also, can connections only occur one layer up in this algorithm? meaning a neuron at layer 1 can only connect with a hidden neuron at layer 2, and not bypass and connect with a neuron at layer 3? the inputs start out connecting with the outputs initially directly with no hidden, but if a hidden is dynamically created do they have to go through the hidden to get to output layer 3?
@@bribes_for_nouns innovation number plays an important role in computing the distance function between two genomes and sorting them into the same "species". Connections are basically just a computational path between two neurons. the connection itself does not give any information which connection is "older". With "older" i mean which connection has been created first. Generally older connections are differently weighted compared to newer weights. Thats why innovation number plays an important role. Also NEAT does not know anything about layers. neurons can be created by splitting connections. as far as i know, HyperNEAT works with layers.
It looks like ion your while loop while walking the connections, after the if (in1 == in2) and before if(in1 > in2) your are missing an else. Otherwise when in1 == in2 index_g1 will be incremented twice. Once for the equality and once for not being greater. since booths if will be executed in each loop. Haven't finish watching the rest of the series, so the suspense its building up. Anyway great series.
I love the work, I definitely want to watch through the series, I am especially confused about the "calculating" part because the random mutations cause cycles in my graphs. But why do you have jordan peterson in your NEAT playlist? I don't have a thing against the guy but it looks out of place.
Yeah I had that problem too. I solved it by assigning an x coordinate. Input nodes had an x value of 0 and output nodes an x value of 1. I only allowed new connections from a node with a smaller x to one with a higher x. This solves this problem entirely
Thank you!
very good explanation
I used NEAT to evolve my COC
I suck at chess, and checkers
Guys got nothing on me
Losers
Thanks, easy to follow👏🏾👏🏾
Hi, I do not know if you are still alive, but judging by your SOF, still how =) Why did you decide not to use your NEAT implementation?
this is amazing. you really are good at teaching these things. would love to see similar series on reinforcement learning methods like actor-critic or deep q-learning.
thanks
Why do you assigning zero to replacement index just right after getting a replacement index from neat? it's always zero in that case (0:40)