Neural networks library [Java] 1 - Structure

Поделиться
HTML-код
  • Опубликовано: 20 дек 2024

Комментарии • 16

  • @necromanhd7716
    @necromanhd7716 6 лет назад +2

    Great Video! And I would like the last one.

  • @John-bb5ty
    @John-bb5ty 6 лет назад

    7:32 I dont understand... so the last two layers are completely linear in terms of transmitting their error function? The second to last layer is only connected to 1 other individual neuron, same as the very last layer (orange output layer), so the last two layers appear to be identical?

    • @finneggers6612
      @finneggers6612  6 лет назад

      Well the idea is that the last layer is like the first layer. Does only store the data.
      This is more easy and flexible to implement.
      The output layer gets an error function assigned.
      The output layer can therefor calculate the error of its output.
      But its output is the same as the output of the previous layer.
      Each layer calculates the error signals of the previous layer.
      So the second last layer actually needs to do some calculations to feed its error backwards.

  • @julianabhari7760
    @julianabhari7760 6 лет назад

    I would really like it if you could show us the new functions for this new structure and show us how to implement it in the previous structure

    • @finneggers6612
      @finneggers6612  6 лет назад

      The thing is, you cannot implement it in a way that we did before.
      But think about it this way. The backpropagation algorithm remains the same. No matter what.
      What changes are the activation functions. You can pretty much take any non linear function and it's derivative and use that one.
      The calculation for the error signals of the output neurons is given in the link in the description. Does this help you?
      The problem with the old code is that the "calculate" method runs through all layers and uses the same activation function for all of them. We could totally have different activation functions in one single network.
      Also, the dimensions are bad. Every layer has only one single dimension and that makes it really hard to extend the network to things like Convolution.

  • @oj0024
    @oj0024 6 лет назад

    omg you are the best chanell, the tutorials are so great and i can simply inplement everything in c++ thanks for the first nice low level language neuralnetwork tutorials.(with low level i mean not python and nor R)

    • @finneggers6612
      @finneggers6612  6 лет назад +1

      You are welcome! :)
      The thing is that I think you understand neural networks best at the beginning if you code them yourself without any libraries.
      I've been sitting next to another student yesterday and he is working with networks for like, a year maybe and I asked him something about convolutional layers and he didnt even know how they are built.
      I mean, he knows how to use them but he does not really know what calculations are actually happening and what makes them so amazing.

    • @oj0024
      @oj0024 6 лет назад

      Finn Eggers i feel the save way to. Also i think its more fun.

    • @oj0024
      @oj0024 6 лет назад

      also i have a nother question are you german? your accent sounds like it.

    • @finneggers6612
      @finneggers6612  6 лет назад

      Oj Craftet yes I am :)

    • @oj0024
      @oj0024 6 лет назад

      Finn Eggers me to XD

  • @oj0024
    @oj0024 6 лет назад +2

    Come one please continue the searies.

    • @finneggers6612
      @finneggers6612  6 лет назад +2

      Oj Craftet I will as soon as possible

    • @oj0024
      @oj0024 6 лет назад

      Finn Eggers nice nice nice