But What Is A Neural Network?

Поделиться
HTML-код
  • Опубликовано: 7 сен 2024

Комментарии • 32

  • @captainjj7184
    @captainjj7184 13 дней назад

    Finally found someone with an amazing illustrative, ingenious educational conveying skills with great voice too lol. Thank you so much, just un-fried my brain watching this after trying to find this specific logic gate NN concept out there for days, brilliant presentation and thank you for sharing!

  • @GabrielCapano
    @GabrielCapano Месяц назад +1

    This is the better and simplest explanation of how a neural network works i have ever seen, thanks!

  • @cipher_angel
    @cipher_angel Месяц назад +2

    How does this have only 28 views? This is great content bro. Keep it up.

    • @Jackson_Zheng
      @Jackson_Zheng  Месяц назад

      @@cipher_angel I know right? 🤣 Took me ages to programmatically animate everything in Manim too. Guess it might just be the titles and thumbnails.

    • @cipher_angel
      @cipher_angel Месяц назад

      @@Jackson_Zheng Impressive work sir. It'll pay off.

  • @ashali2226
    @ashali2226 Месяц назад +1

    Brilliant stuff man!!

  • @joshhunt4431
    @joshhunt4431 Месяц назад

    Nice video Jackson, your editing is so good now!

  • @ramanShariati
    @ramanShariati Месяц назад

    Bro keep up the good work. It takes time...

  • @kipchickensout
    @kipchickensout Месяц назад

    Very nice, wouldn't say it explained it *much* better to me than the other videos but it was definitely in a format where I would've watched an hour or more xd

  • @muhammadamjad4046
    @muhammadamjad4046 Месяц назад

    this channel is hidden gold

  • @hiddendrifts
    @hiddendrifts Месяц назад

    idk what it is but smth about the way you speak makes me think of some british guy telling a friend at the bar about his day

  • @Adhil_parammel
    @Adhil_parammel Месяц назад

    Instead of weight there is mylin thickness in neuron to determine signal strength. neuron activation is based on criticality of signal strength accumulation.

  • @SahilThakur-p6p
    @SahilThakur-p6p Месяц назад +1

    woah great content man...thought for a sec you were one of those million subs channel...keep up the quality!

  • @jorget8855
    @jorget8855 7 дней назад

    You should have used Op amps to model ANN's because ANN's are analog systems. Logic gates don't work because 1. You can't model 'weights' on the input signals. 2. The output signal swings fully high or low (example 3.3V to 0V) when input thresholds are met at typically 0.7*Vin for high for and 0.3*Vin for low, there is no in-between. This behavior won't work as an activation function because you need continuity with some gradual slope in between the high and low transition for information to adequately propagate through the network to function as an ANN, or else it would just be an ordinary combinational digital circuit. Op amps meet all the ANN criteria when you model each unit with negative feedback; the feedback resistors being the weights of each input neuron. The outputs of the opamps will be similar to the ReLU function.

  • @ashbeigian233
    @ashbeigian233 Месяц назад

    clicked on this video by accident, but I have to say this is one of the best videos i have ever seen outlining logic gates and neural networks. Keep it up man, your stuff is going to take off soon if you keep outputting this type of quality content that anyone in the field or not can find value out of.

  • @zerosaturn416
    @zerosaturn416 25 дней назад

    underrated video

  • @jagermon
    @jagermon Месяц назад

    I liked this, very clear. Subscribed.

  • @HugoBossFC
    @HugoBossFC Месяц назад

    Nice video

  • @jason9522
    @jason9522 26 дней назад

    Such a good video, really interesting! I was wondering if the code for the Neural Simulator @4:00 would be available somewhere? I would like to build such demonstrations myself but im not sure how to get there yet :)

  • @rajbunsha8834
    @rajbunsha8834 Месяц назад

    Why does it have so few views? Keep explaining more about ml in such fashion, you are sure to grow.

  • @midpiano3067
    @midpiano3067 Месяц назад +2

    Seriously? All of this work have only 500 views ???

  • @Adhil_parammel
    @Adhil_parammel Месяц назад

    Logic Gates -law of excluded middle 1/0,
    neural network-fuzzy logic.

  • @godmodedylan5563
    @godmodedylan5563 Месяц назад

    the start was really good but you just need a way better outro for this video.

  • @NOTMEVR
    @NOTMEVR 26 дней назад

    I've been researching this tipic for a while now trying to build one in a roblox game of logic gates and i guess this helped 🤣💀😅

  • @wawan_ikhwan
    @wawan_ikhwan 26 дней назад

    speaking about energy comparison with real brain
    actually, i'm thinking about why the computer system nowaday doesn't use analog computing.
    i mean, neural network is tolerating inexact value, so analog technology is not a concern.
    instead, nvidia capitalism is being top currently

    • @Jackson_Zheng
      @Jackson_Zheng  26 дней назад

      @@wawan_ikhwan dedicated ASICs by Groq are getting made that are performing almost as well. The issue is noise I think and the fact that digital components are cheap to mass manufacture and a lot smaller than analog components since they haven't had the same amount of R&D as digital components for like the last 3 decades.

    • @wawan_ikhwan
      @wawan_ikhwan 26 дней назад

      @@Jackson_Zheng noise and inexaxt value are same, so noise is not a concern too