Deep Learning(CS7015): Lec 2.2 McCulloch Pitts Neuron, Thresholding Logic

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2024

Комментарии • 46

  • @sumukhagc5528
    @sumukhagc5528 3 года назад +50

    This is the first nptel lecture I found which is useful

  • @Muskan-wi2zz
    @Muskan-wi2zz 3 года назад +10

    Can't thank enough. The concepts are explained in an absolutely amazing manner.

  • @AtulSharma-hy9yo
    @AtulSharma-hy9yo 5 лет назад +14

    best explanation i found on the internet so far

  • @kanikagarg
    @kanikagarg 5 месяцев назад

    If you’ve also studied psychology, these concepts are easy and relatable; Great explanation, by the way.

  • @rushikeshkorde2673
    @rushikeshkorde2673 4 года назад +1

    very nice sir , your way of teaching is awesome

  • @alyaalblooshi9476
    @alyaalblooshi9476 5 лет назад +1

    Simple and informative .

  • @UtkarshSinghchutiyaNo1
    @UtkarshSinghchutiyaNo1 21 день назад

    Why don't we have such professors in NIT

  • @rishabhsetiya
    @rishabhsetiya 3 года назад +6

    I have read in several articles that excitatory inputs are assigned weight 1 and inhibitory inputs are assigned weight -1. Just mentioning for information of other students.

  • @ganeshwaichal1
    @ganeshwaichal1 2 года назад

    Love younteacher....great explanation

  • @mohammadrasheed9247
    @mohammadrasheed9247 5 лет назад +1

    Great explanation!

  • @GHamsa-e5h
    @GHamsa-e5h 5 дней назад

    the concepts are given in a very simple and clear manner. Thank u.... Just need a clarification for my doubt....For the function x1 and !x2, the threshold value is given as 1. if x1=1 and x2= 0, the sum is 1, threshold satisfies, fires -- OK....But if x1=0 and x2=1, then too the sum is 1, threshold satisfies , fires...BUT NOT OK... how is this taken care? The same for X1=1 and X2= 1, sum =2, threshold satisfies,,,fires....BUT NOT OK...Pls clarify

  • @harshitarajoria-k8u
    @harshitarajoria-k8u Месяц назад

    Omg.....what a teacher he is.....🤌

  • @rohitprasad7418
    @rohitprasad7418 5 лет назад +2

    a very nice explanation. Thank you

  • @pranjalnama2420
    @pranjalnama2420 2 года назад

    amazing lecture

  • @ashutoshpatil26
    @ashutoshpatil26 5 лет назад +2

    thank you sir

  • @MohitSharma-vd1eh
    @MohitSharma-vd1eh 5 лет назад +1

    nice explanation

  • @ZakirHussain-nd4fw
    @ZakirHussain-nd4fw 8 месяцев назад

    intro music just like Doordarshan Shaktimaan tv show.

  • @madhuvarun2790
    @madhuvarun2790 3 года назад +4

    Fantastic lecture. I have a doubt. How is the threshold for x1 and !x2 1? Since x2 is inhibitory input it should always be 0, now if x1 is assigned 1 or 0 the resultant Boolean operation would be (1 and 0), (0 and 0) would still be 0(because the operation mentioned is AND). Could anyone explain please?

    • @MrMopuri
      @MrMopuri 3 года назад +4

      When x2 is 0, the resulting Boolean operations become (1 AND 1), (0 AND 1). Note that it is !x2 (NOT x2). Hence the threshold is 1 (x1 has to be 1) for the output to be 1. Hope this is clear.

    • @himanshu5891
      @himanshu5891 3 года назад +4

      x2 is an inibitory input, meaning if it is 1 then then y=0, irrespective of the values of other inputs. So it is sort of connected with and operation with other inputs. We can see it as !x2 (not x2). So we have x1 connected with and operation with !x2 (which changes the values of inputs if given 0 then !x2 produces 1 and vice versa).
      If x2=1 means !x2=0, so y=0 for any value of x1.
      Now if x2=0 means !x2=1, so for x1=1, y=1.
      So threshold is x1+x2=1+0=1.

    • @shyammarjit9994
      @shyammarjit9994 2 года назад

      @@MrMopuri thanks for the explanation.

  • @sanketkamta106
    @sanketkamta106 5 лет назад +3

    threshold for ANDNOT and NOR? Can anyone explain?

    • @Musical_Era3
      @Musical_Era3 5 лет назад

      @Gokul Gopakumar threshold zero means I think it gets fired for every value for binary.

    • @chhaprichandu
      @chhaprichandu 5 лет назад +2

      @@Musical_Era3 While experimenting, I have found that the boundaries makes more sense if we take the transformed values instead of the raw input values. For example, while plotting x_1 ^ !x_2, if we plot the graph of x_1 and x_2, the problem of decision boundary arises as you mentioned in your point. However, if we plot x_1 vs !x_2, then this problem is solved. However, in the later case the threshold also changes which I think can be handled.

    • @deepakkumarsisodia7092
      @deepakkumarsisodia7092 4 года назад +9

      Think of x2 as a power switch where x2=1 means power is OFF and x2=0 means power is ON.
      When power switch is OFF (x2=1) then output is ALWAYS 0 irrespective of what the other input and threshold value is. Thus, out of total 4 possible inputs (i.e. 00,01,10,11) 01 and 11 are ruled out bcz x2 is 1 in both of these.
      Applicable inputs are 00 and 10 (bcz x2 is 0 in both i.e. power is ON). For 00 input, output is 0. For 10 input, output is 1. Therefore, for all applicable inputs the threshold is 1.
      **At the bottom of the slide it's clearly written : if any inhibitory input is 1 the output will be 0 **

    • @mratanusarkar
      @mratanusarkar 4 года назад +1

      @@deepakkumarsisodia7092 that power switch analogy was great!!

    • @mratanusarkar
      @mratanusarkar 4 года назад +2

      I was really confused, as I didn't get what Inhibitory Input implied and missed the footnote...
      finally, I got it... if any inhibitory input gets 1, the output becomes zero regardless of other conditions and the neuron and other inputs...
      so, that solves it...
      I'm leaving a link to an article on towardsdatascience here:
      towardsdatascience.com/mcculloch-pitts-model-5fdf65ac5dd1

  • @dipali0010
    @dipali0010 3 года назад

    Hello, Please tell me which book I should refer to.

  • @sumanacharya3014
    @sumanacharya3014 2 года назад

    Sir I want to know about if you use three input x1 ,x2 and x3 then the decision boundary is plane because it matches with the plane equation ie x1+x2+x3+d=0 but you told its hyperplane how sir please explain it.

    • @kamleshkumarsingh9758
      @kamleshkumarsingh9758 Год назад

      For more than 3 dimensions the decision boundary will be a hyperplane

  • @dhruvinchawda439
    @dhruvinchawda439 9 месяцев назад

    Can anyone explain why threshold for tautology 11:02 is zero ?

    • @svk0071
      @svk0071 5 месяцев назад

      Tautology implies that output is TRUE or 1 always irrespective of the inputs. So the threshold is 0 since even if both x1 and x2 are 0 , the output is still 1.

  • @narengs9790
    @narengs9790 4 года назад

    But how will the equation look like for NAND? can you kindly explain.

  • @Harry-vr5vz
    @Harry-vr5vz 5 лет назад

    why weights were not included

    • @thanioruvan4556
      @thanioruvan4556 Год назад +3

      this is not perceptron, this is mp neuron model, which doesn't have weights

    • @umang9997
      @umang9997 Год назад

      @@thanioruvan4556 True. Different videos on RUclips suggest otherwise, but the fact is MP Neuron does not have weights.

  • @Harry-vr5vz
    @Harry-vr5vz 5 лет назад +3

    why weights were not included

    • @DerEddieLoL
      @DerEddieLoL 4 года назад +1

      you dont need weights if there is only one connection going out from x_i?

    • @prithvip6360
      @prithvip6360 3 года назад +3

      Becuase MP neurons doesnt have weights

    • @asjadnabeel
      @asjadnabeel Год назад

      Because MP Neurons are the early model, which doesn't have concept of weights