Lecture 1 | The Perceptron - History, Discovery, and Theory

Поделиться
HTML-код
  • Опубликовано: 27 авг 2019
  • Carnegie Mellon University
    Course: 11-785, Intro to Deep Learning
    Offering: Fall 2019
    Slides: deeplearning.cs.cmu.edu/docume...
    For more information, please visit: deeplearning.cs.cmu.edu/
    Contents:
    • Course Logistics
  • КиноКино

Комментарии • 23

  • @HetThakkar-h8h
    @HetThakkar-h8h 28 дней назад +1

    This was absolutely brilliant. A masterclass in lecture content design.
    Very well pieced together -> great flow -> Wow moment towards the end -> evokes a lot of curiosity

  • @bishallakha2454
    @bishallakha2454 4 года назад +32

    This series is the best lecture series on Deep learning. Gone through lots of lectures but nothing like this. So comprehensive. So insightful. Provides in-depth perspective. Thanks a lot Carnegie Mellon University for making such a great content publicly available.

  • @ambujmittal6824
    @ambujmittal6824 4 года назад +15

    Piece of advice. Go through all the slides of that lecture before going through that lecture. You will be really amazed by how much more you will be able to grasp than before.

  • @mostafanakhaei2487
    @mostafanakhaei2487 4 года назад +5

    The professor is great! I really enjoyed his lectures. I really appreciate his ability to convey the information and materials for the class.

  • @ian-haggerty
    @ian-haggerty 3 месяца назад

    Loving this series! Such a talented lecturer.

  • @user-wy7wl5on7l
    @user-wy7wl5on7l 11 месяцев назад +1

    There were some minor inaccuracies around neuroscience, which may have been correct a few decades ago mind you, but overall the lecture was quite good.

  • @user-wy7wl5on7l
    @user-wy7wl5on7l 11 месяцев назад +1

    Very good, but a commentary for 37:29. You would be correct at the time of this statement, but a year ago there was a bombshell neuroscience paper published in the Harvard journal of medicine which discovered the axon would also receive information, which somewhat damages the analogy but may in itself be insightful.

  • @oussamaoussama6364
    @oussamaoussama6364 4 года назад +3

    Best introduction to deep learning i've seen so far.

  • @jijie133
    @jijie133 4 года назад +3

    Great!

  • @NisseOhlsen
    @NisseOhlsen 4 года назад +4

    Around 38:40 it is claimed that the human brain does not grow any new brain cells. The phenomenon 'neurogenesis' disproves that claim. In fact, as I understand it, we constantly throughout our lives create new brain cells, especially through rigorous exercise followed by mental stimulation. Please correct me if I am mistaken.

    • @Enem_Verse
      @Enem_Verse 3 года назад

      I think you should be mistaken
      . He is very experienced prof at CMU
      I dought on him
      May be you are true
      I don't know for sure

  • @anuraglahon8572
    @anuraglahon8572 5 лет назад +4

    Done

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 года назад +3

    Great lecture!

  • @wonjaechoi2762
    @wonjaechoi2762 4 года назад +7

    I hope I can take a lecture someday as a student at CMU's...

  • @debayondharchowdhury2680
    @debayondharchowdhury2680 4 года назад +2

    Great Lecture...

  • @Enem_Verse
    @Enem_Verse 3 года назад +3

    Starting was boring but the last 30 mins awesome

  • @ogsconnect1312
    @ogsconnect1312 4 года назад +2

    Awesome!

  • @namanjain8970
    @namanjain8970 3 года назад

    why the neural network have hidden layers, if it is not hidden then we would know all the parameter from which factors our model is learning, how the model is learning then we would have more control on it,more understanding about its decision, even we can help networks to perform better by manually giving weights to some hidden layer neutron which we are confident about

  • @germangonzalez3063
    @germangonzalez3063 3 года назад

    Why are pages 66 and 67 skipped?. I would like to know about memory in loops

  • @pranjalgupta2072
    @pranjalgupta2072 4 года назад +3

    "more fat means more smart"
    eats while watching this :)
    Nice lecture.

  • @tantzer6113
    @tantzer6113 3 года назад +1

    He says before 2016 SIRI and other systems “pretended” to do speech recognition and their results were a joke. Not really. Speech recognition was already pretty good. I had been using it and I was happy. Perhaps he looked at speech recognition prior to around 2010 and mistakenly assumed it remained at the same level until 2016.

  • @leooram1959
    @leooram1959 Год назад +1

    interesting talk, now, for the sake of sanity, stop moving the camera!