ll ..m o. m .l . llm l .l l mpll lm. l l llp .mll l l ll. mmm mmm m m mm km mm kkkknini k okkk KK lo k. k k. KK KK KK KK KK l ll KK k. okkk k. kl k. kkkknini k. m k k ooo k lk LL LL l. l. k. okkk k. k okkk nn l k KK KK k. l. l llp k l o l l. k. l. k lkll. k l k. k. l l km k. l l k. pl l. l k k. k l ll kpll. k k. l l. Ok l ll k lkll mm k. l kl k. k l. l l l. lk ml😊 lk. k k k. k k ll. l l. k lk. k l l k k. Ok l. kkl k. l l ?. l. l. okkk k m. kl l k. k l. l ll k. l l. k k l. k. l. k k. l. 😊k. k. k. lml. l. k k. lk l. l kl l. l. kl l. okkk l l. k. k. l k. k. kl l l. ll l l😊. l. l k. l. l l. l l m l l. l k. k. k k. l. k l. l. k. ll l. l. l. k. k. l l. k look l. okkk k. k. l. kl l. k. l. l. lk l l. k l l l k l. k. l. l. mmm. l mkl. l l k. kl km😊k. l l. l k l k ?. l. ko l. l ko. l. kkl l kk k m. l l l k l l. kk k. kk l l. l. l. k k. kl. k k. m. l. l k k. llkk. l. k. m. l l. k l l k l l. l kk. k. k. l l k. l l. k. l l m. l. l kl k kl. kk k k kl. m. k ll k k l l kl l k. l. l. k. k l l. ll. l l k. l. l ll. k k l l l. ll l. k k. l k. l l. l. k k. l. l m. l k. k l. k. m. m. l. k. o. kl k. l. l. l ll l k k k. k lo l .k k. pl lk l. k l. l. l l k kk l. m l l. k. k lk. l k. m. l l l. l lk l. k l. m. k l. kkl. kk k l k. l l o mo l l. ml lk. l l k l. k. kl. o. ll kl kl k m. k. l kkl l k l l k. k. l k l. k l ll. k k. k l. k l. l m. kl kk k lk kk. l. k m. l. l. k. l. lk l. k. k l k l. kk. kkl l kl l p l k k. mk l. k l l l m. l l. l l. l. k. kl. l. l. l k. kl m. k. l l m. k lkm. l l k k. ll l l l o. k. l l k o l l. k. k. l k. k. l l ll. k. l. k kkk. lo. mk ml. lm l. l o. l. l. l. l. o. k k. k. ml. o. k. l k. k l k k k. l m l. m. k l. kl k. k. l l. m km kl kk m. k. l l k l. kl. k k k k m. okkl l. k l kk kk k m. l l k l l. o lk. l. m. kl. l. m l. k k ml. l l. o k m k. l k k. l l l. m. l. km l. l. k k kkl lkl k. l. m. l. k. k. k l p. k. k. mm. kk. kl. l. o p km. k. kl. k. l k l kl. l. k lk. kl l. lm. kk. l kl. k. k mk. kkko k. m. l. m. o. k. k. lm. k. k. l. k kl. k l. ml. m. k. k o. o. l k. o k k o l. kll mo o. l. l ko. k. mk k. o k. k m. l kk. kk. l. l. k. ol. k. k. l. l l. ok k. k k. kl m l. k. k. k. k. l. k k. l k. k. l. k l m. k. k k k k. l l. k. l. k kk. k. kl. l o. l. kk l. k. lm l. o. l. l l k k pk. k k. k k k l o. l. o. l. .lk k l k. l k. lm. kl kl. kl. m. l o l l. k. l. l l o. k k k l o. k. l l k. l. o. m l. l. m l. l kl. kk. o o. k k. lk k k. l k. ko. lk m. mm. k kl l m. kl k .mo k k k o o m. k m k k. k k. k. l k. m. o k ol. l. l kl. k. k l. l. l k kk k o k l. lom. k o k. o. k. k. m kk. mmk k. k. k k. k. k. k lm. k kk oo. m. k. m. l. k k. ko. l. km. l. o. ol k. kk. o. o. m. m. l. kl. k. l o. m. kl. k. lo. ko. ml l p. l. l k m l. l. k. l l k k. k. o. l k m. k. l k. l k o. k. k. k. k. m. o. lk. kl l k
I fell asleep watching a very simple maths video and woke up to this after dreaming that me and my friends were studying its contents. I’ve never done anything to do with this before but I understood it when I was dreaming about it so will probably give it another listen. It reminds me of being in College/University when SWIM was doing a bunch of drugs and accidentally designed a computer brain. Score for drugs 1,264,273,995,267,177, score for sobriety: still zero LOL
SWIM... There was an online forum I used to frequent. It's been years. I don't recall how to get there. I assume you know which I'm talking about. Does it still exist?
@therainbowtrout1820 It could be Bluelight which is popular. There was another one I used many years ago, but I can't remember the name of it. I'm not sure it exists anymore.
So to be the first, I’d just like to say my journey consisted of falling asleep to a video about why a magnet on the front of a car wouldn’t work, then it went to cursed units of measurement, then it went to professor Dave explains and then I ended up here, all in all I’ve been asleep for about 3 hours and I need more sleep… Anyone else wanna share the journey?
I started by watching “why therapy sucks for men” I then fell asleep, and RUclips showed me what gaming does to my head, to then finish here, it would’ve been Waaaay more if didn’t have my console on auto rest mode
Does this video ONLY play for people who are sleeping? I saw a comment from someone who woke up confused how they got here. I could relate, and when I clicked the comments to say so... it seems EVERYONE got here that way. Did any of you actually intentionally watch this video?
Woke up at 3-4 am last night to this playing after watching a video on Necromancy. Very strange this played for this many people while sleeping, eerie in fact. Decided to see what other’s thought as well lol
If you're an audio guy, Squash functions are just compression by factor of Ratio (r). Threshold is the pickup weight input and knee is smoothing of weights between input and output over a certain range. And there you go. Compression in a nutshell. However the dB peak scale is non-linear. The dB scale is power of 2x10dB. That's what makes it the most confusing. So a ratio of 10 to keep it simpler is double the volume at the threshold gradually weighting less until the set peak where compression is zero. The knee rolls off that effect by a dB factor at a specified loudness and breadth of its impact. Seems gaussian to me. I don't know how the math works at the knee but it gives a smoother transition from boosted to left alone. So in a typical simple compression threshold at -24dB with 10 ratio would result in threshold at -12dB tapering to -10dB, -8dB, -6 and so on until you hit zero assuming your highest peaks are 0dB which is bad. Then you adjust the output to -8 or -14 depending on the sound and that scales the whole curve downward unaltered relatively by whatever output dB you set. If your threshold was boosted by compression to -12dB and you scale it down in output by -8dB then your threshold after processing will be -20dB tapering off up to -8dB in the same curve it had before the output was scaled down. That's why you have to adjust input vs threshold vs ratio vs knee vs output to get the best out of simple compression. Multiband compression is the same thing just much more complicated as it accounts for frequency where you can specify within a certain frequency range how much compression you'd like. Overlap them and yeah that gets quite complicated but it's super useful to getting the right sound especially in dialogue to grab and manipulate the loudness of tonality and sibilance while rejecting the background noise or any echo or unwanted reverb. The same principles apply in NNs in more of a deterministic and mathematical way. It entirely depends on the architecture and what it is used for as you are taking a larger dynamic range of inputs and compressing them to a smaller range of outputs. That's why CDs in the 90's Redbook audio was 16 bits wide. 2^16 made for 65536 levels of volume for any given sample. That was enough because it was replacing cassette tape which had horrible dynamic range. Now it's standard to have 24 bit audio which has a vastly higher dynamic range of 16,777,216 levels of volume at any given sample. For production and processing it's common to have 96 bit audio which has 7.92281625 x 10^28 levels of loudness. That's technically not better than analog but no human would ever be able to tell the difference. It helps computers and audio processing make very very accurate changes.
Quite a pointless post really. Going to tremendous depth using an analogy to explain neural networks. Far better to understand the network rather than your analogy. And yes, I used to work in audio engineering. Analogies are useful as a means of explaining, of education, but your analogy is so specialised it has very little use in educating people.
Wow. You’re a great teacher. I didn’t know what backpropagation was and you were able to make me understand that its multiplying the slope/derivative of the error functions from the end of the network back to the weight you are correcting for. Amazing that you did it so easily. Thank you!!
Just woke up after it ended. I remember waking up for a few seconds thinking that it was interesting, hitting repeat and fell asleep after a few seconds again as it was still in the middle of the night. Now I'm wide awake and hit replay again to truely watch it haha 😂
1:20:40 Since this course is about learning algorithms this is important to classification. Vector: noun Mathematics. 1: a quantity possessing both magnitude and direction, represented by an arrow the direction of which indicates the direction of the quantity and the length of which is proportional to the magnitude. 2: such a quantity with the additional requirement that such quantities obey the parallelogram law of addition. 3: such a quantity with the additional requirement that such quantities are to transform in a particular way under changes of the coordinate system. Biology. 1: an insect or other organism that transmits a pathogenic fungus, virus, bacterium, etc. 2: any agent that acts as a carrier or transporter, as a virus or plasmid that conveys a genetically engineered DNA segment into a host cell. Computers. 1: an array of data ordered such that individual items can be located with a single index or subscript. verb (used with object) Aeronautics. 1: to guide (an aircraft) in flight by issuing appropriate headings. Aerospace. 1: to change the direction of (the thrust of a jet or rocket engine) in order to steer the craft. ... I am tempted to say Physics would include a unit Physics 1: a quantity possessing magnitude, direction and unit.
bruh.... this is magic... I read the "I just woke up" comments and I didn't get it... till I figured out that 20 minutes ago that I woke up from a quick sleep in my chair and this video played without me searching it or putting it in any list....
just cus they tryna teach it doesnt mean its necessary or objectively useful. most schools dont teach how to do taxes, and those are mandatory @@zilog1 🙄
Also woke up to this. My dream involved moving between layers of reality by matching patterns and when you found a match you could move between parts of the world by 'twisting' in to the new pattern. It was amazing.
I knew an engineer brother of a friend who was working on how best to implement gradient descent into NNs years and years ago. I think he was one of the ones who gave up before CNNs became a widely used method. He certainly isn't a NN engineer anymore. He went on to predictive logistics which resembles RNN but really it was a much simpler feedback loop and balancing input versus output. Part of the Just in Time production to delivery process. Likely, the processing power and tech in the 90's wasn't powerful enough to realize the emergence big data is capable of now. Kinda wonder what he would have done had he been doing that 25 years later than he was. I know that he uses advanced NNs now of various types for his job but at this point he is an implementer rather than a developer. Tuning plays a big role.
My auto play was turned off when I went to sleep and yet, somehow, I woke up to this playing. Not sure if I turned it on in my sleep or something but then again it seems I’m not alone in the endeavor
it’s the fact that i’ve woken up to this video so many times, and each time I fell asleep watching something COMPLETELY different. Like… an episode of Kitchen Nightmares vs. a documentary on the history of nuclear bombs level different
Am I the only person who actually seeked out and watched this video? Was very helpful by the way, very slow and good explanations to a very complex topic
7:40. The bottom right neuron is supposed to be inverted. 2 black on top and 2 white on the bottom. The negative weights should actually be positive weights.
That's strange, I just woke up exactly 7:30 am to this video playing. And I went to the comment section and turns out I wasn't the only one. I don't remember watching science related video neither.
At 9:30 please agree that that last weight from the black solid to the output ”solid” should’ve been _black_ (because minus x minus = plus) and not white. Or I will have understood nothing
I woke up to this like everyone else, apparently. I'm guessing the unusually long runtime increases the likelihood that someone would, as opposed to waking up on some random 20-minute video 🤷🏻
0:40 if you tried making a rule you wouldn't be able to do it. Why? Isn't the rule pretty clear? If only the 2 top or 2 bot are black, it's horizontal. If only the 2 left or 2 right are black, it's vertical. Every other composition where 2 are black is diagonal. If 3 are black, it's L. This just seems programmable. What am I missing?
@@JoaoPedro-dx6pn I see, so of course if it was a more complicated object in an environment, it would be really difficult to describe all rules for it. Is that what he meant?
@@radovankrizalkovic9084 Yes. Specially if you dont know all the possible states. These algorithms are applied to filter/classify tons of data, and it will handle tons of different states that we cannot predict.
@7:40 at the last bottom-right neuron on the third layer; shouldn’t the connected weights be positive (white) to get the desired output of the horizontal pixels?
Marvelous work! If this captivates you, there's a book with similar themes you’ll want to explore. "From Bytes to Consciousness: A Comprehensive Guide to Artificial Intelligence" by Stuart Mills
the graphs from 40:00 foward makes no sense to me, how can you have 2 values and not make a straight line? you have y output and x input and you get waves? that graphic makes no sense on what you are talking about... unless if you have a diferent value called repetition or time, but still the first ones without the multiple nodes will also make waves since you are adjusting the values right? I dont understand it...
Brandon; great video! where can we find more visual representations of adding curves? @40:00 you begin to combine curves. how and where does one learn more?
May be my english not good and i did not understand correctly. So can anyone help me explain why at 1:09:51 error of O is not -0.51, i mean 0 - 0.51 = -0,51. Thank advance
Check out the "TensorFlow Basics to Mastery" Coursera Course - www.deeplearning.ai/tensorflow-from-basics-to-mastery/ I am currently doing Course 1. Doing courses separately is free. The specialization is paid.
@@1ycx I think you misunderstood something. You have to pay for a certificate. You can do courses separately but these courses come under specialization only. You get certificate after every course but you have to pay for that. If you don't want certificate then only these courses are free.
@@shwetagoyal9801 I think that Koga Master is talking about the fact that you are able to learn TensorFlow on Coursera for free rather than getting a certificate for it.
I was about to fall asleep and then this video started playing somewhere in the middle of the video. I guess I listened to part of this lecture in my sleep before.
You explained just a one-layer back propagation. My question is: if I’m propagating from layer 3 to input and I’ve done layer 3, how do I know the expected value of neurons in layer 2? I need that to calculate the error right
This autoplayed while I was sleeping, and it totally influenced the dream I was having when I woke up. Did he use a roommate making dinner as an example at one point? I definitely think he did; pizza, waffles, sushi? and something else. Edit: OMG HE DID! AND I REMEMBERED THE OPTIONS CORRECTLY! Except I thought there were 4 but there were only 3 💀 crazy how much this influenced my dream
RUclips AI: good he’s sleeping. Time to load 3 hour neural network video. The human must understand how I work.
3 HOURS!!
4
Same boat
Hahahah... I absolutely love it!
9 pm
So, I just woke up to this video on my phone but the ironic part is I just learned about this yesterday.
Same as me and I kept the link in a file for learn it next time.
me too😂
Oh and I didn't learn about this and have 0 interest in this...
Same here
Same. We are all connected.❤
I came with an interest in neutral networks.
I left feeling well rested.
Same lol 😂
It's not neutral networks, it's neural networks, get it right!
@@Mario-s1c2ono, it's neutral.
@@Fazers-On-Stun being completely neutral in any network, never works.
I just woke up. I am very confused. Why am i here-
Same lol.
It’s an exclusive club we’re learning in our sleep
Like seriously same just happened 😂😂😂
🌾👀🌾
Same 😂
Watching this on my way to sleep for all the people who are waking up to this, it might break the cycle. 🙏💪
It didn’t 😢
Me too 😂
😂😂😂
It got worse. I was watching funny videos and I fell asleep.
After two funny videos it just went to this one
dunno why but this video was playing when i woke up in the middle of the night
ll
..m o. m .l . llm l .l l mpll lm. l l llp .mll l l ll. mmm mmm m m mm km mm kkkknini k okkk KK lo k. k k. KK KK KK KK KK l ll KK k. okkk k. kl k. kkkknini k. m k k ooo k lk LL LL l. l. k. okkk k. k okkk nn l k KK KK k. l. l llp k l o l l. k. l. k lkll. k l k. k. l l km k. l l k. pl l. l k k. k l ll kpll. k k. l l. Ok l ll k lkll mm k. l kl k. k l. l l l. lk ml😊 lk. k k k. k k ll. l l. k lk. k l l k k. Ok l. kkl k. l l ?. l. l. okkk k m. kl l k. k l. l ll k. l l. k k l. k. l. k k. l. 😊k. k. k. lml. l. k k. lk l. l kl l. l. kl l. okkk l l. k. k. l k. k. kl l l. ll l l😊. l. l k. l. l l. l l m l l. l k. k. k k. l. k l. l. k. ll l. l. l. k. k. l l. k look l. okkk k. k. l. kl l. k. l. l. lk l l. k l l l k l. k. l. l. mmm. l mkl. l l k. kl km😊k. l l. l k l k ?. l. ko l. l ko. l. kkl l kk k m. l l l k l l. kk k. kk l l. l. l. k k. kl. k k. m. l. l k k. llkk. l. k. m. l l. k l l k l l. l kk. k. k. l l k. l l. k. l l m. l. l kl k kl. kk k k kl. m. k ll k k l l kl l k. l. l. k. k l l. ll. l l k. l. l ll. k k l l l. ll l. k k. l k. l l. l. k k. l. l m. l k. k l. k. m. m. l. k. o. kl k. l. l. l ll l k k k. k lo l .k k. pl lk l. k l. l. l l k kk l. m l l. k. k lk. l k. m. l l l. l lk l. k l. m. k l. kkl. kk k l k. l l o mo l l. ml lk. l l k l. k. kl. o. ll kl kl k m. k. l kkl l k l l k. k. l k l. k l ll. k k. k l. k l. l m. kl kk k lk kk. l. k m. l. l. k. l. lk l. k. k l k l. kk. kkl l kl l p l k k. mk l. k l l l m. l l. l l. l. k. kl. l. l. l k. kl m. k. l l m. k lkm. l l k k. ll l l l o. k. l l k o l l. k. k. l k. k. l l ll. k. l. k kkk. lo. mk ml. lm l. l o. l. l. l. l. o. k k. k. ml. o. k. l k. k l k k k. l m l. m. k l. kl k. k. l l. m km kl kk m. k. l l k l. kl. k k k k m. okkl l. k l kk kk k m. l l k l l. o lk. l. m. kl. l. m l. k k ml. l l. o k m k. l k k. l l l. m. l. km l. l. k k kkl lkl k. l. m. l. k. k. k l p. k. k. mm. kk. kl. l. o p km. k. kl. k. l k l kl. l. k lk. kl l. lm. kk. l kl. k. k mk. kkko k. m. l. m. o. k. k. lm. k. k. l. k kl. k l. ml. m. k. k o. o. l k. o k k o l. kll mo o. l. l ko. k. mk k. o k. k m. l kk. kk. l. l. k. ol. k. k. l. l l. ok k. k k. kl m l. k. k. k. k. l. k k. l k. k. l. k l m. k. k k k k. l l. k. l. k kk. k. kl. l o. l. kk l. k. lm l. o. l. l l k k pk. k k. k k k l o. l. o. l. .lk k l k. l k. lm. kl kl. kl. m. l o l l. k. l. l l o. k k k l o. k. l l k. l. o. m l. l. m l. l kl. kk. o o. k k. lk k k. l k. ko. lk m. mm. k kl l m. kl k .mo k k k o o m. k m k k. k k. k. l k. m. o k ol. l. l kl. k. k l. l. l k kk k o k l. lom. k o k. o. k. k. m kk. mmk k. k. k k. k. k. k lm. k kk oo. m. k. m. l. k k. ko. l. km. l. o. ol k. kk. o. o. m. m. l. kl. k. l o. m. kl. k. lo. ko. ml l p. l. l k m l. l. k. l l k k. k. o. l k m. k. l k. l k o. k. k. k. k. m. o. lk. kl l k
Lol. Same here.
@@themichaelsoniiihihihihiihi. Iii. I. Iiii. 😊
Same omg I think I get it. I was afraid I was too stupid but idk maybe I need to sleep listen
@@BadChad-tn3iii just got teleported here for 1st time,but i see there are unusually a lot ppl here too
Did we all wake up here?
I stumbled into this while having a fever
Lol I did
Yes 😮
literally woke up no idea whats going on
Yeah i did
Three and a half million views of this video, but no one has actually watched it.
That's how neural network works
0
9S. 0⁰0b7zsssswssssssssssdxxxzxxxxxdfddddddddddfddddddddddfddddddddddddddxdddxxxxxðdddxxxddxxxddddðdxddxddxdddxxxxddxxxx̌xxdxxxxxxxxdxxddddddxxdxxxdxxdxdzxxdxxxxxx̌x̌xxxxxddxxxxxxdxxdzxxxxxxx̌dxdxdxxxxdxxxdxdxx̌ðxq2uzXa⁵ ❤090000pp2:17:34 2:18:11 2:18:25 2:18:30 2:18:30 @@camellia..-
Exactly how it worked 😂
time is valuable
😂😂😂😂😂
I fell asleep watching a very simple maths video and woke up to this after dreaming that me and my friends were studying its contents. I’ve never done anything to do with this before but I understood it when I was dreaming about it so will probably give it another listen. It reminds me of being in College/University when SWIM was doing a bunch of drugs and accidentally designed a computer brain. Score for drugs 1,264,273,995,267,177, score for sobriety: still zero LOL
SWIM... There was an online forum I used to frequent. It's been years. I don't recall how to get there. I assume you know which I'm talking about. Does it still exist?
YAY drugs
@@therainbowtrout1820yes albeit not necessarily in the same regard
@therainbowtrout1820 It could be Bluelight which is popular. There was another one I used many years ago, but I can't remember the name of it. I'm not sure it exists anymore.
Somehow this autoplayed on my phone while I was sleeping.
Realy?
Yeah @@0xSpaceCowboy
Same here
3.8 million people woke up to this, including me. Fascinating!
Just woke to this playing. It was the catalyst to the craziest most vivid dream since childhood...im in my 30's.
i slept watching a different completely unrelated video and woke up on this what just happened
Yup
Same!
Singularity trying to nudge you in the right direction
You changed the weight without adjusting for the error. Happens all the time
this night same happned with me
From sleeping on a Geopolitics video to landing here, I am stunned😅
Just woke up, don’t know where I am or how I ended up here
same, last thing i remember was veritasium explaining game theory
Bro same i was watching fresh spawns i think...
Ouch.. same here
Same, was watching serpentza.
same😭
Fell asleep te Derek whispering sweet nothings to me about black holes, and woke up tho this.
Truly we live in the best of times
who's derek
@cocogaddam5558 ruclips.net/video/6akmv1bsz1M/видео.htmlsi=Q2l3k6zawnQeBdva
Let me guess: you just woke up and this video was playing
and it’s already an hour and thirty three mins in, how tf?!?!??
YESSS
HOW DO YOU KNOW
YES
woke up 44 minutes in
Very surprised to see everyone woke up to this video as well. The algorithm strikes again!
Hey will you please clarify me what's happening here, why everyone just saying they woke up seeing this video???
@@greyhat430 cuz they went to sleep watching another video and they woke up with this one like me
I’m pretty happy to be awaken by a such interesting lecture. Will watch it again
True! I woke up after it ended and the headline was interesting enough to hit replay while awake 😂
did you watch it again?
So to be the first, I’d just like to say my journey consisted of falling asleep to a video about why a magnet on the front of a car wouldn’t work, then it went to cursed units of measurement, then it went to professor Dave explains and then I ended up here, all in all I’ve been asleep for about 3 hours and I need more sleep…
Anyone else wanna share the journey?
I started by watching “why therapy sucks for men” I then fell asleep, and RUclips showed me what gaming does to my head, to then finish here, it would’ve been Waaaay more if didn’t have my console on auto rest mode
I‘m actually curious about that video. What is the magnet supposed to do?
I was watching videos on the physics of anime booba.
I wasn't even watching a video when I fell asleep, I have no clue how I got here
@@ChefGorebto move the car but it won't work because the magnet will cancel itself
RUclips is a good detector of sleep
Oh, so I wasn't the single one falling asleep watching something then ended up here being confused
Amen
@@marius.y6360 me as well😂
@@marius.y6360 ⁹
Ur right
I woke up and this was playing on the background
Seriously. Same here
@@MossawirAhmed
Bruhhh same tf
🤣🤣🤣
Thanks!
Does this video ONLY play for people who are sleeping? I saw a comment from someone who woke up confused how they got here. I could relate, and when I clicked the comments to say so... it seems EVERYONE got here that way. Did any of you actually intentionally watch this video?
Woke up at 3-4 am last night to this playing after watching a video on Necromancy. Very strange this played for this many people while sleeping, eerie in fact. Decided to see what other’s thought as well lol
I woke up that morning in this video
Woke up with this playing 3 hrs in
Oh my…
Wow.... Was watching a video on Tesla and then dozed off,. The next thing I hear is this video playing when I wake up.
This has to be the perfect video to the algorithm
Right? There's no way it would have gotten 3.8M views 😂😂😂
I watched this on purpose. :) Found it quite helpful! Cheers
Out of the many comments, you are the very few that did not wake up to this/fall asleep to it.... it's actually hard to believe XD
what is this, i just woke up..
If you're an audio guy, Squash functions are just compression by factor of Ratio (r). Threshold is the pickup weight input and knee is smoothing of weights between input and output over a certain range. And there you go. Compression in a nutshell. However the dB peak scale is non-linear. The dB scale is power of 2x10dB. That's what makes it the most confusing.
So a ratio of 10 to keep it simpler is double the volume at the threshold gradually weighting less until the set peak where compression is zero. The knee rolls off that effect by a dB factor at a specified loudness and breadth of its impact. Seems gaussian to me. I don't know how the math works at the knee but it gives a smoother transition from boosted to left alone. So in a typical simple compression threshold at -24dB with 10 ratio would result in threshold at -12dB tapering to -10dB, -8dB, -6 and so on until you hit zero assuming your highest peaks are 0dB which is bad. Then you adjust the output to -8 or -14 depending on the sound and that scales the whole curve downward unaltered relatively by whatever output dB you set. If your threshold was boosted by compression to -12dB and you scale it down in output by -8dB then your threshold after processing will be -20dB tapering off up to -8dB in the same curve it had before the output was scaled down.
That's why you have to adjust input vs threshold vs ratio vs knee vs output to get the best out of simple compression. Multiband compression is the same thing just much more complicated as it accounts for frequency where you can specify within a certain frequency range how much compression you'd like. Overlap them and yeah that gets quite complicated but it's super useful to getting the right sound especially in dialogue to grab and manipulate the loudness of tonality and sibilance while rejecting the background noise or any echo or unwanted reverb.
The same principles apply in NNs in more of a deterministic and mathematical way. It entirely depends on the architecture and what it is used for as you are taking a larger dynamic range of inputs and compressing them to a smaller range of outputs. That's why CDs in the 90's Redbook audio was 16 bits wide. 2^16 made for 65536 levels of volume for any given sample. That was enough because it was replacing cassette tape which had horrible dynamic range. Now it's standard to have 24 bit audio which has a vastly higher dynamic range of 16,777,216 levels of volume at any given sample. For production and processing it's common to have 96 bit audio which has 7.92281625 x 10^28 levels of loudness. That's technically not better than analog but no human would ever be able to tell the difference. It helps computers and audio processing make very very accurate changes.
Quite a pointless post really. Going to tremendous depth using an analogy to explain neural networks.
Far better to understand the network rather than your analogy. And yes, I used to work in audio engineering.
Analogies are useful as a means of explaining, of education, but your analogy is so specialised it has very little use in educating people.
Wow. You’re a great teacher. I didn’t know what backpropagation was and you were able to make me understand that its multiplying the slope/derivative of the error functions from the end of the network back to the weight you are correcting for. Amazing that you did it so easily. Thank you!!
Apparently everyone waking up to this including myself 😂
is this a joke or truth? what exactly happen?
@@fatemehmohseni5414 load of people including myself suddenly wake up to this video. Autoplay at it's finest
Yes I'm scared too
It’s making me laugh so hard 😂
@@fatemehmohseni5414truth
Just woke up after it ended. I remember waking up for a few seconds thinking that it was interesting, hitting repeat and fell asleep after a few seconds again as it was still in the middle of the night.
Now I'm wide awake and hit replay again to truely watch it haha 😂
OH MY GOD 😂
I fell asleep listening to something and had a really trippy dream. Woke up two hours into this!
I don’t care I woke up to this out of nowhere, it’s awesome. Super interesting stuff dude
1:20:40 Since this course is about learning algorithms this is important to classification. Vector:
noun
Mathematics.
1: a quantity possessing both magnitude and direction, represented by an arrow the direction of which indicates the direction of the quantity and the length of which is proportional to the magnitude.
2: such a quantity with the additional requirement that such quantities obey the parallelogram law of addition.
3: such a quantity with the additional requirement that such quantities are to transform in a particular way under changes of the coordinate system.
Biology.
1: an insect or other organism that transmits a pathogenic fungus, virus, bacterium, etc.
2: any agent that acts as a carrier or transporter, as a virus or plasmid that conveys a genetically engineered DNA segment into a host cell.
Computers.
1: an array of data ordered such that individual items can be located with a single index or subscript.
verb (used with object)
Aeronautics.
1: to guide (an aircraft) in flight by issuing appropriate headings.
Aerospace.
1: to change the direction of (the thrust of a jet or rocket engine) in order to steer the craft.
...
I am tempted to say Physics would include a unit
Physics
1: a quantity possessing magnitude, direction and unit.
I🏵️🌪️🏜️🌿🤹🧚🦹🧚🧚🧜🙍🛌🙍🏜️😚🏜️🌿🏜️🛌🤼🙍🏄🤦🏇😈🛌🙍🤦🙍🚣🏄🤦🤦🤦🏄🙍🙍🤦🌪️🌿🌻🍃🍃🐯🦕🐯🏇🏇🤺🤺🏋️🤵😉🤭
77 por
Thanks but most of us finished high school too 😅
Oo😊
Excellent explanation, excellent figures, and animations, awesome speaking! Looks like a dream course!
bruh.... this is magic... I read the "I just woke up" comments and I didn't get it... till I figured out that 20 minutes ago that I woke up from a quick sleep in my chair and this video played without me searching it or putting it in any list....
I suddenly opened my eyes and dreamed about this video while sleeping with my tai chi instructor at my beachfront property. Unreal.
Wow first time I’m actually glad I learned calculus in school. Nice to see it useful outside of the classroom.
T
भघ
Yep. Everyone thinks they know better. "I'll never use this!" Then why are they trying to teach it to you? 🙄
just cus they tryna teach it doesnt mean its necessary or objectively useful. most schools dont teach how to do taxes, and those are mandatory @@zilog1 🙄
@@zilog1 Most people never use it again
Was watching RUclips videos fall asleep n woke up to this video ….is crazy how so many people just woke up to this video
Ikr!
Also woke up to this. My dream involved moving between layers of reality by matching patterns and when you found a match you could move between parts of the world by 'twisting' in to the new pattern. It was amazing.
I knew an engineer brother of a friend who was working on how best to implement gradient descent into NNs years and years ago. I think he was one of the ones who gave up before CNNs became a widely used method. He certainly isn't a NN engineer anymore. He went on to predictive logistics which resembles RNN but really it was a much simpler feedback loop and balancing input versus output. Part of the Just in Time production to delivery process. Likely, the processing power and tech in the 90's wasn't powerful enough to realize the emergence big data is capable of now. Kinda wonder what he would have done had he been doing that 25 years later than he was. I know that he uses advanced NNs now of various types for his job but at this point he is an implementer rather than a developer. Tuning plays a big role.
3 times this week I’ve woken up to this
Thanks 🙏 I finally have some understanding of why cnn’s work!
Wishing you goodluck keep it up 🙏
My auto play was turned off when I went to sleep and yet, somehow, I woke up to this playing. Not sure if I turned it on in my sleep or something but then again it seems I’m not alone in the endeavor
I wasnt even watching anything and i woke up to this video 😭
it’s the fact that i’ve woken up to this video so many times, and each time I fell asleep watching something COMPLETELY different. Like… an episode of Kitchen Nightmares vs. a documentary on the history of nuclear bombs level different
This is a lot of videos smashed into one. Honestly, excellent work.
Where would he place GPT4 on his generality performance graph? Must be a step change
Am I the only person who actually seeked out and watched this video? Was very helpful by the way, very slow and good explanations to a very complex topic
7:40. The bottom right neuron is supposed to be inverted. 2 black on top and 2 white on the bottom. The negative weights should actually be positive weights.
Exactly
Exactly
Exactly
Exactly
Would the vectors need assigned values/weights in that case? Isn’t this just an example?
Ain’t no way that we all woke up here. This is some certified Backrooms shizz right here bruh.
3:33 How come the narrater's cough makes the result go from negative 0.075 to positive 0.075?
abs(0.075)
shutup
@@duoko98 Are you saying that because your mother has three? Get an education
@@ansowarrower5038 Lol cough joke just wasn't funny to me idk...
Wonderful, elegant explanations! This is the way to present the basics of a hugely scalable system!
Right lol you and at and I yyyg
@@dannyfrost2621 an me
@@HassanAhmed-rf9xrk,
W😅 w😮😮 1:11 1:11 1:12 1:12 was
@@HassanAhmed-rf9xra
Woke up to this and is exactly what i searched for yesterday, but couldn't find it
It's almost 10th time. When i woke up i found this neural networks video.
RUclips really just teach me Neutral Networks while I’m asleep.
beautiful course and amazingly easy to understand
That's strange, I just woke up exactly 7:30 am to this video playing. And I went to the comment section and turns out I wasn't the only one. I don't remember watching science related video neither.
Simple and intuitive explanations. Thanks!
Exactly like yours.
@@wolfisraging Thanks bud!
D@@MachineLearningwithPhil
r@@MachineLearningwithPhil ssss
r@@MachineLearningwithPhil ssss
I swear literally everyone including me that has watched this video hadn’t chose the video they woke up to it
This randomly autoplayed, I know nothing about neural networks.
Suddenly, I want to learn more about them now.
I just woke up and this was playing. Now I better just know how to program a new LLM or I’ll be really upset.
I woke up and this was the first video that popped on my feed ,goodmorning I guess !
Malemnya nonton fajrul fx, bangun-bangun kok di sini dan banyak yg komen bangun tidur ngeliat ni video😂
This is super fascinating
This is such a great tutorial! Thank you for making it. I will share the video with students interested in neural net and deep neural networks.
Uu
At 9:30 please agree that that last weight from the black solid to the output ”solid” should’ve been _black_ (because minus x minus = plus) and not white. Or I will have understood nothing
I woke up to this like everyone else, apparently.
I'm guessing the unusually long runtime increases the likelihood that someone would, as opposed to waking up on some random 20-minute video 🤷🏻
we both woke up to this
@@purpls.We all did
Nah that can’t be, not every 3 hour video has everyone waking up to it
Plus the algorithim really enjoys recommending 3.8M views videos
Woke up to this like 10 times in the past 2 months
0:40 if you tried making a rule you wouldn't be able to do it. Why? Isn't the rule pretty clear?
If only the 2 top or 2 bot are black, it's horizontal.
If only the 2 left or 2 right are black, it's vertical.
Every other composition where 2 are black is diagonal.
If 3 are black, it's L.
This just seems programmable.
What am I missing?
It may be programable but what if you rotate it
Or you want to find cat , not cat program
@@seriouscoder1727 Exactly. He used this example because it was more simple to explain :)
@@JoaoPedro-dx6pn I see, so of course if it was a more complicated object in an environment, it would be really difficult to describe all rules for it. Is that what he meant?
@@radovankrizalkovic9084 Yes. Specially if you dont know all the possible states. These algorithms are applied to filter/classify tons of data, and it will handle tons of different states that we cannot predict.
Not to be a broken record, but I just woke up to this. It’s 2am. I think we’re all in a cult.
I’m not sure what we’re supposed to do with this information but I feel like it’s something
Assuming that everyone has had or currently has a learning capacity you realize that environment plays a huge part.
Yeah ok
I was not watching this when I feel asleep.... but I did wake up to it
does anybody know where I can find the result mentioned at 40:10 ?
i dont know, i just woke up
Very well explained sir ❤
How in the world have we all woken up to this. How does the algorithm stop working when night rolls around
Thank you for making this!
46:28 in b those have a corelation too
I’ve watched this so many times in my sleep I must be a genius when I fall asleep, then it all goes away when I wake up
What should be confusing is your vision or at least your conclusion! but the global context is great
@7:40 at the last bottom-right neuron on the third layer; shouldn’t the connected weights be positive (white) to get the desired output of the horizontal pixels?
Marvelous work! If this captivates you, there's a book with similar themes you’ll want to explore. "From Bytes to Consciousness: A Comprehensive Guide to Artificial Intelligence" by Stuart Mills
the graphs from 40:00 foward makes no sense to me, how can you have 2 values and not make a straight line?
you have y output and x input and you get waves? that graphic makes no sense on what you are talking about... unless if you have a diferent value called repetition or time, but still the first ones without the multiple nodes will also make waves since you are adjusting the values right? I dont understand it...
are you combining the results from the 3 layers in the same graph? isn't that the same as making 3 iterations on the 1 layer mode?
Logestic is kind of liner classification
Just woke up nearly 3 hours into this video with no idea how I got here…
Why have so few actually chosen to watch this video, I woke up at 4am to it playing
Brandon; great video! where can we find more visual representations of adding curves? @40:00 you begin to combine curves. how and where does one learn more?
Fourier series comes to mind. Basically, add a bunch of simple but different curves together to get one complicated but continuous curves.
@@닐리아담 Fourier ne fout rien à la fourrière.
@@w花b😂
Glad to know I'm not the only one who woke up to this video playing with brightness all the way down and volume low. Also, my phone heated up so much
Who just woke up to this video in 2024 😂
Me…
May be my english not good and i did not understand correctly. So can anyone help me explain why at 1:09:51 error of O is not -0.51, i mean 0 - 0.51 = -0,51. Thank advance
Maybe the multiplied isnt it?
Guys how about a Tensorflow tutorial in depth! Please?!!
Check out the "TensorFlow Basics to Mastery" Coursera Course - www.deeplearning.ai/tensorflow-from-basics-to-mastery/
I am currently doing Course 1. Doing courses separately is free. The specialization is paid.
@@1ycx I think you misunderstood something. You have to pay for a certificate. You can do courses separately but these courses come under specialization only. You get certificate after every course but you have to pay for that. If you don't want certificate then only these courses are free.
Check recent uploads
@@shwetagoyal9801 I think that Koga Master is talking about the fact that you are able to learn TensorFlow on Coursera for free rather than getting a certificate for it.
@@1ycx ini 0 88
I was about to fall asleep and then this video started playing somewhere in the middle of the video. I guess I listened to part of this lecture in my sleep before.
I like other woke up and this was playing. However, it narrated my entire dream
You explained just a one-layer back propagation. My question is: if I’m propagating from layer 3 to input and I’ve done layer 3, how do I know the expected value of neurons in layer 2? I need that to calculate the error right
Hey guys, post your timestamp below for when you woke up. Mine is 43:33.
For me it stopped earlier at 14:30 because I set RUclips timer.
LMFAOOOO MINE WAS 43:37
This autoplayed while I was sleeping, and it totally influenced the dream I was having when I woke up. Did he use a roommate making dinner as an example at one point? I definitely think he did; pizza, waffles, sushi? and something else. Edit: OMG HE DID! AND I REMEMBERED THE OPTIONS CORRECTLY! Except I thought there were 4 but there were only 3 💀 crazy how much this influenced my dream
Overlords think im capable of learning this lol, we all wokeup to this.
yeea iv never watched a video like this before but i woke up to it too guys😮
Nicely explained. One thing near the start is that sigmoid only goes from 0 to 1 (It's tanh (x) that goes -1 to +1)
They r related ,, nice point to mention
Estava cochilando e acordei no meio desse vídeo, eu e a minha esposa, e foi difícil de acordar ela!
One of those videos you get hooked to when backed
I, too, just woke up to this video - after a nap while on holiday. Lol