Это видео недоступно.
Сожалеем об этом.

Convolution Intuition

Поделиться
HTML-код
  • Опубликовано: 14 окт 2019
  • In this video, I provide some intuition behind the concept of convolution, and show how the convolution of two functions is really the continuous analog of polynomial multiplication. Enjoy!

Комментарии • 148

  • @Keithfert490
    @Keithfert490 4 года назад +125

    Idk if this helped with my intuition, but it did kinda blow my mind.

  • @influential7693
    @influential7693 2 года назад +15

    The result is not important , whats important is the process.. - Dr. Peyam..... sir you are extremely motivational to me

  • @quantphobia2944
    @quantphobia2944 4 года назад +16

    OMG, this is the simplest explanation of convolution I've ever come across, thank you so much!!!

  • @dougr.2398
    @dougr.2398 4 года назад +29

    General comment: Convolution can be thought of as a measure of self-similarity. The more self similarity between and within the two functions, the larger the convolution integral’s value. (There is the group theory connection)

    • @drpeyam
      @drpeyam  4 года назад +5

      Interesting!

    • @dougr.2398
      @dougr.2398 4 года назад +1

      Dr Peyam yes! That is why & how it applies to biology and music theory!!

    • @blackpenredpen
      @blackpenredpen 4 года назад +9

      Wow! I didn't know about that! Very very cool!

    • @chobes1827
      @chobes1827 4 года назад +2

      This intuition makes a lot of sense because for each x the convolution is essentially just an inner product between one function and another function that has been reflected and shifted.

    • @dougr.2398
      @dougr.2398 4 года назад +1

      Chobes 182 so if thé function and its shiftted values are strongly correlated (or even equal) the convolution integral approaches the integral of the square of the function. The more dissimilar the shift and the non-shifted values are, the integral can be greater or lesser than the integral of the square.

  • @blackpenredpen
    @blackpenredpen 4 года назад +25

    So who is convolution? I still don’t get it.

    • @drpeyam
      @drpeyam  4 года назад +11

      Oh, it’s just the integral of f(y) g(x-y) dy, a neat way of multiplying functions

    • @blackpenredpen
      @blackpenredpen 4 года назад +13

      Dr Peyam
      Lol I know. Notice I asked “who”. Since I remembered my still asked me who is convolution before. Because it’s taught after Laplace.

    • @dougr.2398
      @dougr.2398 4 года назад +2

      Self-similarity..... see my other postings in the comments, please!!

    • @blackpenredpen
      @blackpenredpen 4 года назад

      I just did! Thank you! That is so cool!

  • @MrCigarro50
    @MrCigarro50 4 года назад +14

    Thanks for this video. For us, statisticians, is a very important result for it is related to finding the distribution of the sum of two random variables. So, in general I wish to express our appretiation for your efforts.

  • @bballfancalmd2583
    @bballfancalmd2583 4 года назад +4

    Dear Dr. Peyam, THANK YOU !! And engineering we’re taught how to use convolution, but never learn where the hell it comes from. Your explanations are like a brain massage 💆‍♂️. Thank you, thank you! You know an explanation is good when it not only answers a question I hadn’t even thought of, but also opens my mind to other ways of thinking about math. So much fun! Danka!!

  • @area51xi
    @area51xi Год назад

    This arguably might be the most important video on youtube. I wanted to cry at the end from an epiphany.

    • @drpeyam
      @drpeyam  Год назад

      Thanks so much 🥹🥹

  • @gastonsolaril.237
    @gastonsolaril.237 4 года назад +1

    Damn, this is amazing, brother. Though I'll need to watch this video like 2 or 3 more times to connect the dots.
    Keep up with the good work! Really, you are one of the most interesting and useful RUclips channels I've been subscribed to

  • @stevenschilizzi4104
    @stevenschilizzi4104 2 года назад +2

    Brilliant explanation! Brilliant - makes it look so natural and so simple. Thanks heaps. I had been really curious about where it came from.

  • @arteks2001
    @arteks2001 2 года назад +1

    I loved this interpretation. Thank you, Dr. Peyam.

  • @ibrahinmenriquez3108
    @ibrahinmenriquez3108 4 года назад

    I can surely say that i am continuously happy to see you explaining this ideas. Thanks

  • @erayser
    @erayser 4 года назад +1

    Thanks for the explanation! The convolution is quite intuitive to me now

  • @user-ed1tg9rj1e
    @user-ed1tg9rj1e 4 года назад

    Great video!!! It really helps to make the intuition of convolution!

  • @yhamainjohn4157
    @yhamainjohn4157 4 года назад +2

    One word in my mouth : Great ! Bravo !

  • @mattetor6726
    @mattetor6726 3 года назад +2

    Thank you! The students you teach are very lucky :) And we are lucky to be able to watch your videos

  • @dvixdvi7507
    @dvixdvi7507 3 года назад

    Awesome stuff - thank you for the clear explanation

  • @sciencewithali4916
    @sciencewithali4916 4 года назад

    Thank you so much for the baby step explanation ! It became completly intuitive thanks to the way you ve presented. It ! We want more of awsome content

  • @apoorvvyas52
    @apoorvvyas52 4 года назад +1

    Great intuition. Please do more such videos.

  • @prettymuchanobody6562
    @prettymuchanobody6562 Год назад

    I love your attitude, sir! I'm motivated just hearing you speak, let alone how good you explain the subject.

  • @lauralhardy5450
    @lauralhardy5450 Месяц назад

    Thanks Doc, easy to follow. This is a good generalisation of convolution.

  • @bipuldas2060
    @bipuldas2060 3 года назад

    Thank you. Finally understood the intuition behind this pop operation called convolution.

  • @klam77
    @klam77 4 года назад

    very enjoyable! good stuff!

  • @mnada72
    @mnada72 3 года назад

    That clarified convolution once and for all 💯💯

  • @Debg91
    @Debg91 4 года назад

    Very neat explanation, thanks! 🤗

  • @camilosuarez9724
    @camilosuarez9724 4 года назад

    Just beautiful! thanks a lot!

  • @LuisBorja1981
    @LuisBorja1981 4 года назад

    Dirty puns aside, really nice analogy. Never thought of it that way. As always, brilliant work.

  • @sheshankjoshi
    @sheshankjoshi 9 месяцев назад

    This is wonderful. It does really make sense.

  • @corydiehl764
    @corydiehl764 4 года назад +1

    Okay, I really am seeing what you did there, but I feel like what makes this really suggestive to me is looking at the each power of x as a basis function. Wooooow, this is so much more abstracted and interesting compared to the way I usually look at it as a moving inner product

  • @monsieur_piyushsingh
    @monsieur_piyushsingh Год назад +1

    You are so good!!!

  • @gf4913
    @gf4913 4 года назад

    This was very useful, thank you so much

  • @polacy_w_strefie_komfortu
    @polacy_w_strefie_komfortu 4 года назад +25

    Very interesting. I wonder if we can draw other intuitions from polynomial functions and transfer them to general analytical functions. Anyway analytical function can be aproxymated locally by Taylor series. But in this case analogy seems to work not only locally but also in whole range.

  • @ronaktiwari7041
    @ronaktiwari7041 3 года назад

    Subscribed! It was wonderful!

  • @vineetkotian5163
    @vineetkotian5163 3 года назад

    I wasn't really understanding convolution...just had a broad idea of it.... this video made my mind click😎🔥.. insane stuff

  • @maestro_100
    @maestro_100 2 года назад

    Wow!, Thank You Very Much Sir....This Is A Very Nice Point Of View!!!

  • @kamirtharaj6801
    @kamirtharaj6801 4 года назад

    Thanks man......finally understood why we need convolution theorem

  • @chuefroxz9408
    @chuefroxz9408 4 года назад

    very helpful sir! thank you very much!

  • @lambdamax
    @lambdamax 4 года назад +1

    Hey Dr. Peyam. I had this issue in undergrad too! Thank you for the video. Out of curiosity, for convolutional neural networks, whenever they talk about the "window" in convolving images, would the "window" be analogous to getting the coefficient of a particular degree on this example?

  • @ShubhayanKabir
    @ShubhayanKabir 4 года назад +1

    You had me at "thanks for watching" 😍🤗

  • @DHAVALPATEL-bp6hv
    @DHAVALPATEL-bp6hv 4 года назад

    Convolution is for most mortals, a mathematical nightmare and absolutely non intuitive. But this explanation, makes it more obvious. So thumbs up !!!

  • @skkudj
    @skkudj 4 года назад

    Thanks for good video - from Korea

  • @corydiehl764
    @corydiehl764 4 года назад

    Now I'm really curious if this interpretation could be used to give a more intuitive interpretation of volterra series analysis. Which is my favorite analysis technique that I learned in electrical engineering

  • @alexdelarge1508
    @alexdelarge1508 Год назад

    Sir, with your explaination, what was an esotheric formula, now has some real figure. Thank you very much!

  • @visualgebra
    @visualgebra 4 года назад

    More interesting dear Professor

  • @danialmoghaddam8698
    @danialmoghaddam8698 2 года назад

    thank you so much best one found

  • @DHAVALPATEL-bp6hv
    @DHAVALPATEL-bp6hv 4 года назад

    Awesome !!!!

  • @jaikumar848
    @jaikumar848 4 года назад +13

    Thanks a lot doctor payam ! Convolution is really confusion topic for me ...
    I would like to ask that,
    is convolution useful for mathematicians. ..?
    It is part of digital signal processing as per my information

    • @drpeyam
      @drpeyam  4 года назад +10

      So many applications! To get the distribution of the random variable X+Y, to solve Poisson’s equation, etc.

    • @klam77
      @klam77 4 года назад +2

      @@drpeyam Here's "the" classic video on convolution from the engineering school perspective: ruclips.net/video/_vyke3vF4Nk/видео.html
      you will have to forgive the "cool 70s disco" look of the professor, it was indeed the 70s, so......he looks the part, (but, Prof Oppenheim is/was the guru on signals and systems theory") This is immensely useful math. immensely.

    • @sandorszabo2470
      @sandorszabo2470 4 года назад

      @@klam77 I agree with you. The "real" intuition of convolution comes from Signals and systems, the discrete case.

    • @klam77
      @klam77 4 года назад +2

      @@sandorszabo2470 Hello. But prof Peyam is nearly the same: when he talks of convolution in terms of multiplying two polynomials, prof oppenheim talks about "linear time invariant" systems which produce polynomial sums as "outputs" of multiple inputs in the LTI context! Almost similar! But yes, the original intuition was from the Engg department side, historically.

  • @burakbey21
    @burakbey21 Месяц назад

    Thank you for this video, very helpful

    • @drpeyam
      @drpeyam  Месяц назад

      You are welcome!

  • @user-mz6hc5cv8x
    @user-mz6hc5cv8x 4 года назад

    Thanks for video De Peyam! Can you show Fourier and Laplace transform of convolution?

  • @dgrandlapinblanc
    @dgrandlapinblanc 4 года назад

    Thank you very much.

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit 4 года назад +2

    When I transform a function f(x )into an endless series, and also the function g(x)
    Then I create a convolution with these two powerseries in your discrete way, with Sigma and Indices.
    Is the resulting series the same as I transform the continous version (f*g)(x) into a series?

    • @corydiehl764
      @corydiehl764 4 года назад

      That was my realization from the video too. Now that I think about it, I think that's the result from multiplication of taylor series both fixed about a point a.

  • @ranam
    @ranam 2 года назад

    Brother I know mechanical engineers could find resonance but when I had a deep thought on this resonance Is an slow accumulation of energy which is accumulated very high in small installments when the frequencies match if you strike a turning fork of 50 hz you get the same frequency of vibration on another tuning fork so they both vibrate if you strike it harder the amplitude changes hence loudness is a human factor the frequency is the same the languages that human speak through out the world the sound only resonate your ear drum for few seconds my question is that the harmonics is the fundamental frequency and overtones are the frequency that follow it take a word in any language you spell it according to convolution the thing scales and ques and stack the signal so convolution can be used to model resonance so when your ear drum vibrates it vibrates so the electrical signals are carried to brain like tuning fork ear drums vibrate within the audible spectrum 20 hz to 20000 hz hence resonance is caused by the word we speak and within the audible range the ear drums vibrate and we make sense of words I have seen in one videos on RUclips that due to harmonics in any sound causes resonance which could be modelled by convolution recalling the resonance its destructive because slow and steady accumulation of sound on the mass causes high stress and high energy to build inside and stress increase and the system fractures or collapses but our ear drum hearing the sound from human languages try to vibrate but why our ear drum when subjected to continuous exposure of sound does not fracture or rupture like a wine glass iam not telling about high loud sound higher than 80 db but a audible range sound within the frequency of 20 hz to 20000 hz under continuous exposure why it's not damaging it again not failure by high energy but low one in synchronisation on air . But I tried it in my students when I told them to be quite in class they did not listen to me so I took my phone and set an frequency 14000 hz and they told it was irritating the idea of resonance is "small effort but large destruction " just like Tacoma bridge where the wind just slowly accumulated energy on the bridge and it collapsed it so my conclusion is if an audible frequency at continuous exposure to an human ear can it cause bleeding again "small effort but great destruction" sorry for the long story I you are able to reach hear you must be as curious as me so still not finished the ear drum is shook by harmonics in the sound we make by the words( or )overtones in the sound we make by the words I know harmonics is the fundamental frequency and overtones are following it which under slow and steady accumulation of sound energy resonates and could damge the ear drums again "small effort but big destruction" not to mention we assume the person is in coma or brain dead hence when the sound irritates him he or she could not make a move so my question is so simple normally human ear responds to harmonics or overtones according to convolution which could be a disaster but with minimal effort 🙏🙏🙏🙏 at here I could be wrong because harmonics can also be used to construct sound so can it be destructive or the overtones which are the trouble makers and which one according to this gives a response curve when two signals convolved by harmonics or overtones which is destructive but with minimal effort and convolution happens when ear drums oscillate is by harmonics or the overtones or also the trouble makers there

  • @cactuslover2548
    @cactuslover2548 3 года назад

    My mind went boom after this

  • @omerrasimknacstudent5049
    @omerrasimknacstudent5049 2 года назад

    I understand that convolution is analogous to the multiplication of two polynomials. The intuition here is to express any signal f in terms of its impulses, just like coefficients of a polynomial. It makes sense, thanks. But I still do not understand why we convolute a signal when it is filtered. We may multiply the signal with the filter point-wise.

  • @bat_man1138
    @bat_man1138 3 года назад

    Nice bro

  • @Aaron-zi1hw
    @Aaron-zi1hw 11 месяцев назад

    love you sir

  • @prasadjayanti
    @prasadjayanti 2 года назад

    It made sense to me in some way...I still want to know the advantages of 'reflecting' and 'shifting' a function and then multiplying that with another function. If we do not 'reflect' then what ? Shifting I can understand..we have to keep moving window everywhere..

  • @maxsch.6555
    @maxsch.6555 4 года назад +1

    Thanks :)

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit 4 года назад +2

    2:35 You should handle less coefficients and more coffeeicents ^^

  • @Mau365PP
    @Mau365PP 4 года назад +1

    7:13 what do you mean with *f* and *g* as *"continuous polynomials"* ?

    • @drpeyam
      @drpeyam  4 года назад +1

      Think of a polynomial as an expression of the form sum a_n y^n and what I mean is an expression of the form sum a_x y^x where x ranges over the reals

  • @linushs
    @linushs 3 года назад

    Thank you

  • @secretstormborn
    @secretstormborn 4 года назад

    amazing

  • @jonasdaverio9369
    @jonasdaverio9369 4 года назад +2

    Is it called the convolution because it is convoluted?

    • @drpeyam
      @drpeyam  4 года назад +1

      Hahaha, probably! But I’m thinking more of “interlacing” values of f and g

    • @dougr.2398
      @dougr.2398 4 года назад

      Convolution is a term that really might better be described as “self-similarity”. It even has application to music theory! (THERE is the Group Theory connection!!! And even Biology!!!)

  • @dougr.2398
    @dougr.2398 4 года назад

    My profs at The Cooper Union, 1967-1971 likes to say the variable integrated over is “integrated out”..... which I hold is not accurate, as it is only in appearance, vanished..... the functions evaluated at each point of the “integrated out” variable contribute to the sum, as well as the end points. As the variable EXPLICITLY vanishes, it “goes away. By the way, Dr. Tabrizian, what is “f hat” you refer to in the Fourier transform description of the convolution? Please explain?

    • @drpeyam
      @drpeyam  4 года назад +1

      Fourier transform

    • @dougr.2398
      @dougr.2398 4 года назад

      Dr Peyam thanks!

  • @patryk_49
    @patryk_49 4 года назад

    Wikipedia says the symbol from your thumbnail means something called ,,cross corelation'' and it's simmilar to convolution. I hope somewhere in future you will make a video about that.

  • @blurb8397
    @blurb8397 4 года назад

    Hey Dr Peyam, can we perhaps see a more rigorous definition of what you mean by “continuous polynomials”, how functions can be described in terms of them, and how that leads to the convolution?
    I would also love to see how this connects to the view of convolution in terms of linear functionals, as Physics Videos By Eugene made an extensive video on that which at least I didn’t really understand...
    Anyhow, thanks a lot for this!

    • @drpeyam
      @drpeyam  4 года назад

      There is no rigorous definition of continuous polynomials, they don’t exist

    • @blurb8397
      @blurb8397 4 года назад

      @@drpeyam Couldn't we define them as an integral average?
      like
      the definite integral from zero to n of a(t)*x^t dt, all of that divided by n to "cancel out" the "dt" part, if we look at it from a perspective of dimensional analysis like done in physics

  • @krzysztoflesniak2674
    @krzysztoflesniak2674 2 года назад +3

    Remark 1: This one is pretty nice: ruclips.net/video/QmcoPYUfbJ8/видео.html
    ["What is convolution? This is the easiest way to understand" by Discretised]
    It is in terms of integration of processes with fading intensity, but it is amenable for economic interpretation as well.
    Remark 2: This multiplication by gathering indices that sum up to a constant is crucial for the Cauchy product of two infinite series instead of polynomials (Mertens theorem).
    Remark 3: This convolution is with respect to time. In image manipulation the convolution is with respect to space (a kind of weighted averaging over pixels). That "spatial convolution" in the continuous case leads to an integral transform. One of the functions under convolution is then called a kernel. Just loose thoughts.

  • @BootesVoidPointer
    @BootesVoidPointer 2 года назад

    What is the intuition behind the differential dy appearing as we transition to the continuous case?

    • @krzysztoflesniak2674
      @krzysztoflesniak2674 2 года назад

      It tells to integrate wrt y and keep x fixed (the resulting function is of x variable). Integration wrt y is a continuous analog of summation over the index (also termed y at the end of the presentation, to highlight the jump from a discrete to the continuous case).

  • @adambostanov4822
    @adambostanov4822 Год назад

    so what is the result of the convolution of those two polinomials?

  • @leonardromano1491
    @leonardromano1491 4 года назад

    That's cool and gives a quite natural vector product for vectors in R^n:
    (u*v)_i=Sum(0

  • @ventriloquistmagician4735
    @ventriloquistmagician4735 3 года назад

    brilliant

  • @coolfreaks68
    @coolfreaks68 2 месяца назад

    Convolution is integration of *f(τ).g(t-τ) dτ*
    *(τ , τ+dτ)* is an infinitesimally small time period for which we are assuming the values of *f(τ)* and *g(t-τ)* remain constant.
    *f(τ)* is the evolution of f(t) until the time instant *τ* .
    *g(t-τ)* is the version of g(t) which came into existence at the time instant *τ* .

  • @Brono25
    @Brono25 3 года назад

    I could never find an explanation of why (graphically) you have to reflect one function, multiply both and integrate. I see its too keep the indices to always sum the same?

  • @aneeshsrinivas9088
    @aneeshsrinivas9088 2 года назад

    do alternate notations for convolution exist. I hate that notation for convolution since i love using * to mean multiplicaiton and do so quite frequently.

  • @amirabbas_mehrdad
    @amirabbas_mehrdad 3 года назад

    It was amazing but at the moment you replaced coefficient with the function itself, I didn't understand actually how you did this. Is there anyone who can make it clear for me? Thanks.

  • @poutineausyropderable7108
    @poutineausyropderable7108 4 года назад

    Does this mean if you convolute a function with 1 you get a taylor series?

    • @poutineausyropderable7108
      @poutineausyropderable7108 4 года назад

      That means you could get the taylor series of sin^2x, that would be useful in solving diff equations by solving for a taylor series. You could also continue value for sinx in the infinities.

    • @poutineausyropderable7108
      @poutineausyropderable7108 4 года назад

      Oh so i finally understood. F and g aren't time functions. They are the formula for the element of the taylor series. Sinx isn't f. F is i^(k-1)*(1/k!)* ( k mod 2)

  • @matthewpilling9494
    @matthewpilling9494 4 года назад

    I like how you say "Fourier"

  • @allyourcode
    @allyourcode 3 года назад

    I feel that this definitely helped me. Not really sure why you began discussing the continuous convolution though. The whole polynomial discussion is perfectly applicable in the context of discrete convolution. Anyway, for whatever reason, motivating with polynomial multiplication somehow did it for me. Thanks!
    I'm also finding it helpful in higher dimensions to think in terms of multiplying polynomials (the number of variables = the number of dimensions): To find the coefficient for x_1^n_1 * x_2^n_2, you multiply coefficients of the input polynomials where the exponents add up to n_1 and n_2.
    This kind of explains why you need to flip the "kernel" (in all dimensions) when you think of convolution as a "sliding dot product": when you flip the kernel, the coefficients that you need to multiply "pair up" (such that the exponents add up to n_i).
    Also, I really like your sanity check: the two arguments MUST sum to x! Sounds gimmicky, but I'm pretty sure that will help me to remember.

  • @mustafaadel8194
    @mustafaadel8194 3 года назад

    Actually you showed us the similarity between the two formulas , however I didn't understand convolution from that similarity 😥

  • @Muteen.N
    @Muteen.N 2 года назад

    Wow

  • @SIVAPERUMAL-bl6qv
    @SIVAPERUMAL-bl6qv 4 года назад

    Why convolution is used?

  • @krishnamishra8598
    @krishnamishra8598 3 года назад

    Convolution in one Word ???? Please answer!!!

  • @wuxi8773
    @wuxi8773 4 года назад

    This is math, simple and everything has to make sense.

  • @austinfritzke9305
    @austinfritzke9305 4 года назад

    Was watching this at 1.5x and laughed out loud

  • @yashovardhandubey5252
    @yashovardhandubey5252 4 года назад +3

    It's hard to believe that you can take out time from your schedule to answer RUclips comments

    • @drpeyam
      @drpeyam  4 года назад +3

      Thank you! :)

  • @vineetkotian5163
    @vineetkotian5163 3 года назад

    Sir I cant seem to practice this subject the right way......I'm worried the question might get twisted in the exam and my brain will freeze

  • @f3ynman44
    @f3ynman44 3 года назад

    a_k*b_x-k looked like a Cauchy Product. Is this a coincidence?

  • @fedefex1
    @fedefex1 4 года назад

    How can i write a continuous polinomium:?

    • @drpeyam
      @drpeyam  4 года назад

      With convolution :)

    • @dougr.2398
      @dougr.2398 4 года назад

      Dr Peyam what a convoluted reply!!! :)

    • @patryk_49
      @patryk_49 4 года назад

      I think analogous to normal polynomial :
      P(x) = integral(a(t)*x^t)dt

    • @dougr.2398
      @dougr.2398 4 года назад

      Patryk49 what is a normal polynomial? Is there a correspondence to a NormalSubgroup?

    • @dougr.2398
      @dougr.2398 4 года назад

      Here’s one answer: mathworld.wolfram.com/NormalPolynomial.html

  • @mrflibble5717
    @mrflibble5717 2 года назад

    I like your videos, but the whiteboard writing is not clear. It would be worthwhile to fix that because the content and presentation is good!

  • @luisgarabito8805
    @luisgarabito8805 Год назад

    Huh? 🤔 interesting.

  • @luchisevera1808
    @luchisevera1808 4 года назад

    My professor 7 years ago showed this by sliding a triangle into a rectangle until everything became convoluted

  • @dougr.2398
    @dougr.2398 4 года назад +1

    Vous avez un bon accent Français!

  • @zhanggu2008
    @zhanggu2008 4 года назад

    This is good. But it feels like a start, and the goal of a convolution is not explained. why do so, why use polynomial coefficients?

  • @forgetfulfunctor2986
    @forgetfulfunctor2986 4 года назад +1

    convolution is just multiplication in the group algebra!

    • @LemoUtan
      @LemoUtan 4 года назад

      Just what I was thinking! I only recently started reading up on group modules and thus getting my jaw slowly pulled down whilst watching this

    • @dougr.2398
      @dougr.2398 4 года назад

      forgetful functor please explain or at least partially illuminate the Group Theory connection?

    • @LemoUtan
      @LemoUtan 4 года назад

      @@dougr.2398 If I may, this may help (straight to the examples in the wikipedia article about group rings): en.wikipedia.org/wiki/Group_ring#Examples

  • @gosuf7d762
    @gosuf7d762 4 года назад

    If you replace x with e^(I th) you see convolution theorem.

  • @dougr.2398
    @dougr.2398 4 года назад

    You were right to both hesitate and then ignore the possibility that you had misspelled “coefficients”. English is difficult because it is FULL of irregularities.... this is one instance of a violation of the rhyme “I before E (edited 12-12-2023) except after C or when sounding like “Eh” (“long” A) as in Neighbor and Weigh”. Had you bothered to worry about that during the lecture, it would have impeded progress and the continuity (smile) of the discussion.

  • @elmoreglidingclub3030
    @elmoreglidingclub3030 Год назад +1

    I do not take drugs. Never have. But now I feel like I’m on drugs. What’s the point of all this??