Tutorial: Convolution sum

Поделиться
HTML-код
  • Опубликовано: 11 янв 2025

Комментарии • 44

  • @RoseHulmanOnline
    @RoseHulmanOnline  11 лет назад +6

    The method shown here is based on what physically happens with the system. Each input sample triggers its own scaled and shifted (delayed) impulse response, and these are all added together to form the output. Look particularly at 7:42, and you can see the effect of the delay in the argument for h, it appears as h[n-1]. The other method that you are thinking about ("flip and slide") is based on the convolution sum equation; the delay shows up as negated time index (k) and that's why h must flip.

  • @Tobiasz931
    @Tobiasz931 11 лет назад +9

    Thank you so much for those videos! I passed my exam only thanks to those (it's not really my field as I study IT)! It also helped a lot of my friends.

  • @bryandavis2571
    @bryandavis2571 9 лет назад +4

    Great video this is way easier than the way I learned

  • @karlchua9188
    @karlchua9188 9 лет назад +1

    Taking my DSP Lectures this year and this helped me a lot! Too bad my professor cannot really explain this well. Thank you very much Rose-Hulman. Cheers from the Philippines

    • @eggxecution
      @eggxecution Год назад

      currently studying for boards and understanding this for the first time I'm struggling 😂

  • @EssayWriting-h2c
    @EssayWriting-h2c 10 месяцев назад

    Thanks, I have been struggling to understand this concept but you made it easy !

  • @sdavid78
    @sdavid78 11 лет назад +1

    In the example at 6:00, h[n]={1,2,-1} with the second term underlined, indicating h[-1]=1 ; h[0]=2; h[1]=-1. Is it possible for the impuse response of a LTI system to be defined for negative n "h[-1]=1"? doesn't that mean to have a value for the impulse response prior to the impulse?

    • @suyashmisra7406
      @suyashmisra7406 2 года назад

      yes, such systems are called non causal systems. In practice, you can implement it if you use delays.

  • @isaroque1773
    @isaroque1773 9 лет назад +1

    OMG thank you so much for this vídeo. In 5 minutes I understood, in a much simpler way, the convolution summation. :D

  • @christerranaldo906
    @christerranaldo906 3 года назад

    Thanks! Now that I have seen an example I understand it much better

  • @seaburyneucollins688
    @seaburyneucollins688 3 года назад

    Wow, so this is what my textbook was trying to explain to me? I I regret spending so much time trying to decipher that load of gibberish, when I could have just watched this video instead!

  • @aokay720
    @aokay720 3 года назад

    Thank you so much for taking the time to help me with this!

  • @RoseHulmanOnline
    @RoseHulmanOnline  11 лет назад +2

    Reply to Trần Hồng Phúc: The two formulas are equivalent: your equation (Xmax+Hmax) - (Xmin+Hmin)+1 = (Xmax-Xmin) + (Hmax-Hmin) +1 = (Xmax-Xmin+1) + (Hmax-Hmin+1) - 1 = Xlength + Ylength - 1 = equation in video.

  • @nikoofayyaz8811
    @nikoofayyaz8811 4 месяца назад

    thanks for the helpful video . just a question: what is the center if the number of h[n] is even?

  • @YewJiaMing
    @YewJiaMing 11 лет назад

    it was really easy to understand, besides, the method introduced in the example is really convenient.

  • @SwathiMenta
    @SwathiMenta 12 лет назад

    Amazing video! Was of tremendous help! Thank you :)

  • @tranhongphucdt
    @tranhongphucdt 11 лет назад

    Generally, the length of y =(max index x+max index h) - (min index x+min index h) +1

  • @morendav
    @morendav 11 лет назад

    Is there a reason that you did not need to flip the LTI system (H) due to the negative sign infron of the k in h(n-k) ???
    Please let me know as I am confused

  • @tanjuthechill4871
    @tanjuthechill4871 3 года назад

    For given y(n) and h(n)
    What will be input x(n)
    ??

  • @deathbypenguins
    @deathbypenguins 9 лет назад

    This is a technique very different that what my professor taught us. A good shortcut, but I don't think my professor would be too impressed with it... Lol.

  • @tranhongphucdt
    @tranhongphucdt 11 лет назад

    Your tutorial is very understandable and usefull,but, your fomular for calculating the length of y is only right in this case, the others will be wrong. Could you take look again?

  • @sahilgoyal1124
    @sahilgoyal1124 12 лет назад

    nice video...good job helped me a lot

  • @andrewdavis6191
    @andrewdavis6191 9 лет назад

    elegant explanation! thank you

  • @CoupedUpGenny
    @CoupedUpGenny 9 лет назад

    how would you multiply it , without a shift?

  • @usmanhari7800
    @usmanhari7800 11 лет назад

    This video solved my 1 year old problem

  • @xoraxera
    @xoraxera 8 лет назад

    Thank you so much! This was reallyyy helpfull!

  • @mcculloughmusprime
    @mcculloughmusprime 12 лет назад

    Wow. Discrete convolution is a lot simpler than continuous.

  • @volkerblock
    @volkerblock 11 лет назад

    right picture: delta or h [n-5] ?

    • @RoseHulmanOnline
      @RoseHulmanOnline  11 лет назад

      h[n-5]... I caught this problem earlier and have the "CORRECTION" in the video description.

  • @HarryXiVlog
    @HarryXiVlog 10 лет назад

    helps a lot! thx

  • @xMrJanuaryx
    @xMrJanuaryx 8 лет назад

    I dont understand why its x[k]h[n-k] where does the -k come from why not +k?

    • @xMrJanuaryx
      @xMrJanuaryx 8 лет назад

      OH! Cause its a x(+k) if it were x(-k) then it would be h(n+k)!

  • @electrical4th371
    @electrical4th371 8 лет назад

    nice tutorial

  • @al.qasimi
    @al.qasimi 11 лет назад

    Thank you so much

  • @ArcaneKn1ght
    @ArcaneKn1ght 12 лет назад

    Chuck Norris is drawing this graphics.

  • @volkerblock
    @volkerblock 11 лет назад

    good video, thank you

  • @killZtheterrannoob
    @killZtheterrannoob 12 лет назад +2

    Thanks very useful XD

  • @CreativeBangla
    @CreativeBangla 5 лет назад

    Thank you so much. :)

  • @김뫄뫄-f2u
    @김뫄뫄-f2u 8 месяцев назад

    감사합니다

  • @deweymoorejr
    @deweymoorejr 7 лет назад

    Very nice

  • @harry4676
    @harry4676 2 года назад

    Tks u