Hey! I saw the thumbnail of the video and decided to try it myself, and it came out quite differently so I think it's worth sharing. Consider the function f(x)=x on the interval [-π,π], continued periodically over R. We'll compute its Fourier coefficients: A(n) = 0 for all n, because f is odd. B(n) = 1/π * int(-π,π) x*sin(nx)dx = (integration by parts, u=x and dv=sin(nx)) = 1/π(-x*cos(nx)/n{-π,π} - int(-π,π) -cos(nx)/n dx = ... (Boring computation, just cancel out a lot of things) = 2*(-1)ⁿ⁺¹/n. Therefore, because f is continuous on (-π,π) we get: f(x) = x = 2*sum(n=1,∞) (-1)ⁿ⁺¹/n * sin(nx) For every x in (-π,π). Divide by 2: x/2 = sum(n=1,∞) (-1)ⁿ⁺¹/n * sin(nx) For every x in (-π,π). Now, substitute y=x+π and we get: (y-π)/2 = sum(n=1,∞) (-1)ⁿ⁺¹/n * sin(n(y-π)) For every y in (0,2π). Multiply both sides by -1: (π-y)/2 = sum(n=1,∞) (-1)ⁿ/n * sin(n(y-π)) For every y in (0,2π). Lastly, recall the angle addition formula for sin: sin(a+b) = sin(a)cos(b) + sin(b)cos(a). Substituting a=ny and b=-nπ we get: sin(n(y-π)) = sin(ny)cos(-nπ) + sin(-nπ)cos(ny) = sin(ny)*(-1)ⁿ + 0*cos(ny) = (-1)ⁿ*sin(ny) Substituting this back in our sum, the two instances of (-1)ⁿ cancel out and we're left with: (π-y)/2 = sum(n=1,∞) sin(ny)/n For every y in (0,2π). 1 is in (0,2π) so we can substitute y=1 and finally (π-1)/2 = sum(n=1,∞) sin(n)/n QED.
Amazing proof, thank you!!! By the way, I have a question about the considered series: can we prove its convergence without calculating sum (and without going to complex plane)? Integral test doesn't work as function sin(x)/x isn't decreasing; comparison test also doesn't help as well.
@@regulus2033 Hey, you can see that sum sin(n)/n converges by the Dirichlet test, 1/n decreases to zero and |sum_{n=1}^Nsin(n)| is bounded independently of N.
I absolutely had had no idea how to solve this one until you gave the formula of the sinx and all of that suddenly became so obvious... anyway, that was great
at 3:45 do you mean z can't be 1? The alternating harmonic series for z=-1 converges to the value given by the formula, while z=1 gives a divergent value
@@blackpenredpen OK, well then be more careful to mix them up, so you don't get stuck on one of them to the exclusion of the other ;-) . . . . NAAAAH! Fred
One of my friends works at our state library and for my birthday this year he got me a series of math exercise books from 1917. One of the problems is the following integral: integrate (tan(x)*log(cot(x))) from 0 to pi/4; This turned out to be way harder than I thought and i'd really like to see you solve it :D
First time here!!! How didn't I find your videos before? ... Thanks for the Maths!!!! In my Mathematics Faculty in Brazil we did not learn how to deal with that sum the way you did. I remember it was much more difficult to understand. Your explanation was ♾ Times Better!!! Thanks from a Mathematics Teacher in Brazil!!! 🙏📚🎉🥇🏆💡📖💡
I like that starting with a sum of real numbers, going through complex world and ending up with a real answer! Reminds me of Umar Khayyam solving third degree polynomial equations by going through complex without knowing or defining complex numbers at all!
It's more amazing to me that you can split the original sum into 2 separate sums. In a way, you are changing the order of operation, but it doesn't end up mattering because of the specific way you change it.
If we may go further, let's also cancel 'n' from 'sin' with 'n' in 'n=1' under Σ (sigma) and we could obtain 'si'. Due to impossibility of 1 to be equal to an empty set, '=1' will vanish. Now, using a translating function, we can transform Σ (sigma) into 'S', and cancel with 's' in 'si', it comes to an 'i'. Now we have '∞ i' only where i = √-1. Commutative, we get it solved as i×∞ . Please advise.
if you integrate the function in the summation (where sin(0)/0 is set =1) do you get the same value. If so that function has the property that it's summation is the same as it's integral. which leads to the question what other functions have this property.
I can follow the maths but I really cannot fathom how the two are the same. Taking the integral to be approximated by the Riemann sum with vanishingly small error, does this result not imply that only rectangles near integer values on the number line contribute to the integral? Alternatively, the contribution of non integer rectangles must cancel precisely but just looking at a graph of sin(x)/x suggests this can't be the case. I must be missing something fundamental here. Any insight is greatly appreciated :)
hahahasan I was thinking something similar. I think that due to the cyclical nature of the function, the integers at which the rectangles over approximate the area of the function have to precisely cancel out with the rectangles that under approximate the area. Seems strange to me that this would be the case too, but the rectangles formed by the height of the function at higher/lower values are mostly negligible, so I suppose this contributes to it as well
I was thinking the same but what helped me understand it better was to think of the integral as the area, and the summation as heights of lines corresponding to x for the real valued function sin(x)/x. Areas and lengths can't really be compared. Maybe also, think of a simple rectangle. 6 by 4 maybe. The area is 24 yet I can choose lots of line segments (vertical or horizontal) whose sum could be a lot higher, or a lot less.
You can think of an integral as basically the continuous extension of the sum from discrete math, in other words, in this infinite sum you sum over all integers but in the integral you “sum” over all real numbers. If the integral and the sum evaluate to the same thing, that would imply that only the integer values contribute to the final result, and all the non-integer real values of sin(x)/x “sum” to 0. Right?
Or you could split sum(exp(in)/n) = - Log(1-exp(i)) into real and imaginary parts. You need to express 1-exp(i) = 1 - cos1 - i*sin1 in polar coordinates: 1-exp(i) = sqrt(2-2cos1)*exp(i*(1-pi)/2), which gives you: sum(sin(n)/n) = (pi-1)/2 ≈ 1.07 sum(cos(n)/n) = -1/2*log(2-2cos1) ≈ 0.042. Very different-looking results! (How do you find the angle? [0, 1, 1-exp(i)] is an isosceles triangle, where the angle at the vertex at 1 is 1rad).
At 7:40 , why do you only consider the first solution to ln(-1) and not (2n +1)*pi*i ? I know the sum can only converge to one value but what would tip me off to it being solely pi*i?
Its like doing integral of 1/(1+x^2) from 0 to 1, we get arctan(x) then plug in 1 and 0. We take arctan(1)=pi/4 instead of 5pi/4. And yea. You can see Peyam’s video on complex log and principal branch for more details.
Notice how he didn't explain how to get the sum of sinc(n) from the entire integers. He used symmetry, as sinc(n) is an even function, so the sum on the negative integers was also (pi-1)/2. So, if you take the sum from the integers (excluding zero), you get 2((pi-1)/2), which is pi-1. And to finish it off he added that value to the limit of the sinc function at zero, which is 1. So then he added (pi-1)+1, which then equals pi. So, that's how he got pi from using the sum.
@@seroujghazarian6343 The function f(x) = 1 - x^2/3! + x^4/5! - x^6/7! + ... is neither indeterminate nor undefined at x=0, so we may evaluate it there, whence f(0) = 1.
Hi, What do u think about the serie $\sum \sin(n)^n$ ? I am not even sure that sin(n)^n tends to 0. We know sin(n) can be as close as we want to 1, but never equal to 1, so the nth power can be small !
I think I have an explanation for this equality. Since 2𝝅 (the whole circumference) is irrationnal, going step 1 infinitely along the circle scans finally all the points of the circumference, which is the definition of the integral. Right ?
Marco Malabarba (1+abs(sin(n))>1 for every n, since sin(n) is only 0 for integer multiples of pi and so abs(sin(n))>0. So by the p-series test the whole thing converges I think.
@@federicovolpe3389 I don't know, we could use a similar argument for the serie 1/(n^(1+1/n)) (as 1/n is always bigger than zero for all the integers), but it diverges
But couldn't I choose ln(-1) to be - πi? Then the argument wouldn't be valid anymore. The branch cut for the principal branch of the logarithm doesn't matter to the series around 1, so it could have been chosen at φ=0.999π or something. Just assuming one value *at* the branch cut seems very dangerously lax
I'm a younger student so forgive me if this is a stupid question, but: the Gregory-Leibniz theory states that sigma n=1 to infinity of sin(n)/n = pi/4. Why do you need to do sigma of negative infinity to infinity of sin(n)/n to get pi?
maybe you could follow up with an expansion of this video for the sum of 1 to infinity of sin(nx)/(pi*x) and it converging to the dirac distribution? thanks :)
This result bothers me. It seems to me the answer should be zero for the following reasons: First, for any randomly selected positive integer n, the value of sin(n) will be some random value between -1 and+1. Since the sine curve itself is essentially symmetric, there is nothing to favor positive values over negative values. Therefore, the summation of sin(n) for all positive integers n should be zero. Now, would this change for the summation of sin(n)/n? I don't think so, because the summation of 1/n diverges. We can start the summation at any positive integer n, and the sum is not finite. Again, this suggests to me the positive and negative values will cancel out, leaving you with zero.
I didn't know that the integral of a (just even?) function from -inf to inf is the sum of that function. So cool. By the way the integral from -inf to inf is 0 when the function is even. The area under the function is twice...
@@athanasiuscontramundum4127 nope, "log" is base ten, "ln" is base e, and "Log" with a capital L is the complex version of ln (also base e). This is just a convention.
abs(e^i) is a real number (as every absulute value is), therefor it can be used in inequalities even though the number e^i itself can't. Let me know if you need further explanation.
@@rulekop the e^i is a complex number. Its size (not its absolute value) is a real number. Its absolute value is not defined since i cannot determine whether e^i>0 as the complex numbers cannot be ordered. I think that this reasoning is correct, unless i miss something. The power series requires |z|
Well, its integral is convergent, just look at the graph. From this you can conclude that its sum is convergent using the integral test. Because sin(x) always has the same set of values after a period 2pi, but x is not periodic, when x grows indefinitely, the denominator grows arbitrarily large compared to the sin(x) (which always stays in range [-1;1]) and so there is a limit.
Log, log or ln?
log. Only fake fans use base 10 logarithm (and by fake fans I mean most scientists).
log = base 10
log x = base x
ln = base e (maths)
log e = base e (mechanics)
I like Ln for base-e, and Log for base-10. Some math acronyms are not the same as English worldwide, and sometimes they can be confused.
ln for sure
ln
I think you just made me believe in the power of imaginary numbers
I am glad to hear :)))
Imaginary numbers are our real imaginary friends :)
There are a lot of series or integrals that are easily calculable thanks to complexx ;) (sum of cos(kx), integral of cos(ax)exp(bx) ... )
It's more like that power of series expansion. Indeed, complex analysis is paramount.
lol for true..
Hey! I saw the thumbnail of the video and decided to try it myself, and it came out quite differently so I think it's worth sharing.
Consider the function f(x)=x on the interval [-π,π], continued periodically over R. We'll compute its Fourier coefficients:
A(n) = 0 for all n, because f is odd.
B(n) = 1/π * int(-π,π) x*sin(nx)dx = (integration by parts, u=x and dv=sin(nx)) = 1/π(-x*cos(nx)/n{-π,π} - int(-π,π) -cos(nx)/n dx = ... (Boring computation, just cancel out a lot of things) = 2*(-1)ⁿ⁺¹/n.
Therefore, because f is continuous on (-π,π) we get:
f(x) = x = 2*sum(n=1,∞) (-1)ⁿ⁺¹/n * sin(nx)
For every x in (-π,π). Divide by 2:
x/2 = sum(n=1,∞) (-1)ⁿ⁺¹/n * sin(nx)
For every x in (-π,π). Now, substitute y=x+π and we get:
(y-π)/2 = sum(n=1,∞) (-1)ⁿ⁺¹/n * sin(n(y-π))
For every y in (0,2π). Multiply both sides by -1:
(π-y)/2 = sum(n=1,∞) (-1)ⁿ/n * sin(n(y-π))
For every y in (0,2π).
Lastly, recall the angle addition formula for sin: sin(a+b) = sin(a)cos(b) + sin(b)cos(a). Substituting a=ny and b=-nπ we get:
sin(n(y-π)) = sin(ny)cos(-nπ) + sin(-nπ)cos(ny) = sin(ny)*(-1)ⁿ + 0*cos(ny) = (-1)ⁿ*sin(ny)
Substituting this back in our sum, the two instances of (-1)ⁿ cancel out and we're left with:
(π-y)/2 = sum(n=1,∞) sin(ny)/n
For every y in (0,2π).
1 is in (0,2π) so we can substitute y=1 and finally
(π-1)/2 = sum(n=1,∞) sin(n)/n
QED.
WOW! THIS IS AMAZING!!! I DIDN'T THINK OF THAT! THANK YOU!!!
Can I make a video of your solution and I will credit you in the video?
@@blackpenredpen of course!! I'd be honored!
Amazing proof, thank you!!! By the way, I have a question about the considered series: can we prove its convergence without calculating sum (and without going to complex plane)? Integral test doesn't work as function sin(x)/x isn't decreasing; comparison test also doesn't help as well.
@@regulus2033 Hey, you can see that sum sin(n)/n converges by the Dirichlet test, 1/n decreases to zero and |sum_{n=1}^Nsin(n)| is bounded independently of N.
I absolutely had had no idea how to solve this one until you gave the formula of the sinx and all of that suddenly became so obvious... anyway, that was great
Don't we just love the complex world?! : )
I always wondered about that series and never thought the proof is that simple. Thank you very much.
at 3:45 do you mean z can't be 1? The alternating harmonic series for z=-1 converges to the value given by the formula, while z=1 gives a divergent value
Yes, I messed up. I forgot that I was using -Log(1-z) instead of Log(1+z).
And thanks for pointing out!
Yeah, I noticed that, too. But I knew what he meant, and in any case, it doesn't affect his result.
Fred
ffggddss yea. I think I have used the version ln(1+x) too often lol
@@blackpenredpen OK, well then be more careful to mix them up, so you don't get stuck on one of them to the exclusion of the other ;-)
. . . . NAAAAH!
Fred
ffggddss yea. Will do!!
So cool! Love your excitement as always!!
: )))))
Thank you!
One of my friends works at our state library and for my birthday this year he got me a series of math exercise books from 1917. One of the problems is the following integral: integrate (tan(x)*log(cot(x))) from 0 to pi/4;
This turned out to be way harder than I thought and i'd really like to see you solve it :D
When you move to the complex world, but you know the result must end up real:
'All's well that ends well'
"let me just do the math" - amen!
Glad to hear! : )
8:01 "I am still on the bottom, but this is enjoyable so I don't mind."
There's no way that wasn't deliberate...
You made my day ! Mind blowing improper integral infinite series.
Loïc Magnien thank you!! I am glad to hear!!
First time here!!!
How didn't I find your videos before?
...
Thanks for the Maths!!!!
In my Mathematics Faculty in Brazil we did not learn how to deal with that sum the way you did.
I remember it was much more difficult to understand.
Your explanation was
♾ Times Better!!!
Thanks from a Mathematics Teacher in Brazil!!!
🙏📚🎉🥇🏆💡📖💡
I'd choose ln. We are on the same wavelength as of late!
lol, I actually like "log" the most
blackpenredpen would it confuse people to tell which ones they use?
I like that starting with a sum of real numbers, going through complex world and ending up with a real answer!
Reminds me of Umar Khayyam solving third degree polynomial equations by going through complex without knowing or defining complex numbers at all!
The classic discrete unnormalized sinc function. Very useful in signal analysis.
Sinc!!! The Cardinal Sin function. Love the name. It also appears in spectral numerical methods.
@@u.v.s.5583 So THAT'S what the c stands for. Cardinal. I just call it sinc (pronounced as sink).
It's more amazing to me that you can split the original sum into 2 separate sums. In a way, you are changing the order of operation, but it doesn't end up mattering because of the specific way you change it.
Complex definition of sine of theta. Brings back memories of WhiteChalkRedChalk and the red and black stripes tee.
Intresting, but can you find all the functions by which int(-inf;+inf) equals sum(-inf;+inf)?
Was thinking the same thing before he even unveiled the final bonus. Tough question
I have absolutely no reason for guessing this, but maybe such functions have to have recursive and infinite derivatives.
@@brandonklein1 All piecewise constant functions of the form f([x]) where [x] is the floor function will do. So no need for differentiability.
That I am not sure.
Oh, I know f(x)=0 works well : )
@@u.v.s.5583 Great point!
I'm impressed by how you write on the board
Thank you! : )
As usual...---> A W E S O M E ♥️♥️♥️
great video Mrr. prof.!
school dudes don't understand why 'sin n' is not equal to 'sin²'
You went to homeschool, pavel?
@@mahdipourahmad3995 , nope lol
3:16 Let me just do the math.
definitely!
My brain says cancel the n
My mind is telling me no
But my body, my body is telling me YEAH
thats a sin
badumtss
sinn/n
cancels the n
sin
If we may go further, let's also cancel 'n' from 'sin' with 'n' in 'n=1' under Σ (sigma) and we could obtain 'si'. Due to impossibility of 1 to be equal to an empty set, '=1' will vanish.
Now, using a translating function, we can transform Σ (sigma) into 'S', and cancel with 's' in 'si', it comes to an 'i'.
Now we have '∞ i' only where i = √-1. Commutative, we get it solved as i×∞ . Please advise.
@@alexdemoura9972 Congrats! Here's your hardhat and unpaid internship
I’m going to try to buy your shirt during back to school. I love it. greetings from Ecuador
Thanks man! I was looking for exactly that!
Can you find the integration of
log {-x} for me?
[ Note: ln x= log (x)]
It is carrying 1/lnx!!
Amazing. Who would ever guess the sum is pi
Yea, exactly! It seems like if we just randomly guess "pi" as the answers to hard problems, then we might have a good chance!
if you integrate the function in the summation (where sin(0)/0 is set =1) do you get the same value. If so that function has the property that it's summation is the same as it's integral. which leads to the question what other functions have this property.
11:28 “so good” lmao
It should be noted that this works because e^i and e^-i both have absolute value 1, so the series converge.
10:47 (647 s) - It's called the sinc function. So you should write
int_{-oo...oo} sinc(x) dx = sum_{n=-oo...oo} sinc(n).
I can follow the maths but I really cannot fathom how the two are the same. Taking the integral to be approximated by the Riemann sum with vanishingly small error, does this result not imply that only rectangles near integer values on the number line contribute to the integral? Alternatively, the contribution of non integer rectangles must cancel precisely but just looking at a graph of sin(x)/x suggests this can't be the case. I must be missing something fundamental here. Any insight is greatly appreciated :)
hahahasan I was thinking something similar. I think that due to the cyclical nature of the function, the integers at which the rectangles over approximate the area of the function have to precisely cancel out with the rectangles that under approximate the area. Seems strange to me that this would be the case too, but the rectangles formed by the height of the function at higher/lower values are mostly negligible, so I suppose this contributes to it as well
I was thinking the same but what helped me understand it better was to think of the integral as the area, and the summation as heights of lines corresponding to x for the real valued function sin(x)/x. Areas and lengths can't really be compared.
Maybe also, think of a simple rectangle. 6 by 4 maybe. The area is 24 yet I can choose lots of line segments (vertical or horizontal) whose sum could be a lot higher, or a lot less.
Is it somehow possible to show that the difference between the integral and sum is 0 without solving for both of them?
Thanks man! This has been hunting my dreams! But why are we allowed to break up the infinite sum into two parts at 4:29?
Since neither diverge
The complex plane has no order, so does it make any sense to define a summation "from a to b" in the complex plane?
You can think of an integral as basically the continuous extension of the sum from discrete math, in other words, in this infinite sum you sum over all integers but in the integral you “sum” over all real numbers. If the integral and the sum evaluate to the same thing, that would imply that only the integer values contribute to the final result, and all the non-integer real values of sin(x)/x “sum” to 0. Right?
great classic sum
Yup!!!!
Btw, I love your username.
Man that was cool. Thank you.
I don't even know what u say in all of your videos , but it still interests me , I should learn math.
Or you could split
sum(exp(in)/n) = - Log(1-exp(i))
into real and imaginary parts. You need to express 1-exp(i) = 1 - cos1 - i*sin1 in polar coordinates:
1-exp(i) = sqrt(2-2cos1)*exp(i*(1-pi)/2),
which gives you:
sum(sin(n)/n) = (pi-1)/2 ≈ 1.07
sum(cos(n)/n) = -1/2*log(2-2cos1) ≈ 0.042.
Very different-looking results!
(How do you find the angle? [0, 1, 1-exp(i)] is an isosceles triangle, where the angle at the vertex at 1 is 1rad).
Cool video as always! 😎
At 7:40 , why do you only consider the first solution to ln(-1) and not (2n +1)*pi*i ?
I know the sum can only converge to one value but what would tip me off to it being solely pi*i?
I'm trying to prove the note you mentioned at 2:32 but I'm stuck. can you help me?
Log(-1) is πi+2ki right? So how do you know for sure k=0 gets you the correct answer?
watch a video about complex logarithm in the description under the video
Its like doing integral of 1/(1+x^2) from 0 to 1, we get arctan(x) then plug in 1 and 0. We take arctan(1)=pi/4 instead of 5pi/4.
And yea. You can see Peyam’s video on complex log and principal branch for more details.
Notice how he didn't explain how to get the sum of sinc(n) from the entire integers. He used symmetry, as sinc(n) is an even function, so the sum on the negative integers was also (pi-1)/2. So, if you take the sum from the integers (excluding zero), you get 2((pi-1)/2), which is pi-1. And to finish it off he added that value to the limit of the sinc function at zero, which is 1. So then he added (pi-1)+1, which then equals pi. So, that's how he got pi from using the sum.
Did I just watch a mathematician divide by 0 and give an answer? Unacceptable!
Dividing the power series expansion of sin(x) by x leaves 1.
@@lesnyk255 to simplify in a literal fraction, you need to be sure that the divided value CANNOT be 0
@@seroujghazarian6343 sin(x) = x - x^3/3! + x^5/5! - x^7/7! + ...
so for non-zero x, sin(x)/x = 1 - x^2/3! + x^4/5! - ...
As x approaches 0, higher order terms in sin(x)/x vanish, leaving 1
@@lesnyk255 approaches 0 WITHOUT being 0
@@seroujghazarian6343 The function f(x) = 1 - x^2/3! + x^4/5! - x^6/7! + ... is neither indeterminate nor undefined at x=0, so we may evaluate it there, whence f(0) = 1.
Hi, What do u think about the serie $\sum \sin(n)^n$ ? I am not even sure that sin(n)^n tends to 0. We know sin(n) can be as close as we want to 1, but never equal to 1, so the nth power can be small !
Can we prove that it indeed converges to the primary solution and not to one of the other solns?
A wild integral appears!
Great work
Isn't this a special case of Parseval's theorem?
I think I have an explanation for this equality. Since 2𝝅 (the whole circumference) is irrationnal, going step 1 infinitely along the circle scans finally all the points of the circumference, which is the definition of the integral. Right ?
Did I write a so stupid thing ?
You are the Jackie Chan of maths
I am honor to hear that!
I am honor to hear that!
A very nice one!
lol I thought the bonus was going to be a tie in to clausen
Can you prove that the sin(x)/x integral is equal to the sin(n)/n series directly without evaluating it to pi?
Why does that first note hold true? Where did it come from?
did you find the answer by any chance?
Great video!
What about the serie sum from 1 to infinity of 1/(n^(1+abs(sin(n)))) ? Does it converge or diverge? 🤔🤔
con. from 1 to inf. => the turms get exponentially small after 1.
Basically its 1/n^(z(n))
Where z(n) > 1
This condition guarantees, that it converges
Marco Malabarba (1+abs(sin(n))>1 for every n, since sin(n) is only 0 for integer multiples of pi and so abs(sin(n))>0. So by the p-series test the whole thing converges I think.
@@federicovolpe3389 I don't know, we could use a similar argument for the serie 1/(n^(1+1/n)) (as 1/n is always bigger than zero for all the integers), but it diverges
But couldn't I choose ln(-1) to be - πi? Then the argument wouldn't be valid anymore. The branch cut for the principal branch of the logarithm doesn't matter to the series around 1, so it could have been chosen at φ=0.999π or something. Just assuming one value *at* the branch cut seems very dangerously lax
A video on combinatorics and permutations !!!
Good vodeo
What is Sum 1/Prime numbers ?
1/2+1/3+1/5+1/7...=?
Diverges en.wikipedia.org/wiki/Divergence_of_the_sum_of_the_reciprocals_of_the_primes
I'm a younger student so forgive me if this is a stupid question, but: the Gregory-Leibniz theory states that sigma n=1 to infinity of sin(n)/n = pi/4. Why do you need to do sigma of negative infinity to infinity of sin(n)/n to get pi?
I love this approach. I was thinking more along the lines of Fourier series for a triangle wave but this works too.
BPRP at 6:05: “In this case we can just look at the branch cut from -pi to pi”
Also BPRP at 7:42: log(-1)=pi*i
🤨🤔🤔🤨🤔🤨🤨🤔🤔🤨🤨🤔🤨🤔🤔🤨
Yea, branch cut from (-pi, pi]
In the series of z^n/n =-ln(1-z) ,for |z|
maybe you could follow up with an expansion of this video for the sum of 1 to infinity of sin(nx)/(pi*x) and it converging to the dirac distribution? thanks :)
i would use sum of harmonics is pi-pix/2 from 0 to pi
What about cos(n) /n?
Really cool!
Any solution without using complex logarithm (which is a pretty tricky "functiun) ?
sum((1+(1/2)+(1/3)+...(1/n))(sin(nx)/n) from n=1 to infinty
Does this converge and how if you can, thanks
bprp writes "log" as "loy"
This result bothers me. It seems to me the answer should be zero for the following reasons:
First, for any randomly selected positive integer n, the value of sin(n) will be some random value between -1 and+1. Since the sine curve itself is essentially symmetric, there is nothing to favor positive values over negative values. Therefore, the summation of sin(n) for all positive integers n should be zero.
Now, would this change for the summation of sin(n)/n? I don't think so, because the summation of 1/n diverges. We can start the summation at any positive integer n, and the sum is not finite. Again, this suggests to me the positive and negative values will cancel out, leaving you with zero.
Blackpenredpen, but uses a marker!!
Yea
why not just use the comparison test and compare it to 1/x and we will clearly see that it diverges by p-series without doing any of this work?
I didn't know that the integral of a (just even?) function from -inf to inf is the sum of that function. So cool.
By the way the integral from -inf to inf is 0 when the function is even.
The area under the function is twice...
He never said that
He is true in this case but not in general
@@lelouch1722 thanks
I've seen that function called sinc(x) online.
Yes. It’s a fantastic one! I plan to do more on it.
I think I saw it in the Boyd's book on spectral methods. C stands for Cardinal.
Note |z|
Where are u come from sir?
The land of black and red pens!
where did you learn that
sinn/n cancel the n to get sin
There are far more pleasurable ways to get sin, my son!
can you please do a video on
if y = ln(ln(ln(ln(.....ln(x))
what's dy/dx
So how does Log(e^i) cancel to i? Is Log ln or something?
If you used ordinary ln, you would have to write "+2*pi*n, where n is an integer" to your answer, because it has infinitely many possible values.
Thyron Dexter but i dont see how the Log cancels because Log is base ten.
@@athanasiuscontramundum4127 nope, "log" is base ten, "ln" is base e, and "Log" with a capital L is the complex version of ln (also base e). This is just a convention.
Thyron Dexter oh i had no idea that was a thing. Thank you.
Amazing!
Really interesting
hi ..... what about Sum of abs(sin n)/n ..... thanks
Should diverge. + infinity.
Unexpected Payam!!!
Lol, he is my usual guest tho! : )
Lol, he is my usual guest tho! : )
It would be more appropriate to say that lim sin(x)/x ->1 as x->0., since it's the limit that makes sense, not sin(0)/0.
The power series you used is valid if abs(z)
abs(e^i) is a real number (as every absulute value is), therefor it can be used in inequalities even though the number e^i itself can't. Let me know if you need further explanation.
@@rulekop the e^i is a complex number. Its size (not its absolute value) is a real number. Its absolute value is not defined since i cannot determine whether e^i>0 as the complex numbers cannot be ordered. I think that this reasoning is correct, unless i miss something. The power series requires |z|
We all know that ln(e^x) = x...
But why also log(e^x) = x ?
We are not sure here that log have "e" as a base
Can someone explain?
Awesome.
Really nice[btw prefer ln :)]
blackpenredpenbluepen
Sin(n)/n is not convergent so how its sum is finite?
That is perhaps because its sign alternates as n-.>inf
Well, its integral is convergent, just look at the graph. From this you can conclude that its sum is convergent using the integral test. Because sin(x) always has the same set of values after a period 2pi, but x is not periodic, when x grows indefinitely, the denominator grows arbitrarily large compared to the sin(x) (which always stays in range [-1;1]) and so there is a limit.
Dont believe in wolframalpha ?
In Germany:
ein = 1
ei = egg.
I want to buy a t-shirt. Do you send to Brazil?
yes! : )
so in the end :
dx = 1 get that :v ?