Determine if the series of sin^2(1/n) converges or diverges. This question was on my calc 2 final exam. The limit: • The Limit (do not use ... #calculus #blackpenredpen #math
11:22 Answer: Use the same LCT with sum 1/(n^2) lim n->inf of (abbreviation will be ->) (cos (1/n) - 1)/(1/n^2) by taylor expansion -> (1/(n^2) - 1/(n^4) + (1/n^6) ...)/(1/n^2) = 1 - 1/(n^2) + 1/(n^4)... -> 1 Thus since sum 1/(n^2) converges, so does the sum of cos(1/n) - 1
Just wanted to thank you for making these videos. You make calculus interesting and using your channel to review for linear algebra is making my life so much easier.
This can also be done with trig identities. cos(x)-1 = cos(2 x/2)-1 = cos^2(x/2) - sin^2(x/2) - sin^2(x/2) - cos^2(x/2) = -2sin^2(x/2), and rest goes like in the video.
I don't understand that taylor expansion... isn't that the expansion of 1/(1/n^2+1)? Also if you check the sign the numerator is negative since cos(0+) < 1 while the denominator is positive, how can the result of the limit be positive? Anyway I would solve it like this: lim n->inf of (cos (1/n) - 1)/(1/n^2) -> lim m->0+ of (cos(m)-1)/m^2 -> -1/2 (which is a common limit) If you don't like the common limit you can get it by using lim n->0 sin(n)/n -> 1 lim m->0+ of (cos(m)-1)/m^2 -> (cos^2(m)-1)/m^2/(cos(m)+1) -> -sin^2(m)/m^2/(cos(m)+1) -> -(sin(m)/m)^2/(cos(m)+1) -> -1^2/2 = -1/2 and if you want to use the taylor expansion: the expansion of cos(1/n)-1 is -1/(2n^2) + 1/(6n^4) -1/(24n^6) +... so you get lim n->inf of (cos (1/n) - 1)/(1/n^2) -> (-1/(2n^2) + 1/(6n^4) -1/(24n^6) +...)/(1/n^2) -> -1/2 + 1/(6n^2) -1/(24n^4) +... -> -1/2 Anyway: since the limit n->inf of (cos(1/n)-1)/(1/n^2) and the sum of 1/(n^2) converge then the sum from n=1 to inf of cos(1/n) - 1 also converges Edit: the expansion should be 1/(1/n^2+1) and not 1/(x+1) (I didn't do the sub)
Another way to reach the same conclusion. sin(x) < x when x > 0 ==> the summation of sin(1/n^2) < the summation of 1/n^2 which converges, so by Direct Comparison Test, it must converge as well.
@@vladimir0681 "but then you would need to recite the proof of this fact" IMO sinx(x) < x for positive x should be considered common knowledge on a calc 2 final.
@@waynetraub3 what do you mean by "only avaiable using the derivative of sin(x)" ? It seems to me that x happens to approach sin(x) whenever x gets close to 0. I don't see the link with derivatives. Knowing that, sin^2(1/n) has the same behaviour as 1/n^2 for large n and therefore the series converges. I don't see why the piece of information about x being the first term of the taylor series of sin(x) is relevant here. you don't need to consider terms of higher orders to draw a conclusion.
@@avdrago7170 Not quite. Kyro is essentially saying that you can drop the equality; sinx is strictly less than x, when x > 0. You said this at the end, but not at the beginning. Kind of a moot point, really. Fred
sin²(1/n) ~ 1/n² because 1/n -> 0 when n goes to infinity and as sin²(1/n) is always positive, thus we can use comparisons and we know the series of 1/n² converge, by comparison the serie of sin²(1/n) converge
@@oliver_pryce x=1/n here :) if you know the power serie for sin(x) when x goes to zero : sin(x)= Σ(-1)^k * x^(2k+1)/(2k+1)! with k in range of 0 to infinity, so sin(x)= x - x^3 /3! + x^5/5! - ... so sin(x)~x when x goes to 0. 1/n goes to 0 when n goes to infinity then sin(1/n)~1/n when n goes to infinity sin²(1/n)~1/n² however Σ1/n² exists, by comparaison Σsin²(1/n) exists
Using complex analysis, I was able to find a series that converges substantially faster to the same value than this representation does. It's the sum from n=1 to infinity of (-1)^(n+1) * 2^(2*n-1) * zeta(2*n) / (2*n)!
The terms are all positive; the largest argument of the sine function is 1 (radian); which means: 0 < sin(1/n) < 1/n, for all n ≥ 1 [This can be easily shown using the Taylor series for sine.] So by comparison test with ∑₁⁰⁰ (1/n²), it converges. Furthermore: 0 < ∑₁⁰⁰ sin²(1/n) < ∑₁⁰⁰ (1/n²) = ⅙π² "NOW TRY" ∑₁⁰⁰ (cos(1/n) - 1) This, too, converges. lim [(cos(1/n) - 1)/(1/n²)] = lim (cos(x) - 1)/x² → (1-1)/0 → 0/0, [sub. x = 1/n] so use l'Hôpital n→∞ x→0⁺ 1ce: -sin(x)/2x, which we know → -½ So by LCT with ∑(1/n²), it converges. Fred
I'm not sure if that was allowed in your calc 2 final exam, but you could use the fact that for sin(x) is lower or equal to x for all x >= 0 sum from n:=1 to infinity of sin^2(1/n) = sum from n:=1 to infinity of (sin(1/n))^2 = sum from n:=1 to infinity of (1/n)^2 which converges.
Hello, here s a solution from France : remark that sin^2(1/n ) = (1/n + o(1/n))^2 = 1/n^2 + o(1/n^2) as n growth to infinity with the Landeau’s notations. So sin^2(1/n) is equivalent to 1/n^2 wich is convergent. So our series is also convergent.
Sin(1/n) is equivalent to 1/n cz 1/n goes to 0 when n goes to infinity so sin(1/n)^2 is equivalent to 1/n×1/n =1/n^2 which converges according to Rieman(for alpha>1)
For positive x, sin(x) < x, so for positive n, sin(1/n) < 1/n. Since both sides are positive, you can square them to get sin^2(1/n) < 1/n^2. Since the sum of 1/n^2 converges, so does our sum by the DCT.
this was the approach I took too before watching the video. It seems much simpler. You dont even need the absolute value in this case was well because it's all positive N meaning the range of theta is [1->0] where sin is always positive.
Learning order notation and its rules makes this easy because sin(x) = O(x) for small x. You don't need to know anything more other than sum 1/(n^2) converges.
At 6:10 you actually would not need to know the Sin(x)/x = 1 , since when you get the limit approaching infinity of sin (1/n) being 0, 0 multiplied by anything is 0 so you do not need to evaluate the second limit there actually... No harm to but just adding that
11:23 It can be reduced to your series using trig identities cos(1/n)=cos^2(1/(2n))-sin^2(1/(2*n)) 1=cos^2(1/(2n))+sin^2(1/(2*n)) cos(1/n)-1=cos^2(1/(2n))-sin^2(1/(2*n))-cos^2(1/(2n))-sin^2(1/(2n)) cos(1/n)-1=-2sin^2(1/(2*n))
it's not that complicated: if you notice that the sequence in the series is asymptotic to any other sequence whose partial sum you know better, like in this case, just ue the asymptotic confrontation: they have the same character, meaning that if one is convergent/divergent the other one is also convergent/divergent. in this case, sin^2(1/n) as n goes to infinity is asymptotic to 1/n^2, whose partial sum converges (to pi^2/6 i believe), thus, sin^2(1/n) converges.
Okay, I've watched a fair number of videos where it is stated that "you cannot use L'Hopital's Rule here" (as happens at 5:08 in this video), and I am wondering why not. I understand that derivatives are used in L'Hopital's Rule, therefore it would be circular reasoning to use L'Hopital's Rule on a function to find the derivative of that function. However, once L'Hopital's has been proven, why can't it be used for any convergence problem, e.g. the (sin x)/x limit shown in the video?
Because the standard proof of the derivative of sinx uses the same limit of limx->0 sinx/x. A mathematician would ask: Why would you use L'hopital's rule to find a limit that you already need/found for the derivative of sinx? That is why it is circular.
technically you cannot apply L'H rule because n is always a natural number, wich means we are talking about succesions, not functions. succesions are not continuous and therefore, not differentiable
Now lets try some physics problem blackpenredpen. A horizontal shaft rotates in bearings at its ends. At its midpoint is keyed a disk weighing 40 lbs, whose center of gravity is 0.1 inch from the axis of rotation. If a static force of 200 lbs deflects the shaft and disk through 0.1 inch, determine the critical speed of rotation of the shaft.
I scanned comments, didn't see this question: At 3:52, are you not assuming the limit exists to imply the individual terms converge? Is it because that once you factor the product, it is apparent that the individual terms converge, which implies that the product converges?
sin^2(1/n)= sin(1/n )* sin(1/n) < 1/n * 1/n ( it is better to use that inequality than limit comparison because it gives you a better sense of understanding, also it is less work)
Sin x = x MINUS x^3/6 this two parts of Taylor series suggest that this series is actually growing slower at the infinity than 1/n^2 so it converge. Mind solvable.
The answer is also extremely apparent when you treat sin(x) as the Taylor series of sin(x) and then do a little algebra to show that the dominant term is the 1/x^2 term. This is very similar to the LCT though
Furthermore, the general term of sin^2(1/n) series is actually LESS than general term in series of (1/n^2), because of next higher-order term in sin series. Another comparison, and term by term, the series is less than the p series for p = 2.
Much quicker way... sin(n) will ALWAYS be less than n, so since the sum (n=1-->infinity) (1/n)^2 converges, then the sum(n=1-->infinity) (sin(1/n))^2 must converge as each corresponding term is smaller. Since it is not just 1/n^2 you would not be able to find the point it converges TO, but you know it will converge. Would write more thoroughly with a better editor, but for comments...
Or you could just know that the sum of the inverses squares is pi^2/6 and that sin(x) is slightly less than x as x gets close to 0. This gives an upper limit of pi^2/6 and if it has an upper limit is converges
You can also use squeeze theorem by noting that 0≤sin^2(1/n)≤1/n^2, so their partial sums also follow the inequality, so by squeeze theorem, since both of the others converge, so does the series in question. Then, no difficult limits are required like when using the LCT.
I think if you define sin x as a power series in the first place, then you could use L'Hospital's rule to evalutate lim as x --->infinity (sin x)/x =1.
For the limit of sin(x)/x when x tends to 0. You can see that as a derivative and say that [sin(x)-sin(0)]/(x-0) tends to cos(0) when x tends to 0 ! :) Good video again !
Please excuse me if this is a stupid question, but isn't the sum at 8:20 the same thing as ζ(2) (the Riemann zeta function) and thus the result is π²/6 ?
To clarify, you are finding the limit of,the SUM of,all the infinite terms right not just the limit,of one single sine of one over n squared term?..in which case the answer is zero times zero or zero
i use the sin^2(x) = (1/2)(1-cos(2x)) property instead, but yeah if this came up on a test i'd probably take a big L here since I didn't think of using LCT until seeing the second attempt.
But what if I factored sin^2 into 1-cos(doubke angl) all divided by two I can then separate them Then, the series as n goes to infinity for 1 is divergent Therefore, the whole series is divergent Correct?
The power series for sine is found using the derivative of sine, and the derivative of sine is found using the limit as x->0 of sin(x)/x, so it’s circular logic
@@nasekiller where does that power series come from? You need the derivative of sine to derive it, unless you have a way of coming up with its power series without the derivative of sine
@@YorangeJuice yeah, but we already know the power series so we can use it. The definition is then justified by proving it actually has all the properties of sine.
@@nasekiller “we already know the power series so we can use it” yes, but HOW DO WE EVEN KNOW IT IN THE FIRST PLACE?? It didn’t materialize out of nothing, we didn’t just make lucky guesses to derive it. It’s because we used the derivative of sine to derive it, idk how many times I have to keep repeating this, it’s circular logic
I used some of my university Maths during my career, but very little to none of my Calculus, I'm always surprised by how much I remember after, ummmnmnnnmmn, about 50-odd years.
It's better to avoid writing statements like $1/\infty$ as this reinforces incorrect arguments used by undergraduate students. Also, it easy to solve this problem if we just prove that A < |sin(x)/x| < B for some A > 0 and all x satisfying |x| < 1. This is easily shown by showing that the derivative is negative on the left of zero and positive on the right of zero.
Hi Steve now you’re my favourite youtuber :) I’d like to ask you for something. Can you calculate the surface area and volume of the solid obtained by rotating the Bernoulli lemniscate (x^2+y^2)^2=2a^2(x^2−y^2) around the x (from x=0 to x=2)?
If the series is known to be convergent, then it will be lot cool to calculate the sum. Could you do that....? 😋💖 By the way the question u asked in the end is just (-2) times the subject of the video.
You CAN use L hospitals rule at 5:24. You're wrong, respectfully, because the limit of the numerator and denominator are both zero so it's not circular reasoning. I don't get why you think it would be circular?
Let me ask you, how do you know the derivative of sin(x), if you've studied even the basics of calculus you know that it uses the fact that the limit as x goes to 0 sin(x) /x is equal to 1. And thus it is dependant on the fact and using something to prove a fact on which its own truth depends makes no sense
This might be off and containing a couple of mathematical leap of faiths, but if i look at the taylor series for sin(x), then in a very close neighbourhood of 0 sin(x) is almost exactly x. Then it basically converges same as sum (1/n^2) does. Would that make sense?
blackpenredpen He probably means the part where you wrote down sin²(1/∞) (1st try). I know (hope?) that this was just for didactic purposes, but it still hurts a bit. Maybe you should mention that this is a sloppy notation when doing this.
confusing as limit sin theta as theta approaches infinity is oscillating between -1 and 1 which never reaches a discrete value. I do not see how squaring said function changes anything. This should also diverge.
Kenneth Gee Notice that you have to add 2pi to theta for the values of sin theta to oscillate. However, here the theta 1/n constantly decreases from 1 to 0 as n approaches infinity. Therefore, sin(1/n) does not oscillate as there is a different value output from sin(1/n) as opposed to the normal sin(n) where theta can be expressed as n+2kPi where k is any integer
@@QasimKhan-nd8og I disagree that you need to add 2*pi in order for sin theta to oscillate. 2*pi is approximately 6.28 and definitely between 1 and infinity. Everyone just assumes that fractions and irrational numbers are not included is a summation series. I still see that sin theta is cyclical and never settles for a single value. This is the nature of trig functions
But as Qasim Khan pointed out, you're not taking the lim(θ→∞) sin(θ). You're taking lim(n→∞) sin(1/n). sin(x) is a continuous function, so you can bring the limit inside, giving lim(n→∞) sin(1/n) = sin(lim(n→∞) 1/n) = sin(0) = 0
how many points will I get for proving that this sum converges by graphical methods? (assuming that I didn't know how to apply the theorems) Let's start with plotting the function f(x)=sin^2(x). We know that the largest value of x is 1 for n=1, thus the range of x is 0
11:22 Answer:
Use the same LCT with sum 1/(n^2)
lim n->inf of (abbreviation will be ->) (cos (1/n) - 1)/(1/n^2)
by taylor expansion -> (1/(n^2) - 1/(n^4) + (1/n^6) ...)/(1/n^2) = 1 - 1/(n^2) + 1/(n^4)...
-> 1
Thus since sum 1/(n^2) converges, so does the sum of cos(1/n) - 1
Just wanted to thank you for making these videos. You make calculus interesting and using your channel to review for linear algebra is making my life so much easier.
This can also be done with trig identities. cos(x)-1 = cos(2 x/2)-1 = cos^2(x/2) - sin^2(x/2) - sin^2(x/2) - cos^2(x/2) = -2sin^2(x/2), and rest goes like in the video.
I don't understand that taylor expansion... isn't that the expansion of 1/(1/n^2+1)? Also if you check the sign the numerator is negative since cos(0+) < 1 while the denominator is positive, how can the result of the limit be positive? Anyway I would solve it like this:
lim n->inf of (cos (1/n) - 1)/(1/n^2) -> lim m->0+ of (cos(m)-1)/m^2 -> -1/2 (which is a common limit)
If you don't like the common limit you can get it by using lim n->0 sin(n)/n -> 1
lim m->0+ of (cos(m)-1)/m^2 -> (cos^2(m)-1)/m^2/(cos(m)+1) -> -sin^2(m)/m^2/(cos(m)+1) -> -(sin(m)/m)^2/(cos(m)+1) -> -1^2/2 = -1/2
and if you want to use the taylor expansion:
the expansion of cos(1/n)-1 is -1/(2n^2) + 1/(6n^4) -1/(24n^6) +...
so you get lim n->inf of (cos (1/n) - 1)/(1/n^2) -> (-1/(2n^2) + 1/(6n^4) -1/(24n^6) +...)/(1/n^2) -> -1/2 + 1/(6n^2) -1/(24n^4) +... -> -1/2
Anyway: since the limit n->inf of (cos(1/n)-1)/(1/n^2) and the sum of 1/(n^2) converge then the sum from n=1 to inf of cos(1/n) - 1 also converges
Edit: the expansion should be 1/(1/n^2+1) and not 1/(x+1) (I didn't do the sub)
👏🏼👏🏼👏🏼👏🏼👏🏼👏🏼👏🏼
just sin(x) < x so sin²(1/x) < 1/x², we have a positive row what is less then one what converges
"Maybe i should just give up" - Sounds like good solution for me.
Hey I took my Calc 2 final and got an A! I want to thank you so much! Your videos gave me ideas to solve some of the harder problems! :D
You're welcome!!! I am glad to hear your success! Yay!!
Well done! And well done BPRP for providing inspiration :-)
David Gould thank you!!
Another way to reach the same conclusion. sin(x) < x when x > 0 ==> the summation of sin(1/n^2) < the summation of 1/n^2 which converges, so by Direct Comparison Test, it must converge as well.
that sound when you're drawing sad faces, i'm dying
The best way to learn is by showing us when stuff don't work, please do that more!
I agree. Then you know what not to do, so you don't fall into traps.
use sin(x) sum 1/n^2 converge
but then you would need to recite the proof of this fact, as it may not be as famous as lim sin(x)/x
@@vladimir0681 well to be fair the proof is fairly easy. d/dx sin(x) is cos(x) and d/dx(x) is 1. Since cos(x) = sin(x).
@@vladimir0681 "but then you would need to recite the proof of this fact"
IMO sinx(x) < x for positive x should be considered common knowledge on a calc 2 final.
Just use small angle approximation, sin²(x)=x²
Likewise for "The Limit", just use sin(θ)=θ, θ/θ=1
@@waynetraub3 what do you mean by "only avaiable using the derivative of sin(x)" ? It seems to me that x happens to approach sin(x) whenever x gets close to 0. I don't see the link with derivatives. Knowing that, sin^2(1/n) has the same behaviour as 1/n^2 for large n and therefore the series converges. I don't see why the piece of information about x being the first term of the taylor series of sin(x) is relevant here. you don't need to consider terms of higher orders to draw a conclusion.
@@waynetraub3 You can show the limit of sinx/x as x goes to 0 is 1, the small angle approximation follows directly from this
The way I did it is that for large n, sin^2(1/n)
Kyro ya, that’s what I said
@@avdrago7170 Not quite. Kyro is essentially saying that you can drop the equality; sinx is strictly less than x, when x > 0.
You said this at the end, but not at the beginning. Kind of a moot point, really.
Fred
@@ffggddss ya I was talking about the end
That is a very good method as well tbh. I like both methods. The LCT method is pretty good too.
It's always true, not just for large n
when 0
sin²(1/n) ~ 1/n² because 1/n -> 0 when n goes to infinity
and as sin²(1/n) is always positive, thus we can use comparisons and we know the series of 1/n² converge, by comparison the serie of sin²(1/n) converge
That just says it is bounded but sin^2(x) isnt monotonic so it could just oscillate between multiple different values and diverge in that sense
@@oliver_pryce x=1/n here :)
if you know the power serie for sin(x) when x goes to zero :
sin(x)= Σ(-1)^k * x^(2k+1)/(2k+1)! with k in range of 0 to infinity, so sin(x)= x - x^3 /3! + x^5/5! - ...
so sin(x)~x when x goes to 0.
1/n goes to 0 when n goes to infinity
then sin(1/n)~1/n when n goes to infinity
sin²(1/n)~1/n²
however Σ1/n² exists, by comparaison Σsin²(1/n) exists
@@gillesphilippedeboissay109 Oh yes thanks. I guess I wasnt really thinking because 1/n is always gonna be less than 1 and so it will be monotonic.
Bruce Lee of calculus. Same outfit
: )))))
@@blackpenredpen that face scares me to this day! STOP
Thought exactly the same lol
Using complex analysis, I was able to find a series that converges substantially faster to the same value than this representation does. It's the sum from n=1 to infinity of (-1)^(n+1) * 2^(2*n-1) * zeta(2*n) / (2*n)!
Given t>0, sin(t)
∀ x: sin(x)≤x → ∀ x: sin²(x)≤x² →
Σ sin²(x) ≤ Σ x²
for x=1/n
Σ sin²(1/n)≤ Σ1/n² & Σ1/n² ≠∞
∴
Σ sin²(1/n) converges
small note: sin(x)≤x only for non negative x but the rest is right
@@blackbacon08 working in the real numbers means there's no chance of the sum of squares of numbers diverging to -∞
@@xinpingdonohoe3978 well if you consider alternated sums it can happen i guess.
@@dylandiaas this series we have though cannot be alternating as every term is nonnegative.
Using the fundamental theorem of engineering: sum n{1,inf} (sin2(1/n)) =
sum 1/n^2
= pi^2/6
More seriously though, sinx
hey bprp, can you teach us fourier transforms? thx!
Coming soon! (after Xmas)
blackpenredpen Cool!
yess!!!
Hella yes, that'd be so fucking cool.
The sad and angry faces + the dramatic sounds takes the problems to another dimension.
I would've shown it's between sum of 0s and sum of n^-2, because 0
sin(theta)
for the last one its absolutely convergent as -x^2/2
The terms are all positive; the largest argument of the sine function is 1 (radian); which means:
0 < sin(1/n) < 1/n,
for all n ≥ 1 [This can be easily shown using the Taylor series for sine.]
So by comparison test with ∑₁⁰⁰ (1/n²), it converges. Furthermore:
0 < ∑₁⁰⁰ sin²(1/n) < ∑₁⁰⁰ (1/n²) = ⅙π²
"NOW TRY"
∑₁⁰⁰ (cos(1/n) - 1)
This, too, converges.
lim [(cos(1/n) - 1)/(1/n²)] = lim (cos(x) - 1)/x² → (1-1)/0 → 0/0, [sub. x = 1/n] so use l'Hôpital
n→∞ x→0⁺
1ce: -sin(x)/2x, which we know → -½
So by LCT with ∑(1/n²), it converges.
Fred
: )))
That is exactly the first thing I thought to do when I saw this, but was too lazy to do the work lol. Thanks for doing it for me haha!
except I also added that 0 < sin(1/n)^2 < 1/n^2
Good job. You can also do the limit in "now try" multiplying and dividing by (cos(x)+1).
@Poo Guy Michael: Thanks for the kind words!
@MarcoMate87: Yes! Good catch!!
Fred
Very nice!
I'm not sure if that was allowed in your calc 2 final exam, but you could use the fact that for sin(x) is lower or equal to x for all x >= 0
sum from n:=1 to infinity of sin^2(1/n) = sum from n:=1 to infinity of (sin(1/n))^2 = sum from n:=1 to infinity of (1/n)^2
which converges.
Why so much work? -x
Hello, here s a solution from France : remark that sin^2(1/n ) = (1/n + o(1/n))^2 = 1/n^2 + o(1/n^2) as n growth to infinity with the Landeau’s notations. So sin^2(1/n) is equivalent to 1/n^2 wich is convergent. So our series is also convergent.
mais non un Rouxel ?? T'as l'air bon en maths toi, t'en es ou dans ton parcours? Je suis en prépa perso
Sin(1/n) is equivalent to 1/n cz 1/n goes to 0 when n goes to infinity so sin(1/n)^2 is equivalent to 1/n×1/n =1/n^2 which converges according to Rieman(for alpha>1)
For positive x, sin(x) < x, so for positive n, sin(1/n) < 1/n. Since both sides are positive, you can square them to get sin^2(1/n) < 1/n^2. Since the sum of 1/n^2 converges, so does our sum by the DCT.
|sin x|
this was the approach I took too before watching the video. It seems much simpler. You dont even need the absolute value in this case was well because it's all positive N meaning the range of theta is [1->0] where sin is always positive.
1:12 damn I almost peed myself there
Learning order notation and its rules makes this easy because sin(x) = O(x) for small x. You don't need to know anything more other than sum 1/(n^2) converges.
At 6:10 you actually would not need to know the Sin(x)/x = 1 , since when you get the limit approaching infinity of sin (1/n) being 0, 0 multiplied by anything is 0 so you do not need to evaluate the second limit there actually... No harm to but just adding that
11:23 It can be reduced to your series using trig identities
cos(1/n)=cos^2(1/(2n))-sin^2(1/(2*n))
1=cos^2(1/(2n))+sin^2(1/(2*n))
cos(1/n)-1=cos^2(1/(2n))-sin^2(1/(2*n))-cos^2(1/(2n))-sin^2(1/(2n))
cos(1/n)-1=-2sin^2(1/(2*n))
sin^2(1/n)
Just substituting 1/n=t
The limits will become 0 to 1
It will definitely converge.
the serie of 1/n diverge altough 1/n=t which goes to 0
Σ(n=1,inf)(sin(1/n²)) also converges for the same reason:
lim(n->inf) ((sin(1/n²))/(1/n²))=lim(u->0)(sin(u)/u)=1 with u=1/n²
I think I would love to have you as a prof during lectures/office hours, but would hate to have you as a prof during exams lol
I would just say that since sin(1/n) =1 , then Sum(sin(1/n)^2)
|cos(1/n)-1|
easier way is if you recognize cos(1/n)-1 is always negative and and 1/n^2 is always positive, so cos(1/n)-1 < 1/n^2 for all real n
On the positive we have sin(x)
Yeah that's the easy way.
can we just use
sin(x) < x for 0 < x < pi,
so sum (sin^2 1/n) < sum 1/(n^2)
it's not that complicated: if you notice that the sequence in the series is asymptotic to any other sequence whose partial sum you know better, like in this case, just ue the asymptotic confrontation: they have the same character, meaning that if one is convergent/divergent the other one is also convergent/divergent.
in this case, sin^2(1/n) as n goes to infinity is asymptotic to 1/n^2, whose partial sum converges (to pi^2/6 i believe), thus, sin^2(1/n) converges.
I would use integral test, then u sub u = 1/x, then integral test back to a summation, sinc^2, bound it above by sinc, and we know this converges.
Okay, I've watched a fair number of videos where it is stated that "you cannot use L'Hopital's Rule here" (as happens at 5:08 in this video), and I am wondering why not. I understand that derivatives are used in L'Hopital's Rule, therefore it would be circular reasoning to use L'Hopital's Rule on a function to find the derivative of that function. However, once L'Hopital's has been proven, why can't it be used for any convergence problem, e.g. the (sin x)/x limit shown in the video?
Because the standard proof of the derivative of sinx uses the same limit of limx->0 sinx/x. A mathematician would ask: Why would you use L'hopital's rule to find a limit that you already need/found for the derivative of sinx? That is why it is circular.
technically you cannot apply L'H rule because n is always a natural number, wich means we are talking about succesions, not functions. succesions are not continuous and therefore, not differentiable
sin(1/n)=O(1/n) so sin^2(1/n)=O(1/n^2) which is the general term of an absolute convergent serie so it converges
Now lets try some physics problem blackpenredpen.
A horizontal shaft rotates in bearings at its ends. At its midpoint is keyed a disk weighing 40 lbs, whose center of gravity is 0.1 inch from the axis of rotation. If a static force of 200 lbs deflects the shaft and disk through 0.1 inch, determine the critical speed of rotation of the shaft.
I scanned comments, didn't see this question: At 3:52, are you not assuming the limit exists to imply the individual terms converge? Is it because that once you factor the product, it is apparent that the individual terms converge, which implies that the product converges?
just use the integral test to reach the dirichlet integral, which converges.
An easy way to know what to compare the series is by doing limited development and to compare to the first term.
since sin(x)
sin^2(1/n)= sin(1/n )* sin(1/n) < 1/n * 1/n ( it is better to use that inequality than limit comparison because it gives you a better sense of understanding, also it is less work)
Try : sum ( 1 to inf. ) 1/n^(1+1/n)
Sin x = x MINUS x^3/6 this two parts of Taylor series suggest that this series is actually growing slower at the infinity than 1/n^2 so it converge. Mind solvable.
The answer is also extremely apparent when you treat sin(x) as the Taylor series of sin(x) and then do a little algebra to show that the dominant term is the 1/x^2 term. This is very similar to the LCT though
Furthermore, the general term of sin^2(1/n) series is actually LESS than general term in series of (1/n^2), because of next higher-order term in sin series. Another comparison, and term by term, the series is less than the p series for p = 2.
Much quicker way... sin(n) will ALWAYS be less than n, so since the sum (n=1-->infinity) (1/n)^2 converges, then the sum(n=1-->infinity) (sin(1/n))^2 must converge as each corresponding term is smaller. Since it is not just 1/n^2 you would not be able to find the point it converges TO, but you know it will converge. Would write more thoroughly with a better editor, but for comments...
sin^2(1/n)/(1/n^2) tends to 1 as n tends to ♾
abs(sin^2(1/n))
In [0,π/2], (2/π)*x
(sin(1/n))^2
This is part of my Calc1 course also multivariable lol
I noticed that sinx
You could also argue that when n -> inf, sin 1/n is equivalent to 1/n and the whole thing is equivalent to 1/n^2 which means it converges
What do you mean by “is equivalent to”. To prove anything you must show it directly or use a theorem.
@@Erik20766 "Equivalent to" means that you take the first terms of its series expansion. Sorry for the late reply.
@@asameshimae6850 "Sorry for the late reply" 2 years late
1:26 I'd honestly be disapointed if it did diverge, I expect more from my proffesor than a puny divergent series.
Or you could just know that the sum of the inverses squares is pi^2/6 and that sin(x) is slightly less than x as x gets close to 0. This gives an upper limit of pi^2/6 and if it has an upper limit is converges
You can also use squeeze theorem by noting that 0≤sin^2(1/n)≤1/n^2, so their partial sums also follow the inequality, so by squeeze theorem, since both of the others converge, so does the series in question. Then, no difficult limits are required like when using the LCT.
I think if you define sin x as a power series in the first place, then you could use L'Hospital's rule to evalutate lim as x --->infinity (sin x)/x =1.
My pre-watch guess is “yes”, I guess that 1/n being inside a squared function which tends to 0 in the limit makes the overall sum converge. QED
For the limit of sin(x)/x when x tends to 0. You can see that as a derivative and say that [sin(x)-sin(0)]/(x-0) tends to cos(0) when x tends to 0 ! :) Good video again !
That's circular reasoning
Please excuse me if this is a stupid question, but isn't the sum at 8:20 the same thing as ζ(2) (the Riemann zeta function) and thus the result is π²/6 ?
Chrona Mew you are correct!
To clarify, you are finding the limit of,the SUM of,all the infinite terms right not just the limit,of one single sine of one over n squared term?..in which case the answer is zero times zero or zero
i use the sin^2(x) = (1/2)(1-cos(2x)) property instead, but yeah if this came up on a test i'd probably take a big L here since I didn't think of using LCT until seeing the second attempt.
essentially sum of 1/n², save the first couple of terms
But what if I factored sin^2 into 1-cos(doubke angl) all divided by two
I can then separate them
Then, the series as n goes to infinity for 1 is divergent
Therefore, the whole series is divergent
Correct?
You can use the squeeze theorem to prove the limit of sin(theta) over theta
it's 1
If you simply try the Taylor-series in the first order, you can easily show, that this series is
of course you can use l'hopital for sin(t)/t -> 1. all you have to do is define sin via its power series, which is very common in calculus.
The power series for sine is found using the derivative of sine, and the derivative of sine is found using the limit as x->0 of sin(x)/x, so it’s circular logic
@@YorangeJuice you can simply define sine via its power series. No circular logic at all
@@nasekiller where does that power series come from? You need the derivative of sine to derive it, unless you have a way of coming up with its power series without the derivative of sine
@@YorangeJuice yeah, but we already know the power series so we can use it. The definition is then justified by proving it actually has all the properties of sine.
@@nasekiller “we already know the power series so we can use it” yes, but HOW DO WE EVEN KNOW IT IN THE FIRST PLACE?? It didn’t materialize out of nothing, we didn’t just make lucky guesses to derive it. It’s because we used the derivative of sine to derive it, idk how many times I have to keep repeating this, it’s circular logic
I used some of my university Maths during my career, but very little to none of my Calculus, I'm always surprised by how much I remember after, ummmnmnnnmmn, about 50-odd years.
Can you do a video where you actually find that sum?
This series obviously converges because sin(1/n)^2 is equivalent to 1/n^2 as n goes to infinity. That's simple.
It's better to avoid writing statements like $1/\infty$ as this reinforces incorrect arguments used by undergraduate students.
Also, it easy to solve this problem if we just prove that A < |sin(x)/x| < B for some A > 0 and all x satisfying |x| < 1. This is easily shown by showing that the derivative is negative on the left of zero and positive on the right of zero.
Hi Steve now you’re my favourite youtuber :)
I’d like to ask you for something.
Can you calculate the surface area and volume of the solid obtained by rotating the Bernoulli lemniscate
(x^2+y^2)^2=2a^2(x^2−y^2) around the x (from x=0 to x=2)?
oh wow. Did you come up with an answer yet?
sin(1/x)^2*x-si(2/x)
If the series is known to be convergent, then it will be lot cool to calculate the sum. Could you do that....? 😋💖
By the way the question u asked in the end is just (-2) times the subject of the video.
Whenever I see a sin(?) or cos(?), I immediately put down -1
I tried to change it into an integral but could not go further
You CAN use L hospitals rule at 5:24. You're wrong, respectfully, because the limit of the numerator and denominator are both zero so it's not circular reasoning. I don't get why you think it would be circular?
Let me ask you, how do you know the derivative of sin(x), if you've studied even the basics of calculus you know that it uses the fact that the limit as x goes to 0 sin(x) /x is equal to 1. And thus it is dependant on the fact and using something to prove a fact on which its own truth depends makes no sense
Sin^2(1/n)
Loved the Doraemon intro in the beginning. Brought so many childhood memories.
: ))))))
This might be off and containing a couple of mathematical leap of faiths, but if i look at the taylor series for sin(x), then in a very close neighbourhood of 0 sin(x) is almost exactly x. Then it basically converges same as sum (1/n^2) does. Would that make sense?
Thanks for you
could just prove that sin^2(x) is always less than 1/x^2, which converges by p-series.
One slappy face for the infinity boi
Tom G ?
blackpenredpen He probably means the part where you wrote down sin²(1/∞) (1st try). I know (hope?) that this was just for didactic purposes, but it still hurts a bit. Maybe you should mention that this is a sloppy notation when doing this.
The TFD and LCT test seem to require a lot of intuition about whether 0 is an acceptable answer.
Im fairly certain, by definition, lct only works if the result of lim n->infinity of an/bn > 0 and =/= infinity.
confusing as limit sin theta as theta approaches infinity is oscillating between -1 and 1 which never reaches a discrete value. I do not see how squaring said function changes anything. This should also diverge.
Kenneth Gee Notice that you have to add 2pi to theta for the values of sin theta to oscillate. However, here the theta 1/n constantly decreases from 1 to 0 as n approaches infinity. Therefore, sin(1/n) does not oscillate as there is a different value output from sin(1/n) as opposed to the normal sin(n) where theta can be expressed as n+2kPi where k is any integer
@@QasimKhan-nd8og I disagree that you need to add 2*pi in order for sin theta to oscillate. 2*pi is approximately 6.28 and definitely between 1 and infinity. Everyone just assumes that fractions and irrational numbers are not included is a summation series. I still see that sin theta is cyclical and never settles for a single value. This is the nature of trig functions
But as Qasim Khan pointed out, you're not taking the lim(θ→∞) sin(θ). You're taking lim(n→∞) sin(1/n).
sin(x) is a continuous function, so you can bring the limit inside, giving
lim(n→∞) sin(1/n) = sin(lim(n→∞) 1/n) = sin(0) = 0
Always try to tie to 1/n² ^^
how many points will I get for proving that this sum converges by graphical methods? (assuming that I didn't know how to apply the theorems)
Let's start with plotting the function f(x)=sin^2(x). We know that the largest value of x is 1 for n=1, thus the range of x is 0
Question, why can't you use L'Hopitals rule for sin(x)/x? Isn't the rule proved by Rolle's theorem? so it wouldn't be a circular reasoning, would it?
Could it be done easier by big oh function? Can you please do some topic about big oh function?
Do you have any great book for limit?
I thought he was going to use the small angle approximation for the third attempt and troll us, lol 😂😂
It's not an approximation, it's Taylor expansion and it's totally correct to do it this way
@@chat7897 It is an approximation if phrased as sin(x) ~= x, like many engineers and physicists do :)
Is it possible that there are summations like that where the answer of divergent versus convergent is unknowable?