Thank you so much for this awesome content. I took a Real Analysis class three years ago and I am using your videos to brush up on a bit of proof-based math prior to taking more advanced courses this Fall. Please keep it up. It would be absolutely fantastic if you could add partial differential equations and stochastic calculus crash courses similar in style and delivery to this one.
Thank you very much! Both things are on my list. However, I am already producing many series in parallel such that other ones have to wait a little bit longer.
This is also binary search :) I am seeing a pattern. We are showing that if we run binary search long enough, we will converge to some infinitesimally small point, sandwitched between the binary search process.
Hi sorry for the stupid question, but how do we know that we can choose each a_{n_{k}} such that n_{k+1} > n_{k}? Does this just follow directly from the fact that each new bisection contains infinitely many members - I can see it but I am not sure how to write that intuition down...
You don't have to know. Since we are given a sequence, by definition a sequence is a mapping from N to R and thus there are an infinite number of sequence members x1, x2, ... , xn, ..., Also suppose the sequence is bounded by [c, d]. How do we choose which half after we bisect [c,d]? i.e. do we choose [c , (c+d)/2 ] or [ (c+d)/2 , d] ? Consider the left half. Either there are an infinite number of sequence members x_i in the left half , so we pick it and we are done with the choice, or there is not. (This is true by logic, either p or not p.) If there is not an infinite number of sequence members x_i in the left half, then choose the right half. Automatically we know there must be an infinite number of x_i in the right half. Why? Because otherwise, if there were a finite number of x_i in the right half, the original sequence would not have an infinite number of sequence terms (since we assumed that the left half does not have an infinite number of x_i).
Probably not but since I assume it in the foundations, we can just use it here. Of course, your question is useful if you want to see where the axiom of choice is actually needed. In order to avoid the axiom of choice, you have to make the "choice" how to define a_n_k more precise.
Could anyone tell me why we can define the subsequence: a_n_k belongs to [c_k, d_k]? See 4:34 Because I think not all bounded sequences have the subsequence, where a_n_k belongs to [c_k, d_k].
if the sequence elements are chosen at random from an interval does the sequence have infinitely many accumulation points? or do we call it an accumulation interval?
"Random" can mean a lot of things :) Which distribution do you choose? Anyway: You get out a sequence in the end. It could have few or many accumulation values. The question would then be: What is the probability?
@@brightsideofmaths Yes, I was thinking of uniform distribution in this case, however my statistics knowledge is very crude (limited to one semester of biostatistics), and hopefully your playlist on probability will smooth-out some of the gaps in my understanding!
Due to the fact that [c_k, d_k] always contains an infinite number of members, we can always find an a_n_k within this interval that makes n_k greater than n_{k-1}.
I'm addicted to the way you pronounce ''real analysis'' at the intro of every video
"Reelin' In The Years" ~ Steely Dan
I don't have any good master to teach me basics of real analysis better than you ... Thank you very much. I'm from India.
Many thanks for this video!
Best explanation of the theorem I have seen anywhere! 👍
You're very welcome!
Awesome, this is perhaps the cleanest proof of the BW theorem I've ever seen! Please keep it up!
The only German teacher who explains the mathematics in human understandable language I found.
Thanks a lot :)
0:25 BW theorem
5:22 every bounded sequence has at least one accumulation point (check the textbook to verify)
True for Complex Numbers, interesting!
Simple and to the point !
I use this theorem to trade and it works beautifully.
What do you mean? :D
Thank you so much for this awesome content. I took a Real Analysis class three years ago and I am using your videos to brush up on a bit of proof-based math prior to taking more advanced courses this Fall. Please keep it up. It would be absolutely fantastic if you could add partial differential equations and stochastic calculus crash courses similar in style and delivery to this one.
Thank you very much! Both things are on my list. However, I am already producing many series in parallel such that other ones have to wait a little bit longer.
This is also binary search :)
I am seeing a pattern. We are showing that if we run binary search long enough, we will converge to some infinitesimally small point, sandwitched between the binary search process.
Thank you very much for the video, very helpful!!
You are welcome!
Nice! I love it.
thanks
Great video as always!
Thanks!
Welcome! And thank you very much :)
Hi sorry for the stupid question, but how do we know that we can choose each a_{n_{k}} such that n_{k+1} > n_{k}? Does this just follow directly from the fact that each new bisection contains infinitely many members - I can see it but I am not sure how to write that intuition down...
Yes, it follows from the fact that you have infinitely many members to build your sequence :)
The proof is incomplete. He skipped this important step.
How does thee proof work for a constant sequence?
What about the sequence a_n=(nπ)%1?
What about that? :)
@brightsideofmaths does it have infinitely many accumulation values?
@@agostonkis1365 I guess it has.
Frage: Du sagst, wir nehmen das linke unendliche Intervall, aber das neue Intervall mit c1 und d1 ist doch die rechte Seite. Das verwirrt mich gerade,
We take the one with infinitely many members. In the picture, it's the one on the right-hand side.
I wonder is there a proof using the least upper bound property (that every bounded sequence has a least upper bound in the real numbers)?
1:30 how do you know which half contains infinitely many elements of the sequence? That's a rather large step in the proof.
You don't have to know.
Since we are given a sequence, by definition a sequence is a mapping from N to R and thus there are an infinite number of sequence members x1, x2, ... , xn, ...,
Also suppose the sequence is bounded by [c, d].
How do we choose which half after we bisect [c,d]? i.e. do we choose [c , (c+d)/2 ] or [ (c+d)/2 , d] ?
Consider the left half. Either there are an infinite number of sequence members x_i in the left half , so we pick it and we are done with the choice, or there is not. (This is true by logic, either p or not p.) If there is not an infinite number of sequence members x_i in the left half, then choose the right half. Automatically we know there must be an infinite number of x_i in the right half. Why? Because otherwise, if there were a finite number of x_i in the right half, the original sequence would not have an infinite number of sequence terms (since we assumed that the left half does not have an infinite number of x_i).
Hey, What book do you follow (and/or suggest) for real analysis part?
I don't follow it but I could suggest "Introductory Real Analysis : A. N. Kolmogorov"
Fine, I'll check that.
Thanks :)
Does the definition of a_n_k at the end of the proof rely on the axiom of choice?
Probably not but since I assume it in the foundations, we can just use it here. Of course, your question is useful if you want to see where the axiom of choice is actually needed.
In order to avoid the axiom of choice, you have to make the "choice" how to define a_n_k more precise.
Could anyone tell me why we can define the subsequence: a_n_k belongs to [c_k, d_k]? See 4:34
Because I think not all bounded sequences have the subsequence, where a_n_k belongs to [c_k, d_k].
The interval always contains infinitely many sequences members. That is how we choose c_k and d_k. Does this help?
@@brightsideofmaths Thank you for your reply! This is very smart! I think I understand now.
if the sequence elements are chosen at random from an interval does the sequence have infinitely many accumulation points? or do we call it an accumulation interval?
"Random" can mean a lot of things :) Which distribution do you choose?
Anyway: You get out a sequence in the end. It could have few or many accumulation values. The question would then be: What is the probability?
@@brightsideofmaths Yes, I was thinking of uniform distribution in this case, however my statistics knowledge is very crude (limited to one semester of biostatistics), and hopefully your playlist on probability will smooth-out some of the gaps in my understanding!
❤️🤸🙌🤗
Your proof is incomplete. A sequence requires that n_k is increasing. So, in each step, you have the restriction n_{k+1}>n_k.
That is exactly included in the definition of a subsequence.
You are presenting a proof that you didn’t discover, for students that are trying to learn. But you omit an important step. So, you are not helping.
@@videolome I really don't understand what you mean. We've defined subsequences in Part 9.
Due to the fact that [c_k, d_k] always contains an infinite number of members, we can always find an a_n_k within this interval that makes n_k greater than n_{k-1}.