- Видео 280
- Просмотров 653 235
Advanced LAFF
Добавлен 25 июн 2019
Authoring for Active Learning
Quick video to share our experiences with online courses. Intended audience: other instructors at UT.
This was a hastily created video, so don't judge our course by the quality of it!
For information on Advanced Linear Algebra: Foundations to Frontiers, see ulaff.net
A representative review by a learner:
"It was a very unique experience which I will never forget. I felt more engaged and involved than any other online or even on-campus class I have ever attended. Thanks Robert, Maggie, and TAs for making a memorable experience and providing a env to learn and grow. The structure, exercises, leaving room for thinking about a problem by pausing/cutting videos into two and may other things, pushe...
This was a hastily created video, so don't judge our course by the quality of it!
For information on Advanced Linear Algebra: Foundations to Frontiers, see ulaff.net
A representative review by a learner:
"It was a very unique experience which I will never forget. I felt more engaged and involved than any other online or even on-campus class I have ever attended. Thanks Robert, Maggie, and TAs for making a memorable experience and providing a env to learn and grow. The structure, exercises, leaving room for thinking about a problem by pausing/cutting videos into two and may other things, pushe...
Просмотров: 758
Видео
High performance Implementation of Cholesky Factorization
Просмотров 1,8 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Subspace Iteration, implementation
Просмотров 1,5 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.3.6 Putting it all together, Part 2
Просмотров 1,8 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Power Method, implementation
Просмотров 1,4 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.3.5 Implicitly shifted QR algorithm
Просмотров 3,6 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Power Method to compute second eigenvalue, implementation Part 2
Просмотров 7864 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Power Method to compute second eigenvalue, implementation Part 1
Просмотров 1,7 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
9.2.1 Gershgorin Disk Theorem, Part 2
Просмотров 2 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
9.2.1 Gershgorin Disk Theorem, Part 1
Просмотров 4 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.2.3 Reduction to bidiagonal form
Просмотров 2,9 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.3 One sided Jacobi's method
Просмотров 1,7 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.2 Jacobi's method, Part 2
Просмотров 1,6 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.2 Jacobi's method, Part 1
Просмотров 2,4 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.1 Jabobi rotation
Просмотров 2,6 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.2.4 Implicitly shifted bidiagonal QR algorithm
Просмотров 2,3 тыс.4 года назад
11.2.4 Implicitly shifted bidiagonal QR algorithm
11.2.2 A strategy for computing the SVD, Part 3
Просмотров 1,3 тыс.4 года назад
11.2.2 A strategy for computing the SVD, Part 3
11.2.2 A strategy for computing the SVD, Part 2
Просмотров 1,1 тыс.4 года назад
11.2.2 A strategy for computing the SVD, Part 2
11.2.2 A strategy for computing the SVD, Part 1
Просмотров 1,3 тыс.4 года назад
11.2.2 A strategy for computing the SVD, Part 1
11.2.1 Computing the SVD from the Spectral Decomposition, Part 2
Просмотров 9524 года назад
11.2.1 Computing the SVD from the Spectral Decomposition, Part 2
11.2.1 Computing the SVD from the Spectral Decomposition, Part 1
Просмотров 1,1 тыс.4 года назад
11.2.1 Computing the SVD from the Spectral Decomposition, Part 1
11.1.1 Linking the SVD to the Spectral Decomposition
Просмотров 2,1 тыс.4 года назад
11.1.1 Linking the SVD to the Spectral Decomposition
10.3.6 Putting it all together, Part 1
Просмотров 1,3 тыс.4 года назад
10.3.6 Putting it all together, Part 1
10 3 3 Simple tridiagonal QR algorithm
Просмотров 1,8 тыс.4 года назад
10 3 3 Simple tridiagonal QR algorithm
10 3 1 Reduction to tridiagonal form
Просмотров 3,2 тыс.4 года назад
10 3 1 Reduction to tridiagonal form
10.2.2 Simple shifted QR algorithm, Part 1
Просмотров 3,1 тыс.4 года назад
10.2.2 Simple shifted QR algorithm, Part 1
Dear Professor, this course material is very helpful, however I am still not 100% clear about the cos(theta) part. In a realistic case, how can we design the matrix A such that the cos(theta) will be small enough?
thanks for this informative video
thank you big guy !
great
That was so clear thank you
Minor error on right side second last bullet point. Should be dim(N(\lambda I - A))=0. Missed out the null space notation N( . ).
Life saver!! Fantastic lecture=, Professor. I've a graduate exam in numerical analysis and i couldn't quite grasp the intuition behind Householder QR Factorization. Truly amazing! Best, from UWaterloo.
Very clear explanation! I'm implementing a tiny linear algebra library and this channel has been super helpful to me in the process 💗
Thanks, you cooked me up real good
Thanks sir <3
explain it very well!!!
how did he go from 'v' which was a row matrix to a column matrix? 2:55?
how does the mirroring onto the standard basis relate to the upper triangular matrix?
if one knows the 2.norm of x and the standard basis e, then beta e directly gives the required mirror vector along e right. Then why do we need to represent the "mirror" operation as a matrix in terms of u in the first place?
Why is it not + and is * ?
why does it make no sense at all?
Which part do you not understand? I think the proof is quite straightforward. Although at 4:23 I think what Prof. Robert may have meant (I could be wrong) is that the general strategy is to show that if max f(x) <= alpha and max f(x) >= alpha, then the only way these two inequalities can be satisfied is if max f(x) = alpha (i.e. think of it as max f(x) has been "sandwiched" to alpha). That was how he derived that ||A||_2 = max(d_1, d_2), where A is a 2-by-2 diagonal matrix with d_1 and d_2 as entries.
Very good explanation.
Amazing, i was struggling for this one from last two days.
Thank you
amazing
Frobeanius is making me loose all my -marbles- beans.
2-Norm of a matrix is Frobenius Norm?
Well, I didn't get it from the first problem. so should I know the LU decompostion first to understand it. If it were in the form of a normal matrix I could have solved it. can anyone suggest please.
what do you mean by LU decomposition?
great explainer
Thank you for this very clear and comprehensive explanation!
Splendid! I wish I could like the video more than once!
Thanks You are a Saviour for Tommorow's Matrix Computation Quiz.
its not good, why dont you write something on the board?!
Nice class, thanks!
I love how simple, straight to the point, short his explanaition Thank you sir!
Hi Professor van de Gejin, is it possible to generalize this inequality when A is rank-deficient?
This video was actually amazing! Wow, needs more exposure. Thanks professor!
did not help santa
Nice, very well explained!
Thank you teacher!
This is what I wanted. Amazing sir.
Thank you!!!
The best explanation of Householder QR algorithm
Best explanation I saw until now. Thanks!!
helpful👍
excellent explanation. Thanks
You are an absolute fooookin legend.. Who ever this guy Robert de Geijn is I want to say thanks lol
Great explanation Sir 👏
There unknown way to visualize subspace, or vector spaces. You can stretching the width of the x axis, for example, in the right line of a 3d stereo image, and also get depth, as shown below. L R |____| |______| This because the z axis uses x to get depth. Which means that you can get double depth to the image.... 4d depth??? :O p.s You're good teacher!
What a brilliant video !! Thank you so much. had been looking everywhere to extend my 3x3 imagination and here you give it beautifully. thank you so much !!
Thank you very much! Very good explanation! 👍
different into tune. liked it!
Is it easy to code such an algorithm? Maybe you have any references I could look at? I am interested in the SVD of the covariance matrix in order to approximate a rigid body’s basis to calculate rotations with respect to the XYZ axes. I have been going round in circles with no success, the main problem being that the three eigenvectors constantly change direction as the object is rotated so it’s impossible to compute smooth rotations. Tried QR, Power Method, Euler angles and Quaternions with similar results, hence no success. I am hoping that the Polar Decompositon addresses the issue but not so sure. Any help will be appreciated.
In short, the reduced form just means that we leave out the irrelevant 0s from the sigma matrix, and it's corresponding parts in the other 2? Sounds simple enough.
Amazing! Thanks for this course!