Advanced LAFF
Advanced LAFF
  • Видео 280
  • Просмотров 653 235
Authoring for Active Learning
Quick video to share our experiences with online courses. Intended audience: other instructors at UT.
This was a hastily created video, so don't judge our course by the quality of it!
For information on Advanced Linear Algebra: Foundations to Frontiers, see ulaff.net
A representative review by a learner:
"It was a very unique experience which I will never forget. I felt more engaged and involved than any other online or even on-campus class I have ever attended. Thanks Robert, Maggie, and TAs for making a memorable experience and providing a env to learn and grow. The structure, exercises, leaving room for thinking about a problem by pausing/cutting videos into two and may other things, pushe...
Просмотров: 758

Видео

High performance Implementation of Cholesky Factorization
Просмотров 1,8 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Subspace Iteration, implementation
Просмотров 1,5 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.3.6 Putting it all together, Part 2
Просмотров 1,8 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Power Method, implementation
Просмотров 1,4 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.3.5 Implicitly shifted QR algorithm
Просмотров 3,6 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Power Method to compute second eigenvalue, implementation Part 2
Просмотров 7864 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
10.1.1 Power Method to compute second eigenvalue, implementation Part 1
Просмотров 1,7 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
9.2.1 Gershgorin Disk Theorem, Part 2
Просмотров 2 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
9.2.1 Gershgorin Disk Theorem, Part 1
Просмотров 4 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.2.3 Reduction to bidiagonal form
Просмотров 2,9 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.3 One sided Jacobi's method
Просмотров 1,7 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.2 Jacobi's method, Part 2
Просмотров 1,6 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.2 Jacobi's method, Part 1
Просмотров 2,4 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.3.1 Jabobi rotation
Просмотров 2,6 тыс.4 года назад
Advanced Linear Algebra: Foundations to Frontiers Robert van de Geijn and Maggie Myers For more information: ulaff.net
11.2.4 Implicitly shifted bidiagonal QR algorithm
Просмотров 2,3 тыс.4 года назад
11.2.4 Implicitly shifted bidiagonal QR algorithm
11.2.2 A strategy for computing the SVD, Part 3
Просмотров 1,3 тыс.4 года назад
11.2.2 A strategy for computing the SVD, Part 3
11.2.2 A strategy for computing the SVD, Part 2
Просмотров 1,1 тыс.4 года назад
11.2.2 A strategy for computing the SVD, Part 2
11.2.2 A strategy for computing the SVD, Part 1
Просмотров 1,3 тыс.4 года назад
11.2.2 A strategy for computing the SVD, Part 1
11.2.1 Computing the SVD from the Spectral Decomposition, Part 2
Просмотров 9524 года назад
11.2.1 Computing the SVD from the Spectral Decomposition, Part 2
11.2.1 Computing the SVD from the Spectral Decomposition, Part 1
Просмотров 1,1 тыс.4 года назад
11.2.1 Computing the SVD from the Spectral Decomposition, Part 1
11.1.1 Linking the SVD to the Spectral Decomposition
Просмотров 2,1 тыс.4 года назад
11.1.1 Linking the SVD to the Spectral Decomposition
10.3.4 Implicit Q Theorem, Part 2
Просмотров 7874 года назад
10.3.4 Implicit Q Theorem, Part 2
10.3.4 Implicit Q Theorem, Part 1
Просмотров 1,1 тыс.4 года назад
10.3.4 Implicit Q Theorem, Part 1
10.3.6 Putting it all together, Part 1
Просмотров 1,3 тыс.4 года назад
10.3.6 Putting it all together, Part 1
7.1.1 Poisson's equation, Part 3
Просмотров 5984 года назад
7.1.1 Poisson's equation, Part 3
10 3 3 Simple tridiagonal QR algorithm
Просмотров 1,8 тыс.4 года назад
10 3 3 Simple tridiagonal QR algorithm
10.3.2 Givens rotations
Просмотров 8 тыс.4 года назад
10.3.2 Givens rotations
10 3 1 Reduction to tridiagonal form
Просмотров 3,2 тыс.4 года назад
10 3 1 Reduction to tridiagonal form
10.2.2 Simple shifted QR algorithm, Part 1
Просмотров 3,1 тыс.4 года назад
10.2.2 Simple shifted QR algorithm, Part 1

Комментарии

  • @hannaliu3576
    @hannaliu3576 15 дней назад

    Dear Professor, this course material is very helpful, however I am still not 100% clear about the cos(theta) part. In a realistic case, how can we design the matrix A such that the cos(theta) will be small enough?

  • @gemy6188
    @gemy6188 18 дней назад

    thanks for this informative video

  • @henrydl2296
    @henrydl2296 Месяц назад

    thank you big guy !

  • @keziah8337
    @keziah8337 Месяц назад

    great

  • @ibrahimhebib9588
    @ibrahimhebib9588 Месяц назад

    That was so clear thank you

  • @winstonong9593
    @winstonong9593 Месяц назад

    Minor error on right side second last bullet point. Should be dim(N(\lambda I - A))=0. Missed out the null space notation N( . ).

  • @InoceramusGigas
    @InoceramusGigas Месяц назад

    Life saver!! Fantastic lecture=, Professor. I've a graduate exam in numerical analysis and i couldn't quite grasp the intuition behind Householder QR Factorization. Truly amazing! Best, from UWaterloo.

  • @__amkhrjee__
    @__amkhrjee__ 2 месяца назад

    Very clear explanation! I'm implementing a tiny linear algebra library and this channel has been super helpful to me in the process 💗

  • @edwardelric8852
    @edwardelric8852 2 месяца назад

    Thanks, you cooked me up real good

  • @Unknownfor13
    @Unknownfor13 2 месяца назад

    Thanks sir <3

  • @Unknownfor13
    @Unknownfor13 2 месяца назад

    explain it very well!!!

  • @gk4539
    @gk4539 3 месяца назад

    how did he go from 'v' which was a row matrix to a column matrix? 2:55?

  • @rav2n
    @rav2n 3 месяца назад

    how does the mirroring onto the standard basis relate to the upper triangular matrix?

  • @rav2n
    @rav2n 3 месяца назад

    if one knows the 2.norm of x and the standard basis e, then beta e directly gives the required mirror vector along e right. Then why do we need to represent the "mirror" operation as a matrix in terms of u in the first place?

  • @rav2n
    @rav2n 3 месяца назад

    Why is it not + and is * ?

  • @rav2n
    @rav2n 3 месяца назад

    why does it make no sense at all?

    • @winstonong9593
      @winstonong9593 2 месяца назад

      Which part do you not understand? I think the proof is quite straightforward. Although at 4:23 I think what Prof. Robert may have meant (I could be wrong) is that the general strategy is to show that if max f(x) <= alpha and max f(x) >= alpha, then the only way these two inequalities can be satisfied is if max f(x) = alpha (i.e. think of it as max f(x) has been "sandwiched" to alpha). That was how he derived that ||A||_2 = max(d_1, d_2), where A is a 2-by-2 diagonal matrix with d_1 and d_2 as entries.

  • @princeardalan
    @princeardalan 3 месяца назад

    Very good explanation.

  • @waqasbaig993
    @waqasbaig993 3 месяца назад

    Amazing, i was struggling for this one from last two days.

  • @mouaadelyalaoui642
    @mouaadelyalaoui642 3 месяца назад

    Thank you

  • @kiiiko8575
    @kiiiko8575 4 месяца назад

    amazing

  • @danser_theplayer01
    @danser_theplayer01 4 месяца назад

    Frobeanius is making me loose all my -marbles- beans.

  • @PreyumKumar
    @PreyumKumar 4 месяца назад

    2-Norm of a matrix is Frobenius Norm?

  • @uphadhayay
    @uphadhayay 4 месяца назад

    Well, I didn't get it from the first problem. so should I know the LU decompostion first to understand it. If it were in the form of a normal matrix I could have solved it. can anyone suggest please.

    • @vas_show
      @vas_show 2 месяца назад

      what do you mean by LU decomposition?

  • @aidanahram8245
    @aidanahram8245 4 месяца назад

    great explainer

  • @AnnikaMeyer-fm2qu
    @AnnikaMeyer-fm2qu 4 месяца назад

    Thank you for this very clear and comprehensive explanation!

  • @realnameverified416
    @realnameverified416 4 месяца назад

    Splendid! I wish I could like the video more than once!

  • @hiteshkumar5847
    @hiteshkumar5847 5 месяцев назад

    Thanks You are a Saviour for Tommorow's Matrix Computation Quiz.

  • @rommeltzer
    @rommeltzer 5 месяцев назад

    its not good, why dont you write something on the board?!

  • @crisnc29
    @crisnc29 5 месяцев назад

    Nice class, thanks!

  • @EngSeifHabashy
    @EngSeifHabashy 5 месяцев назад

    I love how simple, straight to the point, short his explanaition Thank you sir!

  • @xiaoweilin8184
    @xiaoweilin8184 6 месяцев назад

    Hi Professor van de Gejin, is it possible to generalize this inequality when A is rank-deficient?

  • @cakefactoryy
    @cakefactoryy 7 месяцев назад

    This video was actually amazing! Wow, needs more exposure. Thanks professor!

  • @jackdevereux6254
    @jackdevereux6254 7 месяцев назад

    did not help santa

  • @tech_reforcae_mn
    @tech_reforcae_mn 9 месяцев назад

    Nice, very well explained!

  • @AfsarPervez
    @AfsarPervez 9 месяцев назад

    Thank you teacher!

  • @sattikbiswas2187
    @sattikbiswas2187 9 месяцев назад

    This is what I wanted. Amazing sir.

  • @pilot615
    @pilot615 9 месяцев назад

    Thank you!!!

  • @pilot615
    @pilot615 9 месяцев назад

    The best explanation of Householder QR algorithm

  • @lirabin
    @lirabin 9 месяцев назад

    Best explanation I saw until now. Thanks!!

  • @ShwethaVme23b073
    @ShwethaVme23b073 9 месяцев назад

    helpful👍

  • @ViniciusSantos-bn6ov
    @ViniciusSantos-bn6ov 10 месяцев назад

    excellent explanation. Thanks

  • @malikialgeriankabyleswag4200
    @malikialgeriankabyleswag4200 10 месяцев назад

    You are an absolute fooookin legend.. Who ever this guy Robert de Geijn is I want to say thanks lol

  • @eveleenkaur4349
    @eveleenkaur4349 11 месяцев назад

    Great explanation Sir 👏

  • @VolumetricTerrain-hz7civ
    @VolumetricTerrain-hz7civ 11 месяцев назад

    There unknown way to visualize subspace, or vector spaces. You can stretching the width of the x axis, for example, in the right line of a 3d stereo image, and also get depth, as shown below. L R |____| |______| This because the z axis uses x to get depth. Which means that you can get double depth to the image.... 4d depth??? :O p.s You're good teacher!

  • @mamaslostsoul
    @mamaslostsoul 11 месяцев назад

    What a brilliant video !! Thank you so much. had been looking everywhere to extend my 3x3 imagination and here you give it beautifully. thank you so much !!

  • @aezo6285
    @aezo6285 11 месяцев назад

    Thank you very much! Very good explanation! 👍

  • @tjeaue
    @tjeaue Год назад

    different into tune. liked it!

  • @markdebono1273
    @markdebono1273 Год назад

    Is it easy to code such an algorithm? Maybe you have any references I could look at? I am interested in the SVD of the covariance matrix in order to approximate a rigid body’s basis to calculate rotations with respect to the XYZ axes. I have been going round in circles with no success, the main problem being that the three eigenvectors constantly change direction as the object is rotated so it’s impossible to compute smooth rotations. Tried QR, Power Method, Euler angles and Quaternions with similar results, hence no success. I am hoping that the Polar Decompositon addresses the issue but not so sure. Any help will be appreciated.

  • @Wortigon2000
    @Wortigon2000 Год назад

    In short, the reduced form just means that we leave out the irrelevant 0s from the sigma matrix, and it's corresponding parts in the other 2? Sounds simple enough.

  • @sadegh5429
    @sadegh5429 Год назад

    Amazing! Thanks for this course!