hello,Professor. Thank you for your lectures, and i want to know that does your lectures will include some transitional optimization problems such as TSP problem or VRP problem and some scheduling problem. Because i want to learn more about these for my study in future. Thank you
Hi Prof, in the proof of x_trans * A * x >= 0 for psd matrices, why we can write x in the form of the linear combination of all eigenvectors of A. Is it always true that n by n PSD matrix has n perpendicular eigenvectors?
Yes exactly. For any symmetric matrix (positive semi definite or not) we have an orthonormal basis of eigenvectors. We can use these, therefore, to express any linear combination.
Can the proof for quadratic semidefiniteness still hold when A matrix has repeated eigenvalues? In such cases can any X be written as a linear combination of eigenvectors?
The key property of a symmetric matrix is that it can be written as the scaled outer product of orthogonal vectors: M = \sum \lambda_i v_i v_i^T, where the v_i are orthogonal to each other. Is that your question?
the way you drew the span of x and y is misleading as it suggests it is only the first and third quadrant of the XY plane when it is actually the whole plane
29:24 should it be det M not equal to 0?
Yes, thank you! That's a typo, it should be \det M
ot= 0.
hello,Professor. Thank you for your lectures, and i want to know that does your lectures will include some transitional optimization problems such as TSP problem or VRP problem and some scheduling problem. Because i want to learn more about these for my study in future. Thank you
Hi Prof, in the proof of x_trans * A * x >= 0 for psd matrices, why we can write x in the form of the linear combination of all eigenvectors of A. Is it always true that n by n PSD matrix has n perpendicular eigenvectors?
Yes exactly. For any symmetric matrix (positive semi definite or not) we have an orthonormal basis of eigenvectors. We can use these, therefore, to express any linear combination.
Can the proof for quadratic semidefiniteness still hold when A matrix has repeated eigenvalues? In such cases can any X be written as a linear combination of eigenvectors?
The key property of a symmetric matrix is that it can be written as the scaled outer product of orthogonal vectors: M = \sum \lambda_i v_i v_i^T, where the v_i are orthogonal to each other. Is that your question?
the way you drew the span of x and y is misleading as it suggests it is only the first and third quadrant of the XY plane when it is actually the whole plane
good point... I need to get some better drawing skills...!
@@constantine.caramanis great video really appreciate it thanks!
Sheldon Axler offers free lectures on Linear Algebra Done Right in RUclips. No more confusion on Linear algebra! let's finish it :p