Lecture 3 | Convex Optimization I (Stanford)

Поделиться
HTML-код
  • Опубликовано: 16 ноя 2024

Комментарии • 59

  • @shiv093
    @shiv093 5 лет назад +110

    10:10 Convex functions
    12:50 Examples on R
    15:23 Examples on R^n and R^mxn
    20:09 Restriction of a convex function to a line
    28:43 Extended-value extension
    31:09 First order condition
    35:39 Second-order conditions
    37:20 Examples
    49:38 Epigraph and sublevel set
    52:10 Jensen's inequality
    57:21 Operations that preserve convexity
    59:17 Positive weighted sum & composition with affine function
    1:02:05 Pointwise maximum
    1:04:39 Pointwise Supremum
    1:08:13 Composition with scalar functions
    1:13:31 Vector composition

    • @nileshdixit9672
      @nileshdixit9672 4 года назад +1

      You really deserve more than just likes

    • @shiv093
      @shiv093 4 года назад +4

      @@nileshdixit9672 Honestly, I put them so that it helps me revise topics quickly. Happy that it is helping others too.

    • @mariomariovitiviti
      @mariomariovitiviti 4 года назад

      keep liking this one to keep it up

  • @jamalahmedhussein1341
    @jamalahmedhussein1341 10 лет назад +130

    start from 10:10

    • @rogeryau3115
      @rogeryau3115 7 лет назад +2

      No problem. I usually play at 1.5 speed so I will get ready after 10 minutes.

  • @jackeown
    @jackeown 4 года назад +21

    In case anyone else was confused: at 41:20 the "softmax" he describes there is different from the "softmax" in deep learning. The deep learning softmax should probably be called something like "softargmax" instead.

  • @samw.6550
    @samw.6550 6 лет назад +17

    15:50 Norm
    18:00 trace (inner product)
    28:43 Extended-value extension
    31:12 differentiable functions

  • @jaimelima2420
    @jaimelima2420 3 года назад +2

    The real 'inspirational' essence is what he says at 32:50 and keeps saying for a two minutes or so. Thanks for sharing.

  • @benjamingoldstein14
    @benjamingoldstein14 3 дня назад

    Professor Boyd mentions around 20:00 that the spectral norm of a matrix X is a very complicated function of X, as the square root of the largest eigenvalue of XTX. I would mention, however, that this norm has a very simple geometric interpretation -- it is the maximum factor that X can "stretch" a vector through multiplication. Just as the largest eigenvalue of a matrix is the maximum factor that the matrix can stretch a vector if you don't allow for rotation, the largest singular value is the most that matrix can stretch any vector if you do allow for rotation. It therefore also has the interpretation of the magnitude of the largest axis of the ellipse which is the image of the unit L2 ball under the action of (left) multiplication by X.

  • @mhpt74
    @mhpt74 4 года назад +4

    Great teacher and wonderful sense of humor!

  • @personalchannel6382
    @personalchannel6382 11 лет назад +6

    Professor Boyed is super smart and definitely a researcher who have done large number of rigorous proofs. But even then, no one comes close in conveying mathematics to engineering students to David Snider, my professor at University of South Florida, Not even Prof Boyed.
    Snider is retired now and he was the author of all his math books that we studied in grad school.
    However this course is Awesome :). I love the rigorousness of it, it is very helpful to PhD students to come up with proofs to their theorems.

    • @izzyece707
      @izzyece707 9 лет назад +2

      abuhajara do you suggest a specific book about the conveying to understand it very well

  • @JTMoustache
    @JTMoustache Год назад

    Composition rule mnemonic
    1) Rule is for determining if f has the SAME convexity as h - no rule for f to be opposite of h
    2) h has to be monotone
    3) the monotonicity should be the equality test of the convexities. If convexity of g == convexity of h -> then its should be increasing, else decreasing.
    Outside of that no simple rule.

  • @grunder20
    @grunder20 13 лет назад +3

    Craving for more of this kind of stuff.

  • @emilywong4601
    @emilywong4601 6 лет назад +3

    I studied optimization including linear algebra techniques at Golden Gate University in a major called Computational Decision Analysis from 1999 to 2003. We used SAS, unix and excel solver.

  • @abhimanyu3244
    @abhimanyu3244 6 лет назад +2

    Thank you, Prof. Boyd!

  • @DarkDomnu
    @DarkDomnu 12 лет назад +2

    this guy is a winner.

  • @moxopal5681
    @moxopal5681 3 года назад +2

    1:05:50 It is extremely usefull to know if you are studying control theory.

  • @rogeryau3115
    @rogeryau3115 7 лет назад +3

    2:06 That blink, that grimace.

  • @rabeamahfoud3225
    @rabeamahfoud3225 7 лет назад +16

    I'm a Ph.D. student in electrical engineering. I'm studying this book by myself. Those lectures are so helpful for me to start using convex optimization. How can I get the home works professor Boyed is talking about???

    • @guoweih7339
      @guoweih7339 5 лет назад +11

      You can find the textbook, assignment and solution on this page. see.stanford.edu/Course/EE364A/94

    • @revooshnoj4078
      @revooshnoj4078 5 лет назад +3

      @@guoweih7339 thanks man

    • @happydrawing7309
      @happydrawing7309 5 лет назад +1

      @@guoweih7339 thank you sir.

    • @rodfloripa10
      @rodfloripa10 3 года назад +1

      Did they change the assingments, since the answers are available?

    • @kenahoo
      @kenahoo 2 года назад +4

      @@rodfloripa10 No - students just have to realize that at this level, assignments are a tool for learning, not a tool for getting grades.

  • @7nard
    @7nard 15 лет назад +2

    Good lectures. However, if you are just dropping by like me and want to skip the chatter about admin and classroom issues, start at the 10:10 mark.

  • @engr.aliarsalan2628
    @engr.aliarsalan2628 7 лет назад +2

    Really Helpful.

  • @gamalzayed2247
    @gamalzayed2247 11 месяцев назад

    Thank you for this nice lecture ❤

  • @annawilson3824
    @annawilson3824 Год назад

    24:17 if have no idea whether the function is convex or not - generate a few lines, plot, and look!

  • @manueljenkin95
    @manueljenkin95 2 года назад +1

    13:58, the condition r++ is important. X^3 is not convex in R I think.

  • @ismailelezi
    @ismailelezi 7 лет назад +5

    I feel like a noob. I am understanding the main points, but still, the examples are totally non-obvious.
    Of course, it is a '300' class, so I should have expected that.

  • @janiceliu5473
    @janiceliu5473 3 года назад +1

    shouldn't the determinant of Hessian of the quadratic-over-linear function be just 0 but not greater or equal to 0? 41:00

  • @shiladityabiswas2803
    @shiladityabiswas2803 4 года назад +3

    if Ryan Reynold would become a prof

  • @Alex-if6mv
    @Alex-if6mv 10 месяцев назад

    He tells: 'very painful' as if he knows a lo-o-ot about pain!😀

  • @anamericanprofessor
    @anamericanprofessor 6 лет назад +1

    Is there an active link to the class notes that are presented in these lectures? It would be more leisurely to watch the videos and then write down notes afterwards.

    • @fexbinder
      @fexbinder 5 лет назад +1

      web.stanford.edu/~boyd/cvxbook/ The book and the lecture slides are public available on the stanford web.

  • @cherishnguyen506
    @cherishnguyen506 8 лет назад +2

    I understand that A*X = \lamda *X has \lamda as an eigenvalue. So, how could X^(-1/2)VX^(-1/2) has the eigenvalue \lamda?

    • @akshayramachandran7857
      @akshayramachandran7857 5 лет назад +1

      Didn't understand your question-is it that
      1.Why eigenvalue of X is same as that of X^(-1/2)VX^(-1/2)?
      OR
      2.Why eigenvalue exists for X^(-1/2)VX^(-1/2)?

  • @Abuwesamful
    @Abuwesamful 7 лет назад +2

    someone is asking what is diag(z)? then how can the professor be sure that, the students are following, if they do not actually understand what is this very primitive item in that equation?

  • @hayderatrah
    @hayderatrah 11 лет назад +11

    The pace is ridiculously fast. The book (which is well-written) must be read first before can one cope with these videos.

  • @000HakunaMatata000
    @000HakunaMatata000 12 лет назад +2

    @10:20

  • @rashilalamichhane9750
    @rashilalamichhane9750 2 года назад

    49:48 epigraphs

  • @DarkDomnu
    @DarkDomnu 12 лет назад +2

    hilarious prof.

  • @summerland232
    @summerland232 12 лет назад +2

    A lefty :D

  • @saiftazir
    @saiftazir 6 лет назад +1

    extended value extensions ruclips.net/video/kcOodzDGV4c/видео.html

  • @muratcan__22
    @muratcan__22 6 лет назад +2

    32:00 vay aq

  • @learningsuper6785
    @learningsuper6785 7 лет назад +12

    This guy is the kind of professor I would avoid taking classes from at all cost. He's spending too much time talking about stuff that's not helpful to the understanding of the subject, like debating with himself whether a concept is obvious or not, lots of hand-waving when he really should have drawn some graph on a piece of paper.
    His coursera convex opt. course is even worse. I'd recommend reading a book than watching his videos for learning the subject.

    • @maxwellstrange4572
      @maxwellstrange4572 6 лет назад +13

      i think he's super entertaining and makes stuff make sense and seem interesting when it would otherwise seem dry

    • @ElektrikAkar
      @ElektrikAkar 6 лет назад +3

      Like or not, he is the superhero of the convex optimization and has the best material. Even the trivial examples he is giving, may help one to broaden his/her perspective.

    • @sridharthiagarajan479
      @sridharthiagarajan479 6 лет назад +7

      Disagree, it's quite entertaining, engaging and doesn't skimp on the important stuff

    • @wuzhai2009
      @wuzhai2009 5 лет назад +1

      Disagree. Knowing that it is hard to make things precise shows that he knows way too much. At points when he you say he is 'hand-waving', I suggest you delve deeper and you will appreciate why he said what he said.

  • @daweiliu6452
    @daweiliu6452 7 лет назад +1

    The first ten minutes is total crap.