Bjarne Stroustrup - The Essence of C++

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 767

  • @napat9
    @napat9 4 года назад +397

    7:20 - B. Stroustrup introduction
    11:40 - What Stroustrup keeps in mind about C/C++
    16:28 - Timeline history of programming languages
    25:00 - Resource management
    29:50 - Pointer misuse
    31:34 - Resource handles and pointers
    34:30 - Error handling and resources
    35:25 - Why do we use pointers?
    41:10 - Move semantics
    43:50 - Garbage collection
    47:06 - range: for, auto, and move
    49:16 - Class hierarchies
    51:58 - OOP/Inheritance
    54:10 - Generic Programming: Templates
    59:48 - Algorithms
    1:01:10 - Function objects and lambdas
    1:03:08 - Container algorithms
    1:03:42 - C++14 concepts
    1:10:32 - "Paradigms"
    1:13:56 - C++ challenges
    1:15:28 - C++ more information
    1:17:00 - Questions and answers

  • @vimalk78
    @vimalk78 10 лет назад +1145

    the essence starts at 7:27

    • @deusbuda
      @deusbuda 9 лет назад

      vima78 \o/

    • @Bastro3000
      @Bastro3000 9 лет назад +1

      +vima78 Yeaahhh!!!

    • @asterwolf805
      @asterwolf805 9 лет назад

      +vima78 Thanks. :D

    • @bra5081
      @bra5081 9 лет назад +38

      +vima78 Unless you wanna know where the fire extinguishers are, in case of fire :p

    • @Bastro3000
      @Bastro3000 9 лет назад +4

      Brad Haircut LOL

  • @bartolo5
    @bartolo5 3 года назад +35

    Been reading a lot of Stroustrup C++ books and I love how crystal clear his explanations are. For every "but how would you...?" that pops up in your head he explains it right away in detail.

  • @بهنامگلی
    @بهنامگلی 4 года назад +13

    I feel he is a gentelman and beside his efforts and talents his sincirity and honesty and simplicity is great.
    I started learning c++ this year and 7 months now.
    And within 7 month i read almost 7 reference books and after all i read his book it was very good experience and i think i am better now in designing and thinking than i was and only think helps me to keep forward is the solemn desire to solve one humanities problem and make the life better place for every one.

  • @maxwellstrange4572
    @maxwellstrange4572 6 лет назад +137

    Such a smart guy, I love that he's concerned with the language being approachable to casual users as well.

    • @IamusTheFox
      @IamusTheFox 6 лет назад +6

      Maxwell Strange look at what language he started with.

    • @charlyRoot
      @charlyRoot 4 года назад +3

      ahk is approachable. Cpp is very not.

    • @blatrump
      @blatrump 4 года назад +5

      Ohm, yes, so damn concerned, that he and the C++ committee did not solve this with a more than 20 year head start.

    • @chastitywhiterose
      @chastitywhiterose 4 года назад

      Yeah I like Bjarne and listen to all his talks.

    • @firewallfighter5598
      @firewallfighter5598 3 года назад

      89888889998889888888

  • @TopShelfization
    @TopShelfization 3 года назад +18

    I cannot thank this man enough for creating the only language I can comfortably express myself. The only language that comes close (for me) is probably ECMAScript.

  • @greatsea
    @greatsea 9 лет назад +96

    I'll just throw this out there. Recently I started learning C++ and some C after having achieved some degree of proficiency in VBA Excel and Python and here is the deal: If you are serious about learning the 'science' of programming, that is programming at the conceptual level, then choose either C or C++ as at least one of the languages you are learning. ((I'm leaving aside for a moment the obvious importance of the functional languages.)) I had always read that Python(and even Java and C#) is easier but I had no reference for this until I started learning C++. Obviously it is not going to be practical for everyone but if you want to be a true student of procedural process then go for C or C++. It flexes your brain way more than the higher level stuff.

    • @ruskodudesko9679
      @ruskodudesko9679 8 лет назад +14

      C++ is high level - if you are not using abstraction then you are most likely still using procedural C techniques - I agree man C++ is not easy, it is very challenging and a huge language with huge libraries, and takes years to master
      good luck bro! - get into Scott Meyers series it's great!

    • @WouterStudioHD
      @WouterStudioHD 4 года назад +1

      @Calum Tatum Depends. For performance critical tasks, like realtime audio for example, your example languages are just not fast enough. Also, C# is running on a runtime (CLR) that is written in C++. So what do you recommend for the person writing that runtime? C++ is hard, but it's also a really important language.

    • @syndromeX
      @syndromeX 3 года назад +2

      If you want procedural programming language, try Pascal..very similar to C

    • @cobalt2489
      @cobalt2489 3 года назад +1

      @@WouterStudioHD rust go brrrrrr

  • @eddieoconnor4466
    @eddieoconnor4466 5 лет назад +17

    Love how this man can describe / explain things and make sense!

  • @gojalsewnath6448
    @gojalsewnath6448 7 лет назад +5

    I never new this genius but I understand that he is capable of reading your mail at will and monitor your i/o at the same time. And I really like the way he has programmed his haircut, hair by hair! I don`t know much about programming, but somehow it was just cool to listen to him.

  • @cokeforever
    @cokeforever Год назад

    Bjarne is a legend! It is so nice to listen this intelligent man talk about the language that at some point back in time changed my life...

  • @thebudkellyfiles
    @thebudkellyfiles 6 лет назад +79

    A modern legend. Every time I struggle to understand C++ I think about him inventing it.
    Geeeeeeeez....

    • @julija5949
      @julija5949 6 лет назад +5

      Does it help?

    • @BedroomPianist
      @BedroomPianist 6 лет назад +5

      He invented it but don't be fooled, programming languages are an iterative process and C++ has been around for 40 some-odd years.

    • @markteague8889
      @markteague8889 6 лет назад +1

      Bedroom Pianist More like 30 odd years. In a sense, Objective C beat C++ to market since it was beginning to be used commercially while C++ was still just an internal intellectual curiosity at Bell Labs. One could argue that Objective C isn’t really an entirely new language, but more like C with library and pre-processor extensions.

    • @WouterStudioHD
      @WouterStudioHD 4 года назад +2

      @@markteague8889 Objective-C sucks. It's a mess of weird syntax and it lacks the generic power of C++.

    • @markteague8889
      @markteague8889 4 года назад +1

      Wouter I know the original syntax was akin to that used by SmallTalk (maybe the first OO language) which employed the message passing paradigm. As I understand it, modern flavors of Objective C support both styles of syntax for invoking methods against objects.

  • @rahulmathew8713
    @rahulmathew8713 6 лет назад +990

    The only man who actually understanda c++ 🤣🤣🤣

    • @jscorpio1987
      @jscorpio1987 5 лет назад +102

      If some people would listen to what he has to say and actually take the time to understand his books, not just read, but UNDERSTAND, that wouldn’t be the case.
      Instead, people just learn C++ at a very basic level as if it’s Python or something, and use every single feature of the language in their projects for no rhyme or reason and then call C++ a horrible language if something goes wrong.

    • @wolowizardyt
      @wolowizardyt 5 лет назад +30

      J T Calmdawn computer scientitst

    • @jorgebastos2431
      @jorgebastos2431 4 года назад +31

      No doubt he's a genius, but the syntax is horrible.

    • @madyogi6164
      @madyogi6164 4 года назад +48

      @@jorgebastos2431 The syntax is actually very nice, very 'well defined'. Coder doesn't have much room for 'syntax' mistakes. Compilers catch them. Of course a lot depends from its creators as well... I love the language for classes and possibility to separate interface from implementation... Resource control and the speed of solutions when they
      e finally ready...

    • @TheDanielLivingston
      @TheDanielLivingston 4 года назад +19

      SosukeAizenTaisho C++ isn’t “backwards compatible” with C, it’s a *superset* of C. Big difference.

  • @mennekamminga600
    @mennekamminga600 9 лет назад +67

    inspired me to update my codebase, zero new/delete less lines, hundreds of bugs killed.

    • @nihkz494
      @nihkz494 7 лет назад +3

      Menne Kamminga gg

  • @panduevank8356
    @panduevank8356 7 лет назад +6

    His accent is the best I've ever heard. It's so classy.

  • @davizitopa7252
    @davizitopa7252 4 года назад +4

    I won't watch any of my teachers or preachers lecturing for more than 20 minutes. This is 1 hour and 40 minutes long and I don't think Dr. Stroustrup is a gifted orator, but I will watch it to the end. The difference is the level of respect, I don't think any other living man deserves quite as much respect from me than Dr. Stroustrup.

    • @bartolo5
      @bartolo5 3 года назад

      Indeed, it's interesting. He even has a monotone voice delivery. What keeps us listening is the level of precision and relevancy with which he delivers all the information.

  • @tahmina23
    @tahmina23 6 лет назад +5

    Omg ... watch his first videos "The Design of C++" then watch this. His speaking skills improved DRAMATICALLY! He's awesome.

    • @kristypolymath1359
      @kristypolymath1359 4 года назад +2

      He probably thought "English is too simple. Why would they teach such a language as a first one, there isn't enough thinking involved" :D

  • @krischalkhanal2842
    @krischalkhanal2842 3 года назад +7

    Bjarne Stroustrup is such a great guy. He knows programming from heart, every nukes and crannies, he knows.

  • @dreamhollow
    @dreamhollow Год назад

    I'm glad we are as connected as we are to be able to see this interview even if we are far, far away from Edinburgh.

  • @seektherapy70
    @seektherapy70 6 лет назад +3

    Bjarne Stroustrup needs to speak more often. He is brilliant and very humble. Almost too humble because I have popped into IRC's C++ channel and defended him and his programming language many times. The programmers learning C++ in the channel seem to think they know more about his programming language the man who created it, ugh.... Irritating!

  • @dream_emulator
    @dream_emulator 4 года назад +13

    This is so awesome. Understanding like 10%, but still super interesting to follow.

  • @g.b.3427
    @g.b.3427 4 года назад +3

    C++ is and will be my favourite programming language forever! This man is my Hero! Longlife to Him!

  • @metalore
    @metalore 5 лет назад +359

    When a James Bond villain gives a lecture on programming, I listen.

    • @II_xD_II
      @II_xD_II 4 года назад +6

      @Man With No Name he said james bond not dragon ball z

    • @corriedebeer799
      @corriedebeer799 4 года назад +2

      He does have a bit of a Ernst Stavro Blofeld vibe about him

    • @lambda3925
      @lambda3925 Год назад

      ​@@corriedebeer799 No, maybe the old Blofeld though.

  • @azynkron
    @azynkron 3 года назад +4

    Only a genius can make something complex and abstract to sound simple.

  • @Elite7555
    @Elite7555 4 года назад +10

    1:22:35 He is completely right about that. Well, partially. Many professors aren't involved in any active software development anymore. One could argue even many professors just don't know any better. We still learned how to do linked lists with nullptr instead of using something sensible like optional. We have never seen the good parts of C++.

    • @biskitpagla
      @biskitpagla 3 года назад +2

      we weren't even taught linked lists with nullptr lmao our prof still uses NULL

  • @kennethcarvalho3684
    @kennethcarvalho3684 2 года назад

    Simply great.. He is such a good confident teacher. This video is a must watch for all programmers.

  • @m_r__r_o_b_o_t
    @m_r__r_o_b_o_t 4 года назад +3

    I love the way this guy talks

  • @kingofcastlechaos
    @kingofcastlechaos 4 года назад +6

    Great video UofE. My favorite quote starts at 07:36, and he nails it. Thank you sir for C++ and everything else!

  • @tomgreg2008
    @tomgreg2008 10 лет назад +65

    Bjarne starts at 7:30 fyi

  • @Ram-iz1zp
    @Ram-iz1zp 3 года назад +11

    so this the guy that made my CS classes so difficult. he looks exactly as how I imagined he would look

  • @asicdathens
    @asicdathens Год назад +2

    K&R and Stroustrup were my favorite books back in college. Thank you a lot prof Stroustrup .

  • @raulrrojas
    @raulrrojas 4 года назад +1

    We all who understand a little more about computing that "normal" people, know how important C++ is for todays computing. Thank you very much Mr Stroustrup.

  • @TheSchiffReport
    @TheSchiffReport 3 года назад +8

    the man himself talking about C++

  • @tianyangren
    @tianyangren 2 года назад +1

    People's names at 17:00
    Assembler: David Wheeler
    Fortran: John Warner Backus
    Simula & OOD: Kristen Nygaard
    C: Dennis Ritchie
    C++: Bjarne Stroustrup

  • @GregoryLewis
    @GregoryLewis 8 лет назад +23

    I'm a Microsoft Certified C++ Programmer. Love the language.

  • @MikkoRantalainen
    @MikkoRantalainen 3 года назад +2

    1:34:17 I think the best way forward would be to specify lots of old C++ features as deprecated and automatically generate a warning if deprecated feature is used. That would allow slowly removing parts of C++ that have better alternatives today.

  • @mariogamer86
    @mariogamer86 8 лет назад +17

    This man is a legend and C++ is a great language maybe the most important programming language.

    • @joestevenson5568
      @joestevenson5568 5 лет назад +1

      not to downplay C++, but C is easily more important than C++

    • @naimcool36
      @naimcool36 4 года назад

      @@joestevenson5568 not to downplay c++ is C

  • @ahmadalghooneh2105
    @ahmadalghooneh2105 4 года назад +1

    Love and Respect, I encourage all those who want to learn C++ deeply to read his book 4th edition!

  • @rickyjoebobby1
    @rickyjoebobby1 4 года назад +14

    he's one of the few who can use words like encapsulate and you know he isn't trying to pull a trick

    • @mikhail5002
      @mikhail5002 4 года назад +4

      So you think when people use words with more than two syllables in them they are trying to trick you?

  • @tannerbarcelos6880
    @tannerbarcelos6880 5 лет назад +3

    C++ is tough. I’m about to head into data structures this fall, which is based on c++ and it’s pretty intimidating but honestly, it’s all doable. I definitely won’t be working in c++ as my career, as I will be going into front end and UX but knowing c++ and how powerful it is, is interesting

    • @jayant9151
      @jayant9151 5 лет назад

      follow bisquit channel for c++ magic

  • @berajpatel8081
    @berajpatel8081 4 года назад

    Thank you .... Morgan Stanley , speaker Bjarne Stroustrup & Dave Robertson @ University of Edinburgh

  • @the_arung
    @the_arung 4 года назад

    His talks are so dense! Packs in a wealth of information and insights that need real dedication to follow.

  • @TheDavidlloydjones
    @TheDavidlloydjones 4 года назад

    Start is at 7:27..
    Apparently they don't have any video editors at Edinburgh.

  • @soppaism
    @soppaism 4 года назад +25

    I can understand those sighs... a most pragmatic guy invents a language that gets used in the most impractical ways.

  • @chastitywhiterose
    @chastitywhiterose 4 года назад +1

    It’s cool that Bjarne Stroustrup uses Cygwin on Windows. I used to use it a lot too. These days I use msys2 because it lets me create programs that don’t require the Cygwin DLL file but it still gives me that Unix environment I got used to from my Ubuntu days.

  • @rturrado
    @rturrado 4 года назад

    Example at 47:37: it's a bit weird the use done of Value_type and V. You may think they're same type, values of one and the other are compared directly, yet they are different types.

  • @mohammadawad8323
    @mohammadawad8323 3 года назад +6

    At 49:43, the code for the ranged loop is wrong, it should be for(auto& x : s) x->draw();

    • @MikkoRantalainen
      @MikkoRantalainen 3 года назад +1

      Some of the other examples were also missing semicolons.

  • @tylerminix2028
    @tylerminix2028 8 лет назад +56

    C++ is everything you could want it to be; it's a beautiful language.

    • @markotikvic
      @markotikvic 7 лет назад +14

      Tyler Minix "Beautiful" as in "it's what's on the inside that counts," because it's horendeus to look at 😀

    • @harehudi5117
      @harehudi5117 7 лет назад +6

      Frankly you are insane

    • @chilinouillesdepommesdeter819
      @chilinouillesdepommesdeter819 6 лет назад

      It's pretty powerful,but also hard to control

  • @DjoleNINJA
    @DjoleNINJA 3 года назад

    Good material before a sleep session, this story is quite a lullaby.

  • @HiAdrian
    @HiAdrian 10 лет назад +90

    29:54 _look I'm a java programmer!_ 😊
    LOL!

  • @tomhollins9266
    @tomhollins9266 5 лет назад +1

    When these vids are made: 1) use two cameras, One for the person, One for the presentation 2) Show person on left or right using only 10% of width of video at the same time the 90% of the screen is showing the presentation 3) make sure when the presenter is POINTING, that one can see what is being pointed at. Looks like we have film students making cuts and don't know what is being said so the cut to the presenter is when the viewer should be seeing what the presenter is talking about. Just something to ponder.

  • @anamica4766
    @anamica4766 4 года назад +2

    some of the comments are so disappointing. Respect the man.

    • @iAmCodeMonkey
      @iAmCodeMonkey 3 года назад

      People these days don't respect the past. Sad to see really.

  • @afterthesmash
    @afterthesmash 6 лет назад +1

    Long ago, I read a book by Edsger W. Dijkstra and, more or less, I've never had a problem with buffer overflows ever since. I read that book in the late 1980s, and the book was already old news then. Dijkstra's recommendation to error handling was this: if the operation can't legally be done, don't do the operation. No need to create a new control path, you'll be surprised how fast your loop exits the bottom once it's no longer able to do any legal operation. Then at the bottom, you check whether your contract was satisfied, and if not, you return an error condition. I hated exceptions with a passion-and refused to use them-until exception safety was properly formulated (mostly by David Abrahams) in the context of formal RAII. I continue to prefer Dijkstra's model, because it simply forces you to think more clearly, and implement your loops more cleanly. The only drawback is that it's hard to avoid defining a handful of boolean variables, and you can end up checking those variables almost every statement.
    Supposing C++ had a lazy &&=, the structure looks somewhat like this:
    bool function () {
    bool ok = true;
    ok &&= first_thing(); // function not called unless ok is true
    ok &&= second_thing(); // function not called unless ok is true
    return ok;
    }
    Mathematically, _every_ operation you attempt has a precondition on whether it can be executed legitimately.
    But we learned way back in the 1970s-the age of Dennis Ritchie-not to write code like this:
    for (p = p_begin, s = s_begin; s < s_end; ++s) {
    if (p < p_end) *p++ = *s;
    }
    My God! All those extra loop iterations doing _nothing_ you might execute for no good reason if (p_end-p_begin)

    • @afterthesmash
      @afterthesmash 6 лет назад

      Perhaps this is a good time to point out that on RUclips, if you nest the bold and italic syntax in the wrong way, editing a comment, making no change, and saving it again is not idempotent (it keeps wrapping more and more escape characters-incorrectly-with each additional unchanged save). Man, that takes talent. Broke-the-mold talent. Couldn't get i, j, and k into the correct alphabetical order if your life depended upon it talent.

    • @afterthesmash
      @afterthesmash 6 лет назад

      Also, I hope people read carefully enough to realize that I was exactly half serious: half serious because this *does* actually work, half not serious, because it only works for a stubborn iconoclast willing to put up with a certain extreme forms of surface ugliness -and stubborn iconoclasts (like Knuth and Bernstein and every second Haskell programmer) have always managed to write software with orders of magnitudes fewer bugs than competitive endeavors from lower down in the iconoclasm pecking order, so it's really not such a special value add.

    • @xybersurfer
      @xybersurfer 4 года назад

      @@afterthesmash the problem with avoiding exceptions, is that you are just manually implementing the error handling mechanism already present in the language. you are also wasting the return value which could be used for something more natural. the big advantage of exceptions to me is that you can abstract them away. this way you are left more with code describing task. this is much cleaner, than code checking for errors, with the actual task buried somewhere inside. i'm sure you could make it work like most things, but it looks like it results in a lot of unnecessary code

    • @afterthesmash
      @afterthesmash 4 года назад

      @@xybersurfer xybersurfer Naturalness only exists in a cultural context. You choose to operate within a software culture with a weirdly enlarged notion of "error". If I tried to sign up on Facebook and I entered my "Allan Stokes" as my preferred username, would the system create me a new account, or would it display an "error" that my username is already taken by some other Allan Stokes of no particular distinction? That counts as an "error" in what sense, precisely? Was it an error that some other Allan Stokes wanted my personal name? Was it an error that I also wanted the same? Was it an error that Facebook did not originally design their social network so that all users were known by a globally unique 128-bit GUID, preventing this kind of land-rush clash for all time? I count _none_ of these things as errors. Not everybody can have everything at the same time in the world we actually live in.
      Since I'm already a decade or two late boarding the Facebook ark, it's now a hail Mary for me to obtain the simplest rendition of my own name in this sphere. It's for the same reason we send Santa a list, and not just the single item Ferrari Testarossa. Santa might be short on Ferrari's this year, especially in light of the present circumstance in Milan lately. So it's a wise policy to provide Santa with options B through F, as well. Was it an "error" for me to want a Ferrari Testarossa? Or was it merely the first interaction in a distributed algorithm to probe the maximal intersection point between my list of desires and Santa's available merchandise?
      Likewise, if my algorithm wants 10 GB of DRAM working-space so as to run faster, is it an "error" if my request can not presently be granted by Santa's elves? I don't count that as an error, either.
      Let's suppose I'm only asking for 256 bytes, and malloc returns NULL. Under those conditions, whatever my subroutine is trying to accomplish is probably not viable. How does the OS know that this is now an error as you see it, because your program design has boxed you into a corner with no recourse but to barf up a
      White Flag of Surrender message box? The OS does _not_ know this. (Is it a coincidence that "throw" and "barf" are somewhat synonymous? I think not.) What the OS _could_ know for certain is that you just called free on a pointer it can't find on any active allocation list, which is _definitely_ an error. And if your program has just called free on a non-thing (type II error), what confidence do you now have that you haven't previously called free on a thing you intended to continue using (type I error)? Practically NIL. This is now Hunt the Wumpus territory. "It is now pitch dark. If you proceed, you will likely fall into a pit."
      But this isn't actually defined in the standard as an "error". The standard instead says that free's behaviour is now undefined. And once _any_ function commits undefined behaviour, this condition can not be erased short of exiting the current process (although in theory, attempting to exit your process can now also be undefined)-and now the state of your Unix process tree can potentially also become undefined and it's probably now time to pay homage to CTRL-ALT-DEL. In practice, exit is usually engineered to not depend on much, so it usually won't fail to exit your process, to some number of nines (typically more for BSD with a belt-and-suspenders attitude dating back to the original graybeard; fewer for Linux, which regards too many nines as a type II cultural error that greatly hinders the effective evolution rate).
      This is the funny thing about error culture in computer science. The unambiguous errors are considered so severe, that the cultural response is "why even contemplate continuing to press forward?" whereas completely foreseeable circumstances-like two people trying to sign up on Facebook with the same username-are reified into the barf loop (aka culturally re-parented into the class of "exceptionable" error events).
      My personal favourite case study is the Therac-25, a Canadian radiation therapy machine which either killed or badly injured at last six people. It had two beam strengths: regular and lethal (100× beam power). There was a turntable inside to rotate a bullet-proof vest in front of the lethal beam. On striking the bullet-proof target, the lethal beam would kick out some other modality of therapeutic radiation as a byproduct. Patient saved. In Dijkstra's world, you have ONE place in the code which is capable of unleashing lethal beam energy. Under what condition are you not allowed to do this? When the bullet-proof vest has _not_ been rotated into position. Otherwise, the patient experiences an extreme shock and _literally_ runs screaming out of the treatment room (as did a fatally wounded Roy Cox in actual fact). But you see, the software was designed to ensure that the correct turntable position was already activated and properly in place long _before_ subroutine release_the_radioactive_hounds was called upon to do its dirty work.
      Mechanically, the turntable implemented _three_ separate microswitches to ensure that the software could reliably detect its true position (not just two sets like your microwave oven door).
      They got carried away with thinking about what the software was trying to _do_ (treat the patient) and forget to think clearly about what the software was trying to _not_ do (kill the patient). There's no excuse on God's green earth for a software culture which accepts _anything_ other than a last conditional statement to check the turntable position (and every _other_ survival-critical precondition, if there were any more) before the ONE line of code capable of unleashing the 100× lethal beam. We were a good 35 years into the software industry before these fatal accidents happened. Dijkstra had completely set this straight already by the 1960s.
      In ZFS or PostgreSQL-or any other life-critical application (as viewed by the data under storage)-the implementation culture is strongly biased to think first and always about what you simply MUST NOT DO (e.g. create a race condition on flush to disk). Further up the software stack, the culture flips into: call the extremely sophisticated and fast-moving API as best as you can and mostly hope for the best. If a 7'2" Santa Claus with a 30" vertical leap power-blocks your layup attempt, you click you heels through the handy-dandy uncluttered exception mechanism, and return to Kansas, hardly any the wiser, but more or less still alive. Betcha didn't know Santa had that move. Good thing your ruby slippers are maximally topped up on reindeer credits.
      Only here's the downside: people who live in Kansas begin to develop an extremely muzzy sense of what a true error actually looks like. "Error" becomes a hopelessly muddled class of eventualities where it seems more convenient to click your heels together than to write a robust code block with a clear sense of the hazards it is actually navigating. And then this becomes further reified into what "uncluttered" code ought to properly look like, because we are at the end of the day cultural creatures, who thrive on cultural uniformity

    • @afterthesmash
      @afterthesmash 4 года назад

      @@xybersurfer Long ago I attended a pretentious university (Waterloo, Ontario) where the faculty was determined to teach freshman and sophomores the One True Way, which in that year was the Pascal programming language, with a top-down design methodology. Everyone I knew with any talent was conniving to get an account on any machine with a viable C compiler, while the faculty was conniving to keep access to these machines as limited as possible for the sake of "sparing the children". But no high-integrity system is ever designed top down. You mainly work bottom up at first from your core integrity guarantees. And then once you have a coherent API over your core integrity layer, you work top down, but still not as I was originally instructed. The top-down portion is architected from your test and verification analysis. The best modularity is the maximally testable modularity, and not the modularity that produces the least amount or the cleanest amount of code. In particular, your abstraction boundaries only need to be clean enough, rather than conceptually pristine (pristine when you can swing it, but not at the cost of making the full system less testable). The Linux people thought that ZFS had too many layering violations, and that with the benefit of hindsight and some fundamentally better algorithms deep down, they could kick ZFS to the curb with Btrfs. Actual outcome: Btrfs was removed from Red Hat Enterprise Linux 8 (RHEL) in May 2019.
      Btrfs team: We know pristine (good), we know clutter (bad).
      ZFS team: We know core coherence layering (hard, but doable with a clear head), we know testability at scale (hard, but doable with sustained cultural pragmatism).
      From Rudd-O: "ZFS will actually tell you what went bad in no uncertain terms, and help you fix it. ... btrfs has nothing of the sort. You are forced to stalk the kernel ring buffer if you want to find out about such things."
      That's a guaranteed fingerprint of a click-your-heels-to-return-to-Kansas design ethos. At which point all you can do is emit a somewhat cryptic message to the OS ring buffer.
      The original inviolable rationale for why top-down-design as I was once taught in Pascal was going to rule the world is that it's fundamentally based on divide and conquer. And what can possibly defeat divide and conquer? We had hundreds of years-if not _thousands_ of years-of history with this technique in mathematics. And it truly rocks in that domain. Because mathematics-of all human things-most excels at abstraction from messy circumstance. By emulating mathematics to the ultimate extent, so too can software _pretend_ to exist in this austere world.
      The original ZFS design team at Oracle knew how to design the system around their validation framework, because they were _deeply_ immersed in the world of the shit hitting the fan. Before Matt Ahrens proposed ZFS, a typo in the configuration of the Solaris volume manager caused the system to lose all home directories for 1000 engineers. And that was far from the only domain of extreme suffering. Original design principle: to escape the suffering, the administrator expresses intent, not device mappings.
      Now I'm circling back to the original problem. When you're muzzy-headed about what constitutes a conceptually coherent error class (because you've lumped sick world & sick dog under "click-your-heels" through the underground exception-return expressway) there's barely any possible way to think clearly about application intent. Oh, but you say "the exception return object can be arbitrarily complex to handle this". And right you would be, only there goes your pristine and uncluttered code base, for sure. And now what have you really gained?
      Because you are already welded into the exception mindset, you're almost certainly going to try to report the damage on first contact with a 7'2" Santa Claus guarding the post. And you're return an object to encode error 666: denied by Giant Black Goliath in a red deerskin vest. I've actually played around with writing code where I would continue to attempt to do all things the code can legally do _after_ something has already gone fatally wrong (fatally wrong meaning there is no remaining chance to achieve your subroutine's mandatory post-condition) and then reporting back _all_ the resources and actions that proved unsuccessful in the quest to accomplish the assigned task. (There is no possible way to implement this coding strategy that you would not decree to be hopelessly cluttered.) And this is interesting, because when you do get six errors at all once (illegal host name, can not allocate memory, no route to host, private key does not exist) then you're immediately pretty sure you entered double-insert mode in vi and there's an extra "i" character somewhere in your application's configuration file.
      Suppose you tell a human minion: deliver this yellow envelope to 555 Hurl St, last door on the right at the end of the long yellow hallway on the third floor and then grab me a muffin from the bakery downstairs and your minion comes back to report "no such address" (with no muffin) and doesn't point out that while the hallway on the third floor was long, it was orange instead of yellow and you didn't even check whether there was a bakery down below, because unfortunately the hallway was orange instead of yellow, so you didn't even _think_ about completing the rest of the assigned errand. "And there was no bakery down below" would have you checking your street address first of all. "And there _was_ a bakery down below" would have you checking your floor number first of all (while happily munching your hot muffin as a side bonus).
      Human contingency is _so_ different than code contingency, and I just think this is wrong-headed all around.
      But modern computer languages really do tend to compel divide and conquer, and I conceded that it's extremely difficult to enact human contingency in a way that's clean and compatible with this reductionist software engineering ethos.
      I've seen lots of unnecessarily cluttered code. Almost always because of a woolly conceptual foundation, where you're half in one system of coping and half in another system of coping. But I've gradually evolved to where I almost never count clean expression of preconditions or postconditions (all the way down to the level of individual statements) as active code clutter. My eye now regards such code as a somewhat prolix signature of a job well done, no ruby slippers involved.

  • @rickpontificates3406
    @rickpontificates3406 2 года назад +1

    As a programmer, I'll tell you what really ticks me off about these high level OOP languages.. I'm pretty good at logic and program flow, but my productivity is drastically slowed down by having to remember, or look up, all the INSANE syntax, variable passing, and multiple file dependencies of C++

    • @earx23
      @earx23 2 года назад +2

      It's an amazingly powerful language, but also gives you the opportunity to shoot yourself in the foot at every step on the way. To use it simply and safely, you'll probably start using smart pointers, copying strings instead of referencing.. undoing its performance potential. Still, you can do everything, including shooting yourself in the foot.

  • @MicrosoftsourceCode
    @MicrosoftsourceCode 9 лет назад +1

    ***** I would also listen to this video. He will explain why we got so many different kinds of compilers. You could skip to the end where he does a question and answers.
    I find I can listen to some videos without the sound and learn a lot.
    I do this when I am washing my cloths or cooking.
    that's what I call real multi tasking.

    • @MicrosoftsourceCode
      @MicrosoftsourceCode 9 лет назад

      ***** what the hell are you doing talking to your fucking self? Or is ***** another person as I have long suspected?

    • @MicrosoftsourceCode
      @MicrosoftsourceCode 9 лет назад

      ***** Sadly you will never make a programmer with your split personally behaviour. You need your all spanked until your proper Bonobo blue black

    • @MicrosoftsourceCode
      @MicrosoftsourceCode 9 лет назад

      ***** The answer to what dim wits?

    • @MicrosoftsourceCode
      @MicrosoftsourceCode 9 лет назад

      ***** Well tell your fucking puppeteer to shove you back in his suitcase.

    • @jistAnothaTuber
      @jistAnothaTuber 9 лет назад

      MScode ™
      "I find I can listen to some videos without the sound and learn a lot."
      How are you able to listen to a video without sound?

  • @JaYb97716
    @JaYb97716 8 лет назад +4

    I found this a rather simple lecture for university students. For college/A level students it's very useful. However Bjarne is a genius.

  • @ruskodudesko9679
    @ruskodudesko9679 8 лет назад +11

    my C++ professor is kicking my ass this semester
    one thing I recently learned is that a member function of an object can take an object type of itself for example:
    if I have:
    class Long
    and a member function
    void mult(const Long&)
    I can go
    void Long::mult(const Long& L){
    number+=L.number;
    }
    this line of code really confused me until my professor said "do you have adrivers license?" - I go yeah - he said "so do I but are they the same?" -no
    "the objects are the same way the are the same thing but have different information in them"
    I thought that was really cool and a cool way to look at it
    just thought I would share my experience :P

    • @scottmichaud
      @scottmichaud 4 года назад

      Yup... and "number" can even be private. People tend to think that private members hide data from other objects, but that's not the goal. The goal is to hide members from people who don't know how Long works internally. Two Long objects share the same code, so they assume that the person who wrote them knows how to handle the two of them interacting. It's when you start interacting with other classes, which are probably written by other people at other times, that private starts to say "Eh... but do they *really* know what Long uses that private field (or method) for?"

    • @illlanoize23
      @illlanoize23 4 года назад

      rusko dudesko yeah took me a minute to understand the this pointer and how it can point to private data for any new object created

    • @brianplum1825
      @brianplum1825 4 года назад

      Experienced programmers often write the same code as "this->number += L.number"; it makes it explicit that "number" is a member variable.

    • @scottmichaud
      @scottmichaud 4 года назад

      @@brianplum1825 yeah... then you can search for this and hopefully get all members... but you can't rely upon it... especially if multiple developers.

  • @James_Bowie
    @James_Bowie 3 года назад +42

    How to introduce a speaker: "Ladies and gentlemen, our guest speaker is X. Please welcome him/her to the stage." Then get off!

  • @Proceso-Digital-TV
    @Proceso-Digital-TV 8 лет назад +1

    wow, this is unbelievable!! I never thought about sit in from of this eminence sir, can't believe this because he was to me one of the genius of my Turbo C/C++ book, amazing! ohh no matter the fact that I'm sit at home :P

  • @eugenemartynenko617
    @eugenemartynenko617 9 лет назад +13

    If Scottish accent doesn't turn you on, please press this button! It'll save you much time ---> 7:20

  • @andrewlankford9634
    @andrewlankford9634 6 лет назад +9

    The essence of C++ is int & whatsit::whazzat( HUH? ) -> *= ....

  • @rudyardkipling4517
    @rudyardkipling4517 4 года назад

    you can mix and match, university of simplicity here, use your old codebase in a library, dll, etc.. and just call them from new C++ versions if you do not feel like updating the code

  • @youugotssponged
    @youugotssponged 9 лет назад +19

    Haha, legendary Bjarne "First of all, i dont go round killing people"

    • @jlmurrel
      @jlmurrel 4 года назад +1

      What did he mean by this comment?

    • @dynamo58
      @dynamo58 3 года назад

      К6🕺🕺🕺😞🕺🕺🗨️🕺🕺🕺😞🕺🕺🕺🕺🗨️🕺🕺🕺😞🕺😞🕺🕺🕺😞🕺😞🕺🕺🕺🕺😞😞🕺🤔🤣🤣😒🤣🤣🤣🐩🐩🐴🏢🏔️🏣🏔️🏕️🏔️🏔️🏔️🏔️🏕️🏔️🏪🏔️🏕️🏔️🏔️🏕️🏔️🏔️🏕️🏔️🏕️🗾🏔️🏕️🏔️🏕️🗺️🏔️🏕️🏔️🏔️🏔️🏪🏔️🏔️🏔️🏔️🏔️🏔️🏔️🏖️🏔️🏛️🏔️🏣🏖️🍋🍋🍋🍋🍋🍎🍋🍋🍋🐂🍋🍈🍌🍐🥑🍍🍍🥑🍍🍍🍍🍍🥖🍍🍍🍑🍍🍍🍍🥑🍍🍑🍍🍍🥖🍍🍍🥖🍍🍍🍍🍐🍍🍍🥑🍍🍍🍍🍊🍊🍊🐘🍊🍉🍉🍉🐹🍉🦒🍌🍍🍍🥐🍌🍏🍌🍐🍌🥐🍌🍌🌶️🍌🍌🦒🦒🥑🦒😀😀🍐🍋🍋🍐🐹🍐🍐🍐🏛️🍐🍐😛🙂😅🏖️🍋🍋🍋🍋🍋🍎🍎🍋🍌🍋🍋🥑🍎🍋🥑🍋🍋🍋🍋🍋🍋🍋🍋🍊🍋🍋🍋🥑🍋🍋🍊🍋🍋🍋🍋🍋🍊🍋🥑🍋🍋🍉🍋🍋🍋🍋🍋🍋🍋🍎🍋🍋🍋🍋🍋🍊🍋🍋🍋🍋🍋🍎🍋🍋🍎🍋🍋🍋🍋🍋🍋🍋🍋🍋🥑🍋🍋🍋🍋🍉🍋🍋🍋🏣😃😃🌶️😃😃😃😃😛🍍🍍🍍🥖🍍🥓🍍🥖🍍🍍🍍🌏🌏🥞🌏🌏🌏🏢🏔️🏔️🏔️🏪🏢🏔️🏔️🗻🏔️🏪🏔️🏢🏪🏔️🌍🥨🌍🌍🍊🍌🥐🍌🍌🍌🍊🍌🍌🥐🏔️🏔️🎉♦️🏑🏑🎽♣️👗🧤🧤🛍️🧤🧤🧣🧣👘🧤🧤👢🧤🧤👘🧤🧤🕶️🧤👢🧤👔👔👖👖🧣🚫🚻🚺🚺🧢🏁🏁🏁🏁↘️🏁🏁📵🚻🚻🚻🚻🛃🛃🚻O:-):-|O:-)O:-):-|O:-):-|🇦🇸:-(🇧🇯:-(🇦🇲🛅🎌🎌🎌⬅️🚺🚻🚻🚩🚩🚻🚻📵🚻🏳️🏁🚻O:-):-|O:-)8-):-P:-|8-):-|8-):-|8-):O:-|:O:-[:-D:-DB-):-|:-|:-D:-[:-[:-D:-|O:-):-|:-|:-|8-):-|:-(:-(O:-):-|:-|8-):-|O:-)8-):-|O:-):-|O:-):-{:-DO:-):-|O:-)🚻🚻↘️🚻:-P🇧🇪:-P🇦🇲:-)🇧🇪:-P:-P:-|:-|🇦🇸♿♿💎🎩👛🎩🚰🚰🚰🚹🚻➡️⚠️⚠️⚠️↔️↘️⚠️👔👔🎳👔♿👛♿🎓🔇👠👡🧣👡🧣🧤🧤🧤🚻⚠️☢️🔇➡️☣️☣️🔕☣️🎙️🚷📱🎶🎈🏔️🏕️🏔️🏪🏛️🏔️🏪🎉🎉🎃🍞🗺️🥑🍍🍐🍑🍈🐖🍈🦍🦍😚🐒😊🐒😚🐒🤐🐵🐺🐐🐺🐄🐺🐺🦄🐄🐺🐄🐺🦍🍊🐫🐗🌳🍇🍇🍇🐙😶🤭🐬🐬🤧🤭🤭🤤🤭🤭🤧🤭🐵🙄🐵🐵😙😙😏😙😜😙😏😪😙😪😋🐙😘😘😜😜🐗🤧😜😜🦄😜😜😜😜😜😜🦄😆😗🤣🍌🍉🍉🍉🍊🍌🍌🍊🍌🍌➡️🚹♿🚹🔈🚰🚰🏳️🇦🇨🏳️🇦🇨🏳️🏳️🇦🇨🏳️🏳️🏳️🏳️🏳️🏳️🇦🇷🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🎌🎌🎌Любимый человек

  • @zod6594
    @zod6594 4 года назад +1

    It’s crazy how c++ is moving closer towards java/c# where even the creator says to avoid using pointers and to have automatic garbage collection. With memory being so high(even phones have 4+ GB), I guess it makes sense nowadays.

  • @TheRundownGames
    @TheRundownGames 9 лет назад +2

    Just want to say something do not argue with him as in the guy talking about c++. Because he is the creator of c++ and know more than all you 6 year olds.

  • @Quadromodo
    @Quadromodo 9 лет назад +14

    I have a degree in computer science with games programming, I'm leaving this lecture at 46 mins in cos I need a beer and my brain is sludge!

  • @wclewis123
    @wclewis123 4 года назад +2

    When C and C++ was developed we were already using the Extended Telephone Switching Language (ETSPL) that was PL/I based, strongly typed, with a simple and clear method to interface with hardware. That he ignores that thread in his description of languages is curious. So much of what is said is untrue.

  • @BryonLape
    @BryonLape 8 лет назад +157

    7:25 - to skip the blah, blah at the start.

    • @pattty847
      @pattty847 8 лет назад +7

      thank you lmao

    • @qn565
      @qn565 5 лет назад +1

      thank you! you saved 7 minutes and 25 seconds of my life

    • @jlmurrel
      @jlmurrel 4 года назад

      @@qn565 - yes! 7 minutes and 25 seconds we'll never get back.

  • @int-64
    @int-64 4 года назад +6

    THE MAN THE MYTH THE LEGEND

  • @vikasdangwal3080
    @vikasdangwal3080 7 лет назад +1

    What a great answer about not writing language like Java😄😄

  • @afterthesmash
    @afterthesmash 6 лет назад

    7:00 Tell them what you're going to tell them, tell them, tell them what you told them: done to perfection-that shiny trader is going to really enjoy this lecture.

  • @Elite7555
    @Elite7555 3 года назад +2

    I watch this talk at least once per year, just to remind me what it means to be a (good) programmer.

  • @josephscottadams39
    @josephscottadams39 6 лет назад +1

    I just watched this entire talk and I honestly can say most of what was said was beyond my understanding. Wanting to learn C++ but it seems too hard?

    • @stanrogers5613
      @stanrogers5613 4 года назад +2

      It's really not. The problem is, as the man said, that it's been around a long time, and has been, in a way, several _different_ related languages over the years. Keep in mind how very slow and clunky computers were when C++ was introduced - a lot of the original language relied on the programmer to do most of the heavy lifting. That is, you needed to think about the machine almost as much as the problem you were working on. These days, if you use the right parts of C++, you can offload most of the "thinking about the machine" part to the compiler and just concentrate on the problem you're solving. The compiler itself can be a much larger program than was possible in the early days, and it will compile your modern code in seconds or minutes rather than weeks. Most of this talk was about why it's better to use the newer language features when you can rather than the features that were there so that your steam-and-hamster-powered 4MHz processor with maybe a couple of megabytes of memory (huge for the time) wouldn't choke trying to compile it.

  • @timangus
    @timangus 3 года назад

    I'm pretty sure I attended this talk.

  • @kylepoe5139
    @kylepoe5139 8 лет назад +2

    Well, I only watched the first 30 minutes due to an unfortunate attention span, but great lecture! I would also suggest watching this at x1.25, flows a little better.

  • @jstevens9724
    @jstevens9724 2 года назад +1

    I've just started messing about with esp32 embedded cpu so I am having to learn a bit of C++. I would only call myself an amature programmer at best. I started out on Python for ease of access, which is nice, but soon hit the performance limitations. So I did the research on lots of other languages to weigh up the pros and cons, they all have them and will be different for different people, and
    I settled on D. This was an interesting lecture, but it does reinforce the conclusion that C++ really wants to be D, it's just going to take 10++ years to get there. Anyway, I hope I can avoid the pitfalls highlighted by Bjarne.

    • @earx23
      @earx23 2 года назад

      Is D still alive? I used it 10 years ago and it was nice, but today I settle for Rust and have more grip on performance.

  • @aydinhoe6355
    @aydinhoe6355 6 лет назад +7

    the man who created chaos

  • @TheyRiseBand
    @TheyRiseBand 4 года назад +43

    Oh, I thought the essence was to weed out Comp Sci majors.

    • @gamergo9
      @gamergo9 3 года назад +2

      Best comment on RUclips.

    • @snesmocha
      @snesmocha Год назад

      … seriously I still don’t understand why people get so frustrated with c++…

  • @FroL_Onn
    @FroL_Onn 2 года назад

    Wow! Concepts and modules should have been finalized before cpp17!!!

  • @bhanuxhrma
    @bhanuxhrma 4 года назад +1

    Watching it in lockdown!!

  • @troyc333
    @troyc333 3 года назад

    brilliance beyond my software engineering skills

  • @erawanthewise8227
    @erawanthewise8227 3 года назад

    All hail Bjarne the creator of our generation!

  • @mahfuzulhaquenayeem8561
    @mahfuzulhaquenayeem8561 4 года назад

    Thanks a lot for this video. Keep uploading more of such content. Good luck...

  • @holdenmcgroin8917
    @holdenmcgroin8917 6 лет назад +5

    Got to have that hair to be a legend

  • @hyqhyp
    @hyqhyp 9 лет назад +19

    The Morgan Stanley guy's biggest achievement in life is getting his managers to switch from C to C++. There is something very sad about that ...

    • @emmanuelkofyagyapong6382
      @emmanuelkofyagyapong6382 7 лет назад

      hyqhyp is there any article about it?

    • @Jezze2
      @Jezze2 7 лет назад +5

      Actually, he said he got them to move from Fortran to C and to C++, but still agree with your point...

    • @aseemawad4294
      @aseemawad4294 6 лет назад +11

      Making a large company change their primary programming language is a serious achievement.

    • @bextract0
      @bextract0 6 лет назад

      hyqhyp wth.. he invented one of the most commonly used languages of modern time. What are you smoking?

    • @lincolnsand5127
      @lincolnsand5127 4 года назад +3

      @@bextract0 He's taking about the guy at the beginning. Not Bjarne. You completely misunderstood his very clear comment lol.

  • @weeb3856
    @weeb3856 9 лет назад +22

    Haha. His Danish accent is lovely! xD

    • @helgegrelck2394
      @helgegrelck2394 6 лет назад

      Yea, but his vocabulary is great and on the other hand his Danish name Bjarne is pronounced wrongly. It is pronounced 'Byarne', where the e is pronounced like e in 'error' :) . Bjarne means Bjørn ('byoern' with an ö like in birn or firn ) , and bjørn means 'bear' . Touche :)

    • @mysund
      @mysund 5 лет назад

      Strange how so many Danish computer scientists have influenced the programming languages...

    • @helgegrelck2394
      @helgegrelck2394 4 года назад

      Shallex ... Well yes quite... but the English ‘Y’ is equivalent to the ‘J’ in all the other germanic languages (Dutch, Flamish, Danish, Icelandic, Faroes, Norwegian, Swedish, German) . So its more like ‘Byarne’, but without saying ‘by arne’ ... cause ‘Arne’ is another Danish male name :)

  • @Hans_Magnusson
    @Hans_Magnusson Год назад

    @8:is I completely agree on the description of the subset of human spices 😂😂
    No wonder I had some serious problems over the last decade!
    The only observation I would like to share is: engineers are found in more places than science…
    You have MBA engineers as an example
    In closing, I am far from being an expert (thank you lord) but I am starting to appreciate C++.
    I have no clue if I am getting smarter or what happened

  • @Minotauro_di_Chieti
    @Minotauro_di_Chieti 4 года назад +1

    Love Christopher Guest, he's the man of thousand faces!!!

  • @hikkenwayans
    @hikkenwayans 8 лет назад +1

    EXCELLENT lecture!!!

  • @roadracer1584
    @roadracer1584 3 года назад +4

    It appears Python has overcome C/C++ as the most popular programming language. Almost all of the young engineers and computer scientists I encounter are proficient programming in Python and have almost no experience with C/C++.

    • @postmeme44
      @postmeme44 2 года назад +9

      This only increases the value of C++ programmers

    • @doublesushi5990
      @doublesushi5990 Год назад

      assuming people want C++ programmers@@postmeme44

  • @riccardomassa7006
    @riccardomassa7006 Год назад

    this man is my hero.

  • @zxxNikoxxz
    @zxxNikoxxz 7 лет назад +7

    He looks like the "beast" from KongFu hustle

  • @lolzwoot
    @lolzwoot 6 лет назад

    at 1:13:08 spoke a the creator of OOP, i couldnt understand the name properly, I tought it was Alan Key who invented it, but apparently Bjarne state otherwhise.
    I'm very curious to read more about the man he talk about, if anyone can write his name, that would be lovely.

    • @larsjessen
      @larsjessen 6 лет назад +1

      Kristen Nygaard who together with Ole-Johan Dahl invented Simula

  • @matthewfairley4101
    @matthewfairley4101 3 года назад

    I wondered what Gail Porter was doing recently.

  • @madyogi6164
    @madyogi6164 3 года назад

    Bjarne was always one of my Gurus. When turned this video, I was wondering If I be able to make any sort of comment to it. Well:
    14:16 "Complexity of code should be proportional to the complexity of the task"...
    What failed during last 20 years and why do we have so many "N Gibabyte Fat" crapware...

  • @runtimejpp
    @runtimejpp 2 года назад

    i love his very subtle sarcasm

  • @jakobullmann7586
    @jakobullmann7586 3 года назад

    34:19 What does he mean here? ("There are some behind the scene") How are there pointers behind the scene?

  • @saulocpp
    @saulocpp 10 лет назад +1

    Thanks for sharing the lecture!

  • @dikkepikenhardeballe
    @dikkepikenhardeballe 4 года назад +3

    The essence of C++ is C.

  • @AlexSmith-fs6ro
    @AlexSmith-fs6ro 5 лет назад

    His talk starts at 7 20.

  • @MECHANISMUS
    @MECHANISMUS 3 года назад

    @The University of Edinburgh
    You should either add real English captions or take down what you called English captions but it's in fact another auto-generated version.