7:20 - B. Stroustrup introduction 11:40 - What Stroustrup keeps in mind about C/C++ 16:28 - Timeline history of programming languages 25:00 - Resource management 29:50 - Pointer misuse 31:34 - Resource handles and pointers 34:30 - Error handling and resources 35:25 - Why do we use pointers? 41:10 - Move semantics 43:50 - Garbage collection 47:06 - range: for, auto, and move 49:16 - Class hierarchies 51:58 - OOP/Inheritance 54:10 - Generic Programming: Templates 59:48 - Algorithms 1:01:10 - Function objects and lambdas 1:03:08 - Container algorithms 1:03:42 - C++14 concepts 1:10:32 - "Paradigms" 1:13:56 - C++ challenges 1:15:28 - C++ more information 1:17:00 - Questions and answers
Been reading a lot of Stroustrup C++ books and I love how crystal clear his explanations are. For every "but how would you...?" that pops up in your head he explains it right away in detail.
I feel he is a gentelman and beside his efforts and talents his sincirity and honesty and simplicity is great. I started learning c++ this year and 7 months now. And within 7 month i read almost 7 reference books and after all i read his book it was very good experience and i think i am better now in designing and thinking than i was and only think helps me to keep forward is the solemn desire to solve one humanities problem and make the life better place for every one.
I cannot thank this man enough for creating the only language I can comfortably express myself. The only language that comes close (for me) is probably ECMAScript.
I'll just throw this out there. Recently I started learning C++ and some C after having achieved some degree of proficiency in VBA Excel and Python and here is the deal: If you are serious about learning the 'science' of programming, that is programming at the conceptual level, then choose either C or C++ as at least one of the languages you are learning. ((I'm leaving aside for a moment the obvious importance of the functional languages.)) I had always read that Python(and even Java and C#) is easier but I had no reference for this until I started learning C++. Obviously it is not going to be practical for everyone but if you want to be a true student of procedural process then go for C or C++. It flexes your brain way more than the higher level stuff.
C++ is high level - if you are not using abstraction then you are most likely still using procedural C techniques - I agree man C++ is not easy, it is very challenging and a huge language with huge libraries, and takes years to master good luck bro! - get into Scott Meyers series it's great!
@Calum Tatum Depends. For performance critical tasks, like realtime audio for example, your example languages are just not fast enough. Also, C# is running on a runtime (CLR) that is written in C++. So what do you recommend for the person writing that runtime? C++ is hard, but it's also a really important language.
I never new this genius but I understand that he is capable of reading your mail at will and monitor your i/o at the same time. And I really like the way he has programmed his haircut, hair by hair! I don`t know much about programming, but somehow it was just cool to listen to him.
Bedroom Pianist More like 30 odd years. In a sense, Objective C beat C++ to market since it was beginning to be used commercially while C++ was still just an internal intellectual curiosity at Bell Labs. One could argue that Objective C isn’t really an entirely new language, but more like C with library and pre-processor extensions.
Wouter I know the original syntax was akin to that used by SmallTalk (maybe the first OO language) which employed the message passing paradigm. As I understand it, modern flavors of Objective C support both styles of syntax for invoking methods against objects.
If some people would listen to what he has to say and actually take the time to understand his books, not just read, but UNDERSTAND, that wouldn’t be the case. Instead, people just learn C++ at a very basic level as if it’s Python or something, and use every single feature of the language in their projects for no rhyme or reason and then call C++ a horrible language if something goes wrong.
@@jorgebastos2431 The syntax is actually very nice, very 'well defined'. Coder doesn't have much room for 'syntax' mistakes. Compilers catch them. Of course a lot depends from its creators as well... I love the language for classes and possibility to separate interface from implementation... Resource control and the speed of solutions when they e finally ready...
I won't watch any of my teachers or preachers lecturing for more than 20 minutes. This is 1 hour and 40 minutes long and I don't think Dr. Stroustrup is a gifted orator, but I will watch it to the end. The difference is the level of respect, I don't think any other living man deserves quite as much respect from me than Dr. Stroustrup.
Indeed, it's interesting. He even has a monotone voice delivery. What keeps us listening is the level of precision and relevancy with which he delivers all the information.
Bjarne Stroustrup needs to speak more often. He is brilliant and very humble. Almost too humble because I have popped into IRC's C++ channel and defended him and his programming language many times. The programmers learning C++ in the channel seem to think they know more about his programming language the man who created it, ugh.... Irritating!
1:22:35 He is completely right about that. Well, partially. Many professors aren't involved in any active software development anymore. One could argue even many professors just don't know any better. We still learned how to do linked lists with nullptr instead of using something sensible like optional. We have never seen the good parts of C++.
We all who understand a little more about computing that "normal" people, know how important C++ is for todays computing. Thank you very much Mr Stroustrup.
1:34:17 I think the best way forward would be to specify lots of old C++ features as deprecated and automatically generate a warning if deprecated feature is used. That would allow slowly removing parts of C++ that have better alternatives today.
C++ is tough. I’m about to head into data structures this fall, which is based on c++ and it’s pretty intimidating but honestly, it’s all doable. I definitely won’t be working in c++ as my career, as I will be going into front end and UX but knowing c++ and how powerful it is, is interesting
It’s cool that Bjarne Stroustrup uses Cygwin on Windows. I used to use it a lot too. These days I use msys2 because it lets me create programs that don’t require the Cygwin DLL file but it still gives me that Unix environment I got used to from my Ubuntu days.
Example at 47:37: it's a bit weird the use done of Value_type and V. You may think they're same type, values of one and the other are compared directly, yet they are different types.
When these vids are made: 1) use two cameras, One for the person, One for the presentation 2) Show person on left or right using only 10% of width of video at the same time the 90% of the screen is showing the presentation 3) make sure when the presenter is POINTING, that one can see what is being pointed at. Looks like we have film students making cuts and don't know what is being said so the cut to the presenter is when the viewer should be seeing what the presenter is talking about. Just something to ponder.
Long ago, I read a book by Edsger W. Dijkstra and, more or less, I've never had a problem with buffer overflows ever since. I read that book in the late 1980s, and the book was already old news then. Dijkstra's recommendation to error handling was this: if the operation can't legally be done, don't do the operation. No need to create a new control path, you'll be surprised how fast your loop exits the bottom once it's no longer able to do any legal operation. Then at the bottom, you check whether your contract was satisfied, and if not, you return an error condition. I hated exceptions with a passion-and refused to use them-until exception safety was properly formulated (mostly by David Abrahams) in the context of formal RAII. I continue to prefer Dijkstra's model, because it simply forces you to think more clearly, and implement your loops more cleanly. The only drawback is that it's hard to avoid defining a handful of boolean variables, and you can end up checking those variables almost every statement. Supposing C++ had a lazy &&=, the structure looks somewhat like this: bool function () { bool ok = true; ok &&= first_thing(); // function not called unless ok is true ok &&= second_thing(); // function not called unless ok is true return ok; } Mathematically, _every_ operation you attempt has a precondition on whether it can be executed legitimately. But we learned way back in the 1970s-the age of Dennis Ritchie-not to write code like this: for (p = p_begin, s = s_begin; s < s_end; ++s) { if (p < p_end) *p++ = *s; } My God! All those extra loop iterations doing _nothing_ you might execute for no good reason if (p_end-p_begin)
Perhaps this is a good time to point out that on RUclips, if you nest the bold and italic syntax in the wrong way, editing a comment, making no change, and saving it again is not idempotent (it keeps wrapping more and more escape characters-incorrectly-with each additional unchanged save). Man, that takes talent. Broke-the-mold talent. Couldn't get i, j, and k into the correct alphabetical order if your life depended upon it talent.
Also, I hope people read carefully enough to realize that I was exactly half serious: half serious because this *does* actually work, half not serious, because it only works for a stubborn iconoclast willing to put up with a certain extreme forms of surface ugliness -and stubborn iconoclasts (like Knuth and Bernstein and every second Haskell programmer) have always managed to write software with orders of magnitudes fewer bugs than competitive endeavors from lower down in the iconoclasm pecking order, so it's really not such a special value add.
@@afterthesmash the problem with avoiding exceptions, is that you are just manually implementing the error handling mechanism already present in the language. you are also wasting the return value which could be used for something more natural. the big advantage of exceptions to me is that you can abstract them away. this way you are left more with code describing task. this is much cleaner, than code checking for errors, with the actual task buried somewhere inside. i'm sure you could make it work like most things, but it looks like it results in a lot of unnecessary code
@@xybersurfer xybersurfer Naturalness only exists in a cultural context. You choose to operate within a software culture with a weirdly enlarged notion of "error". If I tried to sign up on Facebook and I entered my "Allan Stokes" as my preferred username, would the system create me a new account, or would it display an "error" that my username is already taken by some other Allan Stokes of no particular distinction? That counts as an "error" in what sense, precisely? Was it an error that some other Allan Stokes wanted my personal name? Was it an error that I also wanted the same? Was it an error that Facebook did not originally design their social network so that all users were known by a globally unique 128-bit GUID, preventing this kind of land-rush clash for all time? I count _none_ of these things as errors. Not everybody can have everything at the same time in the world we actually live in. Since I'm already a decade or two late boarding the Facebook ark, it's now a hail Mary for me to obtain the simplest rendition of my own name in this sphere. It's for the same reason we send Santa a list, and not just the single item Ferrari Testarossa. Santa might be short on Ferrari's this year, especially in light of the present circumstance in Milan lately. So it's a wise policy to provide Santa with options B through F, as well. Was it an "error" for me to want a Ferrari Testarossa? Or was it merely the first interaction in a distributed algorithm to probe the maximal intersection point between my list of desires and Santa's available merchandise? Likewise, if my algorithm wants 10 GB of DRAM working-space so as to run faster, is it an "error" if my request can not presently be granted by Santa's elves? I don't count that as an error, either. Let's suppose I'm only asking for 256 bytes, and malloc returns NULL. Under those conditions, whatever my subroutine is trying to accomplish is probably not viable. How does the OS know that this is now an error as you see it, because your program design has boxed you into a corner with no recourse but to barf up a White Flag of Surrender message box? The OS does _not_ know this. (Is it a coincidence that "throw" and "barf" are somewhat synonymous? I think not.) What the OS _could_ know for certain is that you just called free on a pointer it can't find on any active allocation list, which is _definitely_ an error. And if your program has just called free on a non-thing (type II error), what confidence do you now have that you haven't previously called free on a thing you intended to continue using (type I error)? Practically NIL. This is now Hunt the Wumpus territory. "It is now pitch dark. If you proceed, you will likely fall into a pit." But this isn't actually defined in the standard as an "error". The standard instead says that free's behaviour is now undefined. And once _any_ function commits undefined behaviour, this condition can not be erased short of exiting the current process (although in theory, attempting to exit your process can now also be undefined)-and now the state of your Unix process tree can potentially also become undefined and it's probably now time to pay homage to CTRL-ALT-DEL. In practice, exit is usually engineered to not depend on much, so it usually won't fail to exit your process, to some number of nines (typically more for BSD with a belt-and-suspenders attitude dating back to the original graybeard; fewer for Linux, which regards too many nines as a type II cultural error that greatly hinders the effective evolution rate). This is the funny thing about error culture in computer science. The unambiguous errors are considered so severe, that the cultural response is "why even contemplate continuing to press forward?" whereas completely foreseeable circumstances-like two people trying to sign up on Facebook with the same username-are reified into the barf loop (aka culturally re-parented into the class of "exceptionable" error events). My personal favourite case study is the Therac-25, a Canadian radiation therapy machine which either killed or badly injured at last six people. It had two beam strengths: regular and lethal (100× beam power). There was a turntable inside to rotate a bullet-proof vest in front of the lethal beam. On striking the bullet-proof target, the lethal beam would kick out some other modality of therapeutic radiation as a byproduct. Patient saved. In Dijkstra's world, you have ONE place in the code which is capable of unleashing lethal beam energy. Under what condition are you not allowed to do this? When the bullet-proof vest has _not_ been rotated into position. Otherwise, the patient experiences an extreme shock and _literally_ runs screaming out of the treatment room (as did a fatally wounded Roy Cox in actual fact). But you see, the software was designed to ensure that the correct turntable position was already activated and properly in place long _before_ subroutine release_the_radioactive_hounds was called upon to do its dirty work. Mechanically, the turntable implemented _three_ separate microswitches to ensure that the software could reliably detect its true position (not just two sets like your microwave oven door). They got carried away with thinking about what the software was trying to _do_ (treat the patient) and forget to think clearly about what the software was trying to _not_ do (kill the patient). There's no excuse on God's green earth for a software culture which accepts _anything_ other than a last conditional statement to check the turntable position (and every _other_ survival-critical precondition, if there were any more) before the ONE line of code capable of unleashing the 100× lethal beam. We were a good 35 years into the software industry before these fatal accidents happened. Dijkstra had completely set this straight already by the 1960s. In ZFS or PostgreSQL-or any other life-critical application (as viewed by the data under storage)-the implementation culture is strongly biased to think first and always about what you simply MUST NOT DO (e.g. create a race condition on flush to disk). Further up the software stack, the culture flips into: call the extremely sophisticated and fast-moving API as best as you can and mostly hope for the best. If a 7'2" Santa Claus with a 30" vertical leap power-blocks your layup attempt, you click you heels through the handy-dandy uncluttered exception mechanism, and return to Kansas, hardly any the wiser, but more or less still alive. Betcha didn't know Santa had that move. Good thing your ruby slippers are maximally topped up on reindeer credits. Only here's the downside: people who live in Kansas begin to develop an extremely muzzy sense of what a true error actually looks like. "Error" becomes a hopelessly muddled class of eventualities where it seems more convenient to click your heels together than to write a robust code block with a clear sense of the hazards it is actually navigating. And then this becomes further reified into what "uncluttered" code ought to properly look like, because we are at the end of the day cultural creatures, who thrive on cultural uniformity
@@xybersurfer Long ago I attended a pretentious university (Waterloo, Ontario) where the faculty was determined to teach freshman and sophomores the One True Way, which in that year was the Pascal programming language, with a top-down design methodology. Everyone I knew with any talent was conniving to get an account on any machine with a viable C compiler, while the faculty was conniving to keep access to these machines as limited as possible for the sake of "sparing the children". But no high-integrity system is ever designed top down. You mainly work bottom up at first from your core integrity guarantees. And then once you have a coherent API over your core integrity layer, you work top down, but still not as I was originally instructed. The top-down portion is architected from your test and verification analysis. The best modularity is the maximally testable modularity, and not the modularity that produces the least amount or the cleanest amount of code. In particular, your abstraction boundaries only need to be clean enough, rather than conceptually pristine (pristine when you can swing it, but not at the cost of making the full system less testable). The Linux people thought that ZFS had too many layering violations, and that with the benefit of hindsight and some fundamentally better algorithms deep down, they could kick ZFS to the curb with Btrfs. Actual outcome: Btrfs was removed from Red Hat Enterprise Linux 8 (RHEL) in May 2019. Btrfs team: We know pristine (good), we know clutter (bad). ZFS team: We know core coherence layering (hard, but doable with a clear head), we know testability at scale (hard, but doable with sustained cultural pragmatism). From Rudd-O: "ZFS will actually tell you what went bad in no uncertain terms, and help you fix it. ... btrfs has nothing of the sort. You are forced to stalk the kernel ring buffer if you want to find out about such things." That's a guaranteed fingerprint of a click-your-heels-to-return-to-Kansas design ethos. At which point all you can do is emit a somewhat cryptic message to the OS ring buffer. The original inviolable rationale for why top-down-design as I was once taught in Pascal was going to rule the world is that it's fundamentally based on divide and conquer. And what can possibly defeat divide and conquer? We had hundreds of years-if not _thousands_ of years-of history with this technique in mathematics. And it truly rocks in that domain. Because mathematics-of all human things-most excels at abstraction from messy circumstance. By emulating mathematics to the ultimate extent, so too can software _pretend_ to exist in this austere world. The original ZFS design team at Oracle knew how to design the system around their validation framework, because they were _deeply_ immersed in the world of the shit hitting the fan. Before Matt Ahrens proposed ZFS, a typo in the configuration of the Solaris volume manager caused the system to lose all home directories for 1000 engineers. And that was far from the only domain of extreme suffering. Original design principle: to escape the suffering, the administrator expresses intent, not device mappings. Now I'm circling back to the original problem. When you're muzzy-headed about what constitutes a conceptually coherent error class (because you've lumped sick world & sick dog under "click-your-heels" through the underground exception-return expressway) there's barely any possible way to think clearly about application intent. Oh, but you say "the exception return object can be arbitrarily complex to handle this". And right you would be, only there goes your pristine and uncluttered code base, for sure. And now what have you really gained? Because you are already welded into the exception mindset, you're almost certainly going to try to report the damage on first contact with a 7'2" Santa Claus guarding the post. And you're return an object to encode error 666: denied by Giant Black Goliath in a red deerskin vest. I've actually played around with writing code where I would continue to attempt to do all things the code can legally do _after_ something has already gone fatally wrong (fatally wrong meaning there is no remaining chance to achieve your subroutine's mandatory post-condition) and then reporting back _all_ the resources and actions that proved unsuccessful in the quest to accomplish the assigned task. (There is no possible way to implement this coding strategy that you would not decree to be hopelessly cluttered.) And this is interesting, because when you do get six errors at all once (illegal host name, can not allocate memory, no route to host, private key does not exist) then you're immediately pretty sure you entered double-insert mode in vi and there's an extra "i" character somewhere in your application's configuration file. Suppose you tell a human minion: deliver this yellow envelope to 555 Hurl St, last door on the right at the end of the long yellow hallway on the third floor and then grab me a muffin from the bakery downstairs and your minion comes back to report "no such address" (with no muffin) and doesn't point out that while the hallway on the third floor was long, it was orange instead of yellow and you didn't even check whether there was a bakery down below, because unfortunately the hallway was orange instead of yellow, so you didn't even _think_ about completing the rest of the assigned errand. "And there was no bakery down below" would have you checking your street address first of all. "And there _was_ a bakery down below" would have you checking your floor number first of all (while happily munching your hot muffin as a side bonus). Human contingency is _so_ different than code contingency, and I just think this is wrong-headed all around. But modern computer languages really do tend to compel divide and conquer, and I conceded that it's extremely difficult to enact human contingency in a way that's clean and compatible with this reductionist software engineering ethos. I've seen lots of unnecessarily cluttered code. Almost always because of a woolly conceptual foundation, where you're half in one system of coping and half in another system of coping. But I've gradually evolved to where I almost never count clean expression of preconditions or postconditions (all the way down to the level of individual statements) as active code clutter. My eye now regards such code as a somewhat prolix signature of a job well done, no ruby slippers involved.
As a programmer, I'll tell you what really ticks me off about these high level OOP languages.. I'm pretty good at logic and program flow, but my productivity is drastically slowed down by having to remember, or look up, all the INSANE syntax, variable passing, and multiple file dependencies of C++
It's an amazingly powerful language, but also gives you the opportunity to shoot yourself in the foot at every step on the way. To use it simply and safely, you'll probably start using smart pointers, copying strings instead of referencing.. undoing its performance potential. Still, you can do everything, including shooting yourself in the foot.
***** I would also listen to this video. He will explain why we got so many different kinds of compilers. You could skip to the end where he does a question and answers. I find I can listen to some videos without the sound and learn a lot. I do this when I am washing my cloths or cooking. that's what I call real multi tasking.
my C++ professor is kicking my ass this semester one thing I recently learned is that a member function of an object can take an object type of itself for example: if I have: class Long and a member function void mult(const Long&) I can go void Long::mult(const Long& L){ number+=L.number; } this line of code really confused me until my professor said "do you have adrivers license?" - I go yeah - he said "so do I but are they the same?" -no "the objects are the same way the are the same thing but have different information in them" I thought that was really cool and a cool way to look at it just thought I would share my experience :P
Yup... and "number" can even be private. People tend to think that private members hide data from other objects, but that's not the goal. The goal is to hide members from people who don't know how Long works internally. Two Long objects share the same code, so they assume that the person who wrote them knows how to handle the two of them interacting. It's when you start interacting with other classes, which are probably written by other people at other times, that private starts to say "Eh... but do they *really* know what Long uses that private field (or method) for?"
wow, this is unbelievable!! I never thought about sit in from of this eminence sir, can't believe this because he was to me one of the genius of my Turbo C/C++ book, amazing! ohh no matter the fact that I'm sit at home :P
you can mix and match, university of simplicity here, use your old codebase in a library, dll, etc.. and just call them from new C++ versions if you do not feel like updating the code
К6🕺🕺🕺😞🕺🕺🗨️🕺🕺🕺😞🕺🕺🕺🕺🗨️🕺🕺🕺😞🕺😞🕺🕺🕺😞🕺😞🕺🕺🕺🕺😞😞🕺🤔🤣🤣😒🤣🤣🤣🐩🐩🐴🏢🏔️🏣🏔️🏕️🏔️🏔️🏔️🏔️🏕️🏔️🏪🏔️🏕️🏔️🏔️🏕️🏔️🏔️🏕️🏔️🏕️🗾🏔️🏕️🏔️🏕️🗺️🏔️🏕️🏔️🏔️🏔️🏪🏔️🏔️🏔️🏔️🏔️🏔️🏔️🏖️🏔️🏛️🏔️🏣🏖️🍋🍋🍋🍋🍋🍎🍋🍋🍋🐂🍋🍈🍌🍐🥑🍍🍍🥑🍍🍍🍍🍍🥖🍍🍍🍑🍍🍍🍍🥑🍍🍑🍍🍍🥖🍍🍍🥖🍍🍍🍍🍐🍍🍍🥑🍍🍍🍍🍊🍊🍊🐘🍊🍉🍉🍉🐹🍉🦒🍌🍍🍍🥐🍌🍏🍌🍐🍌🥐🍌🍌🌶️🍌🍌🦒🦒🥑🦒😀😀🍐🍋🍋🍐🐹🍐🍐🍐🏛️🍐🍐😛🙂😅🏖️🍋🍋🍋🍋🍋🍎🍎🍋🍌🍋🍋🥑🍎🍋🥑🍋🍋🍋🍋🍋🍋🍋🍋🍊🍋🍋🍋🥑🍋🍋🍊🍋🍋🍋🍋🍋🍊🍋🥑🍋🍋🍉🍋🍋🍋🍋🍋🍋🍋🍎🍋🍋🍋🍋🍋🍊🍋🍋🍋🍋🍋🍎🍋🍋🍎🍋🍋🍋🍋🍋🍋🍋🍋🍋🥑🍋🍋🍋🍋🍉🍋🍋🍋🏣😃😃🌶️😃😃😃😃😛🍍🍍🍍🥖🍍🥓🍍🥖🍍🍍🍍🌏🌏🥞🌏🌏🌏🏢🏔️🏔️🏔️🏪🏢🏔️🏔️🗻🏔️🏪🏔️🏢🏪🏔️🌍🥨🌍🌍🍊🍌🥐🍌🍌🍌🍊🍌🍌🥐🏔️🏔️🎉♦️🏑🏑🎽♣️👗🧤🧤🛍️🧤🧤🧣🧣👘🧤🧤👢🧤🧤👘🧤🧤🕶️🧤👢🧤👔👔👖👖🧣🚫🚻🚺🚺🧢🏁🏁🏁🏁↘️🏁🏁📵🚻🚻🚻🚻🛃🛃🚻O:-):-|O:-)O:-):-|O:-):-|🇦🇸:-(🇧🇯:-(🇦🇲🛅🎌🎌🎌⬅️🚺🚻🚻🚩🚩🚻🚻📵🚻🏳️🏁🚻O:-):-|O:-)8-):-P:-|8-):-|8-):-|8-):O:-|:O:-[:-D:-DB-):-|:-|:-D:-[:-[:-D:-|O:-):-|:-|:-|8-):-|:-(:-(O:-):-|:-|8-):-|O:-)8-):-|O:-):-|O:-):-{:-DO:-):-|O:-)🚻🚻↘️🚻:-P🇧🇪:-P🇦🇲:-)🇧🇪:-P:-P:-|:-|🇦🇸♿♿💎🎩👛🎩🚰🚰🚰🚹🚻➡️⚠️⚠️⚠️↔️↘️⚠️👔👔🎳👔♿👛♿🎓🔇👠👡🧣👡🧣🧤🧤🧤🚻⚠️☢️🔇➡️☣️☣️🔕☣️🎙️🚷📱🎶🎈🏔️🏕️🏔️🏪🏛️🏔️🏪🎉🎉🎃🍞🗺️🥑🍍🍐🍑🍈🐖🍈🦍🦍😚🐒😊🐒😚🐒🤐🐵🐺🐐🐺🐄🐺🐺🦄🐄🐺🐄🐺🦍🍊🐫🐗🌳🍇🍇🍇🐙😶🤭🐬🐬🤧🤭🤭🤤🤭🤭🤧🤭🐵🙄🐵🐵😙😙😏😙😜😙😏😪😙😪😋🐙😘😘😜😜🐗🤧😜😜🦄😜😜😜😜😜😜🦄😆😗🤣🍌🍉🍉🍉🍊🍌🍌🍊🍌🍌➡️🚹♿🚹🔈🚰🚰🏳️🇦🇨🏳️🇦🇨🏳️🏳️🇦🇨🏳️🏳️🏳️🏳️🏳️🏳️🇦🇷🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🎌🎌🎌Любимый человек
It’s crazy how c++ is moving closer towards java/c# where even the creator says to avoid using pointers and to have automatic garbage collection. With memory being so high(even phones have 4+ GB), I guess it makes sense nowadays.
Just want to say something do not argue with him as in the guy talking about c++. Because he is the creator of c++ and know more than all you 6 year olds.
When C and C++ was developed we were already using the Extended Telephone Switching Language (ETSPL) that was PL/I based, strongly typed, with a simple and clear method to interface with hardware. That he ignores that thread in his description of languages is curious. So much of what is said is untrue.
7:00 Tell them what you're going to tell them, tell them, tell them what you told them: done to perfection-that shiny trader is going to really enjoy this lecture.
It's really not. The problem is, as the man said, that it's been around a long time, and has been, in a way, several _different_ related languages over the years. Keep in mind how very slow and clunky computers were when C++ was introduced - a lot of the original language relied on the programmer to do most of the heavy lifting. That is, you needed to think about the machine almost as much as the problem you were working on. These days, if you use the right parts of C++, you can offload most of the "thinking about the machine" part to the compiler and just concentrate on the problem you're solving. The compiler itself can be a much larger program than was possible in the early days, and it will compile your modern code in seconds or minutes rather than weeks. Most of this talk was about why it's better to use the newer language features when you can rather than the features that were there so that your steam-and-hamster-powered 4MHz processor with maybe a couple of megabytes of memory (huge for the time) wouldn't choke trying to compile it.
Well, I only watched the first 30 minutes due to an unfortunate attention span, but great lecture! I would also suggest watching this at x1.25, flows a little better.
I've just started messing about with esp32 embedded cpu so I am having to learn a bit of C++. I would only call myself an amature programmer at best. I started out on Python for ease of access, which is nice, but soon hit the performance limitations. So I did the research on lots of other languages to weigh up the pros and cons, they all have them and will be different for different people, and I settled on D. This was an interesting lecture, but it does reinforce the conclusion that C++ really wants to be D, it's just going to take 10++ years to get there. Anyway, I hope I can avoid the pitfalls highlighted by Bjarne.
Yea, but his vocabulary is great and on the other hand his Danish name Bjarne is pronounced wrongly. It is pronounced 'Byarne', where the e is pronounced like e in 'error' :) . Bjarne means Bjørn ('byoern' with an ö like in birn or firn ) , and bjørn means 'bear' . Touche :)
Shallex ... Well yes quite... but the English ‘Y’ is equivalent to the ‘J’ in all the other germanic languages (Dutch, Flamish, Danish, Icelandic, Faroes, Norwegian, Swedish, German) . So its more like ‘Byarne’, but without saying ‘by arne’ ... cause ‘Arne’ is another Danish male name :)
@8:is I completely agree on the description of the subset of human spices 😂😂 No wonder I had some serious problems over the last decade! The only observation I would like to share is: engineers are found in more places than science… You have MBA engineers as an example In closing, I am far from being an expert (thank you lord) but I am starting to appreciate C++. I have no clue if I am getting smarter or what happened
It appears Python has overcome C/C++ as the most popular programming language. Almost all of the young engineers and computer scientists I encounter are proficient programming in Python and have almost no experience with C/C++.
at 1:13:08 spoke a the creator of OOP, i couldnt understand the name properly, I tought it was Alan Key who invented it, but apparently Bjarne state otherwhise. I'm very curious to read more about the man he talk about, if anyone can write his name, that would be lovely.
Bjarne was always one of my Gurus. When turned this video, I was wondering If I be able to make any sort of comment to it. Well: 14:16 "Complexity of code should be proportional to the complexity of the task"... What failed during last 20 years and why do we have so many "N Gibabyte Fat" crapware...
@The University of Edinburgh You should either add real English captions or take down what you called English captions but it's in fact another auto-generated version.
7:20 - B. Stroustrup introduction
11:40 - What Stroustrup keeps in mind about C/C++
16:28 - Timeline history of programming languages
25:00 - Resource management
29:50 - Pointer misuse
31:34 - Resource handles and pointers
34:30 - Error handling and resources
35:25 - Why do we use pointers?
41:10 - Move semantics
43:50 - Garbage collection
47:06 - range: for, auto, and move
49:16 - Class hierarchies
51:58 - OOP/Inheritance
54:10 - Generic Programming: Templates
59:48 - Algorithms
1:01:10 - Function objects and lambdas
1:03:08 - Container algorithms
1:03:42 - C++14 concepts
1:10:32 - "Paradigms"
1:13:56 - C++ challenges
1:15:28 - C++ more information
1:17:00 - Questions and answers
papa bless
Thanks, Napat!
Bless you
I shouldn't have to look in comments or description for this type of thing
@@themodfather9382 But you just did xd
the essence starts at 7:27
vima78 \o/
+vima78 Yeaahhh!!!
+vima78 Thanks. :D
+vima78 Unless you wanna know where the fire extinguishers are, in case of fire :p
Brad Haircut LOL
Been reading a lot of Stroustrup C++ books and I love how crystal clear his explanations are. For every "but how would you...?" that pops up in your head he explains it right away in detail.
I feel he is a gentelman and beside his efforts and talents his sincirity and honesty and simplicity is great.
I started learning c++ this year and 7 months now.
And within 7 month i read almost 7 reference books and after all i read his book it was very good experience and i think i am better now in designing and thinking than i was and only think helps me to keep forward is the solemn desire to solve one humanities problem and make the life better place for every one.
Such a smart guy, I love that he's concerned with the language being approachable to casual users as well.
Maxwell Strange look at what language he started with.
ahk is approachable. Cpp is very not.
Ohm, yes, so damn concerned, that he and the C++ committee did not solve this with a more than 20 year head start.
Yeah I like Bjarne and listen to all his talks.
89888889998889888888
I cannot thank this man enough for creating the only language I can comfortably express myself. The only language that comes close (for me) is probably ECMAScript.
I'll just throw this out there. Recently I started learning C++ and some C after having achieved some degree of proficiency in VBA Excel and Python and here is the deal: If you are serious about learning the 'science' of programming, that is programming at the conceptual level, then choose either C or C++ as at least one of the languages you are learning. ((I'm leaving aside for a moment the obvious importance of the functional languages.)) I had always read that Python(and even Java and C#) is easier but I had no reference for this until I started learning C++. Obviously it is not going to be practical for everyone but if you want to be a true student of procedural process then go for C or C++. It flexes your brain way more than the higher level stuff.
C++ is high level - if you are not using abstraction then you are most likely still using procedural C techniques - I agree man C++ is not easy, it is very challenging and a huge language with huge libraries, and takes years to master
good luck bro! - get into Scott Meyers series it's great!
@Calum Tatum Depends. For performance critical tasks, like realtime audio for example, your example languages are just not fast enough. Also, C# is running on a runtime (CLR) that is written in C++. So what do you recommend for the person writing that runtime? C++ is hard, but it's also a really important language.
If you want procedural programming language, try Pascal..very similar to C
@@WouterStudioHD rust go brrrrrr
Love how this man can describe / explain things and make sense!
I never new this genius but I understand that he is capable of reading your mail at will and monitor your i/o at the same time. And I really like the way he has programmed his haircut, hair by hair! I don`t know much about programming, but somehow it was just cool to listen to him.
Bjarne is a legend! It is so nice to listen this intelligent man talk about the language that at some point back in time changed my life...
A modern legend. Every time I struggle to understand C++ I think about him inventing it.
Geeeeeeeez....
Does it help?
He invented it but don't be fooled, programming languages are an iterative process and C++ has been around for 40 some-odd years.
Bedroom Pianist More like 30 odd years. In a sense, Objective C beat C++ to market since it was beginning to be used commercially while C++ was still just an internal intellectual curiosity at Bell Labs. One could argue that Objective C isn’t really an entirely new language, but more like C with library and pre-processor extensions.
@@markteague8889 Objective-C sucks. It's a mess of weird syntax and it lacks the generic power of C++.
Wouter I know the original syntax was akin to that used by SmallTalk (maybe the first OO language) which employed the message passing paradigm. As I understand it, modern flavors of Objective C support both styles of syntax for invoking methods against objects.
The only man who actually understanda c++ 🤣🤣🤣
If some people would listen to what he has to say and actually take the time to understand his books, not just read, but UNDERSTAND, that wouldn’t be the case.
Instead, people just learn C++ at a very basic level as if it’s Python or something, and use every single feature of the language in their projects for no rhyme or reason and then call C++ a horrible language if something goes wrong.
J T Calmdawn computer scientitst
No doubt he's a genius, but the syntax is horrible.
@@jorgebastos2431 The syntax is actually very nice, very 'well defined'. Coder doesn't have much room for 'syntax' mistakes. Compilers catch them. Of course a lot depends from its creators as well... I love the language for classes and possibility to separate interface from implementation... Resource control and the speed of solutions when they
e finally ready...
SosukeAizenTaisho C++ isn’t “backwards compatible” with C, it’s a *superset* of C. Big difference.
inspired me to update my codebase, zero new/delete less lines, hundreds of bugs killed.
Menne Kamminga gg
His accent is the best I've ever heard. It's so classy.
I see what you did there
I won't watch any of my teachers or preachers lecturing for more than 20 minutes. This is 1 hour and 40 minutes long and I don't think Dr. Stroustrup is a gifted orator, but I will watch it to the end. The difference is the level of respect, I don't think any other living man deserves quite as much respect from me than Dr. Stroustrup.
Indeed, it's interesting. He even has a monotone voice delivery. What keeps us listening is the level of precision and relevancy with which he delivers all the information.
Omg ... watch his first videos "The Design of C++" then watch this. His speaking skills improved DRAMATICALLY! He's awesome.
He probably thought "English is too simple. Why would they teach such a language as a first one, there isn't enough thinking involved" :D
Bjarne Stroustrup is such a great guy. He knows programming from heart, every nukes and crannies, he knows.
I'm glad we are as connected as we are to be able to see this interview even if we are far, far away from Edinburgh.
Bjarne Stroustrup needs to speak more often. He is brilliant and very humble. Almost too humble because I have popped into IRC's C++ channel and defended him and his programming language many times. The programmers learning C++ in the channel seem to think they know more about his programming language the man who created it, ugh.... Irritating!
This is so awesome. Understanding like 10%, but still super interesting to follow.
C++ is and will be my favourite programming language forever! This man is my Hero! Longlife to Him!
When a James Bond villain gives a lecture on programming, I listen.
@Man With No Name he said james bond not dragon ball z
He does have a bit of a Ernst Stavro Blofeld vibe about him
@@corriedebeer799 No, maybe the old Blofeld though.
Only a genius can make something complex and abstract to sound simple.
1:22:35 He is completely right about that. Well, partially. Many professors aren't involved in any active software development anymore. One could argue even many professors just don't know any better. We still learned how to do linked lists with nullptr instead of using something sensible like optional. We have never seen the good parts of C++.
we weren't even taught linked lists with nullptr lmao our prof still uses NULL
Simply great.. He is such a good confident teacher. This video is a must watch for all programmers.
I love the way this guy talks
Great video UofE. My favorite quote starts at 07:36, and he nails it. Thank you sir for C++ and everything else!
Bjarne starts at 7:30 fyi
Thank you
so this the guy that made my CS classes so difficult. he looks exactly as how I imagined he would look
hahaha you made my day
lmfaoooooo
K&R and Stroustrup were my favorite books back in college. Thank you a lot prof Stroustrup .
We all who understand a little more about computing that "normal" people, know how important C++ is for todays computing. Thank you very much Mr Stroustrup.
the man himself talking about C++
People's names at 17:00
Assembler: David Wheeler
Fortran: John Warner Backus
Simula & OOD: Kristen Nygaard
C: Dennis Ritchie
C++: Bjarne Stroustrup
I'm a Microsoft Certified C++ Programmer. Love the language.
1:34:17 I think the best way forward would be to specify lots of old C++ features as deprecated and automatically generate a warning if deprecated feature is used. That would allow slowly removing parts of C++ that have better alternatives today.
That's why I learn Rust.
This man is a legend and C++ is a great language maybe the most important programming language.
not to downplay C++, but C is easily more important than C++
@@joestevenson5568 not to downplay c++ is C
Love and Respect, I encourage all those who want to learn C++ deeply to read his book 4th edition!
he's one of the few who can use words like encapsulate and you know he isn't trying to pull a trick
So you think when people use words with more than two syllables in them they are trying to trick you?
C++ is tough. I’m about to head into data structures this fall, which is based on c++ and it’s pretty intimidating but honestly, it’s all doable. I definitely won’t be working in c++ as my career, as I will be going into front end and UX but knowing c++ and how powerful it is, is interesting
follow bisquit channel for c++ magic
Thank you .... Morgan Stanley , speaker Bjarne Stroustrup & Dave Robertson @ University of Edinburgh
His talks are so dense! Packs in a wealth of information and insights that need real dedication to follow.
Start is at 7:27..
Apparently they don't have any video editors at Edinburgh.
I can understand those sighs... a most pragmatic guy invents a language that gets used in the most impractical ways.
It’s cool that Bjarne Stroustrup uses Cygwin on Windows. I used to use it a lot too. These days I use msys2 because it lets me create programs that don’t require the Cygwin DLL file but it still gives me that Unix environment I got used to from my Ubuntu days.
Example at 47:37: it's a bit weird the use done of Value_type and V. You may think they're same type, values of one and the other are compared directly, yet they are different types.
At 49:43, the code for the ranged loop is wrong, it should be for(auto& x : s) x->draw();
Some of the other examples were also missing semicolons.
C++ is everything you could want it to be; it's a beautiful language.
Tyler Minix "Beautiful" as in "it's what's on the inside that counts," because it's horendeus to look at 😀
Frankly you are insane
It's pretty powerful,but also hard to control
Good material before a sleep session, this story is quite a lullaby.
29:54 _look I'm a java programmer!_ 😊
LOL!
When these vids are made: 1) use two cameras, One for the person, One for the presentation 2) Show person on left or right using only 10% of width of video at the same time the 90% of the screen is showing the presentation 3) make sure when the presenter is POINTING, that one can see what is being pointed at. Looks like we have film students making cuts and don't know what is being said so the cut to the presenter is when the viewer should be seeing what the presenter is talking about. Just something to ponder.
some of the comments are so disappointing. Respect the man.
People these days don't respect the past. Sad to see really.
Long ago, I read a book by Edsger W. Dijkstra and, more or less, I've never had a problem with buffer overflows ever since. I read that book in the late 1980s, and the book was already old news then. Dijkstra's recommendation to error handling was this: if the operation can't legally be done, don't do the operation. No need to create a new control path, you'll be surprised how fast your loop exits the bottom once it's no longer able to do any legal operation. Then at the bottom, you check whether your contract was satisfied, and if not, you return an error condition. I hated exceptions with a passion-and refused to use them-until exception safety was properly formulated (mostly by David Abrahams) in the context of formal RAII. I continue to prefer Dijkstra's model, because it simply forces you to think more clearly, and implement your loops more cleanly. The only drawback is that it's hard to avoid defining a handful of boolean variables, and you can end up checking those variables almost every statement.
Supposing C++ had a lazy &&=, the structure looks somewhat like this:
bool function () {
bool ok = true;
ok &&= first_thing(); // function not called unless ok is true
ok &&= second_thing(); // function not called unless ok is true
return ok;
}
Mathematically, _every_ operation you attempt has a precondition on whether it can be executed legitimately.
But we learned way back in the 1970s-the age of Dennis Ritchie-not to write code like this:
for (p = p_begin, s = s_begin; s < s_end; ++s) {
if (p < p_end) *p++ = *s;
}
My God! All those extra loop iterations doing _nothing_ you might execute for no good reason if (p_end-p_begin)
Perhaps this is a good time to point out that on RUclips, if you nest the bold and italic syntax in the wrong way, editing a comment, making no change, and saving it again is not idempotent (it keeps wrapping more and more escape characters-incorrectly-with each additional unchanged save). Man, that takes talent. Broke-the-mold talent. Couldn't get i, j, and k into the correct alphabetical order if your life depended upon it talent.
Also, I hope people read carefully enough to realize that I was exactly half serious: half serious because this *does* actually work, half not serious, because it only works for a stubborn iconoclast willing to put up with a certain extreme forms of surface ugliness -and stubborn iconoclasts (like Knuth and Bernstein and every second Haskell programmer) have always managed to write software with orders of magnitudes fewer bugs than competitive endeavors from lower down in the iconoclasm pecking order, so it's really not such a special value add.
@@afterthesmash the problem with avoiding exceptions, is that you are just manually implementing the error handling mechanism already present in the language. you are also wasting the return value which could be used for something more natural. the big advantage of exceptions to me is that you can abstract them away. this way you are left more with code describing task. this is much cleaner, than code checking for errors, with the actual task buried somewhere inside. i'm sure you could make it work like most things, but it looks like it results in a lot of unnecessary code
@@xybersurfer xybersurfer Naturalness only exists in a cultural context. You choose to operate within a software culture with a weirdly enlarged notion of "error". If I tried to sign up on Facebook and I entered my "Allan Stokes" as my preferred username, would the system create me a new account, or would it display an "error" that my username is already taken by some other Allan Stokes of no particular distinction? That counts as an "error" in what sense, precisely? Was it an error that some other Allan Stokes wanted my personal name? Was it an error that I also wanted the same? Was it an error that Facebook did not originally design their social network so that all users were known by a globally unique 128-bit GUID, preventing this kind of land-rush clash for all time? I count _none_ of these things as errors. Not everybody can have everything at the same time in the world we actually live in.
Since I'm already a decade or two late boarding the Facebook ark, it's now a hail Mary for me to obtain the simplest rendition of my own name in this sphere. It's for the same reason we send Santa a list, and not just the single item Ferrari Testarossa. Santa might be short on Ferrari's this year, especially in light of the present circumstance in Milan lately. So it's a wise policy to provide Santa with options B through F, as well. Was it an "error" for me to want a Ferrari Testarossa? Or was it merely the first interaction in a distributed algorithm to probe the maximal intersection point between my list of desires and Santa's available merchandise?
Likewise, if my algorithm wants 10 GB of DRAM working-space so as to run faster, is it an "error" if my request can not presently be granted by Santa's elves? I don't count that as an error, either.
Let's suppose I'm only asking for 256 bytes, and malloc returns NULL. Under those conditions, whatever my subroutine is trying to accomplish is probably not viable. How does the OS know that this is now an error as you see it, because your program design has boxed you into a corner with no recourse but to barf up a
White Flag of Surrender message box? The OS does _not_ know this. (Is it a coincidence that "throw" and "barf" are somewhat synonymous? I think not.) What the OS _could_ know for certain is that you just called free on a pointer it can't find on any active allocation list, which is _definitely_ an error. And if your program has just called free on a non-thing (type II error), what confidence do you now have that you haven't previously called free on a thing you intended to continue using (type I error)? Practically NIL. This is now Hunt the Wumpus territory. "It is now pitch dark. If you proceed, you will likely fall into a pit."
But this isn't actually defined in the standard as an "error". The standard instead says that free's behaviour is now undefined. And once _any_ function commits undefined behaviour, this condition can not be erased short of exiting the current process (although in theory, attempting to exit your process can now also be undefined)-and now the state of your Unix process tree can potentially also become undefined and it's probably now time to pay homage to CTRL-ALT-DEL. In practice, exit is usually engineered to not depend on much, so it usually won't fail to exit your process, to some number of nines (typically more for BSD with a belt-and-suspenders attitude dating back to the original graybeard; fewer for Linux, which regards too many nines as a type II cultural error that greatly hinders the effective evolution rate).
This is the funny thing about error culture in computer science. The unambiguous errors are considered so severe, that the cultural response is "why even contemplate continuing to press forward?" whereas completely foreseeable circumstances-like two people trying to sign up on Facebook with the same username-are reified into the barf loop (aka culturally re-parented into the class of "exceptionable" error events).
My personal favourite case study is the Therac-25, a Canadian radiation therapy machine which either killed or badly injured at last six people. It had two beam strengths: regular and lethal (100× beam power). There was a turntable inside to rotate a bullet-proof vest in front of the lethal beam. On striking the bullet-proof target, the lethal beam would kick out some other modality of therapeutic radiation as a byproduct. Patient saved. In Dijkstra's world, you have ONE place in the code which is capable of unleashing lethal beam energy. Under what condition are you not allowed to do this? When the bullet-proof vest has _not_ been rotated into position. Otherwise, the patient experiences an extreme shock and _literally_ runs screaming out of the treatment room (as did a fatally wounded Roy Cox in actual fact). But you see, the software was designed to ensure that the correct turntable position was already activated and properly in place long _before_ subroutine release_the_radioactive_hounds was called upon to do its dirty work.
Mechanically, the turntable implemented _three_ separate microswitches to ensure that the software could reliably detect its true position (not just two sets like your microwave oven door).
They got carried away with thinking about what the software was trying to _do_ (treat the patient) and forget to think clearly about what the software was trying to _not_ do (kill the patient). There's no excuse on God's green earth for a software culture which accepts _anything_ other than a last conditional statement to check the turntable position (and every _other_ survival-critical precondition, if there were any more) before the ONE line of code capable of unleashing the 100× lethal beam. We were a good 35 years into the software industry before these fatal accidents happened. Dijkstra had completely set this straight already by the 1960s.
In ZFS or PostgreSQL-or any other life-critical application (as viewed by the data under storage)-the implementation culture is strongly biased to think first and always about what you simply MUST NOT DO (e.g. create a race condition on flush to disk). Further up the software stack, the culture flips into: call the extremely sophisticated and fast-moving API as best as you can and mostly hope for the best. If a 7'2" Santa Claus with a 30" vertical leap power-blocks your layup attempt, you click you heels through the handy-dandy uncluttered exception mechanism, and return to Kansas, hardly any the wiser, but more or less still alive. Betcha didn't know Santa had that move. Good thing your ruby slippers are maximally topped up on reindeer credits.
Only here's the downside: people who live in Kansas begin to develop an extremely muzzy sense of what a true error actually looks like. "Error" becomes a hopelessly muddled class of eventualities where it seems more convenient to click your heels together than to write a robust code block with a clear sense of the hazards it is actually navigating. And then this becomes further reified into what "uncluttered" code ought to properly look like, because we are at the end of the day cultural creatures, who thrive on cultural uniformity
@@xybersurfer Long ago I attended a pretentious university (Waterloo, Ontario) where the faculty was determined to teach freshman and sophomores the One True Way, which in that year was the Pascal programming language, with a top-down design methodology. Everyone I knew with any talent was conniving to get an account on any machine with a viable C compiler, while the faculty was conniving to keep access to these machines as limited as possible for the sake of "sparing the children". But no high-integrity system is ever designed top down. You mainly work bottom up at first from your core integrity guarantees. And then once you have a coherent API over your core integrity layer, you work top down, but still not as I was originally instructed. The top-down portion is architected from your test and verification analysis. The best modularity is the maximally testable modularity, and not the modularity that produces the least amount or the cleanest amount of code. In particular, your abstraction boundaries only need to be clean enough, rather than conceptually pristine (pristine when you can swing it, but not at the cost of making the full system less testable). The Linux people thought that ZFS had too many layering violations, and that with the benefit of hindsight and some fundamentally better algorithms deep down, they could kick ZFS to the curb with Btrfs. Actual outcome: Btrfs was removed from Red Hat Enterprise Linux 8 (RHEL) in May 2019.
Btrfs team: We know pristine (good), we know clutter (bad).
ZFS team: We know core coherence layering (hard, but doable with a clear head), we know testability at scale (hard, but doable with sustained cultural pragmatism).
From Rudd-O: "ZFS will actually tell you what went bad in no uncertain terms, and help you fix it. ... btrfs has nothing of the sort. You are forced to stalk the kernel ring buffer if you want to find out about such things."
That's a guaranteed fingerprint of a click-your-heels-to-return-to-Kansas design ethos. At which point all you can do is emit a somewhat cryptic message to the OS ring buffer.
The original inviolable rationale for why top-down-design as I was once taught in Pascal was going to rule the world is that it's fundamentally based on divide and conquer. And what can possibly defeat divide and conquer? We had hundreds of years-if not _thousands_ of years-of history with this technique in mathematics. And it truly rocks in that domain. Because mathematics-of all human things-most excels at abstraction from messy circumstance. By emulating mathematics to the ultimate extent, so too can software _pretend_ to exist in this austere world.
The original ZFS design team at Oracle knew how to design the system around their validation framework, because they were _deeply_ immersed in the world of the shit hitting the fan. Before Matt Ahrens proposed ZFS, a typo in the configuration of the Solaris volume manager caused the system to lose all home directories for 1000 engineers. And that was far from the only domain of extreme suffering. Original design principle: to escape the suffering, the administrator expresses intent, not device mappings.
Now I'm circling back to the original problem. When you're muzzy-headed about what constitutes a conceptually coherent error class (because you've lumped sick world & sick dog under "click-your-heels" through the underground exception-return expressway) there's barely any possible way to think clearly about application intent. Oh, but you say "the exception return object can be arbitrarily complex to handle this". And right you would be, only there goes your pristine and uncluttered code base, for sure. And now what have you really gained?
Because you are already welded into the exception mindset, you're almost certainly going to try to report the damage on first contact with a 7'2" Santa Claus guarding the post. And you're return an object to encode error 666: denied by Giant Black Goliath in a red deerskin vest. I've actually played around with writing code where I would continue to attempt to do all things the code can legally do _after_ something has already gone fatally wrong (fatally wrong meaning there is no remaining chance to achieve your subroutine's mandatory post-condition) and then reporting back _all_ the resources and actions that proved unsuccessful in the quest to accomplish the assigned task. (There is no possible way to implement this coding strategy that you would not decree to be hopelessly cluttered.) And this is interesting, because when you do get six errors at all once (illegal host name, can not allocate memory, no route to host, private key does not exist) then you're immediately pretty sure you entered double-insert mode in vi and there's an extra "i" character somewhere in your application's configuration file.
Suppose you tell a human minion: deliver this yellow envelope to 555 Hurl St, last door on the right at the end of the long yellow hallway on the third floor and then grab me a muffin from the bakery downstairs and your minion comes back to report "no such address" (with no muffin) and doesn't point out that while the hallway on the third floor was long, it was orange instead of yellow and you didn't even check whether there was a bakery down below, because unfortunately the hallway was orange instead of yellow, so you didn't even _think_ about completing the rest of the assigned errand. "And there was no bakery down below" would have you checking your street address first of all. "And there _was_ a bakery down below" would have you checking your floor number first of all (while happily munching your hot muffin as a side bonus).
Human contingency is _so_ different than code contingency, and I just think this is wrong-headed all around.
But modern computer languages really do tend to compel divide and conquer, and I conceded that it's extremely difficult to enact human contingency in a way that's clean and compatible with this reductionist software engineering ethos.
I've seen lots of unnecessarily cluttered code. Almost always because of a woolly conceptual foundation, where you're half in one system of coping and half in another system of coping. But I've gradually evolved to where I almost never count clean expression of preconditions or postconditions (all the way down to the level of individual statements) as active code clutter. My eye now regards such code as a somewhat prolix signature of a job well done, no ruby slippers involved.
As a programmer, I'll tell you what really ticks me off about these high level OOP languages.. I'm pretty good at logic and program flow, but my productivity is drastically slowed down by having to remember, or look up, all the INSANE syntax, variable passing, and multiple file dependencies of C++
It's an amazingly powerful language, but also gives you the opportunity to shoot yourself in the foot at every step on the way. To use it simply and safely, you'll probably start using smart pointers, copying strings instead of referencing.. undoing its performance potential. Still, you can do everything, including shooting yourself in the foot.
***** I would also listen to this video. He will explain why we got so many different kinds of compilers. You could skip to the end where he does a question and answers.
I find I can listen to some videos without the sound and learn a lot.
I do this when I am washing my cloths or cooking.
that's what I call real multi tasking.
***** what the hell are you doing talking to your fucking self? Or is ***** another person as I have long suspected?
***** Sadly you will never make a programmer with your split personally behaviour. You need your all spanked until your proper Bonobo blue black
***** The answer to what dim wits?
***** Well tell your fucking puppeteer to shove you back in his suitcase.
MScode ™
"I find I can listen to some videos without the sound and learn a lot."
How are you able to listen to a video without sound?
I found this a rather simple lecture for university students. For college/A level students it's very useful. However Bjarne is a genius.
my C++ professor is kicking my ass this semester
one thing I recently learned is that a member function of an object can take an object type of itself for example:
if I have:
class Long
and a member function
void mult(const Long&)
I can go
void Long::mult(const Long& L){
number+=L.number;
}
this line of code really confused me until my professor said "do you have adrivers license?" - I go yeah - he said "so do I but are they the same?" -no
"the objects are the same way the are the same thing but have different information in them"
I thought that was really cool and a cool way to look at it
just thought I would share my experience :P
Yup... and "number" can even be private. People tend to think that private members hide data from other objects, but that's not the goal. The goal is to hide members from people who don't know how Long works internally. Two Long objects share the same code, so they assume that the person who wrote them knows how to handle the two of them interacting. It's when you start interacting with other classes, which are probably written by other people at other times, that private starts to say "Eh... but do they *really* know what Long uses that private field (or method) for?"
rusko dudesko yeah took me a minute to understand the this pointer and how it can point to private data for any new object created
Experienced programmers often write the same code as "this->number += L.number"; it makes it explicit that "number" is a member variable.
@@brianplum1825 yeah... then you can search for this and hopefully get all members... but you can't rely upon it... especially if multiple developers.
How to introduce a speaker: "Ladies and gentlemen, our guest speaker is X. Please welcome him/her to the stage." Then get off!
wow, this is unbelievable!! I never thought about sit in from of this eminence sir, can't believe this because he was to me one of the genius of my Turbo C/C++ book, amazing! ohh no matter the fact that I'm sit at home :P
The internet is amazing!
If Scottish accent doesn't turn you on, please press this button! It'll save you much time ---> 7:20
The essence of C++ is int & whatsit::whazzat( HUH? ) -> *= ....
you can mix and match, university of simplicity here, use your old codebase in a library, dll, etc.. and just call them from new C++ versions if you do not feel like updating the code
Haha, legendary Bjarne "First of all, i dont go round killing people"
What did he mean by this comment?
К6🕺🕺🕺😞🕺🕺🗨️🕺🕺🕺😞🕺🕺🕺🕺🗨️🕺🕺🕺😞🕺😞🕺🕺🕺😞🕺😞🕺🕺🕺🕺😞😞🕺🤔🤣🤣😒🤣🤣🤣🐩🐩🐴🏢🏔️🏣🏔️🏕️🏔️🏔️🏔️🏔️🏕️🏔️🏪🏔️🏕️🏔️🏔️🏕️🏔️🏔️🏕️🏔️🏕️🗾🏔️🏕️🏔️🏕️🗺️🏔️🏕️🏔️🏔️🏔️🏪🏔️🏔️🏔️🏔️🏔️🏔️🏔️🏖️🏔️🏛️🏔️🏣🏖️🍋🍋🍋🍋🍋🍎🍋🍋🍋🐂🍋🍈🍌🍐🥑🍍🍍🥑🍍🍍🍍🍍🥖🍍🍍🍑🍍🍍🍍🥑🍍🍑🍍🍍🥖🍍🍍🥖🍍🍍🍍🍐🍍🍍🥑🍍🍍🍍🍊🍊🍊🐘🍊🍉🍉🍉🐹🍉🦒🍌🍍🍍🥐🍌🍏🍌🍐🍌🥐🍌🍌🌶️🍌🍌🦒🦒🥑🦒😀😀🍐🍋🍋🍐🐹🍐🍐🍐🏛️🍐🍐😛🙂😅🏖️🍋🍋🍋🍋🍋🍎🍎🍋🍌🍋🍋🥑🍎🍋🥑🍋🍋🍋🍋🍋🍋🍋🍋🍊🍋🍋🍋🥑🍋🍋🍊🍋🍋🍋🍋🍋🍊🍋🥑🍋🍋🍉🍋🍋🍋🍋🍋🍋🍋🍎🍋🍋🍋🍋🍋🍊🍋🍋🍋🍋🍋🍎🍋🍋🍎🍋🍋🍋🍋🍋🍋🍋🍋🍋🥑🍋🍋🍋🍋🍉🍋🍋🍋🏣😃😃🌶️😃😃😃😃😛🍍🍍🍍🥖🍍🥓🍍🥖🍍🍍🍍🌏🌏🥞🌏🌏🌏🏢🏔️🏔️🏔️🏪🏢🏔️🏔️🗻🏔️🏪🏔️🏢🏪🏔️🌍🥨🌍🌍🍊🍌🥐🍌🍌🍌🍊🍌🍌🥐🏔️🏔️🎉♦️🏑🏑🎽♣️👗🧤🧤🛍️🧤🧤🧣🧣👘🧤🧤👢🧤🧤👘🧤🧤🕶️🧤👢🧤👔👔👖👖🧣🚫🚻🚺🚺🧢🏁🏁🏁🏁↘️🏁🏁📵🚻🚻🚻🚻🛃🛃🚻O:-):-|O:-)O:-):-|O:-):-|🇦🇸:-(🇧🇯:-(🇦🇲🛅🎌🎌🎌⬅️🚺🚻🚻🚩🚩🚻🚻📵🚻🏳️🏁🚻O:-):-|O:-)8-):-P:-|8-):-|8-):-|8-):O:-|:O:-[:-D:-DB-):-|:-|:-D:-[:-[:-D:-|O:-):-|:-|:-|8-):-|:-(:-(O:-):-|:-|8-):-|O:-)8-):-|O:-):-|O:-):-{:-DO:-):-|O:-)🚻🚻↘️🚻:-P🇧🇪:-P🇦🇲:-)🇧🇪:-P:-P:-|:-|🇦🇸♿♿💎🎩👛🎩🚰🚰🚰🚹🚻➡️⚠️⚠️⚠️↔️↘️⚠️👔👔🎳👔♿👛♿🎓🔇👠👡🧣👡🧣🧤🧤🧤🚻⚠️☢️🔇➡️☣️☣️🔕☣️🎙️🚷📱🎶🎈🏔️🏕️🏔️🏪🏛️🏔️🏪🎉🎉🎃🍞🗺️🥑🍍🍐🍑🍈🐖🍈🦍🦍😚🐒😊🐒😚🐒🤐🐵🐺🐐🐺🐄🐺🐺🦄🐄🐺🐄🐺🦍🍊🐫🐗🌳🍇🍇🍇🐙😶🤭🐬🐬🤧🤭🤭🤤🤭🤭🤧🤭🐵🙄🐵🐵😙😙😏😙😜😙😏😪😙😪😋🐙😘😘😜😜🐗🤧😜😜🦄😜😜😜😜😜😜🦄😆😗🤣🍌🍉🍉🍉🍊🍌🍌🍊🍌🍌➡️🚹♿🚹🔈🚰🚰🏳️🇦🇨🏳️🇦🇨🏳️🏳️🇦🇨🏳️🏳️🏳️🏳️🏳️🏳️🇦🇷🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🏳️🎌🎌🎌Любимый человек
It’s crazy how c++ is moving closer towards java/c# where even the creator says to avoid using pointers and to have automatic garbage collection. With memory being so high(even phones have 4+ GB), I guess it makes sense nowadays.
Just want to say something do not argue with him as in the guy talking about c++. Because he is the creator of c++ and know more than all you 6 year olds.
I have a degree in computer science with games programming, I'm leaving this lecture at 46 mins in cos I need a beer and my brain is sludge!
When C and C++ was developed we were already using the Extended Telephone Switching Language (ETSPL) that was PL/I based, strongly typed, with a simple and clear method to interface with hardware. That he ignores that thread in his description of languages is curious. So much of what is said is untrue.
7:25 - to skip the blah, blah at the start.
thank you lmao
thank you! you saved 7 minutes and 25 seconds of my life
@@qn565 - yes! 7 minutes and 25 seconds we'll never get back.
THE MAN THE MYTH THE LEGEND
What a great answer about not writing language like Java😄😄
7:00 Tell them what you're going to tell them, tell them, tell them what you told them: done to perfection-that shiny trader is going to really enjoy this lecture.
I watch this talk at least once per year, just to remind me what it means to be a (good) programmer.
I just watched this entire talk and I honestly can say most of what was said was beyond my understanding. Wanting to learn C++ but it seems too hard?
It's really not. The problem is, as the man said, that it's been around a long time, and has been, in a way, several _different_ related languages over the years. Keep in mind how very slow and clunky computers were when C++ was introduced - a lot of the original language relied on the programmer to do most of the heavy lifting. That is, you needed to think about the machine almost as much as the problem you were working on. These days, if you use the right parts of C++, you can offload most of the "thinking about the machine" part to the compiler and just concentrate on the problem you're solving. The compiler itself can be a much larger program than was possible in the early days, and it will compile your modern code in seconds or minutes rather than weeks. Most of this talk was about why it's better to use the newer language features when you can rather than the features that were there so that your steam-and-hamster-powered 4MHz processor with maybe a couple of megabytes of memory (huge for the time) wouldn't choke trying to compile it.
I'm pretty sure I attended this talk.
Well, I only watched the first 30 minutes due to an unfortunate attention span, but great lecture! I would also suggest watching this at x1.25, flows a little better.
I've just started messing about with esp32 embedded cpu so I am having to learn a bit of C++. I would only call myself an amature programmer at best. I started out on Python for ease of access, which is nice, but soon hit the performance limitations. So I did the research on lots of other languages to weigh up the pros and cons, they all have them and will be different for different people, and
I settled on D. This was an interesting lecture, but it does reinforce the conclusion that C++ really wants to be D, it's just going to take 10++ years to get there. Anyway, I hope I can avoid the pitfalls highlighted by Bjarne.
Is D still alive? I used it 10 years ago and it was nice, but today I settle for Rust and have more grip on performance.
the man who created chaos
Oh, I thought the essence was to weed out Comp Sci majors.
Best comment on RUclips.
… seriously I still don’t understand why people get so frustrated with c++…
Wow! Concepts and modules should have been finalized before cpp17!!!
Watching it in lockdown!!
brilliance beyond my software engineering skills
All hail Bjarne the creator of our generation!
Thanks a lot for this video. Keep uploading more of such content. Good luck...
Got to have that hair to be a legend
The Morgan Stanley guy's biggest achievement in life is getting his managers to switch from C to C++. There is something very sad about that ...
hyqhyp is there any article about it?
Actually, he said he got them to move from Fortran to C and to C++, but still agree with your point...
Making a large company change their primary programming language is a serious achievement.
hyqhyp wth.. he invented one of the most commonly used languages of modern time. What are you smoking?
@@bextract0 He's taking about the guy at the beginning. Not Bjarne. You completely misunderstood his very clear comment lol.
Haha. His Danish accent is lovely! xD
Yea, but his vocabulary is great and on the other hand his Danish name Bjarne is pronounced wrongly. It is pronounced 'Byarne', where the e is pronounced like e in 'error' :) . Bjarne means Bjørn ('byoern' with an ö like in birn or firn ) , and bjørn means 'bear' . Touche :)
Strange how so many Danish computer scientists have influenced the programming languages...
Shallex ... Well yes quite... but the English ‘Y’ is equivalent to the ‘J’ in all the other germanic languages (Dutch, Flamish, Danish, Icelandic, Faroes, Norwegian, Swedish, German) . So its more like ‘Byarne’, but without saying ‘by arne’ ... cause ‘Arne’ is another Danish male name :)
@8:is I completely agree on the description of the subset of human spices 😂😂
No wonder I had some serious problems over the last decade!
The only observation I would like to share is: engineers are found in more places than science…
You have MBA engineers as an example
In closing, I am far from being an expert (thank you lord) but I am starting to appreciate C++.
I have no clue if I am getting smarter or what happened
Love Christopher Guest, he's the man of thousand faces!!!
EXCELLENT lecture!!!
It appears Python has overcome C/C++ as the most popular programming language. Almost all of the young engineers and computer scientists I encounter are proficient programming in Python and have almost no experience with C/C++.
This only increases the value of C++ programmers
assuming people want C++ programmers@@postmeme44
this man is my hero.
He looks like the "beast" from KongFu hustle
at 1:13:08 spoke a the creator of OOP, i couldnt understand the name properly, I tought it was Alan Key who invented it, but apparently Bjarne state otherwhise.
I'm very curious to read more about the man he talk about, if anyone can write his name, that would be lovely.
Kristen Nygaard who together with Ole-Johan Dahl invented Simula
I wondered what Gail Porter was doing recently.
Bjarne was always one of my Gurus. When turned this video, I was wondering If I be able to make any sort of comment to it. Well:
14:16 "Complexity of code should be proportional to the complexity of the task"...
What failed during last 20 years and why do we have so many "N Gibabyte Fat" crapware...
i love his very subtle sarcasm
34:19 What does he mean here? ("There are some behind the scene") How are there pointers behind the scene?
Thanks for sharing the lecture!
The essence of C++ is C.
His talk starts at 7 20.
@The University of Edinburgh
You should either add real English captions or take down what you called English captions but it's in fact another auto-generated version.