Looking for books & other references mentioned in this video? Check out the video description for all the links! Want early access to videos & exclusive perks? Join our channel membership today: ruclips.net/channel/UCs_tLP3AiwYKwdUHpltJPuAjoin Question for you: What’s your biggest takeaway from this video? Let us know in the comments! ⬇
I learnt a lot in the early 80s as a young developer from then experienced people that I am still passing on to new young developers/architects/analysts/managers today... Love this!
Alan Kay is also critical of modern programming practises, referring to them as a "pop culture". By which he means: we are obsessed with the latest fads in languages/frameworks/methodologies, but completely fail to consider the richer body of knowledge that has accumulated over the decades.
Don't forget that 'pop culture', as it is with fashion, just as it is with fads, moves around on a circle... What was once 'in' and 'hip' will be just that again, if you just wait a decade or two. Tempted to throw out that gaudy leather jacket? Don't. It'll become fashionable again before long. Maybe that's what's really happening here? We're not doing or inventing anything REALLY new, that's rare. Rather we're continuously refreshing the same old and used ideas every couple of years to fool ourselves into thinking we're actually moving forward continuously. We like that feeling of momentum. It makes it feel like we're actually at the bleeding edge.. instead of merely factory line workers.
I found this common at these programming conventions. Sometimes it's a language barrier, sometimes the devs take themselves way too seriously, and I've seen some hilarious presenters
I attended this one. This talk was on the main stage (you can tell how large it is when Kevlin walks around). It was seriously a huge room. I promise you we thoroughly, loudly enjoyed this talk. It's just not going to sound right on a recording when 2/3 of the chairs are empty, and the back half of the room didn't even have chairs.
IO and keeping the code small and simple - i ran into that as a problem many times. I had to do some data-sorting and the construct i got the data in was so unwieldy that this took me over 100 lines of code.... i ended up re-writing it 2 times and then i spun my own data-structure and could reduce it to a single screens worth of code that was easier to understand. And then during code-review they complained that having nested loops and using sort inside would be too slow. Turns out no - their code that was calling my function took LONGER just to call my function cause they decided to first write a lot of tracing line by line, instead of preparing the information and writing it in a single statement. Sorting 1 000 elements based on multiple weighted conditions and re-sorting them after each interaction was still faster than their calling code just logging out the data they are giving to the function.
The argument against nested loops is hardly worth listening to. I once had a real time filter that required a pointered list with five levels or something... and an equal number of loops, most of which did only one iteration most of the time because the conditions for the iterator failed in almost every case. I have never been able to come up with a better solution to that code... it's simply forced by the problem which was a time series of physical data that had to be compared against a four dimensional coincidence pattern (similar to the way GPS determines a position and time from four or more satellite signals). The solution was simply to calculate estimators for the possible temporal and spatial displacement in the data for all sensors and then to check in the data stream (by jumping forth and back) if there was any recorded event during the estimated intervals. Most of the time there wasn't, so the loops never spent much work. The code ran at 40% of total CPU power on my elderly laptop for an estimated one million real time events per second (which was a conservative assumption). The original proposal had called for a small supercomputer farm with over 100 CPUs. Go figure.
@@mirageowl Noticed this on a few videos. The audience audio just isn't captured that well. In some of the videos you can see the audience physically laughing while hearing nothing. He also pauses for laughter sometimes which looks weird when you can't hear it.
7:00, it's funny if you have more than 1 singletons _of the same type_ . But they can be from different ones, which is acceptable. Anyway, the way singletons are implemented is bad - so, don't use them. 13:30, this statement would make sense if completeness covered ALL POSSIBLE situations. Then, it could be sacrificed to favor simplicity, when you decide to run away from completeness, being simple (therefore likely to be correct), fast (less things to handle) and consistent (simple code). 41:30, yes, this is the main principle of OO. Of course, other things can arise from it, if they don't "get in the way", such as a concept around the class, reuse of its code, and so on.
We can't tell people not to use singletons. They are forced on us by either the hardware or the software requirements. In any case, I am not aware of a single language in use that can't implement singletons rationally. What no language can do is to enforce a singleton against a malicious programmer. After all, absolutely nothing stops a programmer from copying the singleton code line by line. The point of patterns is, of course, not to prevent the unpreventable. The point is to point out problems that result from accidental misuse of a resource. In case of singletons a simple print or log statement in the initializing code will do. If the log contains more than one singleton init call, then we know that something went wrong.
There's proverb in Russian language: Everything new is well forgotten old. (всё новое это хорошо забытое старое) Applies to many things (or it won't be a proverb)
Part of the problem here, is the fervent need of the publishing houses to come up with new and, specifically, more voluminous books for the new generations... and the bleeding need of professors to recommend said books for reasons of material and personal gain
I miss Samizdat! On the subject of Perl, I would never dream of disagreeing that it's awful... but at the same time I would highly recommend Larry Wall's books on the language.
31:51 "... it was partly knocked down by that abomination that people call Perl" Guys, you do know that you can write VERY clean and clear perl code, for a long time now, right ? Since 'use English;' pragma, that is. It just doesn't push you (read: encourage) to do it, like other languages, it must be your choice.
That commit strip comic (36:10) demonstrates a profound ignorance of what a specification is. We don't speak in code, we speak in predicates and relations. Quicksort is not a specification of what it means to sort a list, the PTAS algorithm for the knapsack problem is not a specification for what the knapsack problem is, the walking unification algorithm is not a specification for what it means to unify terms, etc. Specifications, at their core, are relations between inputs and outputs, and programs are sub relations of these which are functional, or at least partially functional so that they give deterministic outputs on well-defined inputs. Given a specification in terms of a relation, R, a program satisfying this specification will be the X solving the inequalities (X ⊆ R & X · X° ⊆ id & id ⊆ X° · X). While R is often very simple, X can end up being arbitrarily complex and won't be unique. Treating X as a specification unto itself is irresponsible and somewhat hypocritical on part of Mr. Henney. If we treat our code as a description of what we're trying to accomplish, and our code, in total, is too big for us to "fit in our cognitive bounds" (21:30), then we won't ever be able to understand what we're even trying to do, and, apparently, what we're trying to do will change whenever we change our code. The above description of specifications comes from the Algebra of Programming/Squigol (See "Program Design by Calculation"), but there's also dependent types and other formalization techniques where the specification is something the code needs to satisfy; code is not a specification unto itself. Treating code as its own specification is both ignorant, and a bit defeatist as it implies that, even when communicating to a god-like entity capable of coding anything you want, perfectly, you won't be able to do much better than reading it the program which does what you want. This obviously isn't true. There's an entire field called program synthesis dedicated to generating programs from formally well-defined specifications. See the Synquid system, for instance.
The seminal lisp paper was a report of what they had done and what they wanted to do. The paper included the code for an interpreter. And the paper was published in 1960, but it was written in 1959
Every-time he spoke about singleton he says the same thing "a system has either 0 or 2 or more singletons present, not 1!". However most of my projects have a single all static class (not really a singleton but close to) named "xxApplication" containing the xxWindowManager and other stuff that usually goes into singletons. I went for that following the adage "In doubt do like QT does". Also I have that one project that use a real singleton pattern at one place because I had to fix it in hurry and that was the best think I thought of at the time. It is still present today but it also the single singleton in that code. So 100% of my code has 0 or 1 singleton but never more.
Mostly good presentation -- except for the Perl bashing. Perl code can be extremely elegant AND readable at the same time. Just bashing it b/c that's fashionable nowadays is kind of lame. Plus, you have probably never heard of Raku (formerly known as Perl 6), which is a stricter and more modern form of Perl.
and the ONE language you never even mention, which excells at ALL these principals you speak of is FORTH. I have never seen any language that does as good a job solving ANY software engineering problem as forth. PERIOD. and... all you hear on it is silence.
He opens by saying that there's too much new stuff for him to learn, although he already knows it, just tell him what's different. There are subtle ironies there that might lead him to a more sophisticated reasoning, instead of his repertoire of pointless snide remarks. Most new programmers don't care what the differences of his knowledge and the new are, it's not just about him. It's also not as simple as that, language never stops evolving. I find most of his talks are just incredibly condescending straw-man. "Iterative design process was a bit of a revelation was it? Ahh actually no." Is anyone actually saying that? He seems to nearly understand that technology goes through cycles of refinement and reuse of old ideas in new context, but then he makes a snide remark like "we have a very weak sense of history, we live in a constant state of astonishment and rediscovery". No some of us are just younger than you, and not all of compsci is figured out. Also as long as these topics are relevant then they're going be re-taught and re-invented. That's good. I think this is a talk in need of a topic. If the topic is supposed to be history, then I'd rather have it presented in a way that didn't sound like The Register on a bad day.
Dennis K Take machine learning for example, it's really exploding, both academically and in the science and the business world. There is constant delight and astonishment about how we're making tiny discoveries and breakthroughs and applying these techniques to new things. Visual Studio will soon incorporate AI into its coding. IntelliCode. This is subtle, but has far reaching consequences. "Been done." He'll say, as part of some sarcastic quip. "Seen it all before." An argument that can be applied to every new technology that has changed the world. Either a composite or a remix of existing ideas. Like language itself, context matters. We don't have a weak sense of history, and to live in a constant state of astonishment means we are behaving exactly like those who made the things he thinks we haven't learned about. Unless he wants to enforce some kind of authoritarian principles on technology itself. What is technology but a manifestation of ideas, described and modelled in language so that we may retransmit those ideas? Language, in a Chomskyian sense is a very good model for this kind thinking, a kind of etymology of discovery. What right have we to enjoy a new book, when the words which make it are borrowed from Latin, the theme from Shakespeare, the style from Iain Banks? No, be in constant astonishment. Kevlin Henney wants to curb our enthusiasm, whereas I think we need a great deal more of it.
"Is anyone actually saying that?" yes, there is. I actually was taught in college that iterative design was INVENTED AFTER waterfall was being used as The Way. I understand your remarks, but his talks are spot on for me in many things. maybe my experience is just too different to yours.
There's a difference between "The Way" (shall we say, "common accepted practise") and "invention". So he's found reference to a thing before it became standard practise. 50% of software developers have less than 5 years experience. He *does* know this. And to that 50%, he might appear to be an expert on language and software, but his methods seem crafted to instil doubt rather than educate. How does he *not* know about the universal application of Generative Grammar? How does he not know how people learn? You can have a revelation in the culture of anything but the idea doesn't have to be new. We can compose a completely original sentence, get excited about it, and yet it's built entirely on old ideas. And it's also disingenuous, because we have Turing Machines, which can model anything - all the ideas have already been invented for us with his flawed thinking. The Old is the New New? What a muppet. What a hack.
Machine learning is not new, if i'm not mistaken it was studied in the 60s, but the lack of computer power and data made it inefficient. Now we have the technology and data to see its potential and we are getting good results and improve previous work, but the idea is not new.
Looking for books & other references mentioned in this video?
Check out the video description for all the links!
Want early access to videos & exclusive perks?
Join our channel membership today: ruclips.net/channel/UCs_tLP3AiwYKwdUHpltJPuAjoin
Question for you: What’s your biggest takeaway from this video? Let us know in the comments! ⬇
I learnt a lot in the early 80s as a young developer from then experienced people that I am still passing on to new young developers/architects/analysts/managers today... Love this!
Alan Kay is also critical of modern programming practises, referring to them as a "pop culture". By which he means: we are obsessed with the latest fads in languages/frameworks/methodologies, but completely fail to consider the richer body of knowledge that has accumulated over the decades.
got a reference for that?
@@mohamedfouad6492 queue.acm.org/detail.cfm?id=1039523
@@zetaconvex1987 thank you
Don't forget that 'pop culture', as it is with fashion, just as it is with fads, moves around on a circle...
What was once 'in' and 'hip' will be just that again, if you just wait a decade or two.
Tempted to throw out that gaudy leather jacket? Don't. It'll become fashionable again before long.
Maybe that's what's really happening here? We're not doing or inventing anything REALLY new, that's rare.
Rather we're continuously refreshing the same old and used ideas every couple of years to fool ourselves into thinking we're actually moving forward continuously. We like that feeling of momentum. It makes it feel like we're actually at the bleeding edge.. instead of merely factory line workers.
@Richard Gleaves It doesn't matter whether an idea is good or not. What matters is if the idea SELLS good or not. :)
Once again a fantastic talk that reminds you of all the things you could not remember that you knew.
This audience is absolutely brain dead for humor.
Besides the excellent topics he's explaining very well (in ALL his talks), the jokes are hilarious.
I found this common at these programming conventions. Sometimes it's a language barrier, sometimes the devs take themselves way too seriously, and I've seen some hilarious presenters
Well it's not a comedy show so the audience isn't mic'd up. Unless it's an absolute knee slapper you don't hear them.
I attended this one. This talk was on the main stage (you can tell how large it is when Kevlin walks around). It was seriously a huge room. I promise you we thoroughly, loudly enjoyed this talk. It's just not going to sound right on a recording when 2/3 of the chairs are empty, and the back half of the room didn't even have chairs.
IO and keeping the code small and simple - i ran into that as a problem many times.
I had to do some data-sorting and the construct i got the data in was so unwieldy that this took me over 100 lines of code.... i ended up re-writing it 2 times and then i spun my own data-structure and could reduce it to a single screens worth of code that was easier to understand.
And then during code-review they complained that having nested loops and using sort inside would be too slow. Turns out no - their code that was calling my function took LONGER just to call my function cause they decided to first write a lot of tracing line by line, instead of preparing the information and writing it in a single statement. Sorting 1 000 elements based on multiple weighted conditions and re-sorting them after each interaction was still faster than their calling code just logging out the data they are giving to the function.
The argument against nested loops is hardly worth listening to. I once had a real time filter that required a pointered list with five levels or something... and an equal number of loops, most of which did only one iteration most of the time because the conditions for the iterator failed in almost every case. I have never been able to come up with a better solution to that code... it's simply forced by the problem which was a time series of physical data that had to be compared against a four dimensional coincidence pattern (similar to the way GPS determines a position and time from four or more satellite signals). The solution was simply to calculate estimators for the possible temporal and spatial displacement in the data for all sensors and then to check in the data stream (by jumping forth and back) if there was any recorded event during the estimated intervals. Most of the time there wasn't, so the loops never spent much work. The code ran at 40% of total CPU power on my elderly laptop for an estimated one million real time events per second (which was a conservative assumption). The original proposal had called for a small supercomputer farm with over 100 CPUs. Go figure.
I've watched some of these video's now. His talks contain a lot of hidden humor.
He's British. Of course. Brits are masters of subtle humour.
Also: videos*.
the audience not responding to any of it is the weirdest thing about this I think
@@mirageowl I think it is only a sound problem, I mean we don't hear it. It is not possible that the audience doesn't get it - this much. :)
@@mirageowl Noticed this on a few videos. The audience audio just isn't captured that well. In some of the videos you can see the audience physically laughing while hearing nothing. He also pauses for laughter sometimes which looks weird when you can't hear it.
those who do not know their history are bound to repeat it
I'm working on a system with database migrations larger than Unix.
Brilliant and funny.
Amazing talk thank you so much for sharing this.
7:00, it's funny if you have more than 1 singletons _of the same type_ . But they can be from different ones, which is acceptable. Anyway, the way singletons are implemented is bad - so, don't use them.
13:30, this statement would make sense if completeness covered ALL POSSIBLE situations. Then, it could be sacrificed to favor simplicity, when you decide to run away from completeness, being simple (therefore likely to be correct), fast (less things to handle) and consistent (simple code).
41:30, yes, this is the main principle of OO. Of course, other things can arise from it, if they don't "get in the way", such as a concept around the class, reuse of its code, and so on.
We can't tell people not to use singletons. They are forced on us by either the hardware or the software requirements. In any case, I am not aware of a single language in use that can't implement singletons rationally. What no language can do is to enforce a singleton against a malicious programmer. After all, absolutely nothing stops a programmer from copying the singleton code line by line. The point of patterns is, of course, not to prevent the unpreventable. The point is to point out problems that result from accidental misuse of a resource. In case of singletons a simple print or log statement in the initializing code will do. If the log contains more than one singleton init call, then we know that something went wrong.
nice presentation 👌
Great talk! Loved the humor.
Nice explanation how new programming tehnologies isnt new at all.
There's proverb in Russian language:
Everything new is well forgotten old. (всё новое это хорошо забытое старое)
Applies to many things (or it won't be a proverb)
Part of the problem here, is the fervent need of the publishing houses to come up with new and, specifically, more voluminous books for the new generations... and the bleeding need of professors to recommend said books for reasons of material and personal gain
20:10 Yeah, tell that to Google and their Stadia.
I miss Samizdat!
On the subject of Perl, I would never dream of disagreeing that it's awful... but at the same time I would highly recommend Larry Wall's books on the language.
Excel at 50:23
31:51 "... it was partly knocked down by that abomination that people call Perl"
Guys, you do know that you can write VERY clean and clear perl code, for a long time now, right ? Since 'use English;' pragma, that is.
It just doesn't push you (read: encourage) to do it, like other languages, it must be your choice.
Great talk, thanks!
Back in the mid-90's, I replaced my Sed & Awk with Perl 5.
Very interesting talk.
Badass
Praise for a smart man attributing software engineering to Margaret Hamilton !
Excellent presentation!
2:45 Here we see the famous statue of Shakespeare badly hyperextending his right knee.
What does this contribute to?
32:42 - 32:55 is essentially the key take-away from the talk. The rest is just setting up and making the case.
to the art of making yourself to look like a smartest man in the room by applying advanced ranting techniques.
The talker is brilliant; I kept alternating between ahahahahaha and ahaaaaaaaaa… ; the public sucks
That commit strip comic (36:10) demonstrates a profound ignorance of what a specification is. We don't speak in code, we speak in predicates and relations. Quicksort is not a specification of what it means to sort a list, the PTAS algorithm for the knapsack problem is not a specification for what the knapsack problem is, the walking unification algorithm is not a specification for what it means to unify terms, etc. Specifications, at their core, are relations between inputs and outputs, and programs are sub relations of these which are functional, or at least partially functional so that they give deterministic outputs on well-defined inputs. Given a specification in terms of a relation, R, a program satisfying this specification will be the X solving the inequalities (X ⊆ R & X · X° ⊆ id & id ⊆ X° · X). While R is often very simple, X can end up being arbitrarily complex and won't be unique. Treating X as a specification unto itself is irresponsible and somewhat hypocritical on part of Mr. Henney. If we treat our code as a description of what we're trying to accomplish, and our code, in total, is too big for us to "fit in our cognitive bounds" (21:30), then we won't ever be able to understand what we're even trying to do, and, apparently, what we're trying to do will change whenever we change our code.
The above description of specifications comes from the Algebra of Programming/Squigol (See "Program Design by Calculation"), but there's also dependent types and other formalization techniques where the specification is something the code needs to satisfy; code is not a specification unto itself. Treating code as its own specification is both ignorant, and a bit defeatist as it implies that, even when communicating to a god-like entity capable of coding anything you want, perfectly, you won't be able to do much better than reading it the program which does what you want. This obviously isn't true.
There's an entire field called program synthesis dedicated to generating programs from formally well-defined specifications. See the Synquid system, for instance.
They had a running implementation of LISP in the late 50s
The seminal lisp paper was a report of what they had done and what they wanted to do. The paper included the code for an interpreter. And the paper was published in 1960, but it was written in 1959
I'm here for the Lisp
Every-time he spoke about singleton he says the same thing "a system has either 0 or 2 or more singletons present, not 1!". However most of my projects have a single all static class (not really a singleton but close to) named "xxApplication" containing the xxWindowManager and other stuff that usually goes into singletons. I went for that following the adage "In doubt do like QT does". Also I have that one project that use a real singleton pattern at one place because I had to fix it in hurry and that was the best think I thought of at the time. It is still present today but it also the single singleton in that code. So 100% of my code has 0 or 1 singleton but never more.
unreal
re-usability at his its finest: 6:48 and ruclips.net/video/FyCYva9DhsI/видео.html
rococo is an archetural style, older than old
Let's get these likes up!
British humour. Brilliant
Tough crowd. I would have melted at 2:30
Totally hilarious and so true!
windup
Mostly good presentation -- except for the Perl bashing. Perl code can be extremely elegant AND readable at the same time. Just bashing it b/c that's fashionable nowadays is kind of lame. Plus, you have probably never heard of Raku (formerly known as Perl 6), which is a stricter and more modern form of Perl.
var test = old MyClass();
...Yeah, I don't think that's true
:=
Legacy system... :D
and the ONE language you never even mention, which excells at ALL these principals you speak of is FORTH. I have never seen any language that does as good a job solving ANY software engineering problem as forth. PERIOD.
and... all you hear on it is silence.
YES. I couldn't agree more.
That's funny
Bashing perl and javascript for free.
debt -- see hamilton
Watching this in 2023. "Procedural is bad". Looks at Rust.
I don't think this is unique to programming, its all evolutionary thinking.
He opens by saying that there's too much new stuff for him to learn, although he already knows it, just tell him what's different. There are subtle ironies there that might lead him to a more sophisticated reasoning, instead of his repertoire of pointless snide remarks. Most new programmers don't care what the differences of his knowledge and the new are, it's not just about him. It's also not as simple as that, language never stops evolving. I find most of his talks are just incredibly condescending straw-man. "Iterative design process was a bit of a revelation was it? Ahh actually no." Is anyone actually saying that?
He seems to nearly understand that technology goes through cycles of refinement and reuse of old ideas in new context, but then he makes a snide remark like "we have a very weak sense of history, we live in a constant state of astonishment and rediscovery". No some of us are just younger than you, and not all of compsci is figured out. Also as long as these topics are relevant then they're going be re-taught and re-invented. That's good. I think this is a talk in need of a topic. If the topic is supposed to be history, then I'd rather have it presented in a way that didn't sound like The Register on a bad day.
Dennis K Take machine learning for example, it's really exploding, both academically and in the science and the business world. There is constant delight and astonishment about how we're making tiny discoveries and breakthroughs and applying these techniques to new things.
Visual Studio will soon incorporate AI into its coding. IntelliCode. This is subtle, but has far reaching consequences.
"Been done." He'll say, as part of some sarcastic quip. "Seen it all before."
An argument that can be applied to every new technology that has changed the world. Either a composite or a remix of existing ideas. Like language itself, context matters. We don't have a weak sense of history, and to live in a constant state of astonishment means we are behaving exactly like those who made the things he thinks we haven't learned about.
Unless he wants to enforce some kind of authoritarian principles on technology itself. What is technology but a manifestation of ideas, described and modelled in language so that we may retransmit those ideas? Language, in a Chomskyian sense is a very good model for this kind thinking, a kind of etymology of discovery. What right have we to enjoy a new book, when the words which make it are borrowed from Latin, the theme from Shakespeare, the style from Iain Banks?
No, be in constant astonishment. Kevlin Henney wants to curb our enthusiasm, whereas I think we need a great deal more of it.
"Is anyone actually saying that?" yes, there is. I actually was taught in college that iterative design was INVENTED AFTER waterfall was being used as The Way. I understand your remarks, but his talks are spot on for me in many things. maybe my experience is just too different to yours.
There's a difference between "The Way" (shall we say, "common accepted practise") and "invention". So he's found reference to a thing before it became standard practise. 50% of software developers have less than 5 years experience. He *does* know this. And to that 50%, he might appear to be an expert on language and software, but his methods seem crafted to instil doubt rather than educate.
How does he *not* know about the universal application of Generative Grammar? How does he not know how people learn? You can have a revelation in the culture of anything but the idea doesn't have to be new. We can compose a completely original sentence, get excited about it, and yet it's built entirely on old ideas. And it's also disingenuous, because we have Turing Machines, which can model anything - all the ideas have already been invented for us with his flawed thinking.
The Old is the New New? What a muppet. What a hack.
Machine learning is not new, if i'm not mistaken it was studied in the 60s, but the lack of computer power and data made it inefficient. Now we have the technology and data to see its potential and we are getting good results and improve previous work, but the idea is not new.
I had to stop watching once he started bashing multi threading with no actual facts.