The Last Programming Language

Поделиться
HTML-код
  • Опубликовано: 25 ноя 2024

Комментарии • 2,3 тыс.

  • @sandywyper
    @sandywyper 4 года назад +577

    This is the weirdest episode of Star Trek I've ever seen.

    • @QuadHealer
      @QuadHealer 4 года назад +15

      LOL! I read your comment before seeing the whole video, and I was confused, but after having seen the video, I must agree :-) But does it not go beyond just Star Trek - I mean isn't it a sonic screwdriver from Doctor Who at one point? Or am I mistaken? In any case, thank you for making me laugh!

    • @EvenStarLoveAnanda
      @EvenStarLoveAnanda 4 года назад +3

      I know, the ray-gun was a dead giveaway. ;0)

    • @shawno66
      @shawno66 4 года назад +5

      This is the funniest YT post I've even seen. I now just say that randomly.

    • @Skeeve-Magick
      @Skeeve-Magick 4 года назад +1

      @@QuadHealer isn't there also Indiana Jones and Dallas referenced?

    • @blahuhm6782
      @blahuhm6782 4 года назад +4

      *best episode of Star Trek

  • @dialNforNinja
    @dialNforNinja 4 года назад +139

    "Perfection is achieved not when there is nothing left to add, but when there is nothing left to take away." - Antoine de Saint-Exupery

    • @horusfalcon
      @horusfalcon 4 года назад +7

      So that's why this Antoine guy was always taking things! Why, just the other day he stole my lunch! (Admittedly, my lunch was somewhat less than perfect.)

    • @nik4520
      @nik4520 4 года назад +6

      Time for assembly

    • @sorvex9
      @sorvex9 3 года назад +6

      “If you achieve perfection, then something is wrong”

    • @zyansheep
      @zyansheep 3 года назад

      @@nik4520 assembly is a dsl

    • @chrisE815
      @chrisE815 3 года назад +2

      They clearly never designed a European automobile

  • @spookypen
    @spookypen 4 года назад +168

    The YT algorithm brought me here, staying for the whole thing. I don't know how to program a computer at all.

    • @diamondtroller1253
      @diamondtroller1253 4 года назад +5

      rip

    • @excitedbox5705
      @excitedbox5705 4 года назад +5

      Never too late to better yourself ;D

    • @adamnealis
      @adamnealis 4 года назад +6

      Before I watched this, I thought I had at least a basic idea as to how to programme a computer.

    • @dewinchy
      @dewinchy 4 года назад +4

      That is what I call a tough guy!

    • @jomar_macedo
      @jomar_macedo 4 года назад

      Almost same thing here, but i've already done some poorly executed working software in college. Not that it made me understand anything here any better, but i'm going for the whole thing.

  • @bjornkihlberg2103
    @bjornkihlberg2103 4 года назад +476

    If we decide to choose "one language", we're going to find ourselves stuck with the popular option, not the right option.

    • @rubenb8653
      @rubenb8653 4 года назад +28

      ooof this comment needs likes. this is totally true.

    • @insertoyouroemail
      @insertoyouroemail 4 года назад +36

      Mr. Martin thinks it's going to be Clojure. It's not going to be Clojure. Sorry but if I'm told I have to use an untyped language, me (and many others) will simply take our ball and go home. The idea of the previous generation having picked a language for me to use today is frankly repulsive. I wouldn't want to impose my ideals on future generations like that.

    • @rubenb8653
      @rubenb8653 4 года назад +18

      @@insertoyouroemail yahh but its probably going to be very subjective anyway.
      I like C for example. gets a lot of hate.

    • @excitedbox5705
      @excitedbox5705 4 года назад +9

      @@rubenb8653 C is a great language from the syntax but the feature set is very limited. If C++ had C syntax I would be soooo happy. I hate :: I think it ruins the look and conveys 0 meaning. Like -> or > or even . give you a hint at these things being related. Function syntax with (arg1, arg2) conveys meaning. In C# < > looks aggressive and is pointing away from your objects. _Keyword conveys that there is something that belongs to that. These types of syntax structures are important to make sense even when you know the language so you don't have to think and it makes the code flow much easier. If I told you from now on ? was to be used as a comma and = as a period every time you used them you would have to dedicate thought to that just like a door that you instinctively want to open the wrong way.

    • @paulzheng7663
      @paulzheng7663 4 года назад +8

      @@insertoyouroemail You use fiat currency, drive a car on roads, work a job for fiat, go to your doctor to be prescribed poisons, eat food that lacks nutrition, believe what the goverenment wants you to believe ....
      We have been imposed on. what should we do? WHat will you do?

  • @solidstate0
    @solidstate0 3 года назад +4

    22 orders of magnitude and humans still urinate in telephone boxes

  • @idiosinkrazijske.rutine
    @idiosinkrazijske.rutine 4 года назад +120

    Damn, guys from the '50s were smart.

    • @franciscogerardohernandezr4788
      @franciscogerardohernandezr4788 4 года назад +23

      They had dreams of time travel and spaceships, but then the 70s drugs and the left relegated all these goals to fiction.

    • @normanhenderson7300
      @normanhenderson7300 4 года назад +3

      francisco gerardo hernandez rivera , we’re not some of them under the influence, and were practicing leftist- or actual leftist? This did not necessarily negate the power of analysis in these matters. Although I would not suggest being under the influence of LSD, weed(lab grown especially), and such things. Cocaine, methamphetamines.

    • @franciscogerardohernandezr4788
      @franciscogerardohernandezr4788 4 года назад +18

      @@normanhenderson7300 Not only drugs, but day to day financial stress steer people's mind away from noble ventures.

    • @PflanzenChirurg
      @PflanzenChirurg 4 года назад +2

      @@franciscogerardohernandezr4788 it turns out what u own, owns u :)

    • @winstonsmith77
      @winstonsmith77 4 года назад +8

      Not all Boomers are bad

  • @RayDrouillard
    @RayDrouillard 4 года назад +50

    Twenty-two orders of magnitude more power. What did we do with that power? Argue with strangers, look at pictures of cats, and pr0n.

    • @qwerty13380
      @qwerty13380 4 года назад +4

      Have you been looking in my windows?

    • @puppetsock
      @puppetsock 4 года назад +5

      That is true. Also, much that is plainly trivial or drab. Every episode of Giligan's Island, The Love Boat, MASH, and uncountable hours of discussion of them. Makeup tutorials. ASMR.
      But also there are:
      - Google maps
      - Many many thousands of hours of university lectures on line
      - Many many thousands of hours of quality entertainment such as symphonies, Shakespeare, etc., on line
      - Live web-cams of a family of falcolns that lives on an office tower in downtown Toronto
      - Computer language translation that, while laughably imperfect, can be useful to language learners
      - Downloadable classic books for free
      - Downloadable new books as cheap as reasonably achievable and fast
      - The possibility of learn at home and work at home for a significant fraction of the population
      Humans do a wide range of activities from the horrible to the exemplary, from the drab to the intense, from the forgettable to the unforgettable. And we starting to do this all on line.

    • @mihailmilev9909
      @mihailmilev9909 4 года назад +1

      @@puppetsock well said fellow human

    • @clieding
      @clieding 4 года назад +2

      This is painfully funny as I am also guilty of often squandering this awesome resource by: “arguing with strangers, sharing pictures of cats, and looking at porn.” - trained computer programmer.

    • @ShainAndrews
      @ShainAndrews 4 года назад +1

      Adding pet raccoons and foxes. Your comment is two weeks old, we acquired additional resources since then to support it.

  • @Pariatech
    @Pariatech 4 года назад +29

    17min in and still engaged! You have a nice way to entertain and inform that is pretty rare in the IT field.

    • @anthonyb9147
      @anthonyb9147 2 года назад

      went through the whole hour and didnt even notice it went by. He is great at public speaking.

  • @tomrkba4685
    @tomrkba4685 4 года назад +96

    "What have we done with that power?"
    CAT VIDEOS!

    • @TheNortonio
      @TheNortonio 4 года назад +3

      Grand Theft Auto, Minecraft, Fortnite... sad to say. The cat videos are pretty cool though. Have you seen the cat vs cucumber videos? Awesome!

    • @arnox4554
      @arnox4554 4 года назад +2

      A worthy use of power.

    • @strictnonconformist7369
      @strictnonconformist7369 4 года назад +1

      According to a Seagate CEO, you forgot porn.

    • @TheNortonio
      @TheNortonio 4 года назад

      Strict NonConformist anathema!!!

    • @lkledu2
      @lkledu2 4 года назад +1

      whole data center and infra structure of google to put nyancat 10h (i believe in the day we can do a live for a month only with it)

  • @makinggreatbread
    @makinggreatbread 4 года назад +11

    Entertaining. I started programming in 73 (Cobol, Fortran, RPG, Assembly, machine) and haven't given those days much thought until watching this. A trip back in time. Loved it!

  • @ericpmoss
    @ericpmoss 4 года назад +29

    Grumpy Lisp programmer here... Common Lisp "won't die" because it's extensible and multi-paradigm, and not driven by super-cool, super-pure solutions to toy problems. There should be a way to (a) write a leveled Lisp where things like garbage collection are optional, and layered on top of simpler layers, (b) make a processing architecture that lends itself to it, such as cdr-coded caching; (c) pull some of the commercially driven, non-Lispy clutter out of it, and (d) put some big-time effort into improving the compilers and interpreters.
    For the price of a single stealth bomber, we could do some beautiful things.

    • @derekfrost8991
      @derekfrost8991 4 года назад +1

      I love lisp, I use it for my personal accounts etc.. :)

    • @UNR3S7
      @UNR3S7 4 года назад +4

      lisp is eternal for the same reason as emacs. You put all the things into one thing, and have a good foundation at the very core of the thing. And you let people do whatever they want with it. If you have a good foundation all customization leads to realizing the good way.
      I think that the biggest mistake is assuming that people are not smart enough to realize the good. "you shall not rebind because our bindings are good" well, if you rebind and the original bindings were good then you will realize your mistake eventually. "you shall not assign because it is not pure" well, let them assign and suffer the consequences.
      Lack of confidence in the foundation is what leads people to restrict branching. (and some business reasons, but business will ruin everything)

    • @dejohnny2
      @dejohnny2 4 года назад +2

      I like Dr Racket.

    • @rmsci4002
      @rmsci4002 4 года назад +1

      Clojure already exists, runs on JVM, CLR. Can be compiled to JavaScript via ClojureScript. Next step is it running on LLVM. GRAAL compiler can help this.
      Common Lisp is missing this independence/interdependence, VM, etc.

    • @vinapocalypse
      @vinapocalypse 5 месяцев назад

      @@rmsci4002 Common Lisp does not prescribe which platform it should be implemented on. If you want CL on the JVM there's Armed Bear Common Lisp, but the implementations which compile to machine code are extremely effective at doing so. SBCL compiles forms directly to machine code so you can incrementally develop and compile a function at a time and come out at the end with very fast dynamic Common Lisp
      Common Lisp is out there and being used, it's just waiting for the rest of the world to catch up

  • @brawndo8726
    @brawndo8726 4 года назад +94

    43:29 The Dark Knight Falls

    • @PatriceStoessel
      @PatriceStoessel 4 года назад +3

      seems to fall asleep

    • @brawndo8726
      @brawndo8726 4 года назад +4

      @@PatriceStoessel lol

    • @mikhailpopovic1705
      @mikhailpopovic1705 4 года назад +4

      his central nervous system has capitulated ... I feel suddenly to be a flying rodent

    • @Rebel101
      @Rebel101 4 года назад

      Lol

    • @darnell8897
      @darnell8897 4 года назад +5

      Why do we fall, Bruce?

  • @robrick9361
    @robrick9361 4 года назад +68

    This is like the opposite of dementia.

    • @larrydillard8163
      @larrydillard8163 4 года назад +3

      My thought exactly. If you are short on time, he takes the gloves off at 33:20 - and it gets more fun.

    • @reallylordofnothing
      @reallylordofnothing 9 месяцев назад

      This comment is gold

  • @joerogers4227
    @joerogers4227 4 года назад +17

    I am now almost 80. I have a 4 year degree from San Diego State University in Office automation. During my time from 1981 to 1989 when I graduated at age 47 I studied through Jr. College and SDSU many languages. Basic, Cobol 2 semesters, Fortran, Prolog which at the time was called a fuzzy language, RPG II also. My biggest project was programming in Dbase III and iv. I wrote a labor accounting program for Public Works Department on Miramar Naval air station. (NAS). We compiled with clipper and it could go onto a floppy disk. I like that program because it had clear steps and modules I created to do the job. I did my best to not use what we called spaghetti code or jumps all over the place. Through my career I met people who were involved in the computer world as Pioneer's. One was Dr. Hershey and his specialty was fortran and he developed the Hershey fonts. I also meet Dr. Hamming the developer of the hamming code for forward correction of transmitted data. Dr. Hershey was retired from Naval postgraduate School, the Dr. Hamming was a professor there. I like programming in Dbase IV but diskiked working with Prolog as it was not as structured. I remember when I was at Mesa jr. College in San Diego I got one night at 2:00 am and worked remotely on my Cobol program and I got a return message from a tech there asking if I ever Slept.

  • @CubOfJudahsLion
    @CubOfJudahsLion 4 года назад +15

    Aside from constraints, some paradigms *did* add something. The functional, for example, came with pattern matching and unification, lambdas (aka anonymous functions) and functions as first-class citizens. The logical paradigm (aside from using matching and unification as well) also adds backtracking and domain-constrained searches. While non-imperative languages may seem academical, the truth is that some of those features of other languages have found a place in mainstream programming, and there are plenty of books about programming "functionally" in C#, JavaScript, Python, etc. It's more likely that we'll build an eclectic future rather than a strictly-imperative one.

    • @aoeu256
      @aoeu256 Год назад +1

      I can think of many more paradigms that delete things: total functions (functions must terminate), reversable programming (all functions must be reversable via always returning their input as well as their output), concurrency paradigms, linear types preventing you from duplicating vars, constraint based programming, etc...

  • @MotiviqueStudio
    @MotiviqueStudio 4 года назад +20

    Zen and the art of disdain for everything.
    Sometimes "we" want a proprietary language for "our" own reasons. So there's a lot that has to happen to get to where a last language is a discussion. Goto line 1

    • @monkfoobar
      @monkfoobar 4 года назад

      I met a guy who fixed his motorcycle using the pull top from a beer can.

    • @jakykong
      @jakykong 4 года назад +1

      Yeah, I think his argument is fundamentally based on the flawed premise that any general purpose language actually could encompass everything in real life (as opposed to theoretically). Even if you look in the algol-inspired family (of which C is a member - for someone going as far back as PDP-11's, I'm surprised he didn't mention algol. :P ), they express different concepts preferentially.
      For example, if the problem you're solving is in a domain that is cleanly solved by arrays, you're way better off writing it in Python than in Java. On the other hand, if the problem you're trying to solve is large-scale collaboration, Java has the rigorous type system to enforce contracts, so even though you'll be doing more array fiddling by hand (instead of being expressive with comprehensions and slicing), you can coordinate more developers since the boundaries and interfaces are less fuzzy.
      Clojure is interesting, for much the same reason most Lisp-family languages are (I'm pretty fluent in Common Lisp); they're good at metaprogramming. That is _why_ they are capable of doing pretty much everything: Nobody writes code in "pure" unadulterated Lisp. They write macros that write code in "pure" Lisp. And those macros are used to define subset languages much more akin to each of the various paradigm languages.

    • @absalomdraconis
      @absalomdraconis 3 года назад

      @@jakykong : You arguably could make his "last programming language", but you'd have to reject his own rejection of assembly. Ultimately, the "last language" that he speaks of would be more naturally a derivative of Forth than of Lisp.

  • @ww1593
    @ww1593 4 года назад +73

    If this guy was my teacher in High School, I would've chosen a career in Computer Science 12 years ago ! Better late than never.

    • @nick18303
      @nick18303 4 года назад +2

      Misael d.w I’m 32 and just started learning this year and I too would have loved him as a teacher but atleast we have him now

    • @ww1593
      @ww1593 4 года назад +3

      @@nick18303 the power of the internet and youtube !

    • @terrythompson7535
      @terrythompson7535 4 года назад +4

      You gotta love the way this guy talks. He's so excited and optimistic. You can tell he's very passionate

    • @terrythompson7535
      @terrythompson7535 4 года назад +3

      @Peter Mortensen The science does not back that up. Now people who understand the release of dopamine and how it aids memory are saying that making the learning more entertaining results in people learning faster.

    • @ww1593
      @ww1593 4 года назад +2

      @Peter Mortensen I agree with you but the reason why I made this comment is also becaude of his breath of knowledge. He is the perfect balance in my opinion. Most CS channels are simply career youtubers that what to be celebrities.

  • @Ed64
    @Ed64 4 года назад +15

    I love the Commodore 64 showing up at minute “42”. Awesome talk as usual, Uncle Bob!

  • @KpxUrz5745
    @KpxUrz5745 3 года назад +1

    With all that experience, clearly you are a man who wears many hats, which you changed 8 or 10 times here.

  • @this-abledtheextravertedhe5299
    @this-abledtheextravertedhe5299 4 года назад +10

    I’m 50 years old and learning something other than DOS for the first time since 1989 🤣 I choose Python for many of the reasons you listed. I’m off to explore Lisp and Closure.
    Thank you 😊

  • @billbez7465
    @billbez7465 4 года назад +19

    Wow, this is classic. I've been a language junkie for years (actually, a couple decades), and Bob describes this more informative and interesting than I've heard for a long time.

  • @karenparker3086
    @karenparker3086 4 года назад +34

    A lot of this reminds me of physics circa-1880 or so. Physicists were seriously discussing the possibility that there was no more new physics to discover, just applications of existing stuff and engineering. Then along about 1890-95 X-rays and radioactivity were discovered, setting off a wild ride of new discovery that we’re still on to this day. Don’t be too quick to assert that there’s nothing new under the sun, in any field.

    • @dontworrybehappy5139
      @dontworrybehappy5139 4 года назад +4

      The new stuff, in my opinion, will be the computer programs doing the programming.

    • @normanhenderson7300
      @normanhenderson7300 4 года назад +1

      Don't Worry Be Happy , I dabbled in that idea, when I was introduced to the art. I am sure the more sophisticated programmers can accomplish that.

    • @rmsci4002
      @rmsci4002 4 года назад

      Those were part of it, but it was the ultraviolet catastrophe leading up to quantum physics that pushed the envelope.

    • @rmsci4002
      @rmsci4002 4 года назад

      @@dontworrybehappy5139 that is an old idea... from Lisp.

    • @dontworrybehappy5139
      @dontworrybehappy5139 4 года назад

      @@rmsci4002 It is a problem that just hasn't been attacked with vigor yet. I've worked in the field of computer programming for decades, and I would guess that 95% of the code being written today could easily be generated by intelligent software created in the very near future. The hardest part of the software would be the interface code that had to interface with and gather the requirements from the project managers (ha ha).

  • @TuringTest37
    @TuringTest37 4 года назад +9

    Check out Julia. It has light-weight syntax, is fully homoiconic, compiles on the fly using LLVM down to optimal machine code, interfaces easily with C, Java, R, Fortran and others. And speed? Within 10-20% of compiled C. Often 20 times faster than python for the same numerical problem.

    • @galtbarber2640
      @galtbarber2640 2 года назад

      Python is way slow, even perl is much faster that python while still being slow.

  • @alastairbowie
    @alastairbowie 4 года назад +17

    I've had a growing interest in Common Lisp as of late. The metaprogramming aspects of Lisp is quite interesting to me. I also like the minimal syntax and how it feels like quite an expressive language.
    I've been writing a text based game using a Common Lisp REPL smartphone app I found which has been pretty enjoyable.
    Cool vid.

  • @Dmytro-kt3fr
    @Dmytro-kt3fr 2 года назад +4

    Astonishing video, terrific analysis on the programming, definitely went to a saved videos. Most of us did lost a track of what paradigms and languages are, drowning in the frameworks and tech choice but forgetting the basic ideas behind everything. Will need to prep a "ted talk" in my company covering the things cover by Robert

  • @0xggbrnr
    @0xggbrnr 4 года назад +36

    “But, the materials we manipulate with our hands, the code, is virtually the same...” *Uncle Bob enters the Matrix suddenly and starts listing programming languages* 😂

  • @shableep
    @shableep 4 года назад +10

    44:11 WebAssembly and WASI are the thin layer virtual machine that abstracts the hardware layer (or, "host" as they put it), and almost any language can be compiled to it. All the languages can communicate to each other via WASI. It's a thin virtual machine that invites whatever language that compiles to it to bring their own runtime code if needed. Though it was designed for the web, it was the web that created the excuse for multiple mega corporations to cooperate with each other to agree on a standard language to compile to that would bring more native-like processing power to web apps. That technology is now getting adopted server side, and for many other environments outside of the web browser. It's really exciting technology and I think you should look into it if you haven't already.

    • @RuslanKovtun
      @RuslanKovtun 10 месяцев назад

      Yeah, I'm also waiting for WebAssembly to mature. But I really sold on the point that the code is the data and the data is the code. We should definitely be able to modify the code on the fly (you can remember all sorts of macros running at compile time in rust, jai, nim, etc.). The problem is that programs aren't compilers and they can't produce code other than at compile time. Although, I guess Uncle Bob did logical mistake when he was talking about permissive languages and how unpopular they are and then assumed that the perfect language is permissive.

  • @benjaminmelikant3460
    @benjaminmelikant3460 4 года назад +13

    Warning: text wall incoming
    Reflecting on the idea that we "need to pick a language and all decide to use it across the entire domain", I ended up with a few thoughts. Bob mentions biology, chemistry, mathematics, and others as domains where representation of information was eventually boiled down to a single representation rather than many representations. I see two issues with this. The first is that, even within these disciplines, I don't believe that to be 100% true. I can't speak much of most of these domains, but mathematics is an interesting case in my opinion. Certainly the notation of information for calculus is much different than the notation for basic arithmetic. Yes, calculus builds on basic arithmetic, but most people wouldn't understand a geometric, algebraic, trigonometric, or calculus formula simply by knowing the basic arithmetic operations. There are different representations in each sub-domain because each sub-domain has a unique goal in mind.
    And this is where my second point comes in. There are so many languages within the sphere of computer science because there are so many different things to express. I have seen it in other comments already, but I think most software developers will realize that the base language requirements for something like a game engine or a real-time embedded OS are much different than the requirements for a web UI. They are all within the domain of software development / computer science of course, but they are all unique sub-domains within that domain. I would argue that there is no such thing as a true "general-purpose" language; that is, that no single language is perfect for all programming tasks; rather, most langauges are, to one degree or another, domain-specific or specialized languages. If we all sat down and decided "okay, our one and only programming language is going to be a virtual machine language" for instance, in what language do we write the virtual machine? We would, at best, need two implementations of the same language; one targeting bare metal so that we could write a VM layer on top of the hardware (or at least a bootstrapper of sorts), then the language itself with its runtime within the virtual machine. The option would exist to move the virtual machine into hardware, but that only moves the problem of needing a "bare-metal" language to the domain of hardware development, it doesn't eliminate the issue. Furthermore, hardware-based language abstraction makes it impossible to harness the raw power of the hardware when you need it; you are throwing out the potential to utilize hardware to its fullest, unless your language does allow bare access to the hardware, in which case I would ask what differentiates this language from something like C, which can be used to write high-level software, but can be mixed with assembly to access hardware features that you may not otherwise be able to access? True, it isn't exactly the same, but I feel like it violates the concept you are going for. Certainly you couldn't do this in an ultra simplistic syntax language.
    I understand where you are coming from; if we could worry less about such and such language playing nice with this other language and reading source code would be simple because we all just know the same language, we could in some way eliminate issues with compatibility and such, but this is how I see the other side of that coin I guess. I enjoyed the talk a great deal and don't want to take anything away from it. This is just my "other side of the fence" perspective on this issue.

    • @bertvanbrakel
      @bertvanbrakel 4 года назад +3

      I think we need to make it easier to be more compatible. Our toolchains should probably align more.
      Look at the state of dependency management. There are so many different tools to do it, and so many ways it's implemented, however at it's core it's a rather simple problem. Either we can come up with a single component to do this and have it customisable per problem requirement, or we break it down into smaller parts and allow easier ways to interface multiple dependency tools in a sane way
      After decades of programming, I'm still finding one has to cobble together a bunch of random stuff and scripts to make things work together. It shouldn't be such a PITA. There are common themes across all the languages and tools I've come across, and it's always seemed stupid different software tribes just end up reinventing the wheel

    • @absalomdraconis
      @absalomdraconis 3 года назад

      @@bertvanbrakel : The problem with dependency management is that you need to embrace the idea of multiple sources providing the same resource, which you then have to score so that you can choose between them, and potentially even cause a recompilation storm as your newly installed program or code provides some component that allows the other components which it depends on to themselves provide additional components- people rarely want to deal with this, including the people that should be writing the supporting tools. Among other things, it's naturally a logic language field, and the only things folks are normally aware of in that field are the oft-neglected Make and Prolog.

    • @absalomdraconis
      @absalomdraconis 3 года назад

      A "last language" could probably be created, but it would fundamentally go against his rejection of machine code. In fact, at the end of the day it would probably be more of a Forth than his beloved Lisp variants.

  • @nontimebomala2267
    @nontimebomala2267 4 года назад +2

    No way this Computer Scientist is going to listen to this man talk for an hour about ... a topic that has been beaten bloodless three decades ago.

    • @Evan490BC
      @Evan490BC 4 года назад

      Yes but you (and me) are not his target audience. Most programmers don't even know what a continuation or a monad is.

  • @lfmtube
    @lfmtube 4 года назад +2

    I watched you video with great interest. Thank you for posting it. I used to work in Argentina with a PDP 11/04 with RT11 operating system. The CPU had 4K memory, an LA34 teleprinter with paper tape and a vt100 terminal. Over 40 years later, I still remember those great days. :) I was 19 years old and working reading magnetic tapes from mainframes and converting into Microfiches as a replacement for the huges printed lists that were generated in those days. All this at an speed of 36 thousand records per minute. It still amazes me how we could do all of that with such limited resources.

  • @qualia765
    @qualia765 4 года назад +4

    After a bit of though, _I think_ that:
    0) This is a very good video that I enjoyed a lot
    1) I'm not so sure it will conform on one, but if it does then it will be a very big language in the sense that it can both handle the fine details and complicated stuff as well as assembly, but also the big overview of things as well as python
    2) I think that the future is heading in more of a direction of increased usage of multithreading, not as a limit, but as something that is already needed. Due to the increase in neural networks and 3d graphics and undoubtedly others.
    3) The syntax will be straight forward, it will run very fast, and running modification.
    4) Maily most of the things mentioned in the video should be true, except they can be bypassed so that more hardware-level stuff can happen. So very very hybrid of all things.
    5) It will be a textual language. (Although I will still advocate for the graphical programming of textual language.)
    So thos're my opinions with my limited knowledge.

  • @jorisherry
    @jorisherry 4 года назад +13

    The last programming language will come around the same time as the last spoken language.

    • @clickrick
      @clickrick 4 года назад

      Pretty much true!

    • @1crazypj
      @1crazypj 4 года назад

      @Deep Throat Personally I've always thought a Mandarin/English hybrid far more likely.(for at least the last 40+ years)
      Historically, China has been forced to open borders for 'trade' but has always closed them again after throwing out foreigners .
      Today, they have too much of everything and the technology to do pretty much whatever they really want to, there is already speculation if US could take on a re-armed China (plus, China is where bureaucracy was invented and look how worldwide that is.)

  • @DanielBos
    @DanielBos 4 года назад +9

    Yay Forth! I've worked with a lot of niche language, but Forth was the one that blew my mind.

    • @Mark.Brindle
      @Mark.Brindle 4 года назад

      Used it commercially from 82 to 85. Loved it. These days, I would treat it as a high level assembler.

    • @DaveRoberts308
      @DaveRoberts308 4 года назад +2

      Every programmer needs to wrap their head around both Forth and Lisp. Once you do, you’ll really understand some profound truths about the nature of computation.

    • @DanielBos
      @DanielBos 4 года назад

      @@DaveRoberts308 Haven't used Lisp itself much, but I did write in a Scheme dialect commercially for a few years. call/cc (call-with-current-continuation) is another eye-opener that can fundamentally change your understanding, once you grasp what it does.

    • @DaveRoberts308
      @DaveRoberts308 4 года назад

      @@DanielBos Yep, agreed.

    • @ThePandaGuitar
      @ThePandaGuitar 4 года назад

      Dave Roberts Used LISP for years. what profound truths, stop circle jerking. It’s just a stack snapshot. Heck even Ruby has continuations.

  • @Blink_____
    @Blink_____ 4 года назад +7

    The point of the "hot new thing" being decades old is kind of striking, when you consider other aspects of our culture. For the past decade "retro" has been a big boom. Movies and Vinyl especially. To a lesser extent fashion has been coming around too, and I'm pretty sure we are gonna start seeing coke bottle glasses again before the 2030s hits.

    • @rmsci4002
      @rmsci4002 4 года назад

      Has nothing to do with it being retro. It is because it was based on solid mathematical principles. Those don't wear out.

    • @rudyardkipling4517
      @rudyardkipling4517 4 года назад

      sounds about right, the hot new thing would be in her 20s :p

  • @DieterGribnitz
    @DieterGribnitz 4 года назад +17

    Wow, this is the first time I ever heard the specs of a macbook read aloud that seems impressive with a reasonable price point. Turns out the trick is to compare it to a 3 or 4 decades old machine.

    • @BattousaiHBr
      @BattousaiHBr 4 года назад +3

      also, he VASTLY underweighted the CPU performance since he assumed the same IPC (instructions per clock). modern hardware is leaps and bounds faster than decades old hardware even when running at the same frequency.

  • @saf271828
    @saf271828 4 года назад +53

    "BASIC ... slow."
    No. Full stop. I'm not particularly a fan of BASIC for other reasons, but even I'm smart enough to not confuse a language (syntax, semantics) with its implementation (Microsoft 8-bit and 16-bit implementations). The very first BASIC compilers to come out of Dartmouth were **compiled**, not interpreted, and were, bluntly, fast.
    If you think BASIC is slow, it's because you've grown up using Commodore or Apple II BASIC interpreters, which were utter tripe as interpreter technology goes. Someone who advocates for VM-based language implementations should already know this, and it was a staggering punch to the gut to see this error made.

    • @islandcave8738
      @islandcave8738 4 года назад +3

      QBasic was the first general purpose language I learned. Prior to that, I did have a little bit exposure to the domain specific language, CNC. And prior to that I had exposure to the syntax of Word Perfect 5.1 markup language (I don't know if it is called that, but for lack of a better name, I am calling it that).

    • @RBLevin
      @RBLevin 4 года назад +5

      Also, Python is BASIC 2020. Do people complain that Python is slow?

    • @rameynoodles152
      @rameynoodles152 4 года назад +15

      @@RBLevin Yes. They do complain that Python is slow. Python is INCREDIBLY slow. It's so slow that I don't even know how it's possible to make a language that slow.

    • @RBLevin
      @RBLevin 4 года назад +4

      Back in the DOS days, I used Microsoft BASIC PDS (Professional Development System). No IDE. Just a compiler. It was fast as hell.
      A C programmer at Unisys insisted that BASIC was too slow to do anything useful. I told him I could develop a telecom app with it that used Xmodem to transfer files. He laughed. We placed a gentlemen's bet on that. I won. Had it running in no time and it kept pace with the telecom apps of the time (Procomm, Telix, etc.). Doubt Visual Basic could do that today.
      Another incredibly fast BASIC was Borland's Turbo BASIC, which became Power BASIC. I think that's still around but it never made the transition to GUIs.

    • @newstar346
      @newstar346 4 года назад +2

      @@RBLevin There is a GUI version of PowerBASIC.

  • @horusfalcon
    @horusfalcon 4 года назад +3

    A great deal of the programming done to control real-world inputs and outputs is symbolic in nature, built in the form of a ladder logic diagram that uses symbols geared originally to be understandable by electricians and instrument technicians. A lot of these symbols are graphical, and their interconnect is very often expressed graphically. This was an interesting "keynote".

  • @Radioposting
    @Radioposting 4 года назад +11

    The fly in the ointment of language theory and practicality could be GTP-3 (Open AI). GTP-3 boasts to write code based on natural language explanations fed to the AI by non-programmers. Do we really want mid-level corporate managers writing code? In the words of Lt. Uhura (Star Trek III), "Be careful what you wish for... You may get it."

    • @lfmtube
      @lfmtube 4 года назад +2

      I think the path will be in the direction of business specialists writing their own procedures using low-code based services. And true high-level developers are going to write the microservices or modules to use as the basis for those procedures.

    • @Radioposting
      @Radioposting 4 года назад

      @@lfmtube Maybe the new model will be similar to airline pilots. Other than takeoff and landing, they really don't do much other than supervising the flight computer.

    • @johnaweiss
      @johnaweiss 4 года назад

      "Do we really want mid-level corporate managers writing code?"
      No, we want all humans writing code.

  • @espenskeys
    @espenskeys 4 года назад +4

    The code is the data - the data is the code - we used to call that "speed programming" in the Scandinavian Demoscene back in the day

  • @abj136
    @abj136 4 года назад +2

    You missed a major paradigm, that of Excel. It removes the decision of what to update and keeps everything consistent for you. Many programming languages have been developed based on this, though nothing mainstream that I’m aware of.

    • @TAHeap
      @TAHeap 2 года назад

      @@Alex-wk1jv "Excel would be declarative functional logic programming"
      Weeel ... not really! At least not in a usable way.
      Now, if you were to add lambdas/binding and some explicit syntax for array formulae, then it might be getting there...

  • @homelessrobot
    @homelessrobot 4 года назад +4

    The observation that paradigms are subtractive does not suggest that new paradigms are always subtracting from the semantic space of the immediately previous generation. You can always go as far back in the stack of historical restrictions as you want to establish a new set. And construction is entirely subtractive anyhow. Constructing a mathematical theorem, you are eliminating all other possible/valid constructions in the theory to say/prove something specific. Constucting a theory, you start with axioms/suppositions who's function is to 'subtract possible preconditions'. When you write a program, you are subtracting from 'the set of all possible programs'.

  • @kevinfredericks2335
    @kevinfredericks2335 4 года назад +124

    "time runs in a single direction"
    Quantum Engineers: hold my beer

    • @Dystisis
      @Dystisis 4 года назад +7

      The idea of time having a direction is a confusion based on a metaphor. Time is also not a dimension or a medium.

    • @cybervigilante
      @cybervigilante 4 года назад +8

      The purpose of time is to make sure you can't unscramble eggs.

    • @Kasas90
      @Kasas90 4 года назад +5

      S = k ln ( Omega ). Here you go

    • @stefanhensel8611
      @stefanhensel8611 4 года назад +9

      @@cybervigilante Time is the way of the universe to make sure not everything happens at once. And space is the way of the universe to make sure not everything happens to you.

    • @LaughingOrange
      @LaughingOrange 4 года назад +2

      He said dimension, which to my understanding is true even in quantum mechanics.

  • @KentHambrock
    @KentHambrock 4 года назад +21

    I love how ADHD this keynote is. It makes great points while continuing to be very weird.

    • @nikthefix8918
      @nikthefix8918 2 года назад

      Yeah he reminds me of Lewis Black the standup comic.

    • @ricardo.mazeto
      @ricardo.mazeto 2 года назад +3

      I have ADHD, and I just noticed after reading your comment how I paid attention to the whole video without getting distracted, which is a rare thing. 😅

  • @m.p.jallan2172
    @m.p.jallan2172 4 года назад +2

    Fascinating thanks. Prof D.Brailsford on the Computerfile series once said that the current low level binary/instruction set design is not likely to change much, already being the smallest way to convey information.

  • @Joel-mp2oo
    @Joel-mp2oo 4 года назад +15

    One main thing I take away from this, is I really need to start learning LISP... Oh and we all need to unite ! 😁

    • @AndersJackson
      @AndersJackson 4 года назад +3

      Start using Emacs, and you are running a REPL with lisp, that have an editor you can configure with Lisp.
      Then take a look at Magit and Org-mode with literal programming with Babel-mode (extension of Org-mode).

    • @Collaborologist
      @Collaborologist 4 года назад +2

      Clojure

    • @Evan490BC
      @Evan490BC 4 года назад +1

      @@Collaborologist Or Scheme/Racket.

  • @tiagoweber9438
    @tiagoweber9438 4 года назад +12

    Fascinating: entertaining and informative.

  • @bloguetronica
    @bloguetronica 4 года назад +11

    Props to this man. Thanks to his lecture on clean code, I managed to review a code I've written, break every long function into its components, and now my program improved substantially. I'll apply these principles in the future. They made the code far more readable and maintainable.

    • @nycfpv
      @nycfpv 3 года назад +3

      IMO single responsibility is the key to building good software. If you follow anything, follow this.

  • @ffggddss
    @ffggddss 4 года назад +12

    The only language I've encountered that I didn't hear him mention, is Ada.
    Or did he, and I missed it?
    "Anybody remember core [memory]?"
    Yes! And while I can't say I remember this, my dad related to me a time *_before_* there was core memory!
    (He got to do some thesis work on Johnny von Neumann's computer, ca 1950-51. I was just a toddler.)
    Fred

    • @WilliamHostman
      @WilliamHostman 4 года назад +3

      I remember a number of odd things that are older than me... like rope memory and core memory. And the need to refresh them. I also remember typing punch cards for college students when in Jr. High, because the only time the thing was free was during their classes.
      I know that the average digital watch these days has a better CPU than my TS-1000....

    • @ffggddss
      @ffggddss 4 года назад +2

      @@WilliamHostman To be clear, I wasn't saying that I remember things that are older than me; core memory came into being when I was alive, but too small to know what a computer even was.
      The first computer my dad worked on, before core memory existed, had some way of using a grid of "pixels" (not a term that existed then) on a CRT screen as random-access memory. I never quite understood what he was describing, but one of its pitfalls was that when some part of that screen got "burned in," you had to write your program so as to avoid that region of memory!
      My own first use of a computer was in college, when you typed your program on punch (Hollerith) cards, submitted your "deck," and hoped your output, on 14"(or 17"?)-wide track-feed computer paper, would arrive for pickup outside the computer room in a day or two.
      Fred

    • @NUCLEARARMAMENT
      @NUCLEARARMAMENT 4 года назад +2

      @@ffggddss It sounds like Williams Tubes to me.

    • @WilliamHostman
      @WilliamHostman 4 года назад

      @@ffggddss Most of us remeber things from before our birth - because we have this thing called language, which allows us to share experiences we weren't present for. Like remembering that George Washington was one of the first US general officers.
      Video has made that even better than language alone. Remembering those isn't abnormal... at least not since the 1930's.

    • @ffggddss
      @ffggddss 4 года назад

      @@WilliamHostman I adopt a narrower meaning of "memory." What you're describing, I would call a memory of hearing or reading or seeing a video or movie about it, not actually remembering the thing itself.
      Fred

  • @TedSeeber
    @TedSeeber 4 года назад +1

    I learned PDP-8 Assembly, emulated on a PDP-11, with our only attached printer an old teletype. Freshman Year of College.
    Long before OO, I was doing polymorphism in 6502 assembly on the Apple IIe. It was a neat way to get software sprites on a machine whose high resolution mode was just another memory space.

  • @richardgreen7225
    @richardgreen7225 4 года назад

    Programming notations deal with different relationship concepts. A typical "language framework" contains at least 4 notations.
    > Procedure (steps to run when header is invoked)
    > Data (data-type, data-structure, sometimes meta-data)
    > input-output (mapping from storage [database] onto memory variables and vice-versa)
    > format (notation indicates how the programmer wants the data to be displayed)
    In addition, a framework might also include notations or conventions for:
    > Build (make) - How to assemble components.
    > Wiring - Identifies which actors will participate in a collaboration.
    > Dialog - Given a stimulus in a context, what is the response. (menu, chat-bot)
    > Plan tree nodes - Given a set of pre-conditions are true, what transitional code (if any) creates a post-condition. The "tree" has a root "goal". Many tree-leafs (threads) of a plan may be running simultaneously.
    - Almost all of these "notations" can be written in "simple English" - a notation that looks like written English with some structural conventions. Format requires a mark-up notation which allows the "fill-in-the-blank" convention to be expressed with minimal excursions from WYSIWYG.

  • @michaeladair6557
    @michaeladair6557 4 года назад +12

    Legit O.G. Nerd, I love this guy. Did you see the batman suit in the background! Subscribed!

  • @rickmisk
    @rickmisk 4 года назад +10

    Any Lisp will do. I prefer the minimalist approach with Scheme

    • @cstacy
      @cstacy 4 года назад +1

      I could argue with a few things he said and didn't say, but in the end it would be a Lisp variant, yes.

    • @Collaborologist
      @Collaborologist 4 года назад

      Clojure

    • @rmsci4002
      @rmsci4002 4 года назад +1

      Too limited. Clojure is much more practical.

  • @Rfc1394
    @Rfc1394 3 года назад +3

    At about 8:30 is the statement, "I'm not at all convinced that there are any more surprising languages out there." Reminds me of the (probably apocryphal) story of the U.S. Commissioner of Patents who thought there won't be a need for the Patent Office in the future because everything that could be invented has already been, back in the 1890s. We have no idea what revolutionary change we can't even imagine is coming along that will completely blindside us (in a good way.)

    • @jondor654
      @jondor654 9 месяцев назад

      Perhaps ironic. , notice the playdough .

  • @dale116dot7
    @dale116dot7 4 года назад +1

    My favourite programming language is assembly. Grew up on 6502 assembly. Moved to 68HC11 and 68HC08. Still program lots of assembly on MC9S08 and MC9S12.

  • @alexmad69
    @alexmad69 Год назад

    Scala is the best: strong functional support for immutability and functions as arguments, parallel collections, implicits, higher kinded types, powerful pattern matching, macros for metaprogramming

  • @handris99
    @handris99 4 года назад +9

    I remember him saying elsewhere that "in the end we are all going to be programming in LISP" :)
    Edit: Just finished the thing. Looks like the apple didn't fell far from the tree xD Clojure it is for now. I should take a look at it.

    • @absalomdraconis
      @absalomdraconis 3 года назад

      I'll make a prediction now: in the end, even if what we program in acts similarly to LISP or Clojure (which it probably won't in important ways), it'll look like C. Some people say that syntax is a solved problem and not an issue, but it is quite clear that the solution in question does _not_ look like Lisp.

  • @albertbatfinder5240
    @albertbatfinder5240 3 года назад +8

    “The code is the data, the data is the code”. Wasn’t the very first great insight the realisation that the code and data could live side by side in the computer, and the second breakthrough (debugging at 3am after the first great insight was implemented) that the code should never be treated as data?
    Call me conservative, but I never want the code being executed to be different to the code I wrote down. You might as well ask me to feed my code through an Enigma Machine before compiling it. Turing loved to modify a bit of code on the fly. Yeah, and Turing couldn’t see the need for subroutines. Axiom: if you’re settling on a language that Turing would approve, you have definitely gone wrong because the rest of us ain’t Turings.

    • @absalomdraconis
      @absalomdraconis 3 года назад

      There is some actual value in what he's talking about (aka self-modifying code), but you tend to see it restricted to dynamically loaded libraries (which are so useful that they even existed in the DOS days, in the form of things like overlays) and things like Forth (which aren't really _languages,_ but instead _command interfaces_ that provide some specific language). Self-modifying code in real-world use is very rarified because it's rarely of value.

    • @zyansheep
      @zyansheep 3 года назад

      Sounds like lisp

  • @petersuwara5432
    @petersuwara5432 4 года назад +12

    Bob forgot about Swift replacing Objective-C...

    • @johnheaney3349
      @johnheaney3349 4 года назад +1

      He also forgot about Objective-C getting garbage collection.

    • @ceedob
      @ceedob 4 года назад +5

      This video is from 2011. Swift first came out in 2014

    • @petersuwara5432
      @petersuwara5432 4 года назад +2

      @@ceedob it is indeed! I just looked at the Posted date. So much has changed since then.

    • @carsten_
      @carsten_ 4 года назад

      Objective-C never got GC. ARC is by definition not GC.

  • @bhangrafan4480
    @bhangrafan4480 4 года назад +1

    For some decades I had understood that the next development in computing was going to be increasingly parallel architectures which would require some new innovations in programming for greatly parallel processing.

  • @yapdog
    @yapdog 2 года назад +1

    Great video. First time viewer, but I just had to subscribe; I love your insights and we're of similar mind. I look forward to checking out your previous vids.
    This video really hit home because I'm in the middle of developing a platform that will make new language creation and integration very easy. So, if there are new language classes that can be discovered (I believe there are) they will likely be discovered on my platform since it will be accessible to many different kinds of programmers. Yeah, I know, big claims, no verifiable data. However, I actually *do* have some new classes that I intend to develop on it, but *anyone* will be able to create their own. When it's ready to roll, I'll contact you privately. Until then, wish me luck 😁

  • @ilemming
    @ilemming 4 года назад +11

    I see lots of skepticism in the comments from people who seemingly never spent sufficient time writing Clojure.
    Those who scoff at Lisp because of parentheses, missing the significant point of Lisp. Just listen to any renowned computer scientist and a programmer (of the past and the modern times), be that Alan Kay, Yukihiro Matz, Guy Steele, Paul Graham, John Carmack, or even Donald Knuth. None of them, not a single one of them ever scoffed at Lisp.
    Those who complain there are no jobs and Lisp is not popular perhaps don't even know that Clojure is the most popular language in its category of programming languages with strong FP emphasis. It is more popular than Haskell, OCaml, F#, Purescript, Elm, Erlang, Elixir, and even Scala. It is being used in production and not for small things. Apple using Clojure (I think for their payments system); Walmart has built their massive receipt processing in Clojure; Cisco has built and actively developing their security infrastructure in Clojure; Funding Circle building a peer-to-peer lending platform; Nubank - the biggest digital bank in South America actively using Clojure; The list goes on: Grammarly, Pandora, CircleCI, Spotify, Pitch, Netflix, and even NASA.
    Clojure is a hosted language; it potentially can be extended to work on top of any PL ecosystem. It already works on top of JVM, .Net, and Javascript. These days Clojurists figuring out interop with Python and R. There are Clojure-like languages that work with Lua, Go, Rust, Erlang, etc.
    Bob is right, Clojure has some characteristics of the unifying language of the future. Whatever language you're using right now, I can assure you, there will be some Clojure-like Lisp dialect that can work on top of it, unless it exists already. Do you want to be a programmer, or do you want to be a programming language linguist who has to learn a new language syntax every six months? Like Rich Hickey once said: "Make programmable programs and solve problems, not puzzles."

    • @matthoyt3240
      @matthoyt3240 4 года назад

      Give you one huge hint where languages are going. Go look at what's been added to Java and C#. Go look at Rust, Swift, and Kotlin. We are slowly marching down the path of ML family of languages (Haskell, SML, F#, and OCaml), not Lisp. Lisp is very good for some things, just like Prolog. What most businesses want is a language you can come back after a few years of not touching and be able to at least figure out what it does, and easy to edit. With a lisp, it could be even more problematic due to the macro system and lack of static type checking. The reason there are lisp like languages available everywhere is they are easy to write.
      BTW the most popular impure functional programming language is JavaScript by a lot. The most popular impure functional programming language on the JVM is Kotlin. We are seeing a new wave of programming languages due to the multi-core problem and security issues. Clojure is one of the earlier ones. The VMs are actually holding us back right now because the creators don't want to break backward compatibility, which is understandable. Hopefully, this will happen at some point with the success of Go, Swift, and Rust, where a large company willing to take a risk.

    • @ilemming
      @ilemming 4 года назад

      ​ @Matt Hoyt Wherever languages "are going" - Lispers keep innovating as well. It may even feel sometimes that they are one step ahead. Most of those "new features" you mentioned, getting added to C#, Java, Kotlin, etc. are mostly syntax additions. Syntax, interestingly, complects meaning and order, often in a very unidirectional way. Gerald Jay Sussman makes that point in "Structure and Interpretation of Computer Programs," the book first published in 1979. And even today, it has not lost its relevance. I have never heard anyone, any single established and renown computer scientist ever (not 40 years ago, nor a decade ago, and neither today) saying that SICP is outdated and shouldn't be used to teach programming.
      You can argue all day long, about what's better - statically or dynamically typed PLs, languages with HKTs or without, generics or no, etc. But what usually doesn't get discussed much is the importance of the simplicity. And Clojure is one of the very few languages that made simplicity the cornersone of its philosophy.
      "Simplicity is hard work. But, there's a huge payoff. The person who has a genuinely simpler system - a system made out of genuinely simple parts, is going to be able to affect the greatest change with the least work. He's going to kick your ass. He's gonna spend more time simplifying things up front and in the long haul he's gonna wipe the plate with you because he'll have that ability to change things when you're struggling to push elephants around."
      And that's the ultimate truth about being a successful programmer - to strive for simplicity.
      And again, anyone who feels that programming is about learning new syntax every six months, sure, enjoy your ride. Programming is much more than that though. I'm not saying Clojure is the definitive answer and an ultimate silver bullet. I like other programming languages as well. But you seem to be measuring all Lisp dialects using the same yardstick. Lisp, as you (or most programmers) may have known - perhaps is slowly dying, yes, but more and more Lisps emerge every day. Those who don't see that, are usually those who chose not to care about those things, that's all.
      Lisp is not "very good for some things". It's not a discrete solution - it's an idea. One of the most magnificent ideas in computer science. And like Uncle Bob stated: "it just refuses to die". Because good ideas are not meant to die, they may evolve, but won't die.

    • @KirillKhalitov
      @KirillKhalitov 4 года назад

      @@ilemming "Gerald Jay Sussman makes that point in "Structure and Interpretation of Computer Programs," the book first published in 1979. And even today, it has not lost its relevance". mitadmissions.org/blogs/entry/the_end_of_an_era_1/
      But I think the book is still great.

  • @jerelull9629
    @jerelull9629 4 года назад +4

    LISP was SO fun within EMACS: the editor I never has to exit once I got logged in; compile, link, test & debug in one place. I could even do my email without exiting the editor. After all these years, I've realized that every language can do all the same things as other languages, albeit differently. a *REALLY* strange language was APL. very tight, difficult to analyze or debug, which is what I hated about it.

    • @dahdahditditditditditditda7536
      @dahdahditditditditditditda7536 4 года назад +1

      Long live the EMACS. Does RMS still have anything to do with it? Been retired for a while ...

    • @jerelull9629
      @jerelull9629 4 года назад

      @@dahdahditditditditditditda7536 RMS would be a factor on VAX and possibly Alpha machines. A dozen year ago, before I retired, DEC had an editor as customizable as EMACS, and similarly self-documented, meaning we spent hours searching for features.

  • @dojcubic
    @dojcubic 4 года назад +9

    UML for JAVA programmers seems to be his favorite book. He has a copy in every room and the most unusual places.

    • @inxiti
      @inxiti 4 года назад +1

      dojcubic I thought the same. I looked it up, and it turns out he wrote it.

    • @pqnet84
      @pqnet84 4 года назад

      @@inxiti he has many unsold copies he uses to keep things from moving with wind inside his house :-D

    • @CaedmonWalters
      @CaedmonWalters 4 года назад

      @@pqnet84 lol

    • @JohnSmith-ox3gy
      @JohnSmith-ox3gy 2 года назад

      @@pqnet84
      A "paperweight" made of paper.
      Amusing on two levels.

  • @valdisk3502
    @valdisk3502 4 года назад +2

    Some salient features missing from Bob's talk: delimited continuations (Scala), Communicating serial processes (Go), sparse memory (human brain). These are all great features that are not mere restrictions on the venerable omnipotent LISP. Sparse memory also manifests itself as content-addressable storage, just wait for my Web3 platform to see. Also, I have two meta-features up the sleeve, one for processors (microcode-level programming) and one for the language AUM, my last language candidate, which offers order-of-magnitude more freedom than ML. In AUM, you can define 2+3 to mean: take the character "+", repeat three times and call the duplication function (called "2") on it, producing the two-element list ("+++" "+++"). It is supposed to be as fast as C after compilation, but more compact, both the binaries and the source. There is still room to improve the syntax and functionality of current languages, I am willing to share my research with qualified partners.

    • @ElixirVitae
      @ElixirVitae Год назад

      I’m very intrigued by your comment. Can you tell me more about what you are doing and where I can go to learn about your work?

  • @MarcoAntoniotti
    @MarcoAntoniotti 4 года назад +9

    The favorite language of Italian programmers is "Monicelli". But there is NO way non-Italian programmers can grok its beauty :)

    • @rmsci4002
      @rmsci4002 4 года назад

      How is it generally beautiful then if it is limited to some natural language?

    • @Evan490BC
      @Evan490BC 4 года назад

      @@rmsci4002 Because Italian is a beautiful language. (I'm not Italian by the way.)

  • @thebutlah
    @thebutlah 4 года назад +13

    I wonder if Rusts notions of lifetimes and ownership baked into the language itself instead of as convention would count as a "new thing"

    • @driftonAloft
      @driftonAloft 4 года назад

      not really, its a paradigm of rust an enforcement of rules

    • @duffahtolla
      @duffahtolla 4 года назад +2

      It's core to the language and it's a severe constraint. So I would think yes. Just as capabilities are for Pony.

  • @NiktNobody
    @NiktNobody 4 года назад +10

    We will need new paradigms and new programming languages for new computers. For example, for quantum computers.

    • @elcugo
      @elcugo 4 года назад

      There is no evidence that quantum computers are more powerful than Turing machines. We will probably need DSL for them, however.

    • @eventhisidistaken
      @eventhisidistaken 4 года назад +1

      Fundamentally, you need to be able to set the state of qubits, and read them. I view quantum computers in the same way I view graphics hardware. You have special code you write to run on the hardware, controlled by general purpose code. It might be kind of like cuda or a shading language - yes you need to write some parts in some special language, but a general purpose language is still in control.

    • @sebastianmestre8971
      @sebastianmestre8971 4 года назад

      @@elcugo You are right in that there is no proof that quantum computability is different from regular old computability.
      But there are some problems for which we know linear time solutions in quantum computers, but only exponential time solutions on traditional computers. computers

    • @taragnor
      @taragnor 4 года назад

      @@elcugo : The very nature of what quantum computers can do makes them more powerful than Turing machines. Shor's Algorithm is one good example of a quantum machine beating the hell out of a standard computer. Really the main limit of quantum programming is coming up with quantum algorithms to take advantage of all the amazing speed you can get, because programming things in quantum is far harder than writing a normal algorithm.

    • @rmsci4002
      @rmsci4002 4 года назад

      There are already people arguing that Lisp is the most suitable language for quantum computing.

  • @TheHeraldOfChange
    @TheHeraldOfChange 4 года назад +26

    So the ultimate question out of all this is, if you had to choose one language, what would be your, "go to" language?" 😜😜😜

    • @tack3132
      @tack3132 4 года назад +4

      C would be mine.

    • @martinisdn541
      @martinisdn541 4 года назад +4

      C

    • @monetize_this8330
      @monetize_this8330 4 года назад +2

      a (simple) macro assembler that supports multiple cpu variants.
      I guess C is a better assembler, and far more portable than any language in existence.
      C++ is a joke.

    • @AnthonyDunk
      @AnthonyDunk 4 года назад +4

      I've used a lot of languages over the years (including C, C++, C#, Java, Kotlin, Objective C, PHP, JavaScript). But if I had a to choose a single language it would probably be C#. It's the closest we have at the moment to an easy to learn and very productive language running on a virtual machine that can do it all. You can code applications for the desktop, mobile (using Xamarin), and web (.NET). It can even interface to old code (Interop to C DLLs is now pretty easy). The only real draw back is that's controlled (or at least driven) by Microsoft, and is somewhat Windows-centric. However there are open source projects like Mono that implement C# and can be run on Linux and Mac OSX.

    • @monetize_this8330
      @monetize_this8330 4 года назад

      @@AnthonyDunk I was onboard with .Net from version 1.1 but since v3.5 I gave up because of the constant patching of the frameworks. I'd rather use C++ than .Net

  • @jrherita
    @jrherita Год назад

    I’m more of a hardware guy with some bouts of programming since the early 1980s.. this is an amazing video and take on languages!

  • @mattsimon4167
    @mattsimon4167 4 года назад +1

    For a good example of graphical/module there is Verilog and VHDL and it can program FPGAs. though they are both electronic and domain specific, and progress there is being focused on abstracting the graphical to traditional text language.

  • @IMANGHAFOORI
    @IMANGHAFOORI 4 года назад +8

    The v-lang would be the last one. As far as C, as Modern as Go, and lots of other amazing stuff

    • @rameynoodles152
      @rameynoodles152 4 года назад +2

      Jesus, thank you for mentioning this language. This is pretty incredible.

  • @dukeofearl8078
    @dukeofearl8078 4 года назад +39

    I must disagree with forced garbage collection. Such a language would not meet the requirements of real time apps (and possibly not embedded apps).
    I also disagree that compiled languages bind us to the hardware.
    The ultimate language needs to be compiled to machine code to run efficiently on shared processors (eg containers).
    D and Rust are the best fits I’m aware of.

    • @emmetallen5685
      @emmetallen5685 4 года назад +5

      Maybe something like a superset of rust?
      I think something like this would check off most of the boxes.

    • @hemangandhi4596
      @hemangandhi4596 4 года назад +1

      @@emmetallen5685 what would you add to Rust in the superset? A build target for the JVM?

    • @alienm00sehunter
      @alienm00sehunter 4 года назад

      Something like rust might be the alternative for programs that can't handle a vm

    • @AndersJackson
      @AndersJackson 4 года назад +1

      You are talking about OCaml now, you know.
      It compiles to native or virtual machine, and to JavaScript and much more. It is garbage collected, but run on small machines like Arduino's.
      It is also basis of operating system, or actually, the program IS the operating system. You boot into your program.
      It is functional programming language, and has Object oriented extension (but hardly ever used). It also have extensions to the syntax, so you can in a standard way hack in new syntax structures into it. Like a pre-processor, written in OCaml, which changes the AST with the pre processor, not the text.

    • @hemangandhi4596
      @hemangandhi4596 4 года назад +1

      @@AndersJackson Rust has all of that too. Also, when do you boot into OCaML? I've never seen heard of it.
      But welp, I guess you could tweak the OCaML compiler to stack allocate most things and ref-count to basically make it close to GC-free too, if that was necessary (I think the optimizations for math lead to something like this?)

  • @jitspoe
    @jitspoe 4 года назад +108

    "We want our language to run in a virtual machine." "We don't want to be close to the metal" "We want our language to be garbage collected"
    You've never done game development, where you have to fit an entire frame's worth of calculations in a 16ms (or smaller) budget with no spikes, have you?

    • @homelessrobot
      @homelessrobot 4 года назад +10

      The whole premise is bunk IMO. "We keep coming back to lisp", no, we really don't. "emacs exists" doesn't really support this either.

    • @SimGunther
      @SimGunther 4 года назад +15

      @@homelessrobot GC imposed languages are like being trapped in a prison as one would not be trusted to make their own decisions. Just let us either turn off the GC ourselves or not impose it at all while the low level programmers write the garbage collectors that we can use wherever and whenever we want.

    • @CodesGuide
      @CodesGuide 4 года назад +3

      @@SimGunther That's actually a great idea. More freedom to us coders while we choose the tools best for the job at hand.

    • @devcybiko
      @devcybiko 4 года назад +11

      I've written such code - and you're right. Programming on the bare metal gives the best results. But if your processor was fast enough, and you had enough memory - you could program in a more symbolic language. And don't forget - JVM-like languages can be compiled to machine code (and implement JIT compilation and Hot Spots). It's not unreasonable to contemplate a single programming language that meets Bob's requirements.

    • @devcybiko
      @devcybiko 4 года назад +10

      Also, consider that Bob did allow for Domain Specific Languages. Real Time applications (of which I consider a video game to be) might require a language like C/C++ and they would then become a DSL.

  • @petermaquine8173
    @petermaquine8173 4 года назад +1

    Hi to all colleagues. I remember auto-lisp (autocad) i use it to make script for plan for french Nuclear plants. I remember turbo-prolog, I used it to illustrate a project for a university. Good all times, because yes it's always the same things that we old dev. are seeing decade after decade. Who remember Fred?

  • @you2449
    @you2449 4 года назад +1

    As a fiction writer (and non-programmer) I needed a programming language for a major plot point in a story.
    I chose Cobol w/out knowing anything about it. It just sounded right.
    And now your description of it's weirdness (and absurdity?) confirms it was the right, and only, choice. So glad. (Cuz I wasn't about to go back and change it.)
    Loved all the rest of this too.

    • @utubewatcher806
      @utubewatcher806 4 года назад +4

      of course, Cobol gets bashed--and, yet, every $ transaction; every flight booked, etc, etc goes through a line of Cobol code.

    • @you2449
      @you2449 4 года назад

      @@utubewatcher806 That's pretty awesome! Thx.,
      Yet It Lives!

  • @andygolem5514
    @andygolem5514 4 года назад +8

    This is the best episode of Rick and Morty I've ever watched!

  • @SimGunther
    @SimGunther 4 года назад +10

    44:04 I'm with the crowd on this one. The "write once, run everywhere" philosophy is ironically antiquated given that we're supposed to "run away from the hardware". GC without a large amount of lag or spike in compute time they has a smaller footprint than manually managed memory is a noble goal, but I'll stick with play in my pointer garden without a VM, thx. :)
    P.S. you still can get memory leaks in a garbage collected language, like WTF??

    • @homelessrobot
      @homelessrobot 4 года назад +5

      and even if you could completely eliminate memory leaks, memory is just a special case of resource management. there is no general strategy for automatic resource management. its basically the halting problem.

    • @oysteinsoreide4323
      @oysteinsoreide4323 4 года назад +1

      C# has a big advantage since it is compiled to intermediate language, and compiled to native machine code on the machine running the code.

    • @oysteinsoreide4323
      @oysteinsoreide4323 4 года назад

      SimGunther Garbage collection saves also things. But there are ways to optimize garbage collection.

    • @oysteinsoreide4323
      @oysteinsoreide4323 4 года назад

      SimGunther Yes GC is not a way to say that you don't have to think about resources. GC don't save bad code.

    • @homelessrobot
      @homelessrobot 4 года назад +2

      @@oysteinsoreide4323 the problem comes when you write a thing that is isomorphic to dynamic memory allocation, but its too abstracted from memory primitives for the garbage collector to infer this.
      Or, when something about your memory usage patterns prevents the freeing of memory from actually leading to being able to use the memory again later (memory fragmentation, generally).
      GC is /supposed/ to be a way to say that you don't have to think about memory management; but even this isn't really true.

  • @kevinleesmith
    @kevinleesmith 4 года назад +3

    Very nostalgic and forward looking at the same time. Just how I like it.
    My past started in 1970 on a pdp11/34 running rt11 and programming in Fortran. Ahhhh. Heady days. Happy days. Working 48 hours straight was common place because when u were in the groove, you just kept going. For me, C was as good as it got.
    Now I'm depressed and will have to drink a lot of whisky :-(

    • @Murph5456
      @Murph5456 4 года назад

      Poor guy. You needed the whisky then. RSX11M was the place to be.

  • @frankgerlach4467
    @frankgerlach4467 11 месяцев назад

    If you look for a "standard language" to document algorithms, I suggest Algol, Pascal or Ada for imperative.
    LISP for functional.
    Prolog for logic.

  • @antonnym214
    @antonnym214 3 года назад +1

    Thank you! Bob, you are a national treasure! I started on the TRS-80 in 1977 with BASIC, Z80 and APL. (Yes, APL on the TRS-80!) then I wrote a language called R-code to control multiple "warbots" fighting each other in a virtual maze). Then I wrote a simplified Structured BASIC-like language called L.I.M. (Limited Instruction Model OR Less Is More). I'm convinced simple is best, even if it is less impressive to coders and their friends.

  • @Alkis05
    @Alkis05 4 года назад +6

    12:15 dwarf fortress hydraulic computer is turing complete. They implemented space invaders in dwarf fortress.

  • @Amipotsophspond
    @Amipotsophspond 4 года назад +10

    this feels like one of those in the future year of 2010 mail men will deliver mail by jet pack, and your wife will cook your roast by radioactivity decay, with the help of your robot slave. good history but too much authority.

  • @RayDrouillard
    @RayDrouillard 4 года назад +16

    "Programming happens one step at a time."
    Very true, but a neural net pushes this concept -- particularly if it the net itself is enacted in hardware.

    • @adultlunchables
      @adultlunchables 4 года назад +5

      Hate to disagree here but I think you mean that neural nets push the concept of what PROGRAMS can be... However, they do not push the concept of what PROGRAMMING can be.

    • @handris99
      @handris99 4 года назад

      I've been thinking about this idea deeply for at least 2 years now. Uncle Bob is definitely right about this in the way that there are certainly elements of Computation that Require Sequential Processing (CRSP) (the result of a previous computation is required for the next step because of the laws of math) and quantum computing is going to turn this paradigm on it's head. But let's put quantum aside for a moment. What I've been thinking about deeply is weather there is a way to condense the parts of code that represent one step in a CRSP into one (or a limited number of) layer in a binary neural net. Then the full code would be basically clipped together from these parts, loaded into a neural processor then executed. The CRSP parts in connected layers while other parts just in parallel. The motivation behind this is that this would allow distributed computing in a mass scale with maximal code reuse, load optimization, auditing , verification and a bunch of things. Any opinions?
      Ray Drouillard neural nets are implemented in programming languages but more importantly their training algorithms certainly are.

  • @maniacal_engineer
    @maniacal_engineer 4 года назад +1

    I am glad he mentioned FORTH, I loved forth, which is small and fast and good for things that need small and fast

  • @samuraichef5055
    @samuraichef5055 3 года назад +1

    We could all return to "Muddle". After all, that is what Zork was implemented in.

  • @Peregringlk
    @Peregringlk 4 года назад +4

    Gargabe collection is not the only way to completely get rid of memory leaks.

    • @ftraple
      @ftraple 4 года назад +1

      Yes, modern C++ does this very well.

    • @harrybarrow6222
      @harrybarrow6222 4 года назад

      You can use (automatic) reference counting, but it is still possible to lose data structures in ways that confuse a simple reference counter.

    • @Peregringlk
      @Peregringlk 4 года назад

      @@harrybarrow6222 Or make sure all your objects follows RAII patterns extensively and you're done.

    • @harrybarrow6222
      @harrybarrow6222 4 года назад

      @@Peregringlk Sorry, that seems a little naive. If people were good at manually managing resource allocation and deallocation we would never have memory leaks. 😀

    • @Peregringlk
      @Peregringlk 4 года назад

      @@harrybarrow6222 Constructors and destructors should manage resource allocation, not people. As you say, people is very bad on that; me too since I'm people. So C++ tells all of us: don't put never ever new/malloc/delete/free out of these two and you are basically done. There's always exceptions of course because not every single object can be tied or bounded to a scope or other object's live, but the impact is crazingly minified, and without loss of performance (unlike garbage collection).

  • @squashtomato
    @squashtomato 4 года назад +6

    Getting a "we live in a society" vibe from this guy.

  • @joaquinfabrega
    @joaquinfabrega 4 года назад +4

    I am not involved in programming, however I found this video interesting.

  • @ryanlyle9201
    @ryanlyle9201 4 года назад

    I'm not sure why I'm here and I don't understand all of it but the personality of this channel is highly infectious. You are a fun guy to watch.

  • @mrspock2al
    @mrspock2al 4 года назад

    Boy do you bring back memories... I'm a new subscriber and a retired programmer. I learned on a PDP 8 writing assembler and something called "Focus". Absolutely hated Cobol. Had to translate it to assembler to understand it. I've written assembler, Fortran, Snobol, Cobol, PL1 (liked it), IBM EDL, Pascal, Perl, Powershell, Basic, C and an assortment of shell languages. Got out of programming about the time "oo" became the rage - and never looked back. My career went into sysadmin, networking and virtualization and loved it.
    You've got the coolest toys... sonic screwdriver, phaser, communicator and a copy of Forbidden Planet.

  • @unforgive2n
    @unforgive2n 4 года назад +6

    Rust seems to introduce a new way for getting rid of GC, by concepts of "Ownership" & "Borrowing"..
    and its pretty fast..
    although it's a bit syntax-heavey

    • @Peregringlk
      @Peregringlk 4 года назад +4

      I don't know rust but ownership and borrowing seems a reinvention of RAII and smart pointers respectively.

    • @unforgive2n
      @unforgive2n 4 года назад

      ​@@Peregringlk in Rust, memory is managed through a system of ownership with a set of rules that the compiler checks at compile time. None of the ownership features slow down your program while it’s running.

    • @donjindra
      @donjindra 4 года назад

      That "ownership" concept is one reason I hate Rust.

    • @unforgive2n
      @unforgive2n 4 года назад +2

      @@donjindra then probably you don understand it well. Because it amazed me

    • @donjindra
      @donjindra 4 года назад

      @@unforgive2n True, I don't understand the language well. But it shouldn't be hard to understand a programming language well. There are too many quirks. Too many things to remember. Too many things that just get in the way of writing a program. No language is self-documenting. But Rust fails miserably on that front, imo. I see no benefit to it whatsoever. The 'ownership' concept itself is simple to understand. Any good programmer already uses the concept to a degree.

  • @BlackburnBigdragon
    @BlackburnBigdragon 4 года назад +6

    I've always wondered why the industry never standardized computer languages. If you're someone who, with no knowledge about computers, wants to go out and learn programming, you're pretty much lost right out of the gate, because there's so many options. Researching which one to learn is often difficult because they all have so much overlap in features, and purpose. And asking for help is usually worthless because every person you talk to will give a different set of recommendations. It feels like the wild west out there as far as languages are concerned, and it seems that you just have to close your eyes and pick one out of a bin full of them. It's very confusing for someone just starting out learning.

    • @YuFanLou
      @YuFanLou 4 года назад +4

      The same reason Esperanto never took off. Every attempt to “standardize” only creates a new language. One should start out sticking to one language as a “mother tongue”, and learn the so very many other languages as domain needs arise.

    • @emzaet391
      @emzaet391 4 года назад +2

      Im learning c++ now. it was long time to choose

    • @RolfSchlup
      @RolfSchlup 4 года назад +2

      @@emzaet391 In my opinion, C++ is the best most versatile language. It is a compiled language whereas all others require a virtual machine or runtime to run

    • @ZeWaka
      @ZeWaka 4 года назад +1

      @@RolfSchlup uhh, that's a very broad overgeneralization

    • @dannygjk
      @dannygjk 4 года назад +1

      @@ZeWaka If you had to choose only one language for everything then C would probably be the best choice. I think that was the gist of what he wanted to say. Forth is relatively simple but to use it experience with assembly language is almost compulsory. That would make learning and using Forth much easier.

  • @dougb70
    @dougb70 4 года назад +11

    49:47 Life is massively parallel. Biotech is developed in parity. CPUs have potentially thousands of cores. The next language should align with that goal. So the talk about linearity lacks foresight IMHO.

  • @davisgloff
    @davisgloff 4 года назад +1

    I'm not a programmer (at all,) but I was fascinated by this.. I kept hoping there would be something I understood. There wasn't, and just that fact kept me watching and interested...
    By the way, regardless of my ignorance, I LOVE your presentation!!

  • @davivify
    @davivify 4 года назад

    I think the biggest revolutionary leap in programming languages, imo, is object orientation. By far. It's helped me to do things that would be very difficult without it. I started out with classes as containers and that was very good in terms of statically organizing my code/data. Thence onto inheritance, though I never really did much with that until... polymorphism. That was the key that allowed me to code abstractions. And helped me hugely in creating code that told you what it was. Allowed me to do a complex undo/redo system by designing the underlying architecture separately from the constellation of operations, each with their mate. And the beauty is it's extensible. Allowed me to create a token list for a lexer/parser, where I create the list separately from the gazillion token types I use or could create in the future.
    When designing complex software it's crucial, imo, to understand the design paradigm *before* committing to code. How much time I've wasted, how much trouble I got myself into without a full and thorough understanding of the problem domain. And therefore writing garbage code. Of course, an OOP doesn't do this for you. You need to put in the time with whatever design tools are available. Employ Use-Case charts, state diagrams, flowcharts, interaction diagrams. Use whiteboards, crayon on glass, dinner napkins, backs of envelopes, etc. Have long discussions with your bloodhound - whatever works, but get that paradigm understood ! Then design using the OOP structures, and thankfully they're there for just that purpose !

  • @ProgrammingMadeEZ
    @ProgrammingMadeEZ 4 года назад +4

    Glad to see you posting more stuff Uncle Bob!

  • @blenderpanzi
    @blenderpanzi 4 года назад +10

    I guess the life times of Rust take away the same thing as functional languages do, but just not as much?
    And there is one more paradigm: logic oriented programming langues (e.g. Prolog). Those aren't based on lambda calculus, but on mathematical logic, if I remember what I learned at university correctly. Not sure how general purpose that is.

    • @gzoechi
      @gzoechi 4 года назад +3

      It seems he hasn't looked at Rust yet, but I think he mentioned Prolog early on in this video.

    • @brymusic1542
      @brymusic1542 4 года назад +3

      I used to program almost exclusively in Prolog. It can be general purpose though it's real strength is in symbolic processing. It didn't really catch on because there was little support for it in mainstream IDE's, so newer programmers aren't likely to be enticed by it.

    • @MrJamesstarr
      @MrJamesstarr 4 года назад +3

      Rust is probably the latest example of a new programming language with a new “thing”

    • @brawndo8726
      @brawndo8726 4 года назад +2

      Interesting observation. Rust is both generous, yet incredibly strict with assignment. Is it enough to be considered a new paradigm 🤷‍♂️

  • @alexisandersen1392
    @alexisandersen1392 4 года назад +10

    43:42 RIP Batman.

  • @lorisg
    @lorisg 4 года назад +1

    we, as industrial automation programmers, have already made our choice since 1993. the IEC 61131-3 standard defines a group of 5 languages and therefore we benefit from the advantages of the great unification. In the real world only 2 of these 5 are widespread: LD is a graphical language for simple applications. ST is a text language for complex applications.

  • @GeorgijTovarsen
    @GeorgijTovarsen 2 года назад +1

    I think the problem with just using one language would be that there are mutually excluding feature that desirable for different application (a vary basic example is no-GC for building a kernel and GB for building a web server). I started to think that maybe we could start making a unified type sytem. Adding constraints to the type system it would be possible to make modules interchangable between different languages (not just any language though)