Opps have an error I need to correct in this video, I mention the name Dennis McIlroy, his name was Douglas McIlroy, I guess seeing Dennis Ritchie's name so much during the course of this video my brain got caught in a loop....any way apologies for the error
Great video. I’ve read Raymond’s book and thought I was generally familiar with the concept, but this corrected some misconceptions. The commentary about which tenants Bell Labs and others agreed with or adhered to was very interesting. Since it sounds like Linux isn’t really a prime example of the UNIX philosophy in action, it would be great to get your take on the general state of code quality in F/OSS vs. proprietary software.
People make too much of this. It's not like the tablets from Mt. Sinai, it's just a grace note that kind of holds everything together conceptually. And people always get it wrong: it's not "everything is a file", it's "the system (at least tries to) treat everything as if it was a file" - very different. In practice it makes for a great open system since you can fix/alter/adjust almost everything pretty easily by editing a system file. Brian Kernighan refutes the standard Multics narrative btw and (I think) so did Dennis Ritchie.
Hello sir, I was watching this video after DT's video on unix philosophy and I love your take so much! Thank you for taking time and effort to produce this video.
This is an amazing talk, and I can only wish all young programmers would hear it. There's so much crucial knowledge to learn from Bell Labs, and I'm worrying that their insights will be forgotten - if it hasn't already, truth to be told. This is why history is so very important. We need to learn from the past, and become inspired to create the future. Not just going through the daily grind for a paycheck, as most seem to settle with. Therein lies regression. Really appreciate your talks DJ Ware, thanks for what you do and keep it up. :)
There is always room for their new ideas too, and yes would hate to see this lost, but there is some really cool work going on, and NixOS is one of them.
I'm over in the UK but back in the late 1990's I was working for Lucent Technologies and was lucky enough to get a visit out to Bell Labs in New Jersey some time around 1998-1999 for a week of training for one of our CTI (Computer Telephony Integration) products. It was an amazing place - I remember specifically walking down a long corridor filled with small glass-walled cubicles and every single cubicle was covered in someone's hobby or obsession. So one would be covered in pictures of butterflies, another might have Star Trek, etc... lots of people walking around in white coats too. I'm of a similar age to you, this year I celebrate 40 years as a "techie" in the telecoms, IT and cyber-security industry. I think it's easy for "old guys" like us to appreciate The UNIX Philosophy because we've been around computers for such a long time. I started working on computers back in 1982 when my first (proper) employer gave me day release one day a week for college, and I started programming Z80 CPUs in machine code, and I was lucky enough to start using (SCO) UNIX on telecoms servers back around 1989-ish. I think you need that kind of experience to get a perspective on just how fast computer technology has developed since back then. Back in those days, we had limited CPU power, RAM and storage and therefore there was a constant limit on just how many programs you could load and run - which meant that modular software had to be the norm.
Most of this video reminds me of things I have tried to impress upon workplaces that I was tasked to improve; and you can imagine, met with intransigence, particularly when conflicting with the survival instinct of managers who depend on a level of unmanageable complexity to justify employment and promotion ("sand-bagging".) So I really love hearing someone that a lot of people trust talk about Brooks. The only gripe I have is Linux (predictably), as in other videos you have (sort of) sung the praises of dkms. You are very well aware, obviously, that the kernel can be shaved down to almost nothing. But it isn't, because of the insane growth of devices that users expect to be recognized out of the box. It is hard to express to non-Linux users that the hardware "driver" model actually works better in Linux, that you are more likely to have things like hardware recognized by the Linux kernel or in a module without any action, rather than depending on (presumably) competent EEs' terrible code to talk to the kernel, as is the case with Windows. This is a business decision, IMHO, as it allow MS to pick winners and set the barrier to entry into the ecosystem, where Linux is solving a completely different problem. Anyway, summing the above para, while it is a "monolithic" kernel, all kernels from the major OS's are hybrids. Linux's kernel is increasingly modular, and size comparisons are therefore increasingly misleading (and compiled code is dropped, as you've seen on Phoronix.) Even bogeymen like systemd are actually more modular than detractors say. There are small executables that run parts of the its system; e.g., journalctl and systemctl. Units are the definition of modular, and parts of systemd that touch different subsystems are isolated and use the filesystem, network or good ol' IPC to communicate (gummi or systemd-boot, e.g., is the most reliable boot manager in the UEFI age that I've used; as far as I'm concerned grub is undead.) This is all a bit of a digression, but I just meant to remind you that you have "sort of" sung the praises of modularity in Linux before. While it could be better, in many cases where I tried to instill the good practices from Bell Labs that you talk about here, I was answered with "this isn't an open source project," which is completely irrelevant, of course; but it shows that FOSS (and yeah, FLOSS) is closer to the mark than most commercial software, where "kanban" is a certification and not a philosophy.
I appreciate your take on the UNIX philosophy. It is certainly more nuanced than how it is usually presented. I too watched DT's video on the matter, but found it disappointingly superficial. The key takeaway from his video (for me) was "it doesn't really matter". But hearing you I see the U. philosophy as an ideal (even applicable, in part, to one's own life!). Yes, perhaps Linux didn't follow it and it is too late now to implement it to the letter. However, the "spirit" of the U. philosophy can still be held as a model. Btw, I've seen Gnome take only about 550 MB... but without systemd, of course! :)
This may speak more to DT's experience (or lack there of) with the plumbing, programming, or the underlying philosophy of Operating Systems or systems in general. It may make sense to him as an end user, but agreed, it was very superficial.
@@HandyWyo A lot of it comes down to "you had to have been there". I'm a similar age to DJ and been in the IT industry as a techie for 40 years now. Back then, the limited CPU, RAM and storage of computer systems meant that you had to make the software modular because you could only do a few "processing things" at a time. My first connection to the Internet was over dial-up modem some time around 1992 on a Commodore Amiga A1200 running at 14 MHz with around 8MB RAM - but even with that I could browse the (early) Web with the "iBrowse" browser on the Amiga. One reason I repair, refurbish and collect really old IBM Thinkpads is I like making a computer that might be more than 20 years old do something useful today, and (Gentoo) Linux lets me do that. You get an appreciation of just how far we've come with computers in 20, 3 or 40 years and how we've taken having immense computing power on a device that fits in our pockets for granted.
Thanks for your background info! Your videos are extremely helpful to me. I've been coding since 1984 on different systems, primarily DOS and then Windows. I"ve been playing with Linux since 1998 but at a relatively superficial level, at least compared to my Microsoft experience. Your background information is giving me the mental hooks to start digging down and really see what's going on. Thank you for sharing your insights!
I'm a computer user at this point that's learning Linux out of pure concern for becoming institutionalized on Windows and learned my way around BASH command line and I'm watching this and thinking this philosophy is exactly why I'm upskilling to make the switch gradually. Modern internet is full of too much fucking bloat, and so are our operating systems.
Thanks for another great video. I enjoy your style of filling in gaps in knowledge and weaving history and experience into the subjects you cover. This was very informative and entertaining.
One of the biggest mistakes I ever made was going to work at a company where people would rewrite and rewrite their code to make it prettier. Before going to work there, I wrote code that worked and didn't care whether or not it was the "most elegant" approach. In one company, I was there a year and half before I missed a deadline and that one was by only one day. After working there, I find it nearly impossible to resist rewriting everything to try to maximize elegance and as a result it makes it difficult to actually accomplish much.
Me and my beloved OpenBSD machines would like to thank you for another brilliant video. You´re the best Sir. These things has never been more important than now.
One point in the Unix philosophy is that input and output is text based as much as possible. This is somewhat implied in what you mention "the output of one program can serve as input of many different other programs". The huge advantage being that if you don't know what to expect in the output, or you don't know what to search for, you can, as a human grep or awk until you find what you need. And know how to further process it.
The Unix philosophy seems to work well with my working knowledge of cybernetics (currently reading Stafford Beer). I also watched an obscure tech RUclipsr argue that modern programming education neglects the data portion and focuses on functions. He argued failure to grasp ones head around large data and how it must be navigated is the fundamental cause of disparity between newly educated programmers and professional programmers. ex: learning about the producer/consumer model may describe a little bit of the solution dynamic, but it absolutely does not get across the actual problem of concurrent data access in a large set of data.
I can see how UNIX philosophy encourages a focus on the data, with its second trait, expecting output to be input for unknown programs. Riffing off of these points, maybe modern computer science students would benefit from interleaving system/enterprise architecture frameworks focused on business logic alongside learning computing theory.
Also, I wonder, where this Unix philosophy (UP) now? Should specific organizations/ecosystems develop for the broader UP-compliant network, or does it count if it is still specific and closed within one organization/ecosystem? As in, is it okay for differing organizations/ecosystems to follow different implementations, e.g. breaking from POSIX compliance in favor of new standards? I'm thinking about NuShell, and I found their tabular data frame format *amazing* and conducive for UP style script development, but it's a soft break away from the greater ecosystem of bash and POSIX compliant shell commands.
Thanks Weaver will have a look at it, I was on the POSIX committee for a bit representing Burroughs, omg I hated those meetings it was just people trying to bend it to their way of thinking...and yeah Unix Philosophy was at a bit higher level than POSIX. Fill a room full of nerdy engineers, ask a question about standards, then run. :)
We had a system which ingested something like 650 MB/sec of data when I joined the project it grew to well over 1.2TB/sec over the course of project and there were plans to handle 2.4TB/sec coming. A complex problem for the time, the solutions we found to work and scale were parallel streams, minimizing transformations and parallel writes to chunked storage pools to aggregate bandwidth across the ingest pools, was this the best solution? Probably not but I notice CERN does something very similar today. Oh and one last thing we deployed out apps in swarms, so move themselves to the data elements, and never moved the data from its landing space.
Composable systems are also developed in a culture that expects interoperability, which ends up meaning that the developers are more experienced by the time the system has to inter-operate with the user.
I've been catching up on your awesome videos, DJ. I'm not sure if it was this one where you expressed dislike at referring to a shell as a command-line interface. I was a bit surprised at how strongly you said the Shell was not a CLI. The Amiga computer called the lowest-level script interpreter at boot the CLI before you added a feature-enhanced "Shell" with command-history and such, equivalent of Bash or KornShell. But shells were still enhanced instances of command-line interfaces, which I always thought was more descriptive and less intimidating to the layperson than Terminal, TTY, Shell. It's the interface that takes typed-in command lines rather than a graphical user interface and had a resurgence in use I think from JavaScript frameworks like Vue-CLI. I think it's a good thing the concept is making a resurgence and think it's the rightful heir to the concept Terminal, TTY, Shell, SSH and all the other cryptic things we've called it in the past. Curious about your thoughts.
Recently, the videos have included referrals to Gifted and Inspired One-Offs, whom the vast majority of computer users will sadly never know of. Their contributions seem to have been major ones too. Thank You.
Very interesting to hear your experience and thoughts on the original guiding principles of Unix, DJ. I thiink the availability of very fast CPUs and very large amounts of memory has caused a real sea change in the approaches to systems and applications programming--not always for the better, eh? Perhaps it is progress, but I can't help but get a nagging feeling that somehow the fundamentals are being forgotten or at least are less understood. Cheers from Wisconsin!
Cheers Andrew, and yeah it does seem odd talking about saving a few Megabyte of memory when our system come today with 4GB, 8GB, 16GB and up and up...but it all adds up. What was the Dirkson quote, a Million here a Million there, pretty soon it adds up to real money? I guess the same can be said for system resources as well. Thanks again for the kind words.
Thanks for your input on the topic. As I see it, Linux is a large scale community project driven by enthusiasts, not necessarily great engineers. Besides, having strict quality controls on such a large scale open community project is probably practically unfeasible. That's why we got what we got, and alternatives like BSD are generally better engineered.
While I appreciate being able to read the source code of the software I use, I wonder if the whole free software thing hasn't made all the software cruft more viable. My theory is it has for a bunch of reasons: (1) it incentivizes growth, because as you said, people equate line counts with greater functionality and the line counts became visible now (2) it incentivizes growth and complexity as a protective mechanism -- if it's complex, users will be more likely to trust the complexity is warranted (even when it isn't, but that's hard to figure out) and will not run away to roll their own software for it. (3) it deincentivizes code removal for social reasons -- it's not "nice" to delete some else's "contribution" and a maintainer of a some open source software might perhaps be less likely to try and keep the codebase very clean, as it isn't his/her but rather collectively owned (+ again, it's not "nice" to even do trimming edits to someone else's code).
Just finished watching your video it was informative and learned some new things. My guess would be a emphasis on Small Programs and Simplicity using the Unix Shell Env. and assistance of shell scripts to tie things together. Giving user more control with init scripts (BSD Style) as opposed to monolith systemd for example. Now as far as the Linux Kernel goes yea its hopeless. But I guess the best we can do is focus on the user space for the most unix like env. Then we have those multiple man pages and the sloppiness of Linux. Hmmm... so in this case what's your solution to get the best Unix Like Experience? I personally just ended up opting out for Mac OS (super UnUnixy) but a more unixlike shell env with convenience of the commercial GUI apps if I need them... But I hate a lot of things about dealing with corporate Apple Products and the Consumer based closed source type things. So I guess that leaves us with BSD's? Linux is the wild Wild West but Slackware seems to be able to tame the wild horse of the linux world to make the most Unix out of it. Seems like Slackware is the closest thing to Unix while having the vast support of Linux ecosystem that BSD lacked. What are your thoughts on this. I'm sure you try to stay true or close to Unix in your own personal work env.
Nothing wrong with Slackware, its an awesome distro...the maintainer has had some health issues and it slowed the pace of new releases for awhile. and yep Slackware tried to pull Linux back into heel (heel being the Unix Philosophy) unfortunate the rest of the world saw dollar signs and went chasing after complexity. And yeah I try, I wasn't one of the UNIX devs or a member of the team just a guy trying to support the people who bought AT&T hardware and used UNIX to solve business problems. But I made many friends at Bell Labs and Western Electic (will have to spill that story sometime)
actually mythical man-month aged pretty well, one of the top books which i recommend for new devs (and not only late programmers antipattern, silver bullet also well known, and second project syndrome too, and stuff with paper bug ledger is still very educational) and thx for original article about unix, this piece about fast prototyping is still hit hard, even with popular "agile" (kind of) methods, wish someone tell me about this at the beginning...)
Hey DJ - where, in computing history, do you come from? I respect your thought and perspective on many topics and wonder if I don't know what you've done in computing or technology - or is 'DJ Ware' just yer old Razor 1911 handle?? :P
At AMEX during the Y2K, I had the opportunity to do QA on their Cobol code that covered AMEX card transactions. I got to see that code that had been written in the 70s and 80s had pretty much followed the 'independent' and 'reusable' design, each page stood alone and was a maximum of 24 lines, and did not depend on other than getting input and sending output in response. The biggest thing I found was inputs that never would be received, or outputs that never went anywhere and once in awhile, infinitely looped code, code that called itself, or code that would never get called - probably by units that in the years after being written, the calling code got deleted. Working on analyzing Y2K code at Wells Fargo was somewhat similar, but nowhere near as simple or stand-alone or dependency-free as the Amex stuff. I found check transaction code with potential errors that had never been caught in 20 some odd years - of course, none bad enough to alert end users, apparently. Amex was very easy to work with - as a QA guy I would only have to observe the programmers do their own testing following my QA updates; at Hell's Cargo it was quite a complex process to follow the logic of all the object transactions, and also to work with a Karen supervisor. iNTEL was the worst, imho. Their idea seemed to be to throw money and programmers at a software project, let it develop for a couple of years, and then cancel it. They were apparently GREAT at hardware projects; but software just seemed to be a big problem for them.
One night, Ken and Dennis were working late, arguing about what to call their new operating system. After many hours of hot debate, the finally decided to knock off around 9:00 and go for a drink. As they walking into their favorite watering hole (the BTL Rathskeller), the bouncer at the door saw their jeans and T-shirts and attempted to enforce the dress code (plastic pocket protector and belt-mounted calculator required) by pointing at them and saying, "You: nix!", meaning of course that they couldn't come in. And so history was made.
Who am I to argue against you regarding the bloat in "man". But that is on the developer or maintainer level. For the user "man [section] page" still is sufficient in the majority of the cases. So we as users don't notice that, until we need something special. Actually I cannot recall to ever have used one of the exotic options. I am not sure in case I would need something special I could grep and awk my desired output. However, the increase in program complexity cannot be denied. It is a tradeoff whether it is this, or a string of pre and post processing utilities.
What happened to System 4? Also, I gotta admit that the "Nuke and Pave" approach to code makes a lot of sense. My only counter would be that if you do it really right, code doesn't have to expire. There are a number of utilities that have been around for decades that fit this criteria although I'll admit that those examples are the exception and not the rule.
The bloated idea ... not so sure about that. Unix hit at just the time that Moore's law kicked in high-gear, so adding things to it was not really so negative. I love the Unix system - it is, or was, beautiful ... and I mean the early Unix system, when it was open, text-based, self-documenting, multi-user which at the time was important, and the file-paradigm was really amazing. Then he workstation manufacturers took it over and hid most of the text-based stuff, and made it a lot more complicated to the point that people couldn't really understand it without in ops having expensive contractors, and in design had to take expensive classes to figure out how to manage the system. I reallly enjoy these discussions of the IT industry, thanks.
Interesting video. Notably to see, that the purpose of modular and reusable code was already there. Nowadays everybody in the TDD camp talks about it. =). Is it fair, though to compare RAM sizes and such Linux vs UNIX? I mean, Linux can do a lot more than Unix could back in the day. Various Hardware support, graphical user interfaces that need to be accomodated etc. etc. When I started my studies 2006 they had a UNIX computer pool in physics. Sorry but it was archaic. The userinterface was just painfull. There is no reason why users should have to go through the pain of working with computers like they did in the 70s just to feel more nerdy.
Yeah, true considering some bloated Linux commands where the man page stretches over page and page.....What I have heard is that the BSD dialects have much fewer flags for their commands.
...PS call me crazy but I think Slackware Linux is closer to Unix Philosophy than is BSD and its Bureaucracy! :) Small home-brew team brought the best of Unix Philosophy in Linux and stayed True to it!
Good point and one I missed, the founders of both Slackware and Debian tried to push for the Unix Philosophy, they did manage to do it for their distro, but sadly it didnt catch on to some of the others. Thanks J.D. for brining that up
@@CyberGizmo Debian has messed up on this point thou (perhaps after Ian)! :) Definitely. Slackware = Small nitch group doing amazing things. Sadly I don't have 1 installed on any of my computers at home.
I think the comparison with OS/360 doesn't make a lot of sense. IBM having a total different approach with it's mainframe OS, that was just made for one platform and never intended to be portable and IBM had a lot of resources, so they could maintain a large monolithic system like that. Also because it only had one purpose, making parts easily replaceable was just not necessary. Also for IBM it actually made sense to have a system that is difficult to evaluate, they rented their systems and did all the maintenance themselves, so making it difficult for their costumers to reverse engineer stuff made sense for them. Also trustworthiness wasn't an ussue for them, people trusted IBM anyways. And it generally was successful, it was sold for 6 years in 21 major versions (according to the number of the last released version, which is 21.8). Unix had a totally different usage, it was intended for the underpowered minicomputers of the time, was intended to be portable and it was originally a small side-project at AT&T, that only had a few developers working on it.
In most board rooms, it's easier and more profitable to sell a large, complex, proprietary system -- especially if you have someone from IBM or a complicit accounting firm or on the Board.
28:45 probably the "small" part of the Unix philosophy is outdated with regards to RAM, except on embedded systems. Modern PCs have at least 4GB, usually 8 or 16GB, so noone cares if the OS needs 5MB or 500MB.
Composability in UNIX is such an ugly mess with each program having its own mind numbingly confusing protocol. The problem is that UNIX wasn't written with non-technical people in mind.
A point in the Unix Philosophy which you did not mention in this video is the principle of the least surprise. That is, user interfaces build as much a possible on similarities with previous programs. (Explanation for RUclips, not for you). Unfortunately this philosophy is largely abandoned by GUI's and applications (LibreOffice, KDE, not to mention Gnome). And on the system level with (for example) systemd. I still can't understand why it is systemctl action target (systemclt restart apache2) which is has been /etc/init.d/apache2 restart for decades.
Opps have an error I need to correct in this video, I mention the name Dennis McIlroy, his name was Douglas McIlroy, I guess seeing Dennis Ritchie's name so much during the course of this video my brain got caught in a loop....any way apologies for the error
I like your analysis, I'd like some more videos about your days working at AT&T in the earlier days of computers!
I agree, would be fantastic.
Invaluable experience and insight from someone who was actually there back then. Thanks for the wonderful content, as always!
The voice of experience. Thanks again for another great episode DJ Ware! 🥰
Great video. I’ve read Raymond’s book and thought I was generally familiar with the concept, but this corrected some misconceptions. The commentary about which tenants Bell Labs and others agreed with or adhered to was very interesting.
Since it sounds like Linux isn’t really a prime example of the UNIX philosophy in action, it would be great to get your take on the general state of code quality in F/OSS vs. proprietary software.
People make too much of this. It's not like the tablets from Mt. Sinai, it's just a grace note that kind of holds everything together conceptually. And people always get it wrong: it's not "everything is a file", it's "the system (at least tries to) treat everything as if it was a file" - very different. In practice it makes for a great open system since you can fix/alter/adjust almost everything pretty easily by editing a system file. Brian Kernighan refutes the standard Multics narrative btw and (I think) so did Dennis Ritchie.
Hello sir, I was watching this video after DT's video on unix philosophy and I love your take so much!
Thank you for taking time and effort to produce this video.
This is an amazing talk, and I can only wish all young programmers would hear it.
There's so much crucial knowledge to learn from Bell Labs, and I'm worrying that their insights will be forgotten - if it hasn't already, truth to be told. This is why history is so very important. We need to learn from the past, and become inspired to create the future. Not just going through the daily grind for a paycheck, as most seem to settle with. Therein lies regression.
Really appreciate your talks DJ Ware, thanks for what you do and keep it up. :)
There is always room for their new ideas too, and yes would hate to see this lost, but there is some really cool work going on, and NixOS is one of them.
Thanks for a great video. I think the unix philosophy is more important today than ever.
I'm over in the UK but back in the late 1990's I was working for Lucent Technologies and was lucky enough to get a visit out to Bell Labs in New Jersey some time around 1998-1999 for a week of training for one of our CTI (Computer Telephony Integration) products.
It was an amazing place - I remember specifically walking down a long corridor filled with small glass-walled cubicles and every single cubicle was covered in someone's hobby or obsession. So one would be covered in pictures of butterflies, another might have Star Trek, etc... lots of people walking around in white coats too.
I'm of a similar age to you, this year I celebrate 40 years as a "techie" in the telecoms, IT and cyber-security industry. I think it's easy for "old guys" like us to appreciate The UNIX Philosophy because we've been around computers for such a long time. I started working on computers back in 1982 when my first (proper) employer gave me day release one day a week for college, and I started programming Z80 CPUs in machine code, and I was lucky enough to start using (SCO) UNIX on telecoms servers back around 1989-ish.
I think you need that kind of experience to get a perspective on just how fast computer technology has developed since back then. Back in those days, we had limited CPU power, RAM and storage and therefore there was a constant limit on just how many programs you could load and run - which meant that modular software had to be the norm.
Most of this video reminds me of things I have tried to impress upon workplaces that I was tasked to improve; and you can imagine, met with intransigence, particularly when conflicting with the survival instinct of managers who depend on a level of unmanageable complexity to justify employment and promotion ("sand-bagging".) So I really love hearing someone that a lot of people trust talk about Brooks.
The only gripe I have is Linux (predictably), as in other videos you have (sort of) sung the praises of dkms. You are very well aware, obviously, that the kernel can be shaved down to almost nothing. But it isn't, because of the insane growth of devices that users expect to be recognized out of the box. It is hard to express to non-Linux users that the hardware "driver" model actually works better in Linux, that you are more likely to have things like hardware recognized by the Linux kernel or in a module without any action, rather than depending on (presumably) competent EEs' terrible code to talk to the kernel, as is the case with Windows. This is a business decision, IMHO, as it allow MS to pick winners and set the barrier to entry into the ecosystem, where Linux is solving a completely different problem.
Anyway, summing the above para, while it is a "monolithic" kernel, all kernels from the major OS's are hybrids. Linux's kernel is increasingly modular, and size comparisons are therefore increasingly misleading (and compiled code is dropped, as you've seen on Phoronix.) Even bogeymen like systemd are actually more modular than detractors say. There are small executables that run parts of the its system; e.g., journalctl and systemctl. Units are the definition of modular, and parts of systemd that touch different subsystems are isolated and use the filesystem, network or good ol' IPC to communicate (gummi or systemd-boot, e.g., is the most reliable boot manager in the UEFI age that I've used; as far as I'm concerned grub is undead.) This is all a bit of a digression, but I just meant to remind you that you have "sort of" sung the praises of modularity in Linux before. While it could be better, in many cases where I tried to instill the good practices from Bell Labs that you talk about here, I was answered with "this isn't an open source project," which is completely irrelevant, of course; but it shows that FOSS (and yeah, FLOSS) is closer to the mark than most commercial software, where "kanban" is a certification and not a philosophy.
I appreciate your take on the UNIX philosophy. It is certainly more nuanced than how it is usually presented. I too watched DT's video on the matter, but found it disappointingly superficial. The key takeaway from his video (for me) was "it doesn't really matter". But hearing you I see the U. philosophy as an ideal (even applicable, in part, to one's own life!). Yes, perhaps Linux didn't follow it and it is too late now to implement it to the letter. However, the "spirit" of the U. philosophy can still be held as a model. Btw, I've seen Gnome take only about 550 MB... but without systemd, of course! :)
This may speak more to DT's experience (or lack there of) with the plumbing, programming, or the underlying philosophy of Operating Systems or systems in general. It may make sense to him as an end user, but agreed, it was very superficial.
@@HandyWyo A lot of it comes down to "you had to have been there". I'm a similar age to DJ and been in the IT industry as a techie for 40 years now. Back then, the limited CPU, RAM and storage of computer systems meant that you had to make the software modular because you could only do a few "processing things" at a time.
My first connection to the Internet was over dial-up modem some time around 1992 on a Commodore Amiga A1200 running at 14 MHz with around 8MB RAM - but even with that I could browse the (early) Web with the "iBrowse" browser on the Amiga.
One reason I repair, refurbish and collect really old IBM Thinkpads is I like making a computer that might be more than 20 years old do something useful today, and (Gentoo) Linux lets me do that. You get an appreciation of just how far we've come with computers in 20, 3 or 40 years and how we've taken having immense computing power on a device that fits in our pockets for granted.
Awesome material. Time for me to do more digging into it.
Thanks for your background info! Your videos are extremely helpful to me. I've been coding since 1984 on different systems, primarily DOS and then Windows. I"ve been playing with Linux since 1998 but at a relatively superficial level, at least compared to my Microsoft experience. Your background information is giving me the mental hooks to start digging down and really see what's going on. Thank you for sharing your insights!
I'm a computer user at this point that's learning Linux out of pure concern for becoming institutionalized on Windows and learned my way around BASH command line and I'm watching this and thinking this philosophy is exactly why I'm upskilling to make the switch gradually. Modern internet is full of too much fucking bloat, and so are our operating systems.
31:04 Loved your comments on Linux bloat. Your channel is much underrated.
Really interesting. Thanks DJW 👍
Thanks for another great video. I enjoy your style of filling in gaps in knowledge and weaving history and experience into the subjects you cover. This was very informative and entertaining.
Thanks Eznix much appreciated my friend.
Wow! Two of my favorite educators. You gentlemen are the best.
Another great video from a veteran in the industry & giving back
One of the biggest mistakes I ever made was going to work at a company where people would rewrite and rewrite their code to make it prettier.
Before going to work there, I wrote code that worked and didn't care whether or not it was the "most elegant" approach. In one company, I was there a year and half before I missed a deadline and that one was by only one day. After working there, I find it nearly impossible to resist rewriting everything to try to maximize elegance and as a result it makes it difficult to actually accomplish much.
Me and my beloved OpenBSD machines would like to thank you for another brilliant video. You´re the best Sir. These things has never been more important than now.
Thanks Michael, good ideas usually stand the test of time as you noted in OpenBSD :)
DJ WARE - MUCH RESPECT ✌
One point in the Unix philosophy is that input and output is text based as much as possible. This is somewhat implied in what you mention "the output of one program can serve as input of many different other programs". The huge advantage being that if you don't know what to expect in the output, or you don't know what to search for, you can, as a human grep or awk until you find what you need. And know how to further process it.
I just started watching but I already PREDICT its going to be a GREAT VIDEO to listen 2!
The Unix philosophy seems to work well with my working knowledge of cybernetics (currently reading Stafford Beer). I also watched an obscure tech RUclipsr argue that modern programming education neglects the data portion and focuses on functions. He argued failure to grasp ones head around large data and how it must be navigated is the fundamental cause of disparity between newly educated programmers and professional programmers. ex: learning about the producer/consumer model may describe a little bit of the solution dynamic, but it absolutely does not get across the actual problem of concurrent data access in a large set of data.
I can see how UNIX philosophy encourages a focus on the data, with its second trait, expecting output to be input for unknown programs. Riffing off of these points, maybe modern computer science students would benefit from interleaving system/enterprise architecture frameworks focused on business logic alongside learning computing theory.
Here's the video I mentioned above: ruclips.net/video/tCNPPj5aqOE/видео.html
Also, I wonder, where this Unix philosophy (UP) now? Should specific organizations/ecosystems develop for the broader UP-compliant network, or does it count if it is still specific and closed within one organization/ecosystem? As in, is it okay for differing organizations/ecosystems to follow different implementations, e.g. breaking from POSIX compliance in favor of new standards? I'm thinking about NuShell, and I found their tabular data frame format *amazing* and conducive for UP style script development, but it's a soft break away from the greater ecosystem of bash and POSIX compliant shell commands.
Thanks Weaver will have a look at it, I was on the POSIX committee for a bit representing Burroughs, omg I hated those meetings it was just people trying to bend it to their way of thinking...and yeah Unix Philosophy was at a bit higher level than POSIX. Fill a room full of nerdy engineers, ask a question about standards, then run. :)
We had a system which ingested something like 650 MB/sec of data when I joined the project it grew to well over 1.2TB/sec over the course of project and there were plans to handle 2.4TB/sec coming. A complex problem for the time, the solutions we found to work and scale were parallel streams, minimizing transformations and parallel writes to chunked storage pools to aggregate bandwidth across the ingest pools, was this the best solution? Probably not but I notice CERN does something very similar today. Oh and one last thing we deployed out apps in swarms, so move themselves to the data elements, and never moved the data from its landing space.
Thank you very much. Very interesting as always.
Composable systems are also developed in a culture that expects interoperability, which ends up meaning that the developers are more experienced by the time the system has to inter-operate with the user.
I've been catching up on your awesome videos, DJ. I'm not sure if it was this one where you expressed dislike at referring to a shell as a command-line interface. I was a bit surprised at how strongly you said the Shell was not a CLI. The Amiga computer called the lowest-level script interpreter at boot the CLI before you added a feature-enhanced "Shell" with command-history and such, equivalent of Bash or KornShell. But shells were still enhanced instances of command-line interfaces, which I always thought was more descriptive and less intimidating to the layperson than Terminal, TTY, Shell. It's the interface that takes typed-in command lines rather than a graphical user interface and had a resurgence in use I think from JavaScript frameworks like Vue-CLI. I think it's a good thing the concept is making a resurgence and think it's the rightful heir to the concept Terminal, TTY, Shell, SSH and all the other cryptic things we've called it in the past. Curious about your thoughts.
Recently, the videos have included referrals to Gifted and Inspired One-Offs, whom the vast majority of computer users will sadly never know of. Their contributions seem to have been major ones too. Thank You.
Very interesting to hear your experience and thoughts on the original guiding principles of Unix, DJ. I thiink the availability of very fast CPUs and very large amounts of memory has caused a real sea change in the approaches to systems and applications programming--not always for the better, eh? Perhaps it is progress, but I can't help but get a nagging feeling that somehow the fundamentals are being forgotten or at least are less understood. Cheers from Wisconsin!
Cheers Andrew, and yeah it does seem odd talking about saving a few Megabyte of memory when our system come today with 4GB, 8GB, 16GB and up and up...but it all adds up. What was the Dirkson quote, a Million here a Million there, pretty soon it adds up to real money? I guess the same can be said for system resources as well. Thanks again for the kind words.
Another video very well made and very well explained!
Thank you so much sir 👍!
Thanks Felix as always appreciate the kind words
Thanks for your input on the topic.
As I see it, Linux is a large scale community project driven by enthusiasts, not necessarily great engineers. Besides, having strict quality controls on such a large scale open community project is probably practically unfeasible. That's why we got what we got, and alternatives like BSD are generally better engineered.
While I appreciate being able to read the source code of the software I use, I wonder if the whole free software thing hasn't made all the software cruft more viable. My theory is it has for a bunch of reasons: (1) it incentivizes growth, because as you said, people equate line counts with greater functionality and the line counts became visible now (2) it incentivizes growth and complexity as a protective mechanism -- if it's complex, users will be more likely to trust the complexity is warranted (even when it isn't, but that's hard to figure out) and will not run away to roll their own software for it. (3) it deincentivizes code removal for social reasons -- it's not "nice" to delete some else's "contribution" and a maintainer of a some open source software might perhaps be less likely to try and keep the codebase very clean, as it isn't his/her but rather collectively owned (+ again, it's not "nice" to even do trimming edits to someone else's code).
Well said, thanks for putting things in perspective
Thanks for making me fix my display manager startup chaos!
Just finished watching your video it was informative and learned some new things. My guess would be a emphasis on Small Programs and Simplicity using the Unix Shell Env. and assistance of shell scripts to tie things together. Giving user more control with init scripts (BSD Style) as opposed to monolith systemd for example. Now as far as the Linux Kernel goes yea its hopeless. But I guess the best we can do is focus on the user space for the most unix like env. Then we have those multiple man pages and the sloppiness of Linux. Hmmm... so in this case what's your solution to get the best Unix Like Experience? I personally just ended up opting out for Mac OS (super UnUnixy) but a more unixlike shell env with convenience of the commercial GUI apps if I need them... But I hate a lot of things about dealing with corporate Apple Products and the Consumer based closed source type things. So I guess that leaves us with BSD's? Linux is the wild Wild West but Slackware seems to be able to tame the wild horse of the linux world to make the most Unix out of it. Seems like Slackware is the closest thing to Unix while having the vast support of Linux ecosystem that BSD lacked. What are your thoughts on this. I'm sure you try to stay true or close to Unix in your own personal work env.
Nothing wrong with Slackware, its an awesome distro...the maintainer has had some health issues and it slowed the pace of new releases for awhile. and yep Slackware tried to pull Linux back into heel (heel being the Unix Philosophy) unfortunate the rest of the world saw dollar signs and went chasing after complexity. And yeah I try, I wasn't one of the UNIX devs or a member of the team just a guy trying to support the people who bought AT&T hardware and used UNIX to solve business problems. But I made many friends at Bell Labs and Western Electic (will have to spill that story sometime)
and Western Electric, now there is a truly fascinating story, one I will have to share in a future video if I can figure out how to work it in.
@@CyberGizmo Looking fwd. What is your main workstation at home? ...I'm guessing Debian?
actually mythical man-month aged pretty well, one of the top books which i recommend for new devs (and not only late programmers antipattern, silver bullet also well known, and second project syndrome too, and stuff with paper bug ledger is still very educational)
and thx for original article about unix, this piece about fast prototyping is still hit hard, even with popular "agile" (kind of) methods, wish someone tell me about this at the beginning...)
Would love a take from you on the other systems of today, as you commented Linux in this one. What about the BSD’s of today, or Mac OS?
I can just close my eyes and picture the Dude telling me about UNIX philosophy
The picked up thumbnail for Illustration of "philosophy" is piece of 🎨🎭 🌌 🧠 !
Please make more videos like this one👍
Hey DJ - where, in computing history, do you come from? I respect your thought and perspective on many topics and wonder if I don't know what you've done in computing or technology - or is 'DJ Ware' just yer old Razor 1911 handle?? :P
At AMEX during the Y2K, I had the opportunity to do QA on their Cobol code that covered AMEX card transactions. I got to see that code that had been written in the 70s and 80s had pretty much followed the 'independent' and 'reusable' design, each page stood alone and was a maximum of 24 lines, and did not depend on other than getting input and sending output in response. The biggest thing I found was inputs that never would be received, or outputs that never went anywhere and once in awhile, infinitely looped code, code that called itself, or code that would never get called - probably by units that in the years after being written, the calling code got deleted.
Working on analyzing Y2K code at Wells Fargo was somewhat similar, but nowhere near as simple or stand-alone or dependency-free as the Amex stuff. I found check transaction code with potential errors that had never been caught in 20 some odd years - of course, none bad enough to alert end users, apparently.
Amex was very easy to work with - as a QA guy I would only have to observe the programmers do their own testing following my QA updates; at Hell's Cargo it was quite a complex process to follow the logic of all the object transactions, and also to work with a Karen supervisor.
iNTEL was the worst, imho. Their idea seemed to be to throw money and programmers at a software project, let it develop for a couple of years, and then cancel it. They were apparently GREAT at hardware projects; but software just seemed to be a big problem for them.
One night, Ken and Dennis were working late, arguing about what to call their new operating system. After many hours of hot debate, the finally decided to knock off around 9:00 and go for a drink. As they walking into their favorite watering hole (the BTL Rathskeller), the bouncer at the door saw their jeans and T-shirts and attempted to enforce the dress code (plastic pocket protector and belt-mounted calculator required) by pointing at them and saying, "You: nix!", meaning of course that they couldn't come in. And so history was made.
15:25 Even the diagrams (in the pointers chapter). They are _pic_(1) programs.
Who am I to argue against you regarding the bloat in "man". But that is on the developer or maintainer level. For the user "man [section] page" still is sufficient in the majority of the cases. So we as users don't notice that, until we need something special. Actually I cannot recall to ever have used one of the exotic options. I am not sure in case I would need something special I could grep and awk my desired output. However, the increase in program complexity cannot be denied. It is a tradeoff whether it is this, or a string of pre and post processing utilities.
What happened to System 4? Also, I gotta admit that the "Nuke and Pave" approach to code makes a lot of sense. My only counter would be that if you do it really right, code doesn't have to expire. There are a number of utilities that have been around for decades that fit this criteria although I'll admit that those examples are the exception and not the rule.
The bloated idea ... not so sure about that. Unix hit at just the time that Moore's law kicked in high-gear, so adding things to it was not really so negative. I love the Unix system - it is, or was, beautiful ... and I mean the early Unix system, when it was open, text-based, self-documenting, multi-user which at the time was important, and the file-paradigm was really amazing. Then he workstation manufacturers took it over and hid most of the text-based stuff, and made it a lot more complicated to the point that people couldn't really understand it without in ops having expensive contractors, and in design had to take expensive classes to figure out how to manage the system.
I reallly enjoy these discussions of the IT industry, thanks.
Interesting video. Notably to see, that the purpose of modular and reusable code was already there. Nowadays everybody in the TDD camp talks about it. =).
Is it fair, though to compare RAM sizes and such Linux vs UNIX? I mean, Linux can do a lot more than Unix could back in the day. Various Hardware support, graphical user interfaces that need to be accomodated etc. etc.
When I started my studies 2006 they had a UNIX computer pool in physics. Sorry but it was archaic. The userinterface was just painfull. There is no reason why users should have to go through the pain of working with computers like they did in the 70s just to feel more nerdy.
Yeah, true considering some bloated Linux commands where the man page stretches over page and page.....What I have heard is that the BSD dialects have much fewer flags for their commands.
...PS call me crazy but I think Slackware Linux is closer to Unix Philosophy than is BSD and its Bureaucracy! :) Small home-brew team brought the best of Unix Philosophy in Linux and stayed True to it!
Good point and one I missed, the founders of both Slackware and Debian tried to push for the Unix Philosophy, they did manage to do it for their distro, but sadly it didnt catch on to some of the others. Thanks J.D. for brining that up
@@CyberGizmo Debian has messed up on this point thou (perhaps after Ian)! :) Definitely. Slackware = Small nitch group doing amazing things. Sadly I don't have 1 installed on any of my computers at home.
I think the comparison with OS/360 doesn't make a lot of sense. IBM having a total different approach with it's mainframe OS, that was just made for one platform and never intended to be portable and IBM had a lot of resources, so they could maintain a large monolithic system like that. Also because it only had one purpose, making parts easily replaceable was just not necessary. Also for IBM it actually made sense to have a system that is difficult to evaluate, they rented their systems and did all the maintenance themselves, so making it difficult for their costumers to reverse engineer stuff made sense for them. Also trustworthiness wasn't an ussue for them, people trusted IBM anyways. And it generally was successful, it was sold for 6 years in 21 major versions (according to the number of the last released version, which is 21.8).
Unix had a totally different usage, it was intended for the underpowered minicomputers of the time, was intended to be portable and it was originally a small side-project at AT&T, that only had a few developers working on it.
What is the connection between Unix and Linux?
Do you think apple is willing to dump code?
In most board rooms, it's easier and more profitable to sell a large, complex, proprietary system -- especially if you have someone from IBM or a complicit accounting firm or on the Board.
Sadly, all too true, Tom.
wow dt got mentioned by the Linux yt master himself!
I understood 26:12 to be talking about the shell
28:45 probably the "small" part of the Unix philosophy is outdated with regards to RAM, except on embedded systems. Modern PCs have at least 4GB, usually 8 or 16GB, so noone cares if the OS needs 5MB or 500MB.
Composability in UNIX is such an ugly mess with each program having its own mind numbingly confusing protocol. The problem is that UNIX wasn't written with non-technical people in mind.
What do you think of Terry Davis?
2:19 *cough*systemd*cough*
31:25 Blame GNU, not Linux.
A point in the Unix Philosophy which you did not mention in this video is the principle of the least surprise. That is, user interfaces build as much a possible on similarities with previous programs. (Explanation for RUclips, not for you). Unfortunately this philosophy is largely abandoned by GUI's and applications (LibreOffice, KDE, not to mention Gnome). And on the system level with (for example) systemd. I still can't understand why it is systemctl action target (systemclt restart apache2) which is has been /etc/init.d/apache2 restart for decades.
Lol man mpv
Everything is a file.
man pages still suck