Are YOU a Great Programmer? or a Regular Programmer with GREAT Habits. Download my FREE guide on how to adopt 30 of my favourite Programming Habits. Get yours HERE ➡ www.subscribepage.com/great-programmer
1. 1:27 We get to create tiny worlds inside our computers 2. 2:03 Build a mental model for how things REALLY work 3. 5:37 Always strive to write clear, tidy, well-structured, readable code 4. 6:54 Good design is well structured and navigable 5. 8:05 Abstraction is at the core of what we do 6. 8:14 Abstraction is risky without sound foundations 7. 8:21 Distributed programming is MUCH more difficult 8. 9:44 Automate or lookup the details, focus your skills elsewhere 9. 11:27 Automate repetitive tasks, especially if they are complicated 10. 12:24 Messages, events and asynchrony help tame distributed systems 11. 12:38 Understand the next layer down before abstracting 12. 13:25 Fast feedback via continuous integration is invaluable 13. 14:07 Test behaviour, not implementation 14. 14:34 XP (eXtreme Programming) works well for big projects, as well as small 15. 15:54 You have to de good at a lot of things build great software 16. 16:20 Continuous delivery works for VERY complex problems 17. 16:34 Software engineering needs software engineers 18. 17:41 Good software engineering depends on mechanical sympathy
My story is similar. I started on a 48K ZX Spectrum behind the iron curtain at the time. I quickly filled the entire memory with code, and realized the need to switch to assembly, which allowed me to learn how the CPU worked. I was around 13, so didn't immediately have that much success with Z80 machine code. It was my uncle, an assistant professor, who brought his PC home. As soon as the borders opened to the West, we bought enough components to build a PC and I switched to C, Pascal, then C++. I also had a knowledgeable coworker early in my career who taught me about inheriting from public interfaces by overriding virtual methods, and I slowly became an architect, instead of just a smart coder. Unit tests and CI came later. I've always specialized in documents (PDF, image processing, DOCX, AI page segmentation). I'll have to read your book now.
Two principles of abstraction that have guided me well in the last 30+ (cough) years: 1. Understand one layer above, and one layer below. 2. One problem that cannot be solved by abstraction is having too much abstraction (often due to a lack of understanding the required layers, see #1).
I wonder what kind of programs AI writes? Since AI learns from legacy code that must be OOP, I guess most of their code would be templates of OOP programs doing various things. So much for the great debate over OOP vs structured programming.
@@raylopez99 why "must" legacy code be OOP? OOP was new at one point and all legacy code was either written in LISP or ALGOL. lots of legacy code exists in many paradigms.
Great story, very much in line with my own. I started in depot repair with Data General Nova computers, learned assembly programming on paper tape editors and assemblers. Then progressed to a OSI Challenger 6502, again in assembly language... but I built it! I wrote a network bridge on my first PC Clone bridging a little endian minicomputer to little endian Novell networks in 1983, including my homebrew ethernet driver, . Users could transparently retrieve and store WordPerfect documents between environments. I settled into C, but over my 50 years programming, I've used many SQL systems, FoxPro, Delphi (object pascal), PHP and more Javascript that I care to remember. To this day in retirement, I still program in C/C++ and server-side PHP. Mainly as a hobby I wake every day with yet another new project or algorithm to fill my days. Thanks for your story, good memories.
Apricots were the first PCs I used, writing FORTRAN applications for engineers. We had started with buying time on CDC mainframes in bureaus, then it was cheaper to have a PDP11/45 in house, and then going smaller again to the Apricot. It had the advantage of more RAM available to the program than other PC clones. The major lesson I learned was how to do useful work with very little RAM and "primitive" output devices. For example, we did earthquake analysis of nuclear power stations on non-linear soils on the PDP with just 392kB of RAM and a 20MB disk the size of a washing machine, and plotted 3d structures using hidden line algorithms on Tektronix screens and Calcomp paper plotters. Happy days!
Similar story for myself. My uncle loaned me a ZX81 for a few weeks when I was 11 - later that year my parents got me an Amstrad CPC464 for christmas. I was immediately hooked on BASIC and learning Z80. Few years later I moved up to an Atari ST and started working with C and 68000. University at 17 - 4 years later left with a HND and Degree in CS. Its pretty amazing to think my 30 year career in computing actually stems from what many saw as a toy in the home computer boom. I owe so much to my parents for getting me that 8-bit box of discovery - without it my life would have been very different.
I started also on a Timex Sinclair 1000. I found a working one at a second hand store and I now have it in a shadow box mounted in my office. It's where I got my start
I built a ZX80 from kit as it was cheaper. Later became interested in the opcode listing “195 jp NN” etc. Then ZX81 prom and a Spectrum. And Atari ST… you see where this is going😅. I am now a teacher at a university and teach atmel assembler both in a basic course and a much larger project course. Mission accomplished🎉
My Dad bought me a Zx Spectrum in 1983 when I was 6. I feed my family now thanks to my dev skills I learned way back then. Thanks Dad ❤ Fascinating video, thanks for this!
The first code I wrote was in FORTRAN II (2) in 1971. Punchcards, IBM System/360 mainframe, 24 hour turnaround. I worked for 35 years, writing software, managing development. I got burned out in the early 90s and then the Internet came along....WOW.
Started in 1977 with PL/I and H Assembler on my first computer, a corporate IBM 370/168 MP with a huge 16 megabytes of main memory. I learned a lot from large-scale programming in that particular environment that still rings true today. A lot of the technology "discovered" lately was invented back then but only known by a few of the folks behind the glass walls.
This sounds reasonable for me, as more and more i got experienced in the priciples of computer science, i find it related to something - research or paradigm - that was established maybe 30 or even 50 years earlier. Even the prettiest ideas nowadays are mostly borrowed from the past.
Finding Continuous Delivery book im the library in 2013 was the moment that really changed my life. From that moment, i could easily name things that companies i work for, do wrong and i knew how to Fix it. This book gave me not only the knowledge, but also lots of courage
We started the same way. I wrote my first program on a ZX81 Sinclair exactly 40 years ago in the summer of 1984. I had the 16K extension :-) However I had started a few months before that learning the BASIC language and writing programs on a piece of paper. My dream job was to become an analyst programmer. I graduated as a software engineer in 1994 and have been a professional programmer since then.
My brother started with the Z80 I think it was, his next one was the ZX81 and a while later the 16k extension too :) after that the Commodore 16 came, while my other brother bought the Commodore 64 in 1985, but he was only interest on playing games 😊. It was amazing how much friends we suddenly had, we even didn't know we had, but word spread quickly through our classes and street we had a C64.
Great story. I envy his generation in many ways. When I started, all the abstractions, good and bad, were already there and had been for a long time. Now, after 16 years of work, I'm still fuzzy on how many things work. Oh, I can talk a decent game about how machines work, but this guy really knows. Very inspiring and fun to hear these stories.
We had a similar kind of startup... I started poking around on BASIC in 1977 on a Model 1 TRS-80, then a VIC-20 in 1981, I had games out commercially in 1982 and a few decades of Embedded system Experience at companies here in Silicon Valley. I really like hearing those stories about the low-level assembly language all the way to the concurrent systems and RTOS. After 40 years you can look back on all of it like it was a blur and marvel at the speed of how systems evolved and we had to adapt to them. Good times, wouldn't want to change them for anything.
@@technicalaamiriqbal8991 Assuming you want to leverage your electronics and learn to code... It's very much like learning the language of electronics, how did you learn that? Schematics, components, grouping components, systems, etc. You built it up. I taught at the University for 10 years in the Computer Science department and in their graduate multimedia program, so I do have experience teaching this subject to ppl wanting exactly what you're asking and wanting to do. It starts with picking up a language or two, Usually "C" is what is used in the embedded system industry, C++ is rare, but also used. Other languages seem to come and go. The first step is to learn a language and an environment. Embedded systems you can start with Arduino. Arduino looks like C++ but it is a similar language called Wiring, and it isn't the same thing. What I do is have a stub that calls functions that will be located in a different file, then I open a new tab in the project name the file (name).cpp and write C++ in there. Wiring is similar to C++ but it does some pretty horrible things to it that will frustrate you. (Ask me how I know LOL) The next step is to start learning data structures, and concepts of structured code. Learn about Flow charts, algorithms, that manipulate data such as sorting and searching... Also learn idioms in the language that facilitate them... this may seem boring, but it helps engrain the fundamentals by learning examples. The way code is done today is google, copy, and paste and little effort is used to learn how to read code and understand it... but that is crucial. Next: learn about code architecture, UML (or other modeling notations if UML is outdated, but you'll still encounter it so worth learning), API's, abstraction, libraries, code reuse(both definitions of this), interfaces, and design patterns. If you are using ARM then look into CMSIS. There's so many rabbit holes and that is why I suggested a gentler ramp to understanding by playing with Arduino first. Best advice is to take small steps, don't try to get ahead too quickly. If you try to take a step and it is too difficult, then you skipped ahead too quickly, you are likely missing fundamental knowledge, do *not* allow yourself to stay confused for too long, go do something else instead and make a note to come back and try this later when you're able to tackle it. Good Luck. If you're interested in more, go check out my channel, find my emails and DM me.
@@technicalaamiriqbal8991 , I also have an extensive background in electronics, and actually use programming with it. It depends on what you're looking to do. Desktop, web (browser, server, both/full-stack, etc), embedded, or control systems (SCADA/PLC). As far as getting started, assuming you have no programming experience, you're going to want to start with a basic starter course. I'd look at courses on Udemy, or similar, since they have a wide range of topics to choose from and do a fairly decent job of going over fundamentals in certain areas. First thing to do is decide what you're wanting to accomplish with programming. Then look for online courses that you can follow along with to build something in that area. I'm self-taught, nearly 100%, and have been a Senior Software Engineer for 8 years, after working in many other industries where my programming experience was used to make our lives easier in the field. So, there's no more clear of a path that I can give you other than what I suggested above. Hope that helps, and feel free to ask any other questions. I'm an open book.
Hi there! I've been at this for over 40 years as well. I first learned assembly language from a radio station technician in the Arctic when I was a kid. Over time, I worked my way through most of the programming languages on x86 architecture, starting with the 8088. In the 1980s, I coded for the Swedish military (details declassified in 2022, but the content remains classified). They provided me with a cover job, so no one knew the full extent of what I was really doing. I've done extensive inline assembly for large corporations, including space applications. It’s been quite the journey! - I was hit by a massive stroke a year ago, and now I can only do music .. can't complain :)
That is some extraordinary experience compression algorithm! 20 million minutes of expertise compressed into 20. Now i understand how HRs can demand for graduates with 24 years experience in software development.
wow...that was awesome....i had a radio shack computer when i was a kid and wrote my first line of C, C++ and Java in 1997.....this video made me realize i know a lot more than i thought i did....Thank You.
I, too, went from BASIC to 6502 assembly on an Apple ][ because I needed faster computing for a game! And I love the term "mechanical sympathy" to describe understanding how the underlying technology works.
Dave! Thank you so much for your words of wisdom! I've been programming for about 30+ years myself -- since I was knee high to a grasshopper! My dad upgraded his old Intel something 286 CPU to a 386SX and let me use the old one to build a DOS 5.0-based PC with him! He gave me his old yellow monochrome monitor and upgraded to the 4-color TANDY... or maybe the 16-color EGA? (Late 80s.) He showed me how I could write code in DOS's batch file scripting language and how I could use ASCII alt codes to build a graphical border around text ... and the rest, as they say, is history! I've been hooked ever since! The ability to make manifest all the neat ideas in my head and (effectively) create something out of nothing was so coooool! Again, well done, brother!
I remember typing in the Reversi game out of a magazine. The computer's logic was okay, but it didn't take long for me to be beating it regularly, so I rewrote the logic, and then I had a real challenge again. That was on my Atari 400, back around 1983.
Thank you, Dave, for a good summary of a great life experience and important lessons. Ordered the book, eagerly awaiting the delivery. My father purchased Amstrad Schneider CPC 464 in late 1980's, and learned from magazines to make quizzes e.g. capitals of the world - I was around 10 at the time, and mostly typed in the code 😊 Then came high-school, 486, Turbo Pascal and Dephi 1.0 on Windows 3.1 ❤
To understand this in better light, I ran the video in 19x speed, so that I don't miss out on anything of your (40x1440×360)minutes. Many Thanks, for sharing, it requires resilience and guts to stick to one job for 40 years. It today's Computing too, abstraction and understanding. Macro and micro levels, does matter...
Great stuff Dave, my early work included working on a private network, implementing a comms stack up to the the TCP level. And before that zx80 assembler to test network hardware for the same network. Haha a later learning point, when I found I had been doing deep unpicking of the 'finer' points of Ellis and Stroustrup for fun all weekend for a solid six+ months I decided my life had to change... Later big gains were agile methods, test first development / TDD and BDD, and automated testing in the Rails world. RSpec was a huge lasting influence. And I'm still learning, thankfully. It's one reason why IT is so compelling as a career.
ZX81 for me as well. An Amstrad 1512 later (I learned C on that - Zorland compiler). A 3 month City & Guilds course primarily using COBOL :) I've been programming professionally all my life (40+ years) but was laid off (not sure why) last year and am now having trouble finding another job. I've come close, last 2 at one and last 3 at another but no prizes for 2nd or 3rd place. Weird times. I thought I'd be programming professionally for as long as I wanted to, but times change. Hopefully things will pick up.
My developer journey started a lot later than yours in the early 2000s. I did come to a lot of your conclusions in my work building complex systems software in a large enterprise. However the latter was so inflexible, that, despite demonstrating that my approches worked, work there became unbearable. Therefore my career terminated early about 5 years ago. I started growing organic vegetable, building a working business. Even in that field many of the learnings can be applied... Only now am I starting to rekindle with my deep interest in the computer world. Better late than never. Thanks for the video. Perhaps if I had searched for a better place to apply my skills, I would still be working in the software industry. Now I administer our home network, fiddle with Minecraft servers for and with my kids and tinker with electronics to monitor the weather and other things relating to such topics. I guess I have made my peace with the software industry...
Me? 33 years, from DOS in 1991 to Windows in 1998. First app? A VT220 Terminal Emulator (for dos) with scripting capability. Oh, using Borland C/C++ from then and till now (Embacadero C++ Builder). 3:50 Dr Dobbs Journal, C/C++ Journal, got me a CD from the C Users group and a CD called Technical Arsenal. It has a disassembler thats still in use but for windows
Following a similar journey to yours, Dave. Learn the basics, get burnt with abstractions, learn to abstract well while exploring testing, now moving through XP-fueled coaching and consultancy to help other teams scale their efforts in CD/testing.
Remarkably similar experience with microcomputers starting in the 70's. Also fortunate to have had some access to the IBM 370 and the Cyclotron Unix computers at Cornell, which exposed me to the formal systems of mainframes. It was such a magical time.
Thank you, a wonderful video. As for me 1985 I started as a software engineer. My head hurts, thinking back on all the changes. I found that the most important skill is establishing collaboration amongst team members. I try to treat everyone like family. Even the cranky ones. I once worked with a gentleman with an odd personality. Many didn't like him and didn't want to work with him. I started digging into who he was inside. Found out he was a submariner in the Navy, over the top intelligent, and was a quiet person. We became work friends and collaborated in great product delivery. Working with him was an honor.
Awesome presentation! I found myself nodding in agreement as you moved through the progression of your skills and understanding. I am very interested in your opinion of AI and your prediction of how the software engineering business will change in the future? Thanks.
I started out with dabbling in forth and lisp, but a prof who was a huge enthusiast in Wirth languages left a huge impression on me. I loved writing code in Modula 2. Initial compiles were throwing 3-4x errors than the unix c I was using at the time. Very frustrating, but helped me understand I should spend more time thinking about what I was writing before hands landed on keyboard. Retiring soon, need to re-discover lisp, but I'll be open-sourcing some FreePascal projects also.
Same here, TI-99, Vic20, 64, Atari, Mac. Same Start, same Game. Interesting. Suffering from machine details brought me to write for middleware. First proprietary middleware like Realbasic, then Java, then Processing. Since then I feel no need to move on, because codes and libs that I wrote 20 years ago, compile still and run fine. What I learned: Make it cleaner, cleaner, cleaner. And: read, teach, learn, discuss, produce, all together brings you further. I you leave something of those 5 out, it slows your evolvement.
No Amiga? The leap from the C64 to Amiga was huge in my 15 year old eyes. Still lots of good memories on both computers. Everybody was trying to make the most impressive demos on the C64, it had not use other than to show your skills. But it was fun!
@@jensBendig Me too, I didn't have one, but an older boy in my street did. I bought the C64 of my brother and pimped it up with dolphindos instead. The Amiga was a bit too expensive for me at that time.
Great memories there Dave. Thank you for the reminder how and why many of us got into computers and the programming thereof. I started out on a Casio FX702P with a cassette interface for backups and a small thermal printer. I would regularly blast out long wreaths of silly quotes using a custom font design program i wrote in the miniscule 1K of RAM. Fun and games too on a single line LCD display.
Excellent video and I can relate with a lot of what you said. I started life as a physicist who ended with a career in designing and operating global data communications networks for the delivery of real time financial information and facilitating low latency financial transactions across the world. That included working for companies such as Reuters, where speed, accuracy, and impartiality were important. I was continuously learning new things which made my whole working life fun. One thing I learnt was it was ok to make mistakes because that's how you learn, but fail fast, learn from the mistake, apply the learning, and move on. Networks for real time financial services need to be always available with optimal low latency communications paths. Their design and operation require elevated levels of redundancy and diversity, with a robust "risk managed" change management process, allowing network changes, software updates etc to be applied without impacting service. So, continuous testing over the journey to deliver the network change etc, with an effective fallback process if something unforeseen occurs was essential.
My first computer was the PET2001 limited to 8K, then the Apple II, up to 64K RAM, and with 2 disk drives. PAINFUL to program any database application in Applesoft BASIC and Apple DOS 3.3. But then came the Z80 Softcard for the Apple and CP/M, and ... dBASE II ! WOW, a stroke of genius and a GOOD programmable "relational" database, and my career as database programmer was launched.
I appreciate the info. I am a systems engineer and cloud architect, and CEO of a tech company. I know how to develop software but am not passionate about it. My expertise is in infrastructure, which I truly enjoy. I work alongside software engineers and they offer perspectives outside of my purview which is extremely valuable.
Hey Dave. Just wanted to say that your book, Continuous Delivery, was the first thing I picked up 8 years ago when I started my move from traditional infrastructure/app support towards "build/release", which eventually became DevOps. Today, I have the privilege of leading a top-tier DevOps team, and the principals are as valid today as they were back then. I'm thrilled to see you're still out there teaching.
I remember vividly trying to explain to someone how to have success at a workplace I'd left that I was suggesting they'd apply for. During that conversation I realized why I'd been successful, but before I hadn't really meditated over it before. If you can get into a mentor program, I recommend it. You will not only help someone get a good start in your industry, you will also learn a lot about yourself while doing that.
It looks like you had a lot of fun projects to work on during your career Dave. That makes one keen to go to work and not dread it. The Apricot gig must have been a dream job.
Apricot was a nice place to work. We did some cool stuff. I have been very lucky through my career and nearly always chose my next job based on being interested and excited by the prospect of the work, which helped me to find interesting things to work on.
Had a prof who taught a class in 370 assembler. He taught beautiful machines and the beauty of assembly - the relationship with your code was simple and it never cheated on you.
I found a listing of a program to play Reversi on the BBC Micro in the September 1983 issue of Your Computer. I wonder if that was the one you referred to, Dave. It was submitted by A.P.Walrond, Pitney, Somerset.
What is Mechanical Sympathy? Beside that I am amazed to see how abstraction can reduce the entangled logic and how tests are crucial for good software output. I am still building myself to that.
"Mechanical Sympathy" is a term that we stole from Formula 1 racing. In the 1970's the best racing driver was called Jackie Stewart, he was asked in an interview if you needed to be an engineer to be a successful racing driver, he said "No, but you do have to have *mechanical sympathy*". We were building one of, if not the, fastest financial exchanges in the world. We learned that to make the most of the hardware, we needed to design our code to take full advantage of it, from processor caching to disk-drive mechanics to Network/OS boundaries. I describe some of this in this early video from my channel: ruclips.net/video/0reMVgn6kRo/видео.html
The Intel 486 did not do concurrent issue. It was heavily pipelined, but instruction sequencing wasn't that big a deal, except for instructions with larger latencies, which perhaps benefited from being issued early. I don't even recall if the 486 would allow other instructions to issue while it finished a multiply, blocking only when the result of the multiplication became needed. That would have been an important use case. Most of the important instructions on the 486 had single clock-cycle latencies. Maybe the floating point unit executed concurrently, but I never did any FP coding on the 486. I recall the 486 as easy to program in assembly language. It was really the Pentium where you had to carefully schedule compatible instructions for dual issue, or your performance sucked.
Not quite true, pick the wrong instructions order in the 486 and you blew the pipeline, and so hit performance severely, It was described in some detail in the Intel Spec sheet when the processor was released. So you needed to worry about what order to place instructions as you were programming, even when the order made no other difference to the meaning of the code. That was what made me decide to change.
@@ContinuousDelivery I found some old stuff about this, and the main problems involved delays in address generation and result forwarding from the previous instruction. Forwarding from ALU to ALU was generally seamless, but forwarding from fresh register values to address generation definitely involved stalls. This latter case is not a pipeline break, but a dependency stall. Not all forwarding paths were instantaneous. Address generation is more complex, and I could almost imagine an actual pipeline stall here. But on the 486, my guess is that these are actually dependency stalls, as well. This kind of careful dependency hoisting is certainly tedious to have to do by hand. But it wasn't conceptually very complicated from what I've managed to find. You would likely recover 80% of the lost cycles inherent in a naive ordering with a simple peephole optimizer. That would not be true if the issue was real pipeline stalls. It's not that common that you had a tight loop not already dominated by DRAM memory stalls, making these extra stalls look not so big. I hand-coded a loop to search for character tokens, reading a fairly simple data structure from an input pointer. If the instruction was calculating an address or fetching anything from memory, it was hoist as far up the instruction sequence as far as it would go. Already I was only leaving a few percent on the table by not carefully analyzing all the specific forwarding stalls. It's different for an encryption block where you grind away on a small chunk of data with few memory accesses to slow you down for other reasons. But the 486 was simple enough that the compilers tended to do a good job, anyway. I found few reasons to complain about my Watcom C/C++ compiler.
I became a programmer in my later years, a dream come true finally. The times have changed for sure though, i do some limited amounts of 'assembly' style programming so to speak and i get looked like I'm either an alien or a warlock occasionally (much like doing hlsl gets same reaction) from others. Understanding just the fundamentals of how systems work from one level to the other has been an Immense and usually measurable difference in my code quality and speed, if you think it's all magic spells you'll never learn to structure data and architect to your structures strengths. And everyone has an overall data structure pattern, some just model it after certain italian pasta. This video here is Very good landscaping for how to become good, take it to heart.
I used to copy BASIC programs out of all kinds of magazines. Typos in the MAGAZINE were really cruel. In my case, I bought a Heathkit H-89 when I was in the US Navy. Six months later, the Navy had a contract with Zenith Data Systems - and eight Z-89s arrived at my command - with nobody knowing how to do anything with them. An officer saw me one day with my Heathkit manual - which showed a picture of the H-89 on it - and then I was in charge of installing them in the command, connecting them to printers, etc.. The Navy is fine with you going home - AFTER your work is done.
One lesson I learned is that Conway's law is true. That also means that management decide how efficient software development is. Developers influence is quite limited.
My mother realised that her idiot son wasn't such a dolt when I wrote the tennis game on my Dad's Spectrum within a couple of hours. I've also run big teams and have been thinking about what makes good leadership and how to pass that on. Poor leadership is ubiquitous, even in our politicians. Has it truly been 40 years? Yes, time flies, I need another lifetime to acheive what I want to do.
one learning you had about the build on a loop I saw reflected in how GoCD chose the material (commit) - take the latest commit available and build it. This was much more of a pipeline than what "pipelines" that came after it were doing. In an abstract sense, the GoCD pipelines did not leak 😁. Today's pipelines are often leaky because of lack of representation of inputs and outputs
I came from a 20yo youtuber dishing out "advice" to this man....more experience than twice the lifetime of that 20yo youtuber....haha. And here I will stay. Cool stuff and new sub.
Dave, is your book Modern Software Engineering the kind of book that works as an audiobook, or does it have a bunch of diagrams or code that don’t translate well to audio and I should read it instead?
Yes, I think it works well in its audiobook form. There are diagrams, and the audio version comes with a link to a PDF containing them. Mostly I think it still works without, though you will loose the value of the code examples.
Yay! Z80 machine code comrade! Was your assembler, like mine, a pencil and an exercise book; or did you use one of those posh software ones. We had far less computers than kids back then, so we "invented" pair programming... even when typing in listing from Your Computer. Still going back to assembler is kinda nice though... in my spare time, I'm working on a BBC inspired AVR assembler in TypeScript... working in JS and assembler at the same time is the perfect way to troll the largest group of "real programmers" ;)
I learned BASIC in the same period using Casio FX-802p computer. But the different between us is that I lived in a country that not allowed me to develop, while you were in UK.!
My 25 years of so-called software engineering: it is NOT engineering. Outside of my team, there is NO formal, universal, mathematical, proven concept that stand the test of time. What exist are jargons, poorly defined terms.
I think that is true of most SW dev, but I also think that I have experienced genuine SW engineering, and I try to describe what I mean in my book "Modern Software Engineering". I think that our problem with engineering, is that we often don't really know what "Engineering" is, in other disciplines, and most software is not developed in a very organised way. I define SW engineering like this: "Engineering is the application of an empirical, scientific approach to finding efficient solutions to practical problems" and for SW dev it is built on two pillars, "Optimising for Learning" and "Optimising to Manage Complexity"
@@ContinuousDeliveryworking in computer science since the zx80/81 days and I agree fully. The problem with our engineering discipline is everyone wants a Ferrari for Fiat money and unlike aero and automotive engineering there are no fines or jail time if you mess up so financial pressures seem to be the driver - cheap as possible no future planning beyond the next quarter returns. Like yourself I got to do it right for about 15 years where we had full control and were left to do the job correctly. Sadly it was not the norm.
Hi, any old timers here who want to give me some advice? What are your thoughts on Terraform? I've just delivered my first project using it and in retrospect think it actually added more complexity and failure points than was worth it. How do we feel about declarative infrastructure as code? It sounded great to me but am realizing that it may a have sold a false bill of goods
Terraform is just a tool. It allows you to express your notions of how infrastructure should be set up in an executable way, and having the computer do stuff that you'd otherwise have to do manually is always a good thing. Literally _allways_. If your mental model of setting up infrastructure looks like molasses, so will your terraform files. That's not something terraform or any other technology can fix, and doing infrastructure setup manually won't fix it either. You do have modules, with terraform, to structure your infrastructure as code in an understandable and navigable way. And ideally you shouldn't set up everything in one go, but split your configuration into multiple parts which use data blocks to describe and implement dependencies. The inevitable alternative to _not_ maintaining infrastructure as code, whether it's with terraform or other tools, and doing infrastructure setup manually, is always snowflake servers. And it doesn't scale. Think about disaster recovery. If all of a sudden you need to rebuild everything from scratch, and all you have are three days old, manually created backups, depending on the size of your infrastructure you might need weeks, sometimes very many weeks, and you'll still have significant data loss. With a fully automated infrastructure as code setup, provided there's proper quality assurance in place, you only need days for even quite large setups and the data loss might very well be zero - because you can afford the additional complexity required for zero data loss, which you manually would not. Aside from a very few and very straightforward steps which are performed manually (essentially providing certificates and keys, which are generated and maintained offline, for security reasons), we are able of recovering a setup consisting of several kubernetes clusters running thousands of distinct pods within mere hours, with infrastructure as code. Without, it would take weeks. There's no question from the perspective of business, if the effort we spend maintaining that highly automated setup makes sense or not, given that one hour of downtime translates to millions lost, with a few weeks coming in at a few billions.
@@a0flj0 thanks for the response. I can see the value at larger scales. My project was a singe app running on EKS. Think I was in the uncanny valley where it was just enough to warrant IaC but not enough to get value out of it. I would counter your statement that it's *always* better to have IaC. I'd actually rather have a really good read me and set of shell scripts than a terraform monolith that fails to tear itself down 30% of the time.
Your story sounds familiar. You got 10ish years on me. Are we talking about paid work or just learning and fiddling around? My first computer was an Atari 400 with the touch pad keyboard. Circa ~1979-1982+/-. I learned a lot age ~12-14, and then (gasps) a Coleco Adam with a Z80.(Circa 1983-1984 ish, deepening on supply and where you lived) I made that thing sing and do stuff that seemed impossible for the times. First time I got paid for my skills was 1988. Up to then and since, I learned so many things that eventually turned into a lucrative career in the telecommunications/cellular industries, and that became internet, and that became ... My advice to the younger generations. Pick a focus. Pick a passion. Start somewhere. Evolve. The keys are persistence and your infinite curiosity.
Newbie. I wrote my first Algol program in 1970. I then did years of various assembly languages for payphones and early cellular phones before switching the company to Pascal (!) and then C (using an early compiler that required us to do our own optimisation - "Don’t use switch statements!"). i finished my days writing obscure low level bits of the phones that most of the world uses (though I imagine there are only echoes of it left now).
Indeed most programs published in magazine wouldn't work straight away. So basically you learned to debug or figure out the general idea and write your own code.
A little bit, enough to know that I probably don't "understand" it but I do have a working model in my head that, nearly, satisfies me. There are several theoretical physicists who think that information is at the core of reality. I think that you can make a reasonable argument that what we see as superposition in quantum physics is really just our interpretation of multiple copies of information in different parts of the quantum multiverse. I talk a bit about quantum computers in this video ruclips.net/video/oTKpLKMTdT8/видео.html
In your opinion, will there be people who are just starting their careers talking about their 40, 30 or 20 years of experience in software engineering or will there only be AI ?
I think that there will still be people involved, but they won't be coding in the same way. I imagine the shift will be similar to the shift from Assembler to high-level language, AI development will be human driven for a long time to come, in my imagination, coding will be a lot like BDD - specify accurately what we want in a kind of constrained grammar, but leave the actual implementation to the AI. We will need to solve the reproducibility problem of AI along the way though. How can you build anything complex if you get something different every time you ask the same question? What's the role of version control. This is why I think people will be involved, we will need to specify what we want in the form of tests so that we can verify that we got it!
I think your vision is too short term, that’s 10 years at most. In 40 years time AI will probably foresee what software is needed itself, at worst it will need a human to tell it in hand-wavy terms what is required.
All the learning points you discuss seem to be things you SHOULD do, things that worked well, and so on. You say your experience at LMAX reinforced things you had begun learning beforehand, but you don't mention anything you learnt at LMAX which invalidated or suppressed previous ideas or ways of thinking you had had up to that point. Things you learnt not to do (that aren't simply the opposites of things you learnt you should do).
I guess I've learned very little in comparison, in about the same amount of time. I'm more confused now than when I started writing COBOL in the 90s. Things seem more difficult and slower when it comes to actually writing something. Although yes, the devices and tools are cooler. But the day to day experience is terrible really.
Are YOU a Great Programmer? or a Regular Programmer with GREAT Habits. Download my FREE guide on how to adopt 30 of my favourite Programming Habits. Get yours HERE ➡ www.subscribepage.com/great-programmer
1. 1:27 We get to create tiny worlds inside our computers
2. 2:03 Build a mental model for how things REALLY work
3. 5:37 Always strive to write clear, tidy, well-structured, readable code
4. 6:54 Good design is well structured and navigable
5. 8:05 Abstraction is at the core of what we do
6. 8:14 Abstraction is risky without sound foundations
7. 8:21 Distributed programming is MUCH more difficult
8. 9:44 Automate or lookup the details, focus your skills elsewhere
9. 11:27 Automate repetitive tasks, especially if they are complicated
10. 12:24 Messages, events and asynchrony help tame distributed systems
11. 12:38 Understand the next layer down before abstracting
12. 13:25 Fast feedback via continuous integration is invaluable
13. 14:07 Test behaviour, not implementation
14. 14:34 XP (eXtreme Programming) works well for big projects, as well as small
15. 15:54 You have to de good at a lot of things build great software
16. 16:20 Continuous delivery works for VERY complex problems
17. 16:34 Software engineering needs software engineers
18. 17:41 Good software engineering depends on mechanical sympathy
Thanks so much, @hardi.stones! This helps a lot! I'm a visual learner, so seeing it in text format helps me digest the neat stuff Dave taught us.
@@hardi.stones dude you're a legend
My story is similar. I started on a 48K ZX Spectrum behind the iron curtain at the time. I quickly filled the entire memory with code, and realized the need to switch to assembly, which allowed me to learn how the CPU worked. I was around 13, so didn't immediately have that much success with Z80 machine code. It was my uncle, an assistant professor, who brought his PC home. As soon as the borders opened to the West, we bought enough components to build a PC and I switched to C, Pascal, then C++. I also had a knowledgeable coworker early in my career who taught me about inheriting from public interfaces by overriding virtual methods, and I slowly became an architect, instead of just a smart coder. Unit tests and CI came later. I've always specialized in documents (PDF, image processing, DOCX, AI page segmentation). I'll have to read your book now.
Two principles of abstraction that have guided me well in the last 30+ (cough) years:
1. Understand one layer above, and one layer below.
2. One problem that cannot be solved by abstraction is having too much abstraction (often due to a lack of understanding the required layers, see #1).
I wonder what kind of programs AI writes? Since AI learns from legacy code that must be OOP, I guess most of their code would be templates of OOP programs doing various things. So much for the great debate over OOP vs structured programming.
@@raylopez99 why "must" legacy code be OOP? OOP was new at one point and all legacy code was either written in LISP or ALGOL. lots of legacy code exists in many paradigms.
Great story, very much in line with my own. I started in depot repair with Data General Nova computers, learned assembly programming on paper tape editors and assemblers. Then progressed to a OSI Challenger 6502, again in assembly language... but I built it! I wrote a network bridge on my first PC Clone bridging a little endian minicomputer to little endian Novell networks in 1983, including my homebrew ethernet driver, . Users could transparently retrieve and store WordPerfect documents between environments. I settled into C, but over my 50 years programming, I've used many SQL systems, FoxPro, Delphi (object pascal), PHP and more Javascript that I care to remember. To this day in retirement, I still program in C/C++ and server-side PHP. Mainly as a hobby I wake every day with yet another new project or algorithm to fill my days.
Thanks for your story, good memories.
Apricots were the first PCs I used, writing FORTRAN applications for engineers. We had started with buying time on CDC mainframes in bureaus, then it was cheaper to have a PDP11/45 in house, and then going smaller again to the Apricot. It had the advantage of more RAM available to the program than other PC clones. The major lesson I learned was how to do useful work with very little RAM and "primitive" output devices. For example, we did earthquake analysis of nuclear power stations on non-linear soils on the PDP with just 392kB of RAM and a 20MB disk the size of a washing machine, and plotted 3d structures using hidden line algorithms on Tektronix screens and Calcomp paper plotters. Happy days!
Similar story for myself. My uncle loaned me a ZX81 for a few weeks when I was 11 - later that year my parents got me an Amstrad CPC464 for christmas. I was immediately hooked on BASIC and learning Z80. Few years later I moved up to an Atari ST and started working with C and 68000. University at 17 - 4 years later left with a HND and Degree in CS. Its pretty amazing to think my 30 year career in computing actually stems from what many saw as a toy in the home computer boom. I owe so much to my parents for getting me that 8-bit box of discovery - without it my life would have been very different.
I started also on a Timex Sinclair 1000. I found a working one at a second hand store and I now have it in a shadow box mounted in my office. It's where I got my start
I built a ZX80 from kit as it was cheaper. Later became interested in the opcode listing “195 jp NN” etc. Then ZX81 prom and a Spectrum. And Atari ST… you see where this is going😅. I am now a teacher at a university and teach atmel assembler both in a basic course and a much larger project course. Mission accomplished🎉
My Dad bought me a Zx Spectrum in 1983 when I was 6. I feed my family now thanks to my dev skills I learned way back then. Thanks Dad ❤
Fascinating video, thanks for this!
The first code I wrote was in FORTRAN II (2) in 1971. Punchcards, IBM System/360 mainframe, 24 hour turnaround. I worked for 35 years, writing software, managing development. I got burned out in the early 90s and then the Internet came along....WOW.
Started in 1977 with PL/I and H Assembler on my first computer, a corporate IBM 370/168 MP with a huge 16 megabytes of main memory. I learned a lot from large-scale programming in that particular environment that still rings true today. A lot of the technology "discovered" lately was invented back then but only known by a few of the folks behind the glass walls.
This sounds reasonable for me, as more and more i got experienced in the priciples of computer science, i find it related to something - research or paradigm - that was established maybe 30 or even 50 years earlier. Even the prettiest ideas nowadays are mostly borrowed from the past.
Finding Continuous Delivery book im the library in 2013 was the moment that really changed my life. From that moment, i could easily name things that companies i work for, do wrong and i knew how to Fix it. This book gave me not only the knowledge, but also lots of courage
We started the same way. I wrote my first program on a ZX81 Sinclair exactly 40 years ago in the summer of 1984. I had the 16K extension :-)
However I had started a few months before that learning the BASIC language and writing programs on a piece of paper.
My dream job was to become an analyst programmer. I graduated as a software engineer in 1994 and have been a professional programmer since then.
Whoaaaa!! You had the 16k extra! You lucky bastard😂
My brother started with the Z80 I think it was, his next one was the ZX81 and a while later the 16k extension too :) after that the Commodore 16 came, while my other brother bought the Commodore 64 in 1985, but he was only interest on playing games 😊. It was amazing how much friends we suddenly had, we even didn't know we had, but word spread quickly through our classes and street we had a C64.
Wow. What an amazing video. Thank you very much. I am happy to see that RUclips can still produce content with this high quality.
Great story. I envy his generation in many ways. When I started, all the abstractions, good and bad, were already there and had been for a long time. Now, after 16 years of work, I'm still fuzzy on how many things work. Oh, I can talk a decent game about how machines work, but this guy really knows. Very inspiring and fun to hear these stories.
We had a similar kind of startup... I started poking around on BASIC in 1977 on a Model 1 TRS-80, then a VIC-20 in 1981, I had games out commercially in 1982 and a few decades of Embedded system Experience at companies here in Silicon Valley. I really like hearing those stories about the low-level assembly language all the way to the concurrent systems and RTOS. After 40 years you can look back on all of it like it was a blur and marvel at the speed of how systems evolved and we had to adapt to them. Good times, wouldn't want to change them for anything.
Awesome! I was born in 1977 and started using computers, including BASIC, in 1986 at age 9. Turned it into a career and still going at 47!
@@JarheadCrayonEatergood to hear that😊
I have an electronics background and a keen interest in coding. Could you suggest a concise roadmap for me?
@@technicalaamiriqbal8991 Assuming you want to leverage your electronics and learn to code...
It's very much like learning the language of electronics, how did you learn that? Schematics, components, grouping components, systems, etc. You built it up.
I taught at the University for 10 years in the Computer Science department and in their graduate multimedia program, so I do have experience teaching this subject to ppl wanting exactly what you're asking and wanting to do.
It starts with picking up a language or two, Usually "C" is what is used in the embedded system industry, C++ is rare, but also used. Other languages seem to come and go. The first step is to learn a language and an environment. Embedded systems you can start with Arduino. Arduino looks like C++ but it is a similar language called Wiring, and it isn't the same thing. What I do is have a stub that calls functions that will be located in a different file, then I open a new tab in the project name the file (name).cpp and write C++ in there. Wiring is similar to C++ but it does some pretty horrible things to it that will frustrate you. (Ask me how I know LOL)
The next step is to start learning data structures, and concepts of structured code. Learn about Flow charts, algorithms, that manipulate data such as sorting and searching... Also learn idioms in the language that facilitate them... this may seem boring, but it helps engrain the fundamentals by learning examples. The way code is done today is google, copy, and paste and little effort is used to learn how to read code and understand it... but that is crucial.
Next: learn about code architecture, UML (or other modeling notations if UML is outdated, but you'll still encounter it so worth learning), API's, abstraction, libraries, code reuse(both definitions of this), interfaces, and design patterns. If you are using ARM then look into CMSIS. There's so many rabbit holes and that is why I suggested a gentler ramp to understanding by playing with Arduino first.
Best advice is to take small steps, don't try to get ahead too quickly. If you try to take a step and it is too difficult, then you skipped ahead too quickly, you are likely missing fundamental knowledge, do *not* allow yourself to stay confused for too long, go do something else instead and make a note to come back and try this later when you're able to tackle it. Good Luck. If you're interested in more, go check out my channel, find my emails and DM me.
@@technicalaamiriqbal8991 , I also have an extensive background in electronics, and actually use programming with it.
It depends on what you're looking to do. Desktop, web (browser, server, both/full-stack, etc), embedded, or control systems (SCADA/PLC).
As far as getting started, assuming you have no programming experience, you're going to want to start with a basic starter course. I'd look at courses on Udemy, or similar, since they have a wide range of topics to choose from and do a fairly decent job of going over fundamentals in certain areas.
First thing to do is decide what you're wanting to accomplish with programming. Then look for online courses that you can follow along with to build something in that area.
I'm self-taught, nearly 100%, and have been a Senior Software Engineer for 8 years, after working in many other industries where my programming experience was used to make our lives easier in the field. So, there's no more clear of a path that I can give you other than what I suggested above.
Hope that helps, and feel free to ask any other questions. I'm an open book.
Hi there! I've been at this for over 40 years as well. I first learned assembly language from a radio station technician in the Arctic when I was a kid. Over time, I worked my way through most of the programming languages on x86 architecture, starting with the 8088. In the 1980s, I coded for the Swedish military (details declassified in 2022, but the content remains classified). They provided me with a cover job, so no one knew the full extent of what I was really doing. I've done extensive inline assembly for large corporations, including space applications. It’s been quite the journey! - I was hit by a massive stroke a year ago, and now I can only do music .. can't complain :)
Sorry to hear about your illness, but it sounds like you had some fun, enjoy the music!
That is some extraordinary experience compression algorithm! 20 million minutes of expertise compressed into 20.
Now i understand how HRs can demand for graduates with 24 years experience in software development.
wow...that was awesome....i had a radio shack computer when i was a kid and wrote my first line of C, C++ and Java in 1997.....this video made me realize i know a lot more than i thought i did....Thank You.
I want a lot more career reflections please!!! This was incredible
I, too, went from BASIC to 6502 assembly on an Apple ][ because I needed faster computing for a game! And I love the term "mechanical sympathy" to describe understanding how the underlying technology works.
Dave! Thank you so much for your words of wisdom!
I've been programming for about 30+ years myself -- since I was knee high to a grasshopper! My dad upgraded his old Intel something 286 CPU to a 386SX and let me use the old one to build a DOS 5.0-based PC with him! He gave me his old yellow monochrome monitor and upgraded to the 4-color TANDY... or maybe the 16-color EGA? (Late 80s.) He showed me how I could write code in DOS's batch file scripting language and how I could use ASCII alt codes to build a graphical border around text ... and the rest, as they say, is history! I've been hooked ever since! The ability to make manifest all the neat ideas in my head and (effectively) create something out of nothing was so coooool!
Again, well done, brother!
Fascinating tale and amazing experience. Thank you for sharing & please continue this series. I find it invaluably useful!
I concur with your final comments on LMAX. They were good times. Thanks for the video Dave
I remember typing in the Reversi game out of a magazine. The computer's logic was okay, but it didn't take long for me to be beating it regularly, so I rewrote the logic, and then I had a real challenge again. That was on my Atari 400, back around 1983.
Very valuable. Go onto your channel a few years ago, and love your book. You're definitely a big part of my software journey. Thankyou.
Thank you, Dave, for a good summary of a great life experience and important lessons. Ordered the book, eagerly awaiting the delivery.
My father purchased Amstrad Schneider CPC 464 in late 1980's, and learned from magazines to make quizzes e.g. capitals of the world - I was around 10 at the time, and mostly typed in the code 😊
Then came high-school, 486, Turbo Pascal and Dephi 1.0 on Windows 3.1 ❤
To understand this in better light, I ran the video in 19x speed, so that I don't miss out on anything of your (40x1440×360)minutes.
Many Thanks, for sharing, it requires resilience and guts to stick to one job for 40 years.
It today's Computing too, abstraction and understanding. Macro and micro levels, does matter...
Great stuff Dave, my early work included working on a private network, implementing a comms stack up to the the TCP level. And before that zx80 assembler to test network hardware for the same network.
Haha a later learning point, when I found I had been doing deep unpicking of the 'finer' points of Ellis and Stroustrup for fun all weekend for a solid six+ months I decided my life had to change...
Later big gains were agile methods, test first development / TDD and BDD, and automated testing in the Rails world. RSpec was a huge lasting influence.
And I'm still learning, thankfully. It's one reason why IT is so compelling as a career.
My first computer was also an ZX81 (also 1K) :)
ZX81 for me as well. An Amstrad 1512 later (I learned C on that - Zorland compiler). A 3 month City & Guilds course primarily using COBOL :)
I've been programming professionally all my life (40+ years) but was laid off (not sure why) last year and am now having trouble finding another job. I've come close, last 2 at one and last 3 at another but no prizes for 2nd or 3rd place. Weird times. I thought I'd be programming professionally for as long as I wanted to, but times change. Hopefully things will pick up.
Sorry to hear that, good luck in your search.
@@ContinuousDelivery Thanks
This is brilliant. Such a wealth of knowledge, information and experience. Thank you for sharing.
My developer journey started a lot later than yours in the early 2000s. I did come to a lot of your conclusions in my work building complex systems software in a large enterprise. However the latter was so inflexible, that, despite demonstrating that my approches worked, work there became unbearable. Therefore my career terminated early about 5 years ago. I started growing organic vegetable, building a working business. Even in that field many of the learnings can be applied... Only now am I starting to rekindle with my deep interest in the computer world. Better late than never. Thanks for the video. Perhaps if I had searched for a better place to apply my skills, I would still be working in the software industry. Now I administer our home network, fiddle with Minecraft servers for and with my kids and tinker with electronics to monitor the weather and other things relating to such topics. I guess I have made my peace with the software industry...
Me? 33 years, from DOS in 1991 to Windows in 1998. First app? A VT220 Terminal Emulator (for dos) with scripting capability. Oh, using Borland C/C++ from then and till now (Embacadero C++ Builder).
3:50 Dr Dobbs Journal, C/C++ Journal, got me a CD from the C Users group and a CD called Technical Arsenal. It has a disassembler thats still in use but for windows
@@colinmaharaj50 Loved Borland as a company. I learned C++ and also Assembler and Pascal with their Turbo IDE
I could listen to you all day. Your content delivery is impeccable
That's very kind. Thank you :)
Following a similar journey to yours, Dave. Learn the basics, get burnt with abstractions, learn to abstract well while exploring testing, now moving through XP-fueled coaching and consultancy to help other teams scale their efforts in CD/testing.
Remarkably similar experience with microcomputers starting in the 70's. Also fortunate to have had some access to the IBM 370 and the Cyclotron Unix computers at Cornell, which exposed me to the formal systems of mainframes. It was such a magical time.
Thank you, a wonderful video. As for me 1985 I started as a software engineer. My head hurts, thinking back on all the changes.
I found that the most important skill is establishing collaboration amongst team members. I try to treat everyone like family. Even the cranky ones.
I once worked with a gentleman with an odd personality. Many didn't like him and didn't want to work with him. I started digging into who he was inside. Found out he was a submariner in the Navy, over the top intelligent, and was a quiet person. We became work friends and collaborated in great product delivery. Working with him was an honor.
TI-99/4a was my first computer and programming experience
$50 on clearance back in the day!
@@danielbarnes3406 yep. Thanks to Jack Tramiel. I didn't have any way of saving programs unfortunately so my programming was very limited.
@@FlyFisher-xd6je that was an awesome computer especially with the extended basic cartridge.
@@amirnathoo4600 I have grown to really appreciate it as an adult. I own a few of them and a PEB. I also have some of the modern add-ons. Fun stuff.
Awesome presentation! I found myself nodding in agreement as you moved through the progression of your skills and understanding. I am very interested in your opinion of AI and your prediction of how the software engineering business will change in the future?
Thanks.
I started out with dabbling in forth and lisp, but a prof who was a huge enthusiast in Wirth languages left a huge impression on me. I loved writing code in Modula 2. Initial compiles were throwing 3-4x errors than the unix c I was using at the time. Very frustrating, but helped me understand I should spend more time thinking about what I was writing before hands landed on keyboard. Retiring soon, need to re-discover lisp, but I'll be open-sourcing some FreePascal projects also.
Same here, TI-99, Vic20, 64, Atari, Mac. Same Start, same Game. Interesting. Suffering from machine details brought me to write for middleware. First proprietary middleware like Realbasic, then Java, then Processing. Since then I feel no need to move on, because codes and libs that I wrote 20 years ago, compile still and run fine. What I learned: Make it cleaner, cleaner, cleaner. And: read, teach, learn, discuss, produce, all together brings you further. I you leave something of those 5 out, it slows your evolvement.
No Amiga? The leap from the C64 to Amiga was huge in my 15 year old eyes. Still lots of good memories on both computers. Everybody was trying to make the most impressive demos on the C64, it had not use other than to show your skills. But it was fun!
@@TinusTegenlicht no amiga. But envied amiga owners.
@@jensBendig Me too, I didn't have one, but an older boy in my street did. I bought the C64 of my brother and pimped it up with dolphindos instead. The Amiga was a bit too expensive for me at that time.
Fascinating!
To see the shoulders of the giants you are stepping on as a professional in the modern IT field is almost a spiritual experience.
Great memories there Dave. Thank you for the reminder how and why many of us got into computers and the programming thereof.
I started out on a Casio FX702P with a cassette interface for backups and a small thermal printer. I would regularly blast out long wreaths of silly quotes using a custom font design program i wrote in the miniscule 1K of RAM. Fun and games too on a single line LCD display.
Excellent video and I can relate with a lot of what you said. I started life as a physicist who ended with a career in designing and operating global data communications networks for the delivery of real time financial information and facilitating low latency financial transactions across the world. That included working for companies such as Reuters, where speed, accuracy, and impartiality were important. I was continuously learning new things which made my whole working life fun. One thing I learnt was it was ok to make mistakes because that's how you learn, but fail fast, learn from the mistake, apply the learning, and move on. Networks for real time financial services need to be always available with optimal low latency communications paths. Their design and operation require elevated levels of redundancy and diversity, with a robust "risk managed" change management process, allowing network changes, software updates etc to be applied without impacting service. So, continuous testing over the journey to deliver the network change etc, with an effective fallback process if something unforeseen occurs was essential.
My first computer was the PET2001 limited to 8K, then the Apple II, up to 64K RAM, and with 2 disk drives. PAINFUL to program any database application in Applesoft BASIC and Apple DOS 3.3. But then came the Z80 Softcard for the Apple and CP/M, and ... dBASE II ! WOW, a stroke of genius and a GOOD programmable "relational" database, and my career as database programmer was launched.
I appreciate the info. I am a systems engineer and cloud architect, and CEO of a tech company. I know how to develop software but am not passionate about it. My expertise is in infrastructure, which I truly enjoy. I work alongside software engineers and they offer perspectives outside of my purview which is extremely valuable.
This is very helpful. Thank you, sir.
Hey Dave. Just wanted to say that your book, Continuous Delivery, was the first thing I picked up 8 years ago when I started my move from traditional infrastructure/app support towards "build/release", which eventually became DevOps. Today, I have the privilege of leading a top-tier DevOps team, and the principals are as valid today as they were back then. I'm thrilled to see you're still out there teaching.
That’s really cool to hear. I wish you continued success!
Good advert for the book. Not going to get the book, but will subscribe to get some of your wisdom
I remember vividly trying to explain to someone how to have success at a workplace I'd left that I was suggesting they'd apply for. During that conversation I realized why I'd been successful, but before I hadn't really meditated over it before.
If you can get into a mentor program, I recommend it. You will not only help someone get a good start in your industry, you will also learn a lot about yourself while doing that.
It looks like you had a lot of fun projects to work on during your career Dave. That makes one keen to go to work and not dread it. The Apricot gig must have been a dream job.
Apricot was a nice place to work. We did some cool stuff. I have been very lucky through my career and nearly always chose my next job based on being interested and excited by the prospect of the work, which helped me to find interesting things to work on.
Had a prof who taught a class in 370 assembler. He taught beautiful machines and the beauty of assembly - the relationship with your code was simple and it never cheated on you.
a very relatable story. Thank you for sharing, Dave
amazing and inspiring teachings as usual!
I found a listing of a program to play Reversi on the BBC Micro in the September 1983 issue of Your Computer. I wonder if that was the one you referred to, Dave. It was submitted by A.P.Walrond, Pitney, Somerset.
Thank you I will take a look, it is about the right time.
Thank you, great subject matter
18:00 Summary. Notice how what’s important is very different than what is asked in interviews
What is Mechanical Sympathy? Beside that I am amazed to see how abstraction can reduce the entangled logic and how tests are crucial for good software output. I am still building myself to that.
"Mechanical Sympathy" is a term that we stole from Formula 1 racing. In the 1970's the best racing driver was called Jackie Stewart, he was asked in an interview if you needed to be an engineer to be a successful racing driver, he said "No, but you do have to have *mechanical sympathy*".
We were building one of, if not the, fastest financial exchanges in the world. We learned that to make the most of the hardware, we needed to design our code to take full advantage of it, from processor caching to disk-drive mechanics to Network/OS boundaries.
I describe some of this in this early video from my channel: ruclips.net/video/0reMVgn6kRo/видео.html
The Intel 486 did not do concurrent issue. It was heavily pipelined, but instruction sequencing wasn't that big a deal, except for instructions with larger latencies, which perhaps benefited from being issued early. I don't even recall if the 486 would allow other instructions to issue while it finished a multiply, blocking only when the result of the multiplication became needed. That would have been an important use case. Most of the important instructions on the 486 had single clock-cycle latencies. Maybe the floating point unit executed concurrently, but I never did any FP coding on the 486. I recall the 486 as easy to program in assembly language.
It was really the Pentium where you had to carefully schedule compatible instructions for dual issue, or your performance sucked.
Not quite true, pick the wrong instructions order in the 486 and you blew the pipeline, and so hit performance severely, It was described in some detail in the Intel Spec sheet when the processor was released. So you needed to worry about what order to place instructions as you were programming, even when the order made no other difference to the meaning of the code. That was what made me decide to change.
@@ContinuousDelivery I found some old stuff about this, and the main problems involved delays in address generation and result forwarding from the previous instruction. Forwarding from ALU to ALU was generally seamless, but forwarding from fresh register values to address generation definitely involved stalls. This latter case is not a pipeline break, but a dependency stall. Not all forwarding paths were instantaneous. Address generation is more complex, and I could almost imagine an actual pipeline stall here. But on the 486, my guess is that these are actually dependency stalls, as well. This kind of careful dependency hoisting is certainly tedious to have to do by hand. But it wasn't conceptually very complicated from what I've managed to find. You would likely recover 80% of the lost cycles inherent in a naive ordering with a simple peephole optimizer. That would not be true if the issue was real pipeline stalls. It's not that common that you had a tight loop not already dominated by DRAM memory stalls, making these extra stalls look not so big. I hand-coded a loop to search for character tokens, reading a fairly simple data structure from an input pointer. If the instruction was calculating an address or fetching anything from memory, it was hoist as far up the instruction sequence as far as it would go. Already I was only leaving a few percent on the table by not carefully analyzing all the specific forwarding stalls. It's different for an encryption block where you grind away on a small chunk of data with few memory accesses to slow you down for other reasons. But the 486 was simple enough that the compilers tended to do a good job, anyway. I found few reasons to complain about my Watcom C/C++ compiler.
I became a programmer in my later years, a dream come true finally.
The times have changed for sure though, i do some limited amounts of 'assembly' style programming so to speak and i get looked like I'm either an alien or a warlock occasionally (much like doing hlsl gets same reaction) from others.
Understanding just the fundamentals of how systems work from one level to the other has been an Immense and usually measurable difference in my code quality and speed, if you think it's all magic spells you'll never learn to structure data and architect to your structures strengths.
And everyone has an overall data structure pattern, some just model it after certain italian pasta.
This video here is Very good landscaping for how to become good, take it to heart.
9:41 - RISC - Real Important Stuff In Compiler (I worked in a proprietary/88K/PPC shop which wrote it's own compilers).
I started with the Amstrad CPC, copying code from the same mags
I used to copy BASIC programs out of all kinds of magazines. Typos in the MAGAZINE were really cruel. In my case, I bought a Heathkit H-89 when I was in the US Navy. Six months later, the Navy had a contract with Zenith Data Systems - and eight Z-89s arrived at my command - with nobody knowing how to do anything with them. An officer saw me one day with my Heathkit manual - which showed a picture of the H-89 on it - and then I was in charge of installing them in the command, connecting them to printers, etc.. The Navy is fine with you going home - AFTER your work is done.
One lesson I learned is that Conway's law is true. That also means that management decide how efficient software development is. Developers influence is quite limited.
Thank You for your video. Could you please make video about Leetcode Software Engineering interviews?
My first exposure to computers was seeing a Commodore PET in the head masters office and being allowed to see a computer program.
My mother realised that her idiot son wasn't such a dolt when I wrote the tennis game on my Dad's Spectrum within a couple of hours.
I've also run big teams and have been thinking about what makes good leadership and how to pass that on. Poor leadership is ubiquitous, even in our politicians.
Has it truly been 40 years? Yes, time flies, I need another lifetime to acheive what I want to do.
one learning you had about the build on a loop I saw reflected in how GoCD chose the material (commit) - take the latest commit available and build it. This was much more of a pipeline than what "pipelines" that came after it were doing. In an abstract sense, the GoCD pipelines did not leak 😁. Today's pipelines are often leaky because of lack of representation of inputs and outputs
MC6800 assembler was my starting point :)
I had an old heath kit with one
I came from a 20yo youtuber dishing out "advice" to this man....more experience than twice the lifetime of that 20yo youtuber....haha. And here I will stay. Cool stuff and new sub.
Dave, is your book Modern Software Engineering the kind of book that works as an audiobook, or does it have a bunch of diagrams or code that don’t translate well to audio and I should read it instead?
Yes, I think it works well in its audiobook form. There are diagrams, and the audio version comes with a link to a PDF containing them. Mostly I think it still works without, though you will loose the value of the code examples.
Yay! Z80 machine code comrade!
Was your assembler, like mine, a pencil and an exercise book; or did you use one of those posh software ones.
We had far less computers than kids back then, so we "invented" pair programming... even when typing in listing from Your Computer.
Still going back to assembler is kinda nice though... in my spare time, I'm working on a BBC inspired AVR assembler in TypeScript... working in JS and assembler at the same time is the perfect way to troll the largest group of "real programmers" ;)
I learned BASIC in the same period using Casio FX-802p computer.
But the different between us is that I lived in a country that not allowed me to develop, while you were in UK.!
Started in 1986 on a Commodore Plus/4 at home and KC 85/2 in computer lab.
ZX 81, and PEEK POKE asembler programming! Great time
My 25 years of so-called software engineering: it is NOT engineering. Outside of my team, there is NO formal, universal, mathematical, proven concept that stand the test of time. What exist are jargons, poorly defined terms.
It's just glorified programming.
I think that is true of most SW dev, but I also think that I have experienced genuine SW engineering, and I try to describe what I mean in my book "Modern Software Engineering". I think that our problem with engineering, is that we often don't really know what "Engineering" is, in other disciplines, and most software is not developed in a very organised way.
I define SW engineering like this: "Engineering is the application of an empirical, scientific approach to finding efficient solutions to practical problems" and for SW dev it is built on two pillars, "Optimising for Learning" and "Optimising to Manage Complexity"
@@ContinuousDeliveryworking in computer science since the zx80/81 days and I agree fully. The problem with our engineering discipline is everyone wants a Ferrari for Fiat money and unlike aero and automotive engineering there are no fines or jail time if you mess up so financial pressures seem to be the driver - cheap as possible no future planning beyond the next quarter returns. Like yourself I got to do it right for about 15 years where we had full control and were left to do the job correctly. Sadly it was not the norm.
You are smszing. Thanks fir your time.
Hi, any old timers here who want to give me some advice? What are your thoughts on Terraform? I've just delivered my first project using it and in retrospect think it actually added more complexity and failure points than was worth it. How do we feel about declarative infrastructure as code? It sounded great to me but am realizing that it may a have sold a false bill of goods
Terraform is just a tool. It allows you to express your notions of how infrastructure should be set up in an executable way, and having the computer do stuff that you'd otherwise have to do manually is always a good thing. Literally _allways_.
If your mental model of setting up infrastructure looks like molasses, so will your terraform files. That's not something terraform or any other technology can fix, and doing infrastructure setup manually won't fix it either. You do have modules, with terraform, to structure your infrastructure as code in an understandable and navigable way. And ideally you shouldn't set up everything in one go, but split your configuration into multiple parts which use data blocks to describe and implement dependencies.
The inevitable alternative to _not_ maintaining infrastructure as code, whether it's with terraform or other tools, and doing infrastructure setup manually, is always snowflake servers. And it doesn't scale. Think about disaster recovery. If all of a sudden you need to rebuild everything from scratch, and all you have are three days old, manually created backups, depending on the size of your infrastructure you might need weeks, sometimes very many weeks, and you'll still have significant data loss. With a fully automated infrastructure as code setup, provided there's proper quality assurance in place, you only need days for even quite large setups and the data loss might very well be zero - because you can afford the additional complexity required for zero data loss, which you manually would not.
Aside from a very few and very straightforward steps which are performed manually (essentially providing certificates and keys, which are generated and maintained offline, for security reasons), we are able of recovering a setup consisting of several kubernetes clusters running thousands of distinct pods within mere hours, with infrastructure as code. Without, it would take weeks. There's no question from the perspective of business, if the effort we spend maintaining that highly automated setup makes sense or not, given that one hour of downtime translates to millions lost, with a few weeks coming in at a few billions.
@@a0flj0 thanks for the response. I can see the value at larger scales. My project was a singe app running on EKS. Think I was in the uncanny valley where it was just enough to warrant IaC but not enough to get value out of it. I would counter your statement that it's *always* better to have IaC. I'd actually rather have a really good read me and set of shell scripts than a terraform monolith that fails to tear itself down 30% of the time.
My learning points are continuous. You could say I practice Continuous Learning. Even when learning that something is not new!
its inspiring to see engineers build things from scratch.
I always enjoy this content
Your story sounds familiar. You got 10ish years on me. Are we talking about paid work or just learning and fiddling around? My first computer was an Atari 400 with the touch pad keyboard. Circa ~1979-1982+/-. I learned a lot age ~12-14, and then (gasps) a Coleco Adam with a Z80.(Circa 1983-1984 ish, deepening on supply and where you lived) I made that thing sing and do stuff that seemed impossible for the times. First time I got paid for my skills was 1988. Up to then and since, I learned so many things that eventually turned into a lucrative career in the telecommunications/cellular industries, and that became internet, and that became ... My advice to the younger generations. Pick a focus. Pick a passion. Start somewhere. Evolve. The keys are persistence and your infinite curiosity.
I remeber Apricot machines in magazines like PCW. Always looked great in my opinion
They were one of the more innovative PC manufacturers, a good place to work too.
I feel like dinosaur, because my first program was wrote 50 years ago in 1974!
Newbie. I wrote my first Algol program in 1970. I then did years of various assembly languages for payphones and early cellular phones before switching the company to Pascal (!) and then C (using an early compiler that required us to do our own optimisation - "Don’t use switch statements!"). i finished my days writing obscure low level bits of the phones that most of the world uses (though I imagine there are only echoes of it left now).
@@TonyWhitley My first programming language was also ALGOL and then PL/1 and etc. etc. etc… You’re bigger dinosaur than me!
Focal. 1968. Dec PDP8. 4K & a teletype. The machine was later timeshared to a 2nd terminal.
@@DaveZeichner I remember teletype and it was before punchcards! You’re very, very big dinosaur! Long life!
I used Focal a tiny bit but mostly PDP8 assembler, in 1971 or so. We had a *second* bank of 4k core store 😎
Indeed most programs published in magazine wouldn't work straight away. So basically you learned to debug or figure out the general idea and write your own code.
Thank you, sir.
What a very interesting video!
Glad you enjoyed it
Brill vid cheers!
For me it was a Trash-80 in an otherwise useless computer class at the local junior college while I was a senior in High School (Grade 12).
I think my first computer experience was a TRS-80. It must have made an impression because I’m still in computers 40 years later.
Kindly upload yr course online, classes and labs
My courses are uploaded online here: courses.cd.training
@@ContinuousDelivery
Your offering courses free can beneficial for remote job or just enjoyment.
Awesome, lessons from an 8bit world 😀
If I boil everything down I learned over the years as a software dev it is "No code is best code"
I'm a giant ear, waiting to hear your words of knowledge
Hi giant ear, hope you can hear me. Thanks for listening.
Best Ballmer impersonation: CAN YOU BELIEVE IT, REVERSI?!
12:33 Do you understand quantum mechanics?
A little bit, enough to know that I probably don't "understand" it but I do have a working model in my head that, nearly, satisfies me.
There are several theoretical physicists who think that information is at the core of reality. I think that you can make a reasonable argument that what we see as superposition in quantum physics is really just our interpretation of multiple copies of information in different parts of the quantum multiverse. I talk a bit about quantum computers in this video ruclips.net/video/oTKpLKMTdT8/видео.html
timeline would be great addition
In your opinion, will there be people who are just starting their careers talking about their 40, 30 or 20 years of experience in software engineering or will there only be AI ?
I think that there will still be people involved, but they won't be coding in the same way. I imagine the shift will be similar to the shift from Assembler to high-level language, AI development will be human driven for a long time to come, in my imagination, coding will be a lot like BDD - specify accurately what we want in a kind of constrained grammar, but leave the actual implementation to the AI. We will need to solve the reproducibility problem of AI along the way though. How can you build anything complex if you get something different every time you ask the same question? What's the role of version control. This is why I think people will be involved, we will need to specify what we want in the form of tests so that we can verify that we got it!
I think your vision is too short term, that’s 10 years at most. In 40 years time AI will probably foresee what software is needed itself, at worst it will need a human to tell it in hand-wavy terms what is required.
All the learning points you discuss seem to be things you SHOULD do, things that worked well, and so on. You say your experience at LMAX reinforced things you had begun learning beforehand, but you don't mention anything you learnt at LMAX which invalidated or suppressed previous ideas or ways of thinking you had had up to that point. Things you learnt not to do (that aren't simply the opposites of things you learnt you should do).
Inspiring.
I guess I've learned very little in comparison, in about the same amount of time. I'm more confused now than when I started writing COBOL in the 90s. Things seem more difficult and slower when it comes to actually writing something. Although yes, the devices and tools are cooler. But the day to day experience is terrible really.