Some of that comes down to the 3d art side of things, there is an art to texture bashing where you can bash textures together to make something new and save on storage space. At the cost of slowing things down. But many games simply don't do that at all. And that I think is partly because they are building for very low end machines.
I'm not a huge fan of how large games have gotten, but the people acting like getting an SSD is some prohibitive cost are overreacting a bit. With most games now retailing for $70 it costs less then the price of the game to get a 2tb SSD
The fact that memory management is less of a concern now is how we end up with applications like teams using multiple gigs of system RAM doing absolutely nothing.
@@TheCollectiveHexagon to some extent, they're correct. Cache isn't really used memory, as it can more or less be cleaned up instantly - windows doesn't report it as free and it only puts the faintest of lines on the bar to differentiate truly free, and cache "free." So it is. The problem is that with everyone thinking like that, well it becomes useless. A caveat of programs needing more memory is programs becoming less and less tolerant of swapping, as it takes that much more time to page in or page out a program during memory pressure. Unused memory is wasted memory, that's why OSes fill it with cache. But wasted memory is also wasted memory, and a program should not take more than its fair share, because it doesn't know what other programs are competing for it. "Oh I'll just use 3 GiB to calculate a square root, it's fine since it would just be cache anyways" only holds true until suddenly Chrome or Resolve wants that space.
@@NostraDavid2 Revelation 3:20 Behold, I stand at the door, and knock: if any man hear my voice, and open the door, I will come in to him, and will sup with him, and he with me. HEY THERE 🤗 JESUS IS CALLING YOU TODAY. Turn away from your sins, confess, forsake them and live the victorious life. God bless. Revelation 22:12-14 And, behold, I come quickly; and my reward is with me, to give every man according as his work shall be. I am Alpha and Omega, the beginning and the end, the first and the last. Blessed are they that do his commandments, that they may have right to the tree of life, and may enter in through the gates into the city.
I still do this when writing LUA and find it my preferred method for HTML CSS AND JS due to thinks like Notepad++ now days I get the both of best worlds
@@DaRetroFam that's a direct interpreter! Like the command line. Even then there is usually some limited ability to backspace without deleting and insert. Unless you're going back to the days of switch flipping, if so, then by 'paper' , do you mean punch cards?
I agree, the people programming in assembly and solving complex problems in languages like rust, c++, c, and others are the ones AI cant replace. Thats why im focusing my energy on low level learning rather than the high level shit. Im a noob and even i can see this
If u think those programmers are less replaceable by ai than those who learnt memory optimization or whatever u are wrong, every developer will be replaceable and quicker and quicker, hope u realize that we can already prompt generate and deploy web services that will extends to what you are coding in not even years but months.
@@guilhemgmescudi ever try asking ai to code c or c++? I know for a fact you haven’t since I have and it’s always ruined my entire codebase from its ugliness, and terrible cpu and memory usage
@@lightning_11because modern rendering techniques require multiple "textures" for a single model, effectively exploding the size of the game. I agree that game sizes is absolutely rediculous right now, but it's not without reason.
@@NostraDavid2 if you extract assets from the average modern game, you'll find: a bunch of duplicates of the same texture; 1/4 filled texture maps; 36 LOD variants of a texture used only once in a single quest; 16000x16000 noise texture which could be generated procedurally; 5 identical cube meshes with slightly different scale and, of course, one really detailed mesh for background object, which never gets loaded, because player can never reach close enough distance for LOD switching. Not to mention pre-rendered cutscenes which could be easily implemented on-engine, but instead we got a bunch of uncompressed 4k videos
Strong programmers create good optimization Good optimization makes weak programmers Weak programmers create bad optimization Bad optimization makes strong programmers
This is the reason why I think of old programmers as wizards. They could develop these complex algorithms and solutions out of thin air. Our lives wouldn't be as convenient today without their breakthroughs. I am eternally grateful.
@@underscore. JS can be efficient if compiled to bytecode, and if "manual" object management is implemented (to reduce allocations, and pause the garbage collector). In theory, languages aren't slow or fast, it's the implementations and optimizations that matter. Although some langs are easier to optimize, JS isn't one of them
@@Rudxain I've never heard of Javascript being compiled to bytecode. I always have seen Javascript being interpreted on the spot (meaning compiled and immediately executed). Bytecode refers to Java, which actually is compiled. But Javascript is nowhere near Java. It shares the similar name, but that's where the similarities end. Comparing Javascript to Java is like comparing apples to oranges. There's also one superset of Javascript, it's called Typescript. While the Typescript interpreters are not natively supported, Typescript can actually be transpiled to native Javascript code, which is what Node.js usually can understand, many people inaccurately label it as compiling, which makes sense actually. But you're not compiling into a machine code, you're actually translating from one programming language into another one. It's also worth noting that any Javascript code is a valid Typescript code.
Hello world in C - 17Kb Hello world server in C - 18Kb Hello world in Assembly - 1.2Kb At least, that's what "ls -lh" told me when I made those programs on my university's server, which I happen to know is running Fedora.
@@mage3690if you're using windows and don't set the right compiler options and use the wrong includes , a hello world could easily be over a MB in c++
@@petevenuti7355 electron is used for building desktop applications e.g. spotify and discord. it relies on chromium for rendering, which is the open source browser tooling that chrome is built upon. So imagine you wanted to say hello to a girl, but instead you got a whole make up team, camera crew and a dance choreographed. You definitely did not just use the simple words and you definitely did not do it cheaply.
Hi, small correction: The fast inverse square root wasn't written by John Carmack. From what I can find it seems that the programmer who brought the algorithm to id software was Brian Hook, though he wasn't the one who invented it.There seem to be a lot of people involved to some degree in its creation. It being credited to Carmack was a product of guess work around programming forums. Anyway, love the short!
@@kf7558 I think most people believe Carmack wrote that algorithm The only reason I know it was someone else is because of his interview on the Lex Fridman podcast where he explicitly mentions that he didn't invent it
There's a lot more errors on the video. Like the reason we had fast inverse square algorithm wasn't because we didn't have libraries, it was because hardware didn't have function for it. Now it has and it's much faster.
@@PaintsAreOpI'm pretty sure Intel CPUs had the hardware, or at least not long after this (it probably helped on other ports, though). Seems like it was a hold over from the 70s, a function written for MS BASIC. John Carmack definitely used it, but he's said in interviews he had no clue where it came from.
Close, but no. The piece of code was likely created by Greg Walsh in the late 80s, and the theory to use bit-fiddling with Newton iterations was created by William Kahan and K. C. Ng in the mid 80s.
@@AnotherSkyTV and also software dev answer: our algorithms are so powerfull that you will need to get a new computer (aviable in 2 years + some ehem ehem "minor fixes" ehem..) LOL
So imagine the world if we were writing software like before with the hardware of today. I feel like that world wouldn't even know what a frame drop is
This is what I've been preaching for 2 decades. I do write a hell of a let of software in that way and man, it's fast. Gimme C and terminal or raw frame buffer.
This exactly the reason why modern programmers don’t have a clue what they are doing, creating bulky memory heavy shit using lots of storage and are inefficient at iops. Knowing exactly what you are doing, and knowing how stuff will end up at the stack (knowing what the stack is and does) after compiling your code will outperform all the succers that don’t know. Code written by someone who knows the drill will be: small, low memory usage, does what it’s supposed to do and will be way faster than everything else. And on top of that has a lower TCO. Hardware isn’t free!
my grandfather was a programmer and i learned programming from him and i still debug my code with pen and paper, Actually i have never used a debugger in my life 😅
Same. I used Chatgpt when I was learning a new language and I have syntax error. Once I gain proficiency, I ditch the AI and use my good old brain and if lots of number crunching is involved, pen and paper
In the same way a modern soldier couldn't survive without supplies as well as the ancients. We have better supply lines, faster transports, and delivery of attacks, but even American soldiers ain't in as good of shape as Spartans. Modern programmers have trash collectors, faster dev times, and prettier programs (Doom '93 looks good, but have you seen Eternal?).
That's normal. 3 decades ago, people who programmed applications had limited resources and were passionate about it. Computers, smartphones, and the internet were not a God-given thing in every household at that time. Only a few people had them, and fewer companies used them. Now, the demand for programmers increased, so obviously, the quality would decrease people aren't entering the tech industry because they absolutely love it they go into tech because they have a small attraction and there are many jobs in it so for the majority it's a career not a passion.
I am currently using a Mac book pro from 2015. I have 2gb storage left and 8gb of ram. My gpu and cpu are total shit, and I still manage to code games with up to 60fps. This way I can make sure, that my optimisation is perfect and works on every machine anyone is realistically using. The stuff even works on school pcs
Thats why in my opinion old program or software run much faster....and i still dont understand how the modern programmer cant make their program more effiecient like in the past
I am glad I was taught coding by ppl who had begun with punched cards. It was drilled into us to understand & solve our coding solutions with pen & paper away from the machine. This “old school” approach is golden
This sounds like low vs high level programming. Low level (assembly) requires knowing how memory, binary, registers and systemcalls. The fast inverse sqrt for example works due to the knowledge of how floats are represented in binary and knowing how the computer does math. So knowing your full tech stack helps in optimizing because you know what the computer does exactly even when writing high-level code. I do think that old school programmers often had a mathematics backgrounds. Most programming doesn't require a phd in math, but if you do know math then you see programming very differently. Just like when you know assembly, you see all higher languages differently.
For storage availability, this remains true until you begin collecting datasets for training ML models. Fortunately, new PCIe Gen5 server SSDs can also be installed in workstations. A 10Gbe connection is not sufficiently fast for handling this volume of data. While cloud solutions are advantageous due to their computational power for training, data set privacy is a significant concern in the cloud. I remember the days when I was compiling my first 3D software written in C on 1.44 MB floppy disks in the early 90s.
While it is true that today you can get away with not knowing what your code is actually doing, it is also still true that knowing everything about how a computer works is the only way to make insanely good quality software.
Because 1: CPU clock cycles are STILL a bottleneck. 2: No matter how much memory you have, you can and WILL fill it. The more space-efficient your code is, the more stuff you can do before you fill up memory.
You mean the program not the data right? On an old 8-bit CPU from the 80's it might fill at most half a k, ( yes ½ kilobyte) but nowadays even in c++ , if you're using windows, a hello world easily takes over a megabyte if you don't know how to set the compiler options!
Note for the first one: fairly new programmer here, one of my physical drives is both my C: and D: drive (partitioned it separately), and the C: partition is low on storage to say the least most if the time (about 9GB free if I'm lucky), so actually, storage is still a factor, especially when things default to C: drive, i have a lot of symlinks in use just to keep the machine running. Need to do a big cleanout at some point but never find the time
Actually John Carmac did not find the fast square root algo. He said it himself and it was a colleague of his that did that. Sad that nobody remembers his name.
They are not! The deliverables are exactly the same and with the tools easy in comparison. Try drawing a sprite to the screen of a C64 or worse and AMIGA! It's many lines of assembly.
@@burakdogmus I meant the last comment he had on knowing your own code deeply. I forget most of how my own frameworks function if I stop working on them for a while.
With the exception of storage, many of the things you listed actually do still need to be taken into consideration for certain classes of developers, namely those developing kernels and/or bootloaders. When you're working on bare metal, many of those memory management abstractions, high-level APIs, and garbage collectors that you mention aren't there, because they depend on your OS existing in order to work properly. When you're developing for UEFI, of course, then the UEFI Boot Services do provide some APIs in order to be able to do something like read a kernel off the ESP, but once you call `exit_boot_services()`, that all disappears, and then you're left with needing to reimplement all of that from scratch.
I do software development for all sorts of systems these days, from embedded to enterprise cloud. Even if I'm using a beefy system or high level language that has a garbage collector or abstracts away a lot of those low level operations, knowing how these things work still means I will be aware of how to best leverage their advantages. If I can consume less of a VM's resources in a cloud app, that can result in big savings at scale. If you write code, you still ought to know how the magic works.
I find that keeping my computer science fundamentals sharp is what sets me apart from my colleagues. Especially now since a lot of companies are working with a revenue mindset, they want to pull as much performance out of their hardware as possible, so languages like Python, Ruby and JS aren’t cutting it anymore. I can work with systems programming languages effortlessly which a lot of modern devs don’t seem to be able to do.
Tbh, dealing with these things will make you a better programmer. After I started learning assembly and studied algorithms and data structures in C I became a much better developer.
Carmack didn't invent the fast inverse square root algorithm, Greg Walsh did based on a paper written by William Kahan. Also, systems programmers still frequently use "old school" programming techniques for both hardware limitations and performance reasons.
I work with large datasets that I hold in RAM and manually handle pointers to access that data due to weird memory constraints. I may be a modern programmer, but I can 100% relate to the pains of the older programmers
I think memory should still be considered as important today for any developer. Don’t waste it. That’s the difference between a great developer and a good one. To understand fully the machine and work in harmony with it.
I started on BASIC. It was so incredibly slow because it was compiling on the fly. For serious things you needed to do proper compiling or else Christmas would come before things had finished running. Now with languages like Python compiling on the fly is so normal that many don't realise that packaging with a compiler would speed things up so much.
I can relate to having to have a deep insight into the program, but let me remind you, that also included the hardware, system calls, etc. I was programming my Acorn Electron when I was 6 and I only had 32K and a 4-bit wide bus for the RAM on that thing… and cassette storage.
I'd love to see a video discussing the 8 fallacies of distributed computing. That's something modern programmers should be worried about that wasn't a problem in the old days.
The major cost in my previous job was database cost, so storage is also critical, the difference is that how to storage is the main concern, no how much you store. Memory management is also important nowadays in the cloud
Debugging is a skill every programmer should learn. I have seen too many kids writing codes and debugging using chatgpt and then have no clue when something doesn't work.
That's because schools don't teach it. Same with unit testing. It's a shame, because both have been massively important during development. Using printf to debug is like using a screwdriver to cut a steak. Yeah it works, but it's such a shitty way of doing things.
Absolutely..... even older programmers coming into domains like data and not knowing the internals of ETL/ELT tools and how to both write optimal code and architect data in the most responsive way for both a downstream operation or an end user analytics query. Having ChatGPT on speed dial may be ok for a first draft, but you better know how to reframe your follow up inquiries and why things work the way they do......
i cannot stand programmers who have the mindset "we have so much storage it's ok to be wastefull" No it isn't, because when everybody thinks like that we suddenly don't have enough storage. I miss the days of when video game programmers would compress their files. Games are literally +2x bigger in storage size these days than they were 8 years ago.
It's not even just storage. It often makes it faster as well, because storage devices are slow, and loading less data and simultaneously decompressing it can be significantly faster.
One course that I greatlt appreciate long after I graduated uni is the IoT courses, very ironically the IoT course taught me more about performance and optimization more than all other programming related ones. Because they taught me how to write a C code for ATMEGA 32, before eventually leveling up to more complex micro controllers.
If you've ever dealt with load shedding for 18 hours a day without any power backups, you can imagine the struggles programmers faced. Constant interruptions meant less coding time and more unexpected breaks, making storage a lower priority. Such conditions could definitely deter someone from pursuing programming. Nowadays, with advancements in storage technology, programmers have more freedom with space and don't have to worry as much about running out. However, problems can still happen. For example, there was a case where a 1TB backup drive from a well-known company suddenly malfunctioned, causing significant data loss that wasn’t due to viruses or user error. This shows how crucial reliable storage is and how tough losing a lot of data can be. Plus, the cost and time involved in backup solutions can still be a big hurdle for individuals
Many modern developers thinking they don't have to care about these things is a huge issue. I also really don't like how many "coders" on social media today seem to think development is only Python and JavaScript, don't learn from these people.
That's the main difference between C and C++. C is a faster, more resource-efficient language. But you have to KNOW what you're doing and you have to do it all. C++ has built-in safety features and it performs many functions for you. But that means it is generally slower and takes more computer resources.
while I miss those early days, I don´t miss how hard it was to get an application to "production" quality and how hard was the distribution of it. I started programming in 1978-1979 on the apple II and the TRS80 (yes, we had 2 computers. I was lucky). We could understand the whole working of the computer and use every inch of it, in echange of hundred of hours of tuning. To distribute it, we had to take ads in magazines and go to "in-person" equivalent of discord servers. today, we can build and distribute the equivalent application in hours or spend a couple of days on it and get the equivalent of a developpent that would have taken 2 years. We make it availlable to so many people by just publishing it on the internet.
Distributiion was easier you just had a binary no additional files and libary dependencies. Yes you had to put it on an expensive floppy but that was it. I loathe today's installers and 1000s of separate files and runtimes that need to be maintained and hog space like crazy. We gotten worse not better.
We're not stuck doing low level stuff. However, we are stuck trying to call the modern high-level api. By the time I understand that I need to instantiate the fastRendererFactory implementation of the rendererFactory interface to instantiate the renderer in order to call render(picture) I am quite jealous of the guys who just write a renderer.
This stuff SHOULD still be important. Have you seen the ludicrous amount of memory that things like MSTeams uses? Or the amount of storage required for modern "retro-inspired" games? We need to do better
ok older programmer here, please corrupt me as fast as you can -- How do you debug with ChatGPT? Are you just pasting code and asking it to do static analysis or is it dynamic somehow, with breakpoints and different input tests? Intrigued!!
There are still many widely used languages that don't have garbage collection. Mainly C and C++ to my knowledge. Knowing how your memory is used and released makes you a better programmer in my opinion.
If I encounter another text chat application that hogs 1gb of ram, I swear programmers will have to worry about storage just as they worry of their kneecaps.
One of my biggest programming feats was making a full NES game with only 40KB (yes, kilobytes) of storage. And on that system you only had 2KB of RAM. I actually enjoy programming with heavy limitations a lot more
Yup, and all of that together is the reason why the workload modern computers can manage has not grown proportionally with their "theoretical" capabilities, if at all[*]: the art and knowledge how to use the available resources efficiently has been lost. [*: If I could give you a glimpse what we did 40 years ago with a 6809 CPU, 512 KB of main memory accessed with only 16 address lines via a trivial but highly efficient "MMU", a 40 MB hard disk, and some "Glas TTYs" connected via 9600 Baud serial lines you might understand what I mean.]
But apis such a Directx 12 was developed from its previous versions right? So memory management and graphics calculations are not totally obsolete. Majority of the programmers dont have to think how those programmes work but those api's developers have to right?
Just because resources are abundant doesn’t mean you should waste them. This applies to everything in life.
yeah, this. games that took 8 GB, now have 30GB. Great, if YOU have all the bandwidth and shit, but sometimes people really have a bad speed.
Hurr durr
Agreed. I can't even play warzone cause by the time I have it updated, the next ones out 😢
time, being the most valuable
get off your high horse.
@@boar6615Why are you so angry?
Programmers should 100% still be thinking about storage. When the game is 150gb, it doesn't matter if you have a 1TB hdd, its still not enough.
Yeah
The fact that 1tb storage nowadays cant hold more than 10 AAA games is just insanity
Btw, most storage from large games just comes in the form of high resolution textures and models. Not really binary size.
Some of that comes down to the 3d art side of things, there is an art to texture bashing where you can bash textures together to make something new and save on storage space. At the cost of slowing things down.
But many games simply don't do that at all. And that I think is partly because they are building for very low end machines.
I'm not a huge fan of how large games have gotten, but the people acting like getting an SSD is some prohibitive cost are overreacting a bit. With most games now retailing for $70 it costs less then the price of the game to get a 2tb SSD
The fact that memory management is less of a concern now is how we end up with applications like teams using multiple gigs of system RAM doing absolutely nothing.
It's also the fact they are effectively using a whole-ass browser to render their application.
Unused memory is wasted memory. Almost all of it is cache that is freed whenever the OS needs it.
@@uwuLegacy except its not, the utopia that used ram is freed when needed is a big utopia. in reality, your system will likely just freeze
@@TheCollectiveHexagon to some extent, they're correct. Cache isn't really used memory, as it can more or less be cleaned up instantly - windows doesn't report it as free and it only puts the faintest of lines on the bar to differentiate truly free, and cache "free." So it is.
The problem is that with everyone thinking like that, well it becomes useless. A caveat of programs needing more memory is programs becoming less and less tolerant of swapping, as it takes that much more time to page in or page out a program during memory pressure.
Unused memory is wasted memory, that's why OSes fill it with cache. But wasted memory is also wasted memory, and a program should not take more than its fair share, because it doesn't know what other programs are competing for it. "Oh I'll just use 3 GiB to calculate a square root, it's fine since it would just be cache anyways" only holds true until suddenly Chrome or Resolve wants that space.
@@NostraDavid2
Revelation 3:20
Behold, I stand at the door, and knock: if any man hear my voice, and open the door, I will come in to him, and will sup with him, and he with me.
HEY THERE 🤗 JESUS IS CALLING YOU TODAY. Turn away from your sins, confess, forsake them and live the victorious life. God bless.
Revelation 22:12-14
And, behold, I come quickly; and my reward is with me, to give every man according as his work shall be.
I am Alpha and Omega, the beginning and the end, the first and the last.
Blessed are they that do his commandments, that they may have right to the tree of life, and may enter in through the gates into the city.
The worst part? They had to code in word editors and there were like 8 reliable languages
the good old ways where no one harassed you with Rust
Word editors?
No, we used stuff like vi, emacs, or on IBM mainframes, edit or xedit.
I still do this when writing LUA and find it my preferred method for HTML CSS AND JS due to thinks like Notepad++ now days I get the both of best worlds
@@akulkis I thought those were word / text editors, what more is there? Im actually serious... (Microsoft word doesn't count)
@@DaRetroFam that's a direct interpreter! Like the command line. Even then there is usually some limited ability to backspace without deleting and insert. Unless you're going back to the days of switch flipping, if so, then by 'paper' , do you mean punch cards?
"We don't ever have to worry about storage on our personal computers..." *cries in any modern AAA game*
Like MODERN AAA game is used for programming
To install other games i use external hdd that is slow compared to internal ssd which has lower capacity.
If you’re the type of programmer that never thinks about the things listed here. You’re the type of programmer AI is going to replace
I agree, the people programming in assembly and solving complex problems in languages like rust, c++, c, and others are the ones AI cant replace. Thats why im focusing my energy on low level learning rather than the high level shit. Im a noob and even i can see this
Calling them programmers is an insult
@@FineWine-v4.0script kiddies more like
If u think those programmers are less replaceable by ai than those who learnt memory optimization or whatever u are wrong, every developer will be replaceable and quicker and quicker, hope u realize that we can already prompt generate and deploy web services that will extends to what you are coding in not even years but months.
@@guilhemgmescudi ever try asking ai to code c or c++? I know for a fact you haven’t since I have and it’s always ruined my entire codebase from its ugliness, and terrible cpu and memory usage
Me, embedded developer crying here😢😢😅
why do you code in bed
@@josegarcia2762
1. It's comfy
2. You can code in your pajamas, your underwear, or even nude, and nobody cares.
Every byte counts..
@@akulkis🤣😂
@@josegarcia2762 900IQ comment
Game studios watching the video: 300GB and no optimization it is.
Why is this so true?
@@lightning_11because modern rendering techniques require multiple "textures" for a single model, effectively exploding the size of the game.
I agree that game sizes is absolutely rediculous right now, but it's not without reason.
@@NostraDavid2 if you extract assets from the average modern game, you'll find: a bunch of duplicates of the same texture; 1/4 filled texture maps; 36 LOD variants of a texture used only once in a single quest; 16000x16000 noise texture which could be generated procedurally; 5 identical cube meshes with slightly different scale and, of course, one really detailed mesh for background object, which never gets loaded, because player can never reach close enough distance for LOD switching. Not to mention pre-rendered cutscenes which could be easily implemented on-engine, but instead we got a bunch of uncompressed 4k videos
Extreme SIMD techniques with optimal CPU architecture controls
COD series be like:
Strong programmers create good optimization
Good optimization makes weak programmers
Weak programmers create bad optimization
Bad optimization makes strong programmers
Crazy
LMAOOOOO
This is the reason why I think of old programmers as wizards. They could develop these complex algorithms and solutions out of thin air. Our lives wouldn't be as convenient today without their breakthroughs. I am eternally grateful.
"We dont need to be efficient the consumer can just buy more storage and better internet." That's you bro
Having too much storage also brings problems..for example web apps that use 500MB just for displaying a text schedule.....
Electron be like
@@Rudxain javascript*
@@underscore. JS can be efficient if compiled to bytecode, and if "manual" object management is implemented (to reduce allocations, and pause the garbage collector).
In theory, languages aren't slow or fast, it's the implementations and optimizations that matter. Although some langs are easier to optimize, JS isn't one of them
JS Frameworks be like.
@@Rudxain I've never heard of Javascript being compiled to bytecode. I always have seen Javascript being interpreted on the spot (meaning compiled and immediately executed). Bytecode refers to Java, which actually is compiled. But Javascript is nowhere near Java. It shares the similar name, but that's where the similarities end. Comparing Javascript to Java is like comparing apples to oranges. There's also one superset of Javascript, it's called Typescript. While the Typescript interpreters are not natively supported, Typescript can actually be transpiled to native Javascript code, which is what Node.js usually can understand, many people inaccurately label it as compiling, which makes sense actually. But you're not compiling into a machine code, you're actually translating from one programming language into another one. It's also worth noting that any Javascript code is a valid Typescript code.
📁 Hello world in electron - 338MB
Hello world in C - 17Kb
Hello world server in C - 18Kb
Hello world in Assembly - 1.2Kb
At least, that's what "ls -lh" told me when I made those programs on my university's server, which I happen to know is running Fedora.
@@mage3690if you're using windows and don't set the right compiler options and use the wrong includes , a hello world could easily be over a MB in c++
comparing hello world in electron to hello world in C is oranges to apples. Electron is doing a hell of a lot more than just executing your code.
@@Friskni I'm actually completely unfamiliar with electron, what the hell is it doing?
@@petevenuti7355 electron is used for building desktop applications e.g. spotify and discord.
it relies on chromium for rendering, which is the open source browser tooling that chrome is built upon.
So imagine you wanted to say hello to a girl, but instead you got a whole make up team, camera crew and a dance choreographed. You definitely did not just use the simple words and you definitely did not do it cheaply.
Hi, small correction: The fast inverse square root wasn't written by John Carmack. From what I can find it seems that the programmer who brought the algorithm to id software was Brian Hook, though he wasn't the one who invented it.There seem to be a lot of people involved to some degree in its creation.
It being credited to Carmack was a product of guess work around programming forums.
Anyway, love the short!
If I'm not mistaken, Gary Tarolli has the first credit for this from back in 94. He worked at 3dfx
That's why this creator is now on my don't recommend list. Sloppy researched bad "content" like this needs to be removed from yt. Waste of time!
@@kf7558 I think most people believe Carmack wrote that algorithm The only reason I know it was someone else is because of his interview on the Lex Fridman podcast where he explicitly mentions that he didn't invent it
There's a lot more errors on the video. Like the reason we had fast inverse square algorithm wasn't because we didn't have libraries, it was because hardware didn't have function for it. Now it has and it's much faster.
@@PaintsAreOpI'm pretty sure Intel CPUs had the hardware, or at least not long after this (it probably helped on other ports, though). Seems like it was a hold over from the 70s, a function written for MS BASIC. John Carmack definitely used it, but he's said in interviews he had no clue where it came from.
Debugger: ❌
print("wtf"): ✅
John carmack did not come up with the fast inverse sq. Someone else on his team did
Close, but no. The piece of code was likely created by Greg Walsh in the late 80s, and the theory to use bit-fiddling with Newton iterations was created by William Kahan and K. C. Ng in the mid 80s.
tell me your a web dev without telling me your a web dev
Storage is no problem of a modern programmer.
node_modules: Hold my beer 🍺
We need more of normal programmers (old-school) with modern resources
Translation: we need more real programmers, most modern programmers don't know how to program.
@@AnotherSkyTV we have coders not programmers!
@@AnotherSkyTV and also software dev answer: our algorithms are so powerfull that you will need to get a new computer (aviable in 2 years + some ehem ehem "minor fixes" ehem..) LOL
"Cant relate" you are not alone 💀💀
This is the reason why people are compelled to waste 899 dollars on a new phone every year.. Because of lazy developers wasting resources..
@@savagesarethebest7251 blaming the workers instead of the executives' greed, perfect...
So imagine the world if we were writing software like before with the hardware of today. I feel like that world wouldn't even know what a frame drop is
This is what I've been preaching for 2 decades. I do write a hell of a let of software in that way and man, it's fast. Gimme C and terminal or raw frame buffer.
Temple OS
"programmes don't need to think about memory management anymore" is one of the biggest lies someone couls ever say
"storage is never really a concern" ... Are you telling me while I'm cleaning up my drive because it's getting full.
I'm an enbeded engineer. I have to worrx about these things 😅
This exactly the reason why modern programmers don’t have a clue what they are doing, creating bulky memory heavy shit using lots of storage and are inefficient at iops.
Knowing exactly what you are doing, and knowing how stuff will end up at the stack (knowing what the stack is and does) after compiling your code will outperform all the succers that don’t know.
Code written by someone who knows the drill will be: small, low memory usage, does what it’s supposed to do and will be way faster than everything else. And on top of that has a lower TCO. Hardware isn’t free!
Ultimate debugger tool: print('LOL')
guilty
_😂_
I always do
Print(“********”)
I also use "ROFL" sometimes
printf("sus imposter amongus XD
");
John Carmack did not write the reversed square root function. It was another engineer at Id.
my grandfather was a programmer and i learned programming from him and i still debug my code with pen and paper, Actually i have never used a debugger in my life 😅
Cool
Same. I used Chatgpt when I was learning a new language and I have syntax error. Once I gain proficiency, I ditch the AI and use my good old brain and if lots of number crunching is involved, pen and paper
Work hard, not smart. Got it.
Carmack did NOT write the fast inverse square root algorithm !
Source: himself in lex fridman's podcast
In other words, modern programmers are unskilled compared to those from the last century.
That's not it at all.
In the same way a modern soldier couldn't survive without supplies as well as the ancients. We have better supply lines, faster transports, and delivery of attacks, but even American soldiers ain't in as good of shape as Spartans.
Modern programmers have trash collectors, faster dev times, and prettier programs (Doom '93 looks good, but have you seen Eternal?).
@@rya3190 tbh I don't even know how to use debuggers and just understand my own code
@@super---. And there are Marines who'd burn down a forest with their cooking...
That's normal. 3 decades ago, people who programmed applications had limited resources and were passionate about it. Computers, smartphones, and the internet were not a God-given thing in every household at that time. Only a few people had them, and fewer companies used them. Now, the demand for programmers increased, so obviously, the quality would decrease people aren't entering the tech industry because they absolutely love it they go into tech because they have a small attraction and there are many jobs in it so for the majority it's a career not a passion.
Programmers in 2050: In old days programmers used to write code in an IDE🤯, they had deep understanding of their code
I am currently using a Mac book pro from 2015. I have 2gb storage left and 8gb of ram. My gpu and cpu are total shit, and I still manage to code games with up to 60fps.
This way I can make sure, that my optimisation is perfect and works on every machine anyone is realistically using.
The stuff even works on school pcs
Me with 20KB of RAM and 64 KB of FLASH memory 😭😭😭
Bro, you should upgrade your desktop! It's been a little outdated, me thinks! 😂😂😂
Are you an embeded bros ?
TI-84?
Thats why in my opinion old program or software run much faster....and i still dont understand how the modern programmer cant make their program more effiecient like in the past
I am glad I was taught coding by ppl who had begun with punched cards. It was drilled into us to understand & solve our coding solutions with pen & paper away from the machine. This “old school” approach is golden
This sounds like low vs high level programming. Low level (assembly) requires knowing how memory, binary, registers and systemcalls.
The fast inverse sqrt for example works due to the knowledge of how floats are represented in binary and knowing how the computer does math. So knowing your full tech stack helps in optimizing because you know what the computer does exactly even when writing high-level code.
I do think that old school programmers often had a mathematics backgrounds. Most programming doesn't require a phd in math, but if you do know math then you see programming very differently. Just like when you know assembly, you see all higher languages differently.
I literally used that last month to do a fast float-to-int conversion.
For storage availability, this remains true until you begin collecting datasets for training ML models. Fortunately, new PCIe Gen5 server SSDs can also be installed in workstations. A 10Gbe connection is not sufficiently fast for handling this volume of data. While cloud solutions are advantageous due to their computational power for training, data set privacy is a significant concern in the cloud.
I remember the days when I was compiling my first 3D software written in C on 1.44 MB floppy disks in the early 90s.
Storage is an issue when you're in the cloud as can be costly.
While it is true that today you can get away with not knowing what your code is actually doing, it is also still true that knowing everything about how a computer works is the only way to make insanely good quality software.
"You don't have to worry about debugging"
Me: debugging JS code and confused where the error referred to.
John Carmack is a GENIUS, but he was not the one who invented that thing. That is a hoax.
My professor scolded us when binary search algorithm written in python took 1 mb space he made us rewrite the whole assignment of 20 programs in c 😂
Because 1: CPU clock cycles are STILL a bottleneck. 2: No matter how much memory you have, you can and WILL fill it. The more space-efficient your code is, the more stuff you can do before you fill up memory.
You mean the program not the data right?
On an old 8-bit CPU from the 80's it might fill at most half a k, ( yes ½ kilobyte) but nowadays even in c++ , if you're using windows, a hello world easily takes over a megabyte if you don't know how to set the compiler options!
He should've demanded that all programs were written in C to begin with. That's when you really learn to program. You don't learn that in Python.
Note for the first one: fairly new programmer here, one of my physical drives is both my C: and D: drive (partitioned it separately), and the C: partition is low on storage to say the least most if the time (about 9GB free if I'm lucky), so actually, storage is still a factor, especially when things default to C: drive, i have a lot of symlinks in use just to keep the machine running. Need to do a big cleanout at some point but never find the time
Memory management is very important.
Actually John Carmac did not find the fast square root algo. He said it himself and it was a colleague of his that did that. Sad that nobody remembers his name.
Learning programming in 2023 is learning that you dont need to program anything because of everything
Embedded programmer switches to assembly to burb out every air bubble for storage...
Agreed that the modern day programming environment is much more advanced, but it also means the deliverables are much more complicated.
not much different other than more pixals maybe..
They are not! The deliverables are exactly the same and with the tools easy in comparison. Try drawing a sprite to the screen of a C64 or worse and AMIGA! It's many lines of assembly.
"Cant relate.." LMAO! me either
neither*
I was wondering how would you know these, how old are you..... I can relate, once my only operating system was dos in a floppy disk :D
@@burakdogmus I meant the last comment he had on knowing your own code deeply.
I forget most of how my own frameworks function if I stop working on them for a while.
@@b0exi Only on a coding thread....
(English is not my first language)
@@tally3018NEITHER is it mine
With the exception of storage, many of the things you listed actually do still need to be taken into consideration for certain classes of developers, namely those developing kernels and/or bootloaders. When you're working on bare metal, many of those memory management abstractions, high-level APIs, and garbage collectors that you mention aren't there, because they depend on your OS existing in order to work properly.
When you're developing for UEFI, of course, then the UEFI Boot Services do provide some APIs in order to be able to do something like read a kernel off the ESP, but once you call `exit_boot_services()`, that all disappears, and then you're left with needing to reimplement all of that from scratch.
Storage is a huge issue. Programmers no longer optimize their games like they used to
I do software development for all sorts of systems these days, from embedded to enterprise cloud. Even if I'm using a beefy system or high level language that has a garbage collector or abstracts away a lot of those low level operations, knowing how these things work still means I will be aware of how to best leverage their advantages. If I can consume less of a VM's resources in a cloud app, that can result in big savings at scale. If you write code, you still ought to know how the magic works.
“Can’t relate” is the only part I related to
I find that keeping my computer science fundamentals sharp is what sets me apart from my colleagues. Especially now since a lot of companies are working with a revenue mindset, they want to pull as much performance out of their hardware as possible, so languages like Python, Ruby and JS aren’t cutting it anymore. I can work with systems programming languages effortlessly which a lot of modern devs don’t seem to be able to do.
Tbh, dealing with these things will make you a better programmer. After I started learning assembly and studied algorithms and data structures in C I became a much better developer.
Despite the common misconception, Carmack didn't really write fast inverse sqrt en.m.wikipedia.org/wiki/Fast_inverse_square_root
Dave plumber has an excellent video on this topic. ruclips.net/video/Fm0vygzVXeE/видео.htmlsi=3hem9dcAHneVmKNF
Counterpoints:
Embedded
OS dev
Performance critical applications
Writing those APIs
Writing those GCs
Carmack didn't invent the fast inverse square root algorithm, Greg Walsh did based on a paper written by William Kahan. Also, systems programmers still frequently use "old school" programming techniques for both hardware limitations and performance reasons.
I work with large datasets that I hold in RAM and manually handle pointers to access that data due to weird memory constraints. I may be a modern programmer, but I can 100% relate to the pains of the older programmers
not a matter of age but performance...
I can almost guarantee there’s a better way to do whatever it is you’re doing, but hey if you like pain nobody’s gonna stop you
I think memory should still be considered as important today for any developer. Don’t waste it. That’s the difference between a great developer and a good one. To understand fully the machine and work in harmony with it.
I started on BASIC. It was so incredibly slow because it was compiling on the fly. For serious things you needed to do proper compiling or else Christmas would come before things had finished running.
Now with languages like Python compiling on the fly is so normal that many don't realise that packaging with a compiler would speed things up so much.
I can relate to having to have a deep insight into the program, but let me remind you, that also included the hardware, system calls, etc. I was programming my Acorn Electron when I was 6 and I only had 32K and a 4-bit wide bus for the RAM on that thing… and cassette storage.
I like exploring the history of programming as it helps me get a larger view of wts going on in this field.
Please do more of these videos !
Enough storage, until the Docker images start to pile up.
I'd love to see a video discussing the 8 fallacies of distributed computing. That's something modern programmers should be worried about that wasn't a problem in the old days.
COD developers be like: Well if it's not a problem anymore...
Memory management and storage optimization are still 100% required, just not in python/lua, but in C/C++/Rust, 100% still need to do so.
John said on Lex Friedman podcast, that he did not write the inverse square root algorithm.
The major cost in my previous job was database cost, so storage is also critical, the difference is that how to storage is the main concern, no how much you store. Memory management is also important nowadays in the cloud
Debugging is a skill every programmer should learn. I have seen too many kids writing codes and debugging using chatgpt and then have no clue when something doesn't work.
I never thought printf() will become the second worst debugging method.
That's because schools don't teach it. Same with unit testing. It's a shame, because both have been massively important during development.
Using printf to debug is like using a screwdriver to cut a steak. Yeah it works, but it's such a shitty way of doing things.
Unless you have some pretty simple programs, I don't know how you would debug with chatgpt. It's pretty unreliable.
Absolutely..... even older programmers coming into domains like data and not knowing the internals of ETL/ELT tools and how to both write optimal code and architect data in the most responsive way for both a downstream operation or an end user analytics query.
Having ChatGPT on speed dial may be ok for a first draft, but you better know how to reframe your follow up inquiries and why things work the way they do......
storage is always a concern especially on the web where every byte costs money to both send and host :D
You forget text editors 💀
i cannot stand programmers who have the mindset "we have so much storage it's ok to be wastefull"
No it isn't, because when everybody thinks like that we suddenly don't have enough storage.
I miss the days of when video game programmers would compress their files. Games are literally +2x bigger in storage size these days than they were 8 years ago.
It's not even just storage. It often makes it faster as well, because storage devices are slow, and loading less data and simultaneously decompressing it can be significantly faster.
One course that I greatlt appreciate long after I graduated uni is the IoT courses, very ironically the IoT course taught me more about performance and optimization more than all other programming related ones. Because they taught me how to write a C code for ATMEGA 32, before eventually leveling up to more complex micro controllers.
fun fact: all these are still happening for microcontroller programmer.
2023, computer science student, and I sometimes use pen and paper when debugging because it's easier to focus and think like a computer
If you've ever dealt with load shedding for 18 hours a day without any power backups, you can imagine the struggles programmers faced. Constant interruptions meant less coding time and more unexpected breaks, making storage a lower priority. Such conditions could definitely deter someone from pursuing programming. Nowadays, with advancements in storage technology, programmers have more freedom with space and don't have to worry as much about running out. However, problems can still happen. For example, there was a case where a 1TB backup drive from a well-known company suddenly malfunctioned, causing significant data loss that wasn’t due to viruses or user error. This shows how crucial reliable storage is and how tough losing a lot of data can be. Plus, the cost and time involved in backup solutions can still be a big hurdle for individuals
Someone could argue that this is better than now
Many modern developers thinking they don't have to care about these things is a huge issue. I also really don't like how many "coders" on social media today seem to think development is only Python and JavaScript, don't learn from these people.
Was it John Carmack who wrote the fast inverse square algorithm or was it Michael Abrash?
Storage is still a concern. Some devices do not have a lot storage. Plus, it's good practice to keep files as small as possible.
That's the main difference between C and C++. C is a faster, more resource-efficient language. But you have to KNOW what you're doing and you have to do it all. C++ has built-in safety features and it performs many functions for you. But that means it is generally slower and takes more computer resources.
while I miss those early days, I don´t miss how hard it was to get an application to "production" quality and how hard was the distribution of it. I started programming in 1978-1979 on the apple II and the TRS80 (yes, we had 2 computers. I was lucky). We could understand the whole working of the computer and use every inch of it, in echange of hundred of hours of tuning. To distribute it, we had to take ads in magazines and go to "in-person" equivalent of discord servers. today, we can build and distribute the equivalent application in hours or spend a couple of days on it and get the equivalent of a developpent that would have taken 2 years. We make it availlable to so many people by just publishing it on the internet.
Distributiion was easier you just had a binary no additional files and libary dependencies. Yes you had to put it on an expensive floppy but that was it. I loathe today's installers and 1000s of separate files and runtimes that need to be maintained and hog space like crazy. We gotten worse not better.
Carmack didn't wrote that code. Other person at ID did it.
We're not stuck doing low level stuff. However, we are stuck trying to call the modern high-level api. By the time I understand that I need to instantiate the fastRendererFactory implementation of the rendererFactory interface to instantiate the renderer in order to call render(picture) I am quite jealous of the guys who just write a renderer.
I have 9GBs of storage left, my system runs out of memory all the time and i program low-level code. this video doesn't apply to me.
This stuff SHOULD still be important. Have you seen the ludicrous amount of memory that things like MSTeams uses? Or the amount of storage required for modern "retro-inspired" games?
We need to do better
ok older programmer here, please corrupt me as fast as you can -- How do you debug with ChatGPT? Are you just pasting code and asking it to do static analysis or is it dynamic somehow, with breakpoints and different input tests? Intrigued!!
There are still many widely used languages that don't have garbage collection. Mainly C and C++ to my knowledge. Knowing how your memory is used and released makes you a better programmer in my opinion.
Us old guys used to fight over how to render polygons
If I encounter another text chat application that hogs 1gb of ram, I swear programmers will have to worry about storage just as they worry of their kneecaps.
One of my biggest programming feats was making a full NES game with only 40KB (yes, kilobytes) of storage. And on that system you only had 2KB of RAM. I actually enjoy programming with heavy limitations a lot more
Dude says, “A bunch of reasons that software is soooo bloated now.” 😂
The deep understanding is what i strive for
Yup, and all of that together is the reason why the workload modern computers can manage has not grown proportionally with their "theoretical" capabilities, if at all[*]: the art and knowledge how to use the available resources efficiently has been lost.
[*: If I could give you a glimpse what we did 40 years ago with a 6809 CPU, 512 KB of main memory accessed with only 16 address lines via a trivial but highly efficient "MMU", a 40 MB hard disk, and some "Glas TTYs" connected via 9600 Baud serial lines you might understand what I mean.]
But apis such a Directx 12 was developed from its previous versions right? So memory management and graphics calculations are not totally obsolete. Majority of the programmers dont have to think how those programmes work but those api's developers have to right?
Someone who codes in assambly for embedded applications, I bow my head to the old masters
Carmack's version of the fast inverse square root function is pure genius in just a few lines. Damn.
OG programmers were built different.
Nowadays people whine if a language allows them to make mistakes, and blame it on the languages.
Are those "back in the day" people using assembly?