I don't work in gamedev, but I do work in an enterprise setting as a software engineer. A depressingly small amount of my team's developer hours goes towards writing useful features or functionality. Over half, sometimes over three quarters is spent updating dependencies, fixing issues in our deployment pipeline, and migrating from one deprecated API/technology to a newer system. API migrations especially has been killing us this past year, it feels like every month some other team is contacting us to say "this API you're using is going away, you have to move to this other system with a totally different set of endpoints with different response objects, oh and by the way a feature you relied on no longer exists". Add on top of that, our team's budget has been cut in half. So we're being asked to do tons of work in the backend which doesn't deliver any value to our customers, AND we have half the developers we used to. If gamedev is anything like that, I totally understand why AAA companies spend years making a mediocre product. Modern software development is a dependency nightmare.
@@HunterD510 That's not how the world works. In that case we'd still have Windows 95 for example. There is a limit to how much you can support and sometimes you need to restructure things in order to avoid complexity and scale to the extent that the demand requires. It could be that the cost to benefit ratio was also too low for some features, leading to their removal.
@@Ghorda9 I'm not sure that what you're saying is even possible for the commenter. It sounds like they work inside enterprise web development. If that's the case, what would your solution be for an API vendor that gives a few months notice and then shuts down the server where the API is running? In the shitshow that is webdev that's a very real problem where your only choice is to switch to a new API in the middle of a project. Also, a project within webdev is rarely finished since there's always something to change, fix, or add.
This is soo spot on. Every time I ask this question to a graphics dev, I get the response "uhh I don't know man, I haven't thought about it tbh". Also I'm glad that influential people such as JB are finally talking about all the nasty ways proprietary systems get in the way of good engineering. The future is open software running on open hardware. This is obvious, but it's also EXTREMELY hard to make it work in the modern world! Humanity will probably lose centuries figuring this out and it just breaks my heart every time I think of it...
I remember the days of the C-compiler lock-in. When I found out about gcc (7 years after it first came out, shame on me), that was a revolution! We should learn to NEVER depend on vendor-owned programming languages. It is the ultimate lock-in. The problem with the GLs is that they are not compiled languages but interpreted. Meaning that having an intermediate that translates an open GL (pun intended) into a vendor-specific one, will always suffer a performance hit.
Not saying he's wrong, but he's kinda missing the mark. You shouldn't write for a GPU architecture directly because they change very often in ways that make them not backward compatible. For example, AMD Rx 5000 series had very little in common with its immediate predecessor, the Rx 500 series. Same with Nvidia between the 1000 and 2000 series. Compare that to CPUs: anything written for the 80386 back in 1985 would still work on a processor made in 2024. Half the point of games on PC is that I can still play games I bought in 2003 when I had Windows XP on a brand new computer today.
@@kahaneck when i had the game it couldn't run on anyone's system and when i looked it up it mentioned that it used a feature of a graphics card that was deprecated as soon as the next one came out.
Which, to be fair, they could do in the graphics market. They could just make one ISA and continue extending it like has been done for x86. But they don't. So why is that?
I have similar sentiments to Blow here, but I concluded, "fuck it, do some modern retrogame coding. Agon Light looks nice." So now I am coding a mix of Lua and Forth. Lua is the entire build system and provides all higher-order metaprogramming, asset handling, etc. Forth is the compilation target, assembler and REPL debugging environment - I do not write complex Forth words, I write fairly dumb ones that Lua adds static checks for and copy-pastes variants of, so that hardly anything confusing or hard to read is done with Forth(and Forth itself is no slouch, but it doesn't have garbage collection, runtime type checks, string manipulation functions, etc. - it is a macroassembler operating over static memory structures, that bootstraps into a Lisp if you work at it; vs Lua being the most popular "embedded Lisp" around, one that is Forth-like in its dedication to a small core). The Agon is an eZ80 driving ESP32 graphics firmware over a serial protocol, so it has a good deal more power than vintage 8-bits, with an open design. There is a working emulator, multiple vendors making boards. You can reasonably understand 100% of the system and make targeted low-level optimizations without going waist-deep in assembly. If I want to retarget off of Agon, I just adapt it to a different Forth system and I/O layer. This is the stuff you do when you want to enjoy programming again, I think. Don't do modern connected things - they are poisoned by complex protocols and vendors who want to sell them. This is something the Forthers are right about. Chuck Moore will tell you that floating point is there to sell complex FPUs.
I am programming right now in Vulkan for my thesis and the burden that Vulkan brings against OpenGL is unbarable. The fact that you have to write a vulkan program with 1500+ lines of code just to show a single triangle is kinda insane to me. At university we had to write a software renderer and I feel like I coded that faster than I got Vulkan up and running with the triangle. Don't get me wrong, i love learning about all the low-level stuff, but Vulkan or DX12 because all of this, it's so complex. Don't get me even started when trying to debug a single line where I could have left out just a single StructType assignment which could throw off the whole application (At least the validation layers can help with this). There is just so much information that you have to keep in your head and it really distances you from actually solving the problem. That could be of course skill issue, but that's just how I see it. When I programmed in OpenGL it seemed much clearer to me and I could focus more on the problem.
and even the very best AAA games specifically optimized for vulkan, have really only a minimal benefit compared to gl/dx11. so the complexity is high, the room for error is high, it is easy to make something worse than the old runtimes, and the benefit to the consumer even when it's done right is minimal. low level API's have been a complete disaster, just another exercize in ladder pulling.
@doltBmB i dont know. Vulkan had run my games more smoothly and with higher FPS. In Path of Exile I can even switch the render API and I can directly feel it runs better. I mean Vulkan probably had lots of ways on how not to do it, so that you can have thr capability to so it extra right.
Micros were always shit. Radioshack only sold tandy units because they thought they could repurpose them into cash registers that they couldnt sell as computer kits. The rest of the markets, lets be frank, were just poorer nations that couldnt afford to import tens of thousands of dollars per unit price worth of IBM products. Unless you were working on a bathroom turnstile gate or anti theft alarm, these micros were trash for general computing use.
@@perfectionbox Also enthusiasts: will collect junk, rationalize how amazing it is, cave to an overpriced ebay auction. Then 3 months later, realize they dont have so much as the basic engineering skills to repair the device, will shove a pi inside of the enclosure, subseqently decimating the product and its' history. The only remaining markers of life to be left will be some dude sperging, "REPLACE THE CAPS BRO"
Abstraction from Comprehension is the modern problem. All of these kids need to not be immersed in technology without being taught its foundations FIRST.
biggest problem we faced on the last 2 AAA titles I had to work on from home is the shear amount of data to crunch and build was probihibitively large. one game had 300gigs of data to cook and took unreal engine 24+ hrs to cook. Just changing a shader could take 4hrs to cook due to a terrible dependancy system. The best you could hope for was some changes that you could do in 60seconds, but thats still terrible because you have to put your mind on hold for 60 seconds while you are waiting for the result of your last change... and 60secs is a long time to do nothing and not get distracted with something else. As for languages... I liked a lot fo stuff about jonathon's new language, but its missing some key features like inheritance. But worse than that - he has an amazing feature that lets the compile run code to generate code, but this makes the language too easily abused. Codebases in that language will be unreadable because of how much people are abusing the generated code and how much you need to use the generated code to get around missing features in the language.
The decision to not include inheritance is extremely deliberate, he does not like inheritance and considers OO a bad path to go down. I’m starting to agree, it’s slow and makes code horrible to read.
@@SpeedfreakUK I agree it was deliberate, but a language that denies people the choice of a very commonly used pattern is going to make the language unlikely to catch on. The mess you can make with his compile time #insert shananigans is far and beyond any mess people make with heirachy. And the function polymorphics is going to be a lot more messy than a cleanly designed low-depth hierachy.
I think Jonathan Blow made an alternative to inheritance: The "using" keyword. This works just like inheritance except you can choose to inherit by using a pointer instead of in-lining the members. This means that you can mitigate some of the bloat of your application while still using inheritance. This is like the flyweight design pattern except there's minimal refactoring cost when introducing it to your code. Unreal is also a great example of a project that is negatively affected by the lack of reflection/meta programming support in C++. To integrate your changes with the editor, Unreal needs to first generate an "intermediate" directory. This process requires running a parser through your code and traversing the output AST. While this works reliably in Unreal, this is an added layer of complexity that results in a longer build process than normal. The program needs to be run through a parser twice. Once for the reflection and again for the final build. The build process would probably be a lot more straightforward if Unreal was made in Jai instead, which supports this kind of meta-programming stuff out of the box. You simply cannot make a game engine like Unreal without reflection/meta programming. I'm really looking forward to this language anyway, if it ever releases. It probably won't get wider adoption though like you said.
Little did he know, but in the near future he would install win 11 and find out he no longer can move the startmenu list. Win 11 is for med the practical examples of "software in decline"
I don't fully understand what these companies get from that kind of control. Like, nobody is going to be exclusively Mac OS so it would seem to me that Metal does nothing except create barriers to entry, no barriers to exit because people are already writing their games in DX12 or whatever.
You'd be very wrong on that actually! Sure most gamedevs won't exclusively develop for macOS because the target audience is tiny, but there are quite a few successful mobile games that only get released for iPhone. And you've absolutely seen the inverse happen where a game only releases on windows or on a specific console. I've also worked for a company doing sports computer vision stuff, and a lot of that software is written to run smoothly on a macbook by exclusively using all the proprietary apple tech. It's all swift and it would be impossible to port to anything else. A large portion of b2b software in general doesn't need to care about lock-in
Apple likes having complete control over their ecosystem as that core idea has gotten them to where they are today. They don't care if it makes porting a game to their system hard since developers will do it anyways to get access to their millions of users.
you answered your question. "does nothing except create barriers to entry", that's what they gain, maybe they want their products to be perceived as exclusive
Game developers write games to game engine or at least to some higher level layer that they don't need to care maintaining Metal/Vulkan/DirectX/OpenGL mess.
I don't think it is 'They want to be in control of what languages run on their thing', instead it is 'They need to control what languages run on their thing to raise money to hire literally thousands of highly talented people to make the thing at all.' That's why we don't have community-driven 4090s. No one would spend 1 billion dollar on design, synthesis, layout, verification, and tapeout of a new GPU just to improve the environment programmers work in. Especially not people with that 1 billion dollar, since if they did, they would spend it on something and then they don't. With that said, tech companies are laying off massively in 2024. There are tons of talented people who don't work at a tech giant anymore. Maybe this is the chance for the community-owned GPU.
Apple doesnt want you to use Metal on other hardware. IBM open sourced their architecture, no one attributes PCs to IBM anymore, in fact IBM nearly went under. These companies do need a return from their investment and part of that is securing their IP to some degree. Of course you can go too far like Apple and their home button thing with Samsung
It's funny because the net result for IBM is the essentially gave us the PC architecture out of the goodness of their heart, and nobody remembers them for it!
@MadsterV yes the goal was that if they opened sourced the architecture, it would become more popular and they would make more money by being the biggest suppliers of that hardware, but it didn't work out. I use them as an example of why Metal was invented specifically to be different and closed off.
Part of the solution: go fully opensource for the development stack of games, that way you know what code runs and you can much easier optimize, make an interface etcetera. You can keep the assets of the game proprietary, that is fine. Developers need to get paid, everybody understands that. But there is no good reason to have a secret code in the development stack. I wholeheartedly agree that the GPU should use opensource software, AMD its drivers are opensource (on Linux you can use it fully opensource with good performance) but the firmware is not yet. It is a good thing that Unreal and Unity got to opensource the code (though licensing which I don''t mind) and of course we have Godott. It makes me somewhat optimistic about the future of software becoming more opensource. In regard to Intel its C-compiler, that lead to false benchmarks. We can speculate if Intel intended for that to happen but it was not properly optimized for AMD, using the wrong instructions on AMD-CPU's back then. Disregarding that, it is better to just use opensource tools when those are available, it protects us from getting locked in.
I don't think WFH is a hindrance. Sure, there are slacking asses, but there was a ton of slacking in the office too. What the office did not have is peace and quiet. I envy you if while being constantly interrupted you are able to entertain complex thought, and keep in your mind thousands of lines of code at the same time. I surely can not. I surely am far more productive at home than at the office.
100% agree. Being in an office just changes how people slack off. They'll learn to LOOK busy, not actually be busy. Slacking off is a sign of the company being bad, not the employees. If the employees felt like you treated them well, they'd feel obliged to return the favor and work harder.
The problem with streamlined of interfaces. About graphic card vendors, well its theirs intellectual property, sure if stuff was not so complex , most graphic card vendors would have opensourced the work with shaders. I personally dont care, i just need on OS level an API which just simply works.
03:32 there's Shady which compiles into (vulkan's) spir-v. It's the most advanced one I know of. There are other projects which compile into GLSL(ES)+HLSL+SPIR-V but they all have tradeoffs
Regarding game systems: Endless worlds and adaptive gameplay already ran on DOS3. Oké, sometimes they required 8 floppies with various levels but nevertheless.. Still, Goldrush 2, King Quest IV and Monkey Island 2 had very decent graphics in the times that Voodoo cards, adlib soundcards and EGA monitors ran the top game-systems on PC. At that time Amiga's were more hyped, and consoles like Sega and Nintendo got introduced but PC's always stood it's ground, specifically since Xbox's with AMD chips. And Linux, without support from the major gamdev studio's, also is able to run most games. And game development is mostly language agnostic, except for the hardware accelerated stuff. It needs drivers and can do without chipset.
well, i looked up open gpu's... pretty dismal, but there's a LOT of projects, nothing that'll probably even run half life. i figure the real problem is lack of fab, if it were easy to get something down in real silicon we'd have hundreds of halfway decent open gpu's on the jungle site and nvidia's newest cards would be half the price they are now.
At least these proprietary systems are at least trying to achieve something meaningful, even in a sub-par way. But then there are proprietary systems like most modern console TRCs that don't provide any actual value whatsoever to the player or developer in what they are trying to do and are just purely a sink for development and computing resources.
@@thewhitefalcon8539 tru. Still funny to me when watching his reaction in the noclip The Witness documentary to tge question if his next game will take 8 years :D
I don't think the 'C runs on everything' argument is true. Other than maybe the most basic stuff, you can't take a VAX program and run it on an ATARI 130XE. If the goal is to have one syntax for all GPU programming, instead of ensuring that 90% of all games run smoothly on DAY ONE of a new GPU launch, we would have great unified GPU programming framework. Game programming is far from the most severe vendor lock-in on a GPU, general-purpose computing is.
I think the problem with Intel's compiler is that it's no longer the best at optimization and they make it annoying to acquire. If I could just go to their website and click a link to download it without any other garbage then others would use it too. Granted I've only ever used it to test things with because it's exceedingly slow and even gcc has surpassed it in terms of optimizing, let alone all the LLVM based languages out there. It just doesn't have the competitive edge that it had 25 years ago. As for proprietary systems, unless you want absolute control over everything, just use an existing game engine and all of these problems are handled for you. Though, I do wish that someone would develop an open source GPU. There's really not a lot of open source hardware out there that's any good, and there's just about zero chance you can have something "printed" up at a moments notice because it takes so much infrastructure to actually manufacture chips. What we really need is a (3D printer)-style revolution with regards to hardware production. If you could just download a new processor design, stick in a silicon wafer or some such nonsense, hit print and be testing it by the end of the day, then it would revolutionize computing.
@@thewhitefalcon8539 Indeed, and FPGA's in general are a great step because it allows faster iteration than a printer would allow, but they don't allow for the degree of complexity that such a device would bring to the table. Although, it'd be interesting to see someone make a more generic device that could be used in that way. Of course, the computing power of a modern GPU can't realistically be tested by any current FPGA technology, but it'd still be a useful tool in developing something new.
The "Use a game engine" thing breaks my heart, it's like "Don't have the liberty to make apis that are pleasant for you to use, use these garbage designed-by-committee apis with shitty unintuitive graphical interfaces that, while being intended to make it easier, make it orders of magnitude more tedious and hellish". Like, NO. And then to add insult to injury, except for rare cases, you can almost always tell a game was made in an engine cause it has a certain uninspired generic look and feel to everything. It's like the consensus is "Home cooked meals are too much of a hassle, just use TV dinner trays and add your own sauce packets to it if you want to change it up a little"
The difference between CPUs and GPUs is also stage of development. CPUs did eventually converge and found common ground, but GPUs are still constantly evolving. Not too long ago they were still only for graphics, and they are slowly becoming parallel computing units. My point is, I think GPUs will eventually converge into a more static architecture. Then we might start seeing some quality of life improvements to shader programming.
Metal and Swift (with C, C++, Objective-C and ARM64 interop) on Apple Silicon, running macOS, is an awesome platform. Xcode sucks (but you don't have to use it), and the market's far smaller than on Windows, but if you wanted a modernized Amiga 500, Apple Silicon is pretty close. You can't have your cake and eat it.
No thx when the modernized amiga costs billions and the manufacturer forces its shitty online spying services on you in order to get basic system updates. Image search "apple tech support azis" for visual representation of the situation.
Coding straight to hardware sounds like a nightmare, nothankyouverymuch. Would you like to have a different target per vendor, per year? I wish someone somewhere would have written a shading language that compiles to any Unreal and Unity. Both do it. They have their own (graph based) language and it compiles to many different targets: metal, openGL, DirectX, Vulkan, etc etc. These in turn are interpreted by drivers which you can install for your specific card so you don't need to go back to the engine every time your card's architecture changes (which is, what, yearly?). I thought by proprietary systems he meant libraries, because yeah, I'd agree there. PhysX turned into a lot of features that most people never saw, for example, back when it was locked to a single card series, for no reason other than marketing.
ok but whats the problem with these libraries? they handle the link between the game and the hardware, they figure out 3d for you. shaders and all that stuff. they are big and complicated for a reason, and they arent even close to the problem thats plaguing tripe A games.
I completely disagree with this. One could roll with Vulkan, which is the industry effort between all vendors except Apple, and call it a day. SPIR-V is an awesome standard that can be compiled from many different shading and compute languages. There are proprietary extensions but they are not necessary to make a game renderer with average needs. Really, the frustrating part of the industry is that CUDA and Metal ecosystem that is proprietary by nature. This wastes resources. Open stuff like GL (which sucked) and Vulkan is totally different.
I think language is very important, living in the world today. Back then, we have programmers; people who wrote the codes that “talked” to the computer. Today, we have developers; dealing with dependencies, libraries, configs and crazy amount of documentation to go through to get a grasp of the software. The things that were suppose to help made by those who made them becomes a burden, as they will not understand the problem they have never encounteres.
I honestly disagree with the idea that it would be completely impossible to make your own shader language or rendering API, at least on the AMD Linux stack it's completely possible since it's so damn open. Hell, there is already someone doing just that, there was a project I heard about that's trying to implement DirectX on Linux natively without any Vulkan translation named OpenDX. There is zero reason you couldn't just not implement DirectX and do your own thing. But why? Honestly, don't think the APIs are the problem, they ARE mostly standardized, the only weird exception is consoles and Macs, everything else just uses Vulkan and OpenGL simple as and even the proprietary APIs on these consoles are just clones of Vulkan + extra stuff. If you want the APIs to be more simple just use a wrapper around the more complex APIs like wGPU or WebGPU. I am not a game dev but that's just how I see it, I could be wrong
its called a transpiler. write the code you want. transpile it into the code you need. nobody is stopping you from transpiling a custom scripting language for shaders.
I absolutely love Battlefield 2042 and COD MW2 he mentioned. Completely disagree with his "did not produce anything useful" take on them. They are technical and artistic marvels IMO!
If you can't have a universal shading language I guess WGSL just doesn't exist. These languages are so similar that you can transpile between them pretty easily. And if you can't have C for GPUs someone should tell Embark Studios to stop working on rust-gpu. A lot of this sounds like a problem of constantly chasing cutting edge graphics rather than good gameplay. The best game I've played in the last few years is still Tunic, not some graphical behemoth.
Golden rule: You never ever make a library that should be and is made by a chip manufacturer. It's not that they don't want you to have control, try to write one of those libraries, and see how much your life sucks when the chip manufacturer releases their next major version. This same logic applies to native OS APIs and languages like Swift and Windows Native API.
There are already dozens of cross language shader compilers that let you convert any shader into any language with 0 changes. This is a complete non issue and doesn't matter at all.
@@ujin981 This channel exists to share the ideas, thoughts and personality of Jonathan Blow. I also want to save those moments from oblivion. You're free to think that it's crap. Thanks for watching!
@@BlowFan and you're free to delete comments with legitimate criticism. btw, there's no ideas in this video. but you're a blow fan, so won't get it. congrats, you've deserved a "dont' recommend this channel".
We memed on his stream saying if someone wants to steal the compiler they would have to gunfight him :D He shows what weapons he has on stream sometimes.
I don't think that is necessary. OpenGL ES 3.0 is anyway available everywhere as every browser uses that and that is.. nice. In fact in my opinion old OpenGL ES 2.0 was good enough when we understand that it is the assets that matter. If someone want to work in latest gen low level API with minimal effort, I think that would be Vulkan subset and use some wrappers to transform Vulkan API to DirectX and Metal. We kind of need "Vulkan ES".
@@gruntaxeman3740 Here in the comments soneone said how much you have to write to output a triangle in Vulkan. In another video Jonathan was complaining about having to mail nVidia and praying to fix something stupid that an intern did and people now blame on you. Opengl wouldn't fix that.
@@kuklama0706 Fix is not use API that is not yet commodity. OpenGL ES 3.0 is something that is worked two years everywhere. OpenGL ES 2.0 has been everywhere over decade. If limitations are not options, then it means more complexity, having middleware or manually port code to everywhere and solve problems.
@@kuklama0706 Also, writing code to subset of Vulkan is one option, not having any fancy features, just the basic. Just like we got "miniGL" in 90s. We can have "miniVulkan" and port that to everywhere.
The answer is simple. A commercial vendor is financially incentivized to support their products. They can afford to pay their employee to devote their full time to work on those products. If their game engines suck, they go out of business. Open source products, on the other hand, are not beholden to anything. They have little to no accountability and their developers are usually unpaid volunteers. If you ran a gaming company where its survival depended on an external product, you'd be a fool to develop your game using open source products.
i agree, but you're leaving out the anti-competition nature of these companies. Like how facebook has most people locked in to their services. Theres a huge cost to changing technology so a declining product (or less rapidly improving one) wont necessarily loose customers (what company is willing to take the hit of their most expensive workers needing time out to retrain). And you'd have to be a fool to not realise these companies and more specifically their shareholders know this. So when quarterly profits are the game, the end user wont win. No one in actual control is worried about being the bag-holder at the end since they are all protected from any concequences
Shady and VCC coming to the rescue! I hope we can all ditch attrocity of DXC, support SPIR-V more and write c programs instead of stupid shading languages.
The programming world is these days too fragmented: Multiple Shading languages (MSL, HLSL, GLSL etc.), Multiple Frontend Frameworks (React, Solid, Vue, Svelte etc.), Multiple Programming Languages for each different platform (Kotlin for Android, Swift for IOS, C# for M$ etc.), Multiple programming languages paradigms (Clojure, Rust, C++, Go, Haskell, OCaml etc.), While all of that is good to have the incpompatibilies and “platform specificness” of each of those makes the whole field a real big headache for programmers and very fragmented.
It’s not even that. In the 70s there was far more fragmentation but the spirit of free software allowed for interoperability. The problem today is corporate greed and special interest groups. Locking source code in a cage while holding monopolies. It’s a lot like the engineered, faux diversity movement, that leads to destruction and fragmentation of actual diversity. Real diversity is something organic, free and beautiful.
Idk, wouldn't the alternative be to just have C everywhere? But I kind of agree as well. I think that React is good enough for frontend development, and that any alternative needs to have a competitive advantages. The only one that I found was Elm.
@@majorhumbert676 Early 80s, C compilers were expensive, and slow. Turbo Pascal was way to go very long time. C however, started to show aging in mid 90s. That was clearly seen in web development when every request that triggers cgi-bin launched new process. That is when scripting languages and Java started to be popular. Scripts run in same process as HTTP server and Java application servers worked same way. In frontend development, Svelte and SolidJS has competetive advatage: They don't use virtual dom. Elm is also good. I use React but reason is toolkits made for it. But I need to create product without existing toolkits, I will use something else.
Nonsense. There have always been a ton of different languages, and even as chrome dome says in this video, hardware used to be even weirder and more dissimilar in the past meaning that simply porting something was a non trivial task. For all the hype about there being so many different languages, HLSL and GLSL for example are so close that automatic machine conversion from one to another is trivial AND ALREADY EXISTS. Good luck converting an Atari Jaguar game to the Sega Saturn as easily as that. This is much ado about nothing. Not even remotely a problem.
@@youtubesuresuckscock Agreed. This is not new and things are easier now. In past we really have at best raw pixel data and text, and design documents in paper. Today, we can create whole game fully portable way without building different versions. Just make it webassemly / OpenGL ES 3.0, and limit webassebmly memory usage to 256Mb. We can also have something in DOM side running as long as we keep total memory consumption below 384Mt. If someone wants more, cost is to add middleware or write manually low level code.
Drivers were a mistake. If user space processes were able to interact with the hardware by simply writing data into memory-mapped registers and issuing interrupts (that's what a driver all really is), developers simply wouldn't put up with crappy high-level APIs, undocumented/non-free GPU architectures and proprietary shading languages any longer. We traded freedom and low complexity over developer convenience.
No they weren't. Anyone who lived through early computing knows how completely broken it used to be when game developers actually had to either buy or WRITE THEIR OWN GAMEPAD DRIVERS for SPECIFIC GAMEPADS just to get support into their game. It was a joke. Guess what happened? Games constantly shipped that didn't work with your gamepad. It was a complete disaster. High level APIs supported by vendor drivers are one of the biggest success stories in the history of computing. Things are better than they've ever been. Only clowns who know nothing about the history of computers think things are worse.
Mike Abrash literally spent MONTHS on a "to the metal" port of Quake 1 for the Verite graphics card because it didn't support a generic API with a vendor supplied driver. Guess what? It was a COMPLETE WASTE OF TIME and no one even played that worthless port. It was an utter disaster. High level APIs implemented by vendor drivers are far more awesome than anything the dude in this video will ever be responsible for.
We have an actual HISTORY of companies doing exactly what you're suggesting. S3 Virge and Verite graphics let you write bare metal software for them and you had complete access to everything transparently, and they were UTTER FAILURES.
Sounds like a problem with (catabolic) capitalism and the declining rate of innovation. Why would hardware manufacturers want to take up an otherwise counterproductive, intermediary position between the users and the hardware they themselves pay for? To squeeze out additional profit, because hardware manufacturers are too cynical about the potential profitability of future innovations (assuming they have any innovations at all). There seems to be a general creative and innovative bankruptcy across the West and the increasingly parasitic nature of proprietary hardware indicates that.
2:26 I've never seen any concrete evidence that WFH decreases productivity. If you are talking about people just not working, they do that in an office. I walked round at about 10:30 in a large organisation and half of one department was reading reddit. If you are talking about communication, a huge amount of it happens over teams/slack/email anyway. 3:52 It isn't an "excuse". That is someone being an actual software engineer and looking at what they have to work with and trying to make the best decision for the project. How many games has this guy actually released? Two games and a bit games?
After Blow closed his first studio, he worked as a contractor for game studios with large budgets. Games he worked on include Oddworld: Munch's Oddysee, Deus Ex: Invisible War and Thief: Deadly Shadows.
There is TONS of evidence that work from home is basically bullshit. It's been covered in many major journalistic outlets, and its also obvious common sense.
@@fourscoreand9884 "Games he worked on include Deus Ex: Invisible War" Oof. You mean the game that, whenever you transition through a level, just loads a new Windows process of itself, and then kills the old one? And I'm supposed to learn about optimization from him? I'm still waiting on Jai to release, btw.
"Big and compllicated system" he is more projecting than "giving advice". OpenGL is not really that compllicated. DirectX also not that complicated. It does require a level of comptetence. For those who can't deal with those, they have game engines.
Not only for those that can't, but also for those that don't want to because it would take a lot of time that can otherwise be used on other parts of game development (or just not used, and the game released sooner/at all)
So in one video you're bashing open source by painting badly(even though empirically you were proving wrong) and now you're bashing proprietary. Pick a side already.
What Dialects? We have only three C++ compiler left, Clang/GCC and MSVC. And they are pretty compatible. In production you never want to use the latest features anyway.
"Capitalism is trash, let me eat cake" by Jonathan Blow. Proprietary this and that complicating any real meaningful improvements, literally peoples times being FULL with learning and wading through all the bad decisions, systems and restrictions created in the past/present. P.S. @someone else with your garbage defenses of 'this system' you're wrong and Idgaf if you wanna argue, stay clownin for THE major impediment to human progress. K thnx bai.
A little bit more than one year ago, I would have been certain that this was written by an AI. Now however, I am certain of the opposite; an AI would never have written a text this badly.
Lol, I don't know why youtube is recommending me Jon Blow videos. This guy hasn't made a single good game EVER, so listening to him critique other devs' programming skills is somewhat amusing. Worse than that though, he was a programmer on Deus Ex Invisible War, the worst Deus Ex game, so he contributed to making the sequel to one of my favorite games of all time much, much worse than the first one. For someone who has been in game dev for about 30 years, he never really picked up what it takes to make a quality game. Meanwhile timeless classics like Doom were made by a handful of nerds screwing around.
I don't work in gamedev, but I do work in an enterprise setting as a software engineer. A depressingly small amount of my team's developer hours goes towards writing useful features or functionality. Over half, sometimes over three quarters is spent updating dependencies, fixing issues in our deployment pipeline, and migrating from one deprecated API/technology to a newer system. API migrations especially has been killing us this past year, it feels like every month some other team is contacting us to say "this API you're using is going away, you have to move to this other system with a totally different set of endpoints with different response objects, oh and by the way a feature you relied on no longer exists".
Add on top of that, our team's budget has been cut in half. So we're being asked to do tons of work in the backend which doesn't deliver any value to our customers, AND we have half the developers we used to. If gamedev is anything like that, I totally understand why AAA companies spend years making a mediocre product. Modern software development is a dependency nightmare.
this is why you should never switch to a new engine or API in the middle of a project, just wait for the next one.
Why aren't they using versioning in their APIs? You're normally supposed to support the old one as well as the new one.
@@HunterD510 That's not how the world works. In that case we'd still have Windows 95 for example.
There is a limit to how much you can support and sometimes you need to restructure things in order to avoid complexity and scale to the extent that the demand requires. It could be that the cost to benefit ratio was also too low for some features, leading to their removal.
@@Ghorda9 I'm not sure that what you're saying is even possible for the commenter. It sounds like they work inside enterprise web development. If that's the case, what would your solution be for an API vendor that gives a few months notice and then shuts down the server where the API is running? In the shitshow that is webdev that's a very real problem where your only choice is to switch to a new API in the middle of a project. Also, a project within webdev is rarely finished since there's always something to change, fix, or add.
@@guylianwashier4583 wasn't really aimed at online stuff, just everything else. Not really familiar with "enterprise"
Bro casually found a red dot sight in his cake
This is soo spot on. Every time I ask this question to a graphics dev, I get the response "uhh I don't know man, I haven't thought about it tbh".
Also I'm glad that influential people such as JB are finally talking about all the nasty ways proprietary systems get in the way of good engineering. The future is open software running on open hardware. This is obvious, but it's also EXTREMELY hard to make it work in the modern world! Humanity will probably lose centuries figuring this out and it just breaks my heart every time I think of it...
I remember the days of the C-compiler lock-in. When I found out about gcc (7 years after it first came out, shame on me), that was a revolution!
We should learn to NEVER depend on vendor-owned programming languages. It is the ultimate lock-in.
The problem with the GLs is that they are not compiled languages but interpreted. Meaning that having an intermediate that translates an open GL (pun intended) into a vendor-specific one, will always suffer a performance hit.
Not saying he's wrong, but he's kinda missing the mark. You shouldn't write for a GPU architecture directly because they change very often in ways that make them not backward compatible. For example, AMD Rx 5000 series had very little in common with its immediate predecessor, the Rx 500 series. Same with Nvidia between the 1000 and 2000 series. Compare that to CPUs: anything written for the 80386 back in 1985 would still work on a processor made in 2024. Half the point of games on PC is that I can still play games I bought in 2003 when I had Windows XP on a brand new computer today.
should of told that to Bioware for the original Neverwinter Nights 2 that could only run on a very specific GPU.
@@Ghorda9 Thats not true at all.
@@kahaneck when i had the game it couldn't run on anyone's system and when i looked it up it mentioned that it used a feature of a graphics card that was deprecated as soon as the next one came out.
@@kahaneck this was also in the mid 2000s.
Which, to be fair, they could do in the graphics market. They could just make one ISA and continue extending it like has been done for x86. But they don't. So why is that?
I have similar sentiments to Blow here, but I concluded, "fuck it, do some modern retrogame coding. Agon Light looks nice." So now I am coding a mix of Lua and Forth. Lua is the entire build system and provides all higher-order metaprogramming, asset handling, etc. Forth is the compilation target, assembler and REPL debugging environment - I do not write complex Forth words, I write fairly dumb ones that Lua adds static checks for and copy-pastes variants of, so that hardly anything confusing or hard to read is done with Forth(and Forth itself is no slouch, but it doesn't have garbage collection, runtime type checks, string manipulation functions, etc. - it is a macroassembler operating over static memory structures, that bootstraps into a Lisp if you work at it; vs Lua being the most popular "embedded Lisp" around, one that is Forth-like in its dedication to a small core). The Agon is an eZ80 driving ESP32 graphics firmware over a serial protocol, so it has a good deal more power than vintage 8-bits, with an open design. There is a working emulator, multiple vendors making boards. You can reasonably understand 100% of the system and make targeted low-level optimizations without going waist-deep in assembly. If I want to retarget off of Agon, I just adapt it to a different Forth system and I/O layer.
This is the stuff you do when you want to enjoy programming again, I think. Don't do modern connected things - they are poisoned by complex protocols and vendors who want to sell them. This is something the Forthers are right about. Chuck Moore will tell you that floating point is there to sell complex FPUs.
I am programming right now in Vulkan for my thesis and the burden that Vulkan brings against OpenGL is unbarable. The fact that you have to write a vulkan program with 1500+ lines of code just to show a single triangle is kinda insane to me. At university we had to write a software renderer and I feel like I coded that faster than I got Vulkan up and running with the triangle. Don't get me wrong, i love learning about all the low-level stuff, but Vulkan or DX12 because all of this, it's so complex.
Don't get me even started when trying to debug a single line where I could have left out just a single StructType assignment which could throw off the whole application (At least the validation layers can help with this). There is just so much information that you have to keep in your head and it really distances you from actually solving the problem. That could be of course skill issue, but that's just how I see it. When I programmed in OpenGL it seemed much clearer to me and I could focus more on the problem.
and even the very best AAA games specifically optimized for vulkan, have really only a minimal benefit compared to gl/dx11. so the complexity is high, the room for error is high, it is easy to make something worse than the old runtimes, and the benefit to the consumer even when it's done right is minimal. low level API's have been a complete disaster, just another exercize in ladder pulling.
@@doltBmB Yes, that is one of most mind bogling things too.
And Vulkan is still less complex than the GPU drivers or the GPU hardware architecture itself
There are quite a few ways to make Vulkan less painful of an experience though
@doltBmB i dont know. Vulkan had run my games more smoothly and with higher FPS. In Path of Exile I can even switch the render API and I can directly feel it runs better.
I mean Vulkan probably had lots of ways on how not to do it, so that you can have thr capability to so it extra right.
John: "... shitty home computers..."
Retro PC enthusiasts: WHAT
Micros were always shit. Radioshack only sold tandy units because they thought they could repurpose them into cash registers that they couldnt sell as computer kits. The rest of the markets, lets be frank, were just poorer nations that couldnt afford to import tens of thousands of dollars per unit price worth of IBM products. Unless you were working on a bathroom turnstile gate or anti theft alarm, these micros were trash for general computing use.
@@reecesx Enthusiasts: BLASPHEMY
@@perfectionbox Also enthusiasts: will collect junk, rationalize how amazing it is, cave to an overpriced ebay auction. Then 3 months later, realize they dont have so much as the basic engineering skills to repair the device, will shove a pi inside of the enclosure, subseqently decimating the product and its' history. The only remaining markers of life to be left will be some dude sperging, "REPLACE THE CAPS BRO"
Abstraction from Comprehension is the modern problem.
All of these kids need to not be immersed in technology without being taught its foundations FIRST.
biggest problem we faced on the last 2 AAA titles I had to work on from home is the shear amount of data to crunch and build was probihibitively large.
one game had 300gigs of data to cook and took unreal engine 24+ hrs to cook. Just changing a shader could take 4hrs to cook due to a terrible dependancy system. The best you could hope for was some changes that you could do in 60seconds, but thats still terrible because you have to put your mind on hold for 60 seconds while you are waiting for the result of your last change... and 60secs is a long time to do nothing and not get distracted with something else.
As for languages... I liked a lot fo stuff about jonathon's new language, but its missing some key features like inheritance. But worse than that - he has an amazing feature that lets the compile run code to generate code, but this makes the language too easily abused. Codebases in that language will be unreadable because of how much people are abusing the generated code and how much you need to use the generated code to get around missing features in the language.
The decision to not include inheritance is extremely deliberate, he does not like inheritance and considers OO a bad path to go down. I’m starting to agree, it’s slow and makes code horrible to read.
@@SpeedfreakUK I agree it was deliberate, but a language that denies people the choice of a very commonly used pattern is going to make the language unlikely to catch on. The mess you can make with his compile time #insert shananigans is far and beyond any mess people make with heirachy. And the function polymorphics is going to be a lot more messy than a cleanly designed low-depth hierachy.
I think Jonathan Blow made an alternative to inheritance: The "using" keyword. This works just like inheritance except you can choose to inherit by using a pointer instead of in-lining the members. This means that you can mitigate some of the bloat of your application while still using inheritance. This is like the flyweight design pattern except there's minimal refactoring cost when introducing it to your code.
Unreal is also a great example of a project that is negatively affected by the lack of reflection/meta programming support in C++. To integrate your changes with the editor, Unreal needs to first generate an "intermediate" directory. This process requires running a parser through your code and traversing the output AST. While this works reliably in Unreal, this is an added layer of complexity that results in a longer build process than normal. The program needs to be run through a parser twice. Once for the reflection and again for the final build. The build process would probably be a lot more straightforward if Unreal was made in Jai instead, which supports this kind of meta-programming stuff out of the box. You simply cannot make a game engine like Unreal without reflection/meta programming.
I'm really looking forward to this language anyway, if it ever releases. It probably won't get wider adoption though like you said.
Little did he know, but in the near future he would install win 11 and find out he no longer can move the startmenu list. Win 11 is for med the practical examples of "software in decline"
I don't fully understand what these companies get from that kind of control. Like, nobody is going to be exclusively Mac OS so it would seem to me that Metal does nothing except create barriers to entry, no barriers to exit because people are already writing their games in DX12 or whatever.
You'd be very wrong on that actually! Sure most gamedevs won't exclusively develop for macOS because the target audience is tiny, but there are quite a few successful mobile games that only get released for iPhone. And you've absolutely seen the inverse happen where a game only releases on windows or on a specific console.
I've also worked for a company doing sports computer vision stuff, and a lot of that software is written to run smoothly on a macbook by exclusively using all the proprietary apple tech. It's all swift and it would be impossible to port to anything else.
A large portion of b2b software in general doesn't need to care about lock-in
Apple likes having complete control over their ecosystem as that core idea has gotten them to where they are today. They don't care if it makes porting a game to their system hard since developers will do it anyways to get access to their millions of users.
you answered your question. "does nothing except create barriers to entry", that's what they gain, maybe they want their products to be perceived as exclusive
@@pokefreak2112Can you name me some of those iPhone specific games (so i could check more on them). Thanks.
Game developers write games to game engine or at least to some higher level layer that they don't need to care maintaining Metal/Vulkan/DirectX/OpenGL mess.
I don't think it is 'They want to be in control of what languages run on their thing', instead it is 'They need to control what languages run on their thing to raise money to hire literally thousands of highly talented people to make the thing at all.' That's why we don't have community-driven 4090s. No one would spend 1 billion dollar on design, synthesis, layout, verification, and tapeout of a new GPU just to improve the environment programmers work in. Especially not people with that 1 billion dollar, since if they did, they would spend it on something and then they don't. With that said, tech companies are laying off massively in 2024. There are tons of talented people who don't work at a tech giant anymore. Maybe this is the chance for the community-owned GPU.
Apple doesnt want you to use Metal on other hardware. IBM open sourced their architecture, no one attributes PCs to IBM anymore, in fact IBM nearly went under. These companies do need a return from their investment and part of that is securing their IP to some degree. Of course you can go too far like Apple and their home button thing with Samsung
It's funny because the net result for IBM is the essentially gave us the PC architecture out of the goodness of their heart, and nobody remembers them for it!
@@Domarius64 because it was mostly by mistake, not out of the goodness of their heart. The current state of affairs was not their plan.
@MadsterV yes the goal was that if they opened sourced the architecture, it would become more popular and they would make more money by being the biggest suppliers of that hardware, but it didn't work out. I use them as an example of why Metal was invented specifically to be different and closed off.
Part of the solution: go fully opensource for the development stack of games, that way you know what code runs and you can much easier optimize, make an interface etcetera. You can keep the assets of the game proprietary, that is fine. Developers need to get paid, everybody understands that. But there is no good reason to have a secret code in the development stack. I wholeheartedly agree that the GPU should use opensource software, AMD its drivers are opensource (on Linux you can use it fully opensource with good performance) but the firmware is not yet. It is a good thing that Unreal and Unity got to opensource the code (though licensing which I don''t mind) and of course we have Godott. It makes me somewhat optimistic about the future of software becoming more opensource.
In regard to Intel its C-compiler, that lead to false benchmarks. We can speculate if Intel intended for that to happen but it was not properly optimized for AMD, using the wrong instructions on AMD-CPU's back then. Disregarding that, it is better to just use opensource tools when those are available, it protects us from getting locked in.
I don't think WFH is a hindrance. Sure, there are slacking asses, but there was a ton of slacking in the office too. What the office did not have is peace and quiet. I envy you if while being constantly interrupted you are able to entertain complex thought, and keep in your mind thousands of lines of code at the same time. I surely can not. I surely am far more productive at home than at the office.
100% agree. Being in an office just changes how people slack off. They'll learn to LOOK busy, not actually be busy. Slacking off is a sign of the company being bad, not the employees. If the employees felt like you treated them well, they'd feel obliged to return the favor and work harder.
The problem with streamlined of interfaces. About graphic card vendors, well its theirs intellectual property, sure if stuff was not so complex , most graphic card vendors would have opensourced the work with shaders. I personally dont care, i just need on OS level an API which just simply works.
Standardization comes with commodification. We won't see a uniform approach until these things become nothing more than appliances.
also standardization comes with stagnation. Don't want that myself.
Jon is turning into an open-source fanboy? Yay!
only hypothetically :P
03:32 there's Shady which compiles into (vulkan's) spir-v. It's the most advanced one I know of. There are other projects which compile into GLSL(ES)+HLSL+SPIR-V but they all have tradeoffs
Regarding game systems:
Endless worlds and adaptive gameplay already ran on DOS3. Oké, sometimes they required 8 floppies with various levels but nevertheless.. Still, Goldrush 2, King Quest IV and Monkey Island 2 had very decent graphics in the times that Voodoo cards, adlib soundcards and EGA monitors ran the top game-systems on PC.
At that time Amiga's were more hyped, and consoles like Sega and Nintendo got introduced but PC's always stood it's ground, specifically since Xbox's with AMD chips. And Linux, without support from the major gamdev studio's, also is able to run most games.
And game development is mostly language agnostic, except for the hardware accelerated stuff. It needs drivers and can do without chipset.
well, i looked up open gpu's... pretty dismal, but there's a LOT of projects, nothing that'll probably even run half life. i figure the real problem is lack of fab, if it were easy to get something down in real silicon we'd have hundreds of halfway decent open gpu's on the jungle site and nvidia's newest cards would be half the price they are now.
At least these proprietary systems are at least trying to achieve something meaningful, even in a sub-par way. But then there are proprietary systems like most modern console TRCs that don't provide any actual value whatsoever to the player or developer in what they are trying to do and are just purely a sink for development and computing resources.
Let me write my own programming language and OS - JB. Why can't I finish an indy game in 8 years - also JB
When he does finish a game it's the best game ever. He's done it twice.
@@thewhitefalcon8539 tru. Still funny to me when watching his reaction in the noclip The Witness documentary to tge question if his next game will take 8 years :D
I don't think the 'C runs on everything' argument is true. Other than maybe the most basic stuff, you can't take a VAX program and run it on an ATARI 130XE. If the goal is to have one syntax for all GPU programming, instead of ensuring that 90% of all games run smoothly on DAY ONE of a new GPU launch, we would have great unified GPU programming framework. Game programming is far from the most severe vendor lock-in on a GPU, general-purpose computing is.
I work from home better than in the office.
@@Bobo-ox7fj Offices suck in gender. Being at home is way nicer.
I think the problem with Intel's compiler is that it's no longer the best at optimization and they make it annoying to acquire. If I could just go to their website and click a link to download it without any other garbage then others would use it too. Granted I've only ever used it to test things with because it's exceedingly slow and even gcc has surpassed it in terms of optimizing, let alone all the LLVM based languages out there. It just doesn't have the competitive edge that it had 25 years ago.
As for proprietary systems, unless you want absolute control over everything, just use an existing game engine and all of these problems are handled for you. Though, I do wish that someone would develop an open source GPU. There's really not a lot of open source hardware out there that's any good, and there's just about zero chance you can have something "printed" up at a moments notice because it takes so much infrastructure to actually manufacture chips. What we really need is a (3D printer)-style revolution with regards to hardware production. If you could just download a new processor design, stick in a silicon wafer or some such nonsense, hit print and be testing it by the end of the day, then it would revolutionize computing.
An FPGA GPU would be an intermediate step
@@thewhitefalcon8539 Indeed, and FPGA's in general are a great step because it allows faster iteration than a printer would allow, but they don't allow for the degree of complexity that such a device would bring to the table. Although, it'd be interesting to see someone make a more generic device that could be used in that way. Of course, the computing power of a modern GPU can't realistically be tested by any current FPGA technology, but it'd still be a useful tool in developing something new.
Photolithography is pretty cool
@@Muskar2 Yeah, even if you're not making a processor it can be fun.
The "Use a game engine" thing breaks my heart, it's like "Don't have the liberty to make apis that are pleasant for you to use, use these garbage designed-by-committee apis with shitty unintuitive graphical interfaces that, while being intended to make it easier, make it orders of magnitude more tedious and hellish". Like, NO.
And then to add insult to injury, except for rare cases, you can almost always tell a game was made in an engine cause it has a certain uninspired generic look and feel to everything.
It's like the consensus is "Home cooked meals are too much of a hassle, just use TV dinner trays and add your own sauce packets to it if you want to change it up a little"
The difference between CPUs and GPUs is also stage of development. CPUs did eventually converge and found common ground, but GPUs are still constantly evolving. Not too long ago they were still only for graphics, and they are slowly becoming parallel computing units. My point is, I think GPUs will eventually converge into a more static architecture. Then we might start seeing some quality of life improvements to shader programming.
Metal and Swift (with C, C++, Objective-C and ARM64 interop) on Apple Silicon, running macOS, is an awesome platform. Xcode sucks (but you don't have to use it), and the market's far smaller than on Windows, but if you wanted a modernized Amiga 500, Apple Silicon is pretty close.
You can't have your cake and eat it.
No thx when the modernized amiga costs billions and the manufacturer forces its shitty online spying services on you in order to get basic system updates.
Image search "apple tech support azis" for visual representation of the situation.
> Xcode sucks (but you don't have to use it)
We don't??
An ARM licensing style GPU would be awesome.
Coding straight to hardware sounds like a nightmare, nothankyouverymuch.
Would you like to have a different target per vendor, per year?
I wish someone somewhere would have written a shading language that compiles to any Unreal and Unity. Both do it. They have their own (graph based) language and it compiles to many different targets: metal, openGL, DirectX, Vulkan, etc etc. These in turn are interpreted by drivers which you can install for your specific card so you don't need to go back to the engine every time your card's architecture changes (which is, what, yearly?).
I thought by proprietary systems he meant libraries, because yeah, I'd agree there. PhysX turned into a lot of features that most people never saw, for example, back when it was locked to a single card series, for no reason other than marketing.
I work in backend APIs and dependency management is a NIGHTMARE. Changing the version of any dependency broke ten more dependencies.
jon's rants are satisfying
It sounds like the problem is just the amount of expertise you need to work with these systems and not whether or not vendors being middlemen
for simple games why not remake software renderers and bypass all this crap? Surely CPUs can do circa 2000 graphics now
ok but whats the problem with these libraries? they handle the link between the game and the hardware, they figure out 3d for you. shaders and all that stuff.
they are big and complicated for a reason, and they arent even close to the problem thats plaguing tripe A games.
I completely disagree with this. One could roll with Vulkan, which is the industry effort between all vendors except Apple, and call it a day. SPIR-V is an awesome standard that can be compiled from many different shading and compute languages. There are proprietary extensions but they are not necessary to make a game renderer with average needs. Really, the frustrating part of the industry is that CUDA and Metal ecosystem that is proprietary by nature. This wastes resources. Open stuff like GL (which sucked) and Vulkan is totally different.
I think language is very important, living in the world today. Back then, we have programmers; people who wrote the codes that “talked” to the computer.
Today, we have developers; dealing with dependencies, libraries, configs and crazy amount of documentation to go through to get a grasp of the software. The things that were suppose to help made by those who made them becomes a burden, as they will not understand the problem they have never encounteres.
I honestly disagree with the idea that it would be completely impossible to make your own shader language or rendering API, at least on the AMD Linux stack it's completely possible since it's so damn open. Hell, there is already someone doing just that, there was a project I heard about that's trying to implement DirectX on Linux natively without any Vulkan translation named OpenDX. There is zero reason you couldn't just not implement DirectX and do your own thing. But why? Honestly, don't think the APIs are the problem, they ARE mostly standardized, the only weird exception is consoles and Macs, everything else just uses Vulkan and OpenGL simple as and even the proprietary APIs on these consoles are just clones of Vulkan + extra stuff. If you want the APIs to be more simple just use a wrapper around the more complex APIs like wGPU or WebGPU. I am not a game dev but that's just how I see it, I could be wrong
its called a transpiler. write the code you want. transpile it into the code you need. nobody is stopping you from transpiling a custom scripting language for shaders.
This is my assumption as to why apple is so uptight about propriety hardware and metal. They're trying to force their way out of it.
I absolutely love Battlefield 2042 and COD MW2 he mentioned. Completely disagree with his "did not produce anything useful" take on them. They are technical and artistic marvels IMO!
Laughing in SNES
If you can't have a universal shading language I guess WGSL just doesn't exist. These languages are so similar that you can transpile between them pretty easily.
And if you can't have C for GPUs someone should tell Embark Studios to stop working on rust-gpu.
A lot of this sounds like a problem of constantly chasing cutting edge graphics rather than good gameplay. The best game I've played in the last few years is still Tunic, not some graphical behemoth.
5:53 I see an editing mistake. That one frame of misplaced image. Annoying. . .
1:17 'Murica.
Everybody is drowning in complexity. You have to draw a line and quit.
Intellectual Property and its consequences
Tactical Cake!
jon blows tactical cake
Set your expectation lower! As low as possible
The companies don't want to travel the path of nokia or we can assume innovation as a blasphemy in tech space.
Golden rule: You never ever make a library that should be and is made by a chip manufacturer. It's not that they don't want you to have control, try to write one of those libraries, and see how much your life sucks when the chip manufacturer releases their next major version. This same logic applies to native OS APIs and languages like Swift and Windows Native API.
We need some LLGPU project, like LLVM but for GPUs
what the fuck do u think vulkan is?
OMG i just thought it was focused on graphics@@trejohnson7677
@@trejohnson7677Its just API specification more then software stack, its more like C standard library, you need platform willing to implement it.
so writing a universal shading language is allegedly impossible, but then the SDL3 team is totally insane to try it...
There are already dozens of cross language shader compilers that let you convert any shader into any language with 0 changes. This is a complete non issue and doesn't matter at all.
@@youtubesuresuckscock and yet this channel exists to spread crap over blow fans
GLSL is the universal shading language
@@ujin981 This channel exists to share the ideas, thoughts and personality of Jonathan Blow. I also want to save those moments from oblivion. You're free to think that it's crap. Thanks for watching!
@@BlowFan and you're free to delete comments with legitimate criticism. btw, there's no ideas in this video. but you're a blow fan, so won't get it. congrats, you've deserved a "dont' recommend this channel".
Is Jon a gun guy? That's rad
We memed on his stream saying if someone wants to steal the compiler they would have to gunfight him :D He shows what weapons he has on stream sometimes.
How about community owned... software renderer?
I don't think that is necessary.
OpenGL ES 3.0 is anyway available everywhere as every browser uses that and that is.. nice. In fact in my opinion old OpenGL ES 2.0 was good enough when we understand that it is the assets that matter.
If someone want to work in latest gen low level API with minimal effort, I think that would be Vulkan subset and use some wrappers to transform Vulkan API to DirectX and Metal.
We kind of need "Vulkan ES".
Would that be the start of communism again ?
@@gruntaxeman3740 Here in the comments soneone said how much you have to write to output a triangle in Vulkan.
In another video Jonathan was complaining about having to mail nVidia and praying to fix something stupid that an intern did and people now blame on you. Opengl wouldn't fix that.
@@kuklama0706
Fix is not use API that is not yet commodity. OpenGL ES 3.0 is something that is worked two years everywhere. OpenGL ES 2.0 has been everywhere over decade.
If limitations are not options, then it means more complexity, having middleware or manually port code to everywhere and solve problems.
@@kuklama0706
Also, writing code to subset of Vulkan is one option, not having any fancy features, just the basic. Just like we got "miniGL" in 90s. We can have "miniVulkan" and port that to everywhere.
The answer is simple. A commercial vendor is financially incentivized to support their products. They can afford to pay their employee to devote their full time to work on those products. If their game engines suck, they go out of business. Open source products, on the other hand, are not beholden to anything. They have little to no accountability and their developers are usually unpaid volunteers. If you ran a gaming company where its survival depended on an external product, you'd be a fool to develop your game using open source products.
i agree, but you're leaving out the anti-competition nature of these companies. Like how facebook has most people locked in to their services. Theres a huge cost to changing technology so a declining product (or less rapidly improving one) wont necessarily loose customers (what company is willing to take the hit of their most expensive workers needing time out to retrain). And you'd have to be a fool to not realise these companies and more specifically their shareholders know this. So when quarterly profits are the game, the end user wont win.
No one in actual control is worried about being the bag-holder at the end since they are all protected from any concequences
Shady and VCC coming to the rescue! I hope we can all ditch attrocity of DXC, support SPIR-V more and write c programs instead of stupid shading languages.
Also check about SLang, a C# like shading programming language.
Also there is RustGPU, I guess all of those systems still need tons of work.
The programming world is these days too fragmented: Multiple Shading languages (MSL, HLSL, GLSL etc.), Multiple Frontend Frameworks (React, Solid, Vue, Svelte etc.), Multiple Programming Languages for each different platform (Kotlin for Android, Swift for IOS, C# for M$ etc.), Multiple programming languages paradigms (Clojure, Rust, C++, Go, Haskell, OCaml etc.), While all of that is good to have the incpompatibilies and “platform specificness” of each of those makes the whole field a real big headache for programmers and very fragmented.
It’s not even that. In the 70s there was far more fragmentation but the spirit of free software allowed for interoperability.
The problem today is corporate greed and special interest groups. Locking source code in a cage while holding monopolies.
It’s a lot like the engineered, faux diversity movement, that leads to destruction and fragmentation of actual diversity. Real diversity is something organic, free and beautiful.
Idk, wouldn't the alternative be to just have C everywhere?
But I kind of agree as well. I think that React is good enough for frontend development, and that any alternative needs to have a competitive advantages. The only one that I found was Elm.
@@majorhumbert676
Early 80s, C compilers were expensive, and slow. Turbo Pascal was way to go very long time.
C however, started to show aging in mid 90s. That was clearly seen in web development when every request that triggers cgi-bin launched new process. That is when scripting languages and Java started to be popular. Scripts run in same process as HTTP server and Java application servers worked same way.
In frontend development, Svelte and SolidJS has competetive advatage: They don't use virtual dom. Elm is also good.
I use React but reason is toolkits made for it. But I need to create product without existing toolkits, I will use something else.
Nonsense. There have always been a ton of different languages, and even as chrome dome says in this video, hardware used to be even weirder and more dissimilar in the past meaning that simply porting something was a non trivial task.
For all the hype about there being so many different languages, HLSL and GLSL for example are so close that automatic machine conversion from one to another is trivial AND ALREADY EXISTS. Good luck converting an Atari Jaguar game to the Sega Saturn as easily as that.
This is much ado about nothing. Not even remotely a problem.
@@youtubesuresuckscock
Agreed. This is not new and things are easier now. In past we really have at best raw pixel data and text, and design documents in paper.
Today, we can create whole game fully portable way without building different versions. Just make it webassemly / OpenGL ES 3.0, and limit webassebmly memory usage to 256Mb. We can also have something in DOM side running as long as we keep total memory consumption below 384Mt.
If someone wants more, cost is to add middleware or write manually low level code.
Didn't he whine about open source software?
Well, he said in this video that it's probably impossible.
@@bogdan2529 He sounds like a doomer to me. I still like him though.
He is right, open source software is awful.
@@limarchenko96 software in general is awful
@@Tezla0 Software made by AI will be the solution.
Leverage AI LLMs as a translator and reference puller, bonus if you have an in house all-accessible custom AI to bridge the gaps.
good
Drivers were a mistake. If user space processes were able to interact with the hardware by simply writing data into memory-mapped registers and issuing interrupts (that's what a driver all really is), developers simply wouldn't put up with crappy high-level APIs, undocumented/non-free GPU architectures and proprietary shading languages any longer.
We traded freedom and low complexity over developer convenience.
No they weren't. Anyone who lived through early computing knows how completely broken it used to be when game developers actually had to either buy or WRITE THEIR OWN GAMEPAD DRIVERS for SPECIFIC GAMEPADS just to get support into their game. It was a joke. Guess what happened? Games constantly shipped that didn't work with your gamepad. It was a complete disaster.
High level APIs supported by vendor drivers are one of the biggest success stories in the history of computing. Things are better than they've ever been. Only clowns who know nothing about the history of computers think things are worse.
Mike Abrash literally spent MONTHS on a "to the metal" port of Quake 1 for the Verite graphics card because it didn't support a generic API with a vendor supplied driver. Guess what? It was a COMPLETE WASTE OF TIME and no one even played that worthless port.
It was an utter disaster.
High level APIs implemented by vendor drivers are far more awesome than anything the dude in this video will ever be responsible for.
We have an actual HISTORY of companies doing exactly what you're suggesting. S3 Virge and Verite graphics let you write bare metal software for them and you had complete access to everything transparently, and they were UTTER FAILURES.
Even on platforms where you have direct access to submitting commands by hand on Linux, the amount of work you'd need is still insanity.
OK but you be the one to port Cyberpunk 2077 to 30 different GPU architectures
Basically skill issue
Sounds like a problem with (catabolic) capitalism and the declining rate of innovation. Why would hardware manufacturers want to take up an otherwise counterproductive, intermediary position between the users and the hardware they themselves pay for? To squeeze out additional profit, because hardware manufacturers are too cynical about the potential profitability of future innovations (assuming they have any innovations at all). There seems to be a general creative and innovative bankruptcy across the West and the increasingly parasitic nature of proprietary hardware indicates that.
2:26 I've never seen any concrete evidence that WFH decreases productivity. If you are talking about people just not working, they do that in an office. I walked round at about 10:30 in a large organisation and half of one department was reading reddit. If you are talking about communication, a huge amount of it happens over teams/slack/email anyway.
3:52 It isn't an "excuse". That is someone being an actual software engineer and looking at what they have to work with and trying to make the best decision for the project.
How many games has this guy actually released? Two games and a bit games?
He's made tons of games. It's just those two that were successful
@@Hwioo Ok. Thanks for the correction.
After Blow closed his first studio, he worked as a contractor for game studios with large budgets. Games he worked on include Oddworld: Munch's Oddysee, Deus Ex: Invisible War and Thief: Deadly Shadows.
There is TONS of evidence that work from home is basically bullshit. It's been covered in many major journalistic outlets, and its also obvious common sense.
@@fourscoreand9884 "Games he worked on include Deus Ex: Invisible War"
Oof. You mean the game that, whenever you transition through a level, just loads a new Windows process of itself, and then kills the old one? And I'm supposed to learn about optimization from him?
I'm still waiting on Jai to release, btw.
"Big and compllicated system" he is more projecting than "giving advice".
OpenGL is not really that compllicated.
DirectX also not that complicated.
It does require a level of comptetence.
For those who can't deal with those, they have game engines.
Not only for those that can't, but also for those that don't want to because it would take a lot of time that can otherwise be used on other parts of game development (or just not used, and the game released sooner/at all)
@@mousepotatoliteratureclub Sure, but him saying the graphics api are too complicated is just wrong.
So in one video you're bashing open source by painting badly(even though empirically you were proving wrong) and now you're bashing proprietary.
Pick a side already.
Stuck with Dialects of C++ ? At some point many do C.
What Dialects? We have only three C++ compiler left, Clang/GCC and MSVC. And they are pretty compatible. In production you never want to use the latest features anyway.
@@llothar68He mentioned shader languages. But I’m referring to the C++ standards like Apple’s Metal vs Opengl.
So, the downfall of the free market of both ideas and profit is Cronyism.
"Capitalism is trash, let me eat cake" by Jonathan Blow.
Proprietary this and that complicating any real meaningful improvements, literally peoples times being FULL with learning and wading through all the bad decisions, systems and restrictions created in the past/present.
P.S. @someone else with your garbage defenses of 'this system' you're wrong and Idgaf if you wanna argue, stay clownin for THE major impediment to human progress. K thnx bai.
A little bit more than one year ago, I would have been certain that this was written by an AI. Now however, I am certain of the opposite; an AI would never have written a text this badly.
This is a very sad person!
He's just of a different personality type than you
Do you guys also like Musk?
"F**k Nvidia" (c) Linus
Lol, I don't know why youtube is recommending me Jon Blow videos. This guy hasn't made a single good game EVER, so listening to him critique other devs' programming skills is somewhat amusing. Worse than that though, he was a programmer on Deus Ex Invisible War, the worst Deus Ex game, so he contributed to making the sequel to one of my favorite games of all time much, much worse than the first one. For someone who has been in game dev for about 30 years, he never really picked up what it takes to make a quality game. Meanwhile timeless classics like Doom were made by a handful of nerds screwing around.