The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games. Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them. 2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol. - Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise. - I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse. - Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of. There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier. Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much. Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized. But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on. With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS. Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila. Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
Most companies' idea of "optimization" is to just raise the system requirements. I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
Now programmers just don't even wanna do hard work, I just hope everything will become normal in near future else we gonna have to keep upgrading each year and keep up with the trend even if we don't want to, I mean... Devs could make the game for lowest hardware and using those hardware and I mean lowest is not Intel atom but rather what most people use on average and what most people are capable of using, and use those combinations of hardware to build games or applications.
@@chry003 One of the problems is the reliance on programming "environments" that aim to make programming easier at the cost of increased requirements. I get it, you want to use whatever will make your life easier, but often using these, gives a more inefficient result. There's a small utility for older versions of Windows called TaskArrange that lets you change the order of the buttons on the Task Bar, since older versions of Windows lacked this ability. The whole thing is about 50K, programmed in pure assembly. I once downloaded a program that did the same thing, with maybe an extra feature or two, but it required the .Net Framework. The program was 500K. I know in today's world, 500K is nothing, but the point is that it was ten times larger than the one made in assembly. Plus the extra size of the .Net Framework. All to do the work of a 50K program. For another example, take the PS2 versions of GTA3, and GTA: Liberty City Stories. Both games use essentially the same map, so they should run about the same in an emulator. However, Liberty City Stories requires a faster system to run smoothly, than GTA3 does. Another thing is that often when someone says that something can't be done, what they mean is that there's no pre-made function to do that. Within the limits of the system's hardware, software can do ANYTHING the programmer wants it to. Windows can't run Mac software, right? Well, if you write a Mac emulator, it can.
Imagine if hardware was developed just like modern games: “- Whoa! It POSTs! - Ship it! - But there’s issues and clock is unstable. - We will fix it via driver updates!”
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st. code is written by idiots.. sub optimally... they try get the product done 1st.
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow. On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it. This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
Wouldn't the game just not work then? It has to render an infinite number of the cities. Perhaps not because they get smaller? But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage. Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly. But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
True. Unoptimized games is not because of the software engineers but because of the managenent. Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000. That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent. Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall. Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start. But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
Nowadays games: - 98gbs for just a fighting game (yes Tekken, I am talking about you); - DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too). Seems like the industry only cares about Path Tracing in real time render.
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes. And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure! Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately I just cant play a game slower than 40-50 fps ...or with drops
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
To give some feedback: I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
At this point in time, graphics have reached their peak, in a lot of games you would never be able to tell the difference between high settings and ultra setting except for a drop in FPS, hell in some games medium settings look just as good as high settings unless you compare them side-by-side. It's time that game studios realized that individually rendered fish scales isn't something that players want, the fish could literally be a collection of oval objects with a gray texture and players would be happy. Optimization is needed more than ever given that most people will never buy a graphics card that's more powerful than the RTX 20 series, and people like me who still run the GTX 16 series probably won't upgrade to a 20 series card for years to come.
UE5 optimized the games to the point they require either way top end hardware or upscaling and frame generation to be playable... the future is bright !
9:22 You aren't using the full performance benefits of Dictionaries. By acessing the value by key and not iterating over the values you get a exectuation time of 0.25 ms: Dictionary wordsDictionary = new Dictionary(); void LoadWords() { string[] lines = File.ReadAllLines("words_alpha.txt"); foreach (string line in lines) { string morse = Translate(line, ""); if (!wordsDictionary.ContainsKey(morse)) { wordsDictionary.Add(morse, new List()); } wordsDictionary[morse].Add(line); } } string Translate(string input, string divider = " ") { string result = ""; foreach (char c in input) { lettersDictionary.TryGetValue(c, out string morse); ; result += morse + divider; } return result; } List TranslateInvalidCode(string input) { input = input.Replace(" ", ""); return wordsDictionary[input]; }
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag. Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
If it is so hard to estimate, and hard to do why game company used to hype up gaming community with such announcement and such date release. Get the job done and then announce!
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it Minecraft is truly not a heavy game only if the devs adressed it it would be playable
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones). For example, remember how Crash Bandicoot was more than even Sony believed was possible? Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access. Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB. Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs). Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
6:00 Please don use Big-O Notation like O(1), it's O(c), constant. O(n) is O(cn). Or O(log cn) It's always the constant time needed for one element, multiplied by the number of elements n. ALWAYS
The blinking light is from World of Warships. Don't play it, the game has jumped off a cliff somewhere around 2021. Arguably even earlier, when devs reworked a class to a point where it's simply permabanned from any high level competitive play.
Meanwhile "Alan Wake II" Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!" This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique. Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
I work as a programmer for a small indie studio and my job has really made me wonder how AAA companies make videogames. You often don't see this a lot with indie games, because if your studio has 1 - 3 programmers and someone doesn't know what they're doing, you just don't have a videogame. But AAA companies seem to always struggle with the absolute silliest things, like a single weapon in Destiny 2 breaking 36 times, or the Freddy Fazbear model in Security Breach having straight up movie-level polygons, or games like CoD or RDR2 having absolutely zero compression on anything, or GTA 5 Online literally having an infinite loop on the BOOT. How do these things happen??? EVIDENTLY they have talented engineers on the teams for all of these games, but all of these mistakes are so stupid, so first-semester-of-college basic, that I wonder who these people are actually hiring. Same goes for security too. It's definitely a lot of extra planning, but not much work to make your remotes secure, your client-server communication reliable and safe (P2P is harder though). But then these games never do it, end up with massive cheating problems, and then end up having to shell out millions for a rootkit garbage ""anticheat"". It's baffling.
I think the problem with massive games is that their jumble of code ends up connecting a lot of seemingly separate things. Something we see as a small glitch could be an unimaginative tangle of code that is just waiting to break the entire game
I will say something dumb but I think it's because they don't hire good programmers that play games instead they hire activist that really hates gamers or just lazy developers
@@theemperor4814 I don't think anyone who hates gamers or games would work in the industry, it's an absolutely insane amount of work not many people truly understand; but I wouldn't put it beyond an unqualified or untrained team being in charge of these things. Having a novice audio or graphic designer be responsible for importing assets, and forgetting to compress files; or having more junior developers write less critical code which is never cross-examined, could easily cause this issue.
@@zeelyweely1590 the thing is they love games and hate gamers but they want what you call diversity look at x ( before Elon )having excess of workers that's the majority of gaming Industry now
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
I'd argue games developed by Nintendo themselves on the Switch takes the cake, but I guess that depends on what requirements you're going for here.
What about Factorio?
@@what42pizzaits not a AAA games. If we were talking about indie games, then youd see most of them being well optimised. Like factorio
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games.
Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them.
2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol.
- Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise.
- I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse.
- Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of.
There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier.
Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much.
Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized.
But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on.
With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
thats why AAA makes billions while you work for minimum wage as an indie dev lmao
"Our game is running just fine, maybe it's time to upgrade."
Todd Howard: To PC gamers with i9's and 4090ti
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS.
Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila.
Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
@@valentinvas6454Fr unpaid modders released optimization patches like that same night it released 😂 also added dlss and other very much needed stuff
@@valentinvas6454 Dude you're comparing apples to oranges. This is so ignorant on so many levels I don't even know what to say.
@@ged-4138 Care to elaborate?
@@ged-4138 you dont know that to say because you dont have anything to say. So next time just keep it to yourself.
Most companies' idea of "optimization" is to just raise the system requirements.
I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
And you'll not be wrong 😂
Now programmers just don't even wanna do hard work, I just hope everything will become normal in near future else we gonna have to keep upgrading each year and keep up with the trend even if we don't want to, I mean... Devs could make the game for lowest hardware and using those hardware and I mean lowest is not Intel atom but rather what most people use on average and what most people are capable of using, and use those combinations of hardware to build games or applications.
@@chry003 One of the problems is the reliance on programming "environments" that aim to make programming easier at the cost of increased requirements. I get it, you want to use whatever will make your life easier, but often using these, gives a more inefficient result.
There's a small utility for older versions of Windows called TaskArrange that lets you change the order of the buttons on the Task Bar, since older versions of Windows lacked this ability. The whole thing is about 50K, programmed in pure assembly. I once downloaded a program that did the same thing, with maybe an extra feature or two, but it required the .Net Framework. The program was 500K. I know in today's world, 500K is nothing, but the point is that it was ten times larger than the one made in assembly. Plus the extra size of the .Net Framework. All to do the work of a 50K program.
For another example, take the PS2 versions of GTA3, and GTA: Liberty City Stories. Both games use essentially the same map, so they should run about the same in an emulator. However, Liberty City Stories requires a faster system to run smoothly, than GTA3 does.
Another thing is that often when someone says that something can't be done, what they mean is that there's no pre-made function to do that. Within the limits of the system's hardware, software can do ANYTHING the programmer wants it to. Windows can't run Mac software, right? Well, if you write a Mac emulator, it can.
DX11? It'd need DX12, Windows 10 and an RTX card, cuz we don't use the old fashion shading style anymore.
Imagine if hardware was developed just like modern games:
“- Whoa! It POSTs!
- Ship it!
- But there’s issues and clock is unstable.
- We will fix it via driver updates!”
Shh! Don't give them ideas!
It's consumers that enabled this behavior. If people stopped buying broken games due to their FOMO, we'd be doing much better
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st.
code is written by idiots.. sub optimally... they try get the product done 1st.
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow.
On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it.
This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
that's crazy. thanks for sharing.
Wouldn't the game just not work then? It has to render an infinite number of the cities.
Perhaps not because they get smaller?
But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage.
Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
there is no issue with over optimizations an optimized game will run perfectly for years skyrim is peak example
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly.
But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
@@fireloop69rdr 2 is definitely the perfect example
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
This comment was approved by real amecian patriots!!!
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
Oh yes I especially like how even cutting enemies into 100 plus pieces only lags slightly
I have 1.5 Ghz CPU and integrated GPU. It runs MGR. Just how.
This video was super cool. Why is this channel so underrated. Hope it blows up!
The demonstration about "what if light was slow" was also amazing
Simply, quality content , straight to point , and engaging , luckly i was already subscribed to this channel
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
explain
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
@@ef3675 i understand what you mean. good example
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
@@xerideaI remember this.
Wrong title, it should have been: The LOST Art of Game Optimization
It’s not lost though. AAA games aren’t the only games on the market and even then there are well optimized AAA games
And for common people, AAA is the only one matter. . @@crestofhonor2349
@@crestofhonor2349
Name 1 optimized AAA game made after 2016
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
True. Unoptimized games is not because of the software engineers but because of the managenent.
Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
I just about cried when I saw you put the subdivision modifier on the door handle. I think I felt genuine pain.
You deserve much more subs and views! Amazing video!!
100%, I was confused by how this can only have 300 views, great work
Look, this is why I say "More Hardware doesn't make a better game" The argument should be "Better Optimization makes a better game"
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000.
That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Never understood this O shit, tbh.
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Nintendo is great with compression techniques it’s crazy. They truely are the best at making compressed video game file formats.
They had to, their Nintendo switch is comparable to 2023 midrange smartphone in term of power
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent.
Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall.
Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start.
But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
Nowadays games:
- 98gbs for just a fighting game (yes Tekken, I am talking about you);
- DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too).
Seems like the industry only cares about Path Tracing in real time render.
its not dx12 fault, its fault of lazy developers used to dx11 abstractions
Tbf with tekken 8 you can just Delete the story files (30+ gb size)
Vulkan > DX12, btw you can use DXVK or VKD3D in some cases to fix some stutter and mem leaks with older tittles.
I give it a like just for the effort that you put in the video
What a great work dude!
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes.
And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
This was surprisingly interesting… Thank you!
WOW! Thats a highly concentrated quality video. So much to meaningful content in so little time.
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure!
Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
what
what
what
what
what
very infomative video. some part of the videos are really hard to hear due to the heavy accent, the autogenerated sub can only do so much.
There's mid 2000s game thst looks and plays amazing on what we today consider low end hardware
Thanks, this was a pretty cool rundown.
This is underrated. Great video, sir!
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately
I just cant play a game slower than 40-50 fps ...or with drops
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
wow, someone who knows what they're talking about, very refreshing
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
A beautiful but forgotten art sadly 🤧Well at least indie devs still try in that department
This video basically explained why I'll never learn programming. The numbers immediately made my head hurt.
Quality content right here! Thank you for sharing this!
Raytracing can be more efficient than the typical rasterization when dealing with a huge quantity of triangles.
Yeah, but you still need special hardware to do so, there sure is a better way to do it, just like we all did before Raytracing and still look good.
To give some feedback:
I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
What about vulkan....?
@@BOT-fq9bu also a low level API and roughly speaking, you would have to do the same amount of work
If your game sucks it's not the API's fault, it's yours.
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
4K is a lot pixels to cover one needs those high priced GPUs to render it.
Research floating point operations.
buy a 1080x1920 monitor.
At this point in time, graphics have reached their peak, in a lot of games you would never be able to tell the difference between high settings and ultra setting except for a drop in FPS, hell in some games medium settings look just as good as high settings unless you compare them side-by-side. It's time that game studios realized that individually rendered fish scales isn't something that players want, the fish could literally be a collection of oval objects with a gray texture and players would be happy. Optimization is needed more than ever given that most people will never buy a graphics card that's more powerful than the RTX 20 series, and people like me who still run the GTX 16 series probably won't upgrade to a 20 series card for years to come.
This title feels like it should be the lost art of game optimization
7:43 all that work for "dunkirk"
UE5 optimized the games to the point they require either way top end hardware or upscaling and frame generation to be playable... the future is bright !
Very awesome video. Such an accessible introduction to optimization
9:22
You aren't using the full performance benefits of Dictionaries. By acessing the value by key and not iterating over the values you get a exectuation time of 0.25 ms:
Dictionary wordsDictionary = new Dictionary();
void LoadWords()
{
string[] lines = File.ReadAllLines("words_alpha.txt");
foreach (string line in lines)
{
string morse = Translate(line, "");
if (!wordsDictionary.ContainsKey(morse))
{
wordsDictionary.Add(morse, new List());
}
wordsDictionary[morse].Add(line);
}
}
string Translate(string input, string divider = " ")
{
string result = "";
foreach (char c in input)
{
lettersDictionary.TryGetValue(c, out string morse); ;
result += morse + divider;
}
return result;
}
List TranslateInvalidCode(string input)
{
input = input.Replace(" ", "");
return wordsDictionary[input];
}
C. You are using Unity.
Thought it was the subdivision that broke shit but 3 MILLION?!!? HOW DO YOU DO THAT BY ACCIDENT
Thank you very much, now I know why I am getting 3 fps👌🏻
Someone needs to send this to the team working on ready or not 😂
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag.
Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
I did not understand a word from 5:46 to 10:17. 10/10
get your ears checked then lol
Try understanding some bitches, we can understand him fine
All of this in 10 minutes, insane video
On low budget developers export game engine with game they make. Making Pong requires now to export Unreal Engine 3 Custom with it.
People always say stop optimizing games it’s stupid, and well they need to be optimized
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
The steam deck is more powerful then the ps4. so yes... it runs on a steamdeck and not a ps4.
@@jairit1606 and the legion go is more powerful then the steamdeck - i played every quest and the addon. Really freaking good game. (and Handheld pc)
Ah yes, game optimization, the thing every developer ever forgot about
Hi! Good video, but you used the wrong Godot logo at about 2:00. That's all
You mean those extra teeth?
@@llllllXllllllyes, that logo is very old and not used anymore
GPU: I fear no Pixel, but that thing *Shows Door Handle*.... it scares me.
1:51 It can be c. actually ;) Try the same in UE5 with Nanite :D
Should have watched till the end xD Great video! ^_^
really good video, but you need to find some way to reduce peaks whenever you pronounce S, as it did hurt my ears a bit...
If it is so hard to estimate, and hard to do why game company used to hype up gaming community with such announcement and such date release. Get the job done and then announce!
Thanks man I loved it
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
Give this man a world of warships sponsor
@1:50 C. You are using Unity
Topic os good but I can't understand a single word, depending on subtitles only 😢
nice video
Good video, man! CFBR
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it
Minecraft is truly not a heavy game only if the devs adressed it it would be playable
They have been working on it. I think they also focus on good practice more than performance.
Nanite is actually a really bad way to LOD scaling as it manipulates existing meshes in real time requiring extreme amounts of CPU usage.
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones).
For example, remember how Crash Bandicoot was more than even Sony believed was possible?
Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access.
Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB.
Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs).
Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@@MrMoon-hy6pnBedrock was pretty well optimized. You can get 60 fps at a render distance of 84 chunks on the right specs.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL
But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
6:00 Please don use Big-O Notation like O(1), it's O(c), constant. O(n) is O(cn). Or O(log cn)
It's always the constant time needed for one element, multiplied by the number of elements n. ALWAYS
No, you don't understand how Big-O notation works.
The definition of f in O(g(x)) is if there exists a multiplier M and a threshold for x such that 0
O(c) belongs to O(1). They're sets.
Comment for the algorithm, awesome content!
No way wreckfest made it on a thumbnail
What game is it that you mention at 6:12?
The blinking light is from World of Warships.
Don't play it, the game has jumped off a cliff somewhere around 2021. Arguably even earlier, when devs reworked a class to a point where it's simply permabanned from any high level competitive play.
A sacred art lost to time..
Meanwhile "Alan Wake II"
Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
Having an RTX 3060 be the MINIMUM GPU requirement for a game to run screams shoddy optimization
Modern game devs: I will pretend I did not see that
Really good video dude! Noice
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
Unless you have access to source codes and assets of those games, your claim is just a speculation.
@@swh77 Because in his video he is just talking about basic stuff. Not what goes into optimizing a raytracer, shader passes and so on.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
Fascinating video!
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!"
This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique.
Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@SomeRandomPiggo
I believe that baked lighting gets a bad rep because nearly every modern game with baked lighting looks terrible (see:R6 Siege)
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
You know it IS quality content when you hear that accent
I work as a programmer for a small indie studio and my job has really made me wonder how AAA companies make videogames.
You often don't see this a lot with indie games, because if your studio has 1 - 3 programmers and someone doesn't know what they're doing, you just don't have a videogame. But AAA companies seem to always struggle with the absolute silliest things, like a single weapon in Destiny 2 breaking 36 times, or the Freddy Fazbear model in Security Breach having straight up movie-level polygons, or games like CoD or RDR2 having absolutely zero compression on anything, or GTA 5 Online literally having an infinite loop on the BOOT. How do these things happen???
EVIDENTLY they have talented engineers on the teams for all of these games, but all of these mistakes are so stupid, so first-semester-of-college basic, that I wonder who these people are actually hiring.
Same goes for security too. It's definitely a lot of extra planning, but not much work to make your remotes secure, your client-server communication reliable and safe (P2P is harder though). But then these games never do it, end up with massive cheating problems, and then end up having to shell out millions for a rootkit garbage ""anticheat"". It's baffling.
I think the problem with massive games is that their jumble of code ends up connecting a lot of seemingly separate things. Something we see as a small glitch could be an unimaginative tangle of code that is just waiting to break the entire game
I will say something dumb but I think it's because they don't hire good programmers that play games instead they hire activist that really hates gamers or just lazy developers
@@theemperor4814 I don't think anyone who hates gamers or games would work in the industry, it's an absolutely insane amount of work not many people truly understand; but I wouldn't put it beyond an unqualified or untrained team being in charge of these things.
Having a novice audio or graphic designer be responsible for importing assets, and forgetting to compress files; or having more junior developers write less critical code which is never cross-examined, could easily cause this issue.
@@zeelyweely1590 the thing is they love games and hate gamers but they want what you call diversity look at x ( before Elon )having excess of workers that's the majority of gaming Industry now
@@zeelyweely1590 just look at Concord
u mean to say my GTX 1050 will run GTA VI if optimized wwelll❤
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
Mojang needs to see this
Java Minecraft has so many issues stemming from the fact that it is single threaded. Bedrock Minecraft however does not have this issue