It’s crazy to think what developers were able to pull off on PS3 with foreign architecture. Uncharted, gow3, tlou, kill zone, etc You just don’t see that nowadays.
How about WOW looking good and running fine on integrated graphics 15+ years ago. I don't mean the stile, just that it ran fine, looked good and was fluid on the bare minimum hardware.
@@joee7452wow/ all MmO rpgs are CPU intensive games. You can have a 4090 and still be at 30 fps in dornogal/ valdrakken/stormwind/or grimmer at lowest graphics setting on 1080p
this is a dangerous loopcycle, where PC hardware is improving very slowly and by very little while games are becoming more and more demanding because optimization has been replaced with AI voodoo
I still remember calling this out with the first gen of frame gen and people telling me I was just jealous because I use AMD. Vindication is a wonderful thing. We will never see a 1080 ti value card again.
Depends on wether you consider it obligatory to have RT/PT enabled to enjoy a game. If not, AMD has great value, and with RT OFF you don't really need upscaling (unless you aim for above-60-frames 4K).
As a gamedev, if my game doesn't run 60fps+ at max settings on my 3060 12Gb. That means that I fucked up my job and I gotta go fix that shitty code and shaders. I'm tired of these studios ignoring optimisation, hardware improvements keep slowing down, leaving software optimization as the primary way to improve performance for all systems.
As a commission machine builder It's equally frustrating having to tell a client that to play a new game they wanna get into at a reasonable level is gonna cost them a kidney for the machine to run it on! I ain't moving machines anymore because of it!
Agreed. It's kinda disgraceful how badly optimised most games are for the hardware they're supposed to run on. I don't wanna have to tell you how much it cost for me to buy a rig that could run Cyberpunk 2077 in native 4k at a locked 120fps & without dlss but I'm sure you can imagine lol. The cost of modern gaming is getting outta hand.
What i can see is that every game company are picking Unreal Engine 5 for a quick buck so that they don't need an in house engine and not doing any optimization themselfs... All games that i saw having performance problems are UE games!
Fixing tech debt is such a hard thing for companies to understand. It's not something visual that can make a company money. They always want new features. I think we are getting to the point that either we will need to create a groundbreaking new technology or that we will have to optimize the software.
Nvidia's GREED is braking PC gaming, for not having already 16GB and 32GB VRAM for 300$ and 400$ cards. - 10. jun. 2023 - At present, 8GB of GDDR6 now costs $27 - 32GB=108$, small chips 4070 costing 100$, while 4090 chips costing 200$ to make at TSMC. I mean 4070 32GB for 400$ would be the right price. Mainboard costing 30$ and cooler 14$ wholesale prices.
@@arenzricodexd4409??? The usage of ray tracing in games wasn't yet a twinkle in Jensen's eyes when most raster techniques were invented. That's like saying that we invented biplanes because jet fuel was too expensive to produce.
PC already has tons of great games though and they are not only keep coming, but most of them could run on a potato. Even if new GPU models stop being made as of today, new games will be made for at the same pace for at least a decade more. And if games stop being produced at all, the existing library is so vast that it will take a lifetime to dig it without reaching the bottom.
@@evilbabai7083Which good games are exclusively available for Pc? There are no big and famous games Pc, it gets just the console ports, stop lieing yourself.
@@Thor86- excluding or including those that were ported to consoles later or those that has been ported before, but today only available on PC? And I wonder if you count day-and-date multiplatform as "console ports". 😂 Also, "thinking big" is a console thing, where games are made to appeal to as wide audience as possible, leaving little to no space for those who wanted something more unique. PC is a home for the most acclaimed FPS, RTS, 4X, sandboxes, simulators. Games are not afraid to be extreme here - hardcore or casual, deep or simplistic, with ultra realistic graphics or no graphics at all. But if you insist, lett's start from the popular games that were born on PC and were either ported to or had sequels on consoles later: Doom, Quake, Wolfenstein, Minecraft, PUBG, Battlefield, Far Cry, Half-Life, Command and Conquer, Hitman, Diablo, Divinity, Baldur's Gate, The Elder Scrolls, Fallout, Witcher, Valheim, Palworld, War Thunder, XCOM, Stellaris, Cities Skylines, Civilization, Mount and Blade, STALKER - that's just from the top of my head, something I've touched upon personally. Same goes for games that are PC-exclusives for today: Counter-Strike, Garry's Mod, DOTA, RUST, Factorio, Space Engineers, Elite, StarCraft, Total War, X4, Crusader Kings, Hearts of Iron, Men of Arms, Foxhole, ARMA, Squad, Ready or Not, Escape from Tarkov, Kenshi, Dwarf Fortress, Highfleet, Quasimorph and so on. Again, that's only what I've touched upon personally and can recall without peeking anywhere. My all-time favourite games on PC are Space Rangers 2 and Barotrauma - I've sunk hundreds of hours there on a single breath. You might say:"wait a minute, that list seems weird, I've never heard about most of those games", but that's exactly the point - I have tons of options to scratch my itches not just with a mainstream games, other PC players will have lists of their own. For instance, I'm not bringing up really old games like Vangers, Hard Truck 2, Heroes of Might and Magic, Commandos; some super vast or obscure things like World of Warcraft, EVE online, StarCitizen; any VR games, most of which are PC exclusives like Boneworks of Half-Life Alyx; any mods and total conversions some of which are basically a different game (some of which made into their own games, like Counter-Strike, PUBG and DayZ). On Steam alone there are more than 100k games available, 50 games are released every day, so even if 99.9% are either utter trash or doesn't fit for you, that's still more than dozen of games per year that you will enjoy. At the moment, it's easier to count which games you can't play on PC because: - multiplatform is on the PC; - PC has tons of games released on it (some of which can later make it to the consoles); - most of the XB/PS exclusive games are already on PC and will keep released there eventually; - PC is not split into generations, so once game is there, you'll be able to play. That leaves out Nintеndо exclusive games, and older console games in general, as well as the older PC games that has outdated DRM or require some specific older hardware/OS, and delisted games as wel ... and that's where еmulаtiоn and jolly roger comes to play, but I'm not going to elaborate on that or my comment will vanish) Let's say, PC can run more XB/PS/Nintеndо exclusive games than their modern consoles.
it will definitely hurt their sales... majority of people still use mid range hardware anyway. It really does make games like Jedi Survivor, Dragon Dogma 2, etc to underperform, and people doesn't mind to skip them. Indies are taking the spotlight because the game is accessable by anyone. Hades 2 happens because everyone can play the first game, even on Intel HD Graphics
Itll fuck up sales, I've played every mh game to exist and I'm concerned...lol..I have a 7900xtx and 7800x3d and idk if I'll be able to lock a 60 native in 1440p ultra. I bet I can't.
Those recommended specs for Wilds aren't mid range anymore. They were midrange when they were released 6 years ago or whatever. My PC is mid-range. I bought it almost a year ago for ~$2k and it is above the recommended specs. And for those worried, if you look at the latest showcases the game clearly runs around 60 fps on a playstation 5. Btw, DD2's latest patch basically fixed the game. I play it around 120 fps with a 7800xt and 7700 ryzen.
@@albert2006xp if a game runs at 30FPS on the latest hardware, That's poor optimization, It's like you're trying to fill 5L jugg with 10L of water , Then blame the jugg for its incompetence.
@@Delpnaz "Latest hardware" is vague. Also no game just runs at 30 fps on anything you could call "latest hardware", you made it run at that by setting it up in a way your card is not meant to run at. Trying to run render resolutions you aren't cleared for at max settings.
I find enjoyment in older games nowadays. Playing older masterpieces instead of newer and unoptimized cash grab games is a breath of fresh air from the current market. It is only a matter of time before i run out of older games to play, but if it does come to that, perhaps it is time i take a break from gaming entirely and focus on my life more.
You won't run out of games, we have reached that point in time where there are more games to play than ever, even if you just take the PS3 era and work backwards to 8bit days. That's more games to last a lifetime. Plus yeah going out to touch grass and improve your life is a good thing.
You're more likely to have a wife and kids before you run out of older games to play. So, it's less about running out of games but running out of free time before you even reach the end of the backlog, lol
its what ive been saying, mainly bcs it impacts me directly, so pls do what you usually do before, paying employees, calculating and sharing adsenses, symbiotic sponsorships etc arent they basic business like really it, once again, has domino effect of unfairness and comparing to others, which hard to fix since its traumatic experience, whats worse everything becomes chaotic and distrust
The rtx 5050 has a very amazing feature where it has a 50% chance of turning on, I think its a cool feature because it will increase the lifespan of the graphics card in a long run!
@@kitajecafrica5814and the reason for both is that consumers will carelessly spend their money on anything juat because it's "new", whether there is any improvement/innovation or not
Yeah, but Ngreedia isn't helping game developers too. Game developers want to use Vram to get "better and easier" "optimalisation", but Nvidia is selling expensive cards with not enough Vram, like just look on 5080 leak, 16gb for such big price?
@@fall1n1_yt True, but I guess we can assume it will be 1200 yet again. But if Nvidia will be graceful and sell it for 1000, then still it's way too high price if 5090 will really have 32 gb
enough os and games have been created that now not just Creation and Destruction there is indeed a swelling of Revival because every body know all best games have already been made ..
Uh, no. The present situation is truly unprecedented. For example, in October 2001, the most powerful consumer GPU in the world was the GF3 Ti500 and it cost $349 at launch, which is $620 in today's money. Nvidia lowered the price of their debut GPU, the RIVA TNT from $199 to $149 4 months after launch and more than a year before the launch of the TNT2. 'Can it run Crysis' was a running meme in the late '00s and early '10s but the reality is that Crysis launched in a stable state and actually ran pretty good on reasonably priced hardware and ran great on a 2 generation old flagship. I've played Crysis on a 7800GTX and the 7800GTX was 2 years and 2 generations old when Crysis launched.
Games looking 15% better than in 2015 to perform 500% worse, amazing progression, cause it's totally important to have them 50k polygon eyelashes because someone might zoom in on them with a microscope using an 8k TV.
it's not the games progressing faster than the hardware, it's the gaming industry regressing with bad management experienced devs quit to make their own studios and unexperienced devs are expected to meet the same same deadlines, optimization is an afterthought beside optimization they are expected to also implement shops, which has to be the first one to work flawlessly
People act like developers going rancid means all the human capital went to the gallows. Not how it works. The nice thing about legit devs leaving is more talent going to more creatively free, underdog institutions, perhaps with a chip on their shoulder about the big corpo management practices. So more great games (even if they’re not so glittery) that won’t be AAA prices.
Games already look good enough, I would rather have games run nicely and well over 90fps than have games run at 60fps as standard. Game developers spend way too much money developing better graphically but the gameplay is TOTAL 🗑️
Yeah but alot of games, especially on console arent even hitting 60fps, usually run at 30fps, luckily for us pc gamers, we have more settings to play with to get that frame rate up
Here's what I thought. We have so many great games. I have unfinished Witcher 3, Kingdom Come Deliverance, Horizon Zero Dawn. I want to replay Skyrim with mods. I've never played Cyberpunk. I have a 4070 laptop, and I don't think I'll need an upgrade for a couple of years.
The GPU market is now mostly the AI market with gamers as a afterthought so games also making hardware matter less would make sense. I mean what else can they even do with graphics that actually matters at this point that people would actually notice and appreciate. I wouldn't mind if one day in like 10 years integrated graphics systems are the standard with the big boi GPU s are only used for workstations. We can't go much smaller than 3nm with 1nm being the max smallest.
@@Ven0mSRTSkyrim with mods is such a mood. Recently reinstalled it for the first time since 2018 and tried surfing Nexus for a bit, and I haven't been this eager to jump into a game since I finished BG3.
Let's be honest, current top of the line Hardware are more than enough for gaming. What we need is not better hardware, but companies making good games
Good news is there is a massive library of old games. And, there is nothing stopping developers from using older engines and graphics software, that looked almost just as good. Its about quality of the game, not just graphics.
Yep, my rtx 3080 runs BFV at 4k ultra 60+fps and it looks breathtaking, and then runs BF1 at 4k 120hz. Can barely achieve 60fps at 1080p in some of these new games with shit like Ray tracing baked into it. I’ll be playing older games for many years.
theyre actually really hardwork from other people, even the pics, taken by fansites with real effort to attend events, its not right if im taking all the praise
I only upgrade once every 4 years. Don't buy games at release, get them at a 50-70% discount later, when they're more optimized, bugs are fixed and with all the DLC's. And can finally run them without upscaling at high fps.
Yes. Some of the besr games ever released between 2012 and 2020. An endless list of games to play that all run at 4k on a mid range gpu. The problem is that if everyone did that.. game devs would make even less money and modern games would get even more trash. These devs need to focus on playability and stop trying to break new grounds visually.
7:33 - the hardware is "slowing down" because gamers do not find the new games as exciting as the games in the past so they decide not to upgrade their gpus for the sake of graphics in games just. Gamers yearn for more legitimate fun and more replay-ability as well as innovation in games.
Honestly. I've got a huge backlog from 2014 to 2024. And I'm sure I'll have plenty more to add. There are so many games to play. Diversify your taste in genres and you will have dozens of thousands of hours of content to enjoy.
New is better, the games of this year are better than the year before and the games to be launched are better than the ones already out. Why do you think people see the Game Awards? For the trailers. Thats how the industry works. A game is good till is out, GTA6 is the best game ever till is launched and then GTA7 is the best game ever.
its either older games, or let the released games mature for a year. They usually run 2x better after 10 patches. Let alone if they remove DENUVO after that year. I have many games on my HDD, maturing :)
Easy to say, I have a 1060, my PC is 7 years old and next year is my 50th birthday so I wanted a new PC for the next 7 or 8 years but the price is crazy and between Intel screwing 13th and 14th generation and AMD screwing with the 9000, even if I had the money I would't buy any of it.
I'm still on my i7-920 w/1050 ti and 8GB RAM (used to be 12GB until recently when one of the DIMMs or slots died or something.) Will probably get a $2-2.5k prebuilt maybe around next September, right before MS drops support for Win 10.
Ghost Of Tsushima, GTA5, Horizon Zero Dawn .... So many games that have been incredibly well optimized! Now the developers focus on ray tracing, DLSS and FSR and neglect the optimizations
A big problem is that game developers are firstly less competent overall, but also that they’re seemingly making games with the assumption that hardware improvements and costs are still improving at the pace they used to. They’re throwing insane amounts of detail and effects at games to produce eye candy, and assuming everyone is on a 4K screen, while the actual hardware people have is barely better than a decade ago when 1080p was the norm. They’re also seemingly no longer aiming for consoles as the base specs that a game should be running on, then adding bells and whistles to the PC release. Games are made for top end PC’s that should have existed if moores law was still in effect. Meanwhile on the low end, gaming has never been better. Because we’re so deep into diminishing returns, low spec machines like the Switch and Steam Deck that either use PC low settings or that need to be specifically and appropriately ported to are punching well above their weight.
1080p is still the norm, at least where I live more than 70% of screens are still FullHD. Only 22% Are QHD and out of TV section less than 8% are 4K screens...
I take issue with the console-argument. You can't have your cake and eat it too - on one side, people were complaining when games didn't look different from the console releases despite better hardware capabilities. On the other side people are complaining now when devs put all the bells and whistles that PC-master race idiots have been asking for. Which is it? What is it players actually want? Not enough eye candy - players complain. Too much eye candy - players complain even more. There's simply no winning here.
Game developers are not any less competent, they're just less incentivized. It's true for all software development, really -- optimization is a long, difficult and costly process that typically doesn't earn the company any extra money (unless you're optimizing the internal company infrastructure), so it's the thing the companies will wanna spend the least time and money on. The upper limit to how poorly optimized a game can be is and will always be the hardware.
The "5nm" and "3nm" sizes are marketing nonsense anyways. The actual transistors have size of more than 20nm. Over the past 20 years they have become essentially 3D instead of just 2D allowing chip manufacturers to improve the density per area. So you will definitely see sub "1nm" cause these are just marketing terms.
@@gerardotejada2531 You say that as if the op claimed the opposite.. perhaps it's not obvious enough to you that it's a measurement so you assume he may have thought otherwise.
As an electronic engineer I can tell you what's going on here. The reason why they are saying Moore's Law is dead has nothing to do with manufacturers not doing it. They are up against physics. It's not about improving technology, it's about running up against brick laws when it comes to the physics (which is something you cannot change with technology). They are running into problems such as Quantum Tunneling and other issues at the quantum scale of things. Things are just too small, to put it simply. It is not something you can fix with some "new tech". What the manufacturers are not doing is coming clean about this actual limit in physics they have reached. So this is why you do not see the huge improvements with recent generations. They are saying Moore's Law is dead because they cannot apply it anymore due to the limits imposed by physics. Any improvement you see now is only about them fine tuning architecture (how their products function internally and how the sections interact) and less about increasing actual performance at the individual transistor level. Tunneling begins to show at 4nm (not 1nm like mentioned in that reddit post). What happens is that electrons simply appear on the other side of the transistor junction instead of passing through it. They literally disappear and reappear on the other side of the junction. It's one of the weird things in Quantum Mechanics. So they are simply increasing the die size and adding more transistors. However the efficiency of the processes are diminished at the same rate as the increase in transistors numbers due to tunneling. So on paper (especially for the shareholders) seeing more transistors looks good, but doesn't really glean more performance due to the problems at the quantum scale. The reality is now that they have reached these limits in physics, there is no possibility of any substantial performance improvements, like we have seen in the past. Back then it was about getting to that limit, being able to make things that small was the limit. It was a technological manufacturing issue. Now that that limit due to physics is nearly been reached, no technology or manufacturing process can overcome this. That's why it's called "Laws of Physics". Once you reach those limits outlined by these laws, that's it. Game over.
And game developers don't understand that things should go forward. We expect computers to be all time responsive and more energy efficient. But I disagree that we can't see improvements. I can tell how to get those improvements: 1. Ditch x86 and replace it more efficient architecture. Something that is designed to pipelines from start, has good code density and parsing is designed way that later phases what happens are known early as possible. 2. Shared RAM for CPU/GPU etc. no need to moving same data around. 3. And for high performance computers, ditch ATX/PCI and current structure of high power desktop computers. Like how about having single board that is filled with chips/chiples both sides with large heath sinks both sides. That allows optimization spreading chips based on distances and even heat distribution. Or even fancier design where is one board bottom, two boards raised from that to make some kind of "tunnel" with heat sinks many sides with 180mm coolers both sides of "tunnel". ATX/PCI standards are really not designed to current technology. 4. Optimized hardware for different tasks. We have hardware for graphics, tensors, video decoding etc. and perhaps there are other stuff that is used everywhere and can be optimized dedicated circuits, and computers can be optimized to different type of loads. 5. Rest of the performance improvements can be done on software. To help that happen, interface between hardware and software should be fully standardized and documented. Developers can improve software if it sits on stable platform. Focusing on portability has been all time more important for experienced developers than focusing on performance.
You know what, maybe it’s a good thing. I’ve been upgrading computers for decades and it was not cheap. I have several computers and laptops that are less than 5 years old. I will just use these for the next 10 years and play older games and enjoy them without that urge to upgrade.
No it hasn't. Only clueless people on social media think so. Games are hitting their performance targets. You just can't accept what the performance targets are. For the large majority of people with GPUs ending in 60 and 1080p monitors that's 60 fps 720p render resolution.
@@albert2006xp I just don't understand why they complain in the US. They earn four times more on average than in my country, graphics cards are even slightly cheaper there, and somehow I can afford good hardware
I recently reinstalled Doom 3 with a resolution compatibility mod. I played it on an ultrawide OLED. Guys, this game is 20 years old, and to my eyes it looks genuinely BETTER than any of the upscaled shimmering blurry garbage we're getting today. Yes, it has low poly models and flat textures. But the art direction, purposeful lighting design, and ultraclean super sampled image quality makes up for it. No FSR/DLSS artifacting. No TAA blurring. No weird shimmering anywhere in sight. Just smooth, high framerate gameplay. Modern games just look so disgusting despite the massive technological leaps. What is going on?
Politely disagree. Cyberpunk 2077 on my new 4080S+7800X3D+1440p OLED screen max settings looks absolutely gorgeous as well as Metro Exodus almost as if looking through a window it's insane. Hopefully you'll see this beauty for yourself.
Yeah it's refreshing to play older games from time to time without the feeling of losing eye sight. I hate games that force TAA, i will take jagged edges any time over TAA.
I built a new main PC back in June using a Ryzen 7900 (no X) as I do a lot of Handbrake compression of DVD's and Blue-ray's. I installed Windows 10 and everything else using the iGPU before taking the RX6700XT out of my older PC, but just for fun I installed a few games from steam (nice and quick getting it over my LAN from older PC). Running Half-Life 2 which is also ~20years old, and it ran at between 90-120 FPS (TV's max is 120Hz) using the iGPU with just 2 RDNA compute units! With all the settings set to High (the maximum) the polygon count is still very low and the textures are also very low resolution by today’s standards. Yet that game is still a great fun to replay today if you have not touched it in the last decade or so.
I honestly believe that we hit ceiling when it comes to graphical fidelity. I honestly see no difference between games that are coming out now and the ones that came out 5 or so years ago. I don’t understand how are they getting harder to run when they look basically the same. We truly do live in a time of diminishing returns.
True. Rdr2, Mgsv and other games look almost the same with newer games, pluse these older games have better gameplay in most aspects while being much easier to run.
We definitely hit diminishing returns in terms of how a game looks, but not in how games behave and move. I still think lighting and field of view could still be heavily improved upon, but I don't think we are there in our technology yet. Games still feel very flat. It is why I hope developers stick to strong art direction for the time being.
There're improvement, they are smaller, and just cost more processing power. We are long past the 2000's and early - mid 2010's where you would see significant improvement each year that came due to several reasons. Also making high fidelity games is harder, making entire modern game engine is extremely difficult and that why there are handful of those.
5:00 "games are progressing way faster" ??? WHAT They literally regressing... Terrible developers programming on nasa pc, dont even play own games and using just every library under the sun to not code something on own with 0 optimization to save money on development...
I mean that's only a portion of games and they are greedy AAA games but indie and other game studios(Saber Interactive,Arrowhead and FromSoftware)are coming out with very good games like black myth wukong, space Marine 2, hell divers, Hades 2, Manor Lords and so on. 😮💨
Lazy software optimisation isn't advancement. Games should be using Vulkan and be correctly optimised. id Tech Doom Eternal is a perfect example of a correctly optimised game. My 3060 can run it 1440p ultrawide with RT enabled.
Vulkan can be a mess. It is ok if your game are triple A like Doom Eternal. If you're not then we will see the repeat of OpenGL where majority of game development end up favoring nvidia hardware.
@@arenzricodexd4409 They managed to maintain an engine that ran smooth-60fps in the 360/PS3 era. What these guys do is nothing short from magic. I would guess that such a successful port to the PS3 was as much of a feat as a port to Vulkan is nowadays. I'm not exactly sure what they do or how, but it seems that it has been done for many years already. It is a shame that we are just now starting to realise and appreciate the amazing technology that goes behind a game.
It is a shame that ALL game devs will favor dx over anything else... it is the most cost effective at the moment, and AI helped them in the midst of software developer culling. Game Devs doesn't need people who can tweak optimization and other source like Vulkan because they rely on Nvidia, which is targeted for microsoft dx as the default option. I don't think AAA publishers will delay their game in the favour of optimization anymore, AI upscalling is here to stay.
@@nickochioneantony9288 I honestly doubt it's the developers making that decision. Devs do care about what they produce. It's most likely management and shareholders who seldom care about anything other than profit.
@@nickochioneantony9288 game dev did not use DX because of nvidia. they use DX because MS provide a lot of tool and help when they develop their game using DX. Khronos Group only oversee the development of Vulkan. they did not provide extra tool or help to game developer. for such thing game developer have to rely on IHV support. and this is one of the issue. another issue is things like extensions.
@why-m3g what are they going to do? Intel's APUs have been garbage for years, not to mention when their graphics cards came out, they didn't even have working drivers for video games. Games weren't working with their graphics cards, do you remember? Nvidia is competing with itself and there's no stopping it for right now the only reason AMD was able to claw its way back in the CPU Market is because Intel got complacent
@@bumperxx1 Driver support takes time. Intel GPU hardware is already better than AMD. Just give it time, they fumbled bad with CPUs and currently scrambling
Directx was acceptable, but rt and dlss and all that AI shit is unacceptable The game devs use ai upscaling as an excuse to spend less time on optimizing the game While directx brought better graphics and better efficiency which is good
I still have a PC that runs on a 500W PSU. Also, you can run a 4090 and 14900k on an 850W PSU with plenty of overhead left over for overclocking. But people think they need a 1600W PSU, which is only going to guarantee their poorly designed 12VHPWR connector will set their house on fire when it starts melting. If you have only as much power delivery capacity as you need, the PSU is going to shut off a lot sooner because, as the connector heats up, resistance drops and the cable draws more power. But if you have 750W of unneeded power overhead, the connector has to get hot enough to draw an extra 750W before the PSU shuts off, by which point your house is already burned down. The other concerning thing is that Nvidia specifies a max power draw of like 40-50% in excess of what their upper tier GPUs are actually observed to draw. This is dangerous because any protections on the GPU might not kick in until you exceed that max, by which point a hardware fault on the GPU could have the whole PC on fire. You see, those safety features on your PSU are really only to keep the PSU from bursting into flames, not to protect anything connected to it, same as the circuit breakers in your house. But there is a common misconception to the contrary. A PSU will keep feeding a short or overheated connector until the power draw or current draw exceeds the limit of the PSU.
@@WeatherMan2005 yes but... you can build a budget/low-mid tier pc right now, with parts that are at most a gen or two old, that will run circles around the best thing you could build 10 years ago, while consuming like half the power, for around 500 dollars
I just finished my download session for COD BO6-60GB including Hardcore mode and new maps-plus an additional 24GB for Nvidia preloading shader textures, all because I updated the 3090 Ti software drivers. 😅
The devs don't really have much of a choice. Most of the unoptimized games releasing today are from AAA companies that are slaves to the deadlines created by the shareholders.
@@RedWolfenstein I had Nvidia cards from 2006-2022, had ATi before that and my first AMD card has been the 6700xt. Maybe the drivers were mature enough, but honestly for the price I paid at the time ($325, came with two games), the experience has been even better than what I had with Nvidia. Granted I was coming from a non-RT and non-DLSS card so that's really where nvidia is better these days, but I actually really AMD cards. I mean I'll be open to whomever when the time for an upgrade comes, but I stand by the "buy whatever is the best deal in your price range" mentality.
@@Torso6131I have this card with the 5700x because i thought it would be enough. I’m thinking of upgrading because it’s hard to compete anymore when your pc can barely run the game
@@account-2239 a 6700XT? It's been fine enough for me in anything competitive, but I also have it paired with a 7800x3D now (Did have a i7 6700k before, what an upgrade lol). But I'm mostly sticking to stuff like CS2 and TF2. Maybe a little overwatch every now and again. What are you playing in multiplayer that the 6700xt isn't enough if you don't mind me asking? I've been sticking to single player games as of late (Baldur's Gate 3 has consumed my last year, Alan Wake 2 behind it) and it mostly ran those quite well.
We don't like the increased prices but we don't like cards without DLDSR+DLSS and with poor RT even more, especially when they're not even half price as they deserve to be and actually want to charge close to Nvidia prices.
Not sure if your going to see but with how well regarded the 1080ti is, I decide to check how much of a performance improvement there had in been in a 10 year period from 2006 to 2016. So back in 2006 Nvidia launched the worlds first modern graphics card called the 8800gtx, it was twice as fast as any other GPU available at the time and it only cost 600$, literally one year later NVidia would die shrink the chip down from 90nm to 65nm and release it as the 8800gt and sell it a 350$ for about 90% percent of the performance. Things were advancing so fast back then that by 2009 you could buy a GPU from either AMD and NVidia that matched the 8800gt performance for 100$. Now comparing the 8800gtx to the 1080ti to see how much progress had been made in 10 years, and it turns out that the 1080ti is at least 15X times faster that 8800gtx. Now when we compare the 1080ti to the rtx 4090 in pure rasterization performance the rtx 4090 is only 3.3X times faster that the 1080ti, those are terrible numbers especially when you compare price to performance, after all we never got a 1080ti performance for 100$. And I doubt the 5090 is going to do much change these terrible numbers either, so yeah this hobby is going to end up getting killed by these stupid high prices, terrible performance gains, and terrible developer optimization.
I upgraded from a 1070 this year, mostly because I needed 1440p for RUclips shenanigans...meanwhile there were people who bought a new GPU every damn year for whatever reason lmao. It still played damn Cyberpunk at med to high settings in 1080p at 60 fps
There are limitations on hardware development but some gamers and game developers should take head out of ass. They must be crazy to think that as norm to have several kilowatt rack of water cooled hardware, placed next to room to run some game. Most sane way to see hardware and software progress is to compare what we get on same watts.
@@gruntaxeman3740 yup good point, as far as I am concerned 250 watts is as high a gpu should ever get. The fact that the RTX 4090 has a tdp of almost double that is insane to me. Unfortunately using performance per watt as a metric makes the new gpu's look even worse. As for the terrible state of game optimizations, fun fact I will absolutely blame NVidia on that since they were the ones to start pushing the stupid DLSS AI upscaling and temporal anti aliasing that these incompetent game devs have been abusing as replacement for actual game optimization, hence the terrible state.
@@flydeath1841 About that much yes. I compare computer PSU. For me, upper limit is 750 watts. That means peak load everything on continuously 100% should use max. 50% from that to have enough room for PSU aging and to keep it stable because of peak currents. I'm currently planning my next computer and I do some engineering here, I start this from PSU and pick up something where idle watts are lowest. I also have computer running continuously full load sometimes when rendering and same times compiling some software, so I see that this is 375W heater. Cooling is anyway only way to remove heat efficiently out of enclosure. Will my apartment be turned into a sauna or should I invest also air conditioning? While I can build system reliable enough on that power limit, it may be still inconvenient. And I consider this very high power as my current old computer PSU is only 240 watts (30 watt GPU is sweet).
Developers don't care to optimize their trash, modern games may look more realistic, but they look worse then so many older games, the chase is pathetic. Gonna rock my 970 till it dies, I can run anything I want at 60fps at 1080, I feel no need to upgrade graphics, I only do when I have to, I was happy with 1024x768 res until I couldn't get a new moniter in that aspect.
It's like what has happened in the auto industry. Prices are sky high and quality is not keeping up with price. What we need is better value at lower prices. As for computer graphics, I'd say we have decent graphics quality right now. The problem is more a lack of quality games. Games are being pushed out half baked or not even baked.
@@5-OTT.s-ON-A-KICK There's nothing wrong with the upscaling, as that is simply another technique for making a game viable across a larger variety of players. FSR and DLSS are really just formalizations by AMD and Nvidia of a technique that's been available to game developers for a long time. Games are on the one hand a creative effort, and one the other hand technical, where they should be trying to optimize for a sweet spot between performance and quality. We don't need ray tracing, etc., to have a good game. That's mainly just the industry trying to find ways to justify new GPU purchases.
@@SG-js2qn well now we have companys using dlss to make their gpus seem faster the 4050 is slower then 3050 but its only faster with dlss on, whats the point buying a game when the devs make a optional feature become mandetory. im pretty sure dlss was made to either have games playable on 4k or use an old gpu to get more performance. now we have it to make new gpus to have playable fps on 1080p.
The thing with AMD Ryzen 7 7700x and 9700x is that older one is 105W CPU and runs higher base clocks, and newer one is 65W CPU. This means that while performance is very similar, near identical, there are huge optimizations done to make it more energy efficient. This is huge difference. 5 nm vs 4 nm process surely helps.
or you could lose a little performance and undervolt the 7700x to become more energy efficient (not sure why anyone would do this but it is possible) .
@@VintageCR Technology itself is very useful for applications like laptops and small PC's, thin clients and such. While this doesn't directly fit to high end gaming, people still use these devices to play games. I do not think 7700X can be undervolted to 65W without reducing clocks and disabling boost, as it is done on TSMC 5 nm process. Performance would suffer a a lot. 9700X however is 65W out of the box, and probably could be undervolted to be even less. I don't disagree with you, I simply commented because this channel used these processors as showcase for miniscule benefits, missing the point completely. I just don't like misleading content. Surely we can take one metric and chew and that, but it's not like these processors are the sweet spot for purely gaming anyway. It's just half blind approach.
@@jarivuorinen3878 no you are right, i may have missleaded my comment, my goal was basically to say why even manufacture a higher powered CPU when the next gen or not even next gen is coming up packed with better efficiency.. its such a waste to keep producing multiple products (cpus) with the added errors in the wafers, but they only have marginal performance increases.
@@VintageCR There are some news that AMD will no longer produce 7000 series soon. TSMC makes the chips but their costs have risen. I'm sorry I can't find the source right now so take it with grain of salt. I also didn't mean that you've misled anyone, I mean the channel, content creator, used those processors as bad example.
hey guys, just so you know the reason that the ryzen 9 series was barely improving on the 7 series (sorry if im butchering the names) is because windows was actually bottlenecking it, it was a windows issue not a amd issue, although it was fixed a few weeks ago/ a month ago, although linux STILL has higher cpu benchmarks with amd than windows
an element you left out and the actual problem right now is the modern AAA game dev... they don't code anymore or have deeper understanding of computer science. Modern game devs use SDKs like Unreal, Unity etc that let them build AAA games without having to code 95% of it. The end result sold to the end-user is a bloated one-size-fits-all codebase of which the actual game doesn't fully need, and that requires hardware to work a lot harder to achieve the same results it would get if optimised properly. Optimising games is something that requires the deeper computer science and coding knowledge a typical modern game dev doesn't possess. When "built right" and optimised even moderately, there is no reason a mid-rage gpu can't put out max settings for a brand new AAA game at high frames.
Gaming Industry has shifted the budget to use Hollywood stars instead of Game Developers. I think it is clear to see. To think that I cheered for Keanu Reeves which resulting in a mess of a game that is Jedi Survivor because they busy paying Matthew Monaghan instead of Respawn team.
As a student in college for game development, I very much appreciate working with Unity. Yes, there's less coding for stuff like collision or handling movement, but only coding 5%? Nah. If you aren't using assets for menus or movement, you still need to code a lot of that yourself. It just makes it easier to connect everything and handle it. It's like a house. Unity or Unreal put up the foundation, walls and a roof. But YOU need to add in the walls, furniture, plumbing, doors, etc. And it DOES lower the barrier to game development. So many fantastic indie games use Unity or Unreal and run flawlessly. Stuff like Cuphead, The Messenger, Sea of Stars and Ori and the Blind Forest run on Unity for example. All of these games run on my old tower that has a GTX 970 and on Switch with little to no issues. That being said, optimization is sadly not as widespread as it used to be, for sure. But engines like Unity or Unreal are not the problem. It's how they are used. And hardware is seldom ever to blame; it's the lack of time, care or experience of the developers.
5% of coding? Where do you people come up with these headcannons??? It took me two months to make my first indie horror game in unity and the coding part was literally 90% stop pretending to be a game dev
Dude, it's like this is Moore's law but in reverse! Our hardware is improving at half the pace it used to, but its cost is twice as high as it used to! Exponential... decay... Nah, "Moore's law is dead" cus it is more profitable for you to make us believe so.
I would say that something like a RX 6600 is just enough for 1080p Who cares about 4k? I don't even notice a difference when I use a 4k monitor unless if I squint my eyes really hard.
Similar situation- if it can run total war, space marine 2, elden ring , and armored core 6: its all i meed for years to come. Ive only played space marine 2 from those… it got me back into gaming.
I remember buying a little GTX 950 back in the day for just 150 USD, and i was set to play at 1080P High Settings 60FPS. Now? Budget cards is basically nonexistent.
It’s also a shame about Ada Lovelance GPU’s that there is a RTX 4050 laptop but not a RTX 4050 desktop variant, compare that to Turing when we had the GTX 16XX series as budget options compared to the RTX 20XX series.
I still have my GTX 970. Wonderful little card served me fantastically until i got myself a 1080ti. And it's still working just fine. But you ain't getting more that 20fps out of that little guy on modern games at 1080p when i used to get 100+ fps on games like The Crew
for those who wondering for budget gpu, there is RTX 3050 6gb with a shit memory bus about $200. There... you got a brand "RTX" without any capability to run anything with Ray Tracing. That is how low Nvidia has become in term of consumer's respect. I don't even have the power to tell you how underwhelming RTX 3050 run... it's just very dissapointing. I remember 1050 for being a beast, then 1650 Super bring that budget build to the next level.
@@nickochioneantony9288 exactly... nvidia's current "budget" option is the 4060, right around 300 dollars depending on where you look... 5 years ago that was used 1070/1070 ti money
I built my PC back in the beginning of 2023 with an RX 6600. Great 1080p card and relatively cheap. I recently upgraded to an RX 7800XT and because I'm still gonna stay at 1080p for a while, I thought "well, this is a 1440p card, so it should be pretty overkill for 1080p and last a long time". Card is already struggling to run some recent releases. I played Silent Hill 2 remake at 1080p native, everything maxed out minus ray-tracing. Average of 70fps but with constant stutters and in some extreme (albeit rare) cases, my fps dropped to 45-50. And I know what you might say, "well, yeah, you had everything cranked to the max!", but dude, its a 7800XT in 1080p, and I wasn't even using ray-tracing. SH2 is gorgeous, for sure, but its also mostly closed environments and low draw distances when outside because of the fog. This game should be running way, waaay better on this card. There are other cases, I'm also struggling to maintain 60fps in Final Fantasy 16 as well, and as beautiful as it is, it looks to be less complex than SH2 on a technical aspect. This industry is fucking cooked, games have become 100% dependant on upscaling and fake frames, both with noticeable downsides. Expensive ass hobby.
That’s why you always have to be vigilant when hunting for a reasonable bargain. When the RTX 3090 FE first came out through BB stores, the price was set at $699.00 for months. At that time, Nvidia CEO Jensen Huang didn’t believe he could get customers to pay much more than $700.00. Microcenter has came through once again for me for a 3090 ti @$700.00 (backroom *open box)
AI upscaling was supposed to allow video games to be played at 4k. It's been since the ps4 Pro that we talk about 4k gaming. I should not see "1080p" in a game requirement table in 2024.
Problem with this is that 4k monitors with descent refresh rates end up costing quite a bit. A lot of people would rather stick with a 1080 or 1440 with 100+ frame rates as opposed to spending an arm and a leg to get a 4k monitor that’ll most likely be able to only do 60.
Gaming has gotten too big too fast . It has to balance itself out by shrinking a little. AAA titles must get smaller, people should stop overrating them and give more chances to indie titles
This is IMO very true, AAA titles takes long years and billions dollars to create, so they almost have to be generic to be safe bets for investors. And hardware became more expensive partially just because of demand aswell.
Maybe try VR - there’s a lot of smaller scale games which are innovative and fun with the VR tech (Quest 3 & PCVR). They’re not too taxing on hardware either.
Most games could afford to be shrunken down. I wish companies would let off with the quantity over quality. I hope for more semi open and open linear titles to be more common. This next Dragon Age game seems to be smaller in scope that the previous game so I'm excited since the previous game is why I developed open world fatigue. I think indies get their fair share of love and hype from people. It is the AA/smaller AAA games like Prince of Persia, Astro Bot, Hi Fi Rush, and Unicorn Overlord that tends to get overlooked more.
Newest games these days feel like they become unoptimized to the point the triple A Studios are becoming more reliant on upscaling and frame generation instead of optimizing the games to run better on old hardware, plus the two games I have been playing is Red Dead Redemption 2 and Forza Horizon 5. The main reason why I bring these two games up is if you look at Red Dead Redemption 2, compare it to the newest games you'll notice Red Dead Redemption 2 looks and performs way better, also with Forza Horizon 5 that is the most optimized game I have ever played and you can use the old graphics cards on it to the point that it's still runs decently well. Basically I'm saying if game development didn't rely on upscaling and frame generation and rely on more on optimization, plus I'm saying upscaling and frame generation would be far better use on for older graphics cards and even consoles, but I do blame the higher up within game development for these problems and I say it's better to delay the game to allow it to have more time to be more optimized for better performance.
I run everything I play on 1080p native on a 40 inch tv... No FSR, no DLSS, no ray tracing, no motion blur... If I can't run a stable 60fps like that with max or almost max settings, I don't need it... Kinda makes sense now that the last AAA game I bought was Fallout 4 in 2016...
Everybody brings up red dead 2 as some super optimised masterpiece. Which it is but the reality is that game will still destroy a 4090 at 4k when maxed. The 2080 was the best card when it launched on pc. Yo
@@mojojojo6292 Incorrect, it can get around 100 fps on 4k ultra max settings without dlss. Even my 1070 can play it on 60 fps at around medium-high settings
11:20 When I was in college a few years back, quantum tunneling was theorized to start being an issue at 2nm. If it's now only an issue at 1nm, that's a pretty notable improvement.
Ray tracing is available even for ps5 and tht upscaling psst is just a marketing gimmick if they want they can add it even to ps5 with a software update only its not tht big of a deal@@raven679-m3i
But if they sell more reasonably priced normal chips to people who can actually afford them, wouldn't they get more than just selling stupidly overpriced AI chips to corps?
@@AMDRyzen57500F Because of the demand from AI company's supply will continue to be lower than demand. In any market when demand is high and supply is low prices will be high.
@@AMDRyzen57500F They pricing is very different, the margins are hugely greater for corps, even if they sell vasts amounts to consumers it may not make a difference, I may be wrong.
There's a channel called @HighYield that goes into a lot of the nitty gritty in process nodes and such. He did a recent video about gate all around architecture, and what I take from it is that in order for progress to remain exponential, innovation must also now become exponential. Of course this is pretty much impossible without a steady linear increase in IQ, but they are making more profit margin than ever, so R&D could compensate somewhat. as you said, progress will slow down but not stop.
Machine learning denoising basically. IIRC DLSS2-3 and FSR2-3 are pretty similar technologies but DLSS and XeSS have hardware accelerated machine learning denoisers. Machine learning is pretty cool, I had a work project where my team was working on coding a machine learning based tool for our sales people to use (I was a chemist, so, learning to code, not my full time gig though) and it was pretty neat. You train it to take in all the old data and make a "decision" based on that and in theory it only ever gets better. Once AMD allocates hardware to machine learning for FSR I bet it'll make a fairly sizeable leap in upscaling quality, and it might even boost performance since the GPU won't be using async compute resources to upscale as much. Of course then you're taking up hardware space that could have been used on general performance, so maybe it'll be a bit of a wash. I think DLSS generally outperforms FSR on nvidia cards slightly for that reason.
It doesn't help that games are coming out with graphics that are far past the point of diminishing returns, with awful usage of TAA and upscaling that makes those improved graphics basically pointless.
It would be better to focus on strong art direction. I have seen people crap on the new Ghost of Yotei, but it has amazing draw distance, artistic lighting, and atmosphere. Horizon Forbidden West still looks better that a lot of current gen only games and doesn't need all of those fancy upscaling. Even games on my Switch can look fantastic when they work with the hardware limitations.
As a Taiwanese, I appreciate your sharing. Taiwan number 1! However, just remind you that long working hours has led to burnout and physical health problem for staff in TSMC. When you hold your graphics card, remember this is really from blood and sweat and tears of TSMC. Hope they can form a union.
@@paulboyce8537 Taiwan is not the West they are trade partners with them. Officially, they are part of China. Soon as we have our own foundries they will be cast aside.
I feel like the same as phones, computer chips are getting to their ultimate form where they have reached their full potential and just can't get any better, time to invent new technology (or physics) 😂
Personal opinion: We have reached cap on 1440p and even 4k gaming. Who needs more than 175fps on 1440p? Who needs more than 80 on 4k? What is there to even improve upon. The next step is VR and the insane resolutions these glasses require. Before that happens we will be in the current swamp Edit: The bad game performance is not a hardware issue but a lack of optimization from the devs to save money.
@@gurnery4219 I get that but correct if I'm wrong, esports play these titles on lowest settings with no shadows? Even a 4070 ti super should push 200+ on 1440p this way
@Scorptice tbf atm shadows on cs2 is a bigger thing than it used to be on CS:GO. Personally, single-player games at 60fps is perfectly fine but competitive as much as I can get. I don't feel graphics are looking nicer for the performance hit most of the time. I can't see 8k taking off for another decade when tv companies make it affordable, and even then, do we even need it? We've definitely hit some sort of cap, and I agree to an extent.
The problem isn’t that hardware is slowing down, it’s that optimization for games is going out the window. CPU’s and GPU’s run Warframe at 1000 FPS and it looks great at 4K. It’s just the newer games engines are horribly optimized
My son is still using my 2017 GTX 1080ti build for warzone and rocket league at 1440p high setting over 120fps. Me using GeForce now 🙄 on my Mac book air.
My son plays only Minecraft and Roblox. He's never wanted to play anything else. He gets 1080p75 locked at all times on the GTX 750ti. I ask him every couple months if he wants a better graphics card and he's like dude.. what's wrong with mine? Lol
Soon clould gaming like GFE will become more common. A, friend of mine already move to GFE because he did not want to play catch up with the hardware. One year of GFE sub is cheaper than 3050 6GB.
@@arenzricodexd4409 bro the input lag tho. It will never be solved because of propagation and equipment delay. Even in the same house with the server with expensive equipment will create delay. I can see it strickly for single player games but you can not play a single fps game with remotely competitive input lag. Plus the image quality has compression artifacts like watching Netflix etc. I think that single player games on handheld is where it will excel. I imagine streaming cyberpunk 1080p pathtracing on a handheld would be 10 times better experience than playing it locally on the handheld. Might actually be a great use of the tech. It can't be universally useful unfortunately.
@@christophermullins7163 personally i can't imagine myself going the way of cloud. But for more casuals gamer out there it might become more sensible way to play games from time to time rather than spending big bucks on dedicated pc or console.
@@arenzricodexd4409 I imagine the remote gameplay engine could easily run on tvs out of the box. (Maybe some do already?) Such as a "Steam TV" or "Xbox TV" that you can play any modern game on. Also.. remote gaming on a handheld would provide far better battery life than running natively... all modern devices become a gaming device. The tech has its place for sure.
As someone who does study economy, the higher prices makes sense on both the political and economic aspect of this. In fact, it's important to have higher prices when there are shortages (higher profit margins) because we could see more investment from said profits to do R&D and meet the demand of their consumers. Yes, the consumers aren't just us, the gamers, and they do need to supply for other applications outside of just video games. More competition is important, but for such a complicated process to produce said products, it'll be a while. However, the USA subsidizing manufacturing doesn't really help in the long run. If we're looking at the long run, they really need to reduce taxes and regulation. Reducing costs of production is much more important than just throwing money at the problem. Especially for increasing competition, thus more of said competition can better supply shortages.
@@George_C-e4vI just don't buy new games that are too demanding. If I can't run a game at native 1440p above 120 fps on high settings I just buy the game whenever I get a way more powerful PC.
Until newer games forcing RT/Lumen to be enabled all time, tanking performance for low end gpus, some even can't run it at all like mesh shaders requirement 😂
If everyone were to skip multiple generations, component sales would plummet. Then invidia and amd would pivot even more towards their corporate customers and pc gaming get less innovation plus higher prices. Developers crafting better games that run on old hardware is the only hope for the future. For now, playing older games is the obvious option.
It is of course a pure coincidence that these annual price increases happens at point in time when ALL products are made by A SINGLE manufacturer (TSMC)...
Get Intel gpu so they can speed up driver improvement. They perform great for when they run smoothly. Only certain old games have issues as they don’t have a long history of driver experience. In a few years, Intel will have better gpu than AMD. Hopefully actually compete with Nvidia bc they basically bent over n gave up when they announced their manufacturing of high end GPU failed and is actually why they won’t be releasing high end
the sad thing is that most people out there have NO IDEA on the real price to performance in general and will spend, loan, to get these cards. the delusion of the last 3 generations has been incredible.. nvidia, intel, asus, etc know this.
One thing that will likely improve chip performance massively is using photonic chips as they will bump up the clock speed and won’t increase heat output. But we won’t see this until the mid to late 2030s. We are getting glass dies by 2026 to replace silicon, glass die can be used with photonics.
Literally makes no sense. Games cater to the majority of players. Even a 1060 can play new games nowadays. Consoles also exist and aren't stronger than a new PC (though they can be stronger than a PC from 4 years ago). The problem here is with players. They simply don't understand what their system is capable of and what resolution and settings it's made for nowadays.
It is not that games are progressing at a faster rate than hardware, it is that copy paste engines like Unreal are being used to spew out games as quickly as possible whilst not having to be optimised. As long as it works, it is released.
Not every game has to be a hyper realistic game that requires a 4090 to run, Battlefield 1 level of graphics from 8 years ago are still perfectly acceptable
I literally bought the 1070 for around 550 or so plus tax at micro center… months after it came out. So saying $380 is so fundamentally and patently false that it has to be a deliberate lie
As soon as frame interpolation was made a thing, it was only a matter of time until developers started relying on it to hit playable framerates. That time has already arrived.
In the US prices don't differ much, but here in South Africa, a 4070 Super costs $850 US Dollars on Sale and a 7900 GRE costs $750 US Dollars on Sale, that's a huge price difference. The 4090 is $2650 US Dollars on Sale... That's not even affordable I can buy a car for cheaper.
@@ElRabito The 4090 has not been discontinued just yet, So stock hasn't diminished and the price hasn't yet changed to due to that. $2650 is a very cheap price in South Africa for a 4090.
I wish more games would opt for a cartoony art style. Not only does it age better, but it also upscales significantly better when needed. Wind Waker, running at 480p with a CRT filter on still looks pretty dang good. Team Fortress 2 at native 4k looks incredible and came out in what, 2007? Dunno why Borderlands is so GPU heavy these days though. Guess it's that Unreal backend. Maybe it's the lack of "fine detail" but they look great even at lower res.
valve is working on a new ip "deadlock" and the graphics look pretty simple nothing crazy honestly for a moba hybrid shooter its the most fun I had in a comp game in years. tbh I hate ultra relestic looking games with motion blur and alot of pot processing csgo was the best looked grungy and dark like most source games but not too realistic especially in physics
@@supawithdacream5626 Yeah, I started playing around in Deadlock a little bit. Kinda reminds me of Monday Night Combat from back in the day, which is a good thing. Unfortunately I don't have a ton of time to dedicate to multiplayer shooters like I used to though.
@@Torso6131 yup I feel that, some of the best years of gaming are already behind us anyways if I walked away from modern games I wouldn’t regret anything tbh
@@supawithdacream5626 I basically play three to four games a year at this point. Baldur's Gate 3 was incredible, Tears of the Kingdom as well (even if it felt a little too similar to Breath of the Wild), but multiplayer stuff has just passed me by. It bums me out that more games aren't using custom engines anymore. I feel like UE5's sheer heavy backend is causing so many issues these days.
I purchased my 3080 for $699 launch day at Microcenter. This is just greed plain and simple. They used both Covies and shortages to justify price increases and never went back.
As GPUs get more advanced, it gives devs more incentive to be even lazier with optimization. If this trend continues, in a few years from now, you'll need a 4090-level GPU to run new AAA games at 1080p low 60 FPS (with frame gen). I'm frankly sick of these badly optimized titles. It makes you want to quit AAA games entirely and just play indie-games.
Why would you run native when you can run DLSS and make a game more demanding and graphically intensive? Without DLSS (and upscaling on consoles as well) a game like Alan Wake 2 would have to release in the year 2030 to run.
@@dansmith1661 It's their most successful game and they managed to fund a risky cult classic project most gamers are too dumb for and will make their money back. We would rather play better games now, none of us know when this life is over.
Underrated channel. I watch more of this guy's video than almost any other channel AND his channel probably has the lowest subs.. How does that happen?
@@christophermullins7163 you see their reasoning they sum and always mention either companies, fans, or smth they been doing now, but never my rights of fairness
Optimization has died, this is when you remember GTA V runs on a PS3...
It’s crazy to think what developers were able to pull off on PS3 with foreign architecture. Uncharted, gow3, tlou, kill zone, etc
You just don’t see that nowadays.
How about WOW looking good and running fine on integrated graphics 15+ years ago. I don't mean the stile, just that it ran fine, looked good and was fluid on the bare minimum hardware.
ps 3 was a powerful system that why it was expensive
@@hercufles yes it was powerful but it was also incredibly difficult to develop which made GTA V running on it as well as it did even more impressive.
@@joee7452wow/ all MmO rpgs are CPU intensive games. You can have a 4090 and still be at 30 fps in dornogal/ valdrakken/stormwind/or grimmer at lowest graphics setting on 1080p
this is a dangerous loopcycle, where PC hardware is improving very slowly and by very little while games are becoming more and more demanding because optimization has been replaced with AI voodoo
I still remember calling this out with the first gen of frame gen and people telling me I was just jealous because I use AMD. Vindication is a wonderful thing. We will never see a 1080 ti value card again.
and TAA blurryness in motion everywhere
Did you know tmc 5 nm and 3nm chips only cost 5 percent more, so what does the gpu go up 40 percent
Exactly
Depends on wether you consider it obligatory to have RT/PT enabled to enjoy a game. If not, AMD has great value, and with RT OFF you don't really need upscaling (unless you aim for above-60-frames 4K).
As a gamedev, if my game doesn't run 60fps+ at max settings on my 3060 12Gb. That means that I fucked up my job and I gotta go fix that shitty code and shaders.
I'm tired of these studios ignoring optimisation, hardware improvements keep slowing down, leaving software optimization as the primary way to improve performance for all systems.
As a commission machine builder
It's equally frustrating having to tell a client that to play a new game they wanna get into at a reasonable level is gonna cost them a kidney for the machine to run it on!
I ain't moving machines anymore because of it!
Agreed. It's kinda disgraceful how badly optimised most games are for the hardware they're supposed to run on. I don't wanna have to tell you how much it cost for me to buy a rig that could run Cyberpunk 2077 in native 4k at a locked 120fps & without dlss but I'm sure you can imagine lol. The cost of modern gaming is getting outta hand.
What i can see is that every game company are picking Unreal Engine 5 for a quick buck so that they don't need an in house engine and not doing any optimization themselfs...
All games that i saw having performance problems are UE games!
Fixing tech debt is such a hard thing for companies to understand. It's not something visual that can make a company money. They always want new features. I think we are getting to the point that either we will need to create a groundbreaking new technology or that we will have to optimize the software.
Nvidia's GREED is braking PC gaming, for not having already 16GB and 32GB VRAM for 300$ and 400$ cards. - 10. jun. 2023 - At present, 8GB of GDDR6 now costs $27 - 32GB=108$, small chips 4070 costing 100$, while 4090 chips costing 200$ to make at TSMC. I mean 4070 32GB for 400$ would be the right price. Mainboard costing 30$ and cooler 14$ wholesale prices.
Optimization has been replaced with AI shenanigans, fake frames and upscaling.
I love my fake frames I don't know what you're talking about😂
Native Purist🚨 ALERT 🚨Native Purist 🚨ALERT 🚨Native Purist 🚨ALERT🚨 Native Purist 🚨ALERT🚨Native Purist 🚨ALERT🚨
@@dqskatt bot
@@dqskatt🤡
@@dqskatt games always look better on native, and if stutters occur on native, they will occur on DLSS and Frame Gen
Been saying DLSS was very dangerous because it would push dev optimizations aside, here we are
Good call. Thank you, Nvidia.
DLSS is no different than raster. Raster being created because ray tracing is too demanding to be run on real time.
And DLSS just straight up looks bad.
@@arenzricodexd4409It's completely different. It looks like shit.
@@arenzricodexd4409???
The usage of ray tracing in games wasn't yet a twinkle in Jensen's eyes when most raster techniques were invented. That's like saying that we invented biplanes because jet fuel was too expensive to produce.
We don't need better graphics, we need good PC games
PC already has tons of great games though and they are not only keep coming, but most of them could run on a potato. Even if new GPU models stop being made as of today, new games will be made for at the same pace for at least a decade more. And if games stop being produced at all, the existing library is so vast that it will take a lifetime to dig it without reaching the bottom.
@@evilbabai7083Which good games are exclusively available for Pc? There are no big and famous games Pc, it gets just the console ports, stop lieing yourself.
@@Thor86- excluding or including those that were ported to consoles later or those that has been ported before, but today only available on PC? And I wonder if you count day-and-date multiplatform as "console ports". 😂
Also, "thinking big" is a console thing, where games are made to appeal to as wide audience as possible, leaving little to no space for those who wanted something more unique.
PC is a home for the most acclaimed FPS, RTS, 4X, sandboxes, simulators. Games are not afraid to be extreme here - hardcore or casual, deep or simplistic, with ultra realistic graphics or no graphics at all.
But if you insist, lett's start from the popular games that were born on PC and were either ported to or had sequels on consoles later: Doom, Quake, Wolfenstein, Minecraft, PUBG, Battlefield, Far Cry, Half-Life, Command and Conquer, Hitman, Diablo, Divinity, Baldur's Gate, The Elder Scrolls, Fallout, Witcher, Valheim, Palworld, War Thunder, XCOM, Stellaris, Cities Skylines, Civilization, Mount and Blade, STALKER - that's just from the top of my head, something I've touched upon personally. Same goes for games that are PC-exclusives for today: Counter-Strike, Garry's Mod, DOTA, RUST, Factorio, Space Engineers, Elite, StarCraft, Total War, X4, Crusader Kings, Hearts of Iron, Men of Arms, Foxhole, ARMA, Squad, Ready or Not, Escape from Tarkov, Kenshi, Dwarf Fortress, Highfleet, Quasimorph and so on. Again, that's only what I've touched upon personally and can recall without peeking anywhere. My all-time favourite games on PC are Space Rangers 2 and Barotrauma - I've sunk hundreds of hours there on a single breath.
You might say:"wait a minute, that list seems weird, I've never heard about most of those games", but that's exactly the point - I have tons of options to scratch my itches not just with a mainstream games, other PC players will have lists of their own.
For instance, I'm not bringing up really old games like Vangers, Hard Truck 2, Heroes of Might and Magic, Commandos; some super vast or obscure things like World of Warcraft, EVE online, StarCitizen; any VR games, most of which are PC exclusives like Boneworks of Half-Life Alyx; any mods and total conversions some of which are basically a different game (some of which made into their own games, like Counter-Strike, PUBG and DayZ).
On Steam alone there are more than 100k games available, 50 games are released every day, so even if 99.9% are either utter trash or doesn't fit for you, that's still more than dozen of games per year that you will enjoy.
At the moment, it's easier to count which games you can't play on PC because:
- multiplatform is on the PC;
- PC has tons of games released on it (some of which can later make it to the consoles);
- most of the XB/PS exclusive games are already on PC and will keep released there eventually;
- PC is not split into generations, so once game is there, you'll be able to play.
That leaves out Nintеndо exclusive games, and older console games in general, as well as the older PC games that has outdated DRM or require some specific older hardware/OS, and delisted games as wel ... and that's where еmulаtiоn and jolly roger comes to play, but I'm not going to elaborate on that or my comment will vanish) Let's say, PC can run more XB/PS/Nintеndо exclusive games than their modern consoles.
@@Thor86- World of Warcraft. 😂😂😂😂😂😂😂
@@evilbabai7083essaying on youtube is crazy. Love the passion😃
you know somethings wrong when the recommended settings has frame generation as a requirement for 60FPS.
it will definitely hurt their sales... majority of people still use mid range hardware anyway.
It really does make games like Jedi Survivor, Dragon Dogma 2, etc to underperform, and people doesn't mind to skip them.
Indies are taking the spotlight because the game is accessable by anyone. Hades 2 happens because everyone can play the first game, even on Intel HD Graphics
Itll fuck up sales, I've played every mh game to exist and I'm concerned...lol..I have a 7900xtx and 7800x3d and idk if I'll be able to lock a 60 native in 1440p ultra. I bet I can't.
I'm still playing old battlefield and indie games. I'm maxing out my monitor 175hz at all times
I don't think I'll be able to run the game in general. I'm probably going to have to buy it in console which sucks. @@fuckyoutubrforchangingthis
Those recommended specs for Wilds aren't mid range anymore. They were midrange when they were released 6 years ago or whatever. My PC is mid-range. I bought it almost a year ago for ~$2k and it is above the recommended specs.
And for those worried, if you look at the latest showcases the game clearly runs around 60 fps on a playstation 5. Btw, DD2's latest patch basically fixed the game. I play it around 120 fps with a 7800xt and 7700 ryzen.
Games arent getting more demanding,
Just less optimized.
No, they are getting more demanding. Just like at any stage through gaming history.
@@albert2006xp if a game runs at 30FPS on the latest hardware,
That's poor optimization,
It's like you're trying to fill 5L jugg with 10L of water ,
Then blame the jugg for its incompetence.
@@Delpnaz "Latest hardware" is vague. Also no game just runs at 30 fps on anything you could call "latest hardware", you made it run at that by setting it up in a way your card is not meant to run at. Trying to run render resolutions you aren't cleared for at max settings.
@@albert2006xp no game runs at 30 FPS?
Did you even bother watching the video.
@@Delpnaz Do you not understand the concept of settings, resolutions? It doesn't run at 30 fps unless YOU set it to run at 30 fps.
I find enjoyment in older games nowadays. Playing older masterpieces instead of newer and unoptimized cash grab games is a breath of fresh air from the current market. It is only a matter of time before i run out of older games to play, but if it does come to that, perhaps it is time i take a break from gaming entirely and focus on my life more.
You won't run out of games, we have reached that point in time where there are more games to play than ever, even if you just take the PS3 era and work backwards to 8bit days. That's more games to last a lifetime. Plus yeah going out to touch grass and improve your life is a good thing.
You're more likely to have a wife and kids before you run out of older games to play. So, it's less about running out of games but running out of free time before you even reach the end of the backlog, lol
With gamepass I will never run out of good old games.
based
Exactly how I feel about movies too
The problem is not the hardware, the games do not use the hardware optimally
i cant tell you annoying it is with a high end pc when i get a lag spike and my cpu is getting 20% usage and its cpu related frame spikes
Yeah it’s a joke. £800 for a gpu to run at 1440p in new games is a joke, done with this era of gaming.
its what ive been saying, mainly bcs it impacts me directly, so pls do what you usually do before, paying employees, calculating and sharing adsenses, symbiotic sponsorships etc arent they basic business like really
it, once again, has domino effect of unfairness and comparing to others, which hard to fix since its traumatic experience, whats worse everything becomes chaotic and distrust
@@ESKATEUK Life fell off after 2019 lmao
@@mikemwps you’re not wrong mate lmao
Can't wait for the rtx 5050 8gb with performance of 3060ti
with 50-50 performance sometimes it can, sometimes it can't.
@@jahithber1430😂😂 your joke landed with me bud. Thanks
my 5050 uses 1500 W when idle 😔
Starting at 300 dollars
The rtx 5050 has a very amazing feature where it has a 50% chance of turning on, I think its a cool feature because it will increase the lifespan of the graphics card in a long run!
gaming industry is scammers paradise
It's way worse with Phones
@@kitajecafrica5814and the reason for both is that consumers will carelessly spend their money on anything juat because it's "new", whether there is any improvement/innovation or not
It's not the hardware is the game developers they're not optimizing games like they used to.
Guilty as charged, oops.
Yeah, but Ngreedia isn't helping game developers too. Game developers want to use Vram to get "better and easier" "optimalisation", but Nvidia is selling expensive cards with not enough Vram, like just look on 5080 leak, 16gb for such big price?
@@ladrok97 no price was mentioned on the leaks.
@@fall1n1_yt True, but I guess we can assume it will be 1200 yet again. But if Nvidia will be graceful and sell it for 1000, then still it's way too high price if 5090 will really have 32 gb
Indeed.
PC Gaming is in a constant cycle of Creation and Destruction every time
enough os and games have been created that now not just Creation and Destruction there is indeed a swelling of Revival because every body know all best games have already been made ..
it will be come more trendy in streamers over time to go back and be the only person streaming X old archaic made before the streamer was bornware.
gaming has a whole consoles have the same problem but the ps5 is limited to below rx 6700 + ryzen 7 3700x perfomance
Uh, no. The present situation is truly unprecedented. For example, in October 2001, the most powerful consumer GPU in the world was the GF3 Ti500 and it cost $349 at launch, which is $620 in today's money. Nvidia lowered the price of their debut GPU, the RIVA TNT from $199 to $149 4 months after launch and more than a year before the launch of the TNT2.
'Can it run Crysis' was a running meme in the late '00s and early '10s but the reality is that Crysis launched in a stable state and actually ran pretty good on reasonably priced hardware and ran great on a 2 generation old flagship. I've played Crysis on a 7800GTX and the 7800GTX was 2 years and 2 generations old when Crysis launched.
Games looking 15% better than in 2015 to perform 500% worse, amazing progression, cause it's totally important to have them 50k polygon eyelashes because someone might zoom in on them with a microscope using an 8k TV.
fr. I want bigger worlds, not individually rendered beard hairs on a random NPC.
it's not the games progressing faster than the hardware, it's the gaming industry regressing
with bad management experienced devs quit to make their own studios and unexperienced devs are expected to meet the same same deadlines, optimization is an afterthought
beside optimization they are expected to also implement shops, which has to be the first one to work flawlessly
Yup. Microtransactions over optimisation.
Throw in the rapid financialization of the gaming industry over the past 15 years and you’ve pretty much perfectly described the current environment
People act like developers going rancid means all the human capital went to the gallows. Not how it works. The nice thing about legit devs leaving is more talent going to more creatively free, underdog institutions, perhaps with a chip on their shoulder about the big corpo management practices. So more great games (even if they’re not so glittery) that won’t be AAA prices.
DEI and woke
What single player game not made by Ubisoft has shops? I'll wait.
Games already look good enough, I would rather have games run nicely and well over 90fps than have games run at 60fps as standard. Game developers spend way too much money developing better graphically but the gameplay is TOTAL 🗑️
this is facts
Yeah but alot of games, especially on console arent even hitting 60fps, usually run at 30fps, luckily for us pc gamers, we have more settings to play with to get that frame rate up
Here's what I thought. We have so many great games. I have unfinished Witcher 3, Kingdom Come Deliverance, Horizon Zero Dawn. I want to replay Skyrim with mods. I've never played Cyberpunk. I have a 4070 laptop, and I don't think I'll need an upgrade for a couple of years.
The GPU market is now mostly the AI market with gamers as a afterthought so games also making hardware matter less would make sense. I mean what else can they even do with graphics that actually matters at this point that people would actually notice and appreciate. I wouldn't mind if one day in like 10 years integrated graphics systems are the standard with the big boi GPU s are only used for workstations. We can't go much smaller than 3nm with 1nm being the max smallest.
@@Ven0mSRTSkyrim with mods is such a mood. Recently reinstalled it for the first time since 2018 and tried surfing Nexus for a bit, and I haven't been this eager to jump into a game since I finished BG3.
Let's be honest, current top of the line Hardware are more than enough for gaming.
What we need is not better hardware, but companies making good games
Good news is there is a massive library of old games. And, there is nothing stopping developers from using older engines and graphics software, that looked almost just as good. Its about quality of the game, not just graphics.
As long as you're using Nvidia cards. AMD are really bad at it.
@@gamzillio I have used both and each has their ups and downs. It depends. For me the important things are price, reliability, and support.
@@gamzillio While Nvidia is technically better in old games, but does the difference between 300FPS and 400FPS matter?
Yep, my rtx 3080 runs BFV at 4k ultra 60+fps and it looks breathtaking, and then runs BF1 at 4k 120hz. Can barely achieve 60fps at 1080p in some of these new games with shit like Ray tracing baked into it. I’ll be playing older games for many years.
theyre actually really hardwork from other people, even the pics, taken by fansites with real effort to attend events, its not right if im taking all the praise
I only upgrade once every 4 years. Don't buy games at release, get them at a 50-70% discount later, when they're more optimized, bugs are fixed and with all the DLC's. And can finally run them without upscaling at high fps.
Yes. Some of the besr games ever released between 2012 and 2020. An endless list of games to play that all run at 4k on a mid range gpu. The problem is that if everyone did that.. game devs would make even less money and modern games would get even more trash. These devs need to focus on playability and stop trying to break new grounds visually.
my rig being a 5800x got it for 130 and a 2060
old play ps3/360 era games
keeps old rx570 for good memories
If buying isn't owning than pirating isn't stealing
This is the way.
Sparking Zero will be only new game I’d have bought, even preordered, since at least PS3 days
7:33 - the hardware is "slowing down" because gamers do not find the new games as exciting as the games in the past so they decide not to upgrade their gpus for the sake of graphics in games just. Gamers yearn for more legitimate fun and more replay-ability as well as innovation in games.
Play old games, upgrade less.
In time hopefully devs will learn that if they don't optimize, then they're selling a niche product
Honestly. I've got a huge backlog from 2014 to 2024. And I'm sure I'll have plenty more to add. There are so many games to play.
Diversify your taste in genres and you will have dozens of thousands of hours of content to enjoy.
New is better, the games of this year are better than the year before and the games to be launched are better than the ones already out. Why do you think people see the Game Awards? For the trailers. Thats how the industry works. A game is good till is out, GTA6 is the best game ever till is launched and then GTA7 is the best game ever.
its either older games, or let the released games mature for a year. They usually run 2x better after 10 patches. Let alone if they remove DENUVO after that year. I have many games on my HDD, maturing :)
Easy to say, I have a 1060, my PC is 7 years old and next year is my 50th birthday so I wanted a new PC for the next 7 or 8 years but the price is crazy and between Intel screwing 13th and 14th generation and AMD screwing with the 9000, even if I had the money I would't buy any of it.
I'm still on my i7-920 w/1050 ti and 8GB RAM (used to be 12GB until recently when one of the DIMMs or slots died or something.) Will probably get a $2-2.5k prebuilt maybe around next September, right before MS drops support for Win 10.
The prices really pretty sad 😔
Pandemic already ended 2 years ago
And the prices still high
Because 5nm is more than twice more expensive than 7nm (which is the bleeding edge node during the pandemic).
That's how inflation works
because wars.
Certainly not because of the people that throw money at them and beg for something more expensive... that has nothing to do with it.
Ukraine's war has taken a toll on gas prices and what not. That unfortunately affects every aspect of the economy. At least where I live.
Ghost Of Tsushima, GTA5, Horizon Zero Dawn .... So many games that have been incredibly well optimized! Now the developers focus on ray tracing, DLSS and FSR and neglect the optimizations
A big problem is that game developers are firstly less competent overall, but also that they’re seemingly making games with the assumption that hardware improvements and costs are still improving at the pace they used to.
They’re throwing insane amounts of detail and effects at games to produce eye candy, and assuming everyone is on a 4K screen, while the actual hardware people have is barely better than a decade ago when 1080p was the norm.
They’re also seemingly no longer aiming for consoles as the base specs that a game should be running on, then adding bells and whistles to the PC release.
Games are made for top end PC’s that should have existed if moores law was still in effect.
Meanwhile on the low end, gaming has never been better. Because we’re so deep into diminishing returns, low spec machines like the Switch and Steam Deck that either use PC low settings or that need to be specifically and appropriately ported to are punching well above their weight.
1080p is still the norm, at least where I live more than 70% of screens are still FullHD. Only 22% Are QHD and out of TV section less than 8% are 4K screens...
I take issue with the console-argument. You can't have your cake and eat it too - on one side, people were complaining when games didn't look different from the console releases despite better hardware capabilities. On the other side people are complaining now when devs put all the bells and whistles that PC-master race idiots have been asking for. Which is it? What is it players actually want? Not enough eye candy - players complain. Too much eye candy - players complain even more. There's simply no winning here.
@@totalermist Developers missed the middle ground just like your reply. Many just want it to look "good enough" without making their PC convulse.
Game developers are not any less competent, they're just less incentivized. It's true for all software development, really -- optimization is a long, difficult and costly process that typically doesn't earn the company any extra money (unless you're optimizing the internal company infrastructure), so it's the thing the companies will wanna spend the least time and money on.
The upper limit to how poorly optimized a game can be is and will always be the hardware.
Indeed.
The "5nm" and "3nm" sizes are marketing nonsense anyways. The actual transistors have size of more than 20nm. Over the past 20 years they have become essentially 3D instead of just 2D allowing chip manufacturers to improve the density per area. So you will definitely see sub "1nm" cause these are just marketing terms.
That is true. It is just a metric of comparison but not of reality. Intel 7 or "10nm" lol come on. Just call it 10a already and stop pretending lol
Yep.
Which is why Intel's 14nm was keeping up for so long.
Nanometers is not a marketing word
@@gerardotejada2531 You say that as if the op claimed the opposite.. perhaps it's not obvious enough to you that it's a measurement so you assume he may have thought otherwise.
@@qswaefrdthzg the 3nm size has transistors that are around 5 nanometers in scale, so it is nowhere near as far off base as you claim
As an electronic engineer I can tell you what's going on here. The reason why they are saying Moore's Law is dead has nothing to do with manufacturers not doing it. They are up against physics. It's not about improving technology, it's about running up against brick laws when it comes to the physics (which is something you cannot change with technology). They are running into problems such as Quantum Tunneling and other issues at the quantum scale of things. Things are just too small, to put it simply. It is not something you can fix with some "new tech". What the manufacturers are not doing is coming clean about this actual limit in physics they have reached. So this is why you do not see the huge improvements with recent generations. They are saying Moore's Law is dead because they cannot apply it anymore due to the limits imposed by physics. Any improvement you see now is only about them fine tuning architecture (how their products function internally and how the sections interact) and less about increasing actual performance at the individual transistor level.
Tunneling begins to show at 4nm (not 1nm like mentioned in that reddit post). What happens is that electrons simply appear on the other side of the transistor junction instead of passing through it. They literally disappear and reappear on the other side of the junction. It's one of the weird things in Quantum Mechanics. So they are simply increasing the die size and adding more transistors. However the efficiency of the processes are diminished at the same rate as the increase in transistors numbers due to tunneling. So on paper (especially for the shareholders) seeing more transistors looks good, but doesn't really glean more performance due to the problems at the quantum scale. The reality is now that they have reached these limits in physics, there is no possibility of any substantial performance improvements, like we have seen in the past. Back then it was about getting to that limit, being able to make things that small was the limit. It was a technological manufacturing issue. Now that that limit due to physics is nearly been reached, no technology or manufacturing process can overcome this. That's why it's called "Laws of Physics". Once you reach those limits outlined by these laws, that's it. Game over.
And game developers don't understand that things should go forward. We expect computers to be all time responsive and more energy efficient.
But I disagree that we can't see improvements. I can tell how to get those improvements:
1. Ditch x86 and replace it more efficient architecture. Something that is designed to pipelines from start, has good code density and parsing is designed way that later phases what happens are known early as possible.
2. Shared RAM for CPU/GPU etc. no need to moving same data around.
3. And for high performance computers, ditch ATX/PCI and current structure of high power desktop computers. Like how about having single board that is filled with chips/chiples both sides with large heath sinks both sides. That allows optimization spreading chips based on distances and even heat distribution. Or even fancier design where is one board bottom, two boards raised from that to make some kind of "tunnel" with heat sinks many sides with 180mm coolers both sides of "tunnel". ATX/PCI standards are really not designed to current technology.
4. Optimized hardware for different tasks. We have hardware for graphics, tensors, video decoding etc. and perhaps there are other stuff that is used everywhere and can be optimized dedicated circuits, and computers can be optimized to different type of loads.
5. Rest of the performance improvements can be done on software. To help that happen, interface between hardware and software should be fully standardized and documented. Developers can improve software if it sits on stable platform. Focusing on portability has been all time more important for experienced developers than focusing on performance.
You know what, maybe it’s a good thing. I’ve been upgrading computers for decades and it was not cheap. I have several computers and laptops that are less than 5 years old. I will just use these for the next 10 years and play older games and enjoy them without that urge to upgrade.
Fanboys will always defend the insane high pricing.
They are rich shits. They don't know the pain of playing games at 30fps
"Nvidia Fanboys will always defend the insane high pricing."
There. Fixed it for you.
@@morgueblackRadeon is also guilty of it but to a lesser degree, the main issue there is stupid launch prices that are quickly discounted.
@@morgueblack "Nvidiots will always be Nvidiots"
More like gamer is delusional thinking material and labour will be cheap forever.
The optimization in video games has decreased even faster than technology increased for performance.
No it hasn't. Only clueless people on social media think so. Games are hitting their performance targets. You just can't accept what the performance targets are. For the large majority of people with GPUs ending in 60 and 1080p monitors that's 60 fps 720p render resolution.
@@albert2006xp I just don't understand why they complain in the US. They earn four times more on average than in my country, graphics cards are even slightly cheaper there, and somehow I can afford good hardware
@@carthagonova4132 Lots of these normies that don't prioritize gaming don't consider it important enough to be a big purchase.
agree the optimization in the latest games are dogshit except for god of war ragnarok optimization is good
I recently reinstalled Doom 3 with a resolution compatibility mod. I played it on an ultrawide OLED.
Guys, this game is 20 years old, and to my eyes it looks genuinely BETTER than any of the upscaled shimmering blurry garbage we're getting today. Yes, it has low poly models and flat textures. But the art direction, purposeful lighting design, and ultraclean super sampled image quality makes up for it. No FSR/DLSS artifacting. No TAA blurring. No weird shimmering anywhere in sight. Just smooth, high framerate gameplay.
Modern games just look so disgusting despite the massive technological leaps. What is going on?
Tbf, doom 3 had John Carmack, giga brain coding hyper genius, to code it up.
Those times are over. I also kinda miss the days of simple graphics but clean. My eyes hurt a lot from playing newer games.
Politely disagree. Cyberpunk 2077 on my new 4080S+7800X3D+1440p OLED screen max settings looks absolutely gorgeous as well as Metro Exodus almost as if looking through a window it's insane. Hopefully you'll see this beauty for yourself.
Yeah it's refreshing to play older games from time to time without the feeling of losing eye sight. I hate games that force TAA, i will take jagged edges any time over TAA.
I built a new main PC back in June using a Ryzen 7900 (no X) as I do a lot of Handbrake compression of DVD's and Blue-ray's.
I installed Windows 10 and everything else using the iGPU before taking the RX6700XT out of my older PC, but just for fun I installed a few games from steam (nice and quick getting it over my LAN from older PC).
Running Half-Life 2 which is also ~20years old, and it ran at between 90-120 FPS (TV's max is 120Hz) using the iGPU with just 2 RDNA compute units!
With all the settings set to High (the maximum) the polygon count is still very low and the textures are also very low resolution by today’s standards. Yet that game is still a great fun to replay today if you have not touched it in the last decade or so.
I honestly believe that we hit ceiling when it comes to graphical fidelity. I honestly see no difference between games that are coming out now and the ones that came out 5 or so years ago. I don’t understand how are they getting harder to run when they look basically the same. We truly do live in a time of diminishing returns.
I couldn't agree more
True. Rdr2, Mgsv and other games look almost the same with newer games, pluse these older games have better gameplay in most aspects while being much easier to run.
and you guys proud doing that i assume, theyre my vids taken with really hard earned pc, yet the pc unusable now, and the vids i see still hidden then
We definitely hit diminishing returns in terms of how a game looks, but not in how games behave and move. I still think lighting and field of view could still be heavily improved upon, but I don't think we are there in our technology yet. Games still feel very flat. It is why I hope developers stick to strong art direction for the time being.
There're improvement, they are smaller, and just cost more processing power. We are long past the 2000's and early - mid 2010's where you would see significant improvement each year that came due to several reasons. Also making high fidelity games is harder, making entire modern game engine is extremely difficult and that why there are handful of those.
5:00 "games are progressing way faster" ??? WHAT They literally regressing... Terrible developers programming on nasa pc, dont even play own games and using just every library under the sun to not code something on own with 0 optimization to save money on development...
I mean that's only a portion of games and they are greedy AAA games but indie and other game studios(Saber Interactive,Arrowhead and FromSoftware)are coming out with very good games like black myth wukong, space Marine 2, hell divers, Hades 2, Manor Lords and so on. 😮💨
@@Goldomnivore Sorry man.. I have VERY different meaning of indie game :D
He meant in terms of being intensive
@boris Hades and Manor lords are indie
@@Goldomnivore Where I said they are not...
Jensen's Law is over writing Moore's Law.
GPU price doubled every two years.
Lazy software optimisation isn't advancement. Games should be using Vulkan and be correctly optimised. id Tech Doom Eternal is a perfect example of a correctly optimised game. My 3060 can run it 1440p ultrawide with RT enabled.
Vulkan can be a mess. It is ok if your game are triple A like Doom Eternal. If you're not then we will see the repeat of OpenGL where majority of game development end up favoring nvidia hardware.
@@arenzricodexd4409 They managed to maintain an engine that ran smooth-60fps in the 360/PS3 era. What these guys do is nothing short from magic. I would guess that such a successful port to the PS3 was as much of a feat as a port to Vulkan is nowadays. I'm not exactly sure what they do or how, but it seems that it has been done for many years already. It is a shame that we are just now starting to realise and appreciate the amazing technology that goes behind a game.
It is a shame that ALL game devs will favor dx over anything else... it is the most cost effective at the moment, and AI helped them in the midst of software developer culling.
Game Devs doesn't need people who can tweak optimization and other source like Vulkan because they rely on Nvidia, which is targeted for microsoft dx as the default option.
I don't think AAA publishers will delay their game in the favour of optimization anymore, AI upscalling is here to stay.
@@nickochioneantony9288 I honestly doubt it's the developers making that decision. Devs do care about what they produce. It's most likely management and shareholders who seldom care about anything other than profit.
@@nickochioneantony9288 game dev did not use DX because of nvidia. they use DX because MS provide a lot of tool and help when they develop their game using DX. Khronos Group only oversee the development of Vulkan. they did not provide extra tool or help to game developer. for such thing game developer have to rely on IHV support. and this is one of the issue. another issue is things like extensions.
I never thought I would say this but.... INTEL, DO SOMETHING!
Give it 2 years, their graphics cards will be better than AMDs
@why-m3g what are they going to do? Intel's APUs have been garbage for years, not to mention when their graphics cards came out, they didn't even have working drivers for video games.
Games weren't working with their graphics cards, do you remember?
Nvidia is competing with itself and there's no stopping it for right now the only reason AMD was able to claw its way back in the CPU Market is because Intel got complacent
@@bumperxx1 Driver support takes time. Intel GPU hardware is already better than AMD. Just give it time, they fumbled bad with CPUs and currently scrambling
They did... Fired 15% of people...
@@ldope3904 so within 2 years they will be on par with team green?
Me watching this on GTX 1050 Ti and i7 3 generation processor: how interesting... x D
Gpu makers will always find a way to force consumers to upgrade, it's directx back then, then it shifted to RT, and now AI.
Directx was acceptable, but rt and dlss and all that AI shit is unacceptable
The game devs use ai upscaling as an excuse to spend less time on optimizing the game
While directx brought better graphics and better efficiency which is good
@@AMDRyzen57500F that's not upscaling issue though?
It's Dev's issue
you got it.
rt and ai will not make me upgrade. they are useless
I remember the whole PC can run on 550 watt PSU, now 550 watt for the GPU only🤣
550... my stepson plays fortnite on an optiplex sff with a 1030... hes got a 180w psu
@@unholysaint1987 pretty sure he means modern high end cards and not old low end cards
I still have a PC that runs on a 500W PSU. Also, you can run a 4090 and 14900k on an 850W PSU with plenty of overhead left over for overclocking. But people think they need a 1600W PSU, which is only going to guarantee their poorly designed 12VHPWR connector will set their house on fire when it starts melting. If you have only as much power delivery capacity as you need, the PSU is going to shut off a lot sooner because, as the connector heats up, resistance drops and the cable draws more power. But if you have 750W of unneeded power overhead, the connector has to get hot enough to draw an extra 750W before the PSU shuts off, by which point your house is already burned down. The other concerning thing is that Nvidia specifies a max power draw of like 40-50% in excess of what their upper tier GPUs are actually observed to draw. This is dangerous because any protections on the GPU might not kick in until you exceed that max, by which point a hardware fault on the GPU could have the whole PC on fire.
You see, those safety features on your PSU are really only to keep the PSU from bursting into flames, not to protect anything connected to it, same as the circuit breakers in your house. But there is a common misconception to the contrary. A PSU will keep feeding a short or overheated connector until the power draw or current draw exceeds the limit of the PSU.
Nah is 600w
@@WeatherMan2005 yes but... you can build a budget/low-mid tier pc right now, with parts that are at most a gen or two old, that will run circles around the best thing you could build 10 years ago, while consuming like half the power, for around 500 dollars
It's the optimization that devs nowadays noob af, not the hardware that can't compete
I just finished my download session for COD BO6-60GB including Hardcore mode and new maps-plus an additional 24GB for Nvidia preloading shader textures, all because I updated the 3090 Ti software drivers. 😅
The devs don't really have much of a choice. Most of the unoptimized games releasing today are from AAA companies that are slaves to the deadlines created by the shareholders.
Talk to the Nvidia boys, they keep eating them cards up.
Thats because they are better cards
@@RedWolfenstein I had Nvidia cards from 2006-2022, had ATi before that and my first AMD card has been the 6700xt. Maybe the drivers were mature enough, but honestly for the price I paid at the time ($325, came with two games), the experience has been even better than what I had with Nvidia.
Granted I was coming from a non-RT and non-DLSS card so that's really where nvidia is better these days, but I actually really AMD cards. I mean I'll be open to whomever when the time for an upgrade comes, but I stand by the "buy whatever is the best deal in your price range" mentality.
@@Torso6131I have this card with the 5700x because i thought it would be enough. I’m thinking of upgrading because it’s hard to compete anymore when your pc can barely run the game
@@account-2239 a 6700XT? It's been fine enough for me in anything competitive, but I also have it paired with a 7800x3D now (Did have a i7 6700k before, what an upgrade lol).
But I'm mostly sticking to stuff like CS2 and TF2. Maybe a little overwatch every now and again.
What are you playing in multiplayer that the 6700xt isn't enough if you don't mind me asking? I've been sticking to single player games as of late (Baldur's Gate 3 has consumed my last year, Alan Wake 2 behind it) and it mostly ran those quite well.
We don't like the increased prices but we don't like cards without DLDSR+DLSS and with poor RT even more, especially when they're not even half price as they deserve to be and actually want to charge close to Nvidia prices.
Not sure if your going to see but with how well regarded the 1080ti is, I decide to check how much of a performance improvement there had in been in a 10 year period from 2006 to 2016.
So back in 2006 Nvidia launched the worlds first modern graphics card called the 8800gtx, it was twice as fast as any other GPU available at the time and it only cost 600$, literally one year later NVidia would die shrink the chip down from 90nm to 65nm and release it as the 8800gt and sell it a 350$ for about 90% percent of the performance.
Things were advancing so fast back then that by 2009 you could buy a GPU from either AMD and NVidia that matched the 8800gt performance for 100$.
Now comparing the 8800gtx to the 1080ti to see how much progress had been made in 10 years, and it turns out that the 1080ti is at least 15X times faster that 8800gtx.
Now when we compare the 1080ti to the rtx 4090 in pure rasterization performance the rtx 4090 is only 3.3X times faster that the 1080ti, those are terrible numbers especially when you compare price to performance, after all we never got a 1080ti performance for 100$.
And I doubt the 5090 is going to do much change these terrible numbers either, so yeah this hobby is going to end up getting killed by these stupid high prices, terrible performance gains, and terrible developer optimization.
I upgraded from a 1070 this year, mostly because I needed 1440p for RUclips shenanigans...meanwhile there were people who bought a new GPU every damn year for whatever reason lmao. It still played damn Cyberpunk at med to high settings in 1080p at 60 fps
@@ZealothPL lol I am still using an RX480, mainly due to the fact that right now all that I am playing are older/indie games.
There are limitations on hardware development but some gamers and game developers should take head out of ass. They must be crazy to think that as norm to have several kilowatt rack of water cooled hardware, placed next to room to run some game.
Most sane way to see hardware and software progress is to compare what we get on same watts.
@@gruntaxeman3740 yup good point, as far as I am concerned 250 watts is as high a gpu should ever get. The fact that the RTX 4090 has a tdp of almost double that is insane to me. Unfortunately using performance per watt as a metric makes the new gpu's look even worse. As for the terrible state of game optimizations, fun fact I will absolutely blame NVidia on that since they were the ones to start pushing the stupid DLSS AI upscaling and temporal anti aliasing that these incompetent game devs have been abusing as replacement for actual game optimization, hence the terrible state.
@@flydeath1841
About that much yes. I compare computer PSU. For me, upper limit is 750 watts. That means peak load everything on continuously 100% should use max. 50% from that to have enough room for PSU aging and to keep it stable because of peak currents. I'm currently planning my next computer and I do some engineering here, I start this from PSU and pick up something where idle watts are lowest. I also have computer running continuously full load sometimes when rendering and same times compiling some software, so I see that this is 375W heater. Cooling is anyway only way to remove heat efficiently out of enclosure. Will my apartment be turned into a sauna or should I invest also air conditioning? While I can build system reliable enough on that power limit, it may be still inconvenient.
And I consider this very high power as my current old computer PSU is only 240 watts (30 watt GPU is sweet).
Developers don't care to optimize their trash, modern games may look more realistic, but they look worse then so many older games, the chase is pathetic.
Gonna rock my 970 till it dies, I can run anything I want at 60fps at 1080, I feel no need to upgrade graphics, I only do when I have to, I was happy with 1024x768 res until I couldn't get a new moniter in that aspect.
It's like what has happened in the auto industry. Prices are sky high and quality is not keeping up with price. What we need is better value at lower prices.
As for computer graphics, I'd say we have decent graphics quality right now. The problem is more a lack of quality games. Games are being pushed out half baked or not even baked.
baked lighting and shadows? huh?
@@5-OTT.s-ON-A-KICK That's a joke, right? When they want to, the studios can easily work within the limitations of current tech.
@@SG-js2qn yh but there too lazy like now valve is doing this bs upscaling stuff on cs2 like bruh
@@5-OTT.s-ON-A-KICK There's nothing wrong with the upscaling, as that is simply another technique for making a game viable across a larger variety of players. FSR and DLSS are really just formalizations by AMD and Nvidia of a technique that's been available to game developers for a long time. Games are on the one hand a creative effort, and one the other hand technical, where they should be trying to optimize for a sweet spot between performance and quality. We don't need ray tracing, etc., to have a good game. That's mainly just the industry trying to find ways to justify new GPU purchases.
@@SG-js2qn well now we have companys using dlss to make their gpus seem faster
the 4050 is slower then 3050 but its only faster with dlss on, whats the point buying a game when the devs make a optional feature become mandetory. im pretty sure dlss was made to either have games playable on 4k or use an old gpu to get more performance. now we have it to make
new gpus to have playable fps on 1080p.
The thing with AMD Ryzen 7 7700x and 9700x is that older one is 105W CPU and runs higher base clocks, and newer one is 65W CPU. This means that while performance is very similar, near identical, there are huge optimizations done to make it more energy efficient. This is huge difference. 5 nm vs 4 nm process surely helps.
Also don't forget that Windows isn't really optimized for the newer chips. Some updates even bring more performance.
or you could lose a little performance and undervolt the 7700x to become more energy efficient (not sure why anyone would do this but it is possible) .
@@VintageCR Technology itself is very useful for applications like laptops and small PC's, thin clients and such. While this doesn't directly fit to high end gaming, people still use these devices to play games. I do not think 7700X can be undervolted to 65W without reducing clocks and disabling boost, as it is done on TSMC 5 nm process. Performance would suffer a a lot. 9700X however is 65W out of the box, and probably could be undervolted to be even less.
I don't disagree with you, I simply commented because this channel used these processors as showcase for miniscule benefits, missing the point completely. I just don't like misleading content. Surely we can take one metric and chew and that, but it's not like these processors are the sweet spot for purely gaming anyway. It's just half blind approach.
@@jarivuorinen3878 no you are right, i may have missleaded my comment, my goal was basically to say why even manufacture a higher powered CPU when the next gen or not even next gen is coming up packed with better efficiency.. its such a waste to keep producing multiple products (cpus) with the added errors in the wafers, but they only have marginal performance increases.
@@VintageCR There are some news that AMD will no longer produce 7000 series soon. TSMC makes the chips but their costs have risen. I'm sorry I can't find the source right now so take it with grain of salt. I also didn't mean that you've misled anyone, I mean the channel, content creator, used those processors as bad example.
hey guys, just so you know the reason that the ryzen 9 series was barely improving on the 7 series (sorry if im butchering the names) is because windows was actually bottlenecking it, it was a windows issue not a amd issue, although it was fixed a few weeks ago/ a month ago, although linux STILL has higher cpu benchmarks with amd than windows
an element you left out and the actual problem right now is the modern AAA game dev... they don't code anymore or have deeper understanding of computer science. Modern game devs use SDKs like Unreal, Unity etc that let them build AAA games without having to code 95% of it. The end result sold to the end-user is a bloated one-size-fits-all codebase of which the actual game doesn't fully need, and that requires hardware to work a lot harder to achieve the same results it would get if optimised properly. Optimising games is something that requires the deeper computer science and coding knowledge a typical modern game dev doesn't possess.
When "built right" and optimised even moderately, there is no reason a mid-rage gpu can't put out max settings for a brand new AAA game at high frames.
Yes that too. Unoptimized code is one of the biggest problems and bugs everywhere.
rust devs be like
Gaming Industry has shifted the budget to use Hollywood stars instead of Game Developers. I think it is clear to see.
To think that I cheered for Keanu Reeves which resulting in a mess of a game that is Jedi Survivor because they busy paying Matthew Monaghan instead of Respawn team.
As a student in college for game development, I very much appreciate working with Unity. Yes, there's less coding for stuff like collision or handling movement, but only coding 5%? Nah. If you aren't using assets for menus or movement, you still need to code a lot of that yourself. It just makes it easier to connect everything and handle it. It's like a house. Unity or Unreal put up the foundation, walls and a roof. But YOU need to add in the walls, furniture, plumbing, doors, etc.
And it DOES lower the barrier to game development. So many fantastic indie games use Unity or Unreal and run flawlessly. Stuff like Cuphead, The Messenger, Sea of Stars and Ori and the Blind Forest run on Unity for example. All of these games run on my old tower that has a GTX 970 and on Switch with little to no issues.
That being said, optimization is sadly not as widespread as it used to be, for sure. But engines like Unity or Unreal are not the problem. It's how they are used. And hardware is seldom ever to blame; it's the lack of time, care or experience of the developers.
5% of coding? Where do you people come up with these headcannons??? It took me two months to make my first indie horror game in unity and the coding part was literally 90% stop pretending to be a game dev
Dude, it's like this is Moore's law but in reverse! Our hardware is improving at half the pace it used to, but its cost is twice as high as it used to! Exponential... decay...
Nah, "Moore's law is dead" cus it is more profitable for you to make us believe so.
@ARCx9 for most people you also don't need a GPU that costs more than 300€. Most used resolutions on Steam are 1366x768 and 1920x1080.
I would say that something like a RX 6600 is just enough for 1080p
Who cares about 4k? I don't even notice a difference when I use a 4k monitor unless if I squint my eyes really hard.
@@AMDRyzen57500F Oi now, calm down. You gonna upset the
@ARCx9 the thing is that we aren't at that point yet. So Moore's law is still alive so prices shouldn't be what they r right now
the thing is that we aren't at that point yet. So Moore's law is still alive so prices shouldn't be what they r right now @ARCx9
This is why i support pc handhelds. They are exciting and make the games fun
My 11400F and RX 6600 is running everything fine for me. Then again I grew up in the 1990s so everything modern looks good to me.
That is the most "just enough" setup, I love it!
@@AMDRyzen57500F 99% of the games I play are pre Cyberpunk so I just told myself anything more would be a waste.
Similar situation- if it can run total war, space marine 2, elden ring , and armored core 6: its all i meed for years to come. Ive only played space marine 2 from those… it got me back into gaming.
@@anderotaola7515 Space Marine 2 is the first game in a long time tempting me to pay full price. Holding out for a sale.
@@AMDRyzen57500Fit is not "just enough". Gamer in reality did not need anything faster than RX6600.
I remember buying a little GTX 950 back in the day for just 150 USD, and i was set to play at 1080P High Settings 60FPS. Now? Budget cards is basically nonexistent.
It’s also a shame about Ada Lovelance GPU’s that there is a RTX 4050 laptop but not a RTX 4050 desktop variant, compare that to Turing when we had the GTX 16XX series as budget options compared to the RTX 20XX series.
I still have my GTX 970. Wonderful little card served me fantastically until i got myself a 1080ti. And it's still working just fine. But you ain't getting more that 20fps out of that little guy on modern games at 1080p when i used to get 100+ fps on games like The Crew
for those who wondering for budget gpu, there is RTX 3050 6gb with a shit memory bus about $200.
There... you got a brand "RTX" without any capability to run anything with Ray Tracing. That is how low Nvidia has become in term of consumer's respect.
I don't even have the power to tell you how underwhelming RTX 3050 run... it's just very dissapointing.
I remember 1050 for being a beast, then 1650 Super bring that budget build to the next level.
@@nickochioneantony9288 exactly... nvidia's current "budget" option is the 4060, right around 300 dollars depending on where you look... 5 years ago that was used 1070/1070 ti money
@@infernal-toadit’s cause desktop RTX 4060 is actaullly a 4050. It’s based on xx-107 chip. 106 was always xx60 series die and 107 for xx50
I built my PC back in the beginning of 2023 with an RX 6600. Great 1080p card and relatively cheap. I recently upgraded to an RX 7800XT and because I'm still gonna stay at 1080p for a while, I thought "well, this is a 1440p card, so it should be pretty overkill for 1080p and last a long time".
Card is already struggling to run some recent releases. I played Silent Hill 2 remake at 1080p native, everything maxed out minus ray-tracing. Average of 70fps but with constant stutters and in some extreme (albeit rare) cases, my fps dropped to 45-50. And I know what you might say, "well, yeah, you had everything cranked to the max!", but dude, its a 7800XT in 1080p, and I wasn't even using ray-tracing. SH2 is gorgeous, for sure, but its also mostly closed environments and low draw distances when outside because of the fog. This game should be running way, waaay better on this card. There are other cases, I'm also struggling to maintain 60fps in Final Fantasy 16 as well, and as beautiful as it is, it looks to be less complex than SH2 on a technical aspect.
This industry is fucking cooked, games have become 100% dependant on upscaling and fake frames, both with noticeable downsides. Expensive ass hobby.
That’s why you always have to be vigilant when hunting for a reasonable bargain. When the RTX 3090 FE first came out through BB stores, the price was set at $699.00 for months. At that time, Nvidia CEO Jensen Huang didn’t believe he could get customers to pay much more than $700.00.
Microcenter has came through once again for me for a 3090 ti @$700.00 (backroom *open box)
Sounds like a Windows problem. I can't believe it's 2024 and people are still using the worst operating system ever made.
@@mattseaton5832 Let me guess? You use Linux 😂. Enjoy trying to run any multiplayer games on that thing.
@@goatguythepowerful no I suffer through windows currently. I just don't pretend it isn't terrible
I can charge WHATEVER I want.
Because in the end, YOU GUYS BUY IT ANYWAY.
LOLLLLL!!! THANKS FOR THE LEATHER!
speaking of leather.. hows the fleet of leather jacket wearing flesh eating cyborg assassins coming along?
Exactly! I would too! Gimme all that money dummies! 😁
You just get more with Nvidia.
I haven’t bought a new GPU since 2017. Outside has way better graphics
@@javascriptkiddie2718 Broke talk. I have a 7900xtx and the fact AMD chose to not even compete high end, im going with Nvidia when I upgrade
AI upscaling was supposed to allow video games to be played at 4k. It's been since the ps4 Pro that we talk about 4k gaming. I should not see "1080p" in a game requirement table in 2024.
Problem with this is that 4k monitors with descent refresh rates end up costing quite a bit. A lot of people would rather stick with a 1080 or 1440 with 100+ frame rates as opposed to spending an arm and a leg to get a 4k monitor that’ll most likely be able to only do 60.
Problem is, if you look at Steam’s hardware survey, a LARGE majority of players still use 1080p, with 1440p right behind it.
Moore’s law was killed by shareholder profits.
Gaming has gotten too big too fast . It has to balance itself out by shrinking a little. AAA titles must get smaller, people should stop overrating them and give more chances to indie titles
This is IMO very true, AAA titles takes long years and billions dollars to create, so they almost have to be generic to be safe bets for investors. And hardware became more expensive partially just because of demand aswell.
Maybe try VR - there’s a lot of smaller scale games which are innovative and fun with the VR tech (Quest 3 & PCVR). They’re not too taxing on hardware either.
Most games could afford to be shrunken down. I wish companies would let off with the quantity over quality. I hope for more semi open and open linear titles to be more common. This next Dragon Age game seems to be smaller in scope that the previous game so I'm excited since the previous game is why I developed open world fatigue.
I think indies get their fair share of love and hype from people. It is the AA/smaller AAA games like Prince of Persia, Astro Bot, Hi Fi Rush, and Unicorn Overlord that tends to get overlooked more.
To bad that 90% of indie games suck.
@@goatguythepowerful that 10% is still way more games than AAA without saying that a good chunk of AAA games also suck balls nowadays
Newest games these days feel like they become unoptimized to the point the triple A Studios are becoming more reliant on upscaling and frame generation instead of optimizing the games to run better on old hardware, plus the two games I have been playing is Red Dead Redemption 2 and Forza Horizon 5.
The main reason why I bring these two games up is if you look at Red Dead Redemption 2, compare it to the newest games you'll notice Red Dead Redemption 2 looks and performs way better, also with Forza Horizon 5 that is the most optimized game I have ever played and you can use the old graphics cards on it to the point that it's still runs decently well.
Basically I'm saying if game development didn't rely on upscaling and frame generation and rely on more on optimization, plus I'm saying upscaling and frame generation would be far better use on for older graphics cards and even consoles, but I do blame the higher up within game development for these problems and I say it's better to delay the game to allow it to have more time to be more optimized for better performance.
I run everything I play on 1080p native on a 40 inch tv... No FSR, no DLSS, no ray tracing, no motion blur... If I can't run a stable 60fps like that with max or almost max settings, I don't need it... Kinda makes sense now that the last AAA game I bought was Fallout 4 in 2016...
hell yea man rdr2 and metro exodus was peak
Forza horizon 5 is crazy, I was able to get 60 fps with a 1050 ti on high settings.
Everybody brings up red dead 2 as some super optimised masterpiece. Which it is but the reality is that game will still destroy a 4090 at 4k when maxed. The 2080 was the best card when it launched on pc. Yo
@@mojojojo6292 Incorrect, it can get around 100 fps on 4k ultra max settings without dlss. Even my 1070 can play it on 60 fps at around medium-high settings
AAA studios should focus way less on huge budgets and insane graphics, and start putting focus again on actual passion projects and solid+fun gameplay
Talk to the shareholders about that. They are the real ones running half of the studio's these days.
11:20 When I was in college a few years back, quantum tunneling was theorized to start being an issue at 2nm. If it's now only an issue at 1nm, that's a pretty notable improvement.
Meanwhile PS5 Pro for $700 💀
Ray tracing is available even for ps5 and tht upscaling psst is just a marketing gimmick if they want they can add it even to ps5 with a software update only its not tht big of a deal@@raven679-m3i
800 euro 🤡
$700 and it doesn't even come with the disc drive.
Bargain tbh, a mid tier GPU now is that price before you even build the rest of the pc
But the problem is that a $700 PC with well picked parts will always outperform the ps5 pro, which makes the consoles less attractive compared to PCs
why is he saying billions when it clearly says millions?
*50 series going to be crazy expensive!*
Because we are competing with corporations who wants AI chips. So if you want your gpu or cpu you have to pay up.
@@RealityCheck6969 Yep very true, more profitable for them to sell as AI chips to corps.
But if they sell more reasonably priced normal chips to people who can actually afford them, wouldn't they get more than just selling stupidly overpriced AI chips to corps?
@@AMDRyzen57500F Because of the demand from AI company's supply will continue to be lower than demand. In any market when demand is high and supply is low prices will be high.
@@AMDRyzen57500F They pricing is very different, the margins are hugely greater for corps, even if they sell vasts amounts to consumers it may not make a difference, I may be wrong.
A company will never make its products cheaper if every time they raise prices people rush to buy them as fast as possible! I wouldn't either tbh.
There's a channel called @HighYield that goes into a lot of the nitty gritty in process nodes and such. He did a recent video about gate all around architecture, and what I take from it is that in order for progress to remain exponential, innovation must also now become exponential. Of course this is pretty much impossible without a steady linear increase in IQ, but they are making more profit margin than ever, so R&D could compensate somewhat. as you said, progress will slow down but not stop.
Ai Ai Ai Ai but what is Ai doning in the end?
Machine learning denoising basically. IIRC DLSS2-3 and FSR2-3 are pretty similar technologies but DLSS and XeSS have hardware accelerated machine learning denoisers.
Machine learning is pretty cool, I had a work project where my team was working on coding a machine learning based tool for our sales people to use (I was a chemist, so, learning to code, not my full time gig though) and it was pretty neat. You train it to take in all the old data and make a "decision" based on that and in theory it only ever gets better.
Once AMD allocates hardware to machine learning for FSR I bet it'll make a fairly sizeable leap in upscaling quality, and it might even boost performance since the GPU won't be using async compute resources to upscale as much. Of course then you're taking up hardware space that could have been used on general performance, so maybe it'll be a bit of a wash. I think DLSS generally outperforms FSR on nvidia cards slightly for that reason.
It doesn't help that games are coming out with graphics that are far past the point of diminishing returns, with awful usage of TAA and upscaling that makes those improved graphics basically pointless.
It would be better to focus on strong art direction. I have seen people crap on the new Ghost of Yotei, but it has amazing draw distance, artistic lighting, and atmosphere. Horizon Forbidden West still looks better that a lot of current gen only games and doesn't need all of those fancy upscaling. Even games on my Switch can look fantastic when they work with the hardware limitations.
Its still cheaper than when every cryptominer hoarded all the gpus
As a Taiwanese, I appreciate your sharing. Taiwan number 1! However, just remind you that long working hours has led to burnout and physical health problem for staff in TSMC. When you hold your graphics card, remember this is really from blood and sweat and tears of TSMC. Hope they can form a union.
Taiwan is the corner stone for West. It is also a great price for China. You do get noticed.
❤❤❤ tsmc is nice.!?
Bro I didn't know squat about Taiwan but you having the best chip manufacturer in the world is wild.
@@paulboyce8537 Taiwan is not the West they are trade partners with them. Officially, they are part of China. Soon as we have our own foundries they will be cast aside.
I don't care, they had a choice and they took this one
DO I remember when the GTX 1080ti dropped?
Pepperidge Farms remembers.
I feel like the same as phones, computer chips are getting to their ultimate form where they have reached their full potential and just can't get any better, time to invent new technology (or physics) 😂
Personal opinion: We have reached cap on 1440p and even 4k gaming. Who needs more than 175fps on 1440p? Who needs more than 80 on 4k? What is there to even improve upon. The next step is VR and the insane resolutions these glasses require. Before that happens we will be in the current swamp
Edit: The bad game performance is not a hardware issue but a lack of optimization from the devs to save money.
There are monitors that can display 4k at 240hz so yea there is still room for improvement
@@cvd1 the difference between 240 and 165 is negligible at best
Esports and competitive gaming needs 1440p over 175hz. It's how I play CS2 and I can't play it any other way.
@@gurnery4219 I get that but correct if I'm wrong, esports play these titles on lowest settings with no shadows? Even a 4070 ti super should push 200+ on 1440p this way
@Scorptice tbf atm shadows on cs2 is a bigger thing than it used to be on CS:GO.
Personally, single-player games at 60fps is perfectly fine but competitive as much as I can get.
I don't feel graphics are looking nicer for the performance hit most of the time. I can't see 8k taking off for another decade when tv companies make it affordable, and even then, do we even need it? We've definitely hit some sort of cap, and I agree to an extent.
The only thing monsterous about the 50 series will be the prices.
The power draw as well... 600w on the 5090....
The problem isn’t that hardware is slowing down, it’s that optimization for games is going out the window. CPU’s and GPU’s run Warframe at 1000 FPS and it looks great at 4K. It’s just the newer games engines are horribly optimized
I remember when GTX 1060 6GB was better than GTX 980 4GB and today’s RTX 4060 8gb with so bad memory is not even cloze to 3080 ( even 10GB).
My son is still using my 2017 GTX 1080ti build for warzone and rocket league at 1440p high setting over 120fps.
Me using GeForce now 🙄 on my Mac book air.
My son plays only Minecraft and Roblox. He's never wanted to play anything else. He gets 1080p75 locked at all times on the GTX 750ti. I ask him every couple months if he wants a better graphics card and he's like dude.. what's wrong with mine? Lol
Soon clould gaming like GFE will become more common. A, friend of mine already move to GFE because he did not want to play catch up with the hardware. One year of GFE sub is cheaper than 3050 6GB.
@@arenzricodexd4409 bro the input lag tho. It will never be solved because of propagation and equipment delay. Even in the same house with the server with expensive equipment will create delay. I can see it strickly for single player games but you can not play a single fps game with remotely competitive input lag. Plus the image quality has compression artifacts like watching Netflix etc.
I think that single player games on handheld is where it will excel. I imagine streaming cyberpunk 1080p pathtracing on a handheld would be 10 times better experience than playing it locally on the handheld. Might actually be a great use of the tech. It can't be universally useful unfortunately.
@@christophermullins7163 personally i can't imagine myself going the way of cloud. But for more casuals gamer out there it might become more sensible way to play games from time to time rather than spending big bucks on dedicated pc or console.
@@arenzricodexd4409 I imagine the remote gameplay engine could easily run on tvs out of the box. (Maybe some do already?) Such as a "Steam TV" or "Xbox TV" that you can play any modern game on. Also.. remote gaming on a handheld would provide far better battery life than running natively... all modern devices become a gaming device. The tech has its place for sure.
As someone who does study economy, the higher prices makes sense on both the political and economic aspect of this. In fact, it's important to have higher prices when there are shortages (higher profit margins) because we could see more investment from said profits to do R&D and meet the demand of their consumers. Yes, the consumers aren't just us, the gamers, and they do need to supply for other applications outside of just video games.
More competition is important, but for such a complicated process to produce said products, it'll be a while. However, the USA subsidizing manufacturing doesn't really help in the long run. If we're looking at the long run, they really need to reduce taxes and regulation. Reducing costs of production is much more important than just throwing money at the problem. Especially for increasing competition, thus more of said competition can better supply shortages.
IP is a huge issue also. Been holding back innovation for centuries.
It's a good thing that it's such a small upgrade from generation to generation, means we don't need to buy a new PC that often.
yeah but you are playing medium settings or using upscaling techniques with your 800€ Graphics card. I don´t like that
@@George_C-e4v Bro it's not that hardware we have now sucks it is awesome bad optimization is the reason why system requirements are getting insane!
@@George_C-e4vI just don't buy new games that are too demanding. If I can't run a game at native 1440p above 120 fps on high settings I just buy the game whenever I get a way more powerful PC.
Until newer games forcing RT/Lumen to be enabled all time, tanking performance for low end gpus, some even can't run it at all like mesh shaders requirement 😂
If everyone were to skip multiple generations, component sales would plummet. Then invidia and amd would pivot even more towards their corporate customers and pc gaming get less innovation plus higher prices. Developers crafting better games that run on old hardware is the only hope for the future. For now, playing older games is the obvious option.
5090 is gonna be a performance monster but it's also going to be a price monster.
Modern games are very unuptimized for running on modern hardware. GT6 was able to run on the ps3 at native 1080x1440 at over 50fps.
Brothers it’s time to get up, vex just dropped a new vod
It is of course a pure coincidence that these annual price increases happens at point in time when ALL products are made by A SINGLE manufacturer (TSMC)...
Yeah
TSMC power. They can charge high because their is no competition
Erm actually Intel makes their own silicon wafers 🤓🤓👆👆
But seriously, Intel is not that attractive for normal consumers
Get Intel gpu so they can speed up driver improvement. They perform great for when they run smoothly. Only certain old games have issues as they don’t have a long history of driver experience.
In a few years, Intel will have better gpu than AMD. Hopefully actually compete with Nvidia bc they basically bent over n gave up when they announced their manufacturing of high end GPU failed and is actually why they won’t be releasing high end
@@AMDRyzen57500Fyes which is why their cpus use more than double the amount of power as the AMD equivalent
the sad thing is that most people out there have NO IDEA on the real price to performance in general and will spend, loan, to get these cards. the delusion of the last 3 generations has been incredible.. nvidia, intel, asus, etc know this.
One thing that will likely improve chip performance massively is using photonic chips as they will bump up the clock speed and won’t increase heat output. But we won’t see this until the mid to late 2030s. We are getting glass dies by 2026 to replace silicon, glass die can be used with photonics.
Using sand product to make a different sand product. Nice
Greed. Greed destroys everything. Eventually none of us will be able to play any new games.
gotta toss ego in there too they're both equally guilty and destructive here. i think at least.
Literally makes no sense. Games cater to the majority of players. Even a 1060 can play new games nowadays. Consoles also exist and aren't stronger than a new PC (though they can be stronger than a PC from 4 years ago). The problem here is with players. They simply don't understand what their system is capable of and what resolution and settings it's made for nowadays.
@@albert2006xp Your delusion is astounding.
@@BleedingMem0ry On a post that claims companies will release games nobody can play. That's wild. You guys are out of your minds.
I saw the new games and I don't want that cultural rot.
i feel like pc gaming can be better optimized the steam deck is evidence of this
Meanwhile, more people than ever have a gaming PC.
It is not that games are progressing at a faster rate than hardware, it is that copy paste engines like Unreal are being used to spew out games as quickly as possible whilst not having to be optimised. As long as it works, it is released.
Not every game has to be a hyper realistic game that requires a 4090 to run, Battlefield 1 level of graphics from 8 years ago are still perfectly acceptable
BF1 indeed still looks amazing! Its krayzie
Even earlier. Even bf3 still looks great
0:18 I joined PC gaming around 2018.........What do you mean the 1070 was $380 when it came out?
I literally bought the 1070 for around 550 or so plus tax at micro center… months after it came out. So saying $380 is so fundamentally and patently false that it has to be a deliberate lie
The launch price for a 1070 was 379 if you search it up. I also got my 1080 for 400 around 2020 so it's not far off
@@smallpeople172 I bought a 1070 at 380-ish back then too.
@@Error-kh4cx It was a simpler time wasn't it.
@@Nito-San gtx 1070 was 500-550 euros in Europe so they spread false information and rx 480 8gb was like 300 and 4gb version 240
As soon as frame interpolation was made a thing, it was only a matter of time until developers started relying on it to hit playable framerates.
That time has already arrived.
Nvidia GeForce now will answer all your questions😅
In the US prices don't differ much, but here in South Africa, a 4070 Super costs $850 US Dollars on Sale and a 7900 GRE costs $750 US Dollars on Sale, that's a huge price difference. The 4090 is $2650 US Dollars on Sale... That's not even affordable I can buy a car for cheaper.
Its basically the same thing in brazil :/
Similar in Australia
4090 isn't produced anymore. That's why the price goes up to this crazy values......
@@ElRabito The 4090 has not been discontinued just yet, So stock hasn't diminished and the price hasn't yet changed to due to that. $2650 is a very cheap price in South Africa for a 4090.
$2650 on sale? that's insane
I wish more games would opt for a cartoony art style. Not only does it age better, but it also upscales significantly better when needed. Wind Waker, running at 480p with a CRT filter on still looks pretty dang good. Team Fortress 2 at native 4k looks incredible and came out in what, 2007? Dunno why Borderlands is so GPU heavy these days though. Guess it's that Unreal backend.
Maybe it's the lack of "fine detail" but they look great even at lower res.
Valheim
valve is working on a new ip "deadlock" and the graphics look pretty simple nothing crazy honestly for a moba hybrid shooter its the most fun I had in a comp game in years. tbh I hate ultra relestic looking games with motion blur and alot of pot processing csgo was the best looked grungy and dark like most source games but not too realistic especially in physics
@@supawithdacream5626 Yeah, I started playing around in Deadlock a little bit. Kinda reminds me of Monday Night Combat from back in the day, which is a good thing.
Unfortunately I don't have a ton of time to dedicate to multiplayer shooters like I used to though.
@@Torso6131 yup I feel that, some of the best years of gaming are already behind us anyways if I walked away from modern games I wouldn’t regret anything tbh
@@supawithdacream5626 I basically play three to four games a year at this point. Baldur's Gate 3 was incredible, Tears of the Kingdom as well (even if it felt a little too similar to Breath of the Wild), but multiplayer stuff has just passed me by.
It bums me out that more games aren't using custom engines anymore. I feel like UE5's sheer heavy backend is causing so many issues these days.
I purchased my 3080 for $699 launch day at Microcenter. This is just greed plain and simple. They used both Covies and shortages to justify price increases and never went back.
Still holding on to my 2070 Super.. performance is still respectable in 2024
As GPUs get more advanced, it gives devs more incentive to be even lazier with optimization. If this trend continues, in a few years from now, you'll need a 4090-level GPU to run new AAA games at 1080p low 60 FPS (with frame gen). I'm frankly sick of these badly optimized titles. It makes you want to quit AAA games entirely and just play indie-games.
No way. If by "in a few years" you mean in 5, 6, even 7 years, then yeah. Not in a "few" years.
Stop buying lazy "AAA" games, they've been awful for past decade
I did that.
I only buy single player AA games. They are much better.
@@gruntaxeman3740 based
I am SO glad I built my pc about 6 months ago. Got the 7900xtx and 7800x3d at the correct prices
Game these day don't look good like back in 2012-2016 era.
They do.. but at what cost? Playability is non existent.
me who only plays games from the ps3/360 era
@@5-OTT.s-ON-A-KICK Im going as far back as the ps2
@@337x i lost my display adpapter for the ps2 yk the one that had the audio looking plugs, atm i have my homebrew ps vita xd
Back when SLI was a thing.
2020: 2080 Ti 4K native 50-70FPS
2024: 4080 4K dlss 2+3 70-90FPS...
DLSS has ruined game states terribly...
This^
Why would you run native when you can run DLSS and make a game more demanding and graphically intensive? Without DLSS (and upscaling on consoles as well) a game like Alan Wake 2 would have to release in the year 2030 to run.
@@albert2006xp They should have waited so it would have not been butchered like that and maybe recoup its costs. It still has not broke even.
@@dansmith1661 It's their most successful game and they managed to fund a risky cult classic project most gamers are too dumb for and will make their money back. We would rather play better games now, none of us know when this life is over.
Great video! I wish it would get more views considering the wonderful work your put in.
literally love ur content, because you put average gamers POV which makes it relatable to me.
Underrated channel.
I watch more of this guy's video than almost any other channel AND his channel probably has the lowest subs.. How does that happen?
@@christophermullins7163 you see their reasoning they sum and always mention either companies, fans, or smth they been doing now, but never my rights of fairness