The problem is devs are overthinking about graphics, yes having huge graphics improving each generation is nice but we need good gameplay, storytelling and optimization. Right now, we have games that tank your fps trying to be as photorealist as they can be with bad gameplay and no optimization or barely (don't tell me relying on scaling is optimization!)
They are not overthinking. They are cutting costs, by using rt, dlss, frame gen, and get money from partnering with companies like epic (ue5 lumen), and nvidia. It’s easy money for everyone.
THANK YOU, when going from Low to Medium setting almost made it look like you were playing a DIFFERENT GAME and if you went from Ultra to high or even high to Medium thee wereBIG FPS gains NOT ANYMORE especially since DLSS and FSR gave Publishers the Excuse to cut COSTS by killing Traditional optimization
No, the problem is with the increasing complexity and features in engines many artists don't gain low level knowledge and rely on existing platforms with limited developer teams that constantly need to cater new people. So basically lack of expertise + stupid deadlines.
Nobody ask for graphics, the industry did that to itself, giving the illusion of value on visuals. Black myth wukong would have been a 1 - 2 years max if it wasn't that visually heavy. also that shows that future games won't even have to care that much for optimization on framerate, with FSR/DLSS and framegeneration, game will keep spending on graphics and leave "force" FSR/DLSS and framegen as wukong did to perform as expected.
Spot on, I have no issue with FSR or PSSR being used to enable a rock solid 4K30 game being playable at 1440p60, i think thats a fair trade off, but when you see games abusing FSR to the point that PS5 and Series X games have an internal resolution of 720! Its crazy, Immortals of Avium is actually a fun game as well, but going UE5 ruined the game on console as it had no chance of hitting 60fps without FSR at a base res of 720p!
It's a great way for the CEO of a console company to try and persuade people to stop pursuing tech improvements because it eats away at the cost and effort of a console 10x more than it does on a PC where you can simply adjust settings. You guys eat up marketing so easily while thinking you're too informed to buy into it.
It's interesting when you look at it from the perspective of Memory improvements. The PS3 had 256MB of RAM, plus 256MB set aside just for graphics. The PS4 has 8GB plus 2GB. That's a 30x improvement. THIRTY EX! O_O Now, the leap from PS4 to PS5 was....2X....it went from 8GB to 16GB. A bit underwhelming from a memory standpoint. So, while the PS4 could store 30x the textures that the PS3 could, In contrast, the PS5 only doubled texture capacity from the PS4. Crazy.
@@marsdenit2845 Just a reminder, that minecraft is still the most popular game in the world. Gameplay is the most important part in games, not graphics.
I'm glad that the graphics are plateauing. Hopefully now, the focus will shift to improving optimisation, gameplay and work conditions. (Spoiler: the focus will be MTX)
Very simple answer: I do want better hardware. But not for better graphics, but for bigger scale. Keep current resolution, models, lighting, and so on, but add more detail to the maps(more props, furniture, interiors), more actors that are more interactive, farther view ranges etc. There's still a lot of growth that can happen to games as far as their looks go, it's not all about resolution and raw texture quality or number of triangles.
Everything you just described is still graphics though, and there are improvements being made in that realm. That's part of why things like mesh shaders are useful just to name one example. A lot of stuff like this is happening in game development it's that it's more of a literacy issue now. It's sort of like how people might know they like one song in a genre and not one in another, but might not actually have a way to convey why that it is. Gamers might be able to recognize that something is improved or different now but they no longer can articulate why because the difference is no longer a jump from "barely being able to represent something that looks like a thing" to "being able to represent the thing". Sort of also like how you can tell only when CGI in films is bad, but how someone watching would struggle to tell you why it looks bad and sticks out. Game environments generally are more dense, they've got way more stuff in them, shit farther view distances got solved a long time ago for everything but foliage with an infinite reversed depth buffer, the only thing to push there is maybe higher fidelity LoDs and that's it. Like all of this stuff is improving you just don't have the literacy about how games are made to be able to articulate or grasp what's different now.
i cant wait for AI speech algorithms to be implemented into gaming. just imagine a whole story based on AI given a small script and scaling the story based on how you react
@@adeptalakay I want to talk to an npc not chatgpt. Chat bots are not at the level where they can fit seamlessly into a game world while making sense and not hallucinating
I think physics simulation could use a major overhaul. It's been an afterthought for quite a while now. Even the games we recognize as having really good physics exhibit floaty objects that rarely express their true weight. I'd like to see the industry turn their sights in that direction. Audio has also fallen behind. Some games sound great, but only in comparison to other games. Much like RTGI which bounces light, I'd like to hear sound that bounces as well. It's not often that we have a game that does that. More often, if a sound is coming from my right I only hear it on the right, as if there were a void on my left that sound cannot reflect from. Maybe these things are not important enough to the consensus, but I for one have been wishing for these improvements.
I think physics simulations is a hard thing to do not because of limitations but because making physics an integral part of your game is not easy. Half-Life 2 and Half Life ALYX are like the only examples where really good physics simulations actually meant something for gameplay, I wish more games did that.
@@keatonwastaken I’m currently playing Control, which is why the subject is fresh on my mind. The physics are great in that game, but even so, objects seem to have one predefined weight, which takes me away from the experience a bit. I’d say even if physics aren’t critical to the experience, I think it would make games more immersive. Regarding a new frontier in gaming, that’s among my top picks.
@@LilMissMurder3409 Absolutely, especially open world games. It’s back to the drawing board for sure when it comes to that, because it would be too costly and unreasonable to mocap filler NPC’s. Inevitably I think AI will be a big part of this gaming renaissance, as much as I hate to admit it. I can see how it would be beneficial, but it’s also a big can of worms. I can feel other commenters readying their pitchforks now lol, and I can’t blame them.
Graphical fidelity is good enough. They need to make the physics way better. It's like the picture is getting slightly better, but less and less interactive compared to older games. Arkham Knights to this day looks graphically amazing, and when you turn on Nvidia Physics, it looks way cooler than any raytracing we got today.
Agreed. The Nvidia Physx smoke in Arkham Knight still looks better than the smoke in almost all games today.. Felt like a taste of the future back then.. But, here we are, in the future. And we are still seeing the same old, non-interactive, billboard transparent textures meant to represent smoke, and it's just sad..
I don't consider the smoke physx interactivity I consider a world like Zelda where the world reacts in a common sense way to your actions an example of Interactivity. That smoke was nice but held no gameplay importance just fancy fx
That's just a result of ageing, unfortunately. The more you experience the higher your standards become, and the less novel everything is. What was once exciting and unexpected is now dull and predictable. And there is a foreshortening aspect when you look to the past. The past contains all the great things you loved, it is full of them. The present only contains the things you love right now, which are inevitably fewer. Same reason as why music was always "better" in the past.
@@calmhorizons You aren't wrong but I don't fully agree with you. It's pretty obvious when you look at games being released say 2004-2011 and compare it to games being released 2017-2024 that the quality and quantity of good games has dropped significantly. Same with music. I'm mostly listening to old music, and discovering new stuff all the time so its not nostalgia in any way. The only thing one can aruge about is survivorship bias in that all the bad stuff from the past have been forgotten/ignored and only the good stuff remembered and filtered out. That works to your advantage though, since you know the old stuff is going to be pretty good for that reason.
@Skumtomten1 most old musics are terrible, and like every genre or year youre on. Most music are bad and a good number is decent and a few are very good. Obviously you're only listening to the good ones.
@@calmhorizonsPredictable doesn't equate to bad. Music only has so many note combinations and chord progressions, but a new unique spin can still intrigue someone who prefers his generation's young music. Pokemon, Mario kart, Mario party, smash bros, video game market is full of predictable but still great releases with new tweaks. One piece can be predictable but it's still the greatest work of fiction in modern history.
@@thunderstar254 True to a point. But you might be discounting the vast gulf between possible combinations and plausible combinations. There is a reason why we see the same flavours, melodic intervals, story beats and visual motifs repeated over and over in art and entertainment - evolution furnished us with a limited margin of acceptable interests and constrained sense organs. To make a crude example - it doesn't matter how many variations of shit flavoured ice cream you make, it ain't gonna sell. 😁
I would 100% love it if the world just decided that graphics right now are good enough. Why push progress 1mm forward if that 1mm costs me 1000$ just to keep up? The answer is "because they want that 1000$ from me again and again as they very slightly change almost nothing but performance cost" The industry is very clearly and very deliberately sabotaging performance in order to force consumers to buy new expensive parts if they want to keep up. They might actually add slight improvements to games, but they're so minimal. The only actual change I've noticed in newer games is that they run worse and look worse on my hardware than older games do. That's all there is to it. It's never been about making prettier games. It's always been about making consumer hardware obsolete.
I think this is a bit of a stretch. It's not that manufacturers and publishers are colluding to make hardware obsolete, it's that hardware evolution continues steadily, and prettier games attract consumers, which in turn cost more resources to make, which in turn influences publishers to crunch development time to save money. It's a perfect storm of pressures, and in the AAA space that means 4-6 year old hardware becomes obsolete when, for example, games like Alan Wake 2 are designed around mesh shaders - and stand out visually for pushing the needle forward. Your observations aren't wrong, but the narrative is more simple and more nuanced than "the industry is against its consumers." The truth is that there's more computational power in the hands of developers than ever before, but those resources are being squeezed to get games out the door quickly, and rely on cheap techniques like TAA to buff out the visual imperfections. I'm not sure of the solution. Times are tough right now, new games have very much stagnated, and we need more publishers willing to take on more risky ideas than yet another remake of their next greatest hit. Alleging a conspiracy doesn't help anyone.
Graphics can take a backseat when we finally have raytracing doing all the lighting as the norm. That’s going to be a huge graphical leap that can’t be understated.
“The enemy of art is the absence of limitations.” - Orson Welles I believe the digital age has disintegrated the mindset of optimization. Games no longer have to work before it goes in the box or fit on the cartridge or disc. Games can be 200 Gb, run poorly, and get patched later. Broken indeed.
@@CurtOntheRadio i'd say it is. anyone could write some code that does a thing, but not everyone can optimize it to run well Like how anyone can dip theyhads in paint and print them on a paper, but not everyone can make beaautiful painting from that
@@captainjimo Hmm. Then everything is art too. I'd venture Welles wasn't speaking about optimisation so much as art is about the constraints - only having these few notes, these few instruments, these few locations, whatever ie working within material limits that constrain one and getting the most out of it. Optimisation is more about removing constraints. Arguably the art comes in working within constraints, whatever they are, and getting the most out of it - not removing the constraints (which is what optimisation does, and which is largely a technical, objective, engineering job).
Fuck graphics what pains me is how environments are sterile and uninteractive and how physics in games is dead or how AI is literally same for over a decade now. Burnout Paradise STILL has the best realtime best car deformation system in a racing game and that game came out 16(!) years ago then titles like Red Faction Guerilla that had insane destructible and interactive enivornment or The Force Unleashed that combined multiple middleware tech to reach design objectives of devs. Devs are pushing for visual shit like mentioned in video that can be barely visible meanwhile you play current gen title and while going through foliage you pray observing if your character model will interact with that foliage properly or if your character will just ghost through that shit lmao. What the actual fuck.
@@arkgaharandan5881 That too. Devs really need to scale down on fancy polygons and provide meaningful and more interactive experiences and by interactive I mean everything that includes physics systems. Everyone slowly becomes Ubisoft when it comes to bloat of safe slop designed on sterile corporate templates. A fact you can launch decade old title and look what physics engine of said title does and not just be impressed but you cannot find a game 15 years later that is even a bit close to this in that regard is fucking insane. Only titles that push something like that are some meme indie tech demos that market themselves around singular physics based gimmick and thats it. When was the last time we had cool gameplay centered innovation in a big budget title? Nemesis system in Shadow Of Mordor 10 years ago by Monolith. Nemesis is also AI based and looking at their previous work with FEAR in that regard no wonder they were the last studio that even tried doing something around AI. Again what the fuck is going on, its like entire industry became MCDonalds tier.
Car destruction isn't even on the fault of the developers, it's on the car makers who hate seeing their cars destroyed. Most racing games license out cars to use and often the manufacturers get to decide whether or not they want a detailed destruction model
@@crestofhonor2349 I know about the patent stop excusing talentless worthless western developers that could design some other gameplay centered innovation. Same with licenses and racers, there are games that are using custom vehicles and those STILL arent close to Burnout Paradise in that regard.
@@bliglum I just started playing TitanFall 2 completely maxed out and up-scaled to freaking 5760x2400 over the weekend and it's as smooth as butter. It's been in my back catalog for awhile.
@@bliglum A couple months ago I treated myself to the full Half-Life bundle - it amazed me that a 20-year old game can still look that good. It was money well spent (and righted a wrong, since I'd pirated the original back in the day 😅 )
@@bliglum It's the digital equivalent of digging through the software bargain bin at PC retailers back in the day. That's another thing that has died out.
@@kalebgross1310 it’s possible if you are still in 1080p a 1080 Ti can easily do it at high atleast not ultra. Playing a game, the same game at higher resolution is such a game changing experience and if possible skipping scaling.
I occasionally have to remind myself that I'm in an echo chamber of AAA games with ever increasing system requirements. Outside of that echo chamber, the "real world" loves the Switch and the Steam Deck, and many hugely successful games run on fairly humble machines. The "plateau" is actually quite a nice place.
I had to upgrade from my 2060 Super of 5 years to a 4060ti for some AI VRAM work. Despite all the community trashing on it. Turns out, there's really no game I play that actually struggles on it and that small bump from 2060S is all I needed. Most of my regular FPS games are CPU reliant. Most of the others are anime-esque that don't push graphics, or stylized like Overwatch, and the actual most demanding AAA games I like to play and revisit, RDR2 and CP2077 are the most demanding out of them all. Which all run smoothly. The game I've gotten most hours out of in the last 5 years is Factorio, Genshin etc. I've realized how far I am from the expectations of AAA gamers who push 4k, RT etc and it was warping my mind for a while thinking I made bad purchases, based on other people.
@@lancevance6346 its not their point tho , U can't just say oh 4006ti is amazing cause its not . U don't make a card literally same perf as last gen card and even 3060 had 12 gb of vram and put a higher price for it . second nividia is not the good person in the story cause they actually profit tons of money from the consumers they become the richest in the world rn even apple didn't make like them, so shitting on PC hardware companies is so valid rn
@@lancevance6346 and even if u like aaa games from 5 years these games worked on gpus from 5 years too ,now even 4060ti can't run most aaa games at 1080p rn what will it do next or after 5 years ? did 2060s when u bought it didn't run rdr 2?
This seems like a commonly held misconception which doesn't hold much weight under scrutiny. Which games specifically are you referring to? Which games do you play? Have you genuinely not played any top tier games over the past 10 years or so?
@@steviewonder0850While there will always be good and great games coming out to play, there has also been an exponential increases in shovelware and bad games over the past decade. For every Baldur's Gate 3 or Helldivers 2, we get a sea of trash sports games, unfinished games that take 1-2 years to actually be complete, and games annoyingly butchered by microtransactions.
@@darthwoody9917 I don't know. There was so much shovel ware during the PS2 and Wii era. The DS, Wii, and PS2 were known for just having so much shovel ware dropped onto them
Difference is those games usually weren't flagship titles made by large AAA devs. Activision & EA USED to actually make decent games, now they pump out slop
It'd be interesting to see the AAA game budgets separate development budget from marketing budget. I swear half of their entire budget is just going to marketing and ads to sell their poorly designed games that arent fun in the first place.
It's not unusual to see the highest cost game advertising have a budget 20-30% higher than what they spent on multiple years of development, and even after all of that, how many games can you remember any of the advertising for? You're not far off on the 50% ratio even at AA levels.
Your best marketing is your consumer, give them a good time/experience and word spreads fast. Does not matter if its a device, a restaurant, a hotel, a beach, etc. The person buying any product is your marketing.
we reached a certain plateau back then when 3080/3090 is the king. you can play on 1440p with DLSS Quality for years to come. in 2-3 generations the visual representation will reach a point where no more power is needed. backing this up with "no one wants photo realism in games" . a good art style is much more important. people do not want games to look like the real world, the opposite is the case. the freeze frame can also be told to using 4K and 1440p . you cannot see if a game runs on 4K or 1440p on a monitor. the pixel density is is already high enough for like 27" displays - a 4K display adds almost nothing to it at a super high performance cost.
I remember when the 20 series was coming out and everyone was talking about the potential of 4K gaming. Nearly 3 gens later and not even the most expensive cards can run 4K natively at a consistently smooth rate yet.
True 4K that fully benefits from higher resolution is scam for consumers. It is almost nowhere. Facts: 1. Sub pixel detail. It gives noticeable better looking pixels if there is more information per pixel that are blended to one pixel. Keyword: "supersampling" 2. Camera resolution. Camera has different pixels for red, green and blue with some filter. They are counted for those megapixels so good 4K image requires actually 8K camera. 3. RUclips and streaming. They use so heavy compression that 4K video is just as good on FullHD. In fact that those video quality settings are more like quality settings and have more headroom on pixels, so they can be downscaled if necessary. True 4K in RUclips requires 8K video. 4. Human eye. Human eye resolution is about 1 arcminute. Just do the math based on view distance and screen size what you can actually see. In living room conditions, 4K usually don't happen. Large display in front of armlength or projected to wall in movie theatre is where it matters. 5. GPUs suck on small triangles. That is very wasteful and requires LOD levels, and having more LOD levels is also wasteful because GPU can't benefit instancing so easily and that makes memory bandwidth as bottleneck. Shortly, there isn't much of 4K gaming that really benefits from 4K. It is lacking even in camera technology. So while framegeneration easily sucks, frame upscaling actually not. Game can be made for 720p...1080p and upscaled from that to panel resolution and that is how it is done in reality. PC gamers try to play game without upscaling that is when they see the truth: Game is made to played from sofa in livingroom in console, running internally 900...1080p and upscaled from that. If game runs 30fps 1600x900 on PS5, large 4K display with native resolution on 60fps requires almost 12x more powerful GPU and bandwidth. And I'm not talking about raytracing or pathtracing yet.
Eh, ive been playing at native 4k oled since the gtx1080ti. Im still playing baldurs gate 3 at native 4k on my 1080ti. But next year will upgrade to a 5090
@@slopedarmor 4k rendering is possible but it is expensive. Just buy 4 times powerful hardware what is latest gen console, and you get equal framerate and upscaling ratio what is found from console at FullHD. But content itself is easily made for that resolution found from console prior upscaling. Btw, I just started to play older game, made 2010. That game content is targetted to 1280x720. What I did, I set game resolution to 1280x720 and put MSAA antialias settings to max from driver (because it has forward rendering pipeline) and that give best image because it just look like crap when texts are smaller, there is lots of texture upscale filtering close up and there is graphic elements that are optimized to 1280x720. There is even grain effect that is optimized to 720p. So game actually look much better when running on resolution where it is targetted. It is just smooth everywhere and limited texture resolution or other assets doesn't distract. Settings forced to high from driver keep pixel quality high so they don't look bad when pixels on screen are larger.
What I find crazy is the Graphics sometimes aren't even "that great" and you're getting such a performance hit. As well, that GPU's are becoming so HUGE, yet they STILL can't give you "Good Performance" at 4k.... I mean, tech is supposed to be getting "smaller", NOT bigger.
have you see smartphones in the last 10 years? They do the same as gpus, getting bigger every year But i do agree, if the Top of the Top gpu - rtx 4090 - cant do 4k 60 in any modern game, then whats the point of it being the best
@@CurtOntheRadio Well OP wants good performance at 4K... I've personally given up on that. 1440p all day everyday. 4K is just about good for indie games. AAA can't be arsed to optimize their game.
I was very happy growing up with PS1, NES and N64, GBC and so on as a kid. Never complained about graphics. That's probably why the graphics don't matter to me at all today. I care much more about the story, the characters, the music and gameplay. And high fps of course. Nice graphics are just an added bonus--if they don't make the game run like trash.
@@Thunderhawk51 Same. I still love 2D pixel art as much as 3D, but enjoy good games of any genre from any era. Currently mid-way through my first playthrough of Bloodborne on PS4 connected to a plasma TV and it looks and plays great.
I think it will really vary a lot depending on the age of the person commenting. For me it was without a doubt the Nintendo 64. The OG Playstation was great, but all the geometry warping just looked bad to me, it wasn't the 3D revolution I wanted. The N64, however, just looked so high fidelity and smooth, and the geometry warping was nowhere to be seen, it just blew my mind. Dreamcast and PS2 were both great as well, but nowhere near as transformative to gaming for me as N64 was.
@@K31TH3R N64 was still too fuzzy and low fidelity for me. Ocarina of Time was amazingly atmospheric for the time, but in general, the generation of 3D was novel for how it transformed gameplay more so than visuals and I thought most N64 games were ugly too. For me, the Dreamcast was the first console that made 3D appeal on an aesthetic level.
As with all forms of technology, you reach a point if diminishing returns where "2x the power" starts to seem like "1.5x the power" then "1.25x the power" I'll relate it to 3D modeling, we use to see a new GPU generation mean multiple minutes off render times, now a new GPU generation may see only 10 seconds off render times or even less. But when you look at the percentage difference it's still about the same, because 10 seconds is twice as fast as 20 seconds, but it's still only 10 seconds at the end of the day
Show me a system that requires expontentially less energy to produce twice as much of ANYTHING. A perpetuun mobile still doesn't exist and will never exist. 😅
Going from 800x600 to 1078 and same framerates are about 40 % increase in gpu power or you could play at 800x and have better visuals but same fps as your previous gpu - Mindbogling... a gpu should be 100% better and prefebly at a higher resulution to pay off
Yup, this is why I have been enjoying VR cause the mobile hardware of quest increase by 2x every three years and you can see a next generational leap like resident evil 4 to Batman Arkham Shadow. Sadly I think the quest 4 will be the last massive next generational leaps for VR and every model after that will be improvements like sharpness and FOV.
Imagine your son want a gaming pc, nothing too fancy, just enough to enjoy modern games, and it costs you 1500€. I bought my first pc 15 years ago for 500€, and I could play everything. A non adult will not be able to afford PC gaming anymore, because we added a bit of grass in the distance
Imagine thinking 15 years ago prices would remain the same. We could say the same about gas prices, groceries, housing prices, rent, taxes, healthcare. Maybe the issue is that prices have gone up but wages have not and that's the reason why the world is slowly crumbling
You can still play everything on PC with a low end, but like your $500...you have to be on 1080p and turn down all settings. Everyone knows that experiences are best enjoyed at the max settings. Thats why people buy better hardware because this is their MAIN hobby. Its like golf clubs. You only need 1 driver but people into it will have 5. And each of them can cost $2000.
my only hobby is watching stuff online to save money and gaming, everything is free entertainment wise, if i cant afford a desktop or laptop i only buy a phone and use that until it falls apart, i literally only work, sleep, save money and volunteer, i have no other hobbies, i go nowhere, i cant afford it and i cant afford most things like basics and housing, its very doable to afford a phone with no monthly cost and a computer and using free internet at food places, until jobs pay then there is no reason to do or go anywhere or be involved with anyone, entertainment is free and great and a laptop and desktop is more then just a gaming machine
Just made a PC for a friend with an RTX 4060 and I3 13100f for 650€ running with a gen 4 m.2, inflation is crazy but if you look for the best price across counties and website you can spare a few hundreds bucks
Another problem is that companies aren't innovating anywhere like they used to. They're playing safe with trend chasing at the forefront. Also, the whole release it now, fix it later mentality is also hurting the industry. Optimization really hasn't been a thing since at least the 6th generation of consoles, maybe the 7th. Graphics are fine now. I don't need them to be any better. We need smoother framerate and innovative gameplay that doesn't involve sleazy, greedy and FOMO tactics. I've only ever bought two re-visioned consoles, the PS2 slim (simply because my fat model was having trouble reading dual layered DVDs) and the PS3 slim (because my original 80GB model got stolen).
The extra performance should be used for gameplay, not graphics. Real AI powered NPCs, full physics integration on gameplay interactions, gameplay relevant path tracing (using reflections for stuff or something)... The only purely graphic performance we need now is to be able to achieve current PC graphics on VR, after that, i would be fine with graphics no longer getting better, Half-Life Alyx is so close already...
I have a high-end pc with a 4090. I see myself playing my hacked oled Switch more especially in the past 3 months. We all know Nintendo doesn't care for high frames each game has cartoon visuals I don't mind one bit as long it's fun. And, if you can run the Switch game on pc with Yuzu or Ryujinx Luigi Mansion 3 looks like a CGI film with a couple tweaks at 120fps.
I like both. Not everything has to be super unique in terms of art style. Also realistic games don't always age poorly, it just depends on how you hide your limitations
I don't know what sort of tech The First Descendant is using, but it was quite funny, I had RTX Ray Tracing enabled, and was near a light source admiring how the light was bouncing off my character and even told my friend, "Wow look at this, it probably wouldn't be bouncing off my character light that without ray tracing on". Later on, I was hitting some performance hiccups in battle so I turned my settings down, came across the same light by happenstance and regular old rasterization looked the exact same lol.
Rasterization doesn't actually have anything to do with real-time global illumination. There are multiple software based ways of approximating the effect or doing it literally, the stuff like RT Cores that are a part of the newer cards are really just dedicated hardware to accelerate ray-intersection tests and bounding volume hierarchy traversal. It basically just lets them do raytracing faster using hardware acceleration.
Flat panel TVs/displays have followed the same trend, unsurprisingly. SD to HD to UHD (and the variants of HD in between), have shown pretty clear improvements, but in the jump from 4k to 8k the conditions and content being shown have to be just right for it to have any impact. This just means as the years rolled on 4k tv's went from massively expensive home theatre-like luxuries to cheap consumable items for a fraction of the cost. Will gaming hardware follow suit?
It feels like more people are looking to legacy hardware/emulators to get their gaming fixes, these days, because of diminishing returns, and because of the state of AAA gaming, as set by the publishers.
Yeah we might be seeing a trend of diminishing returns in graphics, but the problem is that hardware requirements aren't following the same trend. Despite being able to play beautiful games today, a game 5 years into the future that might even look stylistically worse won't even be able to run on my PC. So frankly we don't even have the option to go "yeah I'm comfortable with my hardware today", we need to spend more and more money just to be able to play games that look identical to the games we could previously play.
what i have been noticing , that games just look pretty , less interaction with the world , less animation. i dont know how taxing or demanding it is to make foliage bounce off characters rather than clipping through them , or have destructible environments , or a world that is affected by battles. more than 5 npc models. every time is less and less animations..
Physics systems are pretty demanding from both a technical standpoint and a coding standpoint. I don't think you remember Nvidia PhysX but pretty much everyone hated that because of the cost on the performance but it did add in a bunch of physics interactions
@@crestofhonor2349 Surprising how we haven't improved the physics performance in the 15 or so years since PhysX was a household name.. it's the new gimmick word now. "RTX".
Scaling projects up is not a linear increase, costs rise exponentially. Ps1 era devs didnt have to deal with facial expressions for example but nowdays we are not only looking at very expensive motion capture performances for facial expressions but you also have to worry about hair simulations, sub surface light scattering and mussle movements, pores stretching, and the list goes on. And if any of these is neglected then you get a mismatch and a dissonance in how things look vs how you expect them to look and behave. A lot of the advancements we see now has to do with automating these time and cost consuming endeavours (RT, metahumans, etc) but shits getting more and more expensive and companies keep pushing bigger and bigger worlds and projects.
Many classic games pre-2000 were developed with dev teams under 50, and they usually developed their own engine at the same time. The original Baldur's Gate and its engine was made by approx. 60 people over 18 months. CP77 was in development for nearly 7 years and peaked at around 500 people on the team.
@@com.7869 the 2080 Ti released more than two years before the ps5. The ps5 is between a 2060 and 2070 performance wise, that's a far cry from what a 2080 Ti can do. If anything, the ps5 is equivalent to the lowest tier in the current Nvidia line, the 4060 Ti, and that's being very optimistic.
@@janbenes31653 years the 2080 Ti came out in September of 2018 if i remember correctly you’re thinking 3 generations we get a new generation every 2 years roughly or we did
@@janbenes3165Can a PS5 run 1440p native at max settings at +60 FPS consistently? That's my problem with that statement and why DF often find misleads people with that conclusion. I had a 6700 non XT and it can do that all day before you even think about applying FSR. The problem with DF is that they pass off the upscaling and try to act like: "yeah, it's equivalent to XXX GPU" because said upscaler manages to bring it back to 1440p 60. When it reality, the AMD Oberon (the name of the PS5 APU btw) is rendering at 1080p. Sometimes even lower.
This is a handful of issues. Part of it is that graphics are so advanced that trying to advance them further in a percievable way takes a ton of power. Another part is that doing research and development to improve hardware significantly is getting more costly, time consuming and difficult. Early PS4 games still look fantastic. We're well past the point of "these realistic graphics will age bad". Even plenty of PS3 games still look good. We can keep pushing graphical fidelity further and further over the years, but we can leave most of that to tech companies as technology in machining improve. But I think a lot more effort should be placed on optimization. Making newer tech like ray tracing easier to run so its more viable. Reducing development costs, development times, etc. Doing all that is good for everyone. "Buying more cars is awesome, but we should really fix up the ones we have. " so to speak. Beyond that, I'd love to see advancements in tech outside of graphics. Physics engines, NPC counts, world scale, more options for affecting the world, bigger stories, etc. I'm looking towards Hello Games, and... frankly, the Beyond Skyrim mods for expanding the scope of games in a polished way. Remember Red Faction Guerilla? Did everyone just stop caring about pushing interactable destruction? The only other games I've seen do that are Teardown and technically BeamNG. Shoot, even Flight Sim 2024 is massively expanding scope. Assetto Corsa Evo is expanding on the sim racing genre. This is what I want more of. Don't take graphics off the stove, but put them on the backburner. They still can improve. But its not that important anymore.
I think DF said it best yesterday when I listened to them “the Terraflop war is over”. The games I’ve felt have been transformed most by Ray tracing are old ones like Quake 2, Portal, Minecraft and I’m really looking forward to Half-life 2 Rtx. Personally I’m happy with less fidelity and games that take 3 years or less to make as opposed to 4-7 years in the triple a world of gaming. I enjoyed Cyberpunk 2077 and Witcher 3 but have enjoyed the Fromsoft souls games more which are technically inferior to CDPR’s output.
That made me think. I was buying a high-end GPU like every 5 years to play the newest games. But at some point, games can't just get "better", so I guess there should be also a point where my GPU is still high-end years later.
kinda happened with 3090 & 6900, the new cards are more powerful but we've entered an era where cost is so high that they will be extremely relevant for a good few years yet, still 4k cards for the most part in a world where upscaling and frame gen are the new normal.
That would sure be nice, but if UE5 is any indication, we'll still be getting games more demanding on hardware at a similar rate without necessarily improving the visuals (as is the case with modern games starting to require TAA/DLSS as a crutch for optimizing performance).
It sounds great to me. I upgraded and am pleased with this machine. It pains me to see cards like the 4090 starting to be called a 1440p card (id like one).
@@pedropierre9594 My 4090 is a literal space heater. Great for the winter but during the summer I'll use framerate caps and power limiters to make it more efficient.
@@minnidot Hahaha who would call it a 1440p card? It has 24GB of VRAM and DLSS is best experienced on 4K displays. The reality is when you push graphics to the maximum (i.e. path traced Cyberpunk) you are dealing with 1080p upscaled to 4K and you need frame generation. The 4090 will be playing games perfectly fine at 4K for another 5+ years. It won't be keeping up with the latest and greatest ultra PC settings but it will still be light years ahead of what the consoles will be able to achieve for the foreseeable future.
Slightly off topic: I had gotten so used to my Windforce graphics card running Overwatch 1 at under 30 FPS. I was CONSISTENTLY landing headshots as Hanzo. My friend watched me play a bunch and said: "Dude you could go pro!" and gave me his 1080. I was ecstatic, set everything up. Literally couldn;t land a headshot for shit afterwords and went back to my Windforce and immediately went back to dunking on people. Then OW2 came out and now I have the 1080 installed. What does any of this have to do with anything? No clue, but lower frames worked better with my brain, dunno if thats an actual thing that others experience or not. Great friend, we still hangout and talk.
I haven’t finished the video yet, so I don’t know if you addressed it, but another classic example of a wonderfully optimized beautiful title getting absolutely brutalized by modern “features” for a minuscule visual improvement is the Witcher 3. I played through the whole game at 1080p low/med settings on a gtx970 when it first came out to get 60fps, and then when I got my RTX 3080 in 2020, I download it all the texture packs and sky boxes and lighting mods, booted it up in 4K, and played through the whole game again getting 90 to 120 FPS, and being absolutely blown away at the visual experience. Cd project red updated the game to DirectX, 12 and incorporated those mods into the base settings, and the same exact system can barely get 40 to 50 FPS with the same settings on the new version. If there’s a visual improvement between the classic version that runs smooth as butter, and the DirectX 12 version that is stuttery and horrible to play, it is splitting hairs, and not worth losing a single frame of performance. It’s also shitty that all the new players that go to try out the game and its current version for the first time will get the shittiest experience possible. To answer your question mid video, if you gave me games that looked like the Witcher 3 (with better NPC character models though) or RDR2, and could run comfortably 90-120 FPS at 4K on an upcoming 5070 or 8800xt, then I would be perfectly happy with that if it meant the games got more FUN and polished from a gameplay standpoint.
The Witcher 3 current gen update is so god damn cpu intensive I just cannot understand why. I’m getting the same cpu performance as I did on the original launch which I played on an i5 4690k!, and my current CPU is the i5 13400f. It is eating my cpu up more than cyberpunk running path tracing.
Witcher 3 is also not artistically designed for ray tracing, which hinders the benefits a lot. Hardware Unboxed did a video talking about the visual comparisons and improvements possible with ray tracing, and they rated Witcher 3 as "Different, but not necessarily better".
I don't really care about graphics, I care about gameplay, that being said I still hope the ceiling gets higher and higher, we are in a dip right now where small improvements take a lot of performance, but hopefully as time goes on these processes get more effecient and new games come in that raise the ceiling even more.
I remember wanting games to be more immersive but not graphically necessarily. When I played ME2 I was visualizing branching narratives, dialogues etc. This was the future for me what "realistic" meant, not visuals but content and interactivity. Old crpgs are better in story and dialogue, than most games, this aspect has not evolved but devolved. Only games like BG3 showed a glimpse of hope for more realistic and open interaction.
It's amazing how an RPG from 30 years ago (Ultima 7) manages to have a world that feels more alive than pretty much anything made since (partly I blame the original Diablo, which was _marketed_ as an RPG despite really being a dungeon crawler / hack'n'slash game - that really lowered the standards for what could be called an RPG). Ultima 7 NPCs have actual lives, sleep, eat, go to work, react to the environment around them (ex., opening windows during the day, lighting candles at night, etc.), have huge and complex dialogue trees, etc.. Nowadays people just accept "RPGs" where characters stand in the same place and repeat the same 3 lines over and over.
@@RFC3514 There is a big opportunity for this kind of games, but they need decent graphics and voice acting for them to be main street. I think Larian is the best hope, but I'm sure if they get another big hit, some other studio will come out.
@@digitalsublime - Voice acting was another thing that hurt RPGs around the Diablo era. They couldn't fit audio for the huge dialogue trees of games like Ultima 7 into a CD, so they made most dialogues very linear. In fact, the dialogue in some older games was generated dynamically, so it couldn't really be pre-recorded (unless it was assembled from individual words, and that usually sounded too robotic). Maybe now with decent voice synth and generative text AI we'll see games with more complex dialogue systems. Just don't ask the NPCs to count the number of Rs in "strawberry". 😜
Great take on the current dissatisfaction generally felt among gamers, Daniel :) To me another "pressure point" being felt with many of the latest games, is that going from High to Medium makes the latest games look like they rather belong in the early PS4 era, instead of early PS5 era. Few years ago we had the whole "you don't need Ultra settings for gaming"-movement in the community, with was right at that time. But that period came right after a period where a very popular Nvidia 70-series (GTX 970, then 1070) or AMD equivalent GPU could run games at the mostly Ultra setting in games with similar release year as the cards themselves. These where $400 cards (or there about) and was massively popular. But then RTX series began and we need that "Ultra-settings-intervention" because new GPU's became prohibitely expensive for many people. Just look at how much higher the peak adoption rate was for the 970 and 1070 compared to 2070 or 2070S. And for the developers they seem to be between a rock and a hard place in regards to how they should prioritize they game development time on ray tracing, which runs too slowly on most peoples hardware. Or build a graphically impressive lighting setup in game with pre-raytracing techniques like they used to before RTX. The games no longer look impressively current-gen on medium settings without ray-tracing, and even a 4070 has a hard time running High-settings in some of these games with some raytracing. And AMD has been slower to develope the raytracing capabilities of their GPU. So their cards generally still rely on good rasterization and baked lighting implementations. So it kind of a transition period, between to lighting implementations, that is stretched out or is dragging on because the newer processing node (newer than 7 nm) aren't bringing massive efficiency and area gains that we used to see. ...And Nvidia is busy building chips for the AI marked, because companies can afford to pay much more for an NPU than gamers can justify on a GPU...
Overall good points. Here is a couple presentation improvement points. Again though, thanks for the discussion. 1. I wish you went fullscreen with the source. It makes it easier to see and improves quality. Edit: 2. Thanks for linking the original sources. RUclips compression will degrade it every time you upload.
The funny thing is that when you displayed the example around the 4:16 mark, I thought the right side looked better. This reminds me of how in Monster Hunter World you can disable Volumetric Rendering to stop the backgrounds from being blurred and washed out. There are some newer features that don't look good, even if they are trying to be realistic in some cases.
Flatscreen gaming graphics may have plateaued to an extent, but as someone who plays a fair bit of VR there is plenty of room for improvement on that front. There are very few AAA quality titles and mods which transform flatscreen games into VR (like Cyberpunk) look fantastic and have amazing potential but even a 4090 can struggle to run them to an acceptable degree. I feel like that might be the area to focus on more in future.
With VR you're only ever going to get what you get as a result of improvements made for normal monitor use. VR is extremely niche and hardly anyone is interested in it and that isn't likely to change. Sure it's market share is increasing a lot as it advances and it's a lot more prevalent now but that's an increase of a small number to a less small number. VR is cool and I hope it gets more attention for those into it but ready player one isn't going to become reality. The Apple Vision for example is probably their biggest failure ever and the metaverse failed spectacularly.
@@paulc5389 AVP can't even play games and wasn't designed to so that's kind of irrelevant, hardly a shock what is basically a VR Iphone at $3500 didn't go mainstream. The Quest 2 sold 20m units which is not that far below Xbox Series sales (28m), sure it's still niche but there's definitely a market there. I think VR conversions of flatscreen games is a potential goldmine as it doesn't require extra investment or specific dev and can work surprisingly well, it's just the hardware needs to catch up. Once we get affordable GPUs that can run games like Cyberpunk in VR with the same fidelity as flatscreen (probably 2 gens away) i think it will explode as it's a truly incredible and transformative way to experience games.
@@paulc5389 Nvidia doesn't really to do anything in particular, better GPUs for flatscreen gaming = better GPUs for VR. It's just extremely hard to run VR games in high fidelity on current hardware as it's basically like trying to play in 8K flatscreen. But there are little tricks and things they can do for optimisation like foveated rendering which will only get better. VR is still in its infancy really, probably like the equivalent of a PS2 right now compared to what will eventually be possible.
Great video as always! I think part of the issue is resolution and the other part of it is VRAM. This is the first time we have ever gone 4x in resolution, PS1 and PS2 had the benefit of being able to change the output resolution of the console to your CRT screen and knew it would still look good, PS3/360 jumped from basically 480p to 720p, then we moved to 1080p, now its 4k, that has eaten away at half the GPU uplift we got this gen. Then you add in that RAM/VRAM barley moved! Mass Effect Legendary Edition, the main benifit both ME2 and 3 get is the HUGE improvement to texture quality! On PS3 devs had 480MB of usable RAM (32MB for the OS), PS4 gave them 5.5GB, this gen they have 13GB, and when you factor in the resolution increase etc, how much has this been eaten into? And with Nvidia keeping 8GB card relevant 4 years longer they should have with the likes of the otherwise excellent 3070, you end up in a situation like black myth wukong, a game that uses the latest GPU technologies and can look incredible, yet in places looks like a PS3 game with terrible textures. I would argue the single biggest upgrade in the Horizon Zero Dawn remaster is the increase in texture quality which along side the improved animation in cutscenes and better framing of secondary/side quest conversations (based on the video DF did) and in all honestly, other than more VRAM, none of those improvements actually needed more GPU power, the PS4 version of Horizon Forbidden West still has the same incredible motion capture and well framed conversations as the PS5/PC version. While im a sucker for better lighting in incredible micro detail like peach fuzz on Alloys face (it does look so impressive) but does it actually make the game better? No not really, the gameplay and story (my god what a story!!!!) are why I fell in love HZD.
But...modern consoles are NOT, NOT, 4k though. The vast majority of the time to maintain stable 30-40fps they'll run at 1080p. Even if you set it to run quality, the highest resolution point you'll hit isn't 4k at all, but 1660p. It's only capable of upscaled 1440p...but hardly ever runs that. I and a friend hooked up the PS5 to a his computer and ran some programs to read its performance. Ran several games at multiple settings and it averaged 1080p nearly 70% of the time...even with the two "4k" capable games at the time.
From the 360/Ps3 era onward the console companies have been lying about the resolution of the consoles. The consoles almost never render anywhere near the resolution listed on the box. The console simply upscales the image. Your "1080p" console was in fact doing 540p-800p upscaled. Your "4k" console was in fact doing 900p-1600p upscaled.
@@iprfenix Consoles like the SNES and PS1 could all output at 480i and yet they chose to do 240p. Plus even then resolutions weren't consistent. There were a whole host of pretty strange resolutions during the analog days of CRTs. Resolutions were never static. Even the PS2, which could go all the way up to 1080i and did support 480p out of the box, often rendered at 480i. It often took two 240p images and interlaced them to create a 480i image unlike the other consoles which often used a 480p frame buffer and then output a 480i or 480p image. Plus even those that ran at 480p didn't always run at 480p and could have an internal resolution of 448p and just use the fact that there's overscan to hide the missing pixels. There isn't a single generation where your console ever always output at native resolution
PS2 era still ran at mostly 480i (which for rendering costs is largely equivalent to 240p) and earlier consoles mostly ran at 240p. Also modern consoles very rarely run any game at native 4k (stuff like the Quake 2 rerelease does run 4k120 on everything but the Series S), it's mostly stuff like 1200p upscaled to 4k with either checkerboard rendering or FSR and 30 fps.
Looking for how some recent games was made i just got shocked how some games lacks basic optimizations like LODs, or even better file compression (Call of Duty *cough cough*), abusing on realtime features or making shader cache while game is running just to not get a longer loading screen (at the cost of stuttering at gameplay) and all this in a lot of AAA games that had a long or very long development time.
If developers create a game that’s too demanding, the PS5 won’t be able to run it, which would lead to low sales and reduced income. As a result, they design games that the PS5 can handle, even if that means fewer improvements compared to previous titles. Similarly, when viewing photos, you won’t notice much difference between a 10MP image and a 100MP image-unless you zoom in. While it's a significant improvement, can you actually see the difference?
19:19 not only are we not there yet, but designers are having to work with two lighting systems that make life a lot harder. See that HU video for heaps of examples where prioritising one means the other suffers (i.e. the game is mainly designed with non-raytracing in mind, so when you turn it on you get weirdly lit spaces). To nail both you have to carefully design the light of each environment using both approaches and use tricks to avoid one going wrong.
Yeah this transition period is pretty brutal. And sadly, because RTX is still so expensive and underpowered for normal customers, we'll be here for another 10 years until RT gets commonplace and at a decent power, at which point the developers could finally switch and only develop with RT in mind and make it a hard requirement.
Actually having full RT with Everygame without performance problems would speed up making games. As you would no longer have to bake lighting, shadows. Etc.
It's bigger than people think because there is a lot of time spend authoring and lighting a game using rasterization as well as all the time spent making new rasterized effects to imitate what ray tracing does
I wouldn't mind if performance didn't improve beyond this point (at least for a very long time). It would allow developers the time to actually improve-, and master the rendering features we currently have. Why most people want more performance today is because games perform poorly. But at the same time, games perform poorly because developers haven't had the time to master what we currently have. It's a chicken and egg kind of situation, where our economic system (feeding on non-stop growth) requires consumers to WANT the new products. But in reality, just like game development takes longer these days, so does optimizations to engines, to games etc. It feels like the kind of rendering optimization improvement developers made during a 5-year period on the PS2 would take 20 years today due to the added complexity of modern rendering. I would personally like to spend less money on hardware, and allow developers to become more focused on creative graphics engineering.
It's wild to think that a majority of gamers would put story so low. It explains why most people stick to AAA I guess? For me, gameplay>story>audio>art direction>level design>graphics. And I think a majority of indie gamers would be similar.
For me, graphics help with immersion. I'm playing Methapor right now, and I love it, but the graphic takes me out of it a lot. especially exploring dungeons where it's just a room and hallway with poor texture.
4090 pricing may have been exuberant, but in terms of performance? It's fantastic. So is is the 4080. When the 5090 releases, I'll own that as well. The GPUs aren't the problem. Either your ego is or your wallet.
@@huskers1278 Its a scam just based on the prices. Really should regulate the market when no competition and slap a 50% tax on all Nvidia stuff, they are printing money with their monopoly.
we are at a point where something truly revolutionary (hardware wise) would need to happen. Traditional polygonal rasterization is essentially completely plateaued. I'd personally argue that this has been the case for almost a decade, with the GTX 1080ti being the "beginning of the end". If you play at 1080p, that card is STILL all you need. It's insane.
This is also a problem for improving graphics with PC games. If you make a game that requires a 4090 to run at 1080 60 then your market is 4090 owners only.. i.e. hardly anyone. And that lack of willingness to upgrade is only made worse by Nvidia's greed and poor specs on cards that are actually affordable. If I was going to go Nvidia I wouldn't touch anything below a 4070 ti super or any 50 series card that has less than 16GB vram. And for a lot of people that's simply not affordable.
I like your style! Also, i'm a brazilian broke gamedev that have a core i5 8th gen, 32gb ram, and GTX 1050 and i'm very happy that my notebook can tank a lot of modern games. Also, i think realistic graphics are overrated, and the "only dogs can hear it" is a thing i talk about since PS3, because games are trading innovation for graphics since then
In general the focus around games has been too much on graphics. It's the first thing everyone will complain about. I'd much rather see some innovation in game design instead of chasing "photo realism". Or a shift of focus from graphical fidelity to improving stuff like NPC AI or physics which have absolutely stagnated or even regressed. The reality is that games don't need state of the art graphics to be fun and successful, you see it with many indies and especially Nintendo.
It's funny how game AI has barely improved at all (and is still rubbish, frankly) even as we're on a supposed revolution of self-driving cars, talking machines, even generalised AI. Much of that is hype and BS, of course, but you'd think we might already get at least some improvement in game AI before the Robots supposedly make lawyers and artists redundant. But nope. Much easier to do it in games than realworld, too, as you have all the variables and a very limited 'world' in which to operate. Driverless cars (supposedly) and yet driving games have awful AI opponents. Same in shooters, everything. Garbage. If game devs can't do it after forty years, what makes anyone imagine Elon can do it "by next year". lol
@@CurtOntheRadio It's weird isn't it? Seems like a much easier canvas to work with than the real life applications you mentioned. The big difference though is that they have dedicated research where unimaginable sums of money go into. So they are and will be more advanced than what you see in games where it's clearly not been a large topic of focus for a while.
@@vintatsh True, it's not a fair comparison. Buuuut, we might at least expect some improvement in game AI long before we hand over our children to be educated by the Robots, say. Or fire all the lawyers. Even if it needs a cloud subscription. It at least suggests some further scepticism is warranted about the use and integration of AI more generally, imo. I keep trying to think of ways to use AI in games and keep coming up against this issue: how do you swap data between traditional and AI without losing the point of the AI, or the use of the traditional compute? Like, say, you could have AI be a shopkeeper, or blacksmith, so you can better deal with them, be more inventive, maybe - say in Skyrim type. But any result would still have to be passed back to the traditional compute running the game, and it would all need be defined as variables the trad game code 'understands'. Yet if you limit AI to output the trad game understands have you gained anything, really? You can't broaden choices easily, you can't invent new things and all in all I think it's difficult to find applications. Though maybe sports games might be one - driving, tennis, whatever. Blah blah blah But where is it? Where is any game with a novel, 'proper' AI component?
"Several games with AI on the market today implement highly sophisticated forms of AI to elevate the player experience. Games with the best AI often elevate the gaming experience in cool ways. For example, The Last of Us: Part II uses advanced AI to power its enemies, providing them with an ‘awareness state.’ This means that if an enemy sees one of their comrades killed without actually seeing the culprit, that particular enemy will be on alert and more vigilant as they plot to take their revenge. Expect the future of AI gaming will include much smarter NPCs since NPCs have always been one core use case for AI in games." TLOU2 is pretty good re NPCs, I thought. Def a step up from most. Not sure if they just mean 'well programmed' here though. And there is this: ruclips.net/video/PYtmFF02OH4/видео.html
@@CurtOntheRadio I feel like that will be a „next-gen“ type development. I fully expect PS6 and whatever the next Xbox will be, to have a highly capable neural engine to handle the processing of AI interactions and that will be their defining feature. The PS5 Pro already has a 300 tops machine learning block after all for PSSR. That is going to be the point where even AAA developers will start to embrace AI use-cases in games, once the average player has the hardware. It seem like the next logical evolution. As to how exactly it‘s going to be integrating into game design I‘m also a little bit confused, especially regarding storytelling but we‘ll see.
@@albert2006xp I just did a new build and my old one had an 7th gen I3 and a 1050. My buddy still plays on that pc lol its fine for medium-low graphics 1080 on most game 60+ fps. New games are optimized for people with $8000 PC. Not very many people can even play the new games they are releasing not because they don't want to but because they don't have $3500 laying around to play the new game at not even the best quality
Daniel, this will be your most underrated video (especially from 10:01 time mark and onwards), mark my words. 😉 I think that, yes, game design needs a reset, a redesign of the business model for development of new games. Everybody in the gaming industry should pedal back on the "eye candy" development, and respective higher HW requirements involved. What everyone needs is "better" games, not "prettier" games. And considering where the economy is right now, and where it's going (not for the better), when only a portion of the userbase can afford that elusive top 10% performance incredibly expensive hardware, it stops making sense going in the current direction.
Yes, I would prefer if graphics development stopped for a time, because in these times we're only seeing a slightly nicer picture but at bigger and bigger costs, I'd prefer if developers focused more on physics, gameplay, AI of NPCs (i.e. making NPCs react more naturally), face expressions. But more post-processing effects, adding 5 times the polygons to make a tree more realistic, that isn't important for me. Also about Ray-tracing, if it's too expensive, I think it would be better if developers used a combination of: the tricks that they used before + an optimized Ray-tracing.
Also Daniel, Is this a quiet admission of your own culpability in this graphics treadmill? Your channel is about analyzing graphics in games with a fine toothed comb after all. Without graphics what do you have to talk about? 😅
Modern games are already starting to run like piss on my 3090 despite not looking any better than RDR2 which runs in native 4K with raytracing above 60fps without breaking a sweat.
Small Indie game developer here (or at least starting to develop games) I think one of the checkpoints to get to that point is draw distance, the best 2 examples of that I can think of is for one: The whole Ghillie suit and the grass at long distance thingy in DayZ, and the other example is how a whole lot of games are situated in this sort of "valleys" or "canyons" with high walls that conveniently stop the player from looking at details in long distances (although, nowadays it's more of an artistic tool for representing different ideas of a storyline). I just think that the day I can have the exact same visuals while looking through a x14 scope than looking at my character feet, or the day I can have an AC-130 stylish mission where I can see the white in the eyes of a character while zooming in, is the day that we're there (although I use to play 2010ish games, so maybe we're already there and I don't know yet).
when i played dayz in 2014-15 me and a friends were talking about the grass issues at distance and he had an interesting idea At longer distance, just move the ground texture up, so if you are prone you are under the ground texture This way you wont stick out and realistically someone that far away shoudnt be able to see you anyway
The software side of things is also important. As it currently stands we cant really get a huge leap like we used to because we already are representing most objects in games as realistically as possible using Physics Based Rendering. Cant really get wow'd anymore when you cant get any more real than real. It's up to techniques like path tracing, as well as realistic character rendering and animation to give us that wow factor. Which is why I think GTA 6 is going to be crazy 👀
It took you a minute, but you're finally getting the point. People don't see the value in upgrading and spending more money on minor improvements when my current hardware is technically already good enough and should be getting better performance if games were optimized better. Some of us are adults with families and other responsibilities and must make responsible purchases. We watch videos like this to see what's new and help us to make better choices. Honest benchmarking and optimization videos do far more good than just saying we need more powerful hardware.
it’s probably because devs wanted 32GB of VRAM for the PS5/XSX generation, but only got 16GB. The jump from 360 to PS4 in VRAM was 16x, PS4 to PS5 was 2x…
High vram alone doesn't help if the gpu is not fast enough to process the saved data and ship it onto the monitor. The clockspeed will always be the first bottleneck and the higher it is, the more energy it drains, the more durable it has to be and the better the cooling has to be, which leads to bigger and bigger parts (cooling solutions).
@@thelazyworkersandwich4169 Yeah ofc they don'tt complain because no one tells them to code a game for console that runs in 1440p or 4k at AT least 60fps stable, WITHOUT disabling/reducing a dozen effects, like shadows, draw distance, lod, etc. until it turns into a mushy mass of indistinguishable hot garbage.
Why are we encouraging stopping? Comparing ridiculous game settings to optimized settings is not fair, some of those settings are indeed extra useless but we have a lot more things that we could be doing. Path Tracing with more rays, more bounces, more polygons, etc. We have so far still to go... Graphics are NOT good enough until the actors on screen look identical to a movie.
@@SauceageTF Zelda looks absolutely terrible and it not being on a real platform with real graphics is the reason I'll never play that, not even emulated. GOWR looked a little dated and could use an uplift. Very low quality characters for 2024 and you can see polygons all over the place. It felt like a last gen game for sure. Where as Forbidden West actually looked closer to current gen. Path Tracing is the most important thing. That's what makes me happy to look at a game nowadays. Alan Wake, Cyberpunk, even solid RT like SIlent Hill 2, Casting of Frank Stone, other UE5 stuff etc is making scenes tie together so well with lighting, I can't look at that ugly gamey fake lighting anymore. You can see it's 3d models in a fake raster scene, it's so icky.
@@albert2006xp Graphics were never about realism. Pushing for it is a waste of time. The best artists never pushed for realism. They understand realism doesn't make a painting any better. Starry night doesn't look like any night I ever witnessed on Earth.
@@SauceageTF Realism or just sheer detail and believability. You can make something fantastical but it still needs to tie together. You can't play an abstract painting.
"Starting to get to the era of diminishing returns?" I just question the word "starting" In the late 90's through the early 2000's dropping £400 on a new graphics card made a HUGE difference Often games just would not play on 3 year old hardware. they wouldn't install, wouldn't boot, you'd get an error message telling you your GFX and CPU - were not good enough and that was that. A CPU and GFX that was top of the range 3 years earlier, could not play a game at all. That wasn't a rare occurrence. So you'd stump up for new hardware, and immediately you could see just why your old hardware could not perform on this new game Now, as my daughter plays the latest games on my hand-me-down almost 9 year old 1070 - that was MID RANGE in early 2016. I look over her shoulder as she plays and think "I wouldn't care if I had that back, those graphics look just fine to me" Yeah it could look better... but not $800 better. Not even close. Not ever.
So what you're saying is if that old PC and my PC with a 4080 super were running cyberpunk side by side you wouldn't think that my pc running it in ultra wide 1440p at 60fps with path tracing would be worth the upgrade vs what.. minimum settings at 1080p and 60fps?
There was often times new tech being dropped that just wasn't supported on older hardware like deferred rendering, tessellation, pixel shaders, or many other things. Those massive jumps were just because everyone was developing new tech for GPUs. Today we rarely get new tech for a GPU and the only recent addition being ray tracing. Those games that use hardware based ray tracing will do a similar thing to older GPUs from a few years ago
@@MainChannel1999 so you'd take the 1070 if you had to play in ultra wide exclusively with a 4080? cmon bro. anyways, that's what the extra 16:9 monitor is for. options 💁♂
@@crestofhonor2349 it's already happening, and it's even happening for current GPUs from other brands.. I ditched a 7900XTX for the 4080, yes I occasionally get a tiny bit less FPS in other games, but anything ray traced and I'm going from sub 60 to 100, except for cyberpunk, where I went from sub 40 to 60.. but that's proper RT (PT)
We have reached a plateau in game graphics. The only benefit in getting newer GPUs is for 3D rendering and video editing software for demanding companies that put out a ton of content frequently. I had to subscribe because you made some great points! 👍🏾
This is all I'll say, my priorities for games. --- Top --- Gameplay Optimization and performance Game-feel, UX etc. --- Mid --- Art direction - Includes: - Writing - Visuals - Music and Sound - etc --- Bottom --- ''Realism'' Make games more optimized so that we have more people that can play them at acceptable framerates. Most of these new graphics aren't even that good, a lot of it is sacrificing art direction as well.
One thing to note, is that consoles (PS3) at one time gave a price to performance balance. including a Blu-Ray player, media center, and gaming device all wrapped together for a decent budget price.
@@jiggerypokery2962 Ps3 was clearly overpriced at launch, but it was the cheapest bluray drive at the same time... Many sales were only driven by the bluray drive itself.
The problem may be looking at consoles that way. Back in the day many people bought a ps2 or xbox for its home media capabilities such as dvd and blu-ray. Ofcourse the world has changed now but then so too should the console's marketing. What about if you sold consoles like phone plans? That way you could jam more premium components in and have the overall cost be higher but not less affordable. You could also package it with big brands for the modern age. So lets say if you buy and xbox subscription you get the new xbox, gamepass, netflix,spotify premium,youtube premium and amazon prime. That way your entertainment and online quality of life services are all covered under the one payment. Much like how your dvd and game playing device was all in 1 unit in the 90's and early 2000's
not true, a lot of games are build first with console in mind and than pushed with some extras on PC. Vegetation is one of those option where the engine is just put to do a lot and that's called ultra settings.
Reaching the hard limit on computing power would be really weird and really cool because suddenly the entire software world revolves around these limitations and we could theoretically see a greater amount optimization across the board. A world of frozen computing power, ironically, could see its performance improve over time as more people get more experience squeezing out as much power out of the silicon as possible, which in tern could mean that software gets faster to make and therefore cheaper.
I remember spending $200 or less to go from 8bit to 16bit to 32bit, and it was so amazing how much better everything was. Recently I spent $600 for a 4070 Super to upgrade my old $500 2070 Super and there's really no difference. So not only are we not seeing massive change, we're spending massive change! Instead of spending $600 to upgrade your gpu, spend it on a great ultrawide HDR QOLED monitor, because *that* is a noticeable upgrade
This topic revolves around a few things as context. Resolution is hitting the point of optimal clarity. 4k, depending on your screen size and viewing distance is already below the limit of human vision. 1080p is the same, especially for TVs at a typical distance. You can see a bit of a jump moving to 4k, but not that drastic. There is a difference between video and rendered graphics, and you will see a bigger jump for rendered graphics moving between 1080 and 4k. So really, we theoretically only need the power to render photo realistic graphics at 4k to be at optimal clarity. The next improvement will be simulation fidelity. Having the power to not only render photorealism, but to accurately simulate the physical world. Water that flows and behaves like real water. Foliage that moves and reacts realistically to the objects that pass through them. Things like that... We will eventually hit the point of optimal clarity and simulation, and at that point, increased performance from a gaming perspective will basically plateau. Only a change in how we game would shake things up. At the current level of game engine design and capability... I believe that hitting the point of performance where a xx70 series card has the power of the suspected 5090 specs, would be basically the next leveling off point. You would still have the xx80/90 series to handle any additional features that may come around, but the mid-level cards would have some serious grunt. Current CPUs are not too bad, especially in the higher end. We would need better parallelization of the game engines to push those CPUs to their real limit. So in a few generations, with better parallel processing built into the game engines, we will be doing good. Simulation will be primarily CPU based with our current architecture... So better parralelization will be key to better performance. All that said... I think we will start to see gaming performance level off for a few years, until we get better game engines, that push simulation higher, and better take advantage of multicore processing. Possibly dividing out the main game thread tasks to more cores, as that is the current bottleneck for several games.
I think the developers are to "blame", you can have amazing visuals without a 4090, development gets more and more expensive so developers should focus on smaller but better optimized games.
Thank You and even with a BEAST like the 4090 some of these games DO NOT look that good in person, that is why in most cases I Turn RTX off because is is a SMEARY BLURRY MESS and makes the game run like crap but in looks good in stills and RUclips videos where the Compression algorithm hides the flaws, as soon as you pan sideways in Cyberpunk or Dying light 2 you can the Reflections SLOWLY and CRAWLING and SMEARING and they try to catch up because it take MULTIPLE frames of data to make them NOT look like a a bunch of PIN HOLES
Silent Hill 2 doesnt look good but Demons Souls does HOWEVER, Demon Souls is a PS3 game in design. Make Elden Ring look like that and how would that run? Its great looking game but since its world is 2 generations old, they could ramp up the graphics. Is it a bad thing, no but context is key
Someone analyzed SH2 at a deep engine technical level and found they waste a huge amount of resources for almost no impact on image quality, unlike the previous games, they don't even used the fog has a curtain to hide more aggressive LOD (level of detail) and gain performance, hell they don't even use custom LOD's at all they use nanite alone that is way slower than tradicional hand made LOD's and specially slow for vegetation. But hand made level of detail, takes time to make and "time is money" so they just decided to not do it. This guy also made a few tweaks to the game graphics, using UE5 console commands and was able to bring performance way up without changing graphics that much, more or less similar to what Daniel did. Why didn't the original developers did those optimizations! The answer, could be, lack of time or even worse, they didn't cared or they didn't knew better.
I have a 7950X3D + XTX combo at 1440p and.... I use a framecap in most games. I had an RX 580 at 1080p, and I was perfectly happy with it, but during the pandemic work from home made me not satisfied with reading text (I'm a programmer) on 1080p, so I bought three 1440p165Hz monitors and a 6700XT. The only reason I upgraded from that 5950X + 6700XT combo is because I could. I'm perfectly happy with the level of graphics I'm seeing now. I generally target 90FPS in single player titles. I have a lot of games, that aren't even that old, maxed out, and many with community made HD texture packs because that's what hold a lot of games back. So yes, you're perfectly right. Until I can get full path tracing without reflections being rendered at a noticeably lower resolution, or the denoising is so abysmal that all reflections shimmer, then I'm not going to bother upgrading. 1440p full path tracing, with sufficient rays and bounces, with no upscaling at 90FPS or bust.
I have pretty much the same gear, play a lot of older games at 4K, newer ones like FF16 on 1080. No need to upgrade if you get 1080p 60. I got the 4K screen for work. It’s just sad when developers ignoring the massive number of gamers still rocking 1080 or a 2070.
@@JGComments They aren't ignoring them. Some people just take it as a personal attack if they can't run everything at the highest settings at the highest resolution. 1080, 2060, RX 5700, RX 6600 or 2070 class hardware will still run mostly everything at 1080p High and in the very latest titles 1080p Medium. And it still looks fantastic.
@@andersjjensen that's pretty much true 95% of the time. If you can run a game 1080 60 fps on Medium you should be happy. It's insanely expensive to try to keep up if you expect 4k 120 with all the bells and whistles.
Well, I'm currently on an i7-920 w/1050 ti so... I plan to get a modern gaming rig before Win 10 support stops next October. After I do that, I'll be happy if I don't need to upgrade for another dozen years.
I feel like the same thing is kind of happening with display technology. The difference between Standard definition, and 1080p was huge, but from 1080p to 4K was less dramatic. Sure, you can drop a huge chunk of change on 8K, but is it really worth it?
If developers stop the practice of developing games that perform best based on hardware that is not even in the market yet that would be a good thing. Personally, I lost my enthusiasm for gaming a while back when it felt like developers (especially AAA studios) decided that good looking games mattered more than good or innovative gameplay. Probably only those who grew up in the era when gaming started going mainstream in the 70's and 80's would get where I am coming from.
Developers don't make any decisions you attached to them in your comment. The R&D or production teams: 1 artist, 3 devs, 1 analyst, 1 tester, 1 dev-leader. Those teams make up less than 60% of the company staff. Whatever you think devs do, they don't make decisions. Especially the strategic ones.
When was this "a while back" ? 30 years ago ? Because that's what AAA studios/developers did since forever. Literally. And the current situation is much better than it was in the '90s and early 2000s, where games would literally not run on 3 year old hardware. Nowadays almost all games can still run on the midrange GTX 1060 from 8 years ago. This will be even more true when RTX 2060 reaches 8 years old.
@@p4r4g0n I'm almost with you on that one, but I did had some exceptions (DOOM 2016 and Kingom Come: Deliverance, from what I remember). And In the Warcraft 3 days, we were pirating it (it was common in my area, pretty poor country). It's funny that for Warcraft 3, I had it installed for several months before I could play it. Because, I didn't had a GPU initially.
We were headed that way until all this upscaling garbage got pushed onto us. 1080p 60fps was standard a decade ago, yet now new games require mid tier cards to even hit 1080p (native) 30fps. Sad times.
As someone who has been Playing at 4K Native since 2014 it is AMAZING and I can NEVER GO BACK, the sad part is that with the poor performance in most modern Titles even My RTX4090 can struggle get 4k 120 unless I turn down some setting or use DLSS and I HATE SCALERS many of them smear and blur ruining the Crispness of 4k , saddest part is that with some Older but still REALLY GOOD looking titles I can run TRIPLE 4K Surround max settings and get 80-90fps, and yet on some modern games you can struggle to get 60 FPS on ONE SCREEN graphics and Gameplay that is no better that what came out 6-8 years ago
I've been saying this for a while now, I think this is good. Chipmakers can keep developing insane hardware at exorbitant prices for server farms and we can just be happy with what we have. It's a win-win.
Disgusting attitude. We should not be happy with anything until graphics look much better than real life. This comment section and this ragebait video is gross and anti-gamer. Bunch of casual gamers who just play in the evening with a controller in their hand trying to ruin it for real gamers.
@@albert2006xp its so strange to see this anti-progression attitude among pc gamers in the last decade or so. i cant imagine anyone saying 20 years ago that theyd rather ati didnt release the 9700 pro because now their 7500 is going to be obsoleted, or 15 years ago people waxing poetic about how 1 cpu core is way more than enough and why would you ever want a core 2 duo? hardware was obsoleted so much faster back then funnily enough.
@@darudesandstorm7002 20 years ago, there was real and very tangible graphics improvement to be had from new hardware. Now, even the most demanding games still typically run fine on a mid range card from 2 generations ago, and there's often barely any noticeable visual difference between medium and ultra settings. My personal opinion is that I will only buy a product if there's actual improvement to be had by owning it. If the only improvement for owning a card that's five times more expensive than my current one is so minute that I couldn't see it without stopping the game to pixel peep, I don't think it's worth it.
@@darudesandstorm7002Buying new pcs every new release would make a lot of pc gamers abandon the hobby because if unaffordability. There is hardly any longevity in gpus now on lower end, i cannot imagine back then.
The PC market has a much larger inventory to choose from for upgrades, which is why incremental changes aren't met with as much scrutiny. They can pick and choose which part needs an upgrade. Consoles only path to upgrade is a pro model 4 years later or the next generation 6-8 years later, which is why they garner a louder outcry.
I have dumped thoughts across multiple comments into this one comment in case it gets highlighted: Its not broken. Its maturing. PC hardware, mobile hardware, its nearing feature completion for the customer base, and the vendors are struggling to find meaningful innovation out of these parts. Unfortunately what we're seeing is that rather than commoditization like the tech is supposed to do, the vendors are trying the opposite tactic and trying to push the updates to the high end. Same thing happening with GPUs and phones. As a customer base we have to push back on the planned obsolescence and hold onto our hardware and the devs need to defocus pushing the hardware. To answer your question, yes and no, we are generally happy with the level of graphics in GPUs now. But we would get bored after a while. I personally think we can continue with incremental graphical improvement but slow it down a lot. We dont need the change to be this fast anymore and we can't afford the GPUs anyways. If the cost of graphics is killing games you've gone too far as a developer. There are other types of innovation besides performance. By reducing power consumption, you can get innovations in size and form factor. That's how we're getting ITX PCs and PC handhelds like the Steam Deck. Thats how we're getting integrated graphics being able to run games. Imagine every single PC made can run PS3 quality games now instead of needing a discreet GPU. Considering usability concerns for the customer like that is how we get the 'reset' in the gaming hardware business model. Also Daniel, Is this a quiet admission of your own culpability in this graphics treadmill? Your channel is about analyzing graphics in games with a fine toothed comb after all. Without graphics what do you have to talk about? 😅
next jump is gonna bei AI NPCs, a truly living world unique to each person. I don't know what kind of computing power that would require but I don't see graphical improvements as a thing right now until you get into fully realized actual VR gaming.
You dont need much power. What you need is tons of VRAM. 16GB minimum to have textures and the ai stuff running. 20GB would be the sweet spot for hi res textures and amazing ai perf for fast responses
They will ruin that as well with woke AI training. They should just make more interesting sidequests and hire better writers and allow them to dare new things that are normal in movies and literature.
They're literally the same... I have a PS4 Pro and a friend has the PS5. Putting them hooked up to a computer and running some programs...the PS4 Pro runs about 95% of what the PS5 does. PS5 basically just has rt and can run 30-45fps rather than locked at 30. They're the same
PS4 Pro crawls running Cyberpunk not even holding 30 fps at 1080p with some dynamic resolution and lowest settings possible. Meanwhile PS5 runs the game at native 1440p with high settings and ray traced local shadows (which are cheap) then upscaled to 4K at locked 30fps. But sure, PS5 is barely more powerful.
@@phattjohnson Bruh, they both run in predominantly 1080p-ish at 30-40-ish fps, and some titles can be 1440p-ish upscaled to around 1660p-ish...and run 30-45fps... The only difference is PS5 has rt in some games, and has a few new titles. That's it! Lol
As a PCVR gamer the fact that we literally need to render the scene twice (one for each eye) with each eye pushing upwards of 3000p (when using high resolution headsets), this is the ONLY reason I have shelled out each cycle for a 90 series nvidia card. If I wasnt into VR I'd go back to my old ways of skipping a generation of card and only going has high as a 60 or 70 series. I did breath a sigh of relief this year though because the performance im getting out of my 4090 is so smooth in VR that I might actually be able to skip the 5090. I agree with your assesment that we are in the time of Moore's Law and the fact that video cards can cost upwards of 2K is insane...
I'd be happy if it stagnated. It also might give a chance to optimize the software side. Not only has there been less and less noticeable difference with each generation of hardware, but most of the hardware isn't being fully utilized and/or is badly optimized on the software side. A lot of the hardware that's still being used also doesn't get any firmware upgrades anymore cause the dev team on that gets moved to developing for the newer generation. When that happens the companies also don't open source their stuff so other people can work on it. I don't know where I've read this as I read a ton, but I remember there being news of a 'bug' or an 'optimization' that was discovered or done on what we would consider ancient hardware (I think it was for hardware that comes from the 90's). This happens more than we can imagine, especially since the hardware changes so rapidly.
I for one think that hitting a plateau in performance gains for a while would be a boon to gaming, in a way. With developers having more limits again, they’ll hopefully be pushed into becoming more creative with what they have on hand. They’re hitting the soft cap on computing power, now they can refocus back onto the gameplay performance itself, the art and style of making a game, and less on graphical fidity…. Hopefully 🤞
We need more gameplay, gamer player modes: split screen couch-co-op, save states, customized story alt-routes, various graphics settings, easier customizable and authorized mod support, fast ssd, longevity of hardware, ease of maintenance and upkeep, and electrical efficiency. I think the next step really is games having a 2D-3D Tv mode, portable mode for portable gaming devices, and a VR mode for those who own MetaQuest, PSVR2, Apple VR type devices. Meaning game designs have that immersion safely for giving that first person perspectives with game design in mind.
I 100% believe your right the path we need to take is better gameplay ect graphics has been the focus for 10 ish years and actual gameplay has overall stagnated that is something that can be improved with current hardware
performance *MUST* be the priority going forward;
games shouldn't cost millions of dollars and run sub 30fps.
It's in the hardware companies best interest to push graphics.
Bro you know that a freaking developer can cost a 100k per year and even more? You need millions of budget to create a game.
Blockbuster films cost millions and run sub 30 fps. What's the difference? Both most of today's film hits and most popular games are crap.
@@classicallpvault8251 🤣
laughable comparison, you don't _play_ movies.
@@maervo4179 and somewhere in that budget, they can make room for optimization
🙂
The problem is devs are overthinking about graphics, yes having huge graphics improving each generation is nice but we need good gameplay, storytelling and optimization. Right now, we have games that tank your fps trying to be as photorealist as they can be with bad gameplay and no optimization or barely (don't tell me relying on scaling is optimization!)
They are not overthinking. They are cutting costs, by using rt, dlss, frame gen, and get money from partnering with companies like epic (ue5 lumen), and nvidia. It’s easy money for everyone.
THANK YOU, when going from Low to Medium setting almost made it look like you were playing a DIFFERENT GAME and if you went from Ultra to high or even high to Medium thee wereBIG FPS gains NOT ANYMORE especially since DLSS and FSR gave Publishers the Excuse to cut COSTS by killing Traditional optimization
No, the problem is with the increasing complexity and features in engines many artists don't gain low level knowledge and rely on existing platforms with limited developer teams that constantly need to cater new people. So basically lack of expertise + stupid deadlines.
Nobody ask for graphics, the industry did that to itself, giving the illusion of value on visuals. Black myth wukong would have been a 1 - 2 years max if it wasn't that visually heavy. also that shows that future games won't even have to care that much for optimization on framerate, with FSR/DLSS and framegeneration, game will keep spending on graphics and leave "force" FSR/DLSS and framegen as wukong did to perform as expected.
Spot on, I have no issue with FSR or PSSR being used to enable a rock solid 4K30 game being playable at 1440p60, i think thats a fair trade off, but when you see games abusing FSR to the point that PS5 and Series X games have an internal resolution of 720! Its crazy, Immortals of Avium is actually a fun game as well, but going UE5 ruined the game on console as it had no chance of hitting 60fps without FSR at a base res of 720p!
"Only dogs can hear the difference" is a GREAT way to describe this predicament
Also, even a dog can tell the difference between older well made games vs some of the *expletive* we have being released (and shuttered in failure).
It's a great way for the CEO of a console company to try and persuade people to stop pursuing tech improvements because it eats away at the cost and effort of a console 10x more than it does on a PC where you can simply adjust settings. You guys eat up marketing so easily while thinking you're too informed to buy into it.
It's interesting when you look at it from the perspective of Memory improvements. The PS3 had 256MB of RAM, plus 256MB set aside just for graphics. The PS4 has 8GB plus 2GB. That's a 30x improvement. THIRTY EX! O_O
Now, the leap from PS4 to PS5 was....2X....it went from 8GB to 16GB. A bit underwhelming from a memory standpoint. So, while the PS4 could store 30x the textures that the PS3 could, In contrast, the PS5 only doubled texture capacity from the PS4. Crazy.
256gb thats a lot 😀
@@ZdzichuRaczkaEgzorcysta Augh! MEGAbyte, MEGAbyte. XD
Sorry, typo. I fixed it. =P
And this is why it makes so much sense to play retro games these days.
If you are a boomer
@@marsdenit2845 Just a reminder, that minecraft is still the most popular game in the world. Gameplay is the most important part in games, not graphics.
@@Boris-Vasiliev
Which has received a ridiculous amount of updates and is not really a "retro" game.
Retro gaming is better @@marsdenit2845
@@marsdenit2845 Oh dear, another ignorant deluded child.
I'm glad that the graphics are plateauing. Hopefully now, the focus will shift to improving optimisation, gameplay and work conditions.
(Spoiler: the focus will be MTX)
Very simple answer: I do want better hardware. But not for better graphics, but for bigger scale. Keep current resolution, models, lighting, and so on, but add more detail to the maps(more props, furniture, interiors), more actors that are more interactive, farther view ranges etc.
There's still a lot of growth that can happen to games as far as their looks go, it's not all about resolution and raw texture quality or number of triangles.
Everything you just described is still graphics though, and there are improvements being made in that realm. That's part of why things like mesh shaders are useful just to name one example. A lot of stuff like this is happening in game development it's that it's more of a literacy issue now. It's sort of like how people might know they like one song in a genre and not one in another, but might not actually have a way to convey why that it is. Gamers might be able to recognize that something is improved or different now but they no longer can articulate why because the difference is no longer a jump from "barely being able to represent something that looks like a thing" to "being able to represent the thing". Sort of also like how you can tell only when CGI in films is bad, but how someone watching would struggle to tell you why it looks bad and sticks out. Game environments generally are more dense, they've got way more stuff in them, shit farther view distances got solved a long time ago for everything but foliage with an infinite reversed depth buffer, the only thing to push there is maybe higher fidelity LoDs and that's it. Like all of this stuff is improving you just don't have the literacy about how games are made to be able to articulate or grasp what's different now.
i cant wait for AI speech algorithms to be implemented into gaming. just imagine a whole story based on AI given a small script and scaling the story based on how you react
more clustered scene = higher cpu & vram requirements
Best part about that type of improvement is it doesn't have to lock anyone out from a hardware perspective. Like render distance in minecraft
@@adeptalakay I want to talk to an npc not chatgpt. Chat bots are not at the level where they can fit seamlessly into a game world while making sense and not hallucinating
I think physics simulation could use a major overhaul. It's been an afterthought for quite a while now. Even the games we recognize as having really good physics exhibit floaty objects that rarely express their true weight. I'd like to see the industry turn their sights in that direction. Audio has also fallen behind. Some games sound great, but only in comparison to other games. Much like RTGI which bounces light, I'd like to hear sound that bounces as well. It's not often that we have a game that does that. More often, if a sound is coming from my right I only hear it on the right, as if there were a void on my left that sound cannot reflect from. Maybe these things are not important enough to the consensus, but I for one have been wishing for these improvements.
I think physics simulations is a hard thing to do not because of limitations but because making physics an integral part of your game is not easy.
Half-Life 2 and Half Life ALYX are like the only examples where really good physics simulations actually meant something for gameplay, I wish more games did that.
That, and the uncanny valley of character faces, movements and overall realism in the way NPCs interact with the world.
Cpu req increase a lot by the time and physx npcs and ia are always the same....
@@keatonwastaken I’m currently playing Control, which is why the subject is fresh on my mind. The physics are great in that game, but even so, objects seem to have one predefined weight, which takes me away from the experience a bit. I’d say even if physics aren’t critical to the experience, I think it would make games more immersive. Regarding a new frontier in gaming, that’s among my top picks.
@@LilMissMurder3409 Absolutely, especially open world games. It’s back to the drawing board for sure when it comes to that, because it would be too costly and unreasonable to mocap filler NPC’s. Inevitably I think AI will be a big part of this gaming renaissance, as much as I hate to admit it. I can see how it would be beneficial, but it’s also a big can of worms. I can feel other commenters readying their pitchforks now lol, and I can’t blame them.
Graphical fidelity is good enough. They need to make the physics way better. It's like the picture is getting slightly better, but less and less interactive compared to older games. Arkham Knights to this day looks graphically amazing, and when you turn on Nvidia Physics, it looks way cooler than any raytracing we got today.
This 100x. I’ll take Generational increases in interactivity over any visuals any day. What makes games fun is interactivity not visuals.
Agreed. The Nvidia Physx smoke in Arkham Knight still looks better than the smoke in almost all games today.. Felt like a taste of the future back then..
But, here we are, in the future. And we are still seeing the same old, non-interactive, billboard transparent textures meant to represent smoke, and it's just sad..
I don't consider the smoke physx interactivity I consider a world like Zelda where the world reacts in a common sense way to your actions an example of Interactivity. That smoke was nice but held no gameplay importance just fancy fx
No it is not good enough. You're going to have to buy a new PC, get over it. We've been doing this for 30 years.
HL2 still got one of the best physics.
Seems games made a step back in last 10 years xD.
Digital Foundry made a vid about Ageia Physx Card.
My hardware is just a means to an end. I just want good games, and it feels like there are less of that every year.
That's just a result of ageing, unfortunately. The more you experience the higher your standards become, and the less novel everything is. What was once exciting and unexpected is now dull and predictable. And there is a foreshortening aspect when you look to the past. The past contains all the great things you loved, it is full of them. The present only contains the things you love right now, which are inevitably fewer. Same reason as why music was always "better" in the past.
@@calmhorizons You aren't wrong but I don't fully agree with you. It's pretty obvious when you look at games being released say 2004-2011 and compare it to games being released 2017-2024 that the quality and quantity of good games has dropped significantly.
Same with music. I'm mostly listening to old music, and discovering new stuff all the time so its not nostalgia in any way. The only thing one can aruge about is survivorship bias in that all the bad stuff from the past have been forgotten/ignored and only the good stuff remembered and filtered out. That works to your advantage though, since you know the old stuff is going to be pretty good for that reason.
@Skumtomten1 most old musics are terrible, and like every genre or year youre on. Most music are bad and a good number is decent and a few are very good. Obviously you're only listening to the good ones.
@@calmhorizonsPredictable doesn't equate to bad. Music only has so many note combinations and chord progressions, but a new unique spin can still intrigue someone who prefers his generation's young music. Pokemon, Mario kart, Mario party, smash bros, video game market is full of predictable but still great releases with new tweaks. One piece can be predictable but it's still the greatest work of fiction in modern history.
@@thunderstar254 True to a point. But you might be discounting the vast gulf between possible combinations and plausible combinations.
There is a reason why we see the same flavours, melodic intervals, story beats and visual motifs repeated over and over in art and entertainment - evolution furnished us with a limited margin of acceptable interests and constrained sense organs. To make a crude example - it doesn't matter how many variations of shit flavoured ice cream you make, it ain't gonna sell. 😁
I would 100% love it if the world just decided that graphics right now are good enough.
Why push progress 1mm forward if that 1mm costs me 1000$ just to keep up?
The answer is "because they want that 1000$ from me again and again as they very slightly change almost nothing but performance cost"
The industry is very clearly and very deliberately sabotaging performance in order to force consumers to buy new expensive parts if they want to keep up.
They might actually add slight improvements to games, but they're so minimal. The only actual change I've noticed in newer games is that they run worse and look worse on my hardware than older games do.
That's all there is to it.
It's never been about making prettier games. It's always been about making consumer hardware obsolete.
I think this is a bit of a stretch. It's not that manufacturers and publishers are colluding to make hardware obsolete, it's that hardware evolution continues steadily, and prettier games attract consumers, which in turn cost more resources to make, which in turn influences publishers to crunch development time to save money. It's a perfect storm of pressures, and in the AAA space that means 4-6 year old hardware becomes obsolete when, for example, games like Alan Wake 2 are designed around mesh shaders - and stand out visually for pushing the needle forward.
Your observations aren't wrong, but the narrative is more simple and more nuanced than "the industry is against its consumers."
The truth is that there's more computational power in the hands of developers than ever before, but those resources are being squeezed to get games out the door quickly, and rely on cheap techniques like TAA to buff out the visual imperfections. I'm not sure of the solution. Times are tough right now, new games have very much stagnated, and we need more publishers willing to take on more risky ideas than yet another remake of their next greatest hit. Alleging a conspiracy doesn't help anyone.
Graphics can take a backseat when we finally have raytracing doing all the lighting as the norm. That’s going to be a huge graphical leap that can’t be understated.
“The enemy of art is the absence of limitations.” - Orson Welles
I believe the digital age has disintegrated the mindset of optimization. Games no longer have to work before it goes in the box or fit on the cartridge or disc. Games can be 200 Gb, run poorly, and get patched later. Broken indeed.
Ja
Is optimisation art?
@@CurtOntheRadio i'd say it is.
anyone could write some code that does a thing, but not everyone can optimize it to run well
Like how anyone can dip theyhads in paint and print them on a paper, but not everyone can make beaautiful painting from that
@@captainjimo Hmm. Then everything is art too. I'd venture Welles wasn't speaking about optimisation so much as art is about the constraints - only having these few notes, these few instruments, these few locations, whatever ie working within material limits that constrain one and getting the most out of it.
Optimisation is more about removing constraints. Arguably the art comes in working within constraints, whatever they are, and getting the most out of it - not removing the constraints (which is what optimisation does, and which is largely a technical, objective, engineering job).
@@CurtOntheRadio Everything isn't art, if that was the case there wouldn't be art schools.
Fuck graphics what pains me is how environments are sterile and uninteractive and how physics in games is dead or how AI is literally same for over a decade now. Burnout Paradise STILL has the best realtime best car deformation system in a racing game and that game came out 16(!) years ago then titles like Red Faction Guerilla that had insane destructible and interactive enivornment or The Force Unleashed that combined multiple middleware tech to reach design objectives of devs.
Devs are pushing for visual shit like mentioned in video that can be barely visible meanwhile you play current gen title and while going through foliage you pray observing if your character model will interact with that foliage properly or if your character will just ghost through that shit lmao. What the actual fuck.
morrowind's map while smaller than modern games felt so much more like a real place that you could explore everywhere than modern games.
@@arkgaharandan5881 That too. Devs really need to scale down on fancy polygons and provide meaningful and more interactive experiences and by interactive I mean everything that includes physics systems. Everyone slowly becomes Ubisoft when it comes to bloat of safe slop designed on sterile corporate templates.
A fact you can launch decade old title and look what physics engine of said title does and not just be impressed but you cannot find a game 15 years later that is even a bit close to this in that regard is fucking insane. Only titles that push something like that are some meme indie tech demos that market themselves around singular physics based gimmick and thats it.
When was the last time we had cool gameplay centered innovation in a big budget title? Nemesis system in Shadow Of Mordor 10 years ago by Monolith. Nemesis is also AI based and looking at their previous work with FEAR in that regard no wonder they were the last studio that even tried doing something around AI.
Again what the fuck is going on, its like entire industry became MCDonalds tier.
Car destruction isn't even on the fault of the developers, it's on the car makers who hate seeing their cars destroyed. Most racing games license out cars to use and often the manufacturers get to decide whether or not they want a detailed destruction model
@@doooodeh Nemesis system has been completely locked to just WB because they patented the tech so no one else can use it
@@crestofhonor2349 I know about the patent stop excusing talentless worthless western developers that could design some other gameplay centered innovation. Same with licenses and racers, there are games that are using custom vehicles and those STILL arent close to Burnout Paradise in that regard.
There's lots of great older games on Steam. I'm going through my backlog.
It's great fun to load up older games with upgraded modern hardware!
You can max them out, run at 4K+ resolutions, at the highest framerates possible.
@@bliglum I just started playing TitanFall 2 completely maxed out and up-scaled to freaking 5760x2400 over the weekend and it's as smooth as butter. It's been in my back catalog for awhile.
@@bliglum A couple months ago I treated myself to the full Half-Life bundle - it amazed me that a 20-year old game can still look that good. It was money well spent (and righted a wrong, since I'd pirated the original back in the day 😅 )
@@LilMissMurder3409 Hell yeah!
No better time to explore the backlog than now.
@@bliglum It's the digital equivalent of digging through the software bargain bin at PC retailers back in the day. That's another thing that has died out.
Immagine devs spending so much on nice graphics, only to apply forced TAA which makes it a blurry mess anyways.
Imagine having one gpu for 10+ years and still being able to play the newest games on highest settings.
Its doable probably but discouraged by hardware manufacturer mafia
Sounds like a Twilight Zone nightmare.
The highest setting is a gross exaggeration
Never
@@kalebgross1310 it’s possible if you are still in 1080p a 1080 Ti can easily do it at high atleast not ultra.
Playing a game, the same game at higher resolution is such a game changing experience and if possible skipping scaling.
I occasionally have to remind myself that I'm in an echo chamber of AAA games with ever increasing system requirements. Outside of that echo chamber, the "real world" loves the Switch and the Steam Deck, and many hugely successful games run on fairly humble machines. The "plateau" is actually quite a nice place.
Well duh play other games besides AAA.
I had to upgrade from my 2060 Super of 5 years to a 4060ti for some AI VRAM work. Despite all the community trashing on it. Turns out, there's really no game I play that actually struggles on it and that small bump from 2060S is all I needed. Most of my regular FPS games are CPU reliant. Most of the others are anime-esque that don't push graphics, or stylized like Overwatch, and the actual most demanding AAA games I like to play and revisit, RDR2 and CP2077 are the most demanding out of them all. Which all run smoothly. The game I've gotten most hours out of in the last 5 years is Factorio, Genshin etc. I've realized how far I am from the expectations of AAA gamers who push 4k, RT etc and it was warping my mind for a while thinking I made bad purchases, based on other people.
iceyy, cool to see you here, how's the X3D testbench videos going?
@@lancevance6346 its not their point tho , U can't just say oh 4006ti is amazing cause its not .
U don't make a card literally same perf as last gen card and even 3060 had 12 gb of vram and put a higher price for it .
second nividia is not the good person in the story cause they actually profit tons of money from the consumers they become the richest in the world rn even apple didn't make like them,
so shitting on PC hardware companies is so valid rn
@@lancevance6346 and even if u like aaa games from 5 years these games worked on gpus from 5 years too ,now even 4060ti can't run most aaa games at 1080p rn what will it do next or after 5 years ?
did 2060s when u bought it didn't run rdr 2?
Doesn't matter how good the hardware gets when the games we get nowadays are not up to scratch..
This seems like a commonly held misconception which doesn't hold much weight under scrutiny. Which games specifically are you referring to? Which games do you play? Have you genuinely not played any top tier games over the past 10 years or so?
@@steviewonder0850While there will always be good and great games coming out to play, there has also been an exponential increases in shovelware and bad games over the past decade.
For every Baldur's Gate 3 or Helldivers 2, we get a sea of trash sports games, unfinished games that take 1-2 years to actually be complete, and games annoyingly butchered by microtransactions.
@@darthwoody9917 I don't know. There was so much shovel ware during the PS2 and Wii era. The DS, Wii, and PS2 were known for just having so much shovel ware dropped onto them
@@darthwoody9917 Trust me there was no shortage of awful games 30 years ago.
Difference is those games usually weren't flagship titles made by large AAA devs. Activision & EA USED to actually make decent games, now they pump out slop
It'd be interesting to see the AAA game budgets separate development budget from marketing budget. I swear half of their entire budget is just going to marketing and ads to sell their poorly designed games that arent fun in the first place.
It's not unusual to see the highest cost game advertising have a budget 20-30% higher than what they spent on multiple years of development, and even after all of that, how many games can you remember any of the advertising for? You're not far off on the 50% ratio even at AA levels.
Half? Those are rookie numbers
They also tend to locate their studio's in hipster cities that have egregiously expensive living costs and overheads.
Could be said the same for movies
Your best marketing is your consumer, give them a good time/experience and word spreads fast. Does not matter if its a device, a restaurant, a hotel, a beach, etc. The person buying any product is your marketing.
we reached a certain plateau back then when 3080/3090 is the king. you can play on 1440p with DLSS Quality for years to come. in 2-3 generations the visual representation will reach a point where no more power is needed. backing this up with "no one wants photo realism in games" . a good art style is much more important. people do not want games to look like the real world, the opposite is the case. the freeze frame can also be told to using 4K and 1440p . you cannot see if a game runs on 4K or 1440p on a monitor. the pixel density is is already high enough for like 27" displays - a 4K display adds almost nothing to it at a super high performance cost.
I remember when the 20 series was coming out and everyone was talking about the potential of 4K gaming.
Nearly 3 gens later and not even the most expensive cards can run 4K natively at a consistently smooth rate yet.
True 4K that fully benefits from higher resolution is scam for consumers. It is almost nowhere.
Facts:
1. Sub pixel detail. It gives noticeable better looking pixels if there is more information per pixel that are blended to one pixel. Keyword: "supersampling"
2. Camera resolution. Camera has different pixels for red, green and blue with some filter. They are counted for those megapixels so good 4K image requires actually 8K camera.
3. RUclips and streaming. They use so heavy compression that 4K video is just as good on FullHD. In fact that those video quality settings are more like quality settings and have more headroom on pixels, so they can be downscaled if necessary. True 4K in RUclips requires 8K video.
4. Human eye. Human eye resolution is about 1 arcminute. Just do the math based on view distance and screen size what you can actually see. In living room conditions, 4K usually don't happen. Large display in front of armlength or projected to wall in movie theatre is where it matters.
5. GPUs suck on small triangles. That is very wasteful and requires LOD levels, and having more LOD levels is also wasteful because GPU can't benefit instancing so easily and that makes memory bandwidth as bottleneck.
Shortly, there isn't much of 4K gaming that really benefits from 4K. It is lacking even in camera technology.
So while framegeneration easily sucks, frame upscaling actually not. Game can be made for 720p...1080p and upscaled from that to panel resolution and that is how it is done in reality. PC gamers try to play game without upscaling that is when they see the truth: Game is made to played from sofa in livingroom in console, running internally 900...1080p and upscaled from that. If game runs 30fps 1600x900 on PS5, large 4K display with native resolution on 60fps requires almost 12x more powerful GPU and bandwidth. And I'm not talking about raytracing or pathtracing yet.
Eh, ive been playing at native 4k oled since the gtx1080ti. Im still playing baldurs gate 3 at native 4k on my 1080ti. But next year will upgrade to a 5090
@@slopedarmor
4k rendering is possible but it is expensive. Just buy 4 times powerful hardware what is latest gen console, and you get equal framerate and upscaling ratio what is found from console at FullHD. But content itself is easily made for that resolution found from console prior upscaling.
Btw, I just started to play older game, made 2010. That game content is targetted to 1280x720.
What I did, I set game resolution to 1280x720 and put MSAA antialias settings to max from driver (because it has forward rendering pipeline) and that give best image because it just look like crap when texts are smaller, there is lots of texture upscale filtering close up and there is graphic elements that are optimized to 1280x720. There is even grain effect that is optimized to 720p.
So game actually look much better when running on resolution where it is targetted. It is just smooth everywhere and limited texture resolution or other assets doesn't distract. Settings forced to high from driver keep pixel quality high so they don't look bad when pixels on screen are larger.
What I find crazy is the Graphics sometimes aren't even "that great" and you're getting such a performance hit.
As well, that GPU's are becoming so HUGE, yet they STILL can't give you "Good Performance" at 4k.... I mean, tech is supposed to be getting "smaller", NOT bigger.
2080 graphics card has 18.6 billion transistors - size - 775mm²
4090 has 76 billion. Size 609 mm².
have you see smartphones in the last 10 years? They do the same as gpus, getting bigger every year
But i do agree, if the Top of the Top gpu - rtx 4090 - cant do 4k 60 in any modern game, then whats the point of it being the best
@@CurtOntheRadio So 4 times has much transistors and it's sitll not reliable at 4K
@@machintrucGaming Does it have to be? That's your own arbitrary target. And is it unreliable? It's surely better than your 1050Ti.
@@CurtOntheRadio Well OP wants good performance at 4K... I've personally given up on that. 1440p all day everyday. 4K is just about good for indie games. AAA can't be arsed to optimize their game.
The first time I saw a PS2 game as a kid I could hardly believe what I was seeing.
I was very happy growing up with PS1, NES and N64, GBC and so on as a kid. Never complained about graphics. That's probably why the graphics don't matter to me at all today. I care much more about the story, the characters, the music and gameplay. And high fps of course. Nice graphics are just an added bonus--if they don't make the game run like trash.
@@Thunderhawk51 Same. I still love 2D pixel art as much as 3D, but enjoy good games of any genre from any era. Currently mid-way through my first playthrough of Bloodborne on PS4 connected to a plasma TV and it looks and plays great.
I was much more impressed by the GameCube TBH. PS2 was convenient to play DVD's on 😅
I think it will really vary a lot depending on the age of the person commenting. For me it was without a doubt the Nintendo 64. The OG Playstation was great, but all the geometry warping just looked bad to me, it wasn't the 3D revolution I wanted. The N64, however, just looked so high fidelity and smooth, and the geometry warping was nowhere to be seen, it just blew my mind. Dreamcast and PS2 were both great as well, but nowhere near as transformative to gaming for me as N64 was.
@@K31TH3R N64 was still too fuzzy and low fidelity for me. Ocarina of Time was amazingly atmospheric for the time, but in general, the generation of 3D was novel for how it transformed gameplay more so than visuals and I thought most N64 games were ugly too. For me, the Dreamcast was the first console that made 3D appeal on an aesthetic level.
As with all forms of technology, you reach a point if diminishing returns where "2x the power" starts to seem like "1.5x the power" then "1.25x the power"
I'll relate it to 3D modeling, we use to see a new GPU generation mean multiple minutes off render times, now a new GPU generation may see only 10 seconds off render times or even less. But when you look at the percentage difference it's still about the same, because 10 seconds is twice as fast as 20 seconds, but it's still only 10 seconds at the end of the day
Show me a system that requires expontentially less energy to produce twice as much of ANYTHING.
A perpetuun mobile still doesn't exist and will never exist. 😅
Going from 800x600 to 1078 and same framerates are about 40 % increase in gpu power or you could play at 800x and have better visuals but same fps as your previous gpu - Mindbogling... a gpu should be 100% better and prefebly at a higher resulution to pay off
Yup, this is why I have been enjoying VR cause the mobile hardware of quest increase by 2x every three years and you can see a next generational leap like resident evil 4 to Batman Arkham Shadow. Sadly I think the quest 4 will be the last massive next generational leaps for VR and every model after that will be improvements like sharpness and FOV.
Imagine your son want a gaming pc, nothing too fancy, just enough to enjoy modern games, and it costs you 1500€. I bought my first pc 15 years ago for 500€, and I could play everything. A non adult will not be able to afford PC gaming anymore, because we added a bit of grass in the distance
This!! So true.
Imagine thinking 15 years ago prices would remain the same. We could say the same about gas prices, groceries, housing prices, rent, taxes, healthcare. Maybe the issue is that prices have gone up but wages have not and that's the reason why the world is slowly crumbling
You can still play everything on PC with a low end, but like your $500...you have to be on 1080p and turn down all settings. Everyone knows that experiences are best enjoyed at the max settings. Thats why people buy better hardware because this is their MAIN hobby. Its like golf clubs. You only need 1 driver but people into it will have 5. And each of them can cost $2000.
my only hobby is watching stuff online to save money and gaming, everything is free entertainment wise, if i cant afford a desktop or laptop i only buy a phone and use that until it falls apart, i literally only work, sleep, save money and volunteer, i have no other hobbies, i go nowhere, i cant afford it and i cant afford most things like basics and housing, its very doable to afford a phone with no monthly cost and a computer and using free internet at food places, until jobs pay then there is no reason to do or go anywhere or be involved with anyone, entertainment is free and great and a laptop and desktop is more then just a gaming machine
Just made a PC for a friend with an RTX 4060 and I3 13100f for 650€ running with a gen 4 m.2, inflation is crazy but if you look for the best price across counties and website you can spare a few hundreds bucks
Another problem is that companies aren't innovating anywhere like they used to. They're playing safe with trend chasing at the forefront. Also, the whole release it now, fix it later mentality is also hurting the industry. Optimization really hasn't been a thing since at least the 6th generation of consoles, maybe the 7th.
Graphics are fine now. I don't need them to be any better. We need smoother framerate and innovative gameplay that doesn't involve sleazy, greedy and FOMO tactics.
I've only ever bought two re-visioned consoles, the PS2 slim (simply because my fat model was having trouble reading dual layered DVDs) and the PS3 slim (because my original 80GB model got stolen).
The extra performance should be used for gameplay, not graphics.
Real AI powered NPCs, full physics integration on gameplay interactions, gameplay relevant path tracing (using reflections for stuff or something)...
The only purely graphic performance we need now is to be able to achieve current PC graphics on VR, after that, i would be fine with graphics no longer getting better, Half-Life Alyx is so close already...
AI, physics and density are all CPU limitations. Maybe when NVidea joins the CPU market things will get better
Developing NPC/ world ai is something Ive wanted for a long time. That has stagnated for like 15 years
That actually takes effort from the developers, unlike using stuff like ray tracing or making grass be seen from a little farther away.
At this point, I don't care anymore for realistic graphics. I prefer unique art style which always tend to age better.
You can blame the mindless NPC consumers for the focus on ultra realistic graphics. They have no standards and only care about le pretty graphics.
Both can happen at once; I think Cyberpunk is a good example of that where it looks unique and realistic but it's not soulless.
Good for you. I want games to get closer and closer to perfect realism. I want to look at better graphics.
I have a high-end pc with a 4090. I see myself playing my hacked oled Switch more especially in the past 3 months. We all know Nintendo doesn't care for high frames each game has cartoon visuals I don't mind one bit as long it's fun. And, if you can run the Switch game on pc with Yuzu or Ryujinx Luigi Mansion 3 looks like a CGI film with a couple tweaks at 120fps.
I like both. Not everything has to be super unique in terms of art style. Also realistic games don't always age poorly, it just depends on how you hide your limitations
I don't know what sort of tech The First Descendant is using, but it was quite funny, I had RTX Ray Tracing enabled, and was near a light source admiring how the light was bouncing off my character and even told my friend, "Wow look at this, it probably wouldn't be bouncing off my character light that without ray tracing on". Later on, I was hitting some performance hiccups in battle so I turned my settings down, came across the same light by happenstance and regular old rasterization looked the exact same lol.
First Descendant is Unreal Engine 5. The game does use a combination of UE5's Lumen and ray traced shadows
Rasterization doesn't actually have anything to do with real-time global illumination. There are multiple software based ways of approximating the effect or doing it literally, the stuff like RT Cores that are a part of the newer cards are really just dedicated hardware to accelerate ray-intersection tests and bounding volume hierarchy traversal. It basically just lets them do raytracing faster using hardware acceleration.
Flat panel TVs/displays have followed the same trend, unsurprisingly. SD to HD to UHD (and the variants of HD in between), have shown pretty clear improvements, but in the jump from 4k to 8k the conditions and content being shown have to be just right for it to have any impact. This just means as the years rolled on 4k tv's went from massively expensive home theatre-like luxuries to cheap consumable items for a fraction of the cost. Will gaming hardware follow suit?
It feels like more people are looking to legacy hardware/emulators to get their gaming fixes, these days, because of diminishing returns, and because of the state of AAA gaming, as set by the publishers.
Yeah we might be seeing a trend of diminishing returns in graphics, but the problem is that hardware requirements aren't following the same trend. Despite being able to play beautiful games today, a game 5 years into the future that might even look stylistically worse won't even be able to run on my PC. So frankly we don't even have the option to go "yeah I'm comfortable with my hardware today", we need to spend more and more money just to be able to play games that look identical to the games we could previously play.
what i have been noticing , that games just look pretty , less interaction with the world , less animation. i dont know how taxing or demanding it is to make foliage bounce off characters rather than clipping through them , or have destructible environments , or a world that is affected by battles. more than 5 npc models. every time is less and less animations..
That's very game dependent.
Physics systems are pretty demanding from both a technical standpoint and a coding standpoint. I don't think you remember Nvidia PhysX but pretty much everyone hated that because of the cost on the performance but it did add in a bunch of physics interactions
Destructible environments are very game engine dependent
@@crestofhonor2349 Surprising how we haven't improved the physics performance in the 15 or so years since PhysX was a household name.. it's the new gimmick word now. "RTX".
@crestofhonor2349 The last game I know of that uses physx is Control. That game had really good physics and it's well optimized too
Scaling projects up is not a linear increase, costs rise exponentially. Ps1 era devs didnt have to deal with facial expressions for example but nowdays we are not only looking at very expensive motion capture performances for facial expressions but you also have to worry about hair simulations, sub surface light scattering and mussle movements, pores stretching, and the list goes on. And if any of these is neglected then you get a mismatch and a dissonance in how things look vs how you expect them to look and behave. A lot of the advancements we see now has to do with automating these time and cost consuming endeavours (RT, metahumans, etc) but shits getting more and more expensive and companies keep pushing bigger and bigger worlds and projects.
Very true.
Many classic games pre-2000 were developed with dev teams under 50, and they usually developed their own engine at the same time.
The original Baldur's Gate and its engine was made by approx. 60 people over 18 months.
CP77 was in development for nearly 7 years and peaked at around 500 people on the team.
00:40 Mate, the pc was pushing beyond what the ps5 can do three years before the ps5 even launched.
Ps5 beats the 1080ti no problem? You might wanna rethink that statement.
@@com.7869I mean.. it does though. But not by much considering there was a 2 year gap.
@@com.7869 the 2080 Ti released more than two years before the ps5.
The ps5 is between a 2060 and 2070 performance wise, that's a far cry from what a 2080 Ti can do.
If anything, the ps5 is equivalent to the lowest tier in the current Nvidia line, the 4060 Ti, and that's being very optimistic.
@@janbenes31653 years the 2080 Ti came out in September of 2018 if i remember correctly you’re thinking 3 generations we get a new generation every 2 years roughly or we did
@@janbenes3165Can a PS5 run 1440p native at max settings at +60 FPS consistently? That's my problem with that statement and why DF often find misleads people with that conclusion.
I had a 6700 non XT and it can do that all day before you even think about applying FSR. The problem with DF is that they pass off the upscaling and try to act like: "yeah, it's equivalent to XXX GPU" because said upscaler manages to bring it back to 1440p 60. When it reality, the AMD Oberon (the name of the PS5 APU btw) is rendering at 1080p. Sometimes even lower.
This is a handful of issues.
Part of it is that graphics are so advanced that trying to advance them further in a percievable way takes a ton of power. Another part is that doing research and development to improve hardware significantly is getting more costly, time consuming and difficult.
Early PS4 games still look fantastic. We're well past the point of "these realistic graphics will age bad". Even plenty of PS3 games still look good. We can keep pushing graphical fidelity further and further over the years, but we can leave most of that to tech companies as technology in machining improve.
But I think a lot more effort should be placed on optimization. Making newer tech like ray tracing easier to run so its more viable. Reducing development costs, development times, etc. Doing all that is good for everyone.
"Buying more cars is awesome, but we should really fix up the ones we have. " so to speak.
Beyond that, I'd love to see advancements in tech outside of graphics. Physics engines, NPC counts, world scale, more options for affecting the world, bigger stories, etc. I'm looking towards Hello Games, and... frankly, the Beyond Skyrim mods for expanding the scope of games in a polished way. Remember Red Faction Guerilla? Did everyone just stop caring about pushing interactable destruction? The only other games I've seen do that are Teardown and technically BeamNG. Shoot, even Flight Sim 2024 is massively expanding scope. Assetto Corsa Evo is expanding on the sim racing genre. This is what I want more of.
Don't take graphics off the stove, but put them on the backburner. They still can improve. But its not that important anymore.
I think DF said it best yesterday when I listened to them “the Terraflop war is over”.
The games I’ve felt have been transformed most by Ray tracing are old ones like Quake 2, Portal, Minecraft and I’m really looking forward to Half-life 2 Rtx.
Personally I’m happy with less fidelity and games that take 3 years or less to make as opposed to 4-7 years in the triple a world of gaming. I enjoyed Cyberpunk 2077 and Witcher 3 but have enjoyed the Fromsoft souls games more which are technically inferior to CDPR’s output.
That made me think. I was buying a high-end GPU like every 5 years to play the newest games. But at some point, games can't just get "better", so I guess there should be also a point where my GPU is still high-end years later.
My 1030 can still play all the newest games at 4k ultra and ray tracing. Pretty good card you should buy it.
kinda happened with 3090 & 6900, the new cards are more powerful but we've entered an era where cost is so high that they will be extremely relevant for a good few years yet, still 4k cards for the most part in a world where upscaling and frame gen are the new normal.
That would sure be nice, but if UE5 is any indication, we'll still be getting games more demanding on hardware at a similar rate without necessarily improving the visuals (as is the case with modern games starting to require TAA/DLSS as a crutch for optimizing performance).
@@gorillagroddgaming😭😭😭😭🔥🔥
@@gorillagroddgaming Sure , at 1 or 2 fps :P
"we're gonna stop here" sounds great to me for like 5 years. I just upgraded my pc after 10 years. Can you imagine.
i built my pc in 2013, lasted me till 2021, and now feeling how much more heat and power my second build makes, its crazy
It sounds great to me. I upgraded and am pleased with this machine. It pains me to see cards like the 4090 starting to be called a 1440p card (id like one).
Console generations do already last 5 years or more though
@@pedropierre9594 My 4090 is a literal space heater. Great for the winter but during the summer I'll use framerate caps and power limiters to make it more efficient.
@@minnidot Hahaha who would call it a 1440p card? It has 24GB of VRAM and DLSS is best experienced on 4K displays. The reality is when you push graphics to the maximum (i.e. path traced Cyberpunk) you are dealing with 1080p upscaled to 4K and you need frame generation. The 4090 will be playing games perfectly fine at 4K for another 5+ years. It won't be keeping up with the latest and greatest ultra PC settings but it will still be light years ahead of what the consoles will be able to achieve for the foreseeable future.
Diminishing returns is the best way to describe modern gaming.
Slightly off topic: I had gotten so used to my Windforce graphics card running Overwatch 1 at under 30 FPS. I was CONSISTENTLY landing headshots as Hanzo. My friend watched me play a bunch and said: "Dude you could go pro!" and gave me his 1080. I was ecstatic, set everything up. Literally couldn;t land a headshot for shit afterwords and went back to my Windforce and immediately went back to dunking on people. Then OW2 came out and now I have the 1080 installed. What does any of this have to do with anything? No clue, but lower frames worked better with my brain, dunno if thats an actual thing that others experience or not. Great friend, we still hangout and talk.
I haven’t finished the video yet, so I don’t know if you addressed it, but another classic example of a wonderfully optimized beautiful title getting absolutely brutalized by modern “features” for a minuscule visual improvement is the Witcher 3. I played through the whole game at 1080p low/med settings on a gtx970 when it first came out to get 60fps, and then when I got my RTX 3080 in 2020, I download it all the texture packs and sky boxes and lighting mods, booted it up in 4K, and played through the whole game again getting 90 to 120 FPS, and being absolutely blown away at the visual experience. Cd project red updated the game to DirectX, 12 and incorporated those mods into the base settings, and the same exact system can barely get 40 to 50 FPS with the same settings on the new version. If there’s a visual improvement between the classic version that runs smooth as butter, and the DirectX 12 version that is stuttery and horrible to play, it is splitting hairs, and not worth losing a single frame of performance. It’s also shitty that all the new players that go to try out the game and its current version for the first time will get the shittiest experience possible.
To answer your question mid video, if you gave me games that looked like the Witcher 3 (with better NPC character models though) or RDR2, and could run comfortably 90-120 FPS at 4K on an upcoming 5070 or 8800xt, then I would be perfectly happy with that if it meant the games got more FUN and polished from a gameplay standpoint.
AMEN brother most modern games are like that now and it is SAD
The Witcher 3 current gen update is so god damn cpu intensive I just cannot understand why. I’m getting the same cpu performance as I did on the original launch which I played on an i5 4690k!, and my current CPU is the i5 13400f. It is eating my cpu up more than cyberpunk running path tracing.
Witcher 3 is also not artistically designed for ray tracing, which hinders the benefits a lot. Hardware Unboxed did a video talking about the visual comparisons and improvements possible with ray tracing, and they rated Witcher 3 as "Different, but not necessarily better".
@@ElysaraCh The worst part is that you can turn of RT and you still have a massive performance penalty playing the “upgraded” version.
Didn't it also change the artstyle to be worse than before?
I don't really care about graphics, I care about gameplay, that being said I still hope the ceiling gets higher and higher, we are in a dip right now where small improvements take a lot of performance, but hopefully as time goes on these processes get more effecient and new games come in that raise the ceiling even more.
I remember wanting games to be more immersive but not graphically necessarily. When I played ME2 I was visualizing branching narratives, dialogues etc. This was the future for me what "realistic" meant, not visuals but content and interactivity.
Old crpgs are better in story and dialogue, than most games, this aspect has not evolved but devolved. Only games like BG3 showed a glimpse of hope for more realistic and open interaction.
It's amazing how an RPG from 30 years ago (Ultima 7) manages to have a world that feels more alive than pretty much anything made since (partly I blame the original Diablo, which was _marketed_ as an RPG despite really being a dungeon crawler / hack'n'slash game - that really lowered the standards for what could be called an RPG).
Ultima 7 NPCs have actual lives, sleep, eat, go to work, react to the environment around them (ex., opening windows during the day, lighting candles at night, etc.), have huge and complex dialogue trees, etc.. Nowadays people just accept "RPGs" where characters stand in the same place and repeat the same 3 lines over and over.
@@RFC3514 There is a big opportunity for this kind of games, but they need decent graphics and voice acting for them to be main street. I think Larian is the best hope, but I'm sure if they get another big hit, some other studio will come out.
@@digitalsublime - Voice acting was another thing that hurt RPGs around the Diablo era. They couldn't fit audio for the huge dialogue trees of games like Ultima 7 into a CD, so they made most dialogues very linear.
In fact, the dialogue in some older games was generated dynamically, so it couldn't really be pre-recorded (unless it was assembled from individual words, and that usually sounded too robotic).
Maybe now with decent voice synth and generative text AI we'll see games with more complex dialogue systems.
Just don't ask the NPCs to count the number of Rs in "strawberry". 😜
Great take on the current dissatisfaction generally felt among gamers, Daniel :)
To me another "pressure point" being felt with many of the latest games, is that going from High to Medium makes the latest games look like they rather belong in the early PS4 era, instead of early PS5 era.
Few years ago we had the whole "you don't need Ultra settings for gaming"-movement in the community, with was right at that time. But that period came right after a period where a very popular Nvidia 70-series (GTX 970, then 1070) or AMD equivalent GPU could run games at the mostly Ultra setting in games with similar release year as the cards themselves. These where $400 cards (or there about) and was massively popular. But then RTX series began and we need that "Ultra-settings-intervention" because new GPU's became prohibitely expensive for many people. Just look at how much higher the peak adoption rate was for the 970 and 1070 compared to 2070 or 2070S.
And for the developers they seem to be between a rock and a hard place in regards to how they should prioritize they game development time on ray tracing, which runs too slowly on most peoples hardware. Or build a graphically impressive lighting setup in game with pre-raytracing techniques like they used to before RTX.
The games no longer look impressively current-gen on medium settings without ray-tracing, and even a 4070 has a hard time running High-settings in some of these games with some raytracing.
And AMD has been slower to develope the raytracing capabilities of their GPU. So their cards generally still rely on good rasterization and baked lighting implementations.
So it kind of a transition period, between to lighting implementations, that is stretched out or is dragging on because the newer processing node (newer than 7 nm) aren't bringing massive efficiency and area gains that we used to see. ...And Nvidia is busy building chips for the AI marked, because companies can afford to pay much more for an NPU than gamers can justify on a GPU...
Overall good points. Here is a couple presentation improvement points. Again though, thanks for the discussion.
1. I wish you went fullscreen with the source. It makes it easier to see and improves quality.
Edit:
2. Thanks for linking the original sources. RUclips compression will degrade it every time you upload.
The funny thing is that when you displayed the example around the 4:16 mark, I thought the right side looked better. This reminds me of how in Monster Hunter World you can disable Volumetric Rendering to stop the backgrounds from being blurred and washed out. There are some newer features that don't look good, even if they are trying to be realistic in some cases.
Yeah I thought the contrast made the tree textures pop more - seemed more realistic and crisper on the right.
It does
Flatscreen gaming graphics may have plateaued to an extent, but as someone who plays a fair bit of VR there is plenty of room for improvement on that front. There are very few AAA quality titles and mods which transform flatscreen games into VR (like Cyberpunk) look fantastic and have amazing potential but even a 4090 can struggle to run them to an acceptable degree. I feel like that might be the area to focus on more in future.
Exactly what I think!
With VR you're only ever going to get what you get as a result of improvements made for normal monitor use. VR is extremely niche and hardly anyone is interested in it and that isn't likely to change. Sure it's market share is increasing a lot as it advances and it's a lot more prevalent now but that's an increase of a small number to a less small number. VR is cool and I hope it gets more attention for those into it but ready player one isn't going to become reality. The Apple Vision for example is probably their biggest failure ever and the metaverse failed spectacularly.
@@paulc5389 AVP can't even play games and wasn't designed to so that's kind of irrelevant, hardly a shock what is basically a VR Iphone at $3500 didn't go mainstream.
The Quest 2 sold 20m units which is not that far below Xbox Series sales (28m), sure it's still niche but there's definitely a market there.
I think VR conversions of flatscreen games is a potential goldmine as it doesn't require extra investment or specific dev and can work surprisingly well, it's just the hardware needs to catch up. Once we get affordable GPUs that can run games like Cyberpunk in VR with the same fidelity as flatscreen (probably 2 gens away) i think it will explode as it's a truly incredible and transformative way to experience games.
@steviewonder0850 oh for sure. I just mean don't ever expect it to be the main focus of Nvidia et al.
@@paulc5389 Nvidia doesn't really to do anything in particular, better GPUs for flatscreen gaming = better GPUs for VR. It's just extremely hard to run VR games in high fidelity on current hardware as it's basically like trying to play in 8K flatscreen. But there are little tricks and things they can do for optimisation like foveated rendering which will only get better. VR is still in its infancy really, probably like the equivalent of a PS2 right now compared to what will eventually be possible.
Great video as always! I think part of the issue is resolution and the other part of it is VRAM. This is the first time we have ever gone 4x in resolution, PS1 and PS2 had the benefit of being able to change the output resolution of the console to your CRT screen and knew it would still look good, PS3/360 jumped from basically 480p to 720p, then we moved to 1080p, now its 4k, that has eaten away at half the GPU uplift we got this gen. Then you add in that RAM/VRAM barley moved! Mass Effect Legendary Edition, the main benifit both ME2 and 3 get is the HUGE improvement to texture quality! On PS3 devs had 480MB of usable RAM (32MB for the OS), PS4 gave them 5.5GB, this gen they have 13GB, and when you factor in the resolution increase etc, how much has this been eaten into? And with Nvidia keeping 8GB card relevant 4 years longer they should have with the likes of the otherwise excellent 3070, you end up in a situation like black myth wukong, a game that uses the latest GPU technologies and can look incredible, yet in places looks like a PS3 game with terrible textures. I would argue the single biggest upgrade in the Horizon Zero Dawn remaster is the increase in texture quality which along side the improved animation in cutscenes and better framing of secondary/side quest conversations (based on the video DF did) and in all honestly, other than more VRAM, none of those improvements actually needed more GPU power, the PS4 version of Horizon Forbidden West still has the same incredible motion capture and well framed conversations as the PS5/PC version. While im a sucker for better lighting in incredible micro detail like peach fuzz on Alloys face (it does look so impressive) but does it actually make the game better? No not really, the gameplay and story (my god what a story!!!!) are why I fell in love HZD.
But...modern consoles are NOT, NOT, 4k though. The vast majority of the time to maintain stable 30-40fps they'll run at 1080p. Even if you set it to run quality, the highest resolution point you'll hit isn't 4k at all, but 1660p. It's only capable of upscaled 1440p...but hardly ever runs that. I and a friend hooked up the PS5 to a his computer and ran some programs to read its performance. Ran several games at multiple settings and it averaged 1080p nearly 70% of the time...even with the two "4k" capable games at the time.
From the 360/Ps3 era onward the console companies have been lying about the resolution of the consoles. The consoles almost never render anywhere near the resolution listed on the box. The console simply upscales the image. Your "1080p" console was in fact doing 540p-800p upscaled. Your "4k" console was in fact doing 900p-1600p upscaled.
@@iprfenix Consoles like the SNES and PS1 could all output at 480i and yet they chose to do 240p. Plus even then resolutions weren't consistent. There were a whole host of pretty strange resolutions during the analog days of CRTs. Resolutions were never static. Even the PS2, which could go all the way up to 1080i and did support 480p out of the box, often rendered at 480i. It often took two 240p images and interlaced them to create a 480i image unlike the other consoles which often used a 480p frame buffer and then output a 480i or 480p image. Plus even those that ran at 480p didn't always run at 480p and could have an internal resolution of 448p and just use the fact that there's overscan to hide the missing pixels.
There isn't a single generation where your console ever always output at native resolution
PS2 era still ran at mostly 480i (which for rendering costs is largely equivalent to 240p) and earlier consoles mostly ran at 240p. Also modern consoles very rarely run any game at native 4k (stuff like the Quake 2 rerelease does run 4k120 on everything but the Series S), it's mostly stuff like 1200p upscaled to 4k with either checkerboard rendering or FSR and 30 fps.
There are so many games from 2012 that I'm just in love with.
Looking for how some recent games was made i just got shocked how some games lacks basic optimizations like LODs, or even better file compression (Call of Duty *cough cough*), abusing on realtime features or making shader cache while game is running just to not get a longer loading screen (at the cost of stuttering at gameplay) and all this in a lot of AAA games that had a long or very long development time.
If developers create a game that’s too demanding, the PS5 won’t be able to run it, which would lead to low sales and reduced income. As a result, they design games that the PS5 can handle, even if that means fewer improvements compared to previous titles. Similarly, when viewing photos, you won’t notice much difference between a 10MP image and a 100MP image-unless you zoom in. While it's a significant improvement, can you actually see the difference?
19:19 not only are we not there yet, but designers are having to work with two lighting systems that make life a lot harder. See that HU video for heaps of examples where prioritising one means the other suffers (i.e. the game is mainly designed with non-raytracing in mind, so when you turn it on you get weirdly lit spaces).
To nail both you have to carefully design the light of each environment using both approaches and use tricks to avoid one going wrong.
Yeah this transition period is pretty brutal. And sadly, because RTX is still so expensive and underpowered for normal customers, we'll be here for another 10 years until RT gets commonplace and at a decent power, at which point the developers could finally switch and only develop with RT in mind and make it a hard requirement.
Actually having full RT with Everygame without performance problems would speed up making games. As you would no longer have to bake lighting, shadows. Etc.
Exactly. We need to get to full path tracing and just throw away raster mode rendering.
It's bigger than people think because there is a lot of time spend authoring and lighting a game using rasterization as well as all the time spent making new rasterized effects to imitate what ray tracing does
I wouldn't mind if performance didn't improve beyond this point (at least for a very long time). It would allow developers the time to actually improve-, and master the rendering features we currently have. Why most people want more performance today is because games perform poorly. But at the same time, games perform poorly because developers haven't had the time to master what we currently have. It's a chicken and egg kind of situation, where our economic system (feeding on non-stop growth) requires consumers to WANT the new products. But in reality, just like game development takes longer these days, so does optimizations to engines, to games etc. It feels like the kind of rendering optimization improvement developers made during a 5-year period on the PS2 would take 20 years today due to the added complexity of modern rendering. I would personally like to spend less money on hardware, and allow developers to become more focused on creative graphics engineering.
I dont give a f about graphic. level design > gameplay > art direction > story and the rest is details
also sound don't forget bout sound design
@@Zero-j5f Indeed I forgot. It may be even more important than the story. Bug fix is also important
It's wild to think that a majority of gamers would put story so low.
It explains why most people stick to AAA I guess?
For me, gameplay>story>audio>art direction>level design>graphics.
And I think a majority of indie gamers would be similar.
For me, graphics help with immersion. I'm playing Methapor right now, and I love it, but the graphic takes me out of it a lot. especially exploring dungeons where it's just a room and hallway with poor texture.
Finally someone who gets it thank you
Gaming is in a weird place right now.
Gaming industry is dead
@@RAKUNTU It has been dead for over 10 years to me
After 2013 everything got Worst
Somebody tell Nvidia to stop skimping on Silicon and VRAM! 40 Series was a Scam!
4090 pricing may have been exuberant, but in terms of performance? It's fantastic. So is is the 4080. When the 5090 releases, I'll own that as well. The GPUs aren't the problem. Either your ego is or your wallet.
@@huskers1278 just wants to brag about owning a 4090
A scam was 3.5gb. Being poorly priced isn't a scam lol
@@huskers1278 Its a scam just based on the prices. Really should regulate the market when no competition and slap a 50% tax on all Nvidia stuff, they are printing money with their monopoly.
Some of the 40 series was a scam, some models weren't. Choose wisely.
we are at a point where something truly revolutionary (hardware wise) would need to happen. Traditional polygonal rasterization is essentially completely plateaued. I'd personally argue that this has been the case for almost a decade, with the GTX 1080ti being the "beginning of the end". If you play at 1080p, that card is STILL all you need. It's insane.
This is also a problem for improving graphics with PC games. If you make a game that requires a 4090 to run at 1080 60 then your market is 4090 owners only.. i.e. hardly anyone. And that lack of willingness to upgrade is only made worse by Nvidia's greed and poor specs on cards that are actually affordable. If I was going to go Nvidia I wouldn't touch anything below a 4070 ti super or any 50 series card that has less than 16GB vram. And for a lot of people that's simply not affordable.
I like your style!
Also, i'm a brazilian broke gamedev that have a core i5 8th gen, 32gb ram, and GTX 1050 and i'm very happy that my notebook can tank a lot of modern games. Also, i think realistic graphics are overrated, and the "only dogs can hear it" is a thing i talk about since PS3, because games are trading innovation for graphics since then
so glad you touched on this concept of diminishing returns cause ive been thinking the exact same thing for past 8 years
It feels like gaming is going backwards.
It is. It’s because of the juice. They give money to game studios if they promote ace mixing, crimers, trains etc.
This and last year was some of the most fun I had gaming. Idk what u people are playing but there are sooooo many amazing games coming out.
Is not going backwards.. is going nowhere.
It's called TAA
@gorillagroddgaming also true. Based
In general the focus around games has been too much on graphics. It's the first thing everyone will complain about.
I'd much rather see some innovation in game design instead of chasing "photo realism". Or a shift of focus from graphical fidelity to improving stuff like NPC AI or physics which have absolutely stagnated or even regressed. The reality is that games don't need state of the art graphics to be fun and successful, you see it with many indies and especially Nintendo.
It's funny how game AI has barely improved at all (and is still rubbish, frankly) even as we're on a supposed revolution of self-driving cars, talking machines, even generalised AI. Much of that is hype and BS, of course, but you'd think we might already get at least some improvement in game AI before the Robots supposedly make lawyers and artists redundant. But nope.
Much easier to do it in games than realworld, too, as you have all the variables and a very limited 'world' in which to operate. Driverless cars (supposedly) and yet driving games have awful AI opponents. Same in shooters, everything. Garbage. If game devs can't do it after forty years, what makes anyone imagine Elon can do it "by next year". lol
@@CurtOntheRadio It's weird isn't it? Seems like a much easier canvas to work with than the real life applications you mentioned. The big difference though is that they have dedicated research where unimaginable sums of money go into. So they are and will be more advanced than what you see in games where it's clearly not been a large topic of focus for a while.
@@vintatsh True, it's not a fair comparison.
Buuuut, we might at least expect some improvement in game AI long before we hand over our children to be educated by the Robots, say. Or fire all the lawyers. Even if it needs a cloud subscription.
It at least suggests some further scepticism is warranted about the use and integration of AI more generally, imo.
I keep trying to think of ways to use AI in games and keep coming up against this issue: how do you swap data between traditional and AI without losing the point of the AI, or the use of the traditional compute? Like, say, you could have AI be a shopkeeper, or blacksmith, so you can better deal with them, be more inventive, maybe - say in Skyrim type.
But any result would still have to be passed back to the traditional compute running the game, and it would all need be defined as variables the trad game code 'understands'.
Yet if you limit AI to output the trad game understands have you gained anything, really? You can't broaden choices easily, you can't invent new things and all in all I think it's difficult to find applications. Though maybe sports games might be one - driving, tennis, whatever.
Blah blah blah
But where is it? Where is any game with a novel, 'proper' AI component?
"Several games with AI on the market today implement highly sophisticated forms of AI to elevate the player experience. Games with the best AI often elevate the gaming experience in cool ways. For example, The Last of Us: Part II uses advanced AI to power its enemies, providing them with an ‘awareness state.’ This means that if an enemy sees one of their comrades killed without actually seeing the culprit, that particular enemy will be on alert and more vigilant as they plot to take their revenge. Expect the future of AI gaming will include much smarter NPCs since NPCs have always been one core use case for AI in games."
TLOU2 is pretty good re NPCs, I thought. Def a step up from most. Not sure if they just mean 'well programmed' here though.
And there is this:
ruclips.net/video/PYtmFF02OH4/видео.html
@@CurtOntheRadio I feel like that will be a „next-gen“ type development. I fully expect PS6 and whatever the next Xbox will be, to have a highly capable neural engine to handle the processing of AI interactions and that will be their defining feature. The PS5 Pro already has a 300 tops machine learning block after all for PSSR.
That is going to be the point where even AAA developers will start to embrace AI use-cases in games, once the average player has the hardware. It seem like the next logical evolution. As to how exactly it‘s going to be integrating into game design I‘m also a little bit confused, especially regarding storytelling but we‘ll see.
Yes Daniel, I would be happy to run my pc until the wheels came off.
Fans, maybe..
@@christophermullins7163 Until the fan blades came off *
No, I wouldn't. We need to be better. The PC is the most important thing, you can afford to upgrade it once every 8 years.
@@albert2006xp I just did a new build and my old one had an 7th gen I3 and a 1050. My buddy still plays on that pc lol its fine for medium-low graphics 1080 on most game 60+ fps.
New games are optimized for people with $8000 PC. Not very many people can even play the new games they are releasing not because they don't want to but because they don't have $3500 laying around to play the new game at not even the best quality
Daniel, this will be your most underrated video (especially from 10:01 time mark and onwards), mark my words. 😉
I think that, yes, game design needs a reset, a redesign of the business model for development of new games. Everybody in the gaming industry should pedal back on the "eye candy" development, and respective higher HW requirements involved. What everyone needs is "better" games, not "prettier" games. And considering where the economy is right now, and where it's going (not for the better), when only a portion of the userbase can afford that elusive top 10% performance incredibly expensive hardware, it stops making sense going in the current direction.
Yes, I would prefer if graphics development stopped for a time, because in these times we're only seeing a slightly nicer picture but at bigger and bigger costs, I'd prefer if developers focused more on physics, gameplay, AI of NPCs (i.e. making NPCs react more naturally), face expressions. But more post-processing effects, adding 5 times the polygons to make a tree more realistic, that isn't important for me. Also about Ray-tracing, if it's too expensive, I think it would be better if developers used a combination of: the tricks that they used before + an optimized Ray-tracing.
Also Daniel, Is this a quiet admission of your own culpability in this graphics treadmill? Your channel is about analyzing graphics in games with a fine toothed comb after all. Without graphics what do you have to talk about? 😅
Modern games are already starting to run like piss on my 3090 despite not looking any better than RDR2 which runs in native 4K with raytracing above 60fps without breaking a sweat.
Small Indie game developer here (or at least starting to develop games) I think one of the checkpoints to get to that point is draw distance, the best 2 examples of that I can think of is for one: The whole Ghillie suit and the grass at long distance thingy in DayZ, and the other example is how a whole lot of games are situated in this sort of "valleys" or "canyons" with high walls that conveniently stop the player from looking at details in long distances (although, nowadays it's more of an artistic tool for representing different ideas of a storyline). I just think that the day I can have the exact same visuals while looking through a x14 scope than looking at my character feet, or the day I can have an AC-130 stylish mission where I can see the white in the eyes of a character while zooming in, is the day that we're there (although I use to play 2010ish games, so maybe we're already there and I don't know yet).
Sensible comment. You're in the wrong place. ;)
when i played dayz in 2014-15 me and a friends were talking about the grass issues at distance and he had an interesting idea
At longer distance, just move the ground texture up, so if you are prone you are under the ground texture
This way you wont stick out and realistically someone that far away shoudnt be able to see you anyway
Finding out about the grass in dayz still leaves a sour taste in my mouth
The software side of things is also important. As it currently stands we cant really get a huge leap like we used to because we already are representing most objects in games as realistically as possible using Physics Based Rendering. Cant really get wow'd anymore when you cant get any more real than real. It's up to techniques like path tracing, as well as realistic character rendering and animation to give us that wow factor.
Which is why I think GTA 6 is going to be crazy 👀
It took you a minute, but you're finally getting the point. People don't see the value in upgrading and spending more money on minor improvements when my current hardware is technically already good enough and should be getting better performance if games were optimized better. Some of us are adults with families and other responsibilities and must make responsible purchases. We watch videos like this to see what's new and help us to make better choices.
Honest benchmarking and optimization videos do far more good than just saying we need more powerful hardware.
it’s probably because devs wanted 32GB of VRAM for the PS5/XSX generation, but only got 16GB. The jump from 360 to PS4 in VRAM was 16x, PS4 to PS5 was 2x…
For what? haven't heard a single dev complain about the Ps5 ram size. Even equivalent pc's rarely go past 8.
High vram alone doesn't help if the gpu is not fast enough to process the saved data and ship it onto the monitor.
The clockspeed will always be the first bottleneck and the higher it is, the more energy it drains, the more durable it has to be and the better the cooling has to be, which leads to bigger and bigger parts (cooling solutions).
The PS5/XSX doesn't even have 16 gigs of VRAM, it has 16 gigs of shared memory between GPU and CPU.
@@thelazyworkersandwich4169 Yeah ofc they don'tt complain because no one tells them to code a game for console that runs in 1440p or 4k at AT least 60fps stable, WITHOUT disabling/reducing a dozen effects, like shadows, draw distance, lod, etc. until it turns into a mushy mass of indistinguishable hot garbage.
@@esmolol4091 Yeah that's my point......
Why are we encouraging stopping? Comparing ridiculous game settings to optimized settings is not fair, some of those settings are indeed extra useless but we have a lot more things that we could be doing. Path Tracing with more rays, more bounces, more polygons, etc. We have so far still to go... Graphics are NOT good enough until the actors on screen look identical to a movie.
Graphics don't need to be realistic. Art direction is far more important. GOW and The Legend of Zelda franchise prove this.
@@SauceageTF Zelda looks absolutely terrible and it not being on a real platform with real graphics is the reason I'll never play that, not even emulated. GOWR looked a little dated and could use an uplift. Very low quality characters for 2024 and you can see polygons all over the place. It felt like a last gen game for sure. Where as Forbidden West actually looked closer to current gen.
Path Tracing is the most important thing. That's what makes me happy to look at a game nowadays. Alan Wake, Cyberpunk, even solid RT like SIlent Hill 2, Casting of Frank Stone, other UE5 stuff etc is making scenes tie together so well with lighting, I can't look at that ugly gamey fake lighting anymore. You can see it's 3d models in a fake raster scene, it's so icky.
@@albert2006xp Graphics were never about realism. Pushing for it is a waste of time. The best artists never pushed for realism. They understand realism doesn't make a painting any better. Starry night doesn't look like any night I ever witnessed on Earth.
@@SauceageTF Realism or just sheer detail and believability. You can make something fantastical but it still needs to tie together. You can't play an abstract painting.
@@SauceageTF i doubt gamers will put with every game looking like a jackson pollock. art direction is but one pillar upon which a game is created.
"Starting to get to the era of diminishing returns?"
I just question the word "starting"
In the late 90's through the early 2000's dropping £400 on a new graphics card made a HUGE difference
Often games just would not play on 3 year old hardware. they wouldn't install, wouldn't boot, you'd get an error message telling you your GFX and CPU - were not good enough and that was that.
A CPU and GFX that was top of the range 3 years earlier, could not play a game at all.
That wasn't a rare occurrence. So you'd stump up for new hardware, and immediately you could see just why your old hardware could not perform on this new game
Now, as my daughter plays the latest games on my hand-me-down almost 9 year old 1070 - that was MID RANGE in early 2016. I look over her shoulder as she plays and think "I wouldn't care if I had that back, those graphics look just fine to me"
Yeah it could look better... but not $800 better. Not even close. Not ever.
So what you're saying is if that old PC and my PC with a 4080 super were running cyberpunk side by side you wouldn't think that my pc running it in ultra wide 1440p at 60fps with path tracing would be worth the upgrade vs what.. minimum settings at 1080p and 60fps?
@@definitelyfunatparties Get rid of the ultra wide and I'd choose your setup
There was often times new tech being dropped that just wasn't supported on older hardware like deferred rendering, tessellation, pixel shaders, or many other things. Those massive jumps were just because everyone was developing new tech for GPUs. Today we rarely get new tech for a GPU and the only recent addition being ray tracing. Those games that use hardware based ray tracing will do a similar thing to older GPUs from a few years ago
@@MainChannel1999 so you'd take the 1070 if you had to play in ultra wide exclusively with a 4080? cmon bro. anyways, that's what the extra 16:9 monitor is for. options 💁♂
@@crestofhonor2349 it's already happening, and it's even happening for current GPUs from other brands.. I ditched a 7900XTX for the 4080, yes I occasionally get a tiny bit less FPS in other games, but anything ray traced and I'm going from sub 60 to 100, except for cyberpunk, where I went from sub 40 to 60.. but that's proper RT (PT)
We have reached a plateau in game graphics. The only benefit in getting newer GPUs is for 3D rendering and video editing software for demanding companies that put out a ton of content frequently. I had to subscribe because you made some great points! 👍🏾
This is all I'll say, my priorities for games.
--- Top ---
Gameplay
Optimization and performance
Game-feel, UX etc.
--- Mid ---
Art direction - Includes:
- Writing
- Visuals
- Music and Sound
- etc
--- Bottom ---
''Realism''
Make games more optimized so that we have more people that can play them at acceptable framerates. Most of these new graphics aren't even that good, a lot of it is sacrificing art direction as well.
One thing to note, is that consoles (PS3) at one time gave a price to performance balance. including a Blu-Ray player, media center, and gaming device all wrapped together for a decent budget price.
PS3 was the most expensive console Sony ever put out adjusted for inflation and it almost killed the brand. What are you talking about.
@@jiggerypokery2962 Ps3 was clearly overpriced at launch, but it was the cheapest bluray drive at the same time... Many sales were only driven by the bluray drive itself.
@@jiggerypokery2962 True, but you also can't oversee that Blu-Ray players were almost as, or even more expensive than the PS3
@@CrazyDoodEpicLeaves SONY blu ray player at the time of PS3 release... $1000
The problem may be looking at consoles that way. Back in the day many people bought a ps2 or xbox for its home media capabilities such as dvd and blu-ray.
Ofcourse the world has changed now but then so too should the console's marketing. What about if you sold consoles like phone plans? That way you could jam more premium components in and have the overall cost be higher but not less affordable.
You could also package it with big brands for the modern age. So lets say if you buy and xbox subscription you get the new xbox, gamepass, netflix,spotify premium,youtube premium and amazon prime. That way your entertainment and online quality of life services are all covered under the one payment. Much like how your dvd and game playing device was all in 1 unit in the 90's and early 2000's
"I have a $3000 PC because ultra is the way the developer made the game to be experienced" justification x10
not true, a lot of games are build first with console in mind and than pushed with some extras on PC.
Vegetation is one of those option where the engine is just put to do a lot and that's called ultra settings.
Reaching the hard limit on computing power would be really weird and really cool because suddenly the entire software world revolves around these limitations and we could theoretically see a greater amount optimization across the board. A world of frozen computing power, ironically, could see its performance improve over time as more people get more experience squeezing out as much power out of the silicon as possible, which in tern could mean that software gets faster to make and therefore cheaper.
theyll pivot to software as a service/live service games, and theyll have fewer updates with vanishingly smaller performance gains between them 🤣
Never underestimate harfware/software mafia geniously ripping you off again and again. They always find a way
I remember spending $200 or less to go from 8bit to 16bit to 32bit, and it was so amazing how much better everything was. Recently I spent $600 for a 4070 Super to upgrade my old $500 2070 Super and there's really no difference. So not only are we not seeing massive change, we're spending massive change!
Instead of spending $600 to upgrade your gpu, spend it on a great ultrawide HDR QOLED monitor, because *that* is a noticeable upgrade
This topic revolves around a few things as context.
Resolution is hitting the point of optimal clarity. 4k, depending on your screen size and viewing distance is already below the limit of human vision. 1080p is the same, especially for TVs at a typical distance. You can see a bit of a jump moving to 4k, but not that drastic. There is a difference between video and rendered graphics, and you will see a bigger jump for rendered graphics moving between 1080 and 4k.
So really, we theoretically only need the power to render photo realistic graphics at 4k to be at optimal clarity.
The next improvement will be simulation fidelity. Having the power to not only render photorealism, but to accurately simulate the physical world. Water that flows and behaves like real water. Foliage that moves and reacts realistically to the objects that pass through them. Things like that...
We will eventually hit the point of optimal clarity and simulation, and at that point, increased performance from a gaming perspective will basically plateau. Only a change in how we game would shake things up.
At the current level of game engine design and capability... I believe that hitting the point of performance where a xx70 series card has the power of the suspected 5090 specs, would be basically the next leveling off point. You would still have the xx80/90 series to handle any additional features that may come around, but the mid-level cards would have some serious grunt.
Current CPUs are not too bad, especially in the higher end. We would need better parallelization of the game engines to push those CPUs to their real limit. So in a few generations, with better parallel processing built into the game engines, we will be doing good.
Simulation will be primarily CPU based with our current architecture... So better parralelization will be key to better performance.
All that said...
I think we will start to see gaming performance level off for a few years, until we get better game engines, that push simulation higher, and better take advantage of multicore processing. Possibly dividing out the main game thread tasks to more cores, as that is the current bottleneck for several games.
I think the developers are to "blame", you can have amazing visuals without a 4090, development gets more and more expensive so developers should focus on smaller but better optimized games.
Thank You and even with a BEAST like the 4090 some of these games DO NOT look that good in person, that is why in most cases I Turn RTX off because is is a SMEARY BLURRY MESS and makes the game run like crap but in looks good in stills and RUclips videos where the Compression algorithm hides the flaws, as soon as you pan sideways in Cyberpunk or Dying light 2 you can the Reflections SLOWLY and CRAWLING and SMEARING and they try to catch up because it take MULTIPLE frames of data to make them NOT look like a a bunch of PIN HOLES
Demon's Souls Remake and Silent Hill 2 are two examples of what current gen COULD look like.
But they remain an exception.
Both were soulless and pointless remakes
crimson desert as well.
Sh2 doesn't look that good relative to it's performance imo, demon souls remake definetely looks really good though
Silent Hill 2 doesnt look good but Demons Souls does
HOWEVER, Demon Souls is a PS3 game in design.
Make Elden Ring look like that and how would that run? Its great looking game but since its world is 2 generations old, they could ramp up the graphics. Is it a bad thing, no but context is key
Someone analyzed SH2 at a deep engine technical level and found they waste a huge amount of resources for almost no impact on image quality, unlike the previous games, they don't even used the fog has a curtain to hide more aggressive LOD (level of detail) and gain performance, hell they don't even use custom LOD's at all they use nanite alone that is way slower than tradicional hand made LOD's and specially slow for vegetation.
But hand made level of detail, takes time to make and "time is money" so they just decided to not do it.
This guy also made a few tweaks to the game graphics, using UE5 console commands and was able to bring performance way up without changing graphics that much, more or less similar to what Daniel did.
Why didn't the original developers did those optimizations! The answer, could be, lack of time or even worse, they didn't cared or they didn't knew better.
I have a 7950X3D + XTX combo at 1440p and.... I use a framecap in most games. I had an RX 580 at 1080p, and I was perfectly happy with it, but during the pandemic work from home made me not satisfied with reading text (I'm a programmer) on 1080p, so I bought three 1440p165Hz monitors and a 6700XT. The only reason I upgraded from that 5950X + 6700XT combo is because I could.
I'm perfectly happy with the level of graphics I'm seeing now. I generally target 90FPS in single player titles. I have a lot of games, that aren't even that old, maxed out, and many with community made HD texture packs because that's what hold a lot of games back.
So yes, you're perfectly right. Until I can get full path tracing without reflections being rendered at a noticeably lower resolution, or the denoising is so abysmal that all reflections shimmer, then I'm not going to bother upgrading. 1440p full path tracing, with sufficient rays and bounces, with no upscaling at 90FPS or bust.
I have pretty much the same gear, play a lot of older games at 4K, newer ones like FF16 on 1080. No need to upgrade if you get 1080p 60. I got the 4K screen for work. It’s just sad when developers ignoring the massive number of gamers still rocking 1080 or a 2070.
@@JGComments They aren't ignoring them. Some people just take it as a personal attack if they can't run everything at the highest settings at the highest resolution. 1080, 2060, RX 5700, RX 6600 or 2070 class hardware will still run mostly everything at 1080p High and in the very latest titles 1080p Medium. And it still looks fantastic.
@@andersjjensen that's pretty much true 95% of the time. If you can run a game 1080 60 fps on Medium you should be happy. It's insanely expensive to try to keep up if you expect 4k 120 with all the bells and whistles.
Well, I'm currently on an i7-920 w/1050 ti so... I plan to get a modern gaming rig before Win 10 support stops next October. After I do that, I'll be happy if I don't need to upgrade for another dozen years.
I feel like the same thing is kind of happening with display technology. The difference between Standard definition, and 1080p was huge, but from 1080p to 4K was less dramatic. Sure, you can drop a huge chunk of change on 8K, but is it really worth it?
If developers stop the practice of developing games that perform best based on hardware that is not even in the market yet that would be a good thing. Personally, I lost my enthusiasm for gaming a while back when it felt like developers (especially AAA studios) decided that good looking games mattered more than good or innovative gameplay. Probably only those who grew up in the era when gaming started going mainstream in the 70's and 80's would get where I am coming from.
Developers don't make any decisions you attached to them in your comment.
The R&D or production teams: 1 artist, 3 devs, 1 analyst, 1 tester, 1 dev-leader. Those teams make up less than 60% of the company staff.
Whatever you think devs do, they don't make decisions. Especially the strategic ones.
When was this "a while back" ? 30 years ago ? Because that's what AAA studios/developers did since forever. Literally. And the current situation is much better than it was in the '90s and early 2000s, where games would literally not run on 3 year old hardware. Nowadays almost all games can still run on the midrange GTX 1060 from 8 years ago. This will be even more true when RTX 2060 reaches 8 years old.
@@Winnetou17 Have not bought a game on release since Warcraft III.
@@p4r4g0n I'm almost with you on that one, but I did had some exceptions (DOOM 2016 and Kingom Come: Deliverance, from what I remember). And In the Warcraft 3 days, we were pirating it (it was common in my area, pretty poor country).
It's funny that for Warcraft 3, I had it installed for several months before I could play it. Because, I didn't had a GPU initially.
I'd love to be able to run games at native 4K at least. Even 1080p is hard to run on top-tier hardware. I'm waiting for more performance.
We were headed that way until all this upscaling garbage got pushed onto us. 1080p 60fps was standard a decade ago, yet now new games require mid tier cards to even hit 1080p (native) 30fps. Sad times.
>1080p is hard to run on top-tier hardware.
This is giga exaggeration tbh. It’s not that bad.
As someone who has been Playing at 4K Native since 2014 it is AMAZING and I can NEVER GO BACK, the sad part is that with the poor performance in most modern Titles even My RTX4090 can struggle get 4k 120 unless I turn down some setting or use DLSS and I HATE SCALERS many of them smear and blur ruining the Crispness of 4k , saddest part is that with some Older but still REALLY GOOD looking titles I can run TRIPLE 4K Surround max settings and get 80-90fps, and yet on some modern games you can struggle to get 60 FPS on ONE SCREEN graphics and Gameplay that is no better that what came out 6-8 years ago
how is 1080p hard to run?
@@haewymetal go look at Monster Hunter: Wilds PC recommended specs. An rtx 4060 for 1080p 60fps with frame generation (so like 40fps)
I've been saying this for a while now, I think this is good. Chipmakers can keep developing insane hardware at exorbitant prices for server farms and we can just be happy with what we have. It's a win-win.
Disgusting attitude. We should not be happy with anything until graphics look much better than real life. This comment section and this ragebait video is gross and anti-gamer. Bunch of casual gamers who just play in the evening with a controller in their hand trying to ruin it for real gamers.
@@albert2006xp its so strange to see this anti-progression attitude among pc gamers in the last decade or so.
i cant imagine anyone saying 20 years ago that theyd rather ati didnt release the 9700 pro because now their 7500 is going to be obsoleted, or 15 years ago people waxing poetic about how 1 cpu core is way more than enough and why would you ever want a core 2 duo?
hardware was obsoleted so much faster back then funnily enough.
@@darudesandstorm7002 20 years ago, there was real and very tangible graphics improvement to be had from new hardware. Now, even the most demanding games still typically run fine on a mid range card from 2 generations ago, and there's often barely any noticeable visual difference between medium and ultra settings.
My personal opinion is that I will only buy a product if there's actual improvement to be had by owning it. If the only improvement for owning a card that's five times more expensive than my current one is so minute that I couldn't see it without stopping the game to pixel peep, I don't think it's worth it.
@@darudesandstorm7002Buying new pcs every new release would make a lot of pc gamers abandon the hobby because if unaffordability. There is hardly any longevity in gpus now on lower end, i cannot imagine back then.
The PC market has a much larger inventory to choose from for upgrades, which is why incremental changes aren't met with as much scrutiny. They can pick and choose which part needs an upgrade. Consoles only path to upgrade is a pro model 4 years later or the next generation 6-8 years later, which is why they garner a louder outcry.
Just a huge video, thanks man!
I have dumped thoughts across multiple comments into this one comment in case it gets highlighted:
Its not broken. Its maturing. PC hardware, mobile hardware, its nearing feature completion for the customer base, and the vendors are struggling to find meaningful innovation out of these parts. Unfortunately what we're seeing is that rather than commoditization like the tech is supposed to do, the vendors are trying the opposite tactic and trying to push the updates to the high end. Same thing happening with GPUs and phones. As a customer base we have to push back on the planned obsolescence and hold onto our hardware and the devs need to defocus pushing the hardware.
To answer your question, yes and no, we are generally happy with the level of graphics in GPUs now. But we would get bored after a while. I personally think we can continue with incremental graphical improvement but slow it down a lot. We dont need the change to be this fast anymore and we can't afford the GPUs anyways. If the cost of graphics is killing games you've gone too far as a developer.
There are other types of innovation besides performance. By reducing power consumption, you can get innovations in size and form factor. That's how we're getting ITX PCs and PC handhelds like the Steam Deck. Thats how we're getting integrated graphics being able to run games. Imagine every single PC made can run PS3 quality games now instead of needing a discreet GPU. Considering usability concerns for the customer like that is how we get the 'reset' in the gaming hardware business model.
Also Daniel, Is this a quiet admission of your own culpability in this graphics treadmill? Your channel is about analyzing graphics in games with a fine toothed comb after all. Without graphics what do you have to talk about? 😅
well said!
He talks about performance, not just graphics. They're part of analyzing performance
next jump is gonna bei AI NPCs, a truly living world unique to each person. I don't know what kind of computing power that would require but I don't see graphical improvements as a thing right now until you get into fully realized actual VR gaming.
You dont need much power. What you need is tons of VRAM. 16GB minimum to have textures and the ai stuff running. 20GB would be the sweet spot for hi res textures and amazing ai perf for fast responses
They will ruin that as well with woke AI training. They should just make more interesting sidequests and hire better writers and allow them to dare new things that are normal in movies and literature.
@@currywurst2434 Boy do I have news for you about people who write literature.
There is no way I'm going to sit with those things on my head. Not going to happen.
'AI NPC's', my guy we already have AI NPC's. AI is a marketing term used on things we've had for decades.
Between PS4 Pro and PS5 there's no big graphic differences.
They're literally the same... I have a PS4 Pro and a friend has the PS5. Putting them hooked up to a computer and running some programs...the PS4 Pro runs about 95% of what the PS5 does. PS5 basically just has rt and can run 30-45fps rather than locked at 30. They're the same
There are huge differences not being emphasized due to so much cross gen support
@@GrySgtBubba The differences might not mean much to you but to say they're "literally the same" is objectively wrong mate.
PS4 Pro crawls running Cyberpunk not even holding 30 fps at 1080p with some dynamic resolution and lowest settings possible.
Meanwhile PS5 runs the game at native 1440p with high settings and ray traced local shadows (which are cheap) then upscaled to 4K at locked 30fps.
But sure, PS5 is barely more powerful.
@@phattjohnson Bruh, they both run in predominantly 1080p-ish at 30-40-ish fps, and some titles can be 1440p-ish upscaled to around 1660p-ish...and run 30-45fps... The only difference is PS5 has rt in some games, and has a few new titles. That's it! Lol
As a PCVR gamer the fact that we literally need to render the scene twice (one for each eye) with each eye pushing upwards of 3000p (when using high resolution headsets), this is the ONLY reason I have shelled out each cycle for a 90 series nvidia card.
If I wasnt into VR I'd go back to my old ways of skipping a generation of card and only going has high as a 60 or 70 series.
I did breath a sigh of relief this year though because the performance im getting out of my 4090 is so smooth in VR that I might actually be able to skip the 5090.
I agree with your assesment that we are in the time of Moore's Law and the fact that video cards can cost upwards of 2K is insane...
This seems great. It means less worrying about graphics, and more focus on gaming!
best part about diminishing returns in visuals is hopefully the gameplay, physics, animations, and destruction starts ramping up
I'd be happy if it stagnated. It also might give a chance to optimize the software side.
Not only has there been less and less noticeable difference with each generation of hardware, but most of the hardware isn't being fully utilized and/or is badly optimized on the software side. A lot of the hardware that's still being used also doesn't get any firmware upgrades anymore cause the dev team on that gets moved to developing for the newer generation. When that happens the companies also don't open source their stuff so other people can work on it.
I don't know where I've read this as I read a ton, but I remember there being news of a 'bug' or an 'optimization' that was discovered or done on what we would consider ancient hardware (I think it was for hardware that comes from the 90's). This happens more than we can imagine, especially since the hardware changes so rapidly.
I for one think that hitting a plateau in performance gains for a while would be a boon to gaming, in a way. With developers having more limits again, they’ll hopefully be pushed into becoming more creative with what they have on hand. They’re hitting the soft cap on computing power, now they can refocus back onto the gameplay performance itself, the art and style of making a game, and less on graphical fidity…. Hopefully 🤞
We need more gameplay, gamer player modes: split screen couch-co-op, save states, customized story alt-routes, various graphics settings, easier customizable and authorized mod support, fast ssd, longevity of hardware, ease of maintenance and upkeep, and electrical efficiency.
I think the next step really is games having a 2D-3D Tv mode, portable mode for portable gaming devices, and a VR mode for those who own MetaQuest, PSVR2, Apple VR type devices. Meaning game designs have that immersion safely for giving that first person perspectives with game design in mind.
I 100% believe your right the path we need to take is better gameplay ect graphics has been the focus for 10 ish years and actual gameplay has overall stagnated that is something that can be improved with current hardware