@@BasePuma4007 he's got a point, Indy Jones looks 10 years old at the very least. animations and visuals are not on todays standard, let alone on UE5's standard. that engine works great for Doom, but it absolutely doesn't work for realistic portrayals like Indy Jones. RTX 4090 and 7800X3D here, no matter how good your system is, if a game looks dated, it looks dated.
Thank you John for shouting out the HDR DLSS issue. I was pulling my hair out trying to figure out why DLSS looked so borked for me and never thought to try turning off HDR. Thx brotha
@@muneebhamza8917 @killik69 It is not fixed. HDR+PT produce shimmer and also frame gen works only if you restart the game and never go to settings again. We need another patch.
@@zybch Back then 8 year old hardware would not run quake 3. You would have had to upgrade constanly. Computers were expensive. Everybody was exited. Not alot of complaining also. Fun times
Target hardware is the Xbox series X. If after 4 years you still dont have a pc as capable as that console. Why would you expect to play AAA games at higher settings than the xbox??
@@ElZamo92 the 1080ti is 4 generations old! You can’t expect to be able to play modern game with a card that came out in 2017. You want a 2024 game to conform to a 7 year old card?
@@MaxIronsThird 1080p performance fsr on low should easily stay over 30 fps and potential up to 60. It's a pretty solid performer. Edit: I thought it had 16gpu cores but turns out it is 12. We are in igpu territory so I think 1080p perf low 30fps is likely the average and not the minimum.
@@christophermullins7163FSR performance mode at 1080p 30FPS would look like an Xbox 360 game. People in that camp would be better forking over the money for a Series X if they really wanted this.
im old enough to remember everyone bitching about shader requirements on old geforces fx cards and games requireing it like halo pc and stuff. nothing changes. people will always complain the old cards have had an insane life span its fair enough to complain about prices and gfx cards being price gouged. but complaining of these new tech and becoming standard and old cards aging out is silly.
It's not about having to upgrade, it's the fact that the game doesn't give you the option to turn RT off which is honestly "kinda cringe" and anti-consumer. All it does is artificially make it so non RT cards can't run it, even when they would have been more than capable of doing so. AMD for example have a bunch of recent cards such as the RX 7800 that are pretty beefy, but don't support RT features, this is a card that came out a year ago, and it's unable to play this game and potentially a bunch of others if this disgusting trend catches on.
And it doesn’t help that graphics technology has been hitting the point of diminishing returns like a freight train for the last ten years. It makes the lines blurry between one side of the argument and the other
Agreed. We can scream and kick all we want about prices, gouging, exclusive, blah blah blah. But there is just no logical reason to hold back an entire industry to support the bottom 20% of cards left.
@@lau6438bottom 20% cards left? Seriously? The majority of people are using 4060Ti with 8GB Vram, let alone it having a miniscule memory bus of 128bit lol. Industry is still not ready for the mandatory RT requirement in games, like what MachineGames "proudly" did.
I wonder if in 5-10 years when games will use RT as a basic lighting system across the board people will still complain it doesn't run on max settings on their 1050s.
@ dude a 3060 or a 4060 which is arguably the most popular card runs the game just fine on low textures which don’t look bad. You also have a lot of budget options in other brands with more vram.
@@Dawdo-wf1xs on low... So 2007 graphics.... Great
3 дня назад+12
It’s just the way things are going, even integrated GPUs will end up with dedicated RT cores in not too long I’m sure, and consoles have them already And the two versions of lighting is too much really yeah. I’m currently developing with a two year or so release target and only using RT.
If workflows can get streamlined on the order of 50% it's a no brainer, and it just looks so much better.
2 дня назад
@@MaxIronsThird Ah that's ace, I didn't know that!
2 дня назад
@@BasePuma4007 The bonus is that in Unreal, using the path-tracer to bake lighmaps that work in conjunction with Lumen is on the roadmap, so the future includes being able to combine ray and path tracing with GPU-baked lightmaps, which is essentially ideal.
@ Unreal Engine is moving away from light maps I believe. For shadows / direct illumination they will use "MegaLights" in the future, which supports an arbitrary number of lights, probably based on ReStir. By default it uses RT shadows internally rather than virtual shadow maps.
@1:40 are you sure? I think the nvida 6800 supported dx9c, it was ATI that was SOL, if you bought a $700 X800 XT PE, you couldn't play shader model 3 games like bioshock, which wasn't even full UE3 but still required it.
It just shows how awesome the 10 series was. The GOAT nvidia series. A gpu from 2000 would melt with a game made in 2007. But a gpu that's from 2016 can still run a game from 2024 somewhat well.
Except this one, or any game that uses mesh shaders like Alan Wake 2. Anyone serious about PC gaming still using an 8 year old GPU might as well get a console at this point, as current gen consoles are better than the 10 series. And current gen consoles have k&m support.
Terrible comparison. The leap from 2000 to 2007 was far bigger than any leap afterwards. The current technology doesn't improve much compared to old days
@@acurisurThe hell are you talking about? A gtx 1650 card with 4 gb of vram runs Alan Wake 2 just fine in 1080 p with medium to low settings. Yeah, you get the mesh shader bullshit warning but you ignore it and the game boots up.
Mores law has somewhat reached the point of diminishing returns compared to the early 2000s. Powerful GPUs are much harder and more expensive to manufacture.
To me, this game really exposes why the GPU market sucks so hard right now. Basically, games like this (and to a lesser extent Alan Wake 2) are really showing why it's (still) important to buy a GPU with the most up-to-date feature set if you want to have something that will really last a good long time into the future. The fact that a 2060 Super can still run titles using the most modern rendering tech, while the RX 5700 XT (which was its direct price competitor) simply cannot, is a perfect example of this. The 5700 XT was a bit faster at the time it released, but because of the feature set, the 2060 Super was the more "future-proof" card in the long run. But at the same time, games like this also expose how important it is to get a card with a lot of VRAM overhead if you want it to last. And now we're in a situation where you can't get both an acceptable amount of VRAM overhead AND the most modern feature set without paying over $700. AMD gives you enough VRAM but its core performance in RT titles is just not good enough, while Nvidia is just lol with their absurd VRAM stinginess. As somebody who is still rocking a 2070 Super, the minimum upgrade that makes any sense to me would be a 4070 ti Super, because it's the cheapest card with both enough VRAM and a future-looking feature set/RT performance. If you'd asked me back when I bought my 2070S (at its launch) I would have never guessed that in 5 1/2 years time I would need to pay nearly $800 to get something that was a worthwhile upgrade. I'm really, really hoping that AMD closes the gap with Nvidia in RT performance with the 8000 series. Like, I know that they're not shooting for the high end with this series, but that's fine, I want a midrange card anyway. I just want the $500 AMD card to give me RT performance that's competitive with the $500 Nvidia card.
While I agree that AMD's RT performance leaves a lot to be desired, I also think most games just slap the feature over the top of a traditional raster pipeline without much consideration for performance. Games like Indiana Jones. Metro Exodus (the RT re-release) and Avatar, especially the latter- which isn't brought up in these discussions enough -prove without a shadow of a doubt that RT can be a great help to the development of a game, look spectacular and make a meaningful difference in the quality of visuals, and RUN GREAT with fantastic optimisation even on AMD's RDNA2 GPUs like the RX 6800 XT. Even RE Village for example uses extremely performant RTGI which makes a meaningful difference to visual quality, whilst performing very well for all GPUs. It just seems that for RT features, the focus is all in the wrong place- it's focused too much on impressing people, rather than what it's supposed to be, a better solution than traditional global illumination, shadows, etc that can produce better results much easier, and streamlining the development process. When games are built with it as a core part of the development like Avatar, Indiana Jones and Metro Exodus, they just happen to run extremely well? That's not a coincidence. All the games so far that were developed like that run well because they aren't just focused on showing off. On the other hand, we have games that are just RT "showcases", which perform awful even on a 4090 and require DLSS framegeneration to even be acceptable on the most expensive hardware available.
Kinda makes me regret buying a 7800 XT. Because, despite the fact that this game is very playable on that GPU, most games that require ray tracing going forward... simply won't be. And I'm not made of money.
unfortunately.... this is why people keep buying nvidia.... hopefully the new 8800xt would be a lot better in other areas other than just raster performance...
@@killerhurtalot Another reason is DLDSR. It hardly takes up resources because of the tensor cores. It's fantastic. You can even combine it with DLSS and oftentimes looks better then Native.
3070, 3080 despite having superior RT cores, are all slower than even a770 in this game due to lack of vram, in a perfect world 4070 should've had 16gb then that would make our choice easier, but it's not, and there's that slight chance it will end up lacking of vram in the future.
It literally performs like 4fps slower in this game with the regular ray tracing vs the rtx 4070. Path tracing isn't available on AMD cards. God you people are weird.
Nobody who bought a 2060 thought they were future proofing. Indeed I knew my 3080 TUF was a one generation card due to the picked up pace of RT. Although the 4090 is fantastic I find myself looking forward to a 5090.
The conversation around this and RT in general is why I don't understand why RT was marketed towards consumers the way it was. RT is a development tool first and its a massive plus it makes games look better (overall if it's all used correctly). The work flow is so much easier with rt and it's amazing that it can run real time these days
Exactly. RT cuts down the workload of developers by half. Everyone whould be behind this tech if it wasn't for the prohobitively expensive hardware costs that come along with it. Also the clear planned obsolesence of the RT cards being starved of VRAM when the main requirement for RT heavy games is... VRAM.
@@backgammonbacon consumers always have to buy new cards. Being a PC gamer is knowing you have to upgrade your rig every 3 to 4 years. The first raytracing hardware is 6 years old now. We are well past the reasonable upgrade period for people to either come aboard or get left behind.
@@annonannnonymus Consoles needed a new gimmick, so Nvidia (the company that has never manufactured a single RT console) started pushing for RT? Do people think before commenting?
The two games to my knowledge that have required RT capable systems (Indiana Jones and Metro Exodus Enhanced) are both unsurprisingly the two games where the ray traced lighting really makes a big difference in lighting accuracy and immersion, while still running good. Ray tracing is great and is not a gimmick. There's so many games where it is half heartedly implemented with RT shadows or RT reflections paired with traditional lighting techniques. The end result is an image that is slightly more accurate (but barely noticeable) with a 20-30% reduction in performance... With Indiana Jones, the graphics arent that good, not terrible, but not all that great, but the RTGI totally transforms the presentation and makes it so much more immersive and beautiful.
It wasn’t until the Egypt level that I realized just what a difference the RT lighting was making to the game, using your lighter to explore the pitch-black tombs is incredible
@@ScubaSnacks Yeah that is where it really pays off. That section in the blue tent at the beginning of the Egypt level really showed off how much better RTGI is with lighting. It just makes everything look so much more realistic. The bounce lighting and scattering looks so much better than traditional shadow maps and ambient occlusion.
Funny how we have the pc master race gloat about how pc graphics are so good and consoles hold them back yet they complain when games want to be true next gen and move the bar forward and their card doesn’t work
Turing came out 6 years ago, games have not required its feature set. You had 6 whole years to keep your aged hardware alive. If you don't have an RT capable card by now you have made an error. But that said, you still have time. This is just one game.
funny how i'm playing Indiana Jones on RTX2060-6gb even at 1440p/67%res without crashing, the fix is on my channel, with LS framegen im playing at 1440p+DLSSQ and getting 80fps, used "CRU" (custon monitor res) now im playing at smooth 80hz 1440p, and the game plays and looks good lol.
Well.... I'm on a Titan X Pascal and a i7 5930K. it's lasted me all this time.... It's finally time to bite the bullet and get a 9800x3D and a RTX 5090!
Dude I dunno where you live but where I live you can buy a used RTX 2060 super for £150 upfront and there'd be no reason to need a CPU upgrade with that. Of course if you do go with the top of the line stuff all over again it'll probably last you just as long if not longer so it's hardly a bad idea if that's what you were going for (I really can't tell what the time of your comment was and I'm not trying to be rude or combative I promise).
@oliverdowning1543 Yeah, exactly that. I like to get the best of the best hardware at the time and have it last and future proof myself. And who knows if I will get the chance to do this again in my life. I might as well do it now whilst I have the money... Sad to see my old rig go but 10 years... Can't grumble at that really can you...
@@lucasnookeryeah, totally not a bad approach. I've yet to have been in a situation where I can really do that myself but definitely when I can it seems the way to go.
Back around the days of releases like Splinter Cell Pandora Tomorrow or Splinter Cell Chaos Theory, shader models were constantly being iterated on and hardware could only support the latest available shader models, so it wasn't uncommon to launch something like Pandora Tomorrow on a GPU that was outdated by several years and the game being unable to run at all with the error message "your GPU doesn't support shader model 1/2/3, so you can't run this game, period."
Considering what I saw on PS5 pro in F1 2024 and Fortnie do You think guys some "basic path tracing" its feasible on PRO like 30 fps and upscaled res to 1440p? 😅 Please answer only people with programming knowledge not like Pc master race just saying console is trash I know its not a super powerfull😅 Edit: Some guys test medium Path tracing on Rtx 2080 ti at 1440p (upscaled) and they got like 30-45 fps so on PS5 pro something similar seems possible casue 2080 ti is even slower than PS5 🤔 😆?
I doubt it. Sometimes even regular Ray Tracing is difficult for them to add while keeping a decent fps/res. F1 is just not a very demanding game. It's a last gen game essentially with added RT... It's an easy game to run on PC But it's not full Ray Tracing because they're missing RT for Shadows. Sometimes Pro has to use lower than the lowest PC settings of RT. It's possible but it'll only be on highly optimized/less graphically impressive games like the 2 you mentioned.
Regular ultra settings runs around 3 - 3.5 times faster than max full RT... I am referring to a video where 4090 at native 4k is 100fps Ultra vs 30fps path tracing. PS5 pro certainly has a worse raster to RT ratio than 40 series so I would go so far as saying that PS5 pro regular quality settings would run somewhere between 5 - 8 times as fast as the max path tracing mode. You do the math lol what resolution would you need to run if you need 8x performance compared to 1800p? Likely below 720p for 60 fps
No matter what happens now, rasterisation-only cards probably have at most a couple more years of support from new games, anyway. It’s hard to imagine that Sony won’t want to go all in on ray tracing with the PlayStation 6, and that will probably be the final straw for new games also dragging along old PC GPUs with weak or non-existent hardware RT. And naturally studios will start working and planning with that future in mind well before the PS6’s launch date.
@@leocomerford 💯 what I been telling everyone I mean you can see this coming from a mile away but yet these people are still going to be caught with their pants down
Id Tech is amazing as always. Doom: The Dark Ages will be using the latest Id Tech and it will be even better than Motor, the new Id Tech derivative engine.
Thank you for the correct messaging here. Today's gamers don't seem to be aware that these requirements are generous: you need at least a 6 year old NVIDIA card or 4 year old AMD card. Last gen APUs can run this game, as can the ASUS ROG Strix mobile platform, as well as the Steam Deck. Games need to evolve, and this always happens, like when hardware rendering became a requirement in the 90s, or when pixel shaders became a requirement in the early 2000s, and those were a lot less forgiving in their minimum requirements than this game! You weren't going to play Return to Castle Wolfenstein on 6 year old hardware.
@@GreyDeathVaccine This is a false blanket statement. The flagships are more expensive, but we have $250-330 mid range cards today. Throughout the 2000s, we also had $250-400 mid range cards, and if you account for inflation, that's a lot more in today's money.
When graphical fidelity was rapidly advancing and games were looking exponentially better every year, it made sense that the latest games needed better hardware. These days all we're getting is side-grade games that look the same as their competitors but require better hardware to finish the developers' jobs. It's not the same.
To put it into future perspective, as hardware becomes more and more developed in the RT department, eventually ray tracing and path tracing will become more optimized and be far easier to run - it will be to future hardware as performant as rasterized solutions are to current hardware now. This is a necessary step in that direction, even if it means a lower frame rate threshold on current generation hardware, progress doesn't happen without some initial growing pains. And if people hold onto legacy equipment for too many generations, they will eventually be left behind by technology evolution.
You folks are insane. You're basically the definition of a mindless consumer. You're supporting a ridiculous practice. You literally WANT people locked out of playing games and to bring back the marketing practice of making games be so tech heavy you (even unnecessarily so) you need to upgrade just to play then. It's insanity, and you're EXACTLY the kinds of consumer a company like NVIDIA wants. You will literally go on the offensive and rationalise BS company practices and you genuinely think it makes rational sense. You need to take a step back. NVIDIA has been force feeding RT down throats for a loong time now and NO it is not needed and doesn't HAVE to be used. It is a nice feature IF wanted. You people are the natural enemy of those that fight to keep these companies in check and not purposefully gatekeeping games. You think it's a coincidence that certain games would require RT. Don't be so naive you're playing into their hands
We need shadow and LOD distance to have higher settings available, and I don't care if it's a mod. Yes you can do the lodscale, but in the settings menu would be nice. I wish it actually had a larger pool size available for 16-24GB VRam cards, but I understand if it has no visual effect. "Need" is probably not accurate 😂. It absolutely feels like Indiana Jones, and I love it.
Except Indy devs were lazy enough not to add any fall back for RX 5000 or GTX users while Alan Wake 2 had support for them and later added compatibilty for GTX 10 series as well. Indiana Jones' optimization is very overrated
@@DragonOfTheMortalKombat Dude it's not dev laziness, it's the fact that in game development you have priorities and supporting 8+ year old hardware for a tiny portion of people is not very high on the list of those priorities.
@@DragonOfTheMortalKombat creating games with ray tracing in mind allows you to achieve artistic solutions that would otherwise be unfeasible or very difficult. Most people who complain like this don't understand anything about hardware, game development, or coding in general. the GTX 1000 series is now too old and lacks features and raw performance, I can understand what you say about the rx 5000 since theoretically a 5700xt (for example) still has decent performance but in this case it's the amd hardware that is always a bit behind and on certain occasions you pay for it. In the past even 3 years for a gpu were a lot so I wouldn't complain much if after 8 years a gtx 1060 can't play this game (just a single game lol).
@@monkfishy6348 I know you dont know shit about game dev if you think a game fully made from scratch with RT in mind is harder to make that making the game with basic prob light system and then adding RT........ and btw Alan Wake 2 devs didnt added any fallbacks, people found that they could run it but it will run like shit, so devs just added a switch so people could run it even if it was never meant to be play like that
@@DragonOfTheMortalKombat Alan Wake 2 still runs like absolute crap on 10 series GPUs. Just because they made those cards compatible doesn't mean you should play it.
The only problem is the bare minimum RTX 2060 Super is $250, which you could get an Xbox Series S for cheaper than that. That's slightly different than finding a cheapo GPU and barely squeaking by, now you're EXPECTED to pony out hundreds just for the lowest settings possible. Maybe the AMD and Intel options are cheaper, but that still seems over the top given there's the 2050/2060 that use raytracing and are similarly excluded.
@ Exactly, it seems excessive for just the GPU to cost that much to hit minimum. Before, you could get a PC of comparable price and get decent results. The Cryptomining jacked up the price and we see the results here.
@@veilmontTV If they're not covering something that brings out the console waring children, this type of articulate nerd is par for the course around here. ❤
Dlss, high preset, full RT frame gen off, at 2560x1600, with fps limit of 75 puts my gpu utilization at 45% on my ad103 laptop. Really the main issue is that I’m using over 13gb vram. Initially I was experiencing BSODs whenever I was about to enter or was in long cutscenes. My short term solution was to lower the resolution all the way beforehand and I was able to get through. I tried reinstalling my drivers and I think that might have fixed it, fingers crossed. The fact this game uses so much vram is really an issue for me. My gpu has so much more left to give but it would need even more vram to do it. That’s a problem. High looks okay but there are so many other games I’ve played recently that use a fraction of the vram, are fully ray traced, and look way better (cyberpunk) I think there needs to be some more work done to optimize the vram usage in this game. On a side note, this gam taxes the cpu like crazy.
Alienware m16R2 4070 (8gb) 32gb RAM Core Ultra 9 I can barely put anything to medium without getting the vram warning in the settings, even with DLSS on performance. Running 1080p. Would be nice if they provided the vram usage numbers in the settings as you toggle stuff on/off and up/down. I’m getting a good framerate over 120 mostly but I’d like the game to look a little better. Guess I’m held back by the 8gb vram on my GPU.
I usually agree with all the DF viewpoints, but this one leaves me more against for once. Yes direct X 9.0c came out and 2.5 years later was required. In that time GPU's progressed from GeForce 6800 Ultra to a GTX 8800 Shaders = 16 increased to 128 TMUs = 16 increased to 32 ROPs = 16 increased to 24 Bandwidth = 35.2GB/s to 86.4GB/s Core clock also increased from 425Mhz to 576Mhz, so in total: 10.8x Shader improvement 2.7x Texture improvement 2.0x Pixel improvement 2.4x Bandwidth improvement = 4.47x average improvement over 2.5 years = 1.79x increase per year 6800 Ultra = $489 8800 GTX = $599 Price increase was 1.22x for a 4.47x improvement RTX came out 2018 and in 2024 we have our 1st games, so 6 years. RTX 2080 to RTX 4080: Shaders = 2944 increased to 9728 TMUs = 184 increased to 304 ROPS = 64 increased to 112 Bandwidth = 448GB/s to 716.8GB/s Core clock also increased from 1710Mhz to 2505Mhz, so in total: 4.8x Shader Improvement 2.4x Texture improvement 2.5x Pixel improvement 1.6x Bandwidth Improvement = 2.83x average improvement over 6 years = 0.47x increase per year RTX 2080 = $699 RTX 4080 = $1,199 Price increase was 1.71x for a 2.83x improvement So the time period is longer, but the technological improvement over even twice the time period is much less. The killer is the weakest a PC can use is a RTX 2060 / RX 6600, and yet it runs just fine on a Series S. We are talking about a 4Tflop AMD GPU with inferior Ray tracing abilities, but the PC needs a nearly 9Tflop GPU from the same architecture. That makes no sense. Then consider other lighting options wouldn't of taken much effort and the result is laziness and a artificially forced GPU upgrade. Is the upgrade expensive? No not really. Is it REALLY a requirement? Definitely not. A modern say Splinter Cell (Direct X 9.0C or Direct X 8. no 9.0, 9.0a, or 9.0b support)
@@annonannnonymus I don't believe there really is. The Series S has 1536 RDNA2 shaders but cut down to 1280 shaders. Closest is a downclocked RX 6500XT. That has 1024 shaders (So less) but is clocked at 2.60Ghz to the Series S 1.56Ghz. RX6400 is a little weaker. RX 6500XT is faster. Either way that's around a RX 480 or GTX 970 and that's 10 years old. It is just strictly using RTX features to force newer hardware. In this specific games case, the Xbox Series was the only focus. PC and other options are a after thought. At least in my opinion.
>"other lighting options wouldn't have taken much effort" You're wrong and missing the point. Because a lot of effort was taken away by using RTGI, developers were able to focus on other optimizations. They even mention it in the video.
@@Kumoiwa I respect your viewpoint, and my original statement is I mostly agree with DF, but I can't see this being the case when there are multiple engines out there with various options. It isn't like the past where you didn't have enough power and had to bake shadows into the textures themselves. How much more work would it be in Unity or UE5 for example to use global illumination instead? Its basically just a option to tick box. Game development is too long as it is, but this specific area would of taken basically no extra time at all. You could argue the engine is more tailored and diversity is good, but that's the same argument SquareEnix used with Final Fantasy XIII and look at how that development went.
At some point people gotta accept that their rig is old and either live with it or upgrade. If someone was running an 8 year old gpu in the early 2010s or prior they definitely weren't going to be playing modern AAA games and everyone accepted that. People got too comfortable with the fact that the hardware in consoles during the ps3/360 and ps4/xbone era were really weak which is why even a mid tier pc could run circles around them by the time pascal cards arrived.
"If someone was running an 8 year old gpu in the early 2010s or prior they definitely weren't going to be playing modern AAA games and everyone accepted that." Go take a look at a 2010 game and compare it to the graphical fidelity of a 2002 game. Everyone accepted it because graphical fidelity was improving massively very quickly. Meanwhile, these days developers are asking consumers to spend MORE money than they would have in 2010 for what is essentially a side-grade to tech like DLSS or RT, so they have to do less work.
Times have changed a lot since the late 90s and early 00s. And so have the prices of high-end video cards. And I'm not complaining with my 3080Ti and Nvidia shares, but I dont feel *this* game is the poster-child for upgrading your GPU tbh...
I can see why people are annoyed as yes, the cards to do RT have been around for 6 or more. What people quickly forgot is that new gpus have not been viable for most of that time due to price gouging and sustained high prices. Covid and bit coin really destroyed the markets. As a older adult I have the disposable income to handle that but others may not. Even a ps5 was inflated until recent times. So 6 year old tech was the only viable tech for so many. I think devs have to relised the reality of the past 5 years.
A complete straw man argument. If a game is literally playable fine without an UNNECESSARY feature then you're just being gullible and arguing against your own consumer interests. People like you are dangerous to this trying to keep these companies and their underhanded gatekeeping techniques in check. Literally arguing FOR forced upgrading with new features. Insane
I remember having a gtx 870m, was stuck at d3d feature level 11_0, I couldn't play GoW 2018 because it needed feature lvl 11_1. Had to wait a year until I got a RX580. Unfortunately ray tracing is where game development is heading to, and fortunately this tech is now 6 years old or so, so we can't really blame some developers for forcing it. We have to accept that we can't use our graphics cards forever, even just released cards will eventually have to be sunset.
"Why Devs Ramming Technical Problems Up My Ass Is a Good Thing" Fun game, don't get me wrong, but it still lacks the polish I expect. They need to fix the DLSS issues at the very least.
Gamers are complaining about optimization issues with current games but this game was easier to optimize because it was limited scope of targeted platforms. Perhaps developers will put out more performant games if people move on from their decade old hardware to raytracing hardware.
Wait. I thought enabling Nvidia HDR would cause up to a 20% performance hit. And you would have to run DDU to get rid of the performance hit as just toggling it off would not get rid of the hit.
Maybe not the best RT game but I like seeing the RT requirements by default, budget cards have RT now so it’s about time atleast GI now be done as a standard by Ray tracing. It’s scalable and can be performant, win win.
You folks are insane. You're basically the definition of a mindless consumer. You're supporting a ridiculous practice. You literally WANT people locked out of playing games and to bring back the marketing practice of making games be so tech heavy you (even unnecessarily so) you need to upgrade just to play then. It's insanity, and you're EXACTLY the kinds of consumer a company like NVIDIA wants. You will literally go on the offensive and rationalise BS company practices and you genuinely think it makes rational sense. You need to take a step back. NVIDIA has been force feeding RT down throats for a loong time now and NO it is not needed and doesn't HAVE to be used. It is a nice feature IF wanted. You people are the natural enemy of those that fight to keep these companies in check and not purposefully gatekeeping games. You think it's a coincidence that certain games would require RT. Don't be so naive you're playing into their hands
@ ironically I made this same post you made to me, to DF’s Alex a couple of times for calling for mandatory RT in games because I don’t think it always look good or worth the perf cost to alienate millions. I made this comment in light of RT, I’ve been into Ray Tracing as a technology before Nvidia made it their marketing for overpriced products and sure way to make some upgrade yearly for playble framerates when using RTX features. I’m not that guy that wants todo things for corps to benefit, I just have a thing for RT. some games have a fall-back for trash GPUs but to make the switch and finally start making RTGI, Standard will take a lot more than making it a choice. The iPhone has HW RT as we speak, that says a lot about where the gps market should be, some just won’t upgrade until their gpu kicks the bucket. I’m fine with some games not supporting a 1060 because it needs RT-HW but if they have a fallback in place than sweet. Games are better off with RT but it’s hard to get there with so many ways to skin the cat and variety from pointless to neck braking. One feature I can vouch is RTGI, because that will get you closes to look like Path Tracing, second RT reflections. the goal is to Ray Trace everything eventually, been waiting since around early 2010 so to see it come close in some games mean a lot to me than you and that’s fine
I'm really enjoying Indiana Jones on PC. I have a 7700x, 7900XT , 32gb ram and playing 1440p. I have everything maxed (supreme) and it runs great! I keep the frames locked at 90 and it's smooth as butter. I have a 240hz monitor but for single player games like this i cap my fps around 90-120, it's more than fine for me.
Ahhh yes, welcome to modern gaming guys! Where games have a budget of over 400 millon dollars but developers can't be bothered to bake some lights like they used to. Get ready to spend 2000 bucks on a new gpu to run the latest half assed experience with upscaled resolutions, fake frames and mandatory ray tracing.
Well said! And the fact that they never said it requires RTX until two days before launch is a slap in the face. People should be talking about that, but I'm the only one who noticed it, I guess.
Why continue to support outdated technology when RT capable cards have been on the market for 5+ years now, some of you modern gamers complaint too much, when I was young games often drop support of hardware less than 2 years old, imagine trying to run a 1995 game on 1990 hardware, or 2005 game on 2000 hardware.
@@DripDripDrip69 all they have to do is add a software mode also the industry has evolved past needing to upgrade every 2 years these days gpus last atleast ten years the 1080 ti can run almost any game and has the same perf as the 3060 but these devs dont bother to support the have 400 mllion dollar budget but wont allocate 500$ for software rt This Is Stupid On top of that the 5700xt a 4 year old gpu cant run it either even though it has excellent performance
So, I installed the game on my 7700 (PBO +200Mhz CO -35), with 32GB and 3080Ti and am getting around 55-65fps with 4K/All Ultra (in the forest section in the beginning of the game), except Texture Pool Size @High (Ultra shows warning on VRAM usage) and DLSS @Quality. As soon as I touch RT, the system goes to like 5fps even in the menu section. CPU utilisation is about 20-25%. Guessing it's all about VRAM. Even on 1440p/Ultra/High with RT @medium is unplayable.
I'm all for HWRT becoming the standard. It's been over 6 years and 4 generations since HWRT launched on PC, and almost 4 years since consoles had it; even my school laptop's integrated graphics support it (it's not very playable, but neither are most traditional raster games tbf). Standardizing HWRT for lighting would go a long way towards simplifying the lighting pipeline, which I expect to grant better performance and maintainability. I bet UE5 ditching SW Lumen and investing those resources into HW Lumen might speed that up, too, though Epic probably wouldn't do that as it would eat into Fortnite's existing playerbase (as if migrating to UE5 didn't already do that). Plus nearly all GPUs with HWRT have mesh shader support as far as I'm aware, so if you want to use those while you're at it, you're probably home free. My main concerns are twofold: VRAM requirements, and pricing. We need to keep VRAM requirements under control. We still have 8GB cards around, and if we can't ship to, say, a 4060 Ti, then it just muddies the waters on supported hardware and makes everyone's lives harder for no good reason (players and developers alike). Definitely seems like investing in an aggressive memory streaming solution is a good idea (as bad a rep as Unreal gets, something like their virtual textures seems like a step in the right direction), so I'll keep that in mind when working on my pet project game engine. Pricing also needs to be considered. There are legitimate reasons people are still using pre-HWRT hardware (think 10 series, 1080 Ti my beloved) and legitimate reasons to not upgrade. In some countries new hardware is crazy expensive, or maybe they don't have the money, or maybe they do have the money and choose to spend it on things they deem more important... all kinds of reasons to take into account. I'm not super worried because my engine is probably not going to ship games for a few years (it's a learning and resume project right now), but I can see why I would be worried if you're, say, Epic (going back to them), who needs Fortnite to run on a lot of machines.
Pro tip @DF Clips: High pass filter at 200Hz completley solves this issue. Most people's voices don't get low enough to lose detail when high pass filtering up that high anyway and it stops people's rooms from rattling or speakers from getting blown out in the process.
@@peteparker22 Maybe you are not a PC gamer but Alex is the only video game tech reviewer in the entire youtube that is genuinely passionate about PC gaming graphics. he is like a gift from heavens to PC gamers and we PC gamers should protect him at all cost.
I do not really play with ray tracing on most games anyway. So playing at ultra settings on all settings except ray tracing and download texter shader option, I'm still getting way over 60 frames up to 100 sometimes. Son able to get very smooth experience with my 4070 super . I am playing at 1440p with no DLSS or frame gen and I'm getting those numbers on frames. And a 5600 X.
Devs should label their settings with the year that they expect hardware to run it at 60FPS. That way when someone in 2024 tries to run a game using the 2030 setting we will all know it’s the user that’s the problem not the game.
Once I turned off Low Latency mode in the control panel it was smooth sailing. Beautiful game. Playing Path traced supreme on quality DLSS. Getting around 90 fps.
Any game in years is a damn overrated opinion 😂 there are tons of gamers just released this year which are more fun like Astrobot, Metaphor, Black myth, Stellar blade 😂 If you think solving puzzles is fun just go play sudoku or you know play any modern AAA game lots of puzzles 😂
@@Teja only thing funnier is seeing a response that names all playstation associated games as proven examples of alternative fun games lol (even though metaphor isn't Playatation only game, I'm sure you don't associate xbox with it) 😅
finally, i've repeated this stuff in the comments of many videos but many people seem to know nothing about hardware. The reality is that 99.9% of those who complain about this game don't want to admit (or don't understand) that their gpu is really obsolete. For the record, almost all of the 50 most used gpu (not considering the integrated ones) support hardware raytracing and allow you to play this game well, regardless of how old they are. The only "absurd" thing is the use of vram which limits many nvidia gpu, no big deal anyway since even with medium or high textures the texture pool is more than sufficient.
It's very promising that consoles are able to do 1080p+ with RT at 60 fps locked on a game which actually looks quite good. Also, spicy Alex is best Alex.
How can we know the RT in this game is enhancing visual because there's no traditional lighting fallback? for all we know it's not improving visual at all while sacrificing performance.
Game runs 60 fps on a 4 year old console with ray tracing at on average 1800p if you are a pc player with a rig less powerful than a console then why are you complaining about performance?? Upgrade or suffer. Been that way since pc gaming has existed.
@@00-JT And what if it run at higher fps with no visual impact? watch HUB video about RT, and see majority of RT titles out there doesn't improve visual at all while tanking your framerate, if the option is that and 100fps, i'd take 100fps any day
@@stefannita3439 And how can we say for sure, sure it run fine, but what if we can run it at higher framerate by turning off RT with minimal visual impact, you know like many RT titles out there?
Still haven’t seen a good enough game to make me want to upgrade from my 1080 ti. Even on latest gen hardware, games are coming out completely half baked with bad optimization and requiring upscaling tech to get decent frame rates. May as well not even upgrade at all if that’s the state of pc gaming we’re in.
I have a 3080 10GB and as long as I keep the textures around medium, I'm getting well over 60 fps at 1440P with many other settings set to high. Looks amazing and plays great. I had no expectations of this game and only downloaded it since it's on PC Gamepass. Nvidia really nerfed the original 3080 as it has the horsepower to run pretty much all modern games at 1440P high settings but the Vram is holding it back from its full potential. I'm conflicted on forcing RT as in this game, it does look great and runs pretty ok, but I feel bad for the majority of gamers who still don't have an RT capable GPU. That said, to play devil's advocate, we are heading into the 4th gen of RT capable cards and the devs can't wait forever to push the industry forward with making RT implementations standard. However, I'm a few hours in and actually enjoying it despite never watching a single film in the franchise. Maybe I'll actually watch them now.
@@TruthHurts1519 Yeah I'm most likely going to do that. Besides, none of the streaming platforms I'm currently subscribed to have them available anyway.
That's why they are constantly testing on and mention Nvidia cards that are several generations old and only available used at this point (which means that the other big N isn't seeing a penny from people buying those). In other words: Keep your silly conspiracy theories to yourself.
indeed, Nvidia have been caught red handed before infiltrating enthusiast communities with PR firms and handing out "free samples" for "benchmarking" in exchange for promotion of their products. legitimately, Nvidia cannot be trusted with a single thing they say. not that you should trust any corporation, but Nvidia deserves the extra skepticism given their extremely shady history.
@@no1DdC ... you're conveniently leaving out the fact they ONLY test on Nvidia cards. no ARC, no RX. the only time i've ever seen an RX card in a DF video is Vs consoles, or Vs Nvidia in an obviously stacked test like Xenia emulation.
@@anarchicnerd666 AMD only has a GPU market share of 12% at this point - and dropping. Intel's is so low, it's not even measurable. That's why they stopped mentioning those - it's just isn't worth it. I wish that AMD was stronger too, since nobody benefits from an Nvidia monopoly other than Nvidia, but they've unfortunately dropped the ball pretty hard, at least in this area. Their CPUs are doing rather fine - and if you've noticed, DF is routinely using those. What's "obviously stacked" about the Xenia emulation test?
I don't agree, but it's mostly because i'm more interested in getting a game running well than getting it running well on the latest hardware, i don't understand how we can simultaneously praise a technology like DLSS for getting people more frames while praising something that demands people spend hundreds potentially thousands on new hardware, people did that for hallmark games like Crysis, not lighting. On the dev side sure, rendering lighting is a heavy process and on the fly RT makes it easier for devs during the art phase of development but i see two problems, we're passing more of the art of lighting to technology than the artist leading to similar looking games, and AAA has a quality problem, maybe not with Indiana Jones but passing a lack of optimisation onto the player has never been cool and still isn't cool.
@@SilverEye91 People bought hardware for Crysis because of the whole package, but most people can't tell the difference between RT and non RT side by side.
The part about passing the art of lighting to technology is ridiculous, especially in regards to this game, that takes a lot of creative liberties when it comes to its lighting, trying to mimic the look of the movies. The dynamic lighting is what makes this possible in the first place.
Regardless of the additional work and greater scope of support, I don't see why a baked model can't be offered, even if it comes later. Modders would be able to do it. It may be good for advancing the ubiquity of RT and expediting the development process to meet deadlines, but since PC is all about pro-consumer choice, scalability and lowest common denominator spec configurations etc then it only makes business sense to be as robust as possible. They will definitely want to expand their reach as long term, full price sales begin to stagnate and I wouldn't be surprised if a Switch 2 miracle port comes along that completely throws any notion of "just upgrade your PC to play the latest and greatest games" out the window. So there's absolutely financial incentive to do so.
i think this is quite elitist and simply not true, because there is a large fan-base that does not get to play it; a lot of people would play it not only for the cutting edge graphics my GF is having to play it on my PC because her GPU does not support RT; the point being that she would prefer to just play the game, rather than having better graphics; that said i know that there are development decisions that impact this and Cyberpunk does that in a way that i find better for this generation, that game supports both old-school methods aswell as top of the line RT
@@DragonOfTheMortalKombat The RTX cards came out in 2018 I believe. If you haven't upgraded yet, how can you expect to play a AAA 2024 game? That's like someone asking Rockstar to make GTA 6 and asking Insomniac to make Wolverine run on the PS4 from 2013 or PS4 pro from 2016. And the games will come out in 2025-2026 And it's as simple as, if you have not bought a PS5 after 4 years, you can't expect to play the latest games.
i don't have a problem with the graphics getting better with time nor i think that a PC from years ago would run 2024 games at ultra, my point is that it's disapointing to see that there are no options for non RT cards, it's not something that most people can play and then you get extra goodies with RT; correct me if i'm wrong but all the games from the best graphics of 2023 had non RT settings; more options are almost always better and this case is one of them
If you thought hardware based RT requirements weren't coming for games, you were mistaken. The entire point of RT has been to replace prebaked lighting/shadows, SSR, and SSAO. A 16gb RTX 3060 in nearly 2025 is not an expensive card anymore. If you're a budget gamer, you'll be fine with 2nd gen RT cores.
Indiana Jones is surprisingly easy to run even on an RDNA2 GPU which let's face it have the weakest support for Hardware Ray Tracing so it was really nice to see it maintain above 60 FPS in places like the Vatican Daytime where an explorable city with many NPCs. It was a really breath of fresh air that there is a new game not using Unreal Engine 5 and its fast on a 6-Core 5600 no X CPU and now considered a budget GPU entry RX 6700 XT with the likes of the NEXGEN GPUs next year. Doom 2016 could even run on a Q9550 Quad Core + RX 460 4GB GPU, so that's an indicator how good Id Engine has been lately.
Similar setup here, but a 6650XT instead. Over the last little while it’s been clear that I was on borrowed time and it’s time for a proper upgrade all round. I could push on with the 5600 for a while yet, but I’m done with RDNA2 at this point. Will wait until the new hardware in January and make decisions then. Putting off playing this until then.
@@radosuaf If you check my videos, you should see I uploaded a video of the game running it in the Vatican City even going in busy areas with lots of NPCs. Should give you an idea showing full metric in Id Engine's metrics.
@@VesiustheBoneCruncher Yeah, at this point moving forward its best you wait for NEXGEN options especially with RDNA4 or even that B580 if its any good. 8GB VRAM is really a bottleneck.
@@evergaolbird The game runs on a Series S with RDNA2 architecture. So there should be settings to get a similar experience on a Ryzen 5600 with 16 Gigs of RAM and a RX 5700 XT 8 GB card. I know the hardware is old ... so is Series S. But the consoles show that the game and the engine is capable to run on low end machines.
The vram issue is more of a concern for me than the mandatory RT. This is more or less nvidias fault for being deliberately stingy. I do find it ridiculous that my 6GB RTX 4050 can't even launch the game let alone play it on low settings
You could also take responsibility for *your decision to buy a GPU with 6GB* when there were surely other options. You still have a great card. It just can't play *everything* but I'd say you shouldn't expect it to
Man so many haters in this comment section. This channel has always focused on high-end graphics and alex said it the best, if you want to play the LATEST AND GREATEST than your going to need a gpu from the last 2-4 years. Yes people aren't made of money (i only have a 4090 becuase i am a smd solder technician and bought mine broken and fixed it). This is something that digital foundy literally called out in a video recently, the cost of gpus is way to high. They even call out the lack of vram on 40 series... That is not being a shill. Stop getting angry because you cant afford a gpu and play the 1000s of games without rt that are fantastic.
After 7 years of RT GPUs the whining is kind of stupid. They've been around for a long time now and we've always known this was coming...IE more titles needing RT...
yeah like literally i have a laptop with a gpu of 4gb vram and i know that this gpu is old and yk i have many other games thta i can play with my gpu and cpu literally stop complaining that your old gpu can't run there are other games than this one that your gpu or pc can run and play smoothly
Ok but at some point games like this had to start coming out because this is very much a large part of the point of hardware support for ray tracing. The fact that it's so commonplace now means that developers can lean on it to make better games with less restricted cinematography and more resources put into other things because they don't have to spend ages faking lighting. It sucks that that means some people can't play the game but it would suck more if we couldn't have had the game (at the same quality level because resources that went into making a great game would've had to be put instead into all the work that rasterised lighting requires).
I see you've never watched a DF performance and optimization review before. And developers don't custom-configure and program hardware and graphics engines for individual people. Try to repair your attention span and watch the actual channel for context, not just podcast clips.
Right? They live in first world countries, the get the hardware for free and say that it is a good thing rt required when it is a technology you only see differences when comparing side by side...
we need sli again and cards only taking up 2 slots then we wouldn't have to rely on using upscaling and dlss or at least pure raw performance increase for gaming bruteforce
I love that they bring up that your gpu used to go completely out of date in 3 years. From best gpu, to not meeting minimum specs. 6 years is plenty of time to keep up with the hobby
It's a dumb point because graphics tech also used to advance rapidly in the space of 3 years. The last few years nothing has changed at all. New games coming out don't look much better than the old ones.
The first raytracing cards came into the consumer space in 2018. If you want to play the latest games you know that as a PC gamer you have to upgrade your system every 3 to 4 years. If you haven't upgraded to a raytracing capable card by now that's on you. At this point it's not unreasonable for developers to expect users to have some level of raytracing support. The fallback options that have existed up until this point take way longer to program than a raytraced solution and it's just not worth the hassle for developers. Indy also actually runs well with it's mandated raytracing on the worst raytracing card a base 2060. It's doing about 50fps at 1080p native on low settings.
I have an RT capable card (3080) and always disable RT features with the exception of things like DLSS as it's usually not worth the hit in performance, and I know for a fact a lot of others do the exact same thing. Making RT a requirement is ridiculous, especially this early in RT's life where most devs still don't know how to implement it properly without breaking something else.
Some things are inevitable. A major new hardware feature becoming a game requirement, especially years after it was found in most consoles. It's a question of when rather than if. Also inevitable is the whining from people with aging hardware. I can recall these kinds of complaints all the way back to when games first began requiring support for VGA and had no support for EGA and Tandy color modes. Then it was audio cards. Then 3D hardware acceleration. More recently it was SSD storage.
It is a good thing. It means fewer and fewer people will be playing games because of absurd cost involved (on PC), so NVIDIA shilling sites will go out of business. On more serious note, I can't think of any reason why exclusiveness is good for anyone. It applies to single console games and RT-only games all the same.
Sorry buy newer graphics card RT hardware is like 6 years or older now. I did pc gaming in the early 2000 and you need a new graphic card every 2 years or you couldn't play the latest games because your card didn't support the latest pixel/vertex shader.
In this case it's reasonable because the GPUs are old and affordable but it could become a very slippery slope as the mins get higher and the GPUs only get more expensive.
RT games are not ab exclusive thing. Not having the requirements is not the same as not being able to play it *at all*. People had ample warning to understand RT was going to become a big thing. Back in the day, a new shader model would make your card useless. The *vast* majority of games don't need RT, but if you want to play those shiney new, high fidelity games, you will need a shiney high fidelity card.
@@nulian yeah but in early 2000s, every year was a massive jump in texture quality and overall graphical fidelity, which ray tracing simply isn't, that's why people are mad.
Long live NON-UE5 game engines!
Indeed. Idtech does it better, Again! Looking forward to Doom - Dark Age
It's a non UE5 game
Dog 🐕 watering shitty game engine..game looks like trash . And still takes up huge vram😂
@@AponTechy Poor kiddo doesn't have a system that can play it and says it's trash out of spite.
@@BasePuma4007 he's got a point, Indy Jones looks 10 years old at the very least.
animations and visuals are not on todays standard, let alone on UE5's standard.
that engine works great for Doom, but it absolutely doesn't work for realistic portrayals like Indy Jones.
RTX 4090 and 7800X3D here, no matter how good your system is, if a game looks dated, it looks dated.
Thank you John for shouting out the HDR DLSS issue. I was pulling my hair out trying to figure out why DLSS looked so borked for me and never thought to try turning off HDR. Thx brotha
I'm using AutoHDR as a fallback. It isn't perfect, but everything works
I use dlss 3.8.xx its way better in quality and also RTX HDR as a Fallback.
that issue has been addressed in the latest patch
they fixed it with the latest patch
@@muneebhamza8917 @killik69 It is not fixed. HDR+PT produce shimmer and also frame gen works only if you restart the game and never go to settings again. We need another patch.
Remember when Quake 3 didn’t have a software render backend? You NEED a 3D card?1?
Back when such a card didn't take 3 month wages to afford...
@@zybch There's always a PS5 or Xbox to solve that.
@@zybch Back then 8 year old hardware would not run quake 3. You would have had to upgrade constanly. Computers were expensive. Everybody was exited. Not alot of complaining also. Fun times
Just reminded me of having everything but the latest soundcard.
@@zerone9623 The whining and negativity is really irritating these days. I really wish people would just not buy things they complain about.
Of course, Hardware RT will make Richard's head more reflective.
The indirect specular lighting is immaculate!
😂😂😂😂
His baked head lighting is enough.
Only if he uses the right RT polisher. i think nvidia sells one
@@testertester4968 I need to be able to look directly at his head and see the ceiling light reflected that isn't in view.
Target hardware is the Xbox series X. If after 4 years you still dont have a pc as capable as that console. Why would you expect to play AAA games at higher settings than the xbox??
I’m not expecting to play it on my 1080ti at higher settings than the Xbox, I just want to be able to run it.
@@ElZamo92 the 1080ti is 4 generations old! You can’t expect to be able to play modern game with a card that came out in 2017. You want a 2024 game to conform to a 7 year old card?
@@00-JT nearly 8. It's like expecting to be able to play the first Call of Duty on a card you bought for Duke Nukem 3D.
But the Xbox can't even run the game. They have it at lower than lowest settings for rtgi, why isn't that an option on the pc version?
@@MarstedR yet it still looks and runs great. Any modern card will run the game as long as you stay within the VRAM budget.
"We've got a GPU for you, it's called RTX 2060" - Don 'Alex Battaglia' Mattrick 😂
RX6400 is able to "run" this game
@@MaxIronsThird 1080p performance fsr on low should easily stay over 30 fps and potential up to 60. It's a pretty solid performer.
Edit: I thought it had 16gpu cores but turns out it is 12. We are in igpu territory so I think 1080p perf low 30fps is likely the average and not the minimum.
@@christophermullins7163FSR performance mode at 1080p 30FPS would look like an Xbox 360 game. People in that camp would be better forking over the money for a Series X if they really wanted this.
im old enough to remember everyone bitching about shader requirements on old geforces fx cards and games requireing it like halo pc and stuff. nothing changes. people will always complain the old cards have had an insane life span
its fair enough to complain about prices and gfx cards being price gouged.
but complaining of these new tech and becoming standard and old cards aging out is silly.
It's not about having to upgrade, it's the fact that the game doesn't give you the option to turn RT off which is honestly "kinda cringe" and anti-consumer. All it does is artificially make it so non RT cards can't run it, even when they would have been more than capable of doing so. AMD for example have a bunch of recent cards such as the RX 7800 that are pretty beefy, but don't support RT features, this is a card that came out a year ago, and it's unable to play this game and potentially a bunch of others if this disgusting trend catches on.
And it doesn’t help that graphics technology has been hitting the point of diminishing returns like a freight train for the last ten years. It makes the lines blurry between one side of the argument and the other
Agreed. We can scream and kick all we want about prices, gouging, exclusive, blah blah blah. But there is just no logical reason to hold back an entire industry to support the bottom 20% of cards left.
@@lau6438bottom 20% cards left? Seriously? The majority of people are using 4060Ti with 8GB Vram, let alone it having a miniscule memory bus of 128bit lol.
Industry is still not ready for the mandatory RT requirement in games, like what MachineGames "proudly" did.
@@gamzillio The famous 4060 TI without rt cores.
8 years in GPU terms is an absolute lifetime
You're gpu owes you nothing at this point
I would love it if I could keep using my GTX 980ti, even if it is relegated to current Budget level standards.
Id say it's 2 life times considering consoles are on a 3-4 year update cycle
I remember when Battlefield 2 came out and I had a Ti 4200. BF2 required SM 1.4. I was stuck on 1.3. No dice.
No dice 😂
you mean no DICE.
"Why having a monopoly in the GPU Market is a good thing"
I was thinking that also
Most poular gpus 3060 and 4060 and rt is a good thing? Lmfao, total disconnect from reality, enjoy your 5 fps with rt on your 4060
I wonder if in 5-10 years when games will use RT as a basic lighting system across the board people will still complain it doesn't run on max settings on their 1050s.
100% they will
Of course they will, because Nvidia will sell cards marked as X070 that in reality will be X050 inside.
No, everyone will own $10000 90 series gpu by then... Check the most popular gpus and 4090 aint it... Which is only capable of running rt full on
@ dude a 3060 or a 4060 which is arguably the most popular card runs the game just fine on low textures which don’t look bad. You also have a lot of budget options in other brands with more vram.
@@Dawdo-wf1xs on low... So 2007 graphics.... Great
It’s just the way things are going, even integrated GPUs will end up with dedicated RT cores in not too long I’m sure, and consoles have them already
And the two versions of lighting is too much really yeah. I’m currently developing with a two year or so release target and only using RT.
they already have
If workflows can get streamlined on the order of 50% it's a no brainer, and it just looks so much better.
@@MaxIronsThird Ah that's ace, I didn't know that!
@@BasePuma4007 The bonus is that in Unreal, using the path-tracer to bake lighmaps that work in conjunction with Lumen is on the roadmap, so the future includes being able to combine ray and path tracing with GPU-baked lightmaps, which is essentially ideal.
@ Unreal Engine is moving away from light maps I believe. For shadows / direct illumination they will use "MegaLights" in the future, which supports an arbitrary number of lights, probably based on ReStir. By default it uses RT shadows internally rather than virtual shadow maps.
Digital Foundry on RUclips, should require RT, that would be a real very good thing!
At least require HDR!
@1:40 are you sure?
I think the nvida 6800 supported dx9c, it was ATI that was SOL, if you bought a $700 X800 XT PE, you couldn't play shader model 3 games like bioshock, which wasn't even full UE3 but still required it.
It just shows how awesome the 10 series was. The GOAT nvidia series. A gpu from 2000 would melt with a game made in 2007.
But a gpu that's from 2016 can still run a game from 2024 somewhat well.
Except this one, or any game that uses mesh shaders like Alan Wake 2. Anyone serious about PC gaming still using an 8 year old GPU might as well get a console at this point, as current gen consoles are better than the 10 series. And current gen consoles have k&m support.
Im on a Vega 64 and doing just fine lol
Terrible comparison. The leap from 2000 to 2007 was far bigger than any leap afterwards. The current technology doesn't improve much compared to old days
@@acurisurThe hell are you talking about? A gtx 1650 card with 4 gb of vram runs Alan Wake 2 just fine in 1080 p with medium to low settings. Yeah, you get the mesh shader bullshit warning but you ignore it and the game boots up.
Mores law has somewhat reached the point of diminishing returns compared to the early 2000s. Powerful GPUs are much harder and more expensive to manufacture.
Is the DLSS/HDR issue with Indiana Jones on W11? I'm playing now with HDR and not seeing that issue.
They are probably already patched this
To me, this game really exposes why the GPU market sucks so hard right now. Basically, games like this (and to a lesser extent Alan Wake 2) are really showing why it's (still) important to buy a GPU with the most up-to-date feature set if you want to have something that will really last a good long time into the future. The fact that a 2060 Super can still run titles using the most modern rendering tech, while the RX 5700 XT (which was its direct price competitor) simply cannot, is a perfect example of this. The 5700 XT was a bit faster at the time it released, but because of the feature set, the 2060 Super was the more "future-proof" card in the long run.
But at the same time, games like this also expose how important it is to get a card with a lot of VRAM overhead if you want it to last.
And now we're in a situation where you can't get both an acceptable amount of VRAM overhead AND the most modern feature set without paying over $700. AMD gives you enough VRAM but its core performance in RT titles is just not good enough, while Nvidia is just lol with their absurd VRAM stinginess.
As somebody who is still rocking a 2070 Super, the minimum upgrade that makes any sense to me would be a 4070 ti Super, because it's the cheapest card with both enough VRAM and a future-looking feature set/RT performance.
If you'd asked me back when I bought my 2070S (at its launch) I would have never guessed that in 5 1/2 years time I would need to pay nearly $800 to get something that was a worthwhile upgrade.
I'm really, really hoping that AMD closes the gap with Nvidia in RT performance with the 8000 series. Like, I know that they're not shooting for the high end with this series, but that's fine, I want a midrange card anyway. I just want the $500 AMD card to give me RT performance that's competitive with the $500 Nvidia card.
While I agree that AMD's RT performance leaves a lot to be desired, I also think most games just slap the feature over the top of a traditional raster pipeline without much consideration for performance. Games like Indiana Jones. Metro Exodus (the RT re-release) and Avatar, especially the latter- which isn't brought up in these discussions enough -prove without a shadow of a doubt that RT can be a great help to the development of a game, look spectacular and make a meaningful difference in the quality of visuals, and RUN GREAT with fantastic optimisation even on AMD's RDNA2 GPUs like the RX 6800 XT.
Even RE Village for example uses extremely performant RTGI which makes a meaningful difference to visual quality, whilst performing very well for all GPUs. It just seems that for RT features, the focus is all in the wrong place- it's focused too much on impressing people, rather than what it's supposed to be, a better solution than traditional global illumination, shadows, etc that can produce better results much easier, and streamlining the development process. When games are built with it as a core part of the development like Avatar, Indiana Jones and Metro Exodus, they just happen to run extremely well? That's not a coincidence. All the games so far that were developed like that run well because they aren't just focused on showing off.
On the other hand, we have games that are just RT "showcases", which perform awful even on a 4090 and require DLSS framegeneration to even be acceptable on the most expensive hardware available.
Direct Storage API was supposed to help solve this the vram issue but well. Who knows what happened there.
@@layzie45 No it wasn't, that's has a whole other purpose that's not even remotely related to vram saturation.
Hopefully what we're seeing with Intel might shake things up in that regard because certainly for budget hardware the B580 seems very much to do both.
A 2060 doesn't have the vram to run it.
Kinda makes me regret buying a 7800 XT. Because, despite the fact that this game is very playable on that GPU, most games that require ray tracing going forward... simply won't be. And I'm not made of money.
unfortunately.... this is why people keep buying nvidia.... hopefully the new 8800xt would be a lot better in other areas other than just raster performance...
I would've went AMD a long time ago but Ray Tracing and DLSS are game changers.
@@killerhurtalot Another reason is DLDSR. It hardly takes up resources because of the tensor cores. It's fantastic. You can even combine it with DLSS and oftentimes looks better then Native.
3070, 3080 despite having superior RT cores, are all slower than even a770 in this game due to lack of vram, in a perfect world 4070 should've had 16gb then that would make our choice easier, but it's not, and there's that slight chance it will end up lacking of vram in the future.
It literally performs like 4fps slower in this game with the regular ray tracing vs the rtx 4070. Path tracing isn't available on AMD cards. God you people are weird.
The real issue is the VRAM gatekeeping. The early adopters with a 2060 won't get to use the ray tracing future proofing they were hoping for.
Nobody who bought a 2060 thought they were future proofing. Indeed I knew my 3080 TUF was a one generation card due to the picked up pace of RT. Although the 4090 is fantastic I find myself looking forward to a 5090.
2060 and future proofing don't really go well in the same sentence
@@Dawdo-wf1xs The 2060 Super can still run this game decently.
@@Silikone yeah I meant the 2060 og
@@Dawdo-wf1xs OG 2060 is doing just fine, look on my channel. if we had AMD frame gen i could have nice 80-90fps on 6GB-2060
2007 PTSD Activated
I forgot about the Core 2 Extreme until now, thanks.
The conversation around this and RT in general is why I don't understand why RT was marketed towards consumers the way it was. RT is a development tool first and its a massive plus it makes games look better (overall if it's all used correctly). The work flow is so much easier with rt and it's amazing that it can run real time these days
The console manufacturers needed a new marketing gimmick, clarity be damned
Exactly. RT cuts down the workload of developers by half. Everyone whould be behind this tech if it wasn't for the prohobitively expensive hardware costs that come along with it. Also the clear planned obsolesence of the RT cards being starved of VRAM when the main requirement for RT heavy games is... VRAM.
Because the consumers have to buy the card. If the GFX card came with the game then I would be with you but it doesn't.
@@backgammonbacon consumers always have to buy new cards. Being a PC gamer is knowing you have to upgrade your rig every 3 to 4 years. The first raytracing hardware is 6 years old now. We are well past the reasonable upgrade period for people to either come aboard or get left behind.
@@annonannnonymus Consoles needed a new gimmick, so Nvidia (the company that has never manufactured a single RT console) started pushing for RT? Do people think before commenting?
The two games to my knowledge that have required RT capable systems (Indiana Jones and Metro Exodus Enhanced) are both unsurprisingly the two games where the ray traced lighting really makes a big difference in lighting accuracy and immersion, while still running good.
Ray tracing is great and is not a gimmick. There's so many games where it is half heartedly implemented with RT shadows or RT reflections paired with traditional lighting techniques. The end result is an image that is slightly more accurate (but barely noticeable) with a 20-30% reduction in performance...
With Indiana Jones, the graphics arent that good, not terrible, but not all that great, but the RTGI totally transforms the presentation and makes it so much more immersive and beautiful.
Sorry but its a gimmick
@@boscotheman82 Sorry but it's not. You being so unobservant as to not notice the obvious upgrade to lighting accuracy and immersion is your problem.
It wasn’t until the Egypt level that I realized just what a difference the RT lighting was making to the game, using your lighter to explore the pitch-black tombs is incredible
@@ScubaSnacks Yeah that is where it really pays off. That section in the blue tent at the beginning of the Egypt level really showed off how much better RTGI is with lighting. It just makes everything look so much more realistic. The bounce lighting and scattering looks so much better than traditional shadow maps and ambient occlusion.
@@BasePuma4007current baked in lighting honestly looks good enough that i cant bring myself to care that its more realistic or whatever
Funny how we have the pc master race gloat about how pc graphics are so good and consoles hold them back yet they complain when games want to be true next gen and move the bar forward and their card doesn’t work
100% meme level clown car of opinions
Turing came out 6 years ago, games have not required its feature set. You had 6 whole years to keep your aged hardware alive. If you don't have an RT capable card by now you have made an error. But that said, you still have time. This is just one game.
I'm fairly sure that those were two very different groups of PC players saying that. We aren't some sort of unified hivemind.
pc players are the biggest crybabies
funny how i'm playing Indiana Jones on RTX2060-6gb even at 1440p/67%res without crashing, the fix is on my channel, with LS framegen im playing at 1440p+DLSSQ and getting 80fps, used "CRU" (custon monitor res) now im playing at smooth 80hz 1440p, and the game plays and looks good lol.
RTX HDR is really a killer feature
Indiana Jones and Great Circle of PC Requirements
Indiana Jones and the mystery of Ray tracing shoved down your throat
Well.... I'm on a Titan X Pascal and a i7 5930K. it's lasted me all this time.... It's finally time to bite the bullet and get a 9800x3D and a RTX 5090!
Dude I dunno where you live but where I live you can buy a used RTX 2060 super for £150 upfront and there'd be no reason to need a CPU upgrade with that. Of course if you do go with the top of the line stuff all over again it'll probably last you just as long if not longer so it's hardly a bad idea if that's what you were going for (I really can't tell what the time of your comment was and I'm not trying to be rude or combative I promise).
@oliverdowning1543 Yeah, exactly that. I like to get the best of the best hardware at the time and have it last and future proof myself. And who knows if I will get the chance to do this again in my life. I might as well do it now whilst I have the money...
Sad to see my old rig go but 10 years... Can't grumble at that really can you...
PC is so wack
@@lucasnookeryeah, totally not a bad approach. I've yet to have been in a situation where I can really do that myself but definitely when I can it seems the way to go.
Exclamation point exclamation point
Back around the days of releases like Splinter Cell Pandora Tomorrow or Splinter Cell Chaos Theory, shader models were constantly being iterated on and hardware could only support the latest available shader models, so it wasn't uncommon to launch something like Pandora Tomorrow on a GPU that was outdated by several years and the game being unable to run at all with the error message "your GPU doesn't support shader model 1/2/3, so you can't run this game, period."
Considering what I saw on PS5 pro in F1 2024 and Fortnie do You think guys some "basic path tracing" its feasible on PRO like 30 fps and upscaled res to 1440p? 😅 Please answer only people with programming knowledge not like Pc master race just saying console is trash I know its not a super powerfull😅
Edit: Some guys test medium Path tracing on Rtx 2080 ti at 1440p (upscaled) and they got like 30-45 fps so on PS5 pro something similar seems possible casue 2080 ti is even slower than PS5 🤔 😆?
I doubt it. Sometimes even regular Ray Tracing is difficult for them to add while keeping a decent fps/res. F1 is just not a very demanding game.
It's a last gen game essentially with added RT... It's an easy game to run on PC
But it's not full Ray Tracing because they're missing RT for Shadows. Sometimes Pro has to use lower than the lowest PC settings of RT.
It's possible but it'll only be on highly optimized/less graphically impressive games like the 2 you mentioned.
I forgot to mention that fps and Ray Tracing are reliant on the CPU and with the outdated Zen 2 it'll be difficult for path tracing.
lol not a chance
Regular ultra settings runs around 3 - 3.5 times faster than max full RT... I am referring to a video where 4090 at native 4k is 100fps Ultra vs 30fps path tracing. PS5 pro certainly has a worse raster to RT ratio than 40 series so I would go so far as saying that PS5 pro regular quality settings would run somewhere between 5 - 8 times as fast as the max path tracing mode. You do the math lol what resolution would you need to run if you need 8x performance compared to 1800p? Likely below 720p for 60 fps
@@christophermullins7163 🤔
No matter what happens now, rasterisation-only cards probably have at most a couple more years of support from new games, anyway. It’s hard to imagine that Sony won’t want to go all in on ray tracing with the PlayStation 6, and that will probably be the final straw for new games also dragging along old PC GPUs with weak or non-existent hardware RT. And naturally studios will start working and planning with that future in mind well before the PS6’s launch date.
@@leocomerford 💯 what I been telling everyone I mean you can see this coming from a mile away but yet these people are still going to be caught with their pants down
Id Tech is amazing as always. Doom: The Dark Ages will be using the latest Id Tech and it will be even better than Motor, the new Id Tech derivative engine.
Thank you for the correct messaging here. Today's gamers don't seem to be aware that these requirements are generous: you need at least a 6 year old NVIDIA card or 4 year old AMD card. Last gen APUs can run this game, as can the ASUS ROG Strix mobile platform, as well as the Steam Deck. Games need to evolve, and this always happens, like when hardware rendering became a requirement in the 90s, or when pixel shaders became a requirement in the early 2000s, and those were a lot less forgiving in their minimum requirements than this game! You weren't going to play Return to Castle Wolfenstein on 6 year old hardware.
Today's gpus are more expensive than in the past.
@@GreyDeathVaccine This is a false blanket statement. The flagships are more expensive, but we have $250-330 mid range cards today. Throughout the 2000s, we also had $250-400 mid range cards, and if you account for inflation, that's a lot more in today's money.
@@GreyDeathVaccine nope
I mean the only reason the new Intel GPU makes waves at the moment is that there weren't decent 250-dollar midrangers for the past couple of years.
When graphical fidelity was rapidly advancing and games were looking exponentially better every year, it made sense that the latest games needed better hardware.
These days all we're getting is side-grade games that look the same as their competitors but require better hardware to finish the developers' jobs. It's not the same.
To put it into future perspective, as hardware becomes more and more developed in the RT department, eventually ray tracing and path tracing will become more optimized and be far easier to run - it will be to future hardware as performant as rasterized solutions are to current hardware now. This is a necessary step in that direction, even if it means a lower frame rate threshold on current generation hardware, progress doesn't happen without some initial growing pains. And if people hold onto legacy equipment for too many generations, they will eventually be left behind by technology evolution.
You folks are insane. You're basically the definition of a mindless consumer. You're supporting a ridiculous practice. You literally WANT people locked out of playing games and to bring back the marketing practice of making games be so tech heavy you (even unnecessarily so) you need to upgrade just to play then. It's insanity, and you're EXACTLY the kinds of consumer a company like NVIDIA wants. You will literally go on the offensive and rationalise BS company practices and you genuinely think it makes rational sense. You need to take a step back. NVIDIA has been force feeding RT down throats for a loong time now and NO it is not needed and doesn't HAVE to be used. It is a nice feature IF wanted. You people are the natural enemy of those that fight to keep these companies in check and not purposefully gatekeeping games. You think it's a coincidence that certain games would require RT. Don't be so naive you're playing into their hands
@@TheycallmeMrWonka "Yeah, the game looks and runs like shit but hey, we have DLSS! You like DLSS right? That's right. Eat it up piggy."
We need shadow and LOD distance to have higher settings available, and I don't care if it's a mod. Yes you can do the lodscale, but in the settings menu would be nice. I wish it actually had a larger pool size available for 16-24GB VRam cards, but I understand if it has no visual effect. "Need" is probably not accurate 😂.
It absolutely feels like Indiana Jones, and I love it.
alan wake 2 required mesh shaders....while it didnt require RT....this essentially made it so you needed the same GPUs has this game
Except Indy devs were lazy enough not to add any fall back for RX 5000 or GTX users while Alan Wake 2 had support for them and later added compatibilty for GTX 10 series as well.
Indiana Jones' optimization is very overrated
@@DragonOfTheMortalKombat Dude it's not dev laziness, it's the fact that in game development you have priorities and supporting 8+ year old hardware for a tiny portion of people is not very high on the list of those priorities.
@@DragonOfTheMortalKombat creating games with ray tracing in mind allows you to achieve artistic solutions that would otherwise be unfeasible or very difficult. Most people who complain like this don't understand anything about hardware, game development, or coding in general. the GTX 1000 series is now too old and lacks features and raw performance, I can understand what you say about the rx 5000 since theoretically a 5700xt (for example) still has decent performance but in this case it's the amd hardware that is always a bit behind and on certain occasions you pay for it. In the past even 3 years for a gpu were a lot so I wouldn't complain much if after 8 years a gtx 1060 can't play this game (just a single game lol).
@@monkfishy6348 I know you dont know shit about game dev if you think a game fully made from scratch with RT in mind is harder to make that making the game with basic prob light system and then adding RT........ and btw Alan Wake 2 devs didnt added any fallbacks, people found that they could run it but it will run like shit, so devs just added a switch so people could run it even if it was never meant to be play like that
@@DragonOfTheMortalKombat Alan Wake 2 still runs like absolute crap on 10 series GPUs. Just because they made those cards compatible doesn't mean you should play it.
The only problem is the bare minimum RTX 2060 Super is $250, which you could get an Xbox Series S for cheaper than that. That's slightly different than finding a cheapo GPU and barely squeaking by, now you're EXPECTED to pony out hundreds just for the lowest settings possible. Maybe the AMD and Intel options are cheaper, but that still seems over the top given there's the 2050/2060 that use raytracing and are similarly excluded.
You could almost buy a used Series X for that, and it plays fine on that console.
@
Exactly, it seems excessive for just the GPU to cost that much to hit minimum. Before, you could get a PC of comparable price and get decent results. The Cryptomining jacked up the price and we see the results here.
I think hardware RT requirements in 2024 (almost 2025) is perfectly cromulent, especially given how long RT has been around.
I have a good vocabulary and I had to look up cromulent.
@@veilmontTV not cromulent bro
It will only embiggen the push for rt standards across all hardware
@@veilmontTV If they're not covering something that brings out the console waring children, this type of articulate nerd is par for the course around here. ❤
@@veilmontTVYou shoulda watched more of The Simpsons...
Dlss, high preset, full RT frame gen off, at 2560x1600, with fps limit of 75 puts my gpu utilization at 45% on my ad103 laptop. Really the main issue is that I’m using over 13gb vram. Initially I was experiencing BSODs whenever I was about to enter or was in long cutscenes. My short term solution was to lower the resolution all the way beforehand and I was able to get through. I tried reinstalling my drivers and I think that might have fixed it, fingers crossed.
The fact this game uses so much vram is really an issue for me. My gpu has so much more left to give but it would need even more vram to do it. That’s a problem. High looks okay but there are so many other games I’ve played recently that use a fraction of the vram, are fully ray traced, and look way better (cyberpunk) I think there needs to be some more work done to optimize the vram usage in this game. On a side note, this gam taxes the cpu like crazy.
Where is the Soul Reaver Remastered video ??
Thanks for answering my question so thoroughly! Always a pleasure to watch the DF Direct Weekly! :)
Why are the cutscenes so terrible? Massive stutter & screen tear otherwise runs really well.
What's your hardware? I'm seeing nothing of this sort on my old RTX 2080.
@@no1DdC 4090 5950x 3d
@@nrosko That seems exceedingly unlikely. Drivers up to date? Is your G-Sync display properly configured?
@@no1DdC Its not g-snc. I'm not the only one reporting this issue.
@@nrosko I find it odd that you bought the most expensive consumer card from Nvidia, but not the right screen to go with it.
Alienware m16R2 4070 (8gb) 32gb RAM Core Ultra 9
I can barely put anything to medium without getting the vram warning in the settings, even with DLSS on performance. Running 1080p. Would be nice if they provided the vram usage numbers in the settings as you toggle stuff on/off and up/down. I’m getting a good framerate over 120 mostly but I’d like the game to look a little better. Guess I’m held back by the 8gb vram on my GPU.
I usually agree with all the DF viewpoints, but this one leaves me more against for once.
Yes direct X 9.0c came out and 2.5 years later was required.
In that time GPU's progressed from GeForce 6800 Ultra to a GTX 8800
Shaders = 16 increased to 128
TMUs = 16 increased to 32
ROPs = 16 increased to 24
Bandwidth = 35.2GB/s to 86.4GB/s
Core clock also increased from 425Mhz to 576Mhz, so in total:
10.8x Shader improvement
2.7x Texture improvement
2.0x Pixel improvement
2.4x Bandwidth improvement
= 4.47x average improvement over 2.5 years
= 1.79x increase per year
6800 Ultra = $489
8800 GTX = $599
Price increase was 1.22x for a 4.47x improvement
RTX came out 2018 and in 2024 we have our 1st games, so 6 years. RTX 2080 to RTX 4080:
Shaders = 2944 increased to 9728
TMUs = 184 increased to 304
ROPS = 64 increased to 112
Bandwidth = 448GB/s to 716.8GB/s
Core clock also increased from 1710Mhz to 2505Mhz, so in total:
4.8x Shader Improvement
2.4x Texture improvement
2.5x Pixel improvement
1.6x Bandwidth Improvement
= 2.83x average improvement over 6 years
= 0.47x increase per year
RTX 2080 = $699
RTX 4080 = $1,199
Price increase was 1.71x for a 2.83x improvement
So the time period is longer, but the technological improvement over even twice the time period is much less.
The killer is the weakest a PC can use is a RTX 2060 / RX 6600, and yet it runs just fine on a Series S. We are talking about a 4Tflop AMD GPU with inferior Ray tracing abilities, but the PC needs a nearly 9Tflop GPU from the same architecture. That makes no sense.
Then consider other lighting options wouldn't of taken much effort and the result is laziness and a artificially forced GPU upgrade.
Is the upgrade expensive? No not really. Is it REALLY a requirement? Definitely not.
A modern say Splinter Cell (Direct X 9.0C or Direct X 8. no 9.0, 9.0a, or 9.0b support)
Tell me, is there a gpu equivalent to the series s with rtx/RDNA 2 features?
@@annonannnonymus I don't believe there really is. The Series S has 1536 RDNA2 shaders but cut down to 1280 shaders.
Closest is a downclocked RX 6500XT. That has 1024 shaders (So less) but is clocked at 2.60Ghz to the Series S 1.56Ghz.
RX6400 is a little weaker. RX 6500XT is faster. Either way that's around a RX 480 or GTX 970 and that's 10 years old. It is just strictly using RTX features to force newer hardware.
In this specific games case, the Xbox Series was the only focus. PC and other options are a after thought. At least in my opinion.
>"other lighting options wouldn't have taken much effort"
You're wrong and missing the point. Because a lot of effort was taken away by using RTGI, developers were able to focus on other optimizations. They even mention it in the video.
@@Kumoiwa I respect your viewpoint, and my original statement is I mostly agree with DF, but I can't see this being the case when there are multiple engines out there with various options. It isn't like the past where you didn't have enough power and had to bake shadows into the textures themselves.
How much more work would it be in Unity or UE5 for example to use global illumination instead? Its basically just a option to tick box.
Game development is too long as it is, but this specific area would of taken basically no extra time at all.
You could argue the engine is more tailored and diversity is good, but that's the same argument SquareEnix used with Final Fantasy XIII and look at how that development went.
At some point people gotta accept that their rig is old and either live with it or upgrade. If someone was running an 8 year old gpu in the early 2010s or prior they definitely weren't going to be playing modern AAA games and everyone accepted that. People got too comfortable with the fact that the hardware in consoles during the ps3/360 and ps4/xbone era were really weak which is why even a mid tier pc could run circles around them by the time pascal cards arrived.
"If someone was running an 8 year old gpu in the early 2010s or prior they definitely weren't going to be playing modern AAA games and everyone accepted that."
Go take a look at a 2010 game and compare it to the graphical fidelity of a 2002 game.
Everyone accepted it because graphical fidelity was improving massively very quickly. Meanwhile, these days developers are asking consumers to spend MORE money than they would have in 2010 for what is essentially a side-grade to tech like DLSS or RT, so they have to do less work.
This kinda makes me wish Halo had transitioned to Id Tech instead of Unreal
Considering you were able to buy a 3070 less than a year ago consistently makes the vram requirements a little rough.
Times have changed a lot since the late 90s and early 00s. And so have the prices of high-end video cards. And I'm not complaining with my 3080Ti and Nvidia shares, but I dont feel *this* game is the poster-child for upgrading your GPU tbh...
Yes, they overlook prices.
I can see why people are annoyed as yes, the cards to do RT have been around for 6 or more.
What people quickly forgot is that new gpus have not been viable for most of that time due to price gouging and sustained high prices.
Covid and bit coin really destroyed the markets.
As a older adult I have the disposable income to handle that but others may not.
Even a ps5 was inflated until recent times.
So 6 year old tech was the only viable tech for so many.
I think devs have to relised the reality of the past 5 years.
Complaining about 10 series GPU not being able to run this is like complaining the game didn't release on PS4.
I mean, come on.
A complete straw man argument. If a game is literally playable fine without an UNNECESSARY feature then you're just being gullible and arguing against your own consumer interests. People like you are dangerous to this trying to keep these companies and their underhanded gatekeeping techniques in check. Literally arguing FOR forced upgrading with new features. Insane
If they release the PS6 with the same specs as the PS5, but make it so that the new games only run on PS6, would you be complaining?
I remember having a gtx 870m, was stuck at d3d feature level 11_0, I couldn't play GoW 2018 because it needed feature lvl 11_1. Had to wait a year until I got a RX580. Unfortunately ray tracing is where game development is heading to, and fortunately this tech is now 6 years old or so, so we can't really blame some developers for forcing it. We have to accept that we can't use our graphics cards forever, even just released cards will eventually have to be sunset.
"Why Devs Ramming Technical Problems Up My Ass Is a Good Thing"
Fun game, don't get me wrong, but it still lacks the polish I expect. They need to fix the DLSS issues at the very least.
Try not using a 10 year old gpu lil bro
@UT_Alx Strawman , nice try lil bro.
@@UT_Alx I use a 4070 Ti Super, lil sis.
@@WhatDaFaust I refuse to believe you are unironically calling the game unoptimized then.
@@KalElinaboxWhy am I even trying to engage with people that don't even have a profile picture, 90% chance this is AI.
complaints like this make me wonder how these people wouldve survived back when crysis was first released
It's easy for people to say stuff like that when they can afford it
Gamers are complaining about optimization issues with current games but this game was easier to optimize because it was limited scope of targeted platforms. Perhaps developers will put out more performant games if people move on from their decade old hardware to raytracing hardware.
What optimisation? Shits broken.
@@ElZamo92 Broken? On a potato PC maybe. I'm getting a locked 60fps on my aging Ryzen 3600X with a 2080 Super at 1440p.
@@ElZamo92 What's your setup?
It also runs better because it uses the idTech engine instead of the dumpster fire known as Unreal Engine 5.
Wait. I thought enabling Nvidia HDR would cause up to a 20% performance hit. And you would have to run DDU to get rid of the performance hit as just toggling it off would not get rid of the hit.
Where did you read this utter horseshit?
Running the game on an rx 6600 at low settings with forced RT will surely bring up the experience on a whole new level (sarcasm)
Pixel aRT
It looks and runs great. It's not like anyone is complaining about the Series X version, which runs at 1800p at settings below PC's minimum.
the game runs really well
Maybe not the best RT game but I like seeing the RT requirements by default, budget cards have RT now so it’s about time atleast GI now be done as a standard by Ray tracing. It’s scalable and can be performant, win win.
You folks are insane. You're basically the definition of a mindless consumer. You're supporting a ridiculous practice. You literally WANT people locked out of playing games and to bring back the marketing practice of making games be so tech heavy you (even unnecessarily so) you need to upgrade just to play then. It's insanity, and you're EXACTLY the kinds of consumer a company like NVIDIA wants. You will literally go on the offensive and rationalise BS company practices and you genuinely think it makes rational sense. You need to take a step back. NVIDIA has been force feeding RT down throats for a loong time now and NO it is not needed and doesn't HAVE to be used. It is a nice feature IF wanted. You people are the natural enemy of those that fight to keep these companies in check and not purposefully gatekeeping games. You think it's a coincidence that certain games would require RT. Don't be so naive you're playing into their hands
@ ironically I made this same post you made to me, to DF’s Alex a couple of times for calling for mandatory RT in games because I don’t think it always look good or worth the perf cost to alienate millions. I made this comment in light of RT, I’ve been into Ray Tracing as a technology before Nvidia made it their marketing for overpriced products and sure way to make some upgrade yearly for playble framerates when using RTX features. I’m not that guy that wants todo things for corps to benefit, I just have a thing for RT. some games have a fall-back for trash GPUs but to make the switch and finally start making RTGI, Standard will take a lot more than making it a choice. The iPhone has HW RT as we speak, that says a lot about where the gps market should be, some just won’t upgrade until their gpu kicks the bucket. I’m fine with some games not supporting a 1060 because it needs RT-HW but if they have a fallback in place than sweet. Games are better off with RT but it’s hard to get there with so many ways to skin the cat and variety from pointless to neck braking. One feature I can vouch is RTGI, because that will get you closes to look like Path Tracing, second RT reflections. the goal is to Ray Trace everything eventually, been waiting since around early 2010 so to see it come close in some games mean a lot to me than you and that’s fine
wait so less options is OK on Indiana Jones but it's not for ps5 pro enhanced games ?
There are more confusing options on PS 4 ++ (aka PS5 pro)😂😂😂 than any other console
That's a great point, actually.
Finally someone notices the hypocrisy.
A ps5 pro option doesn't force the entire game's art to be remade again. Braindead comparison.
Different things
No HDR with Dlss with all the money I've spent on hardware and my monitor, sounds great😢
Use RTX HDR
I'm really enjoying Indiana Jones on PC. I have a 7700x, 7900XT , 32gb ram and playing 1440p. I have everything maxed (supreme) and it runs great! I keep the frames locked at 90 and it's smooth as butter. I have a 240hz monitor but for single player games like this i cap my fps around 90-120, it's more than fine for me.
My son has a new pc with an rtx 4060 with 8gb of RAM. How future proof is this or should I start looking at upgrading in the short term?
As you can see from the video, if you don't compromise on textures and resolution, the 4060 isn't even ready for the present...
Ahhh yes, welcome to modern gaming guys! Where games have a budget of over 400 millon dollars but developers can't be bothered to bake some lights like they used to. Get ready to spend 2000 bucks on a new gpu to run the latest half assed experience with upscaled resolutions, fake frames and mandatory ray tracing.
Well said! And the fact that they never said it requires RTX until two days before launch is a slap in the face. People should be talking about that, but I'm the only one who noticed it, I guess.
There is a stupendous amount of ignorance behind this comment. You do not know what you're talking about.
Why continue to support outdated technology when RT capable cards have been on the market for 5+ years now, some of you modern gamers complaint too much, when I was young games often drop support of hardware less than 2 years old, imagine trying to run a 1995 game on 1990 hardware, or 2005 game on 2000 hardware.
Not only that but the Series S literally has an Option that’s worse that’s just not in the Game for PC Gamers. Like… why.
@@DripDripDrip69 all they have to do is add a software mode also the industry has evolved past needing to upgrade every 2 years these days gpus last atleast ten years the 1080 ti can run almost any game and has the same perf as the 3060 but these devs dont bother to support the have 400 mllion dollar budget but wont allocate 500$ for software rt This Is Stupid On top of that the 5700xt a 4 year old gpu cant run it either even though it has excellent performance
So, I installed the game on my 7700 (PBO +200Mhz CO -35), with 32GB and 3080Ti and am getting around 55-65fps with 4K/All Ultra (in the forest section in the beginning of the game), except Texture Pool Size @High (Ultra shows warning on VRAM usage) and DLSS @Quality. As soon as I touch RT, the system goes to like 5fps even in the menu section. CPU utilisation is about 20-25%. Guessing it's all about VRAM. Even on 1440p/Ultra/High with RT @medium is unplayable.
you should set textures to medium
Rt is the default in this game. You trying to turn on Path tracing and that is too much for your gpu.
The game plays great on my Xbox Series X 😁😁
I'm all for HWRT becoming the standard. It's been over 6 years and 4 generations since HWRT launched on PC, and almost 4 years since consoles had it; even my school laptop's integrated graphics support it (it's not very playable, but neither are most traditional raster games tbf). Standardizing HWRT for lighting would go a long way towards simplifying the lighting pipeline, which I expect to grant better performance and maintainability. I bet UE5 ditching SW Lumen and investing those resources into HW Lumen might speed that up, too, though Epic probably wouldn't do that as it would eat into Fortnite's existing playerbase (as if migrating to UE5 didn't already do that). Plus nearly all GPUs with HWRT have mesh shader support as far as I'm aware, so if you want to use those while you're at it, you're probably home free.
My main concerns are twofold: VRAM requirements, and pricing.
We need to keep VRAM requirements under control. We still have 8GB cards around, and if we can't ship to, say, a 4060 Ti, then it just muddies the waters on supported hardware and makes everyone's lives harder for no good reason (players and developers alike). Definitely seems like investing in an aggressive memory streaming solution is a good idea (as bad a rep as Unreal gets, something like their virtual textures seems like a step in the right direction), so I'll keep that in mind when working on my pet project game engine.
Pricing also needs to be considered. There are legitimate reasons people are still using pre-HWRT hardware (think 10 series, 1080 Ti my beloved) and legitimate reasons to not upgrade. In some countries new hardware is crazy expensive, or maybe they don't have the money, or maybe they do have the money and choose to spend it on things they deem more important... all kinds of reasons to take into account. I'm not super worried because my engine is probably not going to ship games for a few years (it's a learning and resume project right now), but I can see why I would be worried if you're, say, Epic (going back to them), who needs Fortnite to run on a lot of machines.
The Dachsjäger still has this low end rumble on his mic. Shakes my room on my big subwoofer haha.
Pro tip @DF Clips: High pass filter at 200Hz completley solves this issue. Most people's voices don't get low enough to lose detail when high pass filtering up that high anyway and it stops people's rooms from rattling or speakers from getting blown out in the process.
I play Indiana Jones through GFN Ultimate, which is 4080 and in 1080p it plays perfect on highest settings. Everything maxed and looks beautiful.
The thing is there isnt a very cheap gpu with RT acceleration, RX 6600 have a poor RT performance, the Arc B570 and B580 thank god will change that
Just buy an Xbox if that’s your budget…
all this 8GB talk, meanwhile my 3050 mobile only has 4GB... don't know how they are even allowed to have the same names...
Someone give John a five hour energy drink before they film these things.
Why John is always the excited one, when it comes to Sega and Retro stuff.
@@WheeledHamster Sure but I've seen multiple clips now where he's falling asleep when others are talking.
John is tired because John is busy making great content all the time. Give my boy a break 😂
Honestly I don't blame him. Listening to Alex talk PC stuff puts me to sleep
@@peteparker22 Maybe you are not a PC gamer but Alex is the only video game tech reviewer in the entire youtube that is genuinely passionate about PC gaming graphics. he is like a gift from heavens to PC gamers and we PC gamers should protect him at all cost.
I do not really play with ray tracing on most games anyway. So playing at ultra settings on all settings except ray tracing and download texter shader option, I'm still getting way over 60 frames up to 100 sometimes. Son able to get very smooth experience with my 4070 super . I am playing at 1440p with no DLSS or frame gen and I'm getting those numbers on frames. And a 5600 X.
Devs should label their settings with the year that they expect hardware to run it at 60FPS. That way when someone in 2024 tries to run a game using the 2030 setting we will all know it’s the user that’s the problem not the game.
😂
Crytek attempted to predict the future with Crysis and its settings. We all know how well that went.
Once I turned off Low Latency mode in the control panel it was smooth sailing. Beautiful game. Playing Path traced supreme on quality DLSS. Getting around 90 fps.
Indiana Jones looks great, but even more importantly, is more fun to play than any game in years. It's just a great video game.
Any game in years is a damn overrated opinion 😂 there are tons of gamers just released this year which are more fun like Astrobot, Metaphor, Black myth, Stellar blade 😂
If you think solving puzzles is fun just go play sudoku or you know play any modern AAA game lots of puzzles 😂
@@Teja only thing funnier is seeing a response that names all playstation associated games as proven examples of alternative fun games lol (even though metaphor isn't Playatation only game, I'm sure you don't associate xbox with it) 😅
I have a non super 2060 in my laptop. Does that mean I can't play this?
finally, i've repeated this stuff in the comments of many videos but many people seem to know nothing about hardware. The reality is that 99.9% of those who complain about this game don't want to admit (or don't understand) that their gpu is really obsolete. For the record, almost all of the 50 most used gpu (not considering the integrated ones) support hardware raytracing and allow you to play this game well, regardless of how old they are. The only "absurd" thing is the use of vram which limits many nvidia gpu, no big deal anyway since even with medium or high textures the texture pool is more than sufficient.
Amazing video, thank you Digital Foundry.
It's very promising that consoles are able to do 1080p+ with RT at 60 fps locked on a game which actually looks quite good. Also, spicy Alex is best Alex.
The lack of simulated iris for the player is pretty silly. All this fancy lighting technology but the players eye doesn’t adjust to anything.
a good hdr implementation can help
How can we know the RT in this game is enhancing visual because there's no traditional lighting fallback? for all we know it's not improving visual at all while sacrificing performance.
Game runs 60 fps on a 4 year old console with ray tracing at on average 1800p if you are a pc player with a rig less powerful than a console then why are you complaining about performance?? Upgrade or suffer. Been that way since pc gaming has existed.
Performance is fine lol
@@00-JT And what if it run at higher fps with no visual impact? watch HUB video about RT, and see majority of RT titles out there doesn't improve visual at all while tanking your framerate, if the option is that and 100fps, i'd take 100fps any day
@@stefannita3439 And how can we say for sure, sure it run fine, but what if we can run it at higher framerate by turning off RT with minimal visual impact, you know like many RT titles out there?
@@tomthomas3499 dude RT will very very soon be in every game and will be the standard get over it.
Still haven’t seen a good enough game to make me want to upgrade from my 1080 ti. Even on latest gen hardware, games are coming out completely half baked with bad optimization and requiring upscaling tech to get decent frame rates. May as well not even upgrade at all if that’s the state of pc gaming we’re in.
I have a 3080 10GB and as long as I keep the textures around medium, I'm getting well over 60 fps at 1440P with many other settings set to high. Looks amazing and plays great. I had no expectations of this game and only downloaded it since it's on PC Gamepass. Nvidia really nerfed the original 3080 as it has the horsepower to run pretty much all modern games at 1440P high settings but the Vram is holding it back from its full potential. I'm conflicted on forcing RT as in this game, it does look great and runs pretty ok, but I feel bad for the majority of gamers who still don't have an RT capable GPU. That said, to play devil's advocate, we are heading into the 4th gen of RT capable cards and the devs can't wait forever to push the industry forward with making RT implementations standard.
However, I'm a few hours in and actually enjoying it despite never watching a single film in the franchise. Maybe I'll actually watch them now.
Watch the movies after it 😂
@@TruthHurts1519 Yeah I'm most likely going to do that. Besides, none of the streaming platforms I'm currently subscribed to have them available anyway.
Ive got a 3080. I put all everything to high 1440 dlss and locked it to 72 fps (half my screen 144). Its been fine.
Im just glad i can just barely squeak out running this game on a 2060.. it runs pretty well too. It supposed to not even boot without a 2090 super
After they finished this video, nvidia gave them their new 5000 series cards...😂
Nvidia be like : Good boy here's your candy
That's why they are constantly testing on and mention Nvidia cards that are several generations old and only available used at this point (which means that the other big N isn't seeing a penny from people buying those).
In other words: Keep your silly conspiracy theories to yourself.
indeed, Nvidia have been caught red handed before infiltrating enthusiast communities with PR firms and handing out "free samples" for "benchmarking" in exchange for promotion of their products. legitimately, Nvidia cannot be trusted with a single thing they say. not that you should trust any corporation, but Nvidia deserves the extra skepticism given their extremely shady history.
@@no1DdC ... you're conveniently leaving out the fact they ONLY test on Nvidia cards. no ARC, no RX. the only time i've ever seen an RX card in a DF video is Vs consoles, or Vs Nvidia in an obviously stacked test like Xenia emulation.
@@anarchicnerd666 AMD only has a GPU market share of 12% at this point - and dropping. Intel's is so low, it's not even measurable. That's why they stopped mentioning those - it's just isn't worth it.
I wish that AMD was stronger too, since nobody benefits from an Nvidia monopoly other than Nvidia, but they've unfortunately dropped the ball pretty hard, at least in this area. Their CPUs are doing rather fine - and if you've noticed, DF is routinely using those.
What's "obviously stacked" about the Xenia emulation test?
I don't agree, but it's mostly because i'm more interested in getting a game running well than getting it running well on the latest hardware, i don't understand how we can simultaneously praise a technology like DLSS for getting people more frames while praising something that demands people spend hundreds potentially thousands on new hardware, people did that for hallmark games like Crysis, not lighting.
On the dev side sure, rendering lighting is a heavy process and on the fly RT makes it easier for devs during the art phase of development but i see two problems, we're passing more of the art of lighting to technology than the artist leading to similar looking games, and AAA has a quality problem, maybe not with Indiana Jones but passing a lack of optimisation onto the player has never been cool and still isn't cool.
Bro, why did you think people did that for Crysis if not graphics?
@@SilverEye91 People bought hardware for Crysis because of the whole package, but most people can't tell the difference between RT and non RT side by side.
@@cikame That's not true at all.
The part about passing the art of lighting to technology is ridiculous, especially in regards to this game, that takes a lot of creative liberties when it comes to its lighting, trying to mimic the look of the movies. The dynamic lighting is what makes this possible in the first place.
@@Letucen_nik I'm so sick of non 3d developers acting like they know what the fuck they're talking about.
Regardless of the additional work and greater scope of support, I don't see why a baked model can't be offered, even if it comes later. Modders would be able to do it. It may be good for advancing the ubiquity of RT and expediting the development process to meet deadlines, but since PC is all about pro-consumer choice, scalability and lowest common denominator spec configurations etc then it only makes business sense to be as robust as possible.
They will definitely want to expand their reach as long term, full price sales begin to stagnate and I wouldn't be surprised if a Switch 2 miracle port comes along that completely throws any notion of "just upgrade your PC to play the latest and greatest games" out the window. So there's absolutely financial incentive to do so.
i think this is quite elitist and simply not true, because there is a large fan-base that does not get to play it; a lot of people would play it not only for the cutting edge graphics
my GF is having to play it on my PC because her GPU does not support RT; the point being that she would prefer to just play the game, rather than having better graphics; that said i know that there are development decisions that impact this and Cyberpunk does that in a way that i find better for this generation, that game supports both old-school methods aswell as top of the line RT
did you watch the video at all?
cyberpunk came out 4 years ago, this is a new game. raytracing is the standard now
True Let them alienate all people without RT with 8GB VRAM or less. Then they'll cry when game underperforms
@@DragonOfTheMortalKombat The RTX cards came out in 2018 I believe. If you haven't upgraded yet, how can you expect to play a AAA 2024 game?
That's like someone asking Rockstar to make GTA 6 and asking Insomniac to make Wolverine run on the PS4 from 2013 or PS4 pro from 2016. And the games will come out in 2025-2026
And it's as simple as, if you have not bought a PS5 after 4 years, you can't expect to play the latest games.
i don't have a problem with the graphics getting better with time nor i think that a PC from years ago would run 2024 games at ultra, my point is that it's disapointing to see that there are no options for non RT cards, it's not something that most people can play and then you get extra goodies with RT; correct me if i'm wrong but all the games from the best graphics of 2023 had non RT settings; more options are almost always better and this case is one of them
If you thought hardware based RT requirements weren't coming for games, you were mistaken. The entire point of RT has been to replace prebaked lighting/shadows, SSR, and SSAO. A 16gb RTX 3060 in nearly 2025 is not an expensive card anymore. If you're a budget gamer, you'll be fine with 2nd gen RT cores.
It only comes with 12 GB at most I believe
Indiana Jones is surprisingly easy to run even on an RDNA2 GPU which let's face it have the weakest support for Hardware Ray Tracing so it was really nice to see it maintain above 60 FPS in places like the Vatican Daytime where an explorable city with many NPCs. It was a really breath of fresh air that there is a new game not using Unreal Engine 5 and its fast on a 6-Core 5600 no X CPU and now considered a budget GPU entry RX 6700 XT with the likes of the NEXGEN GPUs next year.
Doom 2016 could even run on a Q9550 Quad Core + RX 460 4GB GPU, so that's an indicator how good Id Engine has been lately.
No CPU bottlenecks with 5600? I have a 11900K + 6800 XT and wonder if I can run 1440p60...
Similar setup here, but a 6650XT instead. Over the last little while it’s been clear that I was on borrowed time and it’s time for a proper upgrade all round. I could push on with the 5600 for a while yet, but I’m done with RDNA2 at this point. Will wait until the new hardware in January and make decisions then. Putting off playing this until then.
@@radosuaf If you check my videos, you should see I uploaded a video of the game running it in the Vatican City even going in busy areas with lots of NPCs.
Should give you an idea showing full metric in Id Engine's metrics.
@@VesiustheBoneCruncher Yeah, at this point moving forward its best you wait for NEXGEN options especially with RDNA4 or even that B580 if its any good.
8GB VRAM is really a bottleneck.
@@evergaolbird The game runs on a Series S with RDNA2 architecture.
So there should be settings to get a similar experience on a Ryzen 5600 with 16 Gigs of RAM and a RX 5700 XT 8 GB card.
I know the hardware is old ... so is Series S. But the consoles show that the game and the engine is capable to run on low end machines.
The vram issue is more of a concern for me than the mandatory RT. This is more or less nvidias fault for being deliberately stingy. I do find it ridiculous that my 6GB RTX 4050 can't even launch the game let alone play it on low settings
You could also take responsibility for *your decision to buy a GPU with 6GB* when there were surely other options. You still have a great card. It just can't play *everything* but I'd say you shouldn't expect it to
Man so many haters in this comment section. This channel has always focused on high-end graphics and alex said it the best, if you want to play the LATEST AND GREATEST than your going to need a gpu from the last 2-4 years. Yes people aren't made of money (i only have a 4090 becuase i am a smd solder technician and bought mine broken and fixed it). This is something that digital foundy literally called out in a video recently, the cost of gpus is way to high. They even call out the lack of vram on 40 series... That is not being a shill. Stop getting angry because you cant afford a gpu and play the 1000s of games without rt that are fantastic.
After 7 years of RT GPUs the whining is kind of stupid. They've been around for a long time now and we've always known this was coming...IE more titles needing RT...
Yeah but I want to run Cyberpunk 2077 on my PS2 its NOT FAIR!!! 😢
yeah like literally i have a laptop with a gpu of 4gb vram and i know that this gpu is old and yk i have many other games thta i can play with my gpu and cpu literally stop complaining that your old gpu can't run there are other games than this one that your gpu or pc can run and play smoothly
This games 1080p is insanely good. I detect zero TAA blur.
How easy it's to talk when companies send you all the hardware,more personalization on the graphical settings and more optimization
8 years, EIGHT YEARS
Ok but at some point games like this had to start coming out because this is very much a large part of the point of hardware support for ray tracing. The fact that it's so commonplace now means that developers can lean on it to make better games with less restricted cinematography and more resources put into other things because they don't have to spend ages faking lighting. It sucks that that means some people can't play the game but it would suck more if we couldn't have had the game (at the same quality level because resources that went into making a great game would've had to be put instead into all the work that rasterised lighting requires).
I see you've never watched a DF performance and optimization review before. And developers don't custom-configure and program hardware and graphics engines for individual people. Try to repair your attention span and watch the actual channel for context, not just podcast clips.
Right? They live in first world countries, the get the hardware for free and say that it is a good thing rt required when it is a technology you only see differences when comparing side by side...
@@MaxIronsThird right, ppl act like this come out of nowhere
we need sli again and cards only taking up 2 slots then we wouldn't have to rely on using upscaling and dlss or at least pure raw performance increase for gaming bruteforce
I love that they bring up that your gpu used to go completely out of date in 3 years. From best gpu, to not meeting minimum specs. 6 years is plenty of time to keep up with the hobby
It's a dumb point because graphics tech also used to advance rapidly in the space of 3 years. The last few years nothing has changed at all. New games coming out don't look much better than the old ones.
I'm confused Indiana Jones released on December 9th.That was yesterday.How did the release last week
Preorder
The first raytracing cards came into the consumer space in 2018. If you want to play the latest games you know that as a PC gamer you have to upgrade your system every 3 to 4 years. If you haven't upgraded to a raytracing capable card by now that's on you. At this point it's not unreasonable for developers to expect users to have some level of raytracing support. The fallback options that have existed up until this point take way longer to program than a raytraced solution and it's just not worth the hassle for developers. Indy also actually runs well with it's mandated raytracing on the worst raytracing card a base 2060. It's doing about 50fps at 1080p native on low settings.
I have an RT capable card (3080) and always disable RT features with the exception of things like DLSS as it's usually not worth the hit in performance, and I know for a fact a lot of others do the exact same thing. Making RT a requirement is ridiculous, especially this early in RT's life where most devs still don't know how to implement it properly without breaking something else.
Some things are inevitable. A major new hardware feature becoming a game requirement, especially years after it was found in most consoles. It's a question of when rather than if. Also inevitable is the whining from people with aging hardware. I can recall these kinds of complaints all the way back to when games first began requiring support for VGA and had no support for EGA and Tandy color modes. Then it was audio cards. Then 3D hardware acceleration. More recently it was SSD storage.
If you can't afford a decent PC then just buy a console.. Got a 3070ti last year and my PS5 been collecting dust ever since literally.
Well if you were gonna use the PC for mostly gaming and can't afford something "decent", then yeah I agree
HDR doesn’t work on PC if you enable DLSS? I must not have noticed I thought it looked great.
It is a good thing. It means fewer and fewer people will be playing games because of absurd cost involved (on PC), so NVIDIA shilling sites will go out of business.
On more serious note, I can't think of any reason why exclusiveness is good for anyone. It applies to single console games and RT-only games all the same.
Sorry buy newer graphics card RT hardware is like 6 years or older now.
I did pc gaming in the early 2000 and you need a new graphic card every 2 years or you couldn't play the latest games because your card didn't support the latest pixel/vertex shader.
In this case it's reasonable because the GPUs are old and affordable but it could become a very slippery slope as the mins get higher and the GPUs only get more expensive.
RT games are not ab exclusive thing. Not having the requirements is not the same as not being able to play it *at all*.
People had ample warning to understand RT was going to become a big thing. Back in the day, a new shader model would make your card useless.
The *vast* majority of games don't need RT, but if you want to play those shiney new, high fidelity games, you will need a shiney high fidelity card.
@@nulian Sounds like you’re defending bad market practices because you had to endure worse in the past. Makes you sound pretty unintelligent.
@@nulian yeah but in early 2000s, every year was a massive jump in texture quality and overall graphical fidelity, which ray tracing simply isn't, that's why people are mad.