I love my 7900 xt. I think I'll wait for a 5080 super that has more than 16gb of vram. or I might just get a 8800 xt, i hear those are gonna be great RT and FSR is suppose to get better. The adrenaline driver software is just goated.
@@CombatMedic1O 8800xt will be round 7900xt at raw power with less vram and better RT with better price.. With 7900xt i would skip RDNA 4 and wait for RDNA 5 that will probably have gddr7 and much better perf. gains. You should be fine with 7900xt at 1440p until RDNA 5 or something with good value from Nvidia or Intel. New games are to demanding for serious RT and GPUs are still not at that level so i wouldnt be bothered with that. For basic RT, even mandatory 7900xt is more than enough.
@roki977 I thought 8800xt was gonna be on par with 4080 super. Well you think I'll be good fr another year or so with 7900xt at 1440p gaming? Cuz I'm worried about UE5 games. Anything with RT or heavy occlusion eats my gpu.
It's not going to hold you over for a couple years you already bought it it's two years old you should have bought it two years ago bro it was 900 freaking dollars bro.
I love my Hellhound 7900GRE, just loaded up STALKER 2 and it plays like a dream, it's awesome. I get to max out all 1440p games and they just run well, when I cap my framerate my 1% lows are one frps lower than my average, and my frametime graph is laser straight on everything. It's paired with a 5800X3D.
Bought a 7900xt about a year and half ago, compared to Nvidia it's been night and day. Every game has run buttery smooth, and 20GB vram I don't have to worry about it struggling for years.
I am super glad I bought the 7900xt it will be a long, long time before I replace it. I don't play games with Raytacing enabled. I don't like the way it looks anyways
If Indiana Jones is any indication, more and more games will make ray tracing a requirement with no fallback to rasterization. Unless you don't care about modern AAA gaming, that 7900 XT isn't going to last as long as you think.
Same here, I got my 7900xt a year ago and I expect it will probably last me 5 years before I even think about upgrading. Probably even longer than that. By the time I really need to upgrade to keep up with current games, I can just drop a better chip into AM5 and swap out the graphics card and I'll get a ton of mileage out of my machine.
As someone who bought a laptop with an rtx 3080 16gb in 2021 when there was a 8gb 3080 I remember when people online kept saying you don't need the 16gb vram, that gpu can't do 4k anyway. Jokes on them. I use my laptop with an LG C4 and everyday I thank myself for going for the 16gb vram variant
At the end of the day you can have the best of the best and it means fack all if the game devs put out unfinished/unoptimized games, then scrambling after the release to patch and fix major issues. Look at jedi survivor, great game, but the game has a lot of issues and don't matter what hardware you have. The issues won't change.
I played through it recently on 4080S. Played pretty decently but kept having to disable the RT in certain sections cos it would just crash the game. Thought it was my overclocks but same thing happened after disabling them and other games are fine including RT heavy Alan wake 2.
Instead of reviewers saying buy this card to play this game. They should be screaming at game developers to stop putting out unoptimized. Games all the get this card talk is just rewarding lazy game developers.
I enjoyed the video. I've used AMD video cards for the past 10 to 15 years. They have always been in my mind a better investment than the nvidia equivalent. Earlier this year I upgraded to the 7900xt from the 5700xt and I've been enjoying it. I plan to keep it for a while.
The 4070 and 4070 Super were VRAM deprived from the start. So were the 3070 and 3080, and I said so when they were announced. The performance of all these GPUs is quite good until you're out of VRAM, but out is out. Texture popping is way worse than giving up some ray tracing performance, especially given how distracting and overblown ray tracing usually looks.
The good thing I found out when going for mid-range GPU instead of high end, I got more money left to buy games. High-end GPUs probably give more FPS at higher details, but only if you have equivalent high quality monitor to display as well. At the end of the day, what's the point of having high-ends gaming PC if you can't comfortably afford every games you want to play?
@nikhilnischay3955 I built a whole new system, I got a 9700X. So far it's been perfectly capable of keeping up with the GPU and all I did was enable PBO, didn't do any manual overclocking.
@@USAFWolf Hey thanks for the reply. I’ve been eyeing the 9700x for a couple of weeks now. Especially now since the price for it has dropped and normalized around $320. I have an 8000 MHz polling mouse which I use on Halo Infinite and the finals. And the CPU usage on my 5800X3D skyrockets in the 90-100% range. When playing single player games like Banishers Ghost of New Eden, Warhammer Space Marine 2, and Control there are times when GPU usage drops to the 80-90% range. Sometimes lower when you’re in a village/city in Banishers and Horizon Zero Dawn Remastered and Forbidden West. And in NFS heat CPU usage is always sitting in the 90% range. I didn’t think I would need to upgrade a 5800X3D but using an 8k hz polling rate mouse and turning on frame gen on single player games the CPU usage goes up 😵💫 I’m just going to get 9700x after the motherboard and RAM prices go down a bit and call it a day. 5800x3D is great at upping the 1% lows but the CPU usage sitting in the 80-95% range is terrifying.
So many RUclipsrs and reviews are pushing Nvdia. It was really hard to find user feedback on prebuilts with AMD cards with Windows 10 EOL, Black Friday timing, and 2025 tariffs looming. I bought one with a 7900GRE because $600 is about the limit I can spare for a graphics card. The real need and priority was getting a Windows 11 compatible desktop for the household. The Windows 10 EOL made me reenter PC gaming slightly earlier than expected. Nobody in the household was complaining about the Series X except for drifting/broken controllers.
@@frogboyx1Gaming Thanks. I'm glad I found one with no bloatware, within budget, and in time for Windows 10Eol /tariffs. It even helps that the household mouse, keyboards, and cables carry over. We just need an external drive. The gaming monitor is about the only thing that can be considered unnecessary or luxury. A few lesser times dining out or takeout orders should just about cover the cost.
@@Silver-h4m I’ve heard something similar to that and I hope it’s true. I’m probably wrong but I don’t see the 5080 being a very successful card for NVIDIA in 2025. I’ll just have to hold on to my good old 2080 TI until then.
I was quite shocked tonight to see that when I turned DLSS from performance to balanced on my 4080S, my FPS in Indy at 4K with full RT and frame gen tanked from mid seventies to single figures. Thought the game was crashing then saw the message in the options screen saying I didn't have enough VRAM. Fortunately I'm happy enough to play at 4K performance but was quite surprised to see it's really on the edge of VRAM usage even at 4K. This is after upgrading from a 10gb 3080 which also became VRAM starved long before it should have. I doubt I'm gonna chase the highest fidelity in the future with Nvidia taking advantage of their market dominance to charge the earth for just one part of a gaming PC. Hopefully AMD can step up and provide decently priced and specced mid range performance and Intel can join them.
Thanks for this great video. I personally don't play high-frame rate games ; but I have always found the AMD cards the best value for my needs. I was finally able to get a MSI Radeon RX-7900XTX Gaming Trio Classic and expect to use ot for a few more years. Please keep up making these great videos.
I feel Nvidia purposely shorted their Ram on their cards, in an attempt to get people to purchase another card sooner than people expected to have to. As for AMD? I'm still rocking an RX 590 on many games with a min spec of a 5700XT and I'm doing it at 1440. I haven't purchased the Indie Jones game yet, but I'm able to play Starfield, DD2, Final Fantasy XVI Demo, & Stalker 2. I pay little attention to FPS. I install a game! If it plays, I'm good! If it doesn't, I uninstall it & play one that I can.
I bought the RX 7600 because it was all I could afford in my country. I just want to play the games, I don't worry about all the fancy graphics. It's about the game play for me.
I bought the 7800xt 2 months ago and its a banger , 16gb vram is really the minimum i would go for on 1440p, seeing that my 7800xt uses over 12gb in some games on 1440p
What do you think about future games putting raytracing as a requirement? I’ve been outta the hardware loop for a long time but do amd cards have ray tracing?
It’s more than simply better ray tracing and upscaling…it’s also power draw for me. I was very much so looking to purchase a 7900xtx, however when I looked at the idle power draw, I had to pass. That shit adds up on your electric bill, and I also don’t want my room feeling like a sauna. 6800xt and 6950 xt were awesome but had my room hot and my bill high.
While watching this video, my rx7900XTX is drawing 22W on a dual monitor setup. I have to say it cracks me up when people drooling over $800-1000 GPU's, but have to pass because they can't afford the electricity. I have 3 setups running all the time. One with a rx6950XT, one with a rx7900XT and one with a rx7900XTX. i can not say that i notice that on the electricity bill.
@@dividedpersona And $800-1000 GPU's are not cheap GPU's. If you can afford that, you can afford the electricity they draw. If you can't, than you can't afford the GPU. I live in the cold part of the world. Do you think my fluxuating power draw to heat my home use less power than a couple of PC's??? It does not.
@@Audiosan79brother.... energy costs are dependent on multiple variables. Your costs do not reflect that of the majority of the world. Even in your own local city. Plus i use my pc's for more than gaming. As the total amount of pc's i have scales up, the operational costs do as well. You all really need to get you all head out you all ass.
Nvidia nerfs their cards every generation, yet their fanboys are in denial. AMD is the mid tier king. Intel starting to look interesting on the low end now. Should be able to get 5-7 years out of a card.
You remind me of my old mechanic friend who was pro Chrysler/Dodge Mopar man and went on and on about the Hemi engine vs Ford/GM engines are not that good, heh. I am not 100% sure about the future; but Intel's b580 might win the low end market, but I think you have point about the mid end market with AMD, and then you have that new RTX 5090 coming out with 32 Gigs, butt he 5080 will have 16GB. I am still doing fine with my 7800xt, and is good enough for my needs.
Yup the nvidia bros overpay by over $200 and want to make excuses and deflect to raytrace or pathtrace, not to mention 4090 cant pathtrace properly and upscaling is necessary
For better image quality and the option to pt and win every time when rt is on. Stop it. Amd is fine and great value but yeah upscaler and rt..nvidia anyday all day.
@@c523jw7only thing Nvida really has over Amd is Ai and that’s not hard for Amd to implement Even intel xess is going to overtake Dlss once they get better fps
You put deflect to rt cus you know your card cant rt. Your coping. At the end of the day if you bought a 4080 or 90 you could have bought the best amd card...we skipped for a reason 😂. Whats your set up man?@steven7297
After owning my GTX 1070 for 8.5 years, I'm finally going to upgrade when AMD & NVIDIA release their new cards in January. I'm leaning heavily towards RDNA 4 but will consider the 5070 Ti too. I think the 8800 XT will either be a slightly better 4070 Ti Super for $500 or a 4080 for $600. I doubt they'll provide 4080 level performance but we'll see. Gotta say, I'm proud of how well my 1070 has held up. I was able to max everything out at 1080p for so long. Now I have to dial settings back to mostly medium on newer games for a high refresh rate experience. The games still look respectable, honestly. I just want to go back to maxing everything out & I'm also interested in trying ray tracing. The NVIDIA 10 series was truly legendary...
I'll be honest man. Your 1070 will get shat on by every other midrange gpu thats currently out. If you bought a 5060 you would feel a massive difference despite it being a mid tier card. Stick with a 70 series card from either nvidia or AMD I tend to feel like it's a good balance between future proof and playing games at high to ultra setting @1080p + 1440P with no issues.
It depends on the resolution you are playing and the games you like, if you are a competitive gamer, I guess you don’t need ray tracing since you will lower settings to get more fps but someone like me who like graphics and plays on a qd oled monitor and play single player games, I rather the nvidia just for the ray tracing and the bette upscaler. But if you get amd it is good, I had a 6800 xt and ran beautifully games as call of duty and other fps games, great cards.
Great video! Budget-conscious gamers need to hear this more (and let's be honest here, majority of us gamers don't want to spend more than we really have to). AMD deserves more love from gamers. I also support Intel especially with their delivery of a value product, the Intel B580 card.
Look at the Price of the XTX Honestly think now with it being the best 24GB GPU on the market behind a 4090 Probably a nice grab for someone That's why i'm keeping mine because I love it & if I do run into vram limits i'm covered in that regard Next Generation is offering us fuck all apart from a 5090 32GB AMD God knows what they're doing after a 8800XT 16GB
Have my new sapphire pulse 7900xt sitting unopened just waiting for CES25 to see if 8800xt really is as good as the rumors say. Hopefully it is as good as the rumors or worse than 7900xt cause I want the card for some longevity and will the extra 4gb or better raytracing give me better longevity. Hopefully it's an easy decision lol. This is my first PC btw using my series x to play games atm so it's not something where I absolutely need it now. Can wait another month or two if needed.
You can't just look at the price alone and then decide what to buy. I look at the price and the full list of features the GPUs are giving us and then decide if it's worth it to buy a GPU. I'll buy NVIDIA GPUs because they offer many features the AMD GPUs doesn't offer. Doesn't help to have a slightly more VRAM and slightly lower price with an AMD GPU then. And now with the RTX 5000-series GPUs, NVIDIA is going to give us even more new features that will be pretty good to have. So, the 'RTX 5070 Ti' is going to be my next GPU.
This is the first time I got recommended a video from this channel but I've been in both situations over the years... Those where it was really important you prioritized vram and those where it didn't really matter much. Admittedly being stuck on the PS3 console generation for so long with 256mbs of vram probably helped hold things back for while. 🤣 I feel like vram doesn't cost enough for nvidia to always fall in the low vram side though. 🤔
I have a 4090 so I don't really focused on the numbers at all. We all know 8GBs is pretty crappy now but I guess it would be sort of interesting to see performance differences at 1440 between 12GBs and 16GBs. I haven't played Indiana Jones but I assume 12GBs still generally holds up if I had to guess?? There's also a difference between a card allocating more than 12GBs to memory when it's available and the game running into framerate/stuttering issues when it only has 12GBs available.
I am building a threadripper system for creative and storage purposes. I had to switch to a water cooled gpu so that it would fit in the case and prioritize the pcie5 nvme adapters. The one I already had was a 4090. My 7900 XTX was too large and airflow would be blocked. Despite none of my AMD cards ever having a problem with my Samsung G8 240 Hz 4K monitor, the 4090 is a total trainwreck. Simply doesn't work sometimes. Most of the reddit threads put the blame on Samsung. Also, the color and HDR also looked like crap. If my Dell monitor has trouble. I will sell it and get one of AsRock's blower style creative 7900 XTXs and sell the 4090. AMD is simply more compatible and has better graphical fidelity.
This is confusing yes the 12gb of VRAM on the 4070 super is right at the edge of what's acceptable but the game he's using as a example Indiana Jones & the Great Circle preforms better on the 4070 super then it does on the GRE, RT on or off. That said 4070 super for 600 bucks compared to the GRE which is nearly a flat 500 is a better deal.
Im actually pretty happy with my nitro pure 7800xt paired with 7800x3d which i bought a year ago for 580 euro after watching couple of your videos :) I actually played the new indiana game on ultra on 80-100fps+ without afmf2, and that card is running really cool (on 100%usage 55C and 71C hotspot). Nvidia gpus were/are just too expensive in the middle Europe
I actually intent on going AMD next gen, the only thing that makes me second guess is seeing every game made with UE5, which has baked in ray tracing you can't turn off (I would if I could, I think it's a stupid gimmick that costs too much performance for some reflections). Let's see how the new generation reviews
Some games don't really look good with RT. But some games look magical with RT. The Witcher 3 being one. The difference between RT off and on is amazing.
Yes because of the new 8gb patch that allows better compatibility on 8gb cards but at what cost 🤨 I'm sure someone will be along soon with a comparison video of hidden low textures or pop in when you turn around on the game ect
7900 XT Merc here, could care less about RT. Feels like a marketing ploy more than anything. Just cant be bothered to pay for the equivalent GPU from Nvidia that costs double. I have a friend who may ditch his 4090 for a 5090. This is the only way ill be jumping ship because I know I can get a fair price on the 4090 from my friend. I just wont pay retail for Nvidia, my EVGA 3060XC in my work PC was used for $200 and I repasted the card and its been great. I would never have paid for this thing new. I really wanted to try out Nvidia thats why I bought that one. FSR - It looks worse than no FSR on a lot of games with the 7900XT for me DLSS- looks sharp on the 3060 and makes a meaningful improvement with the lower end card. Maybe a nice Nvidia card would change my opinion. I would consider a 3090 if people would lay off the pipe with the $700 used pricing. I think both cards are great for me and what I play. I don't see this 7900XT leaving my hands anytime soon.
At what point are we going to start holding these game developers accountable for just being lazy and not optimizing their games? Most of this is what the problem is.
Certainly games are coming out that are leaning too far on upscaling technology to overlook their coding, but there is a limit to how much optimization can be done. Minimum Vram requirements are gonna increase across all resolutions and Nvidia has been happy to skimp on it.
Looking forward to the 8800xt. Think of taking the dive and upgrade my 5700xt. Of course the motherboard and CPU will follow this years project. People that constant want high end will get over that high quickly. I still run my 5700xt with satisfaction for the few games I play. Do I need Indiana Jones with all the ray tracing? Don't think so. And i run my elite dangerous on 1440p easily.
RT and PT both are huff. They barely make a difference in most cases. Only people who have been educated on what the differences are and are actively looking for the difference will notice when a game has RT and PT. RT and PT won’t matter until it’s easily achievable on consoles and $300-$500 graphics cards. I think the best bang for buck buy for this graphic card cycle was 7900xt. It gets me 4k 60 in most games and at least 1440p 120 in my fps games with high settings.
I upgraded from a 1070 TI to a R9 7900 XT. The price point was way too good to pass up & I don't care for ray tracing. Nvidia have lost their mind trying to sell us 8GB cards & trying to get me to sell my organs to buy a good CPU from them. I'm just not putting up with it. I hope AMD & Intel change the GPU market & make Nvidia stop price gouging.
that's what Daniel Owen is saying. if you are buying at around 700+ Nvidia makes sense. that's 4070 Ti Super, 4080 and 4080 super and 4090. but daniel also mentioned to not buy 4080 and 4090 since 5090 and 5080 is almost out.
Recently scored a 7900XTX for just over $700 and honestly? I'm probably going to keep going with AMD or Intel GPUs moving forwards. Nvidia is getting way too expensive for minimal performance boosts
It's really good trying new things my friend I noticed I actually use more AMD features than I would with Nvidia. Once you truly see that AMD does a great job it really makes you wonder why everyone keeps recommending Nvidia when AMD offers great performance at a lower cost with more VRAM
@@frogboyx1Gaming One of the things I've noticed when gaming on my main rig or my secondary rig that uses a 6600XT is the AMD Adrenaline software is much more user friendly than some of Nvidia's features. Having an easy way to tune your cpu and gpu is a big plus in my book. And yeah, being able to get a 24gb card for nearly half the price of an 80 class card with similar performance should be a no brainer! Thank you for posting this video and spreading more awareness about this!
no brand can convince me to switch from amd unless they have a feature equivalent to AFMF2 that can be use on ALL Games without needing game developers implementation. Afmf2 is just too good.
They don't even sell amd cards here currently - all out of stock. Even the nvidia cards are mostly out of stock - just some 4060 / 4070 cards left that people are avoiding because 8gb vram limitations. Most shops are just selling through remaining stick before all the new launches. Currently I have a 3080 10gb and all the games I play run great - even newer games. I am interested to see what the new amd cards perform like. Hopefully the rumours of them being about 4080 super level, and not priced outrageously like nvidia 50 series cards will be, actually happens. AMD are great at botching launch pricing. I wouldn't mind a decent card with 16gb+ vram. I don't care who makes it.
AMD has been better for years now at every bracket except the 3090 and 4090. My friends who listened to me and bought a 5700xt have been playing until recently while the poor fellas who bit for Nvidia and the 2060 got crushed in two years. Same with the guys who paid 1000 euros for a 3070, 3070ti or even a 3080 (8-10gbs) because those were "better at RT". Everyone who bought a 6800 or 6800xt have been playing non stop for years and the Nvidia owners got slayed since Hogwarts Legacy and RE4. Heck, even the 6700xt is now better in Indiana Jones then those Nvidia 3080s who cost 3 times as much. And now, for the third time in a row, people are buying 8-12gbs Nvidia cards for a kidney while they can get 7800xt and above. People don't learn or they apparently want to pay more for a card just to use it for 2 years and be forced into upgrading while the other option lasts twice as long o even more
Funny to think that almost all AMD cards are better even at Ray Tracing than their Nvidia counterparts two years after release😂 (excluding the x90 series)
I saw one YT short of a guy saying AMD bros should be worried about these mandatory RT games but i completed Indiana Jones and the Great Circle on the 7800xt/5700x3d and it was a great experience at 1800p ultra with dynamic resolution enabled targeting 70 fps (i went 1800p dynamic over 1440p native cause im on 4k tv and want more sharpness/clarity) besides one small temple area on Sukhothai where it dipped to 50s. Maybe we should be worried about other engines but still. at least we know Doom The Dark Ages is gonna run like butter
you even game man? happens with a lot of games. they use way too much Vram at launch, then it quickly gets "fixed"... that being said, 12gb will soon not be enough
There is a reason AMD cards are cheaper. Always go with the Nvidia card. Can you run the game with a 7800xt or a 7900xt with path tracing on? Not everyone is not genuine when they suggest the particular hardware.
Great Call Frog! I mean I posted 12GB wasn't enough weeks ago & look at the hate i got for it 🤣 I can see shit coming man! 16GB I think might be ok for while yet but Indiana has really tested the 4080 Super After some updates you can now Use the Ultra Texture pool on the 4080 Super which is better Like I said they totally fucked up texture pool sizes at launch and with Ray Reconstruction coming in a future update, that will reduce vram even further So a 4080 Super 16GB still fine, but 12GB GPUs won't be able to max out the game at all I hate the fact Nvidia & even AMD will continue to sell 8GB GPUs, it's crazy when they will be DOA
@@robertmyers6488 With a 7900XT you can enable FG & not worry about vram running out I'm run into a few games now that cause the 4080 Super 16GB to run out of vram lol So had to disable FG to keep under the limit That's at 4K
AMD has always been better at longevity then Nvidia . Nvidia cards will need a upgrade sooner in time it's always been that way that's why I switched to AMD . But honestly only the 4090 works best at raytracing . I don't like having to game under 60fps and have to use upscaling which lowered Resolution to game. For that I turn it off and play at buttery smooth FPS
Most definitely with Nvidia you are more than likely to be locked out of the next gen tech features also They do that every generation to get ppl to upgrade With AMD I think you don't feel the same pressure to upgrade every generation with how they work So I do think as i've always said, for longevity & bang for your buck, buy AMD
Not really. DLSS upscaling still works beautifully on 2018 NVIDIA cards. AMD still doesn't have a proper answer for DLSS and when they do lots of the old hardware will be locked out.
@@Phil_529I'm not as bothered about DLSS I'd rather have more vram so I can game at 4K & AMD give you that A lot of ppl don't know this but at 4K i've found even the 4080 Super is running out of vram if I'm using FG in games like Stalker 2 and Veilguard Massive FPS drop It can even happen in Alan Wake 2 Although it was fine when I played it, it did alter my settings once at the start & wondered why So that's not good at all imo, 5080 16GB i'd have massive concerns about that GPU already
@@RJTHEGAME Bruh you just making stuff up. Veilguard uses 13GB at 4K maxed with RT and FG and STALKER 2 uses 11GB. You aren't gaming at 4K max without upscaling in modern demanding games. In STALEKR 2 the 7900XTX only averages 45fps at 4K native, 4090 gets 61fps.
@@Phil_529 Lol it uses more than 13GB at 4K once you put FG on and use fade focused for texture sizes even with DLSS quality on I ain't making stuff up at all I don't care what the 4090 does this isn't some competition between a £800 GPU and one that is costing over 2 grand atm
Much less so 7800XT, people are rightfully ripping into Ngreedia for nonexistent generational upgrades but if you look at how much performance this card gains over 6800XT or how much of a flop the RX 7000 is inn general you'll learn Intel is the only honest pro-customer GPU provider right now.
I remember telling you to buy the 7900xtx and you laughed and said "I would never pay over $1k for a GPU" Then you proceeded to buy 4070 LOL!! You complained about Nvidia GPU was too weak.
Great Circle is a horribly unoptimized game, they don't even have non-RT fallback for older cards. As much as 12GB sucks what is even worse is an unusable upscaler at QHD and 1,5 gens behind RT capability affecting UE5
Indiana Jones settings High Ultra Very Ultra and Supreme on Supreme 4K i am getting 60+ FPS with my 7800XT and with my 850.00 4070 ti 1440 ultra settings don't buy a 12 GB graphics card unless it's Intels 250;00 B580 even then don't think it's a 4K card.
DLSS is just too good to pass , games look same or better than native in plenty of AAA games , nothing comes even close in terms of Fidelity... if you have over 400$ to spend Nvidia is the only option
guys mark my words the fomo is going to get him when a 5070 ti is beating his card and he has no upgrade path he's going to crack , He already has second hand FOMO form the Witcher trailer lol
We are at the end of the generation..you found 1 game the 4070 super may struggle with. The 4070 super can beat the 7800 and 7900gre on any pt game and heavy rt game to date. Can the 7800 or 7900xtx for that matter TRY to run path tracing on this game?? The answer is NO. It's not like this game runs better with pt on the amd cards. Come on Frog.
But thats what im saying the amd cards arent beating 4070 super at pt at 1440p. I think non rt the 7800 and gre are better investments overall probably@@frogboyx1Gaming
um you cant even pathrace properly with a 4090 so stop w the bs, so secondly both cards cost over $200 less and get about the same and more fps in all the total games. Stop touting raytracing like Nvidia isnt also bad in alot of titles, not to mention in every competitor gpu amd has , Nvidia loses in raster performance and gets less fps in almost every single multiplayer game
I ditched my 6700xt and switched to nvidia(4070 super) cos fsr is dogwater. I chose nvdia cos of dlss and Blender. I don't care about rt. If fsr4 amd xess2 is closer to dlss, I am ditching nvidia. Games today come with terrible AA Also, I'd rather listen to Daniel than froggy the flip flopper on any buying advice
For a solid year I kept trying to find a good deal for the 4070 Ti 16gb, no joy!
Enough is enough, I went AMD.
Merc10 Black, XFX 7900, 20gb
I love my 7900 xt. I think I'll wait for a 5080 super that has more than 16gb of vram. or I might just get a 8800 xt, i hear those are gonna be great RT and FSR is suppose to get better. The adrenaline driver software is just goated.
@@CombatMedic1O DLSS is the goated thing, making games look even better than native in several cases
@@CombatMedic1O 8800xt will be round 7900xt at raw power with less vram and better RT with better price.. With 7900xt i would skip RDNA 4 and wait for RDNA 5 that will probably have gddr7 and much better perf. gains. You should be fine with 7900xt at 1440p until RDNA 5 or something with good value from Nvidia or Intel. New games are to demanding for serious RT and GPUs are still not at that level so i wouldnt be bothered with that. For basic RT, even mandatory 7900xt is more than enough.
@roki977 I thought 8800xt was gonna be on par with 4080 super. Well you think I'll be good fr another year or so with 7900xt at 1440p gaming? Cuz I'm worried about UE5 games. Anything with RT or heavy occlusion eats my gpu.
Since when was there a 4070 Ti 16 GB? I thought only the Ti Super had 16 GB.
Just went ahead and got a 7900xtx on Black Friday for under $700. It should hold me over a couple years.
You're going to love it!
what resolution u game?
Dude I'm still rocking a 6800xt on 1440p and it's still a beast. Currently replaying the last of us and I just love it.
It's not going to hold you over for a couple years you already bought it it's two years old you should have bought it two years ago bro it was 900 freaking dollars bro.
@Typhon888 7900xtx will hold him just fine.
I tossed a 7800xt in my 7600x3d build, I don't see any issues with games for a while.
where you get a 7600x3d?!
@@MoeMan216 Microcenter exclusive part I believe, I have one about an hour from me.
I’m building the same set up for my son for Christmas. It will be his first gaming PC.
Green with Envy! There are No Micro Centers in Canada, thus No Ryzen 7600X3D's available here!
I love my Hellhound 7900GRE, just loaded up STALKER 2 and it plays like a dream, it's awesome. I get to max out all 1440p games and they just run well, when I cap my framerate my 1% lows are one frps lower than my average, and my frametime graph is laser straight on everything. It's paired with a 5800X3D.
7900xt 7800x3d and i can barely run stalker 2 on 3440x1440 at more than 50fps
@@cezarh5473 Well, you added 5 million more pixels to the screen than I did.
That's freaking awesome my friend. GRE is a dam beast
Bought a 7900xt about a year and half ago, compared to Nvidia it's been night and day. Every game has run buttery smooth, and 20GB vram I don't have to worry about it struggling for years.
That’s what I’m talkin’ about!
What do you even need 20GB for? Card can't do heavy RT and 4K games are using 12GB or 16 tops
@anitaremenarova6662 mods
No driver issues than like everyone talk about?
@@stevenspencer5979 no driver issues in 17 months
I remember, Frog. I loved my Sapphire Pulse 7900 GRE. I loved it so much that I sold it… and put that money towards a Sapphire Nitro+ 7900 XTX.
I still love mine , it runs so well
Na 7900xt better because of that lower price to proformance 😅
@@JahonCross it has worse price to performance yes , 7900xt is great tho and still better obviously, seen it for 600$ on Amazon for one day lol
@ Either way, you’re set. Thank you, AMD!
Yoo I just ordered the 7900xtx nitro plus, I hope it's as good as the reviews say.
I am super glad I bought the 7900xt it will be a long, long time before I replace it. I don't play games with Raytacing enabled. I don't like the way it looks anyways
Love that card as well!
If Indiana Jones is any indication, more and more games will make ray tracing a requirement with no fallback to rasterization. Unless you don't care about modern AAA gaming, that 7900 XT isn't going to last as long as you think.
@@03chrisv AMD can do lumen pretty well its nowhere broken as path tracing is, is something that actrually works. no real problem
Same here, I got my 7900xt a year ago and I expect it will probably last me 5 years before I even think about upgrading. Probably even longer than that. By the time I really need to upgrade to keep up with current games, I can just drop a better chip into AM5 and swap out the graphics card and I'll get a ton of mileage out of my machine.
ray tracing is overrated/not worth it in 95% of games
As someone who bought a laptop with an rtx 3080 16gb in 2021 when there was a 8gb 3080 I remember when people online kept saying you don't need the 16gb vram, that gpu can't do 4k anyway. Jokes on them. I use my laptop with an LG C4 and everyday I thank myself for going for the 16gb vram variant
Learn to appreciate what you have
At the end of the day you can have the best of the best and it means fack all if the game devs put out unfinished/unoptimized games, then scrambling after the release to patch and fix major issues. Look at jedi survivor, great game, but the game has a lot of issues and don't matter what hardware you have. The issues won't change.
@@richardpierre7946 that game will always be broken gold
"Broken Gold"👌🏾 - Perfect way to describe Survivor.... Wow
@@quiltingrox thank you.
I played through it recently on 4080S. Played pretty decently but kept having to disable the RT in certain sections cos it would just crash the game. Thought it was my overclocks but same thing happened after disabling them and other games are fine including RT heavy Alan wake 2.
Instead of reviewers saying buy this card to play this game. They should be screaming at game developers to stop putting out unoptimized. Games all the get this card talk is just rewarding lazy game developers.
my 6800xt waits for the 8800 xt
I enjoyed the video. I've used AMD video cards for the past 10 to 15 years. They have always been in my mind a better investment than the nvidia equivalent. Earlier this year I upgraded to the 7900xt from the 5700xt and I've been enjoying it. I plan to keep it for a while.
I'm always happy to hear when people are enjoying their AMD card.
The 4070 and 4070 Super were VRAM deprived from the start. So were the 3070 and 3080, and I said so when they were announced. The performance of all these GPUs is quite good until you're out of VRAM, but out is out. Texture popping is way worse than giving up some ray tracing performance, especially given how distracting and overblown ray tracing usually looks.
Exactly my friend
The good thing I found out when going for mid-range GPU instead of high end, I got more money left to buy games. High-end GPUs probably give more FPS at higher details, but only if you have equivalent high quality monitor to display as well. At the end of the day, what's the point of having high-ends gaming PC if you can't comfortably afford every games you want to play?
I agree
i bought an r7-7700x 4060ti 8gig vram PC for 650 canadian (450 american) and considering switching out the graphics card for a 7800xt or 7900gre
C'mon over to Team Red you'll enjoy the Ride & the Better Pricing!!
@@michaeloneill1360 a worthwhile gpu would cost more than the pc i bought lol
@@deadrift886 So True!
9800X3D with the 5090 is gunna be a beast
A 9800x3D is a complete waste of money with a 5090.
Yeah well it's gonna cost like $4000
@@robertmyers6488not at 1440p
I picked up a 7900XTX over Black Friday. Loving it so far, its an absolute monster.
Same here. What cpu are you using? My 5800x3d is a bit of a bottleneck for the 7900xtx at 1440p in a few games
@@nikhilnischay3955hmmm. My xtx runs at 100% when gaming and I have a 7600x. CPU usage usually 60-70%. And I play 1440p ultra everything
@nikhilnischay3955 I built a whole new system, I got a 9700X. So far it's been perfectly capable of keeping up with the GPU and all I did was enable PBO, didn't do any manual overclocking.
@@USAFWolf Hey thanks for the reply. I’ve been eyeing the 9700x for a couple of weeks now. Especially now since the price for it has dropped and normalized around $320. I have an 8000 MHz polling mouse which I use on Halo Infinite and the finals. And the CPU usage on my 5800X3D skyrockets in the 90-100% range. When playing single player games like Banishers Ghost of New Eden, Warhammer Space Marine 2, and Control there are times when GPU usage drops to the 80-90% range. Sometimes lower when you’re in a village/city in Banishers and Horizon Zero Dawn Remastered and Forbidden West. And in NFS heat CPU usage is always sitting in the 90% range. I didn’t think I would need to upgrade a 5800X3D but using an 8k hz polling rate mouse and turning on frame gen on single player games the CPU usage goes up 😵💫
I’m just going to get 9700x after the motherboard and RAM prices go down a bit and call it a day. 5800x3D is great at upping the 1% lows but the CPU usage sitting in the 80-95% range is terrifying.
So many RUclipsrs and reviews are pushing Nvdia. It was really hard to find user feedback on prebuilts with AMD cards with Windows 10 EOL, Black Friday timing, and 2025 tariffs looming. I bought one with a 7900GRE because $600 is about the limit I can spare for a graphics card. The real need and priority was getting a Windows 11 compatible desktop for the household. The Windows 10 EOL made me reenter PC gaming slightly earlier than expected. Nobody in the household was complaining about the Series X except for drifting/broken controllers.
That's awesome brother
@@frogboyx1Gaming Thanks. I'm glad I found one with no bloatware, within budget, and in time for Windows 10Eol /tariffs. It even helps that the household mouse, keyboards, and cables carry over. We just need an external drive. The gaming monitor is about the only thing that can be considered unnecessary or luxury. A few lesser times dining out or takeout orders should just about cover the cost.
Besides the 5090 what GPU’s are coming out with more than 16 gbs of vram next generation?
5080 Super should have 24gb if the leaks are correct. Though it won't be out until 2026
@@Silver-h4m I’ve heard something similar to that and I hope it’s true. I’m probably wrong but I don’t see the 5080 being a very successful card for NVIDIA in 2025. I’ll just have to hold on to my good old 2080 TI until then.
I was quite shocked tonight to see that when I turned DLSS from performance to balanced on my 4080S, my FPS in Indy at 4K with full RT and frame gen tanked from mid seventies to single figures. Thought the game was crashing then saw the message in the options screen saying I didn't have enough VRAM. Fortunately I'm happy enough to play at 4K performance but was quite surprised to see it's really on the edge of VRAM usage even at 4K. This is after upgrading from a 10gb 3080 which also became VRAM starved long before it should have.
I doubt I'm gonna chase the highest fidelity in the future with Nvidia taking advantage of their market dominance to charge the earth for just one part of a gaming PC.
Hopefully AMD can step up and provide decently priced and specced mid range performance and Intel can join them.
I am truly sorry this happened to you. This is exactly why I went with the 7900XTX. No more surprises.
The problem is we get too dependent on DLSS to fix everything and most people are fine with that until they turn it off or select a higher quality
Thanks for this great video. I personally don't play high-frame rate games ; but I have always found the AMD cards the best value for my needs. I was finally able to get a MSI Radeon RX-7900XTX Gaming Trio Classic and expect to use ot for a few more years. Please keep up making these great videos.
@@johnpaulbacon8320 thank you.
You also talked about the Nvidia App well before everyone started noticing the weird FPS behaviour
Yeah it's crazy how some hillbilly mechanic can see this stuff but the tech bros are having trouble figuring it out.
im still on the 6950XT, not sure if I will upgrade to 8000 series as UDNA sounds intriguing which is coming after
Exactly my friend you still have plenty of scaling left in that card especially at 1440p
@@frogboyx1Gaming yup exactly been playing at 1440p
I'm so glad I went with the Sapphire Pure 7900GRE over the 4070 Super.
I feel Nvidia purposely shorted their Ram on their cards, in an attempt to get people to purchase another card sooner than people expected to have to. As for AMD? I'm still rocking an RX 590 on many games with a min spec of a 5700XT and I'm doing it at 1440. I haven't purchased the Indie Jones game yet, but I'm able to play Starfield, DD2, Final Fantasy XVI Demo, & Stalker 2. I pay little attention to FPS. I install a game! If it plays, I'm good! If it doesn't, I uninstall it & play one that I can.
I bought the RX 7600 because it was all I could afford in my country. I just want to play the games, I don't worry about all the fancy graphics. It's about the game play for me.
Smart move.
I bought the 7800xt 2 months ago and its a banger , 16gb vram is really the minimum i would go for on 1440p, seeing that my 7800xt uses over 12gb in some games on 1440p
What 4k monitor are you rocking?
LG C2 OLED
My monitor is the LG 39GS95QE 39-inch UltraGear OLED Curved
What do you think about future games putting raytracing as a requirement? I’ve been outta the hardware loop for a long time but do amd cards have ray tracing?
Yes
Indiana Jones has RT and it runs just fine on AMD cards
It’s more than simply better ray tracing and upscaling…it’s also power draw for me. I was very much so looking to purchase a 7900xtx, however when I looked at the idle power draw, I had to pass. That shit adds up on your electric bill, and I also don’t want my room feeling like a sauna. 6800xt and 6950 xt were awesome but had my room hot and my bill high.
While watching this video, my rx7900XTX is drawing 22W on a dual monitor setup.
I have to say it cracks me up when people drooling over $800-1000 GPU's, but have to pass because they can't afford the electricity.
I have 3 setups running all the time. One with a rx6950XT, one with a rx7900XT and one with a rx7900XTX. i can not say that i notice that on the electricity bill.
@ brother 🤦🏾…. Not everyone has cheap utilities. Plus, I call cap on no change in your electric bill.
@@dividedpersona And $800-1000 GPU's are not cheap GPU's. If you can afford that, you can afford the electricity they draw. If you can't, than you can't afford the GPU.
I live in the cold part of the world. Do you think my fluxuating power draw to heat my home use less power than a couple of PC's??? It does not.
@@Audiosan79brother.... energy costs are dependent on multiple variables. Your costs do not reflect that of the majority of the world. Even in your own local city. Plus i use my pc's for more than gaming. As the total amount of pc's i have scales up, the operational costs do as well. You all really need to get you all head out you all ass.
Nvidia nerfs their cards every generation, yet their fanboys are in denial. AMD is the mid tier king. Intel starting to look interesting on the low end now. Should be able to get 5-7 years out of a card.
Absolutely brother
AMD is only good in making their CPU their GPU is okay
AMD is beating themselves with RX 6000 being much better price-performance options than current gen
You remind me of my old mechanic friend who was pro Chrysler/Dodge Mopar man and went on and on about the Hemi engine vs Ford/GM engines are not that good, heh. I am not 100% sure about the future; but Intel's b580 might win the low end market, but I think you have point about the mid end market with AMD, and then you have that new RTX 5090 coming out with 32 Gigs, butt he 5080 will have 16GB. I am still doing fine with my 7800xt, and is good enough for my needs.
@cybernit3 absolutely brother but sadly I am a mechanic and I am a Chevy bro
@cybernit3 7800XT is an awesome well rounded card. I loved the one i tested
Yup the nvidia bros overpay by over $200 and want to make excuses and deflect to raytrace or pathtrace, not to mention 4090 cant pathtrace properly and upscaling is necessary
For better image quality and the option to pt and win every time when rt is on.
Stop it.
Amd is fine and great value but yeah upscaler and rt..nvidia anyday all day.
@@c523jw7not better image quality but better raytracing yes , does that mean I need to pay 200$ for one feature ? I think not
@@c523jw7did you not just deflect to Raytrace ?? Reread my post and see how your coping /deflecting
@@c523jw7only thing Nvida really has over Amd is Ai and that’s not hard for Amd to implement Even intel xess is going to overtake Dlss once they get better fps
You put deflect to rt cus you know your card cant rt. Your coping. At the end of the day if you bought a 4080 or 90 you could have bought the best amd card...we skipped for a reason 😂.
Whats your set up man?@steven7297
Just bought a new rig with a 9800x3d and 7800xt. Hope this holds up for a few years running 1440p.
Absolutely brother that's a solid build and you will most likely get access to FSR4 AI upscaling next year.
@@frogboyx1Gaming thanks!! Wanted to upgrade to a 4080 super or 7900 xt but it cost 1100 USD here in our country!! Too inflated!!! :(
After owning my GTX 1070 for 8.5 years, I'm finally going to upgrade when AMD & NVIDIA release their new cards in January. I'm leaning heavily towards RDNA 4 but will consider the 5070 Ti too. I think the 8800 XT will either be a slightly better 4070 Ti Super for $500 or a 4080 for $600. I doubt they'll provide 4080 level performance but we'll see. Gotta say, I'm proud of how well my 1070 has held up. I was able to max everything out at 1080p for so long. Now I have to dial settings back to mostly medium on newer games for a high refresh rate experience. The games still look respectable, honestly. I just want to go back to maxing everything out & I'm also interested in trying ray tracing. The NVIDIA 10 series was truly legendary...
I'll be honest man. Your 1070 will get shat on by every other midrange gpu thats currently out. If you bought a 5060 you would feel a massive difference despite it being a mid tier card.
Stick with a 70 series card from either nvidia or AMD I tend to feel like it's a good balance between future proof and playing games at high to ultra setting @1080p + 1440P with no issues.
where i am Malaysia(where your latest Ryzen cpu made) RTX 4070 s is 550$ while RX 7900 Gre is 630$
It depends on the resolution you are playing and the games you like, if you are a competitive gamer, I guess you don’t need ray tracing since you will lower settings to get more fps but someone like me who like graphics and plays on a qd oled monitor and play single player games, I rather the nvidia just for the ray tracing and the bette upscaler. But if you get amd it is good, I had a 6800 xt and ran beautifully games as call of duty and other fps games, great cards.
I once bought into the raytracing hype with a 4070 ti, and it's over hyped, and now with 7900 gre and love it
It really is a great card
I'm with you, still on a 2600x+Rx590, can play Stalker 2 1080p. Still waiting for 2026 Amd next Gen GPU.
My 7900xT has crashed every day for about a week. I just saw a video about tuning it. Will see if this works
Over the last 2 years, my 6800XT, 6950XT, 7900XT and 7900XTX have never had a single crash. That is on 4 different systems.
@ Color me Jealous. This ais a brand new build and I just made the switch from NVdia to AMD
@@AnthonyCarrierRUclips If you had any Nvidia drivers on that windows install, that could be your problem.
@ nah complete new build from scratch
@@AnthonyCarrierRUclips Then you might have a defective GPU. or there is a issue with a app/game/OS.
Rocking my 79xt till 2030 , that’s when raytracing will be good
Great video! Budget-conscious gamers need to hear this more (and let's be honest here, majority of us gamers don't want to spend more than we really have to). AMD deserves more love from gamers. I also support Intel especially with their delivery of a value product, the Intel B580 card.
Pretty much! Gamers can just get more value than they actually know by just getting an AMD card
Look at the Price of the XTX
Honestly think now with it being the best 24GB GPU on the market behind a 4090
Probably a nice grab for someone
That's why i'm keeping mine because I love it & if I do run into vram limits i'm covered in that regard
Next Generation is offering us fuck all apart from a 5090 32GB
AMD God knows what they're doing after a 8800XT 16GB
It's a thirsty beast though. It sucks more power down than my 5800x3d and 6950xt combined lol
@@jogonbro Yeah it does use well over 400Watts when overclocked, at stock it pulls about 403watts
Have my new sapphire pulse 7900xt sitting unopened just waiting for CES25 to see if 8800xt really is as good as the rumors say. Hopefully it is as good as the rumors or worse than 7900xt cause I want the card for some longevity and will the extra 4gb or better raytracing give me better longevity. Hopefully it's an easy decision lol. This is my first PC btw using my series x to play games atm so it's not something where I absolutely need it now. Can wait another month or two if needed.
Definitely better RT, also 8800XT can easily be 7900XTX raster performance too so fine enough trade-off
I guarantee you that going from the Xbox Series X to that 7900XT it's going to blow your freaking mind.
Indiana Jones is an addictive game and its handling to my liking above 60 FPS on my RX 7600 XT didn't even dip below 60 yet
Nice brother that 7600XT 16gb card is a beast for sure
You can't just look at the price alone and then decide what to buy. I look at the price and the full list of features the GPUs are giving us and then decide if it's worth it to buy a GPU.
I'll buy NVIDIA GPUs because they offer many features the AMD GPUs doesn't offer. Doesn't help to have a slightly more VRAM and slightly lower price with an AMD GPU then. And now with the RTX 5000-series GPUs, NVIDIA is going to give us even more new features that will be pretty good to have.
So, the 'RTX 5070 Ti' is going to be my next GPU.
iam always using high or photo realistic settings in star citizen. you should try that with your card. its really awesome.
This is the first time I got recommended a video from this channel but I've been in both situations over the years... Those where it was really important you prioritized vram and those where it didn't really matter much. Admittedly being stuck on the PS3 console generation for so long with 256mbs of vram probably helped hold things back for while. 🤣
I feel like vram doesn't cost enough for nvidia to always fall in the low vram side though. 🤔
I have a 4090 so I don't really focused on the numbers at all. We all know 8GBs is pretty crappy now but I guess it would be sort of interesting to see performance differences at 1440 between 12GBs and 16GBs. I haven't played Indiana Jones but I assume 12GBs still generally holds up if I had to guess??
There's also a difference between a card allocating more than 12GBs to memory when it's available and the game running into framerate/stuttering issues when it only has 12GBs available.
I am building a threadripper system for creative and storage purposes. I had to switch to a water cooled gpu so that it would fit in the case and prioritize the pcie5 nvme adapters. The one I already had was a 4090. My 7900 XTX was too large and airflow would be blocked. Despite none of my AMD cards ever having a problem with my Samsung G8 240 Hz 4K monitor, the 4090 is a total trainwreck. Simply doesn't work sometimes. Most of the reddit threads put the blame on Samsung. Also, the color and HDR also looked like crap. If my Dell monitor has trouble. I will sell it and get one of AsRock's blower style creative 7900 XTXs and sell the 4090. AMD is simply more compatible and has better graphical fidelity.
daniel is really the last person to be called a nvidia shill
I strongly disagree with recommending a 12gb 4070 over a 16gb 7800XT.
This is confusing yes the 12gb of VRAM on the 4070 super is right at the edge of what's acceptable but the game he's using as a example Indiana Jones & the Great Circle preforms better on the 4070 super then it does on the GRE, RT on or off.
That said 4070 super for 600 bucks compared to the GRE which is nearly a flat 500 is a better deal.
Im actually pretty happy with my nitro pure 7800xt paired with 7800x3d which i bought a year ago for 580 euro after watching couple of your videos :) I actually played the new indiana game on ultra on 80-100fps+ without afmf2, and that card is running really cool (on 100%usage 55C and 71C hotspot). Nvidia gpus were/are just too expensive in the middle Europe
Hell yeah brother. Thank you for putting your faith in me it means a lot to me.
@frogboyx1Gaming No problem 💪 i put my trust in you, and i don't have any regrets about that.
I got the 7800xt because I don't like that Nvidia cards don't come with much RAM unless you pay a LOT of money. They charge too damn much
I actually intent on going AMD next gen, the only thing that makes me second guess is seeing every game made with UE5, which has baked in ray tracing you can't turn off (I would if I could, I think it's a stupid gimmick that costs too much performance for some reflections). Let's see how the new generation reviews
For me Nvidia for the cuda cores as I use them for AI so is an easy choice
I stayed away from Nvidia because of the connector issues. No amount of ray-tracing was worth potentially burning down my house.
Some games don't really look good with RT. But some games look magical with RT. The Witcher 3 being one. The difference between RT off and on is amazing.
Who would've thought that the company working directly with Nvidia would have good RT lmfao
I’m rocking Indiana Jones on my RTX3070 no issues at all 😊
Yes because of the new 8gb patch that allows better compatibility on 8gb cards but at what cost 🤨 I'm sure someone will be along soon with a comparison video of hidden low textures or pop in when you turn around on the game ect
@ all the games they said the 3070 can’t run.. ℹ play them flawlessly till the end of the game.
@@Ballagun With medium-low textures lol
@ high and ultra setting 😎
My 4090 is gonna hold me over for another 4 years easily. How long is that 7900GRE or 7800XT gonna last?
7900 XT Merc here, could care less about RT. Feels like a marketing ploy more than anything. Just cant be bothered to pay for the equivalent GPU from Nvidia that costs double. I have a friend who may ditch his 4090 for a 5090. This is the only way ill be jumping ship because I know I can get a fair price on the 4090 from my friend. I just wont pay retail for Nvidia, my EVGA 3060XC in my work PC was used for $200 and I repasted the card and its been great. I would never have paid for this thing new. I really wanted to try out Nvidia thats why I bought that one.
FSR - It looks worse than no FSR on a lot of games with the 7900XT for me
DLSS- looks sharp on the 3060 and makes a meaningful improvement with the lower end card.
Maybe a nice Nvidia card would change my opinion. I would consider a 3090 if people would lay off the pipe with the $700 used pricing. I think both cards are great for me and what I play. I don't see this 7900XT leaving my hands anytime soon.
7900XT was my first AMD card absolutely great experience
At what point are we going to start holding these game developers accountable for just being lazy and not optimizing their games? Most of this is what the problem is.
Certainly games are coming out that are leaning too far on upscaling technology to overlook their coding, but there is a limit to how much optimization can be done. Minimum Vram requirements are gonna increase across all resolutions and Nvidia has been happy to skimp on it.
Looking forward to the 8800xt. Think of taking the dive and upgrade my 5700xt. Of course the motherboard and CPU will follow this years project. People that constant want high end will get over that high quickly. I still run my 5700xt with satisfaction for the few games I play. Do I need Indiana Jones with all the ray tracing? Don't think so.
And i run my elite dangerous on 1440p easily.
It will be a beast.
RT and PT both are huff. They barely make a difference in most cases. Only people who have been educated on what the differences are and are actively looking for the difference will notice when a game has RT and PT. RT and PT won’t matter until it’s easily achievable on consoles and $300-$500 graphics cards. I think the best bang for buck buy for this graphic card cycle was 7900xt. It gets me 4k 60 in most games and at least 1440p 120 in my fps games with high settings.
its all cool and everything, then you see market share of gpus and steam hardware charts.
I upgraded from a 1070 TI to a R9 7900 XT. The price point was way too good to pass up & I don't care for ray tracing. Nvidia have lost their mind trying to sell us 8GB cards & trying to get me to sell my organs to buy a good CPU from them. I'm just not putting up with it. I hope AMD & Intel change the GPU market & make Nvidia stop price gouging.
I completely agree, the price of those cards are just insane.
There's the ones with vision, and there's the impulsive ones. The former buy AMD, the latter buy Nvidia.
Every one should buy amd but if you are looking for mid 1440p Intel is the best for the money
that's what Daniel Owen is saying. if you are buying at around 700+ Nvidia makes sense. that's 4070 Ti Super, 4080 and 4080 super and 4090. but daniel also mentioned to not buy 4080 and 4090 since 5090 and 5080 is almost out.
I think im getting the 5070 ti when it releases next year
OK hope it works out for you.
I just want DLSS, frame generation and >16 GB vram. Still sitting on 2070S
Recently scored a 7900XTX for just over $700 and honestly? I'm probably going to keep going with AMD or Intel GPUs moving forwards. Nvidia is getting way too expensive for minimal performance boosts
It's really good trying new things my friend I noticed I actually use more AMD features than I would with Nvidia. Once you truly see that AMD does a great job it really makes you wonder why everyone keeps recommending Nvidia when AMD offers great performance at a lower cost with more VRAM
@@frogboyx1Gaming One of the things I've noticed when gaming on my main rig or my secondary rig that uses a 6600XT is the AMD Adrenaline software is much more user friendly than some of Nvidia's features. Having an easy way to tune your cpu and gpu is a big plus in my book. And yeah, being able to get a 24gb card for nearly half the price of an 80 class card with similar performance should be a no brainer! Thank you for posting this video and spreading more awareness about this!
no brand can convince me to switch from amd unless they have a feature equivalent to AFMF2 that can be use on ALL Games without needing game developers implementation.
Afmf2 is just too good.
Thats why i bought the 4070Ti Super 16GB, 12GB are simply not good enough anymore.
4070 ti super for 800. I'm happy as can be
Awesome brother enjoy your card that's all that matters.
They don't even sell amd cards here currently - all out of stock. Even the nvidia cards are mostly out of stock - just some 4060 / 4070 cards left that people are avoiding because 8gb vram limitations. Most shops are just selling through remaining stick before all the new launches. Currently I have a 3080 10gb and all the games I play run great - even newer games. I am interested to see what the new amd cards perform like. Hopefully the rumours of them being about 4080 super level, and not priced outrageously like nvidia 50 series cards will be, actually happens. AMD are great at botching launch pricing. I wouldn't mind a decent card with 16gb+ vram. I don't care who makes it.
Limit production of the 4000 card so they can sell the 5000 card it always like tat
AMD has been better for years now at every bracket except the 3090 and 4090. My friends who listened to me and bought a 5700xt have been playing until recently while the poor fellas who bit for Nvidia and the 2060 got crushed in two years.
Same with the guys who paid 1000 euros for a 3070, 3070ti or even a 3080 (8-10gbs) because those were "better at RT". Everyone who bought a 6800 or 6800xt have been playing non stop for years and the Nvidia owners got slayed since Hogwarts Legacy and RE4. Heck, even the 6700xt is now better in Indiana Jones then those Nvidia 3080s who cost 3 times as much. And now, for the third time in a row, people are buying 8-12gbs Nvidia cards for a kidney while they can get 7800xt and above. People don't learn or they apparently want to pay more for a card just to use it for 2 years and be forced into upgrading while the other option lasts twice as long o even more
Funny to think that almost all AMD cards are better even at Ray Tracing than their Nvidia counterparts two years after release😂 (excluding the x90 series)
Right it's funny to watch them defending it.
well we're at a point where im expecting the 4090 to be a 1080P medium card by xmas..
sounds about right
I doubt it if the 5080 can’t beat it then the price of the 4090 will still hold it value the 5090 will be the high end card
I saw one YT short of a guy saying AMD bros should be worried about these mandatory RT games but i completed Indiana Jones and the Great Circle on the 7800xt/5700x3d and it was a great experience at 1800p ultra with dynamic resolution enabled targeting 70 fps (i went 1800p dynamic over 1440p native cause im on 4k tv and want more sharpness/clarity) besides one small temple area on Sukhothai where it dipped to 50s. Maybe we should be worried about other engines but still. at least we know Doom The Dark Ages is gonna run like butter
It’s great to see that your 7800XT is working great for you!
I like how you're conveniently ignoring all the UE5 games where AMD is way behind, this one has weak RT
@anitaremenarova6662 every UE5 game i have played has been just fine LOL
@@frogboyx1Gaming On a high-end card sure but the 7800XT/7900GRE is struggling compared to 4070S.
you even game man? happens with a lot of games. they use way too much Vram at launch, then it quickly gets "fixed"... that being said, 12gb will soon not be enough
no, no he doesn't lol
There is a reason AMD cards are cheaper. Always go with the Nvidia card. Can you run the game with a 7800xt or a 7900xt with path tracing on? Not everyone is not genuine when they suggest the particular hardware.
Waiting for 5090.. 🥰
Even Intel for 250.00 12 GB graphics card Intel has come alive for 250.00 you are getting a bargain.
Great Call Frog!
I mean I posted 12GB wasn't enough weeks ago & look at the hate i got for it
🤣
I can see shit coming man!
16GB I think might be ok for while yet but Indiana has really tested the 4080 Super
After some updates you can now Use the Ultra Texture pool on the 4080 Super which is better
Like I said they totally fucked up texture pool sizes at launch and with Ray Reconstruction coming in a future update, that will reduce vram even further
So a 4080 Super 16GB still fine, but 12GB GPUs won't be able to max out the game at all
I hate the fact Nvidia & even AMD will continue to sell 8GB GPUs, it's crazy when they will be DOA
@RJTHEGAME exactly and everyone said I was a stupid AMD fanboy for speaking common sense
@@frogboyx1Gaming Yeah lol. it's more looking at things objectively
Why I'm holding on to the 7900 XT and XTX that I have.
@@robertmyers6488 👍
@@robertmyers6488 With a 7900XT you can enable FG & not worry about vram running out
I'm run into a few games now that cause the 4080 Super 16GB to run out of vram lol
So had to disable FG to keep under the limit
That's at 4K
Wait i can only buy a Nvidia gpu to use it how 'YOU' tell me i have to use it?
Dude you put yourself on a high pedestal!
AMD has always been better at longevity then Nvidia . Nvidia cards will need a upgrade sooner in time it's always been that way that's why I switched to AMD . But honestly only the 4090 works best at raytracing . I don't like having to game under 60fps and have to use upscaling which lowered Resolution to game. For that I turn it off and play at buttery smooth FPS
Most definitely with Nvidia you are more than likely to be locked out of the next gen tech features also
They do that every generation to get ppl to upgrade
With AMD I think you don't feel the same pressure to upgrade every generation with how they work
So I do think as i've always said, for longevity & bang for your buck, buy AMD
Not really. DLSS upscaling still works beautifully on 2018 NVIDIA cards. AMD still doesn't have a proper answer for DLSS and when they do lots of the old hardware will be locked out.
I don't use said features, who cares about fake frames. Every game has an upscaler now, UE5 especially
@@Phil_529I'm not as bothered about DLSS
I'd rather have more vram so I can game at 4K & AMD give you that
A lot of ppl don't know this but at 4K i've found even the 4080 Super is running out of vram if I'm using FG in games like Stalker 2 and Veilguard
Massive FPS drop
It can even happen in Alan Wake 2
Although it was fine when I played it, it did alter my settings once at the start & wondered why
So that's not good at all imo, 5080 16GB i'd have massive concerns about that GPU already
@@RJTHEGAME Bruh you just making stuff up. Veilguard uses 13GB at 4K maxed with RT and FG and STALKER 2 uses 11GB.
You aren't gaming at 4K max without upscaling in modern demanding games. In STALEKR 2 the 7900XTX only averages 45fps at 4K native, 4090 gets 61fps.
@@Phil_529 Lol it uses more than 13GB at 4K once you put FG on and use fade focused for texture sizes even with DLSS quality on
I ain't making stuff up at all
I don't care what the 4090 does this isn't some competition between a £800 GPU and one that is costing over 2 grand atm
Only weeks away from new GPU announcements, nobody should buy a GPU at the moment
Much less so 7800XT, people are rightfully ripping into Ngreedia for nonexistent generational upgrades but if you look at how much performance this card gains over 6800XT or how much of a flop the RX 7000 is inn general you'll learn Intel is the only honest pro-customer GPU provider right now.
Vram is just where it's at rn.
100 plus fps so I think not bad 😊
Nvidia wins sales and features set
Amd wins budget and youtube comment sections
I say this AMD 8800 Video Card Vs Whatever Nvidia 5000. I'm not looking Back at the 7900 GRE Sorry.
That being Said I'm not counting out either 1 out.
I remember telling you to buy the 7900xtx and you laughed and said "I would never pay over $1k for a GPU" Then you proceeded to buy 4070 LOL!! You complained about Nvidia GPU was too weak.
I paid 933 dollars for my 7900XTX
i bought 7800xt and love it but i haven't played Indiana Jones
it runs great on 7800xt just no pathtracing
@@Ireland-m5p It also runs great on the 4070 or 4070S with no path tracing. 80+ fps 1440p native ultra settings. This video is silly.
Great Circle is a horribly unoptimized game, they don't even have non-RT fallback for older cards. As much as 12GB sucks what is even worse is an unusable upscaler at QHD and 1,5 gens behind RT capability affecting UE5
Indiana Jones settings High Ultra Very Ultra and Supreme on Supreme 4K i am getting 60+ FPS with my 7800XT and with my 850.00 4070 ti 1440 ultra settings don't buy a 12 GB graphics card unless it's Intels 250;00 B580 even then don't think it's a 4K card.
Absolutely brother
DLSS is just too good to pass , games look same or better than native in plenty of AAA games , nothing comes even close in terms of Fidelity... if you have over 400$ to spend Nvidia is the only option
XeSS does, higher tier battlemage will sell like hot cakes
@@anitaremenarova6662 xess is better than fsr, but still way behind DLSS
guys mark my words the fomo is going to get him when a 5070 ti is beating his card and he has no upgrade path he's going to crack , He already has second hand FOMO form the Witcher trailer lol
We are at the end of the generation..you found 1 game the 4070 super may struggle with.
The 4070 super can beat the 7800 and 7900gre on any pt game and heavy rt game to date.
Can the 7800 or 7900xtx for that matter TRY to run path tracing on this game??
The answer is NO. It's not like this game runs better with pt on the amd cards.
Come on Frog.
@c523jw7 it's been more than that LOL Black myth wukong same thing Alan Wake 2 basically every single PT game.
@c523jw7 we have literally only had 4 PT games this year LOL
But thats what im saying the amd cards arent beating 4070 super at pt at 1440p. I think non rt the 7800 and gre are better investments overall probably@@frogboyx1Gaming
@@frogboyx1Gamingbut I agree froggy 2025 onward 12g is gonna be a challenge. But the generation of 2 years...was the 7800 and gre better??
um you cant even pathrace properly with a 4090 so stop w the bs, so secondly both cards cost over $200 less and get about the same and more fps in all the total games. Stop touting raytracing like Nvidia isnt also bad in alot of titles, not to mention in every competitor gpu amd has , Nvidia loses in raster performance and gets less fps in almost every single multiplayer game
Nah amd gpu costs higher here than nvidia. Like 200 dollars
I ditched my 6700xt and switched to nvidia(4070 super) cos fsr is dogwater. I chose nvdia cos of dlss and Blender. I don't care about rt. If fsr4 amd xess2 is closer to dlss, I am ditching nvidia. Games today come with terrible AA
Also, I'd rather listen to Daniel than froggy the flip flopper on any buying advice