the problem releasing new series gpus too soon, if the 40 series released 3-4 years after 30 series it would be much much better and at a similar or slightly higher price
"Who would've thought"??? Saw this coming miles away after that report from last year which claimed that NVidia were actually part of the shortage problem, stockpiling the 30 series. They have to sell the 30 series fast while keeping the 40 series appealing, how? Sell the 40 series at high prices so the 30 series look appealing, but keep some features like Frame Generations to the 40 series only.
I have found the RTX 4070 is really good for 3D rendering in programs like Brice, Daz studio, and the IRAY render engine, (it also rendered really fast on 3D Studio Max, but i only had an upgrade trial copy because my licensed copy of 3DS Max is a decade old and is only used for modeling). My personal benchmark was of a scene I created specifically to punish the GPU's. I rendered the same scene in Daz Studio using the IRAY engine at 4k resolution with 4k textures on every object set for a total of 3500 iterations. My result for the 3 cards i have were the NVIDIA 2070 Super render time 6 hours and 43 min. The RTX 3070 was 3 hours and 22 min , and the RTX 4070 cut it down to 44 min total!!! (I was shocked as to how fast the 4070 rendered the scene) I made no changes to the file before rendering, so all three cards were rendering the exact scene with zero changes. I know this was not a formal bench test, but i thought I would share the results for the 3D modelers out there since most card reviews focus on games vs 3D programs. Hope the info helps.
Nice. The benchmark im looking for. Using an AMD and wanted to upgrade to AMD again (for gaming and rendering/designing) but noticed that Nvidia has major advantages for 3D rendering
I love the small changes to the graphs you guys did! Last review there were a bunch of people saying they couldn't make out which product was the one that was being reviewed in the graphs, and just highlighting the product being reviewed is a very nice!
I still think it would have been clearer if they had one section for each manufacturer. Edit: also it would probably be nice if they had an icon or a photo for each graphics card
@@Ben.N Totally agree! it's also kinda confusing that they are sorted by speed/score instead of keeping the order the same - if i want to compare the 4070 to e.g. a 7900xt i keep having to pause and search for them with each new slide
Thanks a lot for including the Arc 770 on this. Got one a few months back and all the data out there is out of date, so being able to compare it to other newer gpus is nice. Really glad to be able to use this review as an updated Arc 770 update. (different typography for intel arc in the graphs makes it stand out. It's a fetaure not a bug)
GN also mentioned this on their review of the 4070. I just finished a "last generation weapon" build, 6950xt + 5800X3D, targeting high-refresh 1440p. No regrets.
I've only been watching LTT videos for a few years, but I have to thank you for finally highlighting the product featured in the video, instead of just tossing it in a list. Even just that one detail makes it so much easier for me to compare what's being featured to what it was tested against, instead of having to pause the video every few seconds to read each graph in its entirety. So, thank you.
Dang, the RX 6800 XT performance it's now REALLY competitive, AMD is doing a great job updating drivers and making a good product in a incredible product for free!
so from now on when a game releases crap full of issues and devs upgrade it for “free” i expect you to say this too right? Unless ofc if its not AMD it doesnt matter 😂😂
@@Freestyle80 6800XT didn't launch as crap. It launched as a 3080 competitor, and now it's a 3080ti competitor. If a game launches as a 9/10 and then 2 years laters becomes a 10/10 for free, then yeah, we're all going to celebrate that. Look at Terraria. Over a decade of free support adding tons of new features and content.
As someone that recently built their first PC weeks ago, I personally went with the 6700 XT and I've been absolutely blown away by the performance, plus the 12gb of VRAM is always a plus for less than 400 dollars.
@@WithMay I highly recommend going 1440p at the same time if you haven't already. Mine needed so much CPU power at 1080p that it stuttered a bit. 1440p is buttery smooth. I went with a Koorui 1440p 144hz monitor from Amazon and I've been super happy with that as well.
I really wish more outlets would remind audiences of how expensive GPUs have become. Nvidia -60 cards used to be budget-friendly between $199-249 USD. These days, a 3060 goes for $300-350 which is the price point of past -70 cards. It used to be that only novelty cards like the TITAN or ARES dual-GPU cards were $1,000 USD or more, but now that has become normal for flagship-level cards. It just really feels unfair how the market has shifted so far from what was expected just 10 years ago
All these outlets are bought off, they won't give bad reviews because they won't get review samples anymore. PC gaming is just a complete ripoff nowadays. If Nvidia or AMD thinks I am going to pay $600 for a midrange GPU they are out of their mind. My next "upgrade" will be the next Nintendo Switch.
I was bit taken back when I saw the price of the 4090 was around 1600-2000 dollars. Then I remembered the days when people were buying 2 or 3 $1200 Titans to SLI in their enthusiast builds. The enthusiast budget hasn't really moved from that $2000ish range. However, to see the 70 series cards cost $600 minimum is insane.
What infuriates me so damn much about this is that first of all the tech industry takes everybody's grandma for a ride, getting ignorant naive parents to overpay for crap (I really don't care if smug hipsters get ripped off, iPhones and macbooks are sometimes a victimless crime), but also because if you get enough of the consumers to be stupid and accept being fucked in the ass, it fucks everybody over equally. And so I really don't have the free market option of just switching to AMD, if they get simps to pay $600 for what's basically a 4060, because AMD is just going to do the same. Case in point, 80 level performance is now $900. No, I don't care that they called it "the 7900XTX" It's like Steve at Gamersnexus said, if you just move the entire gauge you didn't improve performance, you just turned the entire gauge but the reality stays the same. ANd in this case it's that 80/800XT level performance that cost about $650/700 two years ago now is literally $900. It's not a 7900XT. It's an RX 7800. They gave their RX 7800 a different name. So because nVidia buyers are all such cucks but they outnumber everyone else, you get this wave of people not even smart enough to make a Dunning Kruger remark tipping the scale by sheer mass into market stupidity, and like it's not even drug dealing, it isn't like this is healthcare or heroin or military gear where there's some broader reason why the consumer literally needs it to survive or is clinically crazy and stupid, it's just exemplifying this notion companies have that gamers are the market least deserving of being taken seriously or respected, that then gamers do stupid shit like this that makes those execs seem like they're right. I mean, why would you respect someone that pays that kind of money for a graphics card and is too dumb to notice what they're doing? So I can't even do anything about that beyond buy used at this point, which mainly is a problem because I didn't want to replace PSU and something like the 6900XT uses too much power (not that it matters, Lovelace efficiency sucks and so does the 7800XTX). I'm so mad about this because you know in 2020 when nVidia was being insultingly overpriced (imo, I had no idea the average gamer is this dumb) I at least had the option to switch brands, and it was fairly competitive. I still like my 5700XT, and it still runs everything but Cyberpunk at 1440p (the game itself is just kind of a mess, I think they finally fixed the bugs and mostly optimized it enough but you're still talking like 35-45fps on a 1080ti basically) so it isn't like I'm suffering, but soon enough this may be a problem for my system. If the corpos are just renaming everything while scamming me I'm basically just going to keep doing what I did all pandemic long and sit things out until my graphics card breaks. But the other thing about it is, it's not just the price being worse, 8gb on a 4070 is so bad I literally keep forgetting this, it's like my brain does some trim function as if 8gb had to be in error, but no it really is an 8gb card somehow. That's not even budget at this point dude. That's like if they put 2gb on the GTX 1060. It's clearly not even enough by modern standards. I play at 1440p and it already maxes out, in fact I question sometimes if some of the texture popin I got in Cyberpunk is due to it exceeding 8gb sometimes. There are games right now that use over 8gb on ultra settings at 1080p. So it's not just the fact it costs substantially more, it's the fact you're getting a substantially worse product too, with worse bus width ON TOP OF the inferior VRAM. So the nu xx70, the slot that last gen was practically intro to 4k and high refresh 1440p and a midrange card, is now two years later basically a budget 1080p card, and even at that it's questionable how long it's going to last for 1080p. Bear in mind this was the memory config on the RX 480, this is a VRAM standard that's nearing a decade old now, and it also pisses me off that literally my number one pet peeve about games, smudgy bad texture work, still stagnates solely because of nVidia's self defeating greed making their RT also somewhat useless when cranking up RT just hits VRAM limits anyway. Textures is literally the no. 1 most noticeable thing to me, so at least I have AMD to thank for pushing that all these years so devs can keep upping the standard. I just hope enough devs finally stop giving a shit about nVidia user's whining and they learn to deal with the fact they're paying premium prices for budget hardware and therefore should either stop buying that hardware or accept the fact they need to turn their graphics settings down because the GPU they chose to buy is pretty crappy and therefore they don't get that part of the premium experience. Like I'm not gonna buy an AMD GPU and then review bomb some hard working dev team just because I can't understand why my fps is worse with RT ultra on, these kids need to learn the same about their bad textures and stuttering and crashing with 8gb.
it always performed well on release it was plus or minus 5% on regular rasterization performance against the 3080 now it competes near 3080ti with its fine wine drivers. people were just so focused on 30 series that even when we finally had on par competition in the gpu market they still bought 30series
@@zackmandarino1021 The mind share is strong, I'm trying very hard to go AMD and love the fine wine that is historically proven. But that fucking little voice in my head always inches me towards Nvdia and I hate it. I really wanted AMD to knock the RX 7000 series out of the park and take the performance crown with their MCM design but they came up short of the hypothetical performance targets that were possible for the 7900 XTX which would have a 5-10% performance ADVANTAGE over the 4090. They ended up coming in at 80-90% in Rasterization performance, so nearly 15-25% lower than their own targeted goals. They are at least less expensive, but not by a large enough discount that gamers are happy and having them fly off of the shelves like they had the potential to do. I get it, oversupply and market demand and all that. But it really was a missed opportunity to capture the high-end market and mind share.
@@androidx99 This is exactly why I personally see Intel as the future. Why compete for the best card when you can compete with last generations cards for half the price? I understand that many people want the best card, but for almost everything the best card is way overkill.
I’ve had my 6800 xt for a while now and it’s been a great buy. No issues, plenty of performance, and vram. I don’t feel like it’s too much money or going to be obsolete soon.
I'm thinking of upgrading from my RTX 3060 TI and getting the RX 6800 XT. I can see 8GB of VRAM isn't going to cut it. It's fine if I can still play on 1080p but I'm already seeing that games are going to require probably 10 to 12gb of VRAM. Also the RX 6800 XT is a more powerful GPU and sometimes I play games at 1440p or 4K. What I'll probably do is swap out the RX 580 4GB in my wife computer and put the RTX 3060 ti in that and then my main system will be the RX 6800 XT.
@@Lucromis My issues were coil whine, drivers and micro stutter. But i found that undervolting the card got rid of coil whine and switching to a display port got rid of micro stutter and saving my overclock/undervolt settings sorted out the drivers. So in the beginning i was semi regretting my purchase thinking i should've got the 3080 but now i am experiencing perfect gaming with no issue at all.
Same. And after seeing the issues the 3070/3080 were having with their VRAM in modern games, I knew I made a good choice. I was also laughing at the 1080 performance comparison because that's what I upgraded from. The only gripe I have is that I can't get consistent updates on AMD Adrenaline. Not sure what the issue is, but considering most of the driver updates now are focused on the 7000 series, it's not a big deal. I do miss GeForce Experience though. But overall I couldn't be happier.
I like how Linus and the guys over at LTT have taken benchmarking to such a level that we only need 1 to be able to compare. Including all the way back to the 10-series is brilliant, guys. Keep this up.
You should still look at numbers from other people. Everyone makes mistakes, all people have a price, everybody lies. Linus himself says that you shouldn't trust them blindly, and he's more than right with that.
Being a fan of this card feels like being a Chad but here goes... IT'S A GOOD CARD. When he said "shockingly stable" he's not kidding. It runs like a sports car: all the fat has been trimmed off, and its efficiency is almost beautiful to watch under load. For it to contend with a 3080 at $600 MSRP, especially after last seasons cryto-fail and the chip shortage is also pretty astounding. The A.I. and performance boost patches are also worth considering. I mean, let's be honest here, what does your average gamer/worker really want in a card? Raw POWER, or sleek performance within a reasonable price range? I think this is a win. Check back on this in a few years, I won't edit this OP lol
As someone with a 1080 just looking into the available upgrade options, I think you’re right. It’s hilarious to wade through all the negativity and entitlement online, but that’s largely par for the course for social media these days. Sure, I’d prefer it to be cheaper, but ultimately the power and efficiency of the 4070 will give a huge performance boost over my current card.
@@plain-bagelim in the same boat, 1080 here. Was looking at a 4070ti for 1000$ but realizing i dont need that extra performance, however i cant go under 700$ for a 4070 in sweden atm, even on black friday.. oh well
The 4070 really is in a bad spot because the card itself *IS* very good, but the starting price was fucked and now everyone just blanket dislikes the 40 series. As a $500-$550 card to upgrade to from a previous gen, lower spec card, it's *fine*.
I keep telling people Nvidia is only better at the very top end ie (4090) if you don't need the very tippy top in terms of performance AMD is better (also RT is very overrated I've never noticed much benefit from using it) if you're on a budget AMD demolishes Nvidia Edit: Also to you Linux users out there AMD is generally better than Nvidia due to better driver compatibility
@@u2popmart And with AMD finewine you may have it for a while. My brother is STILL using his 8GB RX 480 Red Devil that he bought at it's launch and it's still going strong. Now, he is cheap and it is very old, but his card is a great example of AMD aging cards. His card is impressive. I am the idiot who went out and bought an RTX 4080.
and VRAM I mean, it's still shame on NVIDIA but the fact of the matter is that there is a lot of performance left on the table with the 8 GB framebuffer on 30-series cards.
@@gokublack8342 well they are also better at the 499$ price point with the 4070 if you are playing with Raytracing and frame generation supported games. Otherwise just get the rx6650 or rx 6800 and be happy or if you want to be really cheap the rx6500 and turn on and use custom interlaced resolutions.
It just isn't fast enough for the cost - especially with the efficiency of the 40 series. I'm skipping both, however. Until they make the 4080 $800, nothing is worth upgrading my 6900 XT over.
@@redpilljesus Why would you even want to upgrade your 6900XT? That card is fast as hell and I don't see a reason why you would need to upgrade it so soon.
Same, i got a founders 3080 for $700 during peak rona era, and i really don't feel like the 40 series is making any compelling arguments for an upgrade (which I'm happy about, no FOMO)
I miss the days when the 70 tier was just the 80 tier from the last generation at a lower price, possibly with a different RAM amount. Really kind makes me feel better about the 2080 I bought off a friend for $150 a few weeks ago.
I understand this but take in into account MSRP and inflation, for example in 2014 the gtx 970 had an MSRP of 330$ which would be 420$ today, so we cant really keep the same prices forever, it doesnt make sense, i would understand if 4070 launched at 500$ but 600$ is just nuts. The gtx 1060 in 2016 launched at 299$, which was very reasonable. That would make it 375$ today, so 400$ makes sense honestly. So thats the only place Nvidia keeps it fair, the place that sells by numbers not money. Everything else is just insane.
I always switched GPUs when I noticed my old GPU wasn't holding up anymore. I still remember how I tried running Assassin's Creed Unity on my GTX 760 and it ran like crap in 1080p. I then switched to the 970 which was a teriffic card for it's price and served me for many years. Unfortunately, it broke and I then bought the 1080, because the price dropped massively when the RTX 20 series got announced. I'm using that 1080 to this day and of course, you have to do some tweaks. But so far, there hasn't been a game where I felt the urge to change my GPU, because it's not holding up anymore. I'm currently playing Borderlands 3 at 1440p Medium to High settings and it runs at about 80-100 fps on my 120hz TV. Maybe things will change once devs leave the old consoles behind as PS5s and Xbox Series X are now widely available. But for now, the 1080 is still good. Which is a shame, because I actually wanted to wait for the 4070. But when I saw the price, I immediately lost interest. Thanks NVIDIA.
I've been running my trusty 970 since that card launched, upgrading from a 660 TI. Every GPU generation I've been looking to upgrade, but I've always held back. 2080 due to RTX being experimental and overpriced. 3080 due to supply issues with covid 4080 due to the price and the 4090 being too tempting in comparison... though I never bit the bullet. I just started doing UE5 game development and my 3.5GB VRam is crippling me on my 970, so I'm finally pulling the trigger. Honestly the 970 is such a legendary card and has kept up at 1080p and even some games at 1440p for so long. But now AAA games and technical programs just destroy it. Gonna be a hell of a leap for me, and will finally get to use my 144hz 1440p monitor to its full potential.
@rugbyf0rlife I've been using my 970 for 9 years. It's been an absolute machine. I'm kinda sad to replace it actually but it's time for a total overhaul. I'm hoping to still find a use for it and the entire old machine in some capacity because it all still works
@@rugbyf0rlifeI'm still running 2GB GTX 960 and you basically perfectly described what I've been going through as well. The jump is going to be insane, but hey, at least the 960 can still hold a candle to most AAA games... aside from Cyberpunk 2077 and similarly demanding titles 😅
You're impressed about a duopoloy where AMD is not competing and always diving 1% under the price of Nvidia so they can both keep fleecing consumers. Good for you I guess.
Yeah I have switched from 3070 to 6950XT and I had a great deal to switch to 7900xtx and I'm super happy - dont care about dlss I just want raw 4K gaming 😊
I mean, even the comparison itself is weird. Thats just goes to show how amazing 1000series was all around..!! Well, seems like ill keep rocking my 1080 too, unless i find an amazing 3070 used deal.
This review of an nvidia product made me a bigger intel and AMD fan as I am very suprised by the performance of older AMD cards after all the updates and I can't believe that ARC actually can be somewhere on these charts with pretty good fps
Arc a770 16GB is actually really good, most games can run with the settings on highest and still achieve 60fps even if that is with XeSS or FSR. I'm maxing out Hogwarts with ray tracing at 4k and still getting playable frame rates. Lots of Nvidia and AMD cards can't even do that due to a lack of VRAM on Nvidia's side and a lack of ray tracing on AMDs. Intel is the sweet spot right now and I only expect future launches to be even better
@Chronometer I hope that's not mostly ownership bias, but it does make me hopeful for the mainstream market. It's no problem in my eyes if Nvidia screws off and creates some kind of "luxury gaming" tier of the market, as long as there remain compelling offers from other entities.
The thing with Arc is that it's effectively Vega 30, at least according to Chips and Cheese. It's really good when it gets loaded, hence why you'll see how the A770 competes with a 6700XT or a 3070 in TimeSpy, or how when running something stupid like Hogwarts Legacy with RT at native 4K it loses so comparatively little performance that it slumps its way to 3090 tier in some way at 23 FPS to the latter's 25 (going off of day one TPU review). The problem is that games aren't synthetic benchmarks, and most people tune their settings to match their card. So, while an A380 beats a 1060 and almost matches a 580 on Cyberpunk at 1080p Medium, most PC gamers don't want to play it at 36 FPS. So the settings go down to low, and the A380 gets around 50 FPS while the latter two get 70+. Arc is like a random grab bag of candy you get on Halloween, and you hit up every non-normal neighborhood in town. You'll get great candy in small amounts from the rich. Someone will home make some surprisingly good chocolate. A person will slip in Tootsie Rolls with razor blades and get you in the hospital.
Honestly the only reason I'm not buying an arc is the lack of support for legacy games, of which I have 500+. I might just make a legacy computer and use an arc or AMD build for new games post 2020.
@@jemiebridges3197 Intel has come a long way with their drivers and the hardware itself can more than handle older games. I doubt you would actually need to build a second computer with older hardware. I have as many legacy games as you, though I haven't tried them all yet. When I first got the card I'd have said something different, but with the improvements I'm confident it would work for you
Honestly, I'd love to see you use DCS as a comparison for some of these cards, simply because it's a free to play full fidelity flight sim and it would be good to see some benchmarks
What I really like about DCS is how well it proves the point about system RAM. That game is a monster when it comes to memory. I think it was the first game to recommend 32gb, but not only that, you can see benchmarks out there for comparing different RAM speeds and timings, and the differences between bottom of the barrel 2133mhz vs 4400mhz preemo RAM kits was literally like the difference between a 6600XT and a 6800XT. That game's super sensitive to RAM apparently, and intensive af. I do like it when studios really push it to the limit, like did you know Age of Empires actually has a 2560x1440p display resolution?? Microsoft actually left a resolution for it back in the 1990s, when every screen was 640x480, pure wizardry. So I don't think it's a bad thing at all to make a game that the most high end hardware barely takes advantage of, and is why I like the overdrive fully pathtraced tech demo of Cyberpunk. I just don't like it when manipulative lying corpos then go and try to bully tech journalists for treating it like a tech demo, and not acting like it's the only performance which counts. I think I would've had a much better reception to RT in general if they'd just introduced it the way they're introducing fully pathtraced scenes, and emphasized performance otherwise. Or it's like how I was cooler receiving 3D vcache, because it's a one trick pony, so I may get 5950x anyway. Because I think the average gamer in general doesn't seem to understand not all games use system resources the same way, and DCS World vs Civ 6 vs Cyberpunk vs GTA V is a really good example of that to where you can have all kind of engine limits and resource utilizations, that make any one game not really indicative of the fuller experience, because RAM speed might not matter in lots of games, but some they do, and it's easier to make an informed choice when the answer to does it matter is "sometimes, but maybe....but those times it counts it might count a lot." Like why 5800X3D is literally superior to the 12900ks in certain titles, but other times it loses pretty badly. It isn't always that it's just "faster" or "better" but it performs differently, depending on engine and title etc. Honestly I'm a bit shocked how much compatability there is at all sometimes. Maybe for all my bitching, it would be so much worse in tech if we had a dozen different companies. I guess it's really nice for GPU coolers, but not having standards is such a nightmare. Is why USB existed in the first place I guess. Just crazy the variety we have I guess, and how much the performance is able to vary sometimes. It's part of why the way these brick and mortar companies sadly even Steam system req sometimes is so useless, like "i5 8gb of RAM" is practically meaningless to me at this point, and RAM is definitely an area where SIs like to cheap out on. DCS is a great example why that matters.
My 980Ti is old enough to be in second grade and I haven't yet felt particularly compelled to upgrade it. 45-55 fps in Cyberpunk 2077 at 2560x1080 with all the non-RT eye candy cranked to ultra is still good enough for me, for now.
@@ntzt2150 I kinda dislike buying used tech, because it's a gamble. You never know what crap they did with it. Maybe they stressed it to the limit or dropped it _(wink)_ But well, maybe I'll have to adapt..
Feels good being a 6800xt user, saving several hundred dollars over a NVIDIA card and seeing the better or similar non rt performance is a great feeling :)
This video finally convinced me to upgrade my GTX1070, but for the first time in my life to a team red GPU, the RX 6800 XT looks like such a great deal now
not sure where you are based but in the UK there is only a small difference in price acorss 6800, 6900 (used) and if you are wanting new ive seen some good deals on 6950xt. Worth a look!
Just upgraded to the 6800 XT a month ago (From a 2070 Super) and it was well worth it. I'd be lying if I said I didn't have to do some tuning with software to get it up to standards but that was mainly my own fault with not keeping my PC well optimized over the past 3 years. I bought this rather than the 6950 XT because the thermals and power consumption are infinitely better, and imo, that outweighs the raw "value per frame" advantage that the 6950 has. 6800 XT is plenty enough to run all my games at 1440p with max graphics settings, would recommend.
They should also add the 3080 ti & that would make the 4070 look even worse. Anyone who took advantage of back-to-school free upgrades from base 3080 to 3080 ti on pre-built sites made off like bandits & RT/DLSS can't hide that comparison. Wait in a few months when Nvidia ends the rebate program to AIB partners for initial 4070 sales & the price jumps will make this card a hard pass.
this makes me feel fine about the 7900XT I just bought a few weeks ago. was really torn between it and the 4070/4070TI but ultimately decided I wanted more vram for the long haul. have been really enjoying it so far.
That was definitely the right move. 12go of VRAM is not enough even for 1440p resolution these days as I am consistently getting more than 12go on last of us, fh5, re4 or hogwarts
Why would it cost 800€? The 4070ti can be had for 880-900, I am expecting this to cost 650-700. Also note that comparing new vs used cards is a bit unfair in the price department. edit: it can be had for 659€, officially and on third party sites such as caseking, if you are in germany, ofcourse regional variances apply.
I'm just now upgrading from my R9 390. Outside of driver conflicts, I've been able to run pretty much anything at 1080p on it. Have a 3060 coming in tomorrow tho
yeah, I remember Rx 580 8GB being equal to a GTX 1060 6GB when was released. But now performs like a 1070 if not slighly better. Same story with the RX 570
It also helped that both ps4 and xbox one were GCN based and since almost every game is designed primarily for console, almost every game engine ended up being optimized for that architecture
Getting 6800 XT performance... For $10 more, you can get a 6950 XT. Why are people buying Nvidia? I mean I bought the 4090 going from the Radeon R9 295x but if this was my budget I wouldn't even consider Nvidia
I just upgraded to a 4070 from a 2060 super in my small formfactor build, and I am very happy. It only needs a single 8-pin, runs cool and it's quit. The performance is uplift from my previous GPU is also quite good. I went from 1689 to 2619 in the Heaven Benchmark. I use Blender, so I had no choice but going with Nvidia, so I am glad that there finally was an option that didn't require a new case and power supply.
I'm in a similar boat, my GTX 1080 is starting to show its age and the only GPU that is at least twice as good as my 1080 that will fit in my case is a 4070. So now the question is do I buy now or wait and see what AMD has to offer with their mid range RDNA 3 ?
@@Sid-CannonJust an opinion from a stranger on the internet, so take it for what's it worth: If you actually need the extra performance today - go for it. Assess your needs and base your choice on what meets your criteria. It's hard to make a decision based on a GPU which doesn't exist.
I'm really glad they focused on how they're raising prices equal to performances gains for the second time in the last 3 generations. What I'm sad about is them not pointing out NVIDIA's record profits last quarter (even with the cyrpto bubble over, a new record), they're taking the fact there isn't competition to simply raise prices knowing they have a near monopoly, record profit after record profit has been posted...they learned the market can be squeezed for more and they're doing so hard.
what can we say... we voted with our wallet. Look at Intel...thanks AMD.... and now look at Nvidia. Nvidia stock is trading too high compare to their P/E because everyone think they're unbeatable now.
It's capitalism. They're simply doing what all companies do. They are actually obligated to by their shareholders. The prices are high because people are willing to buy them at that price. Be angry at the buyers. They're the ones causing the price rise. Not the company.
yeah , actually, if nobody would buy the cards, they would have no choice than lower the price, the fact is there are still people who buy at that price personally i will keep my 2070S until a better deal
I thought 4070 was gonna beat my 6900xt which I got for $550 this generation but damn AMD is killing it! Everyday I'm becoming more pleased with going red
@@paulelderson934 I didn't get second hand tho. It was a refurbished model with 2 year warranty but yea the second hand market is crazy rn in my region. I can see there's still that refurbished model available for like $400 now
Kinda crazy how I’m in the same boat with so many others. Got a 1080 when it came out and it’s still going strong, but the 4070 finally seems like the time to upgrade. Planning to hang on to this one for quite a while as well!
Hell, you can get a new one for under $550 if you look in the right places. For a card that has 15% greater performance and more vram as long as you don't whirl up rt to get sub 60 frame rates anyway.
Yeah, I grabbed my red devil 6800xt new for $513 back in January and I can't believe how good of a decision that was lol. Micro center is the goat for that deal.
>4000 is a much better value than 2000 / 3000 series no its not. 2000 / 3000 were priced that way because you could've mined eth with them and actually you could've returned ALL YOUR INVESTMENT ON YOUR OVERPRICED AF 2000 / 3000 CARD + MAKE A PROFIT. only stupid ngreedia fanbois don't understand that 2000 and 3000 ngreedia cards were overpriced ONLY BECAUSE THEY COULD'VE MADE MONEY to their owners THEN. now mining is DEAD. so 4000 series prices make 0 SENSE. same with ayymoode.
Did you watch the video? The only place it was even comparable was photoshop. Every where else it performs worse than a 3060 lol. What sort of fanboy copium is this?
This 4070 launch has cemented amd graphics in my build this year. I am upgrading from i5-7600 with a 1070 to i5-13600k with either 6800XT or 7800/7700 series that are probably coming this summer
What I find amazing is that the 6800XT is about 14% ahead in performance in 1080p. That's similar to the difference between 6800 and 6800 XT. So given that the 6800 always has been faster than a 3070... it's like Nvidia just closed the gap to their competitor's last gen with a whole new generation. It's terrible to see that. Especially considering that you can a 6800 for $500 or even less. Even if someone wants to get an Nvidia GPU, it might even make more sense to wait for next generation. It's unlikely they can make a worse deal than this time. Otherwise AMD just offers the better deal currently. I'm definitely happy with my 6800. Also having 16GB of VRAM can be pretty handy when ray tracing gets used more. Because it actually requires more VRAM in practice which is something Nvidia won't tell you before selling 12GB or less.
I’ve been loving my RX 6800 XT! Didn’t realize the software updates had been so influential-kinda crazy that it outperforms a 3080 nowadays, according to the data here.
AMD has been gorgeous for software updates, I have a rx580 and I got through the years: Freesync / VR, Sharpening for upscaling, FSR. This GPU still rocks for the resolutions it was made and even stretches to 1440p with old games or FSR for new ones.
Please stop buying this Kool Aid.. It isn't the software updates, its the larger VRAM sizes. Please go watch Hardware Unboxed 3070 vs RX 6800 2023 revisit to see some eye opening stuff!
@@eaman11 My 980ti died recently and I'm temporarily running an RX5500XT - when it's replacement arrives, it'll be an Nvidia card, but I will absolutely miss those AMD drivers, they're so good! Chill is witchcraft, I love it!
RX 6800 XT: about 550 dollars ????? RTX 3080: about 810 dollars ?!?!?!? what the hell? :-O personally I was kinda pissed that my GTX 750Ti couldn't play "Doom Eternal"...so I installed "Binary Domain"...had fun...and didn't spend a penny ;-)
Having an RTX2070 seems still like a solid option for my workload; Factorio, an MMO every once in a while, video editing, and basic programming. But damn, holding a freshly unboxed GPU is one of the few joys in life :D
Pro tip. Don't buy anything new that has less than 12gb of vram at this point. That is really the important thing to note. It doesn't matter how many cuda cores if you just dont have the texture memory
Hardware Unboxed did a good demonstration of just what happens when you're hitting VRAM limits. Stuttery, unplayable messes where all those cores are meaningless.
@@LiminalSpaceMan192 It depends on the target resolution, needing more than 12gb at 1440p is a >50% increase for as far as I'm able to tell, no perceptible increase in graphical fidelity... That's definitely worryingly bad from a technical point of view.
For me, my 2070 still runs above 60 FPS in every normal game I want to play, but the area I would really want to have better performance in is VR. It would be nice if VR stats were included in the videos, since that's really the only thing that makes me want a better graphics card.
It's definitely not the most accurate but, I use the 4k stats for VR. Unforunately, VR-GPU Testing should have its own production as the variables in resolution differs considerably between headsets (even for high end models), as I'm sure you know. But, couldn't agree more! Need more VR data, especially nowadays!
my brother in-law is using my brothers old 1080, has no complaints about hogwarts legacy and is enjoying it with his son / he is a big board game guy but his friends kids and friends (now that kids are older) are starting to dip into video games again
My 1080Ti is still strong. Will most likely upgrade my 9th gen/1080Ti system when 14th/15th gen comes out and will pair nicely with a greatly reduced in price 4090 :D
nVidia has become to lazy and complacent due to years of majority control of the GPU market. That's why I just ditched nvidia this time and bought 7900xtx. Even though I have always bought nvidia since 2010.
I upgraded from a 1080 to the 4070. I didn't mind lowering settings over the years, but I wanted to go back to playing modern games in 4K since the release of the 3080 which was never in stock. Regardless, I saved $100 waiting, and the 8 pin requirement saves time.
@@mjtomsky6387 Yes, specifically the Asual Dual OC model. I think it has one of the best coolers of all time. It stays at 33c at idle without the fans even spinning. Great energy efficiency for a card that is 3080-level.
Thank you for including the 1080 I am still rocking it for 1440p got it during the pandemic. Keep up the old hardware in comparison with the modern hardware.
i also have a kfa2 1080 oc and im actually surprised how well it still runs. i play RE4 Remake on medium settings in 1440p/144Hz and have frames between 70-100 thanks to the last driver. when i first played the chainsaw-demo i already thought i have to buy a new one because i just had about 30-40fps, but now its just completely fine. :D
Great review! Clearly a lot to be a bit grumpy about, especially the cost increase to performance boost since last gen, but in the right scenario this still feels like a good option in niche circumstances. My brother is on the now ancient 4790k with a 1070 ti, and holding out for ddr5 to be more accessible and standard before upgrading his cpu and mobo etc, but is desperate for a better gpu to carry him through that. He’s obviously gonna be cpu bottlenecked a lot, but even so, a 3080 for $600 that sounds like it will work with his 750W supply feels like a great quality of life improvement until he can sort the rest of his rig. His rig is actually a hand me down from me, I upgraded everything after I was lucky enough to get a 3080 FE back in 2020, but there was a short window where I just slotted the 3080 into that 4790k setup and it definitely ran things a WHOLE lot nicer. This feels like a perfect bridge gpu in that (very particular) case
I'm really feeling good about that 6900XT I bought 5 months ago, especially now that I got my cpu and mobo up to date too with the 7600x with SAM enabled. Feels like it would've taken the combined cost of gpu, cpu, mobo and ram just get an nvidia gpu that could compete, add to that the cost of cpu, mobo and ram and it would take a couple decades for that power cost to make up the difference.
Interested in upgrading my 2080ti with a 7900xt - very interested in SAM now that I have a 7900 - have you done any testing with and without SAM enabled? Should I go for it? Wanting the upgrade to make the most of a 4K 120Hz LG OLED
same! initially wanted to upgrade to 7900 from 590 but the early price was so high in my country compared to actual MSRP, luckily then i found 6900xt in a very good price, so happy with it now
@@sexyfishproductions I know you didn't ask me but it's really up to you if want to turn it on. Usually a 3% performance increase on most games at 4k but some can be up to 15%.
@@sexyfishproductions My 6900xt has a lot less stuttering with SAM, as for direct fps boosts idk, i think it did give me a bunch of fps depending on the game, especially in forza I have a 5800x btw But hey, rezisable bar for nvidia also exists so at the end of the day it depends on what you wanna pay for a card lol
I wish you'd included the 6700xt on your graphs. With 12GB of VRAM, it's actually been edging out the 3070 in some newer games. It would be a nice point of reference in approximately the same tier
That mention LTT Store had me like 🤔👀🤣 and that “Shadow Realm” reference just made me flashback to junior high playing Yugioh in the lunch room and telling a kid “I summon Dark Magician prepare to be sent to the shadow realm”
"All of you Pascal owners out there that are finally gonna bite the bullet and upgrade" Right about that one! I finally felt it was time to upgrade my 1070Ti, so I went with a Merc 310 7900 XTX a couple of weeks ago for the same price as reference new, and I love it! Not this time Ngreedia
Went with Rtx 4070ti. Gamers paid $2k last year for RTX 3090ti and even bought 3090 for that price from scalpers. Nvdia are so nice they made it $800 for the same performance if not better at 1440p lol. Thanks Nvdia I actually don't give af as long as it performs as a high end GPU with way better power efficiency.
@@xikirito_6809 Nah I'll just buy another one in 2 years when 4k OLED high fps monitors come out. Just be the best version of yourself. It's better than crying about prices, espically when the product you get is worth the performance gains.
What a lot of people seem to forget is that it wasn't until this time last year that most folks were finally able to get their hands on a 3080/90...soooo theres that. And I'm not even entertaining all of the other rabbit holes I could certainly wander down. Most folks are just good now...
Hopefully Nvidia will start to loose money in production and marketing, not be able to profit because people stopped buying their top-of-the-lines and they'll become in financial struggles. It feels like the only way they could actually deliver and stop the crack smoking with their pricing is if they all to do all it takes to squeeze as much sales as possible to get back on track.
@@charlesm.2604 It would help if people covering tech make an effort to not review nvidia cards, not include said cards in graphs, and cover as little news about nvidia as possible.
@@rimuru2035 I don't feel a news outlet should choose to not cover a particular brand. That would make them not valuable to me personally. I can make my own decision regarding what to do with the news.
You were spot-on about the GTX1080 people holding on to their cards. I am still using mine for gaming, but I a looking at GPUs for rendering Rhino3D/Cycles. I find the new card prices a bit much.
Amazing, in 3 years of development Nvidia managed to lower the TDP of the 3080. Just, incredible work! Their team should be immensely proud! With gains like these we might even have Path Tracing in 10 years!
I would love to see a 6950XT stacked against the 4070, given they are regularly going on sale for $600 currently! I decided to go AMD route because of the subpar price to performance with Nvidia recently.
If it's current prices you're comparing you can. Not sure how common the "sales" are but sometimes they're basically permanent in which case it's a valid comparison
One thing more focus should be on is on power dissipation. It looks like the 40 series is a lot more efficient in terms of power consumption than the 30 series. So whilst the 3080 may have a small edge in performance over the 4070, it uses 50 - 100% more power to do it.
Each review that comes out of the 40 series has me scared I made a mistake buying a 3080 this past Christmas. This one had me specially concerned since it was supposed to be the more comparable one to the 3080. Glad to see again that I made the right call. I’m beyond happy with my purchase (specially since I came from a 1060 hahaha) now that I see they perform basically the same for the use I give them.
Same here. I have zero remorse for buying a 3080 at MSRP even this far into the 4000-series. I have a feeling it will serve me well all the way through the 5000-series too.
you chose right for bang per buck, try buying a 3080ti strix oc in feb 22....you dont wanna know how much it dropped in a month, but it was a birthday present to myself so doesnt sting as much (it does im kidding myself) :)
I have 4070ti. I play my games in peace at my beautiful looking OLED 1440p 175hz. People spend so much money on Iphones every year which barley have a performance increase. I am a simple man, I see 4070ti is nearly identical or better at 1440p than RTX 3090ti(fastest GPU a year ago) while being significantly cheaper. I buy it (:
Would be nice to see a price per frame comparison (adjusted for inflation) of the last few GPU generations to see if we're actually getting more bang for our buck.
@@redfoottttt you take the price of any given gpu and divide it by the number of frames per second it on average generates in a given game, on a given set of video settings, then you get different price per frame values
I love the tons of data that are in the reviews now because of the lab but sometimes it gets a bit confusing. I personally would prefer having a graph be separated between what’s competitive and other. For example having only the 4070, 3070, 3080, and 6800XT and maybe the 1080Ti at the top of the chart always and that the position does not change. It would help show what’s the performance s difference against what viewers should compare the product to when making a buying decision. Great work by the lab in this review and the Ryzen 7800x3D review, it’s just that the data seems like it can be a bit overwhelming sometimes.
I agree, it would be nice if there were some "gold standard" type GPUs in a fixed position on the graph, and we could use them to ground our understanding of where [new GPU] fits.
I went from a 1080 bought when it was released to a 4070 yesterday. If I get like 7 years out of it again, I am happy. I like the rather low power consumption
I’ve been team green since my Geforce 6800GT (2005ish I think) and I’m finally starting to consider AMD for my next upgrade. Their recent driver update history is confidence inspiring and I’m done with shelling out the premium team green price.
It's tough man. Team green products just work, I've been using then since 9800gt. Not as long as some clearly but still, they just work. I hear only horror stories about AMD cards and their stability and drivers, but ita very quickly getting to the point where the $$$ outll weighs the the chances of issues.
I'm in a similar boat. I've been using nVidia since way back when I got two 660 ti's (when SLI was quite a bit more popular than today). I have a GTX 1070 in my machine and current cards are just so much better. I train AI using my 1070, and it's a bit lack-luster, especially compared to RTX series options. However, $599 for a 70-level card??? The GTX 970 launched at $329. The current pricing is just completely unreasonable.
buying a new GPU doesn't involve just replacing the GPU, so it makes sense that the 1080 is going to be around for a while, especially considering graphics probably aren't going to get exponentially better anymore while the 1080 runs most modern games at regular settings pretty well
The data in these vids is getting more and more impressive. I'll say tho, another upgrade y'all could make for the figures is incorporating uncertainty bands. That one note about +/- 400 for arc made me wonder if there were any significant differences between any of those cards, or if the results were just an arbitrary ordering based on a single draw. Ik doing repeated draws of the same tests takes much more time, but for really important reviews the return might be worthwhile!
@@jackrametta I suggested using distribution curves for some of their frame data as well, rather than multiple bars for different values. they seem to respond to criticism of their data presentation with the defence that it needs to be widely understood and there are simply too many normies who wont understand high quality data for it to be worthwhile.
@@kanjakan just a normal distribution/bell curve (google it). It’s pretty self explanatory when you look at it, but it’s hard to describe. It’s essentially a plot of frequency vs value. You’d have your frame rates as the x axis, same as their current bar charts, then the y axis will be what percentage of the total time spent testing was spent at each frame rate. If you had each test rig colour coded and semi transparent, then overlaid them, it would provide heaps of immediate visual information about what to expect for each setup. It would look like upside down “U” shapes with varying curvatures where the further to the right the peak is and the less spread the curve is, the better. I don’t think it requires collecting any more information than they do already.
Wow, its still surprising how well the GTX 1080 has lasted over the years. Per why its staying in my PC for another couple of years! Even for 1440p Mid settings
@@squirrelsinjacket1804 I will upgrade to a 4070 eventually, but for now games like cyberpunk, battlefront 2, and tf2 run surprisingly well at 1440p. For cyberpunk I can achieve 70fps < at mid low settings because the card is water cooled and overclocked like crazy.
I am a lifelong AMD card user. And I can’t remember the last time I cared about a sitting through a video card review video in it’s entirety. But I watched and really enjoyed watching this entire video. The writers did a great job!
>4000 is a much better value than 2000 / 3000 series no its not. 2000 / 3000 were priced that way because you could've mined eth with them and actually you could've returned ALL YOUR INVESTMENT ON YOUR OVERPRICED AF 2000 / 3000 CARD + MAKE A PROFIT. only stupid ngreedia fanbois don't understand that 2000 and 3000 ngreedia cards were overpriced ONLY BECAUSE THEY COULD'VE MADE MONEY to their owners THEN. now mining is DEAD. so 4000 series prices make 0 SENSE. same with ayymoode.
What they don't tell you about DLSS3 on the 4070's is that it uses cores instead of dedicated hardware. That's their dirty little secret. You can tell this by the 4090 giving 2-4x frame generation, while the 4070 only gives ~1.5x frame generation. That means you're actually losing real frames to gain fake frames. The 4090 however does not lose real frames, it simply adds more frames on top of real frames. This is actually quite a deceptive change, because it means frame generation does not equally increase frame rate across different cards, making frame generation far less effective on the 4070's. If you notice on the graph, the 4070 went from 36 to 58 fps with frame generation. Since you need an equal amount of frames generated as real frames, this means with frame generation on, the 4070 is actually producing only 29 real frames instead of 36. That means your real-frame latency actually drops below even 30fps levels with frame generation on. This can only be explained by them secretly using CUDA cores to do the frame generation, rather than the dedicated DLSS3 frame generation hardware on the 4090. This is a big freakin deal, because it seriously undermines the benefit of even having DLSS3. I would consider this kind of deception not just borderline illegal, but actually illegal, as they have fundamentally altered the promised nature of how a core feature functions, by using real hardware in the higher tier cards, and what amounts to emulation in the lower tier cards, all without actually disclosing it. Like a child being fed and having the spoon switched out from delicious cake to vegetables at the last minute, this is exactly what Nvidia has done with the 4070 and consumers, hoping you don't notice. This is class action lawsuit levels of deception, like back when Nvidia made the GTX 970 with 3.5GB of VRAM, but 512MB of it was segmented from the main RAM, and thus virtually useless. The 4070 doesn't have real frame generation, it has EMULATED frame generation, that comes at the expense of core performance.
Just because Ngreedia didn't meet your expectations with this card does not mean they did anything illegal lol. Don't know if you were being serious with that or not.
Glad I got my 7900XTX at MSRP. I had a 3060TI and I loved it (1440p high at 144hz was more than enough for me) but these 40 cards aren’t worth the money this generation
I upgraded because I stupidly sold my PC last year…I literally only have a GPU because i was waiting for the 7800X3D until sites other than Best Buy listed them $100-$300 over MSRP 😭
@@dauntae24 oh for sure. If you’ve got the money go for it! No shame here. I just found AMD’s value proposition more enticing, again if you can find the products at or around MSRP
Who would've thought the 40 series was going to make the 30 series look a lot more desirable.
I still remeber all those videos regarding how RTX 40 will be the new pascal LOL. More like the new Turing
@@DragonOfTheMortalKombat MORTAL KOMBAAAAAAAAT!
the problem releasing new series gpus too soon, if the 40 series released 3-4 years after 30 series it would be much much better and at a similar or slightly higher price
"Who would've thought"??? Saw this coming miles away after that report from last year which claimed that NVidia were actually part of the shortage problem, stockpiling the 30 series.
They have to sell the 30 series fast while keeping the 40 series appealing, how? Sell the 40 series at high prices so the 30 series look appealing, but keep some features like Frame Generations to the 40 series only.
@@Saver310 FATALITY!!!!!!!!!!!!!!
The way that the 1080 is still able to show up on graphs just shows how much of a monster it was
Mine just died and I am so sad about it.
i have a 1080 lol
It shows how stagnant computer graphics and gpu market of today is.
It’s literally a high end gpu, that’s the only reason why it’s still good even more than 5 years later
I've been using a 1080ti since the day it came out, i can safely say i was able to keep up with games flawlessly to this day.
9:25 We're so glad you're using our website, Linus! Thanks for featuring us!
Congrats!
So happy for u guys! 🥰
The best website I've found. Recommend to all my friends.
_So easy to find out free games out in the wild, hehehe_
Your site is awesome, thank you for the great job!
This is the way
I have found the RTX 4070 is really good for 3D rendering in programs like Brice, Daz studio, and the IRAY render engine, (it also rendered really fast on 3D Studio Max, but i only had an upgrade trial copy because my licensed copy of 3DS Max is a decade old and is only used for modeling). My personal benchmark was of a scene I created specifically to punish the GPU's. I rendered the same scene in Daz Studio using the IRAY engine at 4k resolution with 4k textures on every object set for a total of 3500 iterations. My result for the 3 cards i have were the NVIDIA 2070 Super render time 6 hours and 43 min. The RTX 3070 was 3 hours and 22 min , and the RTX 4070 cut it down to 44 min total!!! (I was shocked as to how fast the 4070 rendered the scene) I made no changes to the file before rendering, so all three cards were rendering the exact scene with zero changes. I know this was not a formal bench test, but i thought I would share the results for the 3D modelers out there since most card reviews focus on games vs 3D programs. Hope the info helps.
Nice. The benchmark im looking for. Using an AMD and wanted to upgrade to AMD again (for gaming and rendering/designing) but noticed that Nvidia has major advantages for 3D rendering
I love the small changes to the graphs you guys did! Last review there were a bunch of people saying they couldn't make out which product was the one that was being reviewed in the graphs, and just highlighting the product being reviewed is a very nice!
Exactly!
Thank you for pointing this out. Totally agree it's so much better!
I still think it would have been clearer if they had one section for each manufacturer.
Edit: also it would probably be nice if they had an icon or a photo for each graphics card
@@Ben.N Totally agree! it's also kinda confusing that they are sorted by speed/score instead of keeping the order the same - if i want to compare the 4070 to e.g. a 7900xt i keep having to pause and search for them with each new slide
I'm in agreeance with this, I didn't realize while I was watching but in hindsight the graphs were wildly easy to understand
Thanks a lot for including the Arc 770 on this. Got one a few months back and all the data out there is out of date, so being able to compare it to other newer gpus is nice.
Really glad to be able to use this review as an updated Arc 770 update.
(different typography for intel arc in the graphs makes it stand out. It's a fetaure not a bug)
Arc is looking good with these graphs
So excited for battlemage and celestial. We might have cheap GPUs again after all
My arc has impressed me when gaming 1440 and 4k
who buys arc???
intel will be launching some lower end gpus soon so we will see updated benchmarks in those reviews.
You can get a 6950 XT for slightly more than a 6800 XT these days. That would have been interesting to see on the benchmarks.
@@lennybarentine6425 Linus isnt either, nvidia does not like him
GN also mentioned this on their review of the 4070.
I just finished a "last generation weapon" build, 6950xt + 5800X3D, targeting high-refresh 1440p.
No regrets.
@@lennybarentine6425 Linus has shit on Nvidia countless time before, what are you talking about.
@@lennybarentine6425 he literally uses a 7900 xtx
@@FerralVideo I'm doing the same thing for 1080p 240hz
I've only been watching LTT videos for a few years, but I have to thank you for finally highlighting the product featured in the video, instead of just tossing it in a list. Even just that one detail makes it so much easier for me to compare what's being featured to what it was tested against, instead of having to pause the video every few seconds to read each graph in its entirety. So, thank you.
04:14 "This test is very CPU bound, but it is still funny how well Arc did." - shout out to the editors for this one.
Backhanded compliment for sure! 🤣
Dang, the RX 6800 XT performance it's now REALLY competitive, AMD is doing a great job updating drivers and making a good product in a incredible product for free!
Yeah I have people saying i'm crazy thinking the 6800xt performs on par with a 3080ti now, but it does.
so from now on when a game releases crap full of issues and devs upgrade it for “free” i expect you to say this too right?
Unless ofc if its not AMD it doesnt matter 😂😂
@@Freestyle80 6800XT didn't launch as crap. It launched as a 3080 competitor, and now it's a 3080ti competitor.
If a game launches as a 9/10 and then 2 years laters becomes a 10/10 for free, then yeah, we're all going to celebrate that.
Look at Terraria. Over a decade of free support adding tons of new features and content.
I picked one up a few weeks before Thanksgiving for $540, and have no regrets picking the 6800xt
Snagged a factory OC'ed Asrock Rx 6800xt Taichi running at 2.49ghz for 520$ so far its running better than many Rx 6900xt's I feel lucky!.
As someone that recently built their first PC weeks ago, I personally went with the 6700 XT and I've been absolutely blown away by the performance, plus the 12gb of VRAM is always a plus for less than 400 dollars.
I'm also on a 6700XT, I got mine used for $250. Absolutely love it
I am also a 6700XT owner and absolutly happy, i can play every gane on 2k with highest settings(yes, Hogwarts legacy and Cyberpunk included)
Glad to hear so many people are happy with this card, because I just bought it!
@@WithMay I highly recommend going 1440p at the same time if you haven't already. Mine needed so much CPU power at 1080p that it stuttered a bit. 1440p is buttery smooth. I went with a Koorui 1440p 144hz monitor from Amazon and I've been super happy with that as well.
@@EricFixalot I got a 1440P 165 Hz monitor from ASUS on sale for 329 euros 🥰
Seeing how good the 6800XT is doing right now I'd love a video on how last-gen cards from both AMD and Nvidia are doing right now vs at launch!
My 6800xt is really great rn.
I am still using a 5700xt. Its not bad and i actually gain some benefit from AMDs FSR feature which allowed me to play diablo 4 bwta relatively smooth
You forgot that 6800xt eats more power than a 4070
Strix LC 6800xt here and it's a killer card.
@@danieldaniel1210more powah means more better?
Thanks for including the A770! We need to keep including them in test.
Yeah absolutely loved it!
I really wish more outlets would remind audiences of how expensive GPUs have become. Nvidia -60 cards used to be budget-friendly between $199-249 USD. These days, a 3060 goes for $300-350 which is the price point of past -70 cards. It used to be that only novelty cards like the TITAN or ARES dual-GPU cards were $1,000 USD or more, but now that has become normal for flagship-level cards.
It just really feels unfair how the market has shifted so far from what was expected just 10 years ago
All these outlets are bought off, they won't give bad reviews because they won't get review samples anymore. PC gaming is just a complete ripoff nowadays. If Nvidia or AMD thinks I am going to pay $600 for a midrange GPU they are out of their mind. My next "upgrade" will be the next Nintendo Switch.
The advantage of being a gamer since the 1990's is that I don't need a reminder, I remember very well.
Inflation, economic depression, supply chain issues, etc… I personally think it’s crazy how people keep going “Back in the day…”
@@aj0413_ the only crazy thing is how gullible you are
@@aj0413_ I was just thinking this
I was bit taken back when I saw the price of the 4090 was around 1600-2000 dollars. Then I remembered the days when people were buying 2 or 3 $1200 Titans to SLI in their enthusiast builds. The enthusiast budget hasn't really moved from that $2000ish range. However, to see the 70 series cards cost $600 minimum is insane.
already dropped to 500, black friday it will be sub 400
What infuriates me so damn much about this is that first of all the tech industry takes everybody's grandma for a ride, getting ignorant naive parents to overpay for crap (I really don't care if smug hipsters get ripped off, iPhones and macbooks are sometimes a victimless crime), but also because if you get enough of the consumers to be stupid and accept being fucked in the ass, it fucks everybody over equally. And so I really don't have the free market option of just switching to AMD, if they get simps to pay $600 for what's basically a 4060, because AMD is just going to do the same. Case in point, 80 level performance is now $900. No, I don't care that they called it "the 7900XTX" It's like Steve at Gamersnexus said, if you just move the entire gauge you didn't improve performance, you just turned the entire gauge but the reality stays the same. ANd in this case it's that 80/800XT level performance that cost about $650/700 two years ago now is literally $900. It's not a 7900XT. It's an RX 7800. They gave their RX 7800 a different name. So because nVidia buyers are all such cucks but they outnumber everyone else, you get this wave of people not even smart enough to make a Dunning Kruger remark tipping the scale by sheer mass into market stupidity, and like it's not even drug dealing, it isn't like this is healthcare or heroin or military gear where there's some broader reason why the consumer literally needs it to survive or is clinically crazy and stupid, it's just exemplifying this notion companies have that gamers are the market least deserving of being taken seriously or respected, that then gamers do stupid shit like this that makes those execs seem like they're right. I mean, why would you respect someone that pays that kind of money for a graphics card and is too dumb to notice what they're doing?
So I can't even do anything about that beyond buy used at this point, which mainly is a problem because I didn't want to replace PSU and something like the 6900XT uses too much power (not that it matters, Lovelace efficiency sucks and so does the 7800XTX). I'm so mad about this because you know in 2020 when nVidia was being insultingly overpriced (imo, I had no idea the average gamer is this dumb) I at least had the option to switch brands, and it was fairly competitive. I still like my 5700XT, and it still runs everything but Cyberpunk at 1440p (the game itself is just kind of a mess, I think they finally fixed the bugs and mostly optimized it enough but you're still talking like 35-45fps on a 1080ti basically) so it isn't like I'm suffering, but soon enough this may be a problem for my system. If the corpos are just renaming everything while scamming me I'm basically just going to keep doing what I did all pandemic long and sit things out until my graphics card breaks.
But the other thing about it is, it's not just the price being worse, 8gb on a 4070 is so bad I literally keep forgetting this, it's like my brain does some trim function as if 8gb had to be in error, but no it really is an 8gb card somehow. That's not even budget at this point dude. That's like if they put 2gb on the GTX 1060. It's clearly not even enough by modern standards. I play at 1440p and it already maxes out, in fact I question sometimes if some of the texture popin I got in Cyberpunk is due to it exceeding 8gb sometimes. There are games right now that use over 8gb on ultra settings at 1080p. So it's not just the fact it costs substantially more, it's the fact you're getting a substantially worse product too, with worse bus width ON TOP OF the inferior VRAM. So the nu xx70, the slot that last gen was practically intro to 4k and high refresh 1440p and a midrange card, is now two years later basically a budget 1080p card, and even at that it's questionable how long it's going to last for 1080p. Bear in mind this was the memory config on the RX 480, this is a VRAM standard that's nearing a decade old now, and it also pisses me off that literally my number one pet peeve about games, smudgy bad texture work, still stagnates solely because of nVidia's self defeating greed making their RT also somewhat useless when cranking up RT just hits VRAM limits anyway. Textures is literally the no. 1 most noticeable thing to me, so at least I have AMD to thank for pushing that all these years so devs can keep upping the standard. I just hope enough devs finally stop giving a shit about nVidia user's whining and they learn to deal with the fact they're paying premium prices for budget hardware and therefore should either stop buying that hardware or accept the fact they need to turn their graphics settings down because the GPU they chose to buy is pretty crappy and therefore they don't get that part of the premium experience. Like I'm not gonna buy an AMD GPU and then review bomb some hard working dev team just because I can't understand why my fps is worse with RT ultra on, these kids need to learn the same about their bad textures and stuttering and crashing with 8gb.
@@pandemicneetbux2110 damn, dude
@@pandemicneetbux2110 Enjoyable rant, you should edit and post somewhere for internet bux.
@@pandemicneetbux2110 you seem genuinely knowledgeable, but the 4070 has 12gb of ram, it was even stated in this video.
I thought I was supposed to be excited by the 4070 benchmark but shocked to see how well the RX 6800 XT performs now.
it always performed well on release it was plus or minus 5% on regular rasterization performance against the 3080 now it competes near 3080ti with its fine wine drivers. people were just so focused on 30 series that even when we finally had on par competition in the gpu market they still bought 30series
@@zackmandarino1021 The mind share is strong, I'm trying very hard to go AMD and love the fine wine that is historically proven. But that fucking little voice in my head always inches me towards Nvdia and I hate it. I really wanted AMD to knock the RX 7000 series out of the park and take the performance crown with their MCM design but they came up short of the hypothetical performance targets that were possible for the 7900 XTX which would have a 5-10% performance ADVANTAGE over the 4090. They ended up coming in at 80-90% in Rasterization performance, so nearly 15-25% lower than their own targeted goals. They are at least less expensive, but not by a large enough discount that gamers are happy and having them fly off of the shelves like they had the potential to do. I get it, oversupply and market demand and all that. But it really was a missed opportunity to capture the high-end market and mind share.
thats probably because the 4070 is a 60 series card or 50 series card with old 80 series pricing.
@@androidx99 This is exactly why I personally see Intel as the future. Why compete for the best card when you can compete with last generations cards for half the price? I understand that many people want the best card, but for almost everything the best card is way overkill.
@@kius5033 sadly I'm lik e 1 good bf game away from spending 12oo on a 4080. But no bf and gta 6 no need
Feels like a 3080 renamed 😂
With almost half the power draw
@@jaydeep-p and frame generation
Honestly with my psu that might be the selling point for me personally.
People who is going to buy 3080 now has a better "3080". Meanwhile, people who bought a 3080 last month now suffer from buyer's remorse.
@@jaydeep-p And more price and less bandwidth🤦🤦
The intro has more tension and emotion than entirety of Twilight
Damn funny
This joke is older then I can remember
Say my name
nah. the ryzen 4070 is better.
Much more affordable
More transistors too
@@fesyukiryzen 42069 has more trans sisters though
@@MusicalWhiskey fr
@@MusicalWhiskeydo you have a bestbuy link?? :)
Thank you NVIDIA for convincing me to pick up a 6800 XT for my next build.
What card do you have now? The 6800 XT seems like a nice upgrade for a 1080ti.
Yeah, you can get it for a lot cheaper than the 4070 (atleast in Europe)
I've got a 6900xt and it's been great.
I just got myself a 6700xt up from a 2060 and it feels like every game runs like butter
@@nightfr09 6950xt here, picked it up for about $700. thing is fllying even when undervolted.
A month ago i bought the 6800xt and am quite pleased with it. Especially after seeing this
I’ve had my 6800 xt for a while now and it’s been a great buy. No issues, plenty of performance, and vram. I don’t feel like it’s too much money or going to be obsolete soon.
I'm thinking of upgrading from my RTX 3060 TI and getting the RX 6800 XT. I can see 8GB of VRAM isn't going to cut it. It's fine if I can still play on 1080p but I'm already seeing that games are going to require probably 10 to 12gb of VRAM. Also the RX 6800 XT is a more powerful GPU and sometimes I play games at 1440p or 4K.
What I'll probably do is swap out the RX 580 4GB in my wife computer and put the RTX 3060 ti in that and then my main system will be the RX 6800 XT.
@@Lucromis My issues were coil whine, drivers and micro stutter. But i found that undervolting the card got rid of coil whine and switching to a display port got rid of micro stutter and saving my overclock/undervolt settings sorted out the drivers. So in the beginning i was semi regretting my purchase thinking i should've got the 3080 but now i am experiencing perfect gaming with no issue at all.
6800XT was my favorite GPU I’ve owned, moved on to the XTX but 6800xt still holds up really well 😊
Same. And after seeing the issues the 3070/3080 were having with their VRAM in modern games, I knew I made a good choice.
I was also laughing at the 1080 performance comparison because that's what I upgraded from.
The only gripe I have is that I can't get consistent updates on AMD Adrenaline. Not sure what the issue is, but considering most of the driver updates now are focused on the 7000 series, it's not a big deal. I do miss GeForce Experience though. But overall I couldn't be happier.
I like how Linus and the guys over at LTT have taken benchmarking to such a level that we only need 1 to be able to compare. Including all the way back to the 10-series is brilliant, guys. Keep this up.
You should still look at numbers from other people. Everyone makes mistakes, all people have a price, everybody lies. Linus himself says that you shouldn't trust them blindly, and he's more than right with that.
@@Donnerwamp if he himself lies then it's possible he is lying about lying. Makes you think.
@@Donnerwamp yeah the point being blindly believing anyone is completely foolish
@@oaw117 no, no NO… I will not waste 2 hours thinking about this now
Except other reviewers have higher quality benchmarking. Just flat out.
Being a fan of this card feels like being a Chad but here goes... IT'S A GOOD CARD. When he said "shockingly stable" he's not kidding. It runs like a sports car: all the fat has been trimmed off, and its efficiency is almost beautiful to watch under load. For it to contend with a 3080 at $600 MSRP, especially after last seasons cryto-fail and the chip shortage is also pretty astounding. The A.I. and performance boost patches are also worth considering. I mean, let's be honest here, what does your average gamer/worker really want in a card? Raw POWER, or sleek performance within a reasonable price range? I think this is a win. Check back on this in a few years, I won't edit this OP lol
just bought a 4070, ill also be giving an update in 5 years
how's it working so far?
@@waltert9
As someone with a 1080 just looking into the available upgrade options, I think you’re right. It’s hilarious to wade through all the negativity and entitlement online, but that’s largely par for the course for social media these days. Sure, I’d prefer it to be cheaper, but ultimately the power and efficiency of the 4070 will give a huge performance boost over my current card.
@@plain-bagelim in the same boat, 1080 here. Was looking at a 4070ti for 1000$ but realizing i dont need that extra performance, however i cant go under 700$ for a 4070 in sweden atm, even on black friday.. oh well
The 4070 really is in a bad spot because the card itself *IS* very good, but the starting price was fucked and now everyone just blanket dislikes the 40 series. As a $500-$550 card to upgrade to from a previous gen, lower spec card, it's *fine*.
I am impressed how much performance AMD could squeeze out of their cards through software updates.
I keep telling people Nvidia is only better at the very top end ie (4090) if you don't need the very tippy top in terms of performance AMD is better (also RT is very overrated I've never noticed much benefit from using it) if you're on a budget AMD demolishes Nvidia Edit: Also to you Linux users out there AMD is generally better than Nvidia due to better driver compatibility
I’m still happy with my 6700 XT it’s not even overclocked but meets all my needs
@@u2popmart And with AMD finewine you may have it for a while. My brother is STILL using his 8GB RX 480 Red Devil that he bought at it's launch and it's still going strong. Now, he is cheap and it is very old, but his card is a great example of AMD aging cards. His card is impressive. I am the idiot who went out and bought an RTX 4080.
and VRAM
I mean, it's still shame on NVIDIA but the fact of the matter is that there is a lot of performance left on the table with the 8 GB framebuffer on 30-series cards.
@@gokublack8342 well they are also better at the 499$ price point with the 4070 if you are playing with Raytracing and frame generation supported games. Otherwise just get the rx6650 or rx 6800 and be happy or if you want to be really cheap the rx6500 and turn on and use custom interlaced resolutions.
This makes me feel even better about upgrading to a 7900 XT (at $800) instead of waiting to see how the 4070 performs.
It just isn't fast enough for the cost - especially with the efficiency of the 40 series.
I'm skipping both, however. Until they make the 4080 $800, nothing is worth upgrading my 6900 XT over.
@@redpilljesus Why would you even want to upgrade your 6900XT? That card is fast as hell and I don't see a reason why you would need to upgrade it so soon.
Same, i got a founders 3080 for $700 during peak rona era, and i really don't feel like the 40 series is making any compelling arguments for an upgrade (which I'm happy about, no FOMO)
Yeah, just wait until nvidia crash and burn with these prices...
@@josephjoestar515 idk man. 1st world things. I have a 3090 but for content creation I've thought abt upgrading because AV.1 encoding is 😩
I miss the days when the 70 tier was just the 80 tier from the last generation at a lower price, possibly with a different RAM amount. Really kind makes me feel better about the 2080 I bought off a friend for $150 a few weeks ago.
Good snag. I got my 2080 new and it still performs like a champ.
Damn that is a good deal
3080Ti for 75% msrp factory sealed. Pretty happy about it.
Uh? Isn’t the point of the video that this is mostly 3080 at a lower price?
I understand this but take in into account MSRP and inflation, for example in 2014 the gtx 970 had an MSRP of 330$ which would be 420$ today, so we cant really keep the same prices forever, it doesnt make sense, i would understand if 4070 launched at 500$ but 600$ is just nuts. The gtx 1060 in 2016 launched at 299$, which was very reasonable. That would make it 375$ today, so 400$ makes sense honestly. So thats the only place Nvidia keeps it fair, the place that sells by numbers not money. Everything else is just insane.
I always switched GPUs when I noticed my old GPU wasn't holding up anymore. I still remember how I tried running Assassin's Creed Unity on my GTX 760 and it ran like crap in 1080p. I then switched to the 970 which was a teriffic card for it's price and served me for many years. Unfortunately, it broke and I then bought the 1080, because the price dropped massively when the RTX 20 series got announced. I'm using that 1080 to this day and of course, you have to do some tweaks. But so far, there hasn't been a game where I felt the urge to change my GPU, because it's not holding up anymore. I'm currently playing Borderlands 3 at 1440p Medium to High settings and it runs at about 80-100 fps on my 120hz TV. Maybe things will change once devs leave the old consoles behind as PS5s and Xbox Series X are now widely available. But for now, the 1080 is still good. Which is a shame, because I actually wanted to wait for the 4070. But when I saw the price, I immediately lost interest. Thanks NVIDIA.
your gpu would not hold up to metro exodus enhanced edition
I've been running my trusty 970 since that card launched, upgrading from a 660 TI.
Every GPU generation I've been looking to upgrade, but I've always held back.
2080 due to RTX being experimental and overpriced.
3080 due to supply issues with covid
4080 due to the price and the 4090 being too tempting in comparison... though I never bit the bullet.
I just started doing UE5 game development and my 3.5GB VRam is crippling me on my 970, so I'm finally pulling the trigger.
Honestly the 970 is such a legendary card and has kept up at 1080p and even some games at 1440p for so long. But now AAA games and technical programs just destroy it. Gonna be a hell of a leap for me, and will finally get to use my 144hz 1440p monitor to its full potential.
@rugbyf0rlife I've been using my 970 for 9 years. It's been an absolute machine. I'm kinda sad to replace it actually but it's time for a total overhaul. I'm hoping to still find a use for it and the entire old machine in some capacity because it all still works
@@rugbyf0rlifeI'm still running 2GB GTX 960 and you basically perfectly described what I've been going through as well. The jump is going to be insane, but hey, at least the 960 can still hold a candle to most AAA games... aside from Cyberpunk 2077 and similarly demanding titles 😅
This is basically a what a 4060 should be.
It even has a 192 bit bus just like every other 60 series.
4070ti was supposed to be the 4070 and the 4070 was supposed to be a 4060. Skipping this generation. Not paying 600 for a 4060.
In a world where nvidia isn't as greedy as they are today this card would be the 4060 and it'd be priced appropriately. Not here tho
I am glad that got 4070ti for 830 euro. Not counting the vram it is faster than 3090
cant wait for an rtx 4040 with a 64 bit bus which will cost 369.99 and will be on par with a 2060, truly impeccable value
what I'm actually impressed with is arc a770 and the amd cards performance
I am laughing my ass off with my A770. 37% less performance @4k while it costs ~289$ vs 600$.
You're impressed about a duopoloy where AMD is not competing and always diving 1% under the price of Nvidia so they can both keep fleecing consumers. Good for you I guess.
Yeah I have switched from 3070 to 6950XT and I had a great deal to switch to 7900xtx and I'm super happy - dont care about dlss I just want raw 4K gaming 😊
Battlemage will be fun
@@doniscoming 4k is only good for the desktop space ngl. Anything past 1800p isn't worth it unless you refuse to enable antialiasing
Considering I have had Nvidia's 1080 for about 5 or 6 years, this was a fun video to see it in comparison with a new GPU :)
It's shocking after all this time it's only a 60% difference.
it is shocking.
I mean, even the comparison itself is weird. Thats just goes to show how amazing 1000series was all around..!! Well, seems like ill keep rocking my 1080 too, unless i find an amazing 3070 used deal.
those 1080 boards man, those were gold!
I'm still on 1070, with a 1440p monitor. But considering a new AMD GPU soon.
Would love to see an energy bill comparison, RTX 4070's peak 185 W power draw is very compelling. Compared to the 6800 XT's ~290 W.
Imagine undervolting the 4070.. Ive seen people with the 4070 under load eating 130W...
I mean if i had 6800xt i would definitely undervolt, as long as it is at max 250w id be fine with it...
@@TheGoodDoctorSRByou can also undervolt the 4070
@@lachlanokeefe8020 I got the 4070 for a while now, and sure enough, consumption under load is 130W... Eats as much as most cpu's.....
@@bradhaines3142 Your 6950xt will be slower than the 4070 lol.
Props to the writers. The script for slap in the face to transition to LTT Store genuinely had me burst in laughter.
and Ivone with banana for scale :D
Then there was Aevone with the banana!
literally
ok?
you know one of my favorite things about LTT videos? clear, and proper volume level audio.
underrated fact 💯
This! I appreciate it so much lol. Maybe I'm just picky or maybe I'm holding channels to higher standards.
i just got new Bluetooth earphone modules and i found myself testing them on LTT videos. i second that.
For like 90% of tech, absolutely, just don't trust his advice on monitors, man has been misleading in that department more than once.
@@Zer0--Zer0 in what way do you mean?
is the ryzen 4070 a good pair with my rtx 7800x3d
yes:)
Yes
5:17 - Linus Kink Tips
I love how the psu standard got changed to accommodate power spikes and then the next generation of gpus stopped having power spikes
This review of an nvidia product made me a bigger intel and AMD fan as I am very suprised by the performance of older AMD cards after all the updates and I can't believe that ARC actually can be somewhere on these charts with pretty good fps
Arc a770 16GB is actually really good, most games can run with the settings on highest and still achieve 60fps even if that is with XeSS or FSR. I'm maxing out Hogwarts with ray tracing at 4k and still getting playable frame rates. Lots of Nvidia and AMD cards can't even do that due to a lack of VRAM on Nvidia's side and a lack of ray tracing on AMDs. Intel is the sweet spot right now and I only expect future launches to be even better
@Chronometer I hope that's not mostly ownership bias, but it does make me hopeful for the mainstream market. It's no problem in my eyes if Nvidia screws off and creates some kind of "luxury gaming" tier of the market, as long as there remain compelling offers from other entities.
The thing with Arc is that it's effectively Vega 30, at least according to Chips and Cheese. It's really good when it gets loaded, hence why you'll see how the A770 competes with a 6700XT or a 3070 in TimeSpy, or how when running something stupid like Hogwarts Legacy with RT at native 4K it loses so comparatively little performance that it slumps its way to 3090 tier in some way at 23 FPS to the latter's 25 (going off of day one TPU review). The problem is that games aren't synthetic benchmarks, and most people tune their settings to match their card. So, while an A380 beats a 1060 and almost matches a 580 on Cyberpunk at 1080p Medium, most PC gamers don't want to play it at 36 FPS. So the settings go down to low, and the A380 gets around 50 FPS while the latter two get 70+.
Arc is like a random grab bag of candy you get on Halloween, and you hit up every non-normal neighborhood in town. You'll get great candy in small amounts from the rich. Someone will home make some surprisingly good chocolate. A person will slip in Tootsie Rolls with razor blades and get you in the hospital.
Honestly the only reason I'm not buying an arc is the lack of support for legacy games, of which I have 500+. I might just make a legacy computer and use an arc or AMD build for new games post 2020.
@@jemiebridges3197 Intel has come a long way with their drivers and the hardware itself can more than handle older games. I doubt you would actually need to build a second computer with older hardware. I have as many legacy games as you, though I haven't tried them all yet. When I first got the card I'd have said something different, but with the improvements I'm confident it would work for you
Honestly, I'd love to see you use DCS as a comparison for some of these cards, simply because it's a free to play full fidelity flight sim and it would be good to see some benchmarks
What I really like about DCS is how well it proves the point about system RAM. That game is a monster when it comes to memory. I think it was the first game to recommend 32gb, but not only that, you can see benchmarks out there for comparing different RAM speeds and timings, and the differences between bottom of the barrel 2133mhz vs 4400mhz preemo RAM kits was literally like the difference between a 6600XT and a 6800XT. That game's super sensitive to RAM apparently, and intensive af. I do like it when studios really push it to the limit, like did you know Age of Empires actually has a 2560x1440p display resolution?? Microsoft actually left a resolution for it back in the 1990s, when every screen was 640x480, pure wizardry. So I don't think it's a bad thing at all to make a game that the most high end hardware barely takes advantage of, and is why I like the overdrive fully pathtraced tech demo of Cyberpunk. I just don't like it when manipulative lying corpos then go and try to bully tech journalists for treating it like a tech demo, and not acting like it's the only performance which counts. I think I would've had a much better reception to RT in general if they'd just introduced it the way they're introducing fully pathtraced scenes, and emphasized performance otherwise. Or it's like how I was cooler receiving 3D vcache, because it's a one trick pony, so I may get 5950x anyway. Because I think the average gamer in general doesn't seem to understand not all games use system resources the same way, and DCS World vs Civ 6 vs Cyberpunk vs GTA V is a really good example of that to where you can have all kind of engine limits and resource utilizations, that make any one game not really indicative of the fuller experience, because RAM speed might not matter in lots of games, but some they do, and it's easier to make an informed choice when the answer to does it matter is "sometimes, but maybe....but those times it counts it might count a lot." Like why 5800X3D is literally superior to the 12900ks in certain titles, but other times it loses pretty badly. It isn't always that it's just "faster" or "better" but it performs differently, depending on engine and title etc.
Honestly I'm a bit shocked how much compatability there is at all sometimes. Maybe for all my bitching, it would be so much worse in tech if we had a dozen different companies. I guess it's really nice for GPU coolers, but not having standards is such a nightmare. Is why USB existed in the first place I guess. Just crazy the variety we have I guess, and how much the performance is able to vary sometimes. It's part of why the way these brick and mortar companies sadly even Steam system req sometimes is so useless, like "i5 8gb of RAM" is practically meaningless to me at this point, and RAM is definitely an area where SIs like to cheap out on. DCS is a great example why that matters.
As someone who is still running a GTX 1080, I loved seeing how the other hardware compares XD, gives me a good base line for upgrading
GTX 1070 here. It's still good for my current PC & monitor, but I can't uprade when the 3070 is still 600-1100 € in my country 😔
My 980Ti is old enough to be in second grade and I haven't yet felt particularly compelled to upgrade it. 45-55 fps in Cyberpunk 2077 at 2560x1080 with all the non-RT eye candy cranked to ultra is still good enough for me, for now.
@@phaolo6 Just get a used one. The Radeon 6000 series used are way cheaper than 3000nvidia where I live. Just gotta be smart about it.
My fellow 1080 brothers! Stand strong!
@@ntzt2150 I kinda dislike buying used tech, because it's a gamble. You never know what crap they did with it. Maybe they stressed it to the limit or dropped it _(wink)_
But well, maybe I'll have to adapt..
Feels good being a 6800xt user, saving several hundred dollars over a NVIDIA card and seeing the better or similar non rt performance is a great feeling :)
you aint missing much on RT, its woo for 10 mins then unnoticeable.
Even better when you see how good the 16 GB of vram of AMD cards like my 6800 xt have aged compared with the 8 or 10 from the 30 series from nvidia 🥰
Lol, keep believing that.
@@markmanderson with the launch of rtx remix a lot of titles are going to need that rt performance though
@Kim Faes F*Kim 🤣
This video finally convinced me to upgrade my GTX1070, but for the first time in my life to a team red GPU, the RX 6800 XT looks like such a great deal now
not sure where you are based but in the UK there is only a small difference in price acorss 6800, 6900 (used) and if you are wanting new ive seen some good deals on 6950xt. Worth a look!
I'm in the same boat
Still rocking a 1070! Watercool and tune the balls off it. it's like a Fiesta XR2i (hopefully without the stolen and crashed in a ditch bit).
Just upgraded to the 6800 XT a month ago (From a 2070 Super) and it was well worth it. I'd be lying if I said I didn't have to do some tuning with software to get it up to standards but that was mainly my own fault with not keeping my PC well optimized over the past 3 years. I bought this rather than the 6950 XT because the thermals and power consumption are infinitely better, and imo, that outweighs the raw "value per frame" advantage that the 6950 has. 6800 XT is plenty enough to run all my games at 1440p with max graphics settings, would recommend.
there are no "teams" ..... you just buy what makes more sense ;-)
With the launch of every new GPU I just get more and more happy I bought a Sapphire Pulse 6800 in 2020 for MSRP.
It would be nice to have the 6950 xt included in your graphs, although the 6800 xt is already a very good comparison.
They need to add the 6900xt
@@Aznrob869 They need to add all cards dating back to 1999 (GeForce 256's release date).
They should also add the 3080 ti & that would make the 4070 look even worse. Anyone who took advantage of back-to-school free upgrades from base 3080 to 3080 ti on pre-built sites made off like bandits & RT/DLSS can't hide that comparison. Wait in a few months when Nvidia ends the rebate program to AIB partners for initial 4070 sales & the price jumps will make this card a hard pass.
@@SirWolf2018 cards? What about iGPU and SOCs???!?!?!?
@@amistrophy :D
this makes me feel fine about the 7900XT I just bought a few weeks ago. was really torn between it and the 4070/4070TI but ultimately decided I wanted more vram for the long haul. have been really enjoying it so far.
It's been a lot more attractive since they lowered the price, for sure.
That was definitely the right move. 12go of VRAM is not enough even for 1440p resolution these days as I am consistently getting more than 12go on last of us, fh5, re4 or hogwarts
the 7900xt and 7900xtx outperform it anyway, even in some raytracing scenarios, so you should feel fine/good with your choice.
What cpu are u using if i can ask. Im curently thinking about upgrading from my gtx1080 to a 7900xt and maybe wanna pair it with a 5900x
@@lucidlucifer8605 Ryzen 7 3700X
Similar performance as the 3080, just a tiny bit better... Difference is that the 4070 probably costs €800, while I paid €500 for my used 3080 (:
Or you can get a used AMD card
@@antonhelsgaun nope lol
Well, DLSS 3.0 is kinda neat, but idk if thats worth €300
Why would it cost 800€? The 4070ti can be had for 880-900, I am expecting this to cost 650-700.
Also note that comparing new vs used cards is a bit unfair in the price department.
edit: it can be had for 659€, officially and on third party sites such as caseking, if you are in germany, ofcourse regional variances apply.
Can we truly say it's similar performance when dlss 3 fakes frames between gpu rendered fps to make it sound better?
The ryzen 4070 is to expensive
Fr
Used ryzen 4070s are more adorable
@@fesyuki lol
AMD's 'fine wine' tech is a force in its own. I still remember when AMD released the 7970, was amazing how long that card held up
Fine wine was because they were using the same base tech for years. So they could bring the improvements from later GPUs to older ones
I'm just now upgrading from my R9 390. Outside of driver conflicts, I've been able to run pretty much anything at 1080p on it. Have a 3060 coming in tomorrow tho
yeah, I remember Rx 580 8GB being equal to a GTX 1060 6GB when was released. But now performs like a 1070 if not slighly better. Same story with the RX 570
It also helped that both ps4 and xbox one were GCN based and since almost every game is designed primarily for console, almost every game engine ended up being optimized for that architecture
ran those in crossfire for years loved it!
been planning on getting a 7900XT to replace my 1070, and this just solidifies my decision.
Why not XTX bro?
If you can afford the extra money, buy the 7900 XTX instead.
if you turn on dlss3 and ray tracing its gets clobbered but to each their own
@@jette24 what also gets clobbered is your wallet by spending almost double the price just for a AI crutch
@@jette24 Pretty sure that 7900XT is faster than 4070 even with RT. And DLSS 3 simply isn't mature right now.
Getting 6800 XT performance... For $10 more, you can get a 6950 XT.
Why are people buying Nvidia? I mean I bought the 4090 going from the Radeon R9 295x but if this was my budget I wouldn't even consider Nvidia
I just upgraded to a 4070 from a 2060 super in my small formfactor build, and I am very happy. It only needs a single 8-pin, runs cool and it's quit. The performance is uplift from my previous GPU is also quite good. I went from 1689 to 2619 in the Heaven Benchmark. I use Blender, so I had no choice but going with Nvidia, so I am glad that there finally was an option that didn't require a new case and power supply.
You made me feel better about myself , in my country 4070 ti goes up to 1600 dollars , yes so I just spent 1600 dollars on a 4070ti
@@codeman8113 I'm not quite sure how I helped, but I am happy to be of service 😄
@@Kristian.H.Nielsen you being happy with a 4070 made me feel better haha
I'm in a similar boat, my GTX 1080 is starting to show its age and the only GPU that is at least twice as good as my 1080 that will fit in my case is a 4070. So now the question is do I buy now or wait and see what AMD has to offer with their mid range RDNA 3 ?
@@Sid-CannonJust an opinion from a stranger on the internet, so take it for what's it worth: If you actually need the extra performance today - go for it. Assess your needs and base your choice on what meets your criteria. It's hard to make a decision based on a GPU which doesn't exist.
I'm really glad they focused on how they're raising prices equal to performances gains for the second time in the last 3 generations. What I'm sad about is them not pointing out NVIDIA's record profits last quarter (even with the cyrpto bubble over, a new record), they're taking the fact there isn't competition to simply raise prices knowing they have a near monopoly, record profit after record profit has been posted...they learned the market can be squeezed for more and they're doing so hard.
what can we say... we voted with our wallet. Look at Intel...thanks AMD.... and now look at Nvidia. Nvidia stock is trading too high compare to their P/E because everyone think they're unbeatable now.
It's capitalism. They're simply doing what all companies do. They are actually obligated to by their shareholders.
The prices are high because people are willing to buy them at that price. Be angry at the buyers. They're the ones causing the price rise. Not the company.
@@e5211 Bad press is part of capitalism, and covering for a company like you're doing doesn't help anything.
Do you mind share the source of this record?
yeah , actually, if nobody would buy the cards, they would have no choice than lower the price, the fact is there are still people who buy at that price
personally i will keep my 2070S until a better deal
Dang Linus 0:22 you're huge
I thought 4070 was gonna beat my 6900xt which I got for $550 this generation but damn AMD is killing it! Everyday I'm becoming more pleased with going red
Second hand 6800XT & 6900/6950XT are amazing value.
@@paulelderson934 I didn't get second hand tho. It was a refurbished model with 2 year warranty but yea the second hand market is crazy rn in my region. I can see there's still that refurbished model available for like $400 now
It should get more powerful as they continue to improve the drivers too
and mor VRAM
@@gokublack8342 6900xt seem matured enough
Kinda crazy how I’m in the same boat with so many others. Got a 1080 when it came out and it’s still going strong, but the 4070 finally seems like the time to upgrade. Planning to hang on to this one for quite a while as well!
I just bought a 4070, upgrading from a 1080 as well lmao, gonna set my dad up with my 1080 so we can play some games together.
@@gamingsfinest3356how’d it go?
His dad owned him on
4fortnight
My 1080ti was a fucking beast sadly I came across a nasty deal for the 4070 and bought it for 500$ flat no tax so I just had to bite the bullet
Im rocking a 1060 at the moment still and even that is able to keep up remarkably well.
Great review LTT. You guys have convinced me to buy a second-hand RX 6800 XT!
Hell, you can get a new one for under $550 if you look in the right places.
For a card that has 15% greater performance and more vram as long as you don't whirl up rt to get sub 60 frame rates anyway.
@@Torso6131 even the 30 series wasn’t ready for RT
Yeah, I grabbed my red devil 6800xt new for $513 back in January and I can't believe how good of a decision that was lol. Micro center is the goat for that deal.
I bought mine used for 450$ incredibly worth it! Do it now!
>4000 is a much better value than 2000 / 3000 series
no its not. 2000 / 3000 were priced that way because you could've mined eth with them and actually you could've returned ALL YOUR INVESTMENT ON YOUR OVERPRICED AF 2000 / 3000 CARD + MAKE A PROFIT.
only stupid ngreedia fanbois don't understand that 2000 and 3000 ngreedia cards were overpriced ONLY BECAUSE THEY COULD'VE MADE MONEY to their owners THEN.
now mining is DEAD. so 4000 series prices make 0 SENSE. same with ayymoode.
What's even more crazy is how the A770 can hang with these cards.
Yea but Intel needs to work on the software before i even think about buying anything from them
@Welshmanshots They have been. Still needs more work but since launch they have done a bunch of updates to it.
Did you watch the video? The only place it was even comparable was photoshop. Every where else it performs worse than a 3060 lol. What sort of fanboy copium is this?
This 4070 launch has cemented amd graphics in my build this year. I am upgrading from i5-7600 with a 1070 to i5-13600k with either 6800XT or 7800/7700 series that are probably coming this summer
i highly recommend the 6800xt if you build before the new gen AMD cards come out!! i love mine to death. got it a few weeks ago and it rips 1440p
What I find amazing is that the 6800XT is about 14% ahead in performance in 1080p. That's similar to the difference between 6800 and 6800 XT. So given that the 6800 always has been faster than a 3070... it's like Nvidia just closed the gap to their competitor's last gen with a whole new generation. It's terrible to see that. Especially considering that you can a 6800 for $500 or even less.
Even if someone wants to get an Nvidia GPU, it might even make more sense to wait for next generation. It's unlikely they can make a worse deal than this time. Otherwise AMD just offers the better deal currently.
I'm definitely happy with my 6800. Also having 16GB of VRAM can be pretty handy when ray tracing gets used more. Because it actually requires more VRAM in practice which is something Nvidia won't tell you before selling 12GB or less.
@@drew3045 thanks for the advice! I am upgrading to transition from 1080 to 1440 so glad to hear it performs!
Might as well get a Ryzen to go with it
@@lathrin I just might! But the bang for buck on the 13600k is pretty enticing so we will see.
I’ve been loving my RX 6800 XT! Didn’t realize the software updates had been so influential-kinda crazy that it outperforms a 3080 nowadays, according to the data here.
AMD has been gorgeous for software updates, I have a rx580 and I got through the years: Freesync / VR, Sharpening for upscaling, FSR. This GPU still rocks for the resolutions it was made and even stretches to 1440p with old games or FSR for new ones.
Please stop buying this Kool Aid.. It isn't the software updates, its the larger VRAM sizes. Please go watch Hardware Unboxed 3070 vs RX 6800 2023 revisit to see some eye opening stuff!
I knew that it would happen, especially considering the RTX 3080's tiny 10GB frame buffer.
@@eaman11 My 980ti died recently and I'm temporarily running an RX5500XT - when it's replacement arrives, it'll be an Nvidia card, but I will absolutely miss those AMD drivers, they're so good! Chill is witchcraft, I love it!
RX 6800 XT: about 550 dollars ?????
RTX 3080: about 810 dollars ?!?!?!?
what the hell? :-O
personally I was kinda pissed that my GTX 750Ti couldn't play "Doom Eternal"...so I installed "Binary Domain"...had fun...and didn't spend a penny ;-)
Having an RTX2070 seems still like a solid option for my workload; Factorio, an MMO every once in a while, video editing, and basic programming. But damn, holding a freshly unboxed GPU is one of the few joys in life :D
Same here. I currently have an RTX 2060 still does the job haha.
For me the 2070 can't cut it anymore, upgraded to a 1440p 144hz monitor and its a struggle trying to get high refresh rates on any modern AAA games
my factories nver get that big before i get lost. But my gaming laptop doesn't even spin the fans up for it.
And here I'm playing on GTX 980 at 1440p, which unfortunately isn't good enough anymore for lots of games, especially with just 4GB of RAM.
2080Ti, and I have to say, I'd love an upgrade for VR stuff. Aside from that, I'd be happy with a 2070. DLSS is amazing for prolonging anGPUs life.
Pro tip. Don't buy anything new that has less than 12gb of vram at this point. That is really the important thing to note. It doesn't matter how many cuda cores if you just dont have the texture memory
Hardware Unboxed did a good demonstration of just what happens when you're hitting VRAM limits. Stuttery, unplayable messes where all those cores are meaningless.
Games need more than 8 GB Vram are just badly optimized if you ask me
@@LiminalSpaceMan192 It depends on the target resolution, needing more than 12gb at 1440p is a >50% increase for as far as I'm able to tell, no perceptible increase in graphical fidelity... That's definitely worryingly bad from a technical point of view.
Ive only played a couple games that required 8+gb at 1440p unless you use near useless AA anc Motion blur/heavy shadows. But im sure glad to have 12
@Travis Clark The 7900XTX has 24GB of VRAM and for half the price ;)
For me, my 2070 still runs above 60 FPS in every normal game I want to play, but the area I would really want to have better performance in is VR. It would be nice if VR stats were included in the videos, since that's really the only thing that makes me want a better graphics card.
It's definitely not the most accurate but, I use the 4k stats for VR. Unforunately, VR-GPU Testing should have its own production as the variables in resolution differs considerably between headsets (even for high end models), as I'm sure you know. But, couldn't agree more! Need more VR data, especially nowadays!
Well Nvidia sure is making my 1080 I bought 7 years ago look like an excellent investment
My 1080 is still goin strong too lol
COPIUM
my brother in-law is using my brothers old 1080, has no complaints about hogwarts legacy and is enjoying it with his son / he is a big board game guy but his friends kids and friends (now that kids are older) are starting to dip into video games again
my 1080 TI startin to really struggle these days lol
this 4070 looks juicy tbh
My 1080Ti is still strong. Will most likely upgrade my 9th gen/1080Ti system when 14th/15th gen comes out and will pair nicely with a greatly reduced in price 4090 :D
6950xt definitely looking good over a 4070. You guys should have included that in the comparison since the price point is almost the same now
Was kind of strange seeing 7900 XT but not the 7900XTX??? Even though RTX4090 was on the chart...
nVidia has become to lazy and complacent due to years of majority control of the GPU market.
That's why I just ditched nvidia this time and bought 7900xtx. Even though I have always bought nvidia since 2010.
@@hammerheadcorvette4 but then they include one of the worse price-performance cards on there - the 2080
You should check out HUB review then, they did exactly this 👍🏻
Newegg has XFX 6950xt for $600 right now.
I would've liked to have seen more of AMD's last gen in the benchmark. For example the 6700XT has the same amount of VRAM but is alot cheaper
I upgraded from a 1080 to the 4070. I didn't mind lowering settings over the years, but I wanted to go back to playing modern games in 4K since the release of the 3080 which was never in stock. Regardless, I saved $100 waiting, and the 8 pin requirement saves time.
Are you happy with the 4070? Looking forward to buy one too
Bro same I wanted to buy a used 3080 to replace my old 1080 but I wasn't feeling like changing my PSU
@@mjtomsky6387 Yes, specifically the Asual Dual OC model. I think it has one of the best coolers of all time. It stays at 33c at idle without the fans even spinning. Great energy efficiency for a card that is 3080-level.
@@suspectedcrab Aight thank you, do you have multiple monitors?
If you do, may I know the resolution and if your gpu is struggling with these?
could you tell me what CPU you use with the 4070 pls ?
Thank you for including the 1080 I am still rocking it for 1440p got it during the pandemic. Keep up the old hardware in comparison with the modern hardware.
I love the 1080. I know hating Nvidia is the cool thing to right now and a lot of that hate is deserved, but the 1080 and 1080 ti were amazing GPUs.
my 1080ti scoffs at this weak attempt at get me to stop running it 24/7 OC'd to 2050mhz @ 1.093v lmao
@@doomdimensiondweller5627 people don't hate NVIDIA for their 900 or 1000 series GPU, they hate them for 2000+ series, which I totally agree with
@@mr_confuse I know, I think most of the criticism of Nvidia is justified.
i also have a kfa2 1080 oc and im actually surprised how well it still runs.
i play RE4 Remake on medium settings in 1440p/144Hz and have frames between 70-100 thanks to the last driver. when i first played the chainsaw-demo i already thought i have to buy a new one because i just had about 30-40fps, but now its just completely fine. :D
AMD just got my money for 7900 XT. I was on a 3070 but that 8GB vram was a huge hold back for that card.
Good choice. I'll be keeping my rx6800 for now
go with 7900 xtx
5:23 please make a 69 pack for $420
Great review! Clearly a lot to be a bit grumpy about, especially the cost increase to performance boost since last gen, but in the right scenario this still feels like a good option in niche circumstances. My brother is on the now ancient 4790k with a 1070 ti, and holding out for ddr5 to be more accessible and standard before upgrading his cpu and mobo etc, but is desperate for a better gpu to carry him through that. He’s obviously gonna be cpu bottlenecked a lot, but even so, a 3080 for $600 that sounds like it will work with his 750W supply feels like a great quality of life improvement until he can sort the rest of his rig. His rig is actually a hand me down from me, I upgraded everything after I was lucky enough to get a 3080 FE back in 2020, but there was a short window where I just slotted the 3080 into that 4790k setup and it definitely ran things a WHOLE lot nicer. This feels like a perfect bridge gpu in that (very particular) case
I'm really feeling good about that 6900XT I bought 5 months ago, especially now that I got my cpu and mobo up to date too with the 7600x with SAM enabled. Feels like it would've taken the combined cost of gpu, cpu, mobo and ram just get an nvidia gpu that could compete, add to that the cost of cpu, mobo and ram and it would take a couple decades for that power cost to make up the difference.
Interested in upgrading my 2080ti with a 7900xt - very interested in SAM now that I have a 7900 - have you done any testing with and without SAM enabled? Should I go for it? Wanting the upgrade to make the most of a 4K 120Hz LG OLED
same! initially wanted to upgrade to 7900 from 590 but the early price was so high in my country compared to actual MSRP, luckily then i found 6900xt in a very good price, so happy with it now
@@sexyfishproductions eh, could only find 6900xt at 700$.
i found a 3080 at 400$ so i think that was better value.
@@sexyfishproductions I know you didn't ask me but it's really up to you if want to turn it on. Usually a 3% performance increase on most games at 4k but some can be up to 15%.
@@sexyfishproductions My 6900xt has a lot less stuttering with SAM, as for direct fps boosts idk, i think it did give me a bunch of fps depending on the game, especially in forza
I have a 5800x btw
But hey, rezisable bar for nvidia also exists so at the end of the day it depends on what you wanna pay for a card lol
love to see the 6800XT stomping the 4070 - considering that the 6800XT is going for as low as 250$ on eBay 🤣
Ewww amd
I think you're looking at current bids. Can't see one that sold for less than $450 recently
@@alyoshamikhaylov7651 ew, a fanboy
what is crazy is gamers nexus has 4070 beating the 6950xt in most stuff...odd...
Show me where I can buy a 6800xt for 250$ and I'll buy it right now.
I wish you'd included the 6700xt on your graphs. With 12GB of VRAM, it's actually been edging out the 3070 in some newer games. It would be a nice point of reference in approximately the same tier
Did you watch it?
That mention LTT Store had me like 🤔👀🤣 and that “Shadow Realm” reference just made me flashback to junior high playing Yugioh in the lunch room and telling a kid “I summon Dark Magician prepare to be sent to the shadow realm”
"All of you Pascal owners out there that are finally gonna bite the bullet and upgrade" Right about that one! I finally felt it was time to upgrade my 1070Ti, so I went with a Merc 310 7900 XTX a couple of weeks ago for the same price as reference new, and I love it! Not this time Ngreedia
Went with Rtx 4070ti. Gamers paid $2k last year for RTX 3090ti and even bought 3090 for that price from scalpers. Nvdia are so nice they made it $800 for the same performance if not better at 1440p lol. Thanks Nvdia I actually don't give af as long as it performs as a high end GPU with way better power efficiency.
@@Sam-fs8sx If you bought it not long ago you could refund it so you don't have to intake large amounts of copium.
@@xikirito_6809 Nah I'll just buy another one in 2 years when 4k OLED high fps monitors come out. Just be the best version of yourself. It's better than crying about prices, espically when the product you get is worth the performance gains.
What a lot of people seem to forget is that it wasn't until this time last year that most folks were finally able to get their hands on a 3080/90...soooo theres that. And I'm not even entertaining all of the other rabbit holes I could certainly wander down. Most folks are just good now...
Hopefully Nvidia will start to loose money in production and marketing, not be able to profit because people stopped buying their top-of-the-lines and they'll become in financial struggles. It feels like the only way they could actually deliver and stop the crack smoking with their pricing is if they all to do all it takes to squeeze as much sales as possible to get back on track.
@@charlesm.2604 It would help if people covering tech make an effort to not review nvidia cards, not include said cards in graphs, and cover as little news about nvidia as possible.
@@charlesm.2604 Not going to happen. Nvidia users are like Apple users, blind and sheep. Nvidia will continue to earn more.
@@rimuru2035 I don't feel a news outlet should choose to not cover a particular brand. That would make them not valuable to me personally. I can make my own decision regarding what to do with the news.
I would have loved to see the 7900 XTX included in the bench....
15% slower on average while costing 40% less
You were spot-on about the GTX1080 people holding on to their cards. I am still using mine for gaming, but I a looking at GPUs for rendering Rhino3D/Cycles. I find the new card prices a bit much.
Amazing, in 3 years of development Nvidia managed to lower the TDP of the 3080. Just, incredible work! Their team should be immensely proud! With gains like these we might even have Path Tracing in 10 years!
Path Tracing is already available in Cyberpunk 2077 for Nvidia 40s cards. No need to wait for it.
@@BlueTocho ........ my god
I would love to see a 6950XT stacked against the 4070, given they are regularly going on sale for $600 currently! I decided to go AMD route because of the subpar price to performance with Nvidia recently.
I'm sorry you can't compare msrp against sale price like that lmao
@@dougr8646 you definitely can buddy
If it's current prices you're comparing you can. Not sure how common the "sales" are but sometimes they're basically permanent in which case it's a valid comparison
Yeah whole world isn't your country
@@dougr8646 yeah you kinda can
love to see the old cards tested again to see how they stack up!
One thing more focus should be on is on power dissipation. It looks like the 40 series is a lot more efficient in terms of power consumption than the 30 series. So whilst the 3080 may have a small edge in performance over the 4070, it uses 50 - 100% more power to do it.
Each review that comes out of the 40 series has me scared I made a mistake buying a 3080 this past Christmas. This one had me specially concerned since it was supposed to be the more comparable one to the 3080. Glad to see again that I made the right call. I’m beyond happy with my purchase (specially since I came from a 1060 hahaha) now that I see they perform basically the same for the use I give them.
Same here. I have zero remorse for buying a 3080 at MSRP even this far into the 4000-series. I have a feeling it will serve me well all the way through the 5000-series too.
you chose right for bang per buck, try buying a 3080ti strix oc in feb 22....you dont wanna know how much it dropped in a month, but it was a birthday present to myself so doesnt sting as much (it does im kidding myself) :)
Same I got a 3080ti around Christmas for $600 and thought I screws up
I have 4070ti. I play my games in peace at my beautiful looking OLED 1440p 175hz. People spend so much money on Iphones every year which barley have a performance increase. I am a simple man, I see 4070ti is nearly identical or better at 1440p than RTX 3090ti(fastest GPU a year ago) while being significantly cheaper. I buy it (:
@@Killshot15 i paid 3x that year before in £ for the strix, you got a bargain any model at that price.
0:47 Video what now? Video who? Excuse me Jake😳👉👈
Nicely done on highlighting the GPU you are reviewing this time :D
Would be nice to see a price per frame comparison (adjusted for inflation) of the last few GPU generations to see if we're actually getting more bang for our buck.
so according to this logic if a card could pull down 2,000 fps, they should've cost 2k$ right? what about 4,000 fps then I wonder.
@@redfoottttt you take the price of any given gpu and divide it by the number of frames per second it on average generates in a given game, on a given set of video settings, then you get different price per frame values
Relative to the flagship cards, 90 series, the 4070 is MUCH worse perf/$ than the 3070 was.
A -70 tier card at 600 bucks, UNACCEPTABLE, I don't wanna fkn hear it
5:15 WHAT DID HE SAYYYYY
I love the tons of data that are in the reviews now because of the lab but sometimes it gets a bit confusing. I personally would prefer having a graph be separated between what’s competitive and other. For example having only the 4070, 3070, 3080, and 6800XT and maybe the 1080Ti at the top of the chart always and that the position does not change. It would help show what’s the performance s difference against what viewers should compare the product to when making a buying decision. Great work by the lab in this review and the Ryzen 7800x3D review, it’s just that the data seems like it can be a bit overwhelming sometimes.
I agree, it would be nice if there were some "gold standard" type GPUs in a fixed position on the graph, and we could use them to ground our understanding of where [new GPU] fits.
I went from a 1080 bought when it was released to a 4070 yesterday. If I get like 7 years out of it again, I am happy. I like the rather low power consumption
Def not getting 7years out of 12gb vram unless you’re on 1080p.
@@puffyips I am not going anywhere near 4K anytime soon because it just offers no value to me at all, so I think I'm fine
I’ve been team green since my Geforce 6800GT (2005ish I think) and I’m finally starting to consider AMD for my next upgrade. Their recent driver update history is confidence inspiring and I’m done with shelling out the premium team green price.
6950xt is up for $649 right now if you were saving for a 4070
It's tough man. Team green products just work, I've been using then since 9800gt. Not as long as some clearly but still, they just work. I hear only horror stories about AMD cards and their stability and drivers, but ita very quickly getting to the point where the $$$ outll weighs the the chances of issues.
I got a 6700 XT upgrade recently from a GTX 1080. I play at 1080p and love it, looking into 1440p monitors now to get even more out of it lol
@@mattmanix5104 i only hear horror stories about nvidia cards and their stability and blowing up and melting cables.
I'm in a similar boat. I've been using nVidia since way back when I got two 660 ti's (when SLI was quite a bit more popular than today). I have a GTX 1070 in my machine and current cards are just so much better. I train AI using my 1070, and it's a bit lack-luster, especially compared to RTX series options. However, $599 for a 70-level card??? The GTX 970 launched at $329. The current pricing is just completely unreasonable.
1080 is one hell of a card , still continues to haunt newer card
Still using my 1080 as well.
@@travisthebeast11 same
Still using my 1080 ti
1080 ti gang
buying a new GPU doesn't involve just replacing the GPU, so it makes sense that the 1080 is going to be around for a while, especially considering graphics probably aren't going to get exponentially better anymore while the 1080 runs most modern games at regular settings pretty well
ah another review of a card i wouldn’t buy (cause i got 0 money)
Yep, same here brother😅
Me too 😂
Agreed
Agreed
Agreed
The data in these vids is getting more and more impressive. I'll say tho, another upgrade y'all could make for the figures is incorporating uncertainty bands. That one note about +/- 400 for arc made me wonder if there were any significant differences between any of those cards, or if the results were just an arbitrary ordering based on a single draw. Ik doing repeated draws of the same tests takes much more time, but for really important reviews the return might be worthwhile!
Plus, then you could switch from bar charts to the much more pleasing dot and whisker style plots.
@@jackrametta I suggested using distribution curves for some of their frame data as well, rather than multiple bars for different values. they seem to respond to criticism of their data presentation with the defence that it needs to be widely understood and there are simply too many normies who wont understand high quality data for it to be worthwhile.
@@jonathanodude6660 Lol that sounds about right. I have no idea what a distribution curve or whisker plot is.
@@kanjakan just a normal distribution/bell curve (google it). It’s pretty self explanatory when you look at it, but it’s hard to describe. It’s essentially a plot of frequency vs value. You’d have your frame rates as the x axis, same as their current bar charts, then the y axis will be what percentage of the total time spent testing was spent at each frame rate. If you had each test rig colour coded and semi transparent, then overlaid them, it would provide heaps of immediate visual information about what to expect for each setup. It would look like upside down “U” shapes with varying curvatures where the further to the right the peak is and the less spread the curve is, the better. I don’t think it requires collecting any more information than they do already.
6800 xt looks impressive, it's really cheap first and second hand where I live
Main takeaway from this Video
Yeah, definitely gonna stick with my trusty 6800XT for a bit. Great price-performance.
Wow, its still surprising how well the GTX 1080 has lasted over the years. Per why its staying in my PC for another couple of years! Even for 1440p Mid settings
Bruh, time for an upgrade unless you happy at 1080p.
@@squirrelsinjacket1804 I will upgrade to a 4070 eventually, but for now games like cyberpunk, battlefront 2, and tf2 run surprisingly well at 1440p. For cyberpunk I can achieve 70fps < at mid low settings because the card is water cooled and overclocked like crazy.
@@calebmartin_ Yea that works... I feel like once you make the move to 4K and maxing everything out it should be on a proper 40+ series card.
Stop it! It's not by choice.
@@squirrelsinjacket1804 you don’t have to play newer demanding games at max settings. If you don’t need a new PC, why buy one?
I am a lifelong AMD card user. And I can’t remember the last time I cared about a sitting through a video card review video in it’s entirety. But I watched and really enjoyed watching this entire video. The writers did a great job!
>4000 is a much better value than 2000 / 3000 series
no its not. 2000 / 3000 were priced that way because you could've mined eth with them and actually you could've returned ALL YOUR INVESTMENT ON YOUR OVERPRICED AF 2000 / 3000 CARD + MAKE A PROFIT.
only stupid ngreedia fanbois don't understand that 2000 and 3000 ngreedia cards were overpriced ONLY BECAUSE THEY COULD'VE MADE MONEY to their owners THEN.
now mining is DEAD. so 4000 series prices make 0 SENSE. same with ayymoode.
What they don't tell you about DLSS3 on the 4070's is that it uses cores instead of dedicated hardware. That's their dirty little secret. You can tell this by the 4090 giving 2-4x frame generation, while the 4070 only gives ~1.5x frame generation. That means you're actually losing real frames to gain fake frames. The 4090 however does not lose real frames, it simply adds more frames on top of real frames. This is actually quite a deceptive change, because it means frame generation does not equally increase frame rate across different cards, making frame generation far less effective on the 4070's. If you notice on the graph, the 4070 went from 36 to 58 fps with frame generation. Since you need an equal amount of frames generated as real frames, this means with frame generation on, the 4070 is actually producing only 29 real frames instead of 36.
That means your real-frame latency actually drops below even 30fps levels with frame generation on. This can only be explained by them secretly using CUDA cores to do the frame generation, rather than the dedicated DLSS3 frame generation hardware on the 4090. This is a big freakin deal, because it seriously undermines the benefit of even having DLSS3. I would consider this kind of deception not just borderline illegal, but actually illegal, as they have fundamentally altered the promised nature of how a core feature functions, by using real hardware in the higher tier cards, and what amounts to emulation in the lower tier cards, all without actually disclosing it. Like a child being fed and having the spoon switched out from delicious cake to vegetables at the last minute, this is exactly what Nvidia has done with the 4070 and consumers, hoping you don't notice.
This is class action lawsuit levels of deception, like back when Nvidia made the GTX 970 with 3.5GB of VRAM, but 512MB of it was segmented from the main RAM, and thus virtually useless. The 4070 doesn't have real frame generation, it has EMULATED frame generation, that comes at the expense of core performance.
Just because Ngreedia didn't meet your expectations with this card does not mean they did anything illegal lol. Don't know if you were being serious with that or not.
Glad I got my 7900XTX at MSRP. I had a 3060TI and I loved it (1440p high at 144hz was more than enough for me) but these 40 cards aren’t worth the money this generation
I upgraded because I stupidly sold my PC last year…I literally only have a GPU because i was waiting for the 7800X3D until sites other than Best Buy listed them $100-$300 over MSRP 😭
Great! I have an RX 7900 XT and it seems the updates are making a big difference!
It depends on the individual consumer. 😩
Neither is the 7900XTX
GPU pricing is a joke.
@@dauntae24 oh for sure. If you’ve got the money go for it! No shame here. I just found AMD’s value proposition more enticing, again if you can find the products at or around MSRP
Every new 40 series card makes me happier and happier that I went with the RX 7900 XTX for this generation.
minus vampor chamber, yes
Really hoping to see a 7900 re-review in a few months to see if its gotten as much love as the 6000 series in terms of driver updates.