How about doing a vid showing how easy it is to use Jawa. This is definitely the target audience for it. BTW, love how you go in depth explaining your point and not just rehashing other RUclipsrs' stuff!
Hi, i guess links are not allowed on your RUclips channel :)?, you could test 3090 vs 4070 ti 4k max settings + raytracing in Resi 4 demo, it took 14gb allocated + 14,5gb dedicated vram on my 3090 ti :=), best regards
Which I find very disturbing. Is there are too many of these phony scalpers hyping up these ridiculous prices in the comment section as a bargain deal. Sad that many folks buy into these marketing normative influencers' nonsense scams.
@@sunteraze ha 1080ti is still one of the best value gfx card Nvidia has ever released. Since 1080 and 1080ti things have been going downhill for them.
Totally bizarre to call a GPU from 2021 'old', especially when the current gen of cards are barely an improvement (except at the very top end). But the marketing is working I guess.
Which I find very disturbing. Is there are too many of these phony scalpers hyping up these ridiculous prices in the comment section as a bargain deal. Sad that many folks buy into these marketing normative influencers' nonsense scams.
Yup. Bought a 3080 Ti used for 600€ 2 months ago, with 2 years of warranty left, and it's amazing value. I sold my 3080 (10Gb) for 500€, so the upgrade cost me only 100€. Nowadays, with more mature drivers, the 3080 Ti is much closer to 3090 in gaming performance. Changed the thermal pads and it's now running very cool, with good OC. Decent value upgrade in today's market, even from an original 3080 (which I bought for 740€ at release in 2020).
lolol, did nearly the same, sold my 3080 rog strix 10gb for 500€, found a guy selling a zotac 4070ti AMP extreme airo for 700 bucks, added 200$ and got the 4070ti. on an average all between 10-20fps better than the 3080 in 3440x1440p in 10/10 games, while my 3080 rog strix also pulled easily 330-360w, the 4070ti barely touches 260-270w and in most games it just pulls 180-220w.
No man. I swapped from 3080 10 GB Zotac Trinity to 3080 Ti 12GB Evga Ultra Ftw3 a year back. On 4K Ultra the difference is somewhere inbetween 12-17%. In real results in games that i play, that means a avg bump from ~73FPS to ~85FPS. It's barely even noticieble.
the point is.. if the 4070ti has more vram or bus, you would get significantly more frames, longevity and stability. you are paying 800 dollars, which is usually a xx80 tier card price. why would there people defend nvidia giving you LESS while paying more?? we talking about vram/bus which are cheap, not the die size which is the bulk of the expenses. anything below 4090 is gimped either because of power or vram. 4080 would be acceptable if it was 1,000usd or lower, but its not. pretty sure they gimped the 4080 in vram so there can be a 4080ti with 20gb of vram.
Nobody is defending Nvidia, but the not enough VRAM nonsense is getting ridiculous. Bottom line is that Nvidia doesn't care what you or anyone thinks about their choices for VRAM as everyone will still buy their cards. At 1440p, 12GB is and will be plenty of VRAM for the foreseeable future. You want to waste money on more (i.e. 3090 and 3090ti owners that got royally butt-raped)? Get a 4080/4090 or AMD card (yah...right).
@@karlhungus545 so.. you fine getting less. thats all you gotta say. wouldnt you rather have 16GB on 4070ti instead of 12GB????? it is evident from the video that if it had 16GB, it would be smooth and good frames in cyberpunk 4k instead of studdery. and guess what? that translate with you being able to keep and use that card for much longer at 1440p for a 800 dollar card.
watch for people that will say your card is already outdated and you should go for more Vram... but got i hate my 7900xt i wish i had bought a 4080... or even a 4070ti pretty sure i would be able to play games at least....
I like to think there's at least one Nvidia fanbois out there losing sleep over frame generation, and I'm going to love seing FSR 3 run on Nvidia's 3000.
You mean the Fake Frames version for AMD. Hmmmm, It will be interesting to see if AMD optimizes it more for their GPUs or will be basically the same across the board.
@@NBWDOUGHBOY Looking at it, it seems that you'll be able to turn it off, like nVidia with frame generation. I don't see the problem, it's not as if they're going to force people to use the interpolation.
Which I find very disturbing. Is there are too many of these phony scalpers hyping up these ridiculous prices in the comment section as a bargain deal. Sad that many folks buy into these marketing normative influencers' nonsense scams.
Was gonna ask you to compare these cards actually, since its more proper test comparing 384bit bus vs 192bit with same amount of VRAM while trying to max out the VRAM usage.
If it wasn't for the power efficiency upgrade, the 4000 series so far (with the exception of the 4090) would probably not be a worthwhile upgrade if you already have a 3080 or above.
Just undervolt your 3 series card and the power efficiency isn’t an issue. There’s also a mod to get FSR 3 on series 3 nvidia cards which will let you use frame generation so you’re not even losing out on dlss 3. People paying full price for series 4 cards are just throwing their money away
so I live in Korea, and recently I purchased a 3080ti (brand new) for $644.99. Would you guys consider this a good price? My friends say I got super lucky because this store was selling their remaining 3080ti's for cheaper to make room for the new 4000 series.
Nowadays 3080 you can buy for a price like 400 eur, because everybody belies "intelligence" magical force of cache and frame generation. Meanwhile 3080 is pushing everything with raw power, even super broken dev version of Atomic Hearth with full raytracing is fully playable on it, thanks to raw bandwidth of Vram (760GB/s). Regarding of frequencies, 3080 with good cooling can boost frequency even to 2Ghz vs declared 1.7Ghz.
I recently sold my Asus TUF 3070 8GB OC for $325 and got Gigabyte Gaming 3080Ti 12GB OC for $470; initially was looking at 4070 but that would have cost me few hundreds more. Think I am good to go for another 1-2 years now.
I literally just bought a used 3080Ti for $550 this month, August 2023. I replaced a 10GB 3080 purchased for $550 last year and justified the upgrade by selling the 3080 on again. All in all I'm out about $100. At $550 3080Ti is technically a decent value for price/performance, has enough memory, and my system is built to support a high end card so TDP isn't really a major concern. 400W TDP is nothing to scoff at but it's not nearly a 3090Ti or 4090 anyway. As it only requires a pair of 8 pin PCIe power connectors, it can only draw so much. 3080Ti is a decent little jump from the 3080, and while not massively faster, there are some games where the difference is surprising. 12GB isn't exactly an ideal amount but it's far better than any 8GB GPU and PLENTY of memory for 1440p in almost every possible scenario. Even new and very demanding games don't usually use over about 8GB of memory. The outlier games I play are the Resident Evil Remakes which will happily gobble as much memory as they can get. And those ran just fine at max settings locked 75FPS Freesync on the 10GB 3080. Would 16GB be nice? Yeah but 12GB is serviceable for now and the ray-tracing performance advantage is worth it to me. Are there better values available than a $550 3080Ti? Probably. But if you want to keep using Nvidia hardware and don't want to buy a compromised, price fixed, or dangerous video card, the 3080Ti is as good as it gets. Poor 3070 and 3070Ti are very gimped at 8GB despite their actual performance easily justifying more memory. As well 4060 and 4070 are lame ducks that have very underwhelming performance, even my 10GB 3080 is faster than those cards. 4070Ti is a decent performer but it averages $1000 for premium models and even the cheapest ones are $800 for what is essentially a mid range card that should be about $400-500 max. That basically leaves the 3080, 3080 12GB, and 3080Ti as the only reasonable options from Nvidia right now for anyone that runs high end hardware but wants to spend a SANE amount on a GPU. And that's ONLY if you buy them USED. If you don't have anything locking you down to Nvidia right now, go AMD.
people today seem to forget how important vram specs are. I may sound like an old dad (haha) but back in the day when i bought my gtx 970 i had stuttering issues when going over that really weird 3.5gb full speed memory. The 970 had 4GB in total but only 3.5gb were full speed and the 500mb left were much slower. When i played GTA V and it started raining it pushed beyond 3.5gb had really started stuttering very bad. so i had to lower some graphical options just because when it rains it stutters otherwise it ran very well. The 4070ti 192bit bus may not be a huge issue right now in 1440p but when you see how it performs right now in 4k, you know that its going to happen in 1440p as well in a year or 2. Remember games don't get easier to run (newer games obviously not the ones that are already released) but require more powerful hardware as time goes on. If you are ok with spending 800-1100$ on a card to needing an upgrade in 2 years then so be it but i'm not in that camp and i'm pretty sure the vast majority isn't either. In my opinion the VRAM should never be the bottleneck on a graphics card, the gpu should always be the limit not VRAM. You pay all that money for that gpu but then don't get all of its performance you paid for because Nvidia decided to put some bad VRAM configuration on it to save a buck and put higher prices.
Shameful thing is Ngreedia don't pay verry much for thier Vram in the first place as they buy so much , saw somwhere it works out at $2 per gb . So an extra $8 to give the 4070Ti 16 gb
no DLSS 3 tho, they made the 30 series so good they have to nerf it to replace it, at this point i'm glad i got an AMD GPU, heard that FSR 3.0 might get announced 2 weeks from now, on 23rd of march
Tbh the biggest mind blowing for me how people use Hogwart as example of how now games will obsolete alot of GPUs while this game is broken af. One of first patches had "memory leak issue fixed" new patch have "Fixed several memory leaks", "Improve VRAM usage", "Fixed lighting optimization with Nvidia drivers", "Fixed streaming issues." and all of this stuff but no people are saying that 2 years old hardware is too old to carry this. If this game would release with that kind of issues on consoles especially on XSS these consoles would crash probably in main menu. Also worth to mention is that Dead Space Remake also have issues with memory that Digital Foundry found not sure why is happening with these new games.
@@Extreme96PL Funny, as you complain about it being broken, I hear Atomic Heart crying in the distance... Funny thing about how "broken" HL is, it's still one of the most complete triple A game releases in a long time. So don't expect future releases to be better, especially not when UE5 becomes the standard engine.
@@x8jason8x if product is broken then content dont matter and if devs dont care about their product and release it broken i see no reason to care about it either just dont use this broken product as example to test other product because this mean mostly if not all problems will be fault of game not hardware. I would say this again let people pre order more triple A games i am sure this will force devs to care especially if their lazy port paid off in pre order already. Not sure what you mean about Atomic Heart its probably best optimized game this year together with Returnal both of them looks great and works great and both of them use UE4 too so like you see games on UE4 can be optimized devs just need to care instead being lazy.
@@Extreme96PL You... you really don't understand... this "broken" game, as you call it... is the best optimized triple a game released in *_years,_* but somehow you don't like it as a benchmark? That's nonsense my guy. It's exactly what should be used as a benchmark standard.
@@x8jason8x best optimized game wtf how this can be best optimized game if even devs said in patch log that they fixed multiple critical issues and no one know how much issues are still here, game cant even use CPU because in high populated areas you always cpu bound even with really good cpus, fact that even people with high end pc which are much more powerful than consoles have fps drops and stutters lol you think that Arkham Knight PC port was to good before they fixed it after almost 2 years? Or GTA IV port which was never fixed and even now have issues?
I just bought a used 3080 ti for 630$(USD) it came in a near perfect condition and i have a 750 plat psu so i undervolted and OC the memory and i pretty much have a cheaper 4070 ti with a bigger memory bus with similar eiffiency and amazing performance with the bells and whistle turn on. I highly reccomend getting a used 3080ti for less that 700 and saving the money for other parts a build
im pumped I was looking to get one a couple of months ago they were all at the 700-750 range , going to look a t a soldiers 3080 ti he didint get to enjoy because he got deployed for 2 years for $550 that will hold me off great at 3440x1440 till next gen and I still have to sell of my 5700xt and 6700xt im sitting on as backups
So the 4070 ti is the GTX 970 all over again. But this time the price is atrocious also. I think if this was a 4060 ti and $400 to $500 we would easily dismiss these observations by saying a 4060 ti is not expected to be 4k ultra RT capable.
LOL…used my OLD 1080 for 6+ years. Loving my Strix 3080ti and don’t plan to upgrade for some time to come. Skipping 4000 series is an easy choice…we shall see what happens with the 5000 series but won’t be upgrading until a 5090 at least…I like to make it last as long as possible and bump to the next level…and the next level is a 90 series.
I’m waiting patiently for 5080 Ti. I’m an xx80 Ti guy since 980 Ti. I had 1080 Ti and 2080 Ti. I’m currently using a 3080 Ti. I’ll skip the 4000 series this time.
I got used 3080ti in late 2022 after being disappointed by 4090 and both 4080s. It would be so funny if my 80 class gpu beat next generation 80 class in games with high vram usage.
@@harveybirdmannequin because of how fast tech moves a 2 year old card IS old. On release 3080 ti was considered a 4k card. 2 years later its basically a 1440p card and on the lower end of mid range.
@@Dempig Okay, you're on one there bud, 3080ti is still a very capable 4k card. Lower end of midrange? LOL! The lower end of midrange for nVidia has always been 50 class skus. It's quite obviously still in the top 5 nVidia cards, top 10 across all cards.
It’s perfect for 4K especially if the game is made by a competent studio and uses DLSS for example I play RDR2 at 4K optimised settings (leagues ahead of console settings) with DLSS quality and almost never drop below 100fps, same with HZD, Shadow of Mordor with 4K texture pack DLC, “ God of War etc etc it’s not that the high end 3000 series cards have become obsolete, it’s that games are not made well anymore.
@@George-um2vc What you say is true, 3080 Ti meant to be a 4k card, and it is still a beast (basically a 3090 with half VRAM), but shitty optimization, forced releases and lazy developers make it look as it is already obsolete, and nvidia using frame generation to make 4000 series look tough (which i think could be easily utilised on 3000 series too), so it is nvidia's interest to force out the still appealing high-end 3000 series from the market, to increase the crappy sales of 4000 series. In addition - maybe it is conspiracy theory or not - newer drivers make older cards slower to make 4000 series look better in comparison to 3000 series.
@@mikerowave1986 Have to agree with you on all that, what you said about drivers scares me, we know Nvidia does not have own best interest at heart, they could well be nerfing 3000 series through drivers, at least we can always roll back right?
@@saricubra2867 Yes and that's a huge part of what I was referring to about games just not being made well, devs do not put effort in to make their game take advantage of multi core/thread CPUs which just bottlenecks your GPU no matter how powerful it is and leaves lord knows how much performance on the table, sad times.
Bro what, The 1080 is still pretty capable in 2023, tf u mean "old gpu", the 3080 Ti is still very much capable and I wouldn't count 3 years as old by any means
I have a 3080TI fairly happy with it. When buying I was on the fence if i should go with a 6900XT instead. I feel like if AMD can just be a little more consistent we could have a case like Intel vs AMD in CPU but with NVidia and GPU over the next few years if NVidia starts to get lazy and cheep out on new GPU's the way Intel did with CPU. Its not the same obviously since the 4090 is pretty crazy but if they only put effort into the highest model then they could end up losing market.
Well to be fair I’m pushing it in 4k, most other channels mainly focus on 1440p or just 4k benchmarks, without taking into account the vram limits when presented with ray tracing heavy games at max settings
@@derricksmith3973 Ive seen this issue before, but its not related to the GPU, Ive seen it when there is a bad network connection. I don't think that is his experience, but as a IT Engineering Professional, I don't find this content helpful without knowing more about his setup/variables.
3080 and the variants above it will carry over a consumer for a while similar to the 980ti especially if you're gaming at 1440p. If you plan on doing 4K gaming with RT and DLSS Quality enabled, then maybe consider a 4090.
Sold my 1080 Ti for a 3080 (300eu to 850eu) one month ago. video editing :100% gaming 0% To be honest, compared to the 1080Ti i am not impressed with the 3080 either! It is not x3 as i expected it to be but something like 40% better. but i won't deal with the 3080 TI power consumption! Video editors, don't only count CUDA cores (as i did) and calculate your render times, find the "CUDA CAPABILITY number" board and see the real differences there. CT thanks for your work brother!
@@Achilleas7 LOL! If you got zero gaming performance increase with a 3080, you're doing something mega wrong. A base 3080 10gb is 40% faster than a 2080ti *_at worst..._* you should take your system to a professional and have it optimized.
@@x8jason8x Next time you respond to someone what you think they should do, make sure you qualify as a reader first, and then turn to writer! Read again and try not to fail this time! Editing: 100%, gaming: 0%. I don't game! Go back to gaming and let professionals have their talk without you interfering with them!
@@Achilleas7 You need to "read again". I specifically addressed what you said. As to "professionals", you most certainly are not. I build for a living, and you are talking out of your neck.
@@ClassicalTechnology I recommended a 3080Ti to a friend with a 5900X. I compared it against the 1080Ti (352 bit memory bus and 11GB of VRAM) and the 3090 (number of CUDA cores). The only good Ampere card. The rest of the lineup, wrecked by RDNA2.
A 3080Ti is considered "old" already? LOL. Come on dude. A 1080Ti is still a beast. So yes, the 3080Ti is *more* than capable and will continue to be so for several years.
"But hey, a 70-class GPU was never meant to perform at 4K ultra settings in the first place, anyway". Yeah, and this card was never meant to be a 70-class GPU in the first place, anyway too. It should have been the RTX 4080, at least it was intended to. Don't ever forget that. You can try to make a fool of yourself all day long, looking for a reasonable point to justify your bad call in purchasing this piece of crap, as well as your eagerness to remain a shameless company simp. If that will make you feel somewhat better, good for you, I guess... But first of all, if the kind of thought presented in the first line of the text ever had some merit as an argument, it would be pointless to seek innovation, we just should've kept aiming at 1080 60fps as did flagships in the past. Now we're starting to see 8K resolution becoming a thing and Nvidia is there trying to push you a crippled GPU that is simply already incapable to perform well in some titles if you opt to max out your settings at 1440, let alone 4K, and that just a few weeks after it was launched. This hardware is an overpriced and outdated horsesh*t. And you keep telling Nvidia "oh yeah, give us some more of that! Keep going, and don't ever forget to charge a premium for no reason at all, 'cause we all love you just that way". Well then, no wonder we've been seeing all those ridiculous GPU prices for quite a while. You just won't stop asking for it. Oh, and have I told you this card was initially intended to be a 80-class GPU? I beg you pardon if I haven't. Second, Nvidia itself advertised this garbage as comparable or superior to a RTX 3090, which in turn was advertised at capable of 8K gaming in Ampere generation. I can't help but to imagine the jokes people at Nvidia come up about how insanely easy is to troll your sorry asses.
Late last week I was in need of a Windows 11 key and for the life of me I couldn’t recall who I’ve heard say time and time again “for a nice discount on a windows key….” And then it hit me. Thanks for the nice discount.
Definitely! My 3080ti is such a huge leap over my 2070 so you will enjoy the new card, I play demanding vr racing simulators on one of the highest resolution headsets out there and it's so smooth
I went from a Alienware X51 R2 with a 1660TI to a Origin Chronos V2 with a 3080Ti and love it! I won’t upgrade until at least late 2025 when the 5090 or TI version comes out.
5:00 probably Vram limitation. Hogwarts is very vram and ram hungry. At 1440 Ultra settings it takes around 9 GB of Vram and around 20 GB of System RAM.
I have a TUF 3080 12GB OC and it is great. I get 100+ FPS playing sniper contracts 2 max settings with at 3840x1600 and both the Ryzen 9 5900x and the 3080 stays below 65 Deg.
The 7900xt is becoming very reasonable at 799$, they're throwing in a copy of The Last of Us as well. I won't be buying this gen, but I'd definitely consider it at that price point.
@@x8jason8x yeah Jason, below $800 here in St Louis, Looks like I can still get the game on Amazon for the $649 6950 also. Also Jason I'm only running a 11700kf. It's a 3840 x 1080p 44 inch Ultrawide. And it can kill a 3070 ti or a RX 6600 in Cyberpunk and and MS Flight Sim. Not a fps over 30
I have a rtx 3090ti and in no way upgrading any time soon its consistent and I doubt there will be a big enough jump in gaming to make it irrelevant or truly challenge its quality gameplay
Got a used 3080ti for $620 and I'm riding this until 5000 generation at least. As shown here, there's not much difference until you get to the 4080, which simply isn't worth buying given the relative performance between the 4080 and the 3080ti. Further, I play at 1440p only, as many of us do, and most of my games are 2+ years old. Combined with the fact that I don't use ray tracing much, there's plenty of performance for my tastes.
Love how you call my (probably) upgrade gpu old.... My guy my 980 (not the ti, just basic 980) has been kicking hard with my 1440p, triple monitor gaming lol.
And it's going to become even close in FPS with AMD FSR 3.0, FSR 3.0 is the reason why I didn't upgrade to RTX 4000, I'm gonna wait for either RTX 5000 or AMD Rx 7000
Be honest and go with the 3080 to hogsmeade map, he only shows the 4070ti in hogsmeade. Not the 3080. There is no busproblem with the 4070ti it was a bug and a possible shortage of ram, its not even sure. And while i was testing the new fixes at 2k, cause im very happy with this. Im measuring an average of 150fps with dlls balanced ultra raytracing. Nvidia should sue you for not being fair with demonstrating the cards at different locations. Shame on you
I said this before....his test isn't a real world scenario. If you find your 4070ti doesn't run Hogwarts well at 4k why the hell wouldn't you use DLSS to render at 2k and upscale to 4k? Problem solved....I get his point but if you actually own the card you're going to make it work for you and be happy with what you have. It's still way too expensive though....imho.
Hey, BMWS are cool though. Best cars ever! I was showing a specific issue with the 4070 ti there, the 3080 ti did not demonstrate those same problems in the same way, onlly the 4070 ti. The 3080 ti didn't have high FPS there, but it wasn't stuttering that badly either
DLSS 3 seems to increase vram usage so if the game base was already maxing vram turning dlss 3 can really destroy your game. This may be the reason as to what you experienced.
Thank you. I decided to get a rx 7900 thanks to you. Was going to get a 4070 but the stuttering put me off. You can get a 7900 cheaper than the 4070 now.
My setup Nov 2020 i9 10900k @5.2 32GB DDR4 3090RTX 2X1TB 970S 1000WATT Is in 3rd year of use and with over 120+ game hit 2010 2023 just sitting enjoying games so far at 1440P 165Hz at max setting most in 130 165fps bracket my monitor cap is 165fps /165Hz Just under 10 can't hit that, but if lower to just all high instead of ultra they do. I'm half way done with my games By time 6090RTX out I be done with all my games great card for years Btw of 10 games that can't do the 130 165 bracket are not all new games some 2016 2020 games
I have it doing 4k120 ultra in a quite a few titles, granted it's not doing that in modern titles or broken games, but I was easily getting max in native raster Witcher 3 before they did the RT update, for instance.
@@abd1692 I mean I have yet to come across a game I couldn't max out with a 3080ti, Cyberpunk is really the only exception in my library. With a few tweaks, I was able to max native fps on Witcher 3 again, with RT off. I realize that's a long way of saying I agree. lol
A seller had some stock clearing and sold me a 3080Ti sealed box for $650, I instantly bought it, i think its a big deal considering here the prices are almost double due to duty and taxes, so a 4070 vanilla would cost a lot more than the deal i got. My outgoing card was a 1660Ti and i usually keep cards for long time.
I have played hogwarts legacy on my 4070Ti at 1440p ultra quality, RTX on, and DLSS and Frame generation both turned on. My CPU is a 12700f and I’ve encountered far less issues than you claim the 4070Ti has. It’s possible the card you have is defective but that is not normal performance for that card.
The problem comes as soon as your Vram gets close or over 12GB becouse the bus cant keep up anymore to push all the information into Vram . If you lower your settings you wont see the problem but you pay a shit tone lf money for that shoud be then a 3090 . Just accept reality
I’d rather have a 3080Ti if I do ultimately stick with Nvidia, as I don’t use ray tracing, especially in games that are fast paced as you won’t even notice the ray tracing when you’re more concerned about enemy players coming around a building or sniping from a rooftop. Probably the only real advantage to something like ray traced reflections in something like Call of Duty or Battlefield, would be the glint off of a sniper scope’s lens which would give away their position. If, however I stick with my decision to switch to AMD for the price to performance of AMD’s RX 7900 XTX in rasterized performance, I’d get the Sapphire Tech Nitro+ RX 7900 XTX. I even got the prescription for my glasses for the VR headset I want, as you can’t wear glasses with it, but they do have prescription lens inserts. It’s the BigScreen Beyond. As either the RTX 3080Ti or the RX 7900 XTX would be a beast in their own right. And shouldn’t have any issues running DCS and War Thunder in VR. Let alone older VR horror games like Dreadhalls. Watching Markiplier and DashieGames play a VR horror game even on a 65” LG C2 OLED isn’t the same as experiencing what they experienced. I’ll bet it’s absolute intense. And I’ve heard from VR users I’ve encountered in War Thunder Simulator Battles that until you are seeing the game from the actual pilot’s perspective and not from a monitor, you don’t realize the scale of everything until you’re in VR. As apparently the F-14 is an absolutely enormous aircraft.
Yeah I’m curious why that happened, I wonder if it’s a bug or dlss3 causing this in that area of the game In cyberpunk it was working a lot better at full power
ofc a 3080 is a good gpu still, even the 10gb version is a pretty damn good gpu today, my friend has one paired with a 5600x in 2560x1440p and he can run any game out there medium/high settings with 6-8gb vram usage and have 60-70+fps in all games out there. and the 3080 12gb is just pretty much a perfect 1440p gpu that will be solid for the coming 2-3 years. If you're ok with playing at medium-high settings instead of ultra all, then a 12gb vram gpu will be enough for the coming 5 years lol.
Im torn between changing my reference 6800xt for a 3080ti or not. I can buy one for 600~ and think I can get the same amount for the 6800xt. Any thoughts?
6890xt overclocked gives the exact performance of a 3080 to without ray tracing. At worst you'd get 5-10 fps difference. You also have 16gb vram which you'd need more for smoother gameplay
Asking if 3080ti is still relevant? Hell I'm still over here on a 1080ti squeezing 50-60fps at 1440p on hogwarts on fsr2 quality high settings. It should be able to double that on the 3080ti.
Valid point on the intentionally gimped 192-bit bus, but a 70-class GPU is not meant for Ultra settings let alone in 4K and I'm sure u will have no problems with optimized settings at 4K or even ultra with DLSS Quality so... and most people who buy 4070ti will have a 1440p high refresh rate monitors and frankly won't give a damn about how it's broken in 4K max settings with ray tracing. Also, this is the worst-case scenario in some most demanding and badly optimized broken unoptimized games.
Yes, this. The 4070ti is NOT a 4K GPU. The only real 4K GPU in existence is the 4090, but you'd have to be an idiot to pay that much to play games...or, have zero life. NOBODY plays in 4K on PC anyways. For 1440p, the 4070ti will last you MANY years.
@@MrChologno Well, it is what it is with today's market. Reality is if you just badly need new GPU now, not bad call to get 4070 Ti. Especially if you just do mostly competitive e-sports type of games and don't care if you need to use DLSS for heavier new triple A single player games.
Check out Our Sponsor Jawa: bit.ly/jawaCT
Take $10 of code WELCOME10
How about doing a vid showing how easy it is to use Jawa. This is definitely the target audience for it. BTW, love how you go in depth explaining your point and not just rehashing other RUclipsrs' stuff!
Hi, i guess links are not allowed on your RUclips channel :)?, you could test 3090 vs 4070 ti 4k max settings + raytracing in Resi 4 demo, it took 14gb allocated + 14,5gb dedicated vram on my 3090 ti :=), best regards
Which I find very disturbing. Is there are too many of these phony scalpers hyping up these ridiculous prices in the comment section as a bargain deal.
Sad that many folks buy into these marketing normative influencers' nonsense scams.
Just a friendly reminder that a year ago 3080ti used to cost more than 4090 does today. I wouldn't call it "old"
Yah haha I chuckled when he said it was “old”
@@wakingfromslumber9555 *walks with head high as proud owner of 1080ti since 2017*
@@sunteraze ha 1080ti is still one of the best value gfx card Nvidia has ever released. Since 1080 and 1080ti things have been going downhill for them.
Totally bizarre to call a GPU from 2021 'old', especially when the current gen of cards are barely an improvement (except at the very top end). But the marketing is working I guess.
Crypto boom and the pandemic.
3080ti is a beast. 2023 and over
Which I find very disturbing. Is there are too many of these phony scalpers hyping up these ridiculous prices in the comment section as a bargain deal.
Sad that many folks buy into these marketing normative influencers' nonsense scams.
I have to say, I thoroughly enjoy my 3080 I got for $460 in early 2023. Runs 1440p like a boss at high frame rates.
Yup. Bought a 3080 Ti used for 600€ 2 months ago, with 2 years of warranty left, and it's amazing value. I sold my 3080 (10Gb) for 500€, so the upgrade cost me only 100€. Nowadays, with more mature drivers, the 3080 Ti is much closer to 3090 in gaming performance.
Changed the thermal pads and it's now running very cool, with good OC. Decent value upgrade in today's market, even from an original 3080 (which I bought for 740€ at release in 2020).
lolol, did nearly the same, sold my 3080 rog strix 10gb for 500€, found a guy selling a zotac 4070ti AMP extreme airo for 700 bucks, added 200$ and got the 4070ti.
on an average all between 10-20fps better than the 3080 in 3440x1440p in 10/10 games, while my 3080 rog strix also pulled easily 330-360w, the 4070ti barely touches 260-270w and in most games it just pulls 180-220w.
I just swapped from a 3080 to a 3080ti and it was like night and day difference at 4k, the 3080ti is insanely fast
No man.
I swapped from 3080 10 GB Zotac Trinity to 3080 Ti 12GB Evga Ultra Ftw3 a year back.
On 4K Ultra the difference is somewhere inbetween 12-17%.
In real results in games that i play, that means a avg bump from ~73FPS to ~85FPS.
It's barely even noticieble.
I love how something like the 3080ti is old, when the most used gpu's are like 1630's and other 10series cards lol.
Some people still laughing in 1080 ti as all these games still haven't made it obsolete. They just don't get RT
Many buyers are tainted by consumerist ideas planted in their mind by companies or hell knows what
@@ArtisChronicles RT can make a huge difference for visuals, the 1080Ti on raster aged pretty well.
the 1080ti aged like fine wine its crazy
the point is.. if the 4070ti has more vram or bus, you would get significantly more frames, longevity and stability. you are paying 800 dollars, which is usually a xx80 tier card price. why would there people defend nvidia giving you LESS while paying more?? we talking about vram/bus which are cheap, not the die size which is the bulk of the expenses. anything below 4090 is gimped either because of power or vram. 4080 would be acceptable if it was 1,000usd or lower, but its not. pretty sure they gimped the 4080 in vram so there can be a 4080ti with 20gb of vram.
4080 open box are readily available at best buy if you keep checking got my 4080 open box for 895 and my 4090 open box for 1299 both reference models
Nobody is defending Nvidia, but the not enough VRAM nonsense is getting ridiculous. Bottom line is that Nvidia doesn't care what you or anyone thinks about their choices for VRAM as everyone will still buy their cards. At 1440p, 12GB is and will be plenty of VRAM for the foreseeable future. You want to waste money on more (i.e. 3090 and 3090ti owners that got royally butt-raped)? Get a 4080/4090 or AMD card (yah...right).
@@karlhungus545 this comment is gonna age like a bag of milk and it's gonna be glorious
@@ArtisChronicles 😆 Yah, we'll see how fast the 4K market on PC grows, you're right. Actually, you're not, I'm right.
@@karlhungus545 so.. you fine getting less. thats all you gotta say. wouldnt you rather have 16GB on 4070ti instead of 12GB????? it is evident from the video that if it had 16GB, it would be smooth and good frames in cyberpunk 4k instead of studdery. and guess what? that translate with you being able to keep and use that card for much longer at 1440p for a 800 dollar card.
Bought my 3080 Ti after the 4090 released. Only got it for $800. This thing is a beast at 1440p and 4K.
watch for people that will say your card is already outdated and you should go for more Vram... but got i hate my 7900xt i wish i had bought a 4080... or even a 4070ti pretty sure i would be able to play games at least....
@@thedaeron3877 what’s the issue with your 7900XT? I am loving it , I am getting better performance out of it than 3080Ti
@@wakingfromslumber9555 😂
🤞
You must only game too.
@@dangerous8333 and programming
So if I can get one for 400 used jump on it ? I have a 3060ti and Ryzen 5 3600. I'm worried about CPU bottleneck
I like to think there's at least one Nvidia fanbois out there losing sleep over frame generation, and I'm going to love seing FSR 3 run on Nvidia's 3000.
Yessssss me too
You mean the Fake Frames version for AMD. Hmmmm, It will be interesting to see if AMD optimizes it more for their GPUs or will be basically the same across the board.
@@NBWDOUGHBOY FSR is anti-aliasing and upscaling, not frame generation.
@@x8jason8x He said FSR 3. FSR 3 will be the 3rd version of FSR which will include the Frame Generation Technology from AMD
@@NBWDOUGHBOY Looking at it, it seems that you'll be able to turn it off, like nVidia with frame generation. I don't see the problem, it's not as if they're going to force people to use the interpolation.
Which I find very disturbing. Is there are too many of these phony scalpers hyping up these ridiculous prices in the comment section as a bargain deal.
Sad that many folks buy into these marketing normative influencers' nonsense scams.
Bought a 3080 ti in Jan for $480 pretty good performance for the price I got it for.
perfect 1440p gpu.
Was gonna ask you to compare these cards actually, since its more proper test comparing 384bit bus vs 192bit with same amount of VRAM while trying to max out the VRAM usage.
agree with the 384 bandwidth makes huge difference at 4k
If it wasn't for the power efficiency upgrade, the 4000 series so far (with the exception of the 4090) would probably not be a worthwhile upgrade if you already have a 3080 or above.
what about dlss3
As has been the case with buying two generations in a row since the dawn of time.
dlss3, av1 encoding, power efficiency and better rt cores
@@MrKasenom I have a 3080TI,I don't use dlss at all, just native 2k and it plays all games very well with plenty of fps.
Just undervolt your 3 series card and the power efficiency isn’t an issue. There’s also a mod to get FSR 3 on series 3 nvidia cards which will let you use frame generation so you’re not even losing out on dlss 3. People paying full price for series 4 cards are just throwing their money away
I've been saying the same about my 3080ti lol, but people won't believe it even with you showing them.
so I live in Korea, and recently I purchased a 3080ti (brand new) for $644.99. Would you guys consider this a good price? My friends say I got super lucky because this store was selling their remaining 3080ti's for cheaper to make room for the new 4000 series.
If you got it brand new its a good price depnding alot how much a 3080ti went for in Korea
Nowadays 3080 you can buy for a price like 400 eur, because everybody belies "intelligence" magical force of cache and frame generation. Meanwhile 3080 is pushing everything with raw power, even super broken dev version of Atomic Hearth with full raytracing is fully playable on it, thanks to raw bandwidth of Vram (760GB/s). Regarding of frequencies, 3080 with good cooling can boost frequency even to 2Ghz vs declared 1.7Ghz.
If the 4070ti had the same bit bus as that 3080ti then the 4070ti would be a much better gpu, but the way it is now it won't have a long lifespan.
the 4070ti is a glorified 12 gb 3060 lol Basically.
@@IAmWadih What?! Do you even look at benchmarks? 🤣 The 3080ti is old, power-hungry, mined-on tech. Only a true idiot would buy one of those.
@@karlhungus545 you ate glue in kindergarten didnt you?
@@karlhungus545guess I'm a true idiot lol
@@JakeNach You're man enough to admit it. Important step...
I recently sold my Asus TUF 3070 8GB OC for $325 and got Gigabyte Gaming 3080Ti 12GB OC for $470; initially was looking at 4070 but that would have cost me few hundreds more. Think I am good to go for another 1-2 years now.
I literally just bought a used 3080Ti for $550 this month, August 2023. I replaced a 10GB 3080 purchased for $550 last year and justified the upgrade by selling the 3080 on again. All in all I'm out about $100. At $550 3080Ti is technically a decent value for price/performance, has enough memory, and my system is built to support a high end card so TDP isn't really a major concern. 400W TDP is nothing to scoff at but it's not nearly a 3090Ti or 4090 anyway. As it only requires a pair of 8 pin PCIe power connectors, it can only draw so much.
3080Ti is a decent little jump from the 3080, and while not massively faster, there are some games where the difference is surprising.
12GB isn't exactly an ideal amount but it's far better than any 8GB GPU and PLENTY of memory for 1440p in almost every possible scenario. Even new and very demanding games don't usually use over about 8GB of memory. The outlier games I play are the Resident Evil Remakes which will happily gobble as much memory as they can get. And those ran just fine at max settings locked 75FPS Freesync on the 10GB 3080. Would 16GB be nice? Yeah but 12GB is serviceable for now and the ray-tracing performance advantage is worth it to me.
Are there better values available than a $550 3080Ti? Probably. But if you want to keep using Nvidia hardware and don't want to buy a compromised, price fixed, or dangerous video card, the 3080Ti is as good as it gets. Poor 3070 and 3070Ti are very gimped at 8GB despite their actual performance easily justifying more memory. As well 4060 and 4070 are lame ducks that have very underwhelming performance, even my 10GB 3080 is faster than those cards. 4070Ti is a decent performer but it averages $1000 for premium models and even the cheapest ones are $800 for what is essentially a mid range card that should be about $400-500 max.
That basically leaves the 3080, 3080 12GB, and 3080Ti as the only reasonable options from Nvidia right now for anyone that runs high end hardware but wants to spend a SANE amount on a GPU. And that's ONLY if you buy them USED. If you don't have anything locking you down to Nvidia right now, go AMD.
people today seem to forget how important vram specs are. I may sound like an old dad (haha) but back in the day when i bought my gtx 970 i had stuttering issues when going over that really weird 3.5gb full speed memory. The 970 had 4GB in total but only 3.5gb were full speed and the 500mb left were much slower. When i played GTA V and it started raining it pushed beyond 3.5gb had really started stuttering very bad. so i had to lower some graphical options just because when it rains it stutters otherwise it ran very well.
The 4070ti 192bit bus may not be a huge issue right now in 1440p but when you see how it performs right now in 4k, you know that its going to happen in 1440p as well in a year or 2. Remember games don't get easier to run (newer games obviously not the ones that are already released) but require more powerful hardware as time goes on. If you are ok with spending 800-1100$ on a card to needing an upgrade in 2 years then so be it but i'm not in that camp and i'm pretty sure the vast majority isn't either.
In my opinion the VRAM should never be the bottleneck on a graphics card, the gpu should always be the limit not VRAM. You pay all that money for that gpu but then don't get all of its performance you paid for because Nvidia decided to put some bad VRAM configuration on it to save a buck and put higher prices.
Shameful thing is Ngreedia don't pay verry much for thier Vram in the first place as they buy so much , saw somwhere it works out at $2 per gb . So an extra $8 to give the 4070Ti 16 gb
no DLSS 3 tho, they made the 30 series so good they have to nerf it to replace it, at this point i'm glad i got an AMD GPU, heard that FSR 3.0 might get announced 2 weeks from now, on 23rd of march
For some reason the 3080 ti costs more than a 4090 in the webshops over here 😂
This man hates the 4070ti more than people hated the RX 6500xt lol.
It is $h!t-tier and you know it.
If your paying that much, I also feel your being taken for a ride by nvidia.
They just released a patch for Hogwarts today, hopefully it should resolve a lot of the issues seen on PC
Tbh the biggest mind blowing for me how people use Hogwart as example of how now games will obsolete alot of GPUs while this game is broken af. One of first patches had "memory leak issue fixed" new patch have "Fixed several memory leaks", "Improve VRAM usage", "Fixed lighting optimization with Nvidia drivers", "Fixed streaming issues." and all of this stuff but no people are saying that 2 years old hardware is too old to carry this. If this game would release with that kind of issues on consoles especially on XSS these consoles would crash probably in main menu. Also worth to mention is that Dead Space Remake also have issues with memory that Digital Foundry found not sure why is happening with these new games.
@@Extreme96PL Funny, as you complain about it being broken, I hear Atomic Heart crying in the distance...
Funny thing about how "broken" HL is, it's still one of the most complete triple A game releases in a long time. So don't expect future releases to be better, especially not when UE5 becomes the standard engine.
@@x8jason8x if product is broken then content dont matter and if devs dont care about their product and release it broken i see no reason to care about it either just dont use this broken product as example to test other product because this mean mostly if not all problems will be fault of game not hardware. I would say this again let people pre order more triple A games i am sure this will force devs to care especially if their lazy port paid off in pre order already. Not sure what you mean about Atomic Heart its probably best optimized game this year together with Returnal both of them looks great and works great and both of them use UE4 too so like you see games on UE4 can be optimized devs just need to care instead being lazy.
@@Extreme96PL You... you really don't understand... this "broken" game, as you call it... is the best optimized triple a game released in *_years,_* but somehow you don't like it as a benchmark? That's nonsense my guy. It's exactly what should be used as a benchmark standard.
@@x8jason8x best optimized game wtf how this can be best optimized game if even devs said in patch log that they fixed multiple critical issues and no one know how much issues are still here, game cant even use CPU because in high populated areas you always cpu bound even with really good cpus, fact that even people with high end pc which are much more powerful than consoles have fps drops and stutters lol you think that Arkham Knight PC port was to good before they fixed it after almost 2 years? Or GTA IV port which was never fixed and even now have issues?
I just bought a used 3080 ti for 630$(USD) it came in a near perfect condition and i have a 750 plat psu so i undervolted and OC the memory and i pretty much have a cheaper 4070 ti with a bigger memory bus with similar eiffiency and amazing performance with the bells and whistle turn on. I highly reccomend getting a used 3080ti for less that 700 and saving the money for other parts a build
im pumped I was looking to get one a couple of months ago they were all at the 700-750 range , going to look a t a soldiers 3080 ti he didint get to enjoy because he got deployed for 2 years for $550 that will hold me off great at 3440x1440 till next gen and I still have to sell of my 5700xt and 6700xt im sitting on as backups
So the 4070 ti is the GTX 970 all over again. But this time the price is atrocious also. I think if this was a 4060 ti and $400 to $500 we would easily dismiss these observations by saying a 4060 ti is not expected to be 4k ultra RT capable.
Jesus my small mid-range 3060 ti has a memory bus of 256 bits
All the new RTX 4000 and RX 7000 models need to be half of launch MSRP. That's what a fair price is. Anything more and you overpaid.
LOL…used my OLD 1080 for 6+ years. Loving my Strix 3080ti and don’t plan to upgrade for some time to come. Skipping 4000 series is an easy choice…we shall see what happens with the 5000 series but won’t be upgrading until a 5090 at least…I like to make it last as long as possible and bump to the next level…and the next level is a 90 series.
I’m waiting patiently for 5080 Ti.
I’m an xx80 Ti guy since 980 Ti. I had 1080 Ti and 2080 Ti. I’m currently using a 3080 Ti. I’ll skip the 4000 series this time.
I love it... "look at my hands" ... "I'm not doin nothing"... You're Great, absolutely love your channel. Thanks man !!
I got used 3080ti in late 2022 after being disappointed by 4090 and both 4080s. It would be so funny if my 80 class gpu beat next generation 80 class in games with high vram usage.
I got a 2070 and holding strong with a 5600x on a b450 board, I might just wait for 2nd hand prices to get better.
Nothing wrong with the 2070 at all, still a very good card. FSR3 should actually help a lot as well.
I think nvidia will eventually drop out of the low/mid end market once people get tired of paying scam level prices and stop buying them.
Yea I'm done with them. Next PC is full AMD/Ryzen.
From the looks of it, the RTX 3080 Ti may be old, but it still gets the job done.
Funny how you and Classical Technology both call the 3080Ti "old". One generation behind is not old by any means.
@@harveybirdmannequin because of how fast tech moves a 2 year old card IS old. On release 3080 ti was considered a 4k card. 2 years later its basically a 1440p card and on the lower end of mid range.
@@Dempig but the price is still not low end!! Thats what’s wrong with the whole GPU market!! 😂😎💰❤️❤️❤️🤨🏈
@@DirtyMoney1971 It went from being unavailable to out of reach in price.
@@Dempig Okay, you're on one there bud, 3080ti is still a very capable 4k card. Lower end of midrange? LOL! The lower end of midrange for nVidia has always been 50 class skus. It's quite obviously still in the top 5 nVidia cards, top 10 across all cards.
If you shut ray tracing off with the 3080ti you can get 80-120 fps on 4k Ultra settings in most games with DLSS 2
The 3080ti even used will be more expensive than the 4070ti so Nvidia made a mistake on the vram for the 4070ti
4070ti with 192bit bus and 12gb in 2023 for $800+ LOL
How much cope does it take to swallow this? Who buys this garbage?
So your saying that thumbnail is faster than my 7900xtx liquid Devil? Hmmm… I doubt that
The rtx perfomance is better and the card is also faster in Blender.
I never use a blender so that means nothing to me😅
@@rikogarza1729 Obviously not you he’s talking to. However my RTX 4090 Asus Strix white blows your GPU out of the water in everything.
@@ZackSNetwork in a twist, I have a 4090 asus strix on water block in black it’s also and over clocked so yeah, I sunk your battle ship
I’m happy with my “old” 3080 Ti, it’ll remain in my pc for years to come, it is perfect for 1440p
It’s perfect for 4K especially if the game is made by a competent studio and uses DLSS for example I play RDR2 at 4K optimised settings (leagues ahead of console settings) with DLSS quality and almost never drop below 100fps, same with HZD, Shadow of Mordor with 4K texture pack DLC, “ God of War etc etc
it’s not that the high end 3000 series cards have become obsolete, it’s that games are not made well anymore.
@@George-um2vc What you say is true, 3080 Ti meant to be a 4k card, and it is still a beast (basically a 3090 with half VRAM), but shitty optimization, forced releases and lazy developers make it look as it is already obsolete, and nvidia using frame generation to make 4000 series look tough (which i think could be easily utilised on 3000 series too), so it is nvidia's interest to force out the still appealing high-end 3000 series from the market, to increase the crappy sales of 4000 series. In addition - maybe it is conspiracy theory or not - newer drivers make older cards slower to make 4000 series look better in comparison to 3000 series.
@@mikerowave1986 Have to agree with you on all that, what you said about drivers scares me, we know Nvidia does not have own best interest at heart, they could well be nerfing 3000 series through drivers, at least we can always roll back right?
@@George-um2vc Also awful CPU bottlenecks because of bad optimization, even on 10 year old hyperthreaded quad cores.
@@saricubra2867 Yes and that's a huge part of what I was referring to about games just not being made well, devs do not put effort in to make their game take advantage of multi core/thread CPUs which just bottlenecks your GPU no matter how powerful it is and leaves lord knows how much performance on the table, sad times.
Bro what, The 1080 is still pretty capable in 2023, tf u mean "old gpu", the 3080 Ti is still very much capable and I wouldn't count 3 years as old by any means
I have a 3080TI fairly happy with it. When buying I was on the fence if i should go with a 6900XT instead.
I feel like if AMD can just be a little more consistent we could have a case like Intel vs AMD in CPU but with NVidia and GPU over the next few years if NVidia starts to get lazy and cheep out on new GPU's the way Intel did with CPU. Its not the same obviously since the 4090 is pretty crazy but if they only put effort into the highest model then they could end up losing market.
Ngreedia already cheaped out hence this video
So Ive watched other channels and I don't see the same experience for othes that you have on the 4070ti.
Well to be fair I’m pushing it in 4k, most other channels mainly focus on 1440p or just 4k benchmarks, without taking into account the vram limits when presented with ray tracing heavy games at max settings
Cuz he’s full of crap.
He has no content ideas so he’s riding the Nvidia hate wave. Can’t wait til this channel die lol
@@derricksmith3973 Ive seen this issue before, but its not related to the GPU, Ive seen it when there is a bad network connection. I don't think that is his experience, but as a IT Engineering Professional, I don't find this content helpful without knowing more about his setup/variables.
@@BowlersRant I agree.
Nice to see frametime graph.
3080 and the variants above it will carry over a consumer for a while similar to the 980ti especially if you're gaming at 1440p. If you plan on doing 4K gaming with RT and DLSS Quality enabled, then maybe consider a 4090.
Sold my 1080 Ti for a 3080 (300eu to 850eu) one month ago.
video editing :100% gaming 0%
To be honest, compared to the 1080Ti i am not impressed with the 3080 either! It is not x3 as i expected it to be but something like 40% better.
but i won't deal with the 3080 TI power consumption!
Video editors, don't only count CUDA cores (as i did) and calculate your render times, find the "CUDA CAPABILITY number" board and see the real differences there.
CT thanks for your work brother!
@@Achilleas7 LOL! If you got zero gaming performance increase with a 3080, you're doing something mega wrong. A base 3080 10gb is 40% faster than a 2080ti *_at worst..._* you should take your system to a professional and have it optimized.
@@x8jason8x
Next time you respond to someone what you think they should do, make sure you qualify as a reader first, and then turn to writer!
Read again and try not to fail this time! Editing: 100%, gaming: 0%. I don't game! Go back to gaming and let professionals have their talk without you interfering with them!
@@Achilleas7 You need to "read again". I specifically addressed what you said. As to "professionals", you most certainly are not. I build for a living, and you are talking out of your neck.
@@Achilleas7 Next time, if you mean to say you do zero gaming, say that specifically.
3080 TI is still the KING 384 BIT POWER !!!!!!!! OVER 10000 CORES IS STILL THE KING !!!!!!
I just popped a 3080 ti in a test system today, and was impressed again. It's so much better than the newer 4070 and below it's not even funny!
@@ClassicalTechnology I recommended a 3080Ti to a friend with a 5900X. I compared it against the 1080Ti (352 bit memory bus and 11GB of VRAM) and the 3090 (number of CUDA cores). The only good Ampere card. The rest of the lineup, wrecked by RDNA2.
A 3080Ti is considered "old" already? LOL. Come on dude. A 1080Ti is still a beast. So yes, the 3080Ti is *more* than capable and will continue to be so for several years.
"But hey, a 70-class GPU was never meant to perform at 4K ultra settings in the first place, anyway".
Yeah, and this card was never meant to be a 70-class GPU in the first place, anyway too. It should have been the RTX 4080, at least it was intended to. Don't ever forget that.
You can try to make a fool of yourself all day long, looking for a reasonable point to justify your bad call in purchasing this piece of crap, as well as your eagerness to remain a shameless company simp. If that will make you feel somewhat better, good for you, I guess...
But first of all, if the kind of thought presented in the first line of the text ever had some merit as an argument, it would be pointless to seek innovation, we just should've kept aiming at 1080 60fps as did flagships in the past. Now we're starting to see 8K resolution becoming a thing and Nvidia is there trying to push you a crippled GPU that is simply already incapable to perform well in some titles if you opt to max out your settings at 1440, let alone 4K, and that just a few weeks after it was launched. This hardware is an overpriced and outdated horsesh*t. And you keep telling Nvidia "oh yeah, give us some more of that! Keep going, and don't ever forget to charge a premium for no reason at all, 'cause we all love you just that way".
Well then, no wonder we've been seeing all those ridiculous GPU prices for quite a while. You just won't stop asking for it.
Oh, and have I told you this card was initially intended to be a 80-class GPU? I beg you pardon if I haven't.
Second, Nvidia itself advertised this garbage as comparable or superior to a RTX 3090, which in turn was advertised at capable of 8K gaming in Ampere generation. I can't help but to imagine the jokes people at Nvidia come up about how insanely easy is to troll your sorry asses.
I have a 3080ti with a ryzen 7 5800x no issues with running 60fps 4K and 120 fps in most games.
If this was an Amd title messing up even if developers fault with drivers, people would be out with their pitchforks and fire.
I totally agree, but remember... only amd has problems, in nvidia they are called 'bugs' xD
Late last week I was in need of a Windows 11 key and for the life of me I couldn’t recall who I’ve heard say time and time again “for a nice discount on a windows key….” And then it hit me. Thanks for the nice discount.
I’m owning 3080Ti, last GPU Ihad was 980Ti, next time upgrade GPU probably wait until maybe 7080? 😂
Problem is that a 3080ti is roughly 1000-1500 dollars. 1500 for a founder edition. I can grab a gigabyte 4070ti for $900.
Or get a used 3080 ti for less than $700. Save some money
Just went from a 1660ti to a 3080ti, my new pc will be arriving later this week. Hoping to see a big performance increase
Definitely! My 3080ti is such a huge leap over my 2070 so you will enjoy the new card, I play demanding vr racing simulators on one of the highest resolution headsets out there and it's so smooth
I went from a Alienware X51 R2 with a 1660TI to a Origin Chronos V2 with a 3080Ti and love it! I won’t upgrade until at least late 2025 when the 5090 or TI version comes out.
i remember going form a 1660 Ti to a 3080 Ti, super performance jump
5:00 probably Vram limitation. Hogwarts is very vram and ram hungry. At 1440 Ultra settings it takes around 9 GB of Vram and around 20 GB of System RAM.
I'm glad the 3080 is getting a lot of love, I'm torn between the 3080 and the less expensive RX 6900
I have a TUF 3080 12GB OC and it is great. I get 100+ FPS playing sniper contracts 2 max settings with at 3840x1600 and both the Ryzen 9 5900x and the 3080 stays below 65 Deg.
The 7900xt is becoming very reasonable at 799$, they're throwing in a copy of The Last of Us as well. I won't be buying this gen, but I'd definitely consider it at that price point.
@@x8jason8x yeah Jason, below $800 here in St Louis, Looks like I can still get the game on Amazon for the $649 6950 also. Also Jason I'm only running a 11700kf. It's a 3840 x 1080p 44 inch Ultrawide. And it can kill a 3070 ti or a RX 6600 in Cyberpunk and and MS Flight Sim. Not a fps over 30
I'm totally satisfied with my RTX 3080, 10GB. And, I play on 4k ultra most games.
I'd personally go with the 6900 because of the vram. It's faster anyway too
found one for 500, 1 year old. having a blast
I have a rtx 3090ti and in no way upgrading any time soon its consistent and I doubt there will be a big enough jump in gaming to make it irrelevant or truly challenge its quality gameplay
Just got a EVGA 3080ti FTW3 Ultra about 480 usd with receipt and 2 year replacement warranty. Did I do good y'all?
Genuine question, if I don't get the EVGA version then whats the 2nd or 3rd best option to go for?
I would like to know have you tried using a 12VHPWR connector on the 3080Ti FE and an ATX 3.0 PSU?
Got a used 3080ti for $620 and I'm riding this until 5000 generation at least. As shown here, there's not much difference until you get to the 4080, which simply isn't worth buying given the relative performance between the 4080 and the 3080ti. Further, I play at 1440p only, as many of us do, and most of my games are 2+ years old. Combined with the fact that I don't use ray tracing much, there's plenty of performance for my tastes.
Love how you call my (probably) upgrade gpu old.... My guy my 980 (not the ti, just basic 980) has been kicking hard with my 1440p, triple monitor gaming lol.
Second months of enjoying my 4070 ti, getting 170 fps on atomic heart, maxing out my 240 hz monitor on most of the older stuff.
And it's going to become even close in FPS with AMD FSR 3.0, FSR 3.0 is the reason why I didn't upgrade to RTX 4000, I'm gonna wait for either RTX 5000 or AMD Rx 7000
I’m happy with my rtx 3090 at 4k60 but should i upgrade or wait for the 5000 series ?
if you got 3090, you don't need a upgrade. don't waste your money.
@@Valueshooter ok thanks for the advice
@@pipikakachu by the way if you got money to burn. Only card worth upgrading to would be the 4090.
@@Valueshooter nah i don’t have money to burn, i would sell the 3090 and add a bit of money to get a 4090 but i don’t think it’s worth it
@@pipikakachu hold off for 5090, get that two generation jump. Hopefully it’s not $2,000 msrp though.
Is the 4070 12GB at $600 a better or worse buy vs a used 3080 Ti 12GB at around the same price?
DLSS 3 for 4070 Ti / FSR 3.0 for 3080 Ti - The playing field is even now. Let's see who's the boss.
3090ti user here. I am glad owning it. I will not upgrade to 40x series.
What software you use to show CPU, RAM, GPU, MEM, etc? Thanks!!!
MSI Afterburner, just download directly from MSI and no where else
Go from an 1080 ti buy back in 2017 to a 3080 ti buy use for 440 €
Very good deal i think
Just picked up a 3080ti for $350 today.
Yeah I think I got a good deal
I have the 3080 TK with 16 vram how much better will they be
What hardware monitor are you using?
Be honest and go with the 3080 to hogsmeade map, he only shows the 4070ti in hogsmeade. Not the 3080. There is no busproblem with the 4070ti it was a bug and a possible shortage of ram, its not even sure. And while i was testing the new fixes at 2k, cause im very happy with this. Im measuring an average of 150fps with dlls balanced ultra raytracing. Nvidia should sue you for not being fair with demonstrating the cards at different locations. Shame on you
I said this before....his test isn't a real world scenario. If you find your 4070ti doesn't run Hogwarts well at 4k why the hell wouldn't you use DLSS to render at 2k and upscale to 4k? Problem solved....I get his point but if you actually own the card you're going to make it work for you and be happy with what you have.
It's still way too expensive though....imho.
Hey, BMWS are cool though. Best cars ever!
I was showing a specific issue with the 4070 ti there, the 3080 ti did not demonstrate those same problems in the same way, onlly the 4070 ti. The 3080 ti didn't have high FPS there, but it wasn't stuttering that badly either
Yes its way to expensive, thats for sure. Nvidia needs some spanking
How does resizeable bar factor into all this? Thanks.
Love my 3080 ti fe. Thing is a beast
Got mine for 450usd new boxed in march 2024. Best value/performance gpu
DLSS 3 seems to increase vram usage so if the game base was already maxing vram turning dlss 3 can really destroy your game. This may be the reason as to what you experienced.
turning dlss upscaling a little higher (0.25-0.30 seems like a sweet spot to me) will fix the issue
@@tunatuna4710 you mean upscaling or sharpness? Cuz turning up upscaling is like going from quality to balanced.
@@kelvino3990 i mean sharpness
Just got a i7 13700k for free so I decided to upgrade my gpu from a 3060ti to a ASUS 3080ti oc and got that for 550 which is a steal.
Thank you. I decided to get a rx 7900 thanks to you. Was going to get a 4070 but the stuttering put me off. You can get a 7900 cheaper than the 4070 now.
Me still playing 1440p ultra on my 1080Ti getting 60+ FPS on Sons of The Forest.
Thanks, you helped me made up my mind getting a used/not abused, 3080TI for 470 bucks
My setup Nov 2020
i9 10900k @5.2
32GB DDR4
3090RTX
2X1TB 970S
1000WATT
Is in 3rd year of use and with over 120+ game hit 2010 2023 just sitting enjoying games so far at 1440P 165Hz at max setting most in 130 165fps bracket my monitor cap is 165fps /165Hz
Just under 10 can't hit that, but if lower to just all high instead of ultra they do.
I'm half way done with my games By time 6090RTX out I be done with all my games great card for years
Btw of 10 games that can't do the 130 165 bracket are not all new games some 2016 2020 games
Love my suprim x 3080ti and i am happy it will performs even better with fsr3
What cpu are you using
Picked up an EVGA 3080 Ti XC3 Hybrid on the dip just before they broke off from Nvidia. Sure it’s no 4090 but it is sufficient for 1440p Gsync gaming.
I have it doing 4k120 ultra in a quite a few titles, granted it's not doing that in modern titles or broken games, but I was easily getting max in native raster Witcher 3 before they did the RT update, for instance.
@@abd1692 I mean I have yet to come across a game I couldn't max out with a 3080ti, Cyberpunk is really the only exception in my library. With a few tweaks, I was able to max native fps on Witcher 3 again, with RT off.
I realize that's a long way of saying I agree. lol
A seller had some stock clearing and sold me a 3080Ti sealed box for $650, I instantly bought it, i think its a big deal considering here the prices are almost double due to duty and taxes, so a 4070 vanilla would cost a lot more than the deal i got. My outgoing card was a 1660Ti and i usually keep cards for long time.
I have played hogwarts legacy on my 4070Ti at 1440p ultra quality, RTX on, and DLSS and Frame generation both turned on. My CPU is a 12700f and I’ve encountered far less issues than you claim the 4070Ti has. It’s possible the card you have is defective but that is not normal performance for that card.
You've made a bad purchase. Just get over it.
The problem comes as soon as your Vram gets close or over 12GB becouse the bus cant keep up anymore to push all the information into Vram . If you lower your settings you wont see the problem but you pay a shit tone lf money for that shoud be then a 3090 . Just accept reality
I think it's an AMD CPU issue. Same problems in CyberPunk, Witcher 3 and Hogwarts.
@@jjlw2378 You're 100% wrong, but keep making nonsense up if you have to, I guess.
@@x8jason8x Look into it. You'll find many posts talking about it.
I’d rather have a 3080Ti if I do ultimately stick with Nvidia, as I don’t use ray tracing, especially in games that are fast paced as you won’t even notice the ray tracing when you’re more concerned about enemy players coming around a building or sniping from a rooftop. Probably the only real advantage to something like ray traced reflections in something like Call of Duty or Battlefield, would be the glint off of a sniper scope’s lens which would give away their position.
If, however I stick with my decision to switch to AMD for the price to performance of AMD’s RX 7900 XTX in rasterized performance, I’d get the Sapphire Tech Nitro+ RX 7900 XTX. I even got the prescription for my glasses for the VR headset I want, as you can’t wear glasses with it, but they do have prescription lens inserts. It’s the BigScreen Beyond.
As either the RTX 3080Ti or the RX 7900 XTX would be a beast in their own right. And shouldn’t have any issues running DCS and War Thunder in VR. Let alone older VR horror games like Dreadhalls. Watching Markiplier and DashieGames play a VR horror game even on a 65” LG C2 OLED isn’t the same as experiencing what they experienced. I’ll bet it’s absolute intense.
And I’ve heard from VR users I’ve encountered in War Thunder Simulator Battles that until you are seeing the game from the actual pilot’s perspective and not from a monitor, you don’t realize the scale of everything until you’re in VR. As apparently the F-14 is an absolutely enormous aircraft.
Is in it faulty that 4070ti, its only using 95w instead of the 285w from the TDP, better look into that. There was a time it went to 75w
Yeah I’m curious why that happened, I wonder if it’s a bug or dlss3 causing this in that area of the game
In cyberpunk it was working a lot better at full power
Hehe ☺ I enjoy your work , thanks.
3080ti on average is 550 usd where i live, should i get it? 4070ti costs double btw about 1100 bucks
ofc a 3080 is a good gpu still, even the 10gb version is a pretty damn good gpu today, my friend has one paired with a 5600x in 2560x1440p and he can run any game out there medium/high settings with 6-8gb vram usage and have 60-70+fps in all games out there.
and the 3080 12gb is just pretty much a perfect 1440p gpu that will be solid for the coming 2-3 years. If you're ok with playing at medium-high settings instead of ultra all, then a 12gb vram gpu will be enough for the coming 5 years lol.
Im torn between changing my reference 6800xt for a 3080ti or not. I can buy one for 600~ and think I can get the same amount for the 6800xt. Any thoughts?
6890xt overclocked gives the exact performance of a 3080 to without ray tracing. At worst you'd get 5-10 fps difference. You also have 16gb vram which you'd need more for smoother gameplay
Keep them 4070ti videos coming!! Nvidia's greed is beyond greedy
Asking if 3080ti is still relevant? Hell I'm still over here on a 1080ti squeezing 50-60fps at 1440p on hogwarts on fsr2 quality high settings. It should be able to double that on the 3080ti.
Valid point on the intentionally gimped 192-bit bus, but a 70-class GPU is not meant for Ultra settings let alone in 4K and I'm sure u will have no problems with optimized settings at 4K or even ultra with DLSS Quality so... and most people who buy 4070ti will have a 1440p high refresh rate monitors and frankly won't give a damn about how it's broken in 4K max settings with ray tracing. Also, this is the worst-case scenario in some most demanding and badly optimized broken unoptimized games.
I would agree but 800+ for 1440 only?
@@MrChologno You can’t apply reason to corporate greed. The 4070ti is a 1440p GPU. If you want 4K it’s the 4080 or 4090 you want.
@@ZackSNetwork 4090 sure. 4080 is trash at the current price point.
Yes, this. The 4070ti is NOT a 4K GPU. The only real 4K GPU in existence is the 4090, but you'd have to be an idiot to pay that much to play games...or, have zero life. NOBODY plays in 4K on PC anyways. For 1440p, the 4070ti will last you MANY years.
@@MrChologno Well, it is what it is with today's market. Reality is if you just badly need new GPU now, not bad call to get 4070 Ti. Especially if you just do mostly competitive e-sports type of games and don't care if you need to use DLSS for heavier new triple A single player games.
If ur playing on a 34" UW 3440x1440 and need a GPU, the 3080ti/3090 is the sweet spot. I much prefer these UWs vs 4k when gaming
UW looks dumb to me, it only looks good in racing games. First person games look terrible in UW.
@@Dempig Your talking crap.
im only playing at 1080p. its good to buy 3080ti used one for $450 ???