@@zacthegamer6145 No lmao. His 4080 is plenty, no need to spend a few 100 more for a 4090. Of course the 4090 is top of the line, but in this gen of gaming the 4080 if paired with the right specs will last him plenty. i got the 4080/ryzen 9 7950x. im loving it, hope you are enjoying that 4080 man!!
@@zacthegamer6145 compared to my 3080 it’s much better. I mean how much more do you need? Lmao you’re acting like any game out rn even needs a 4090. You’re not even getting everything out of the 4090. Maybe a couple years down the road🤣
@@nothim-7161 A lot of games max out 4090 even. Just keep living under a rock and justify your purchase. A 4080 is the worst price/performance card of 4000 series while 4090 is the best. 4090 will be equivalent to 5080 next Gen meanwhile 4080 will be beaten by 5060ti/5070! The Last of US at 4K native, ultra settings use 17GB VRAM. Let UE5 games come out and your 4080 will be a compromise!
I've upgraded to 4080 from 3070 (aka 2080ti with less vram), sold it for 1/3 the price of 4080 so yeah it was kinda less expensive but still not cheap. For me it was worth it since I'm definitely skipping 5000 series and won't be running out of vram for that time.
@@mikem-gb3pj Bullshit You're probably confusing this with RAM. There are games that can take up 16GB if you have 32GB. But just because the memory is being occupied doesn't mean you absolutely need it. And it's the same with graphics cards. A 4090 will almost always use more VRAM just because it can, but that doesn't mean it's needed. There are also two different values. First the memory that is reserved and then the memory that is actually used. And no game needs more than 16GB of VRAM, that's absolute bullshit.
With the 4080 you can definitely skip a generation. A very good card and definitely more useful than a 4070 TI because of the memory. 16GB is definitely enough, especially with Nvidia cards, which have significantly better memory management.
@@mikem-gb3pj well it depends on which resolution youre playing. hogwarts legacy on 1440p sucks 7gb of vram approximately. If youre playing 4k, then okay lol.
This video is a perfect example of why I'm skipping 40 series and just sticking it out with my $700 3080 12gb. The 4070ti isn't enough of an upgrade for me and the 4080 is too expensive. If I'm going to spend $1200, I'll just save it and wait for next gen.
Remember not everyone has a 3080 so for someone making a new build or upgrading a weaker build the 4070ti is a better choice if it’s around the same price
@@iDTecKt you went from a 3070 to 4070ti. That means 45% AVG Performance Gain for roughly extra $400 (depending on the Custom Model). That's kinda meh. Not worth it in my opinion. in my Situation: I have a Palit RTX 2080 Super Gamerock Premium (769€ in 2019). If I would upgrade to a RTX4080 the performance Gain is 153% on Average (for 1440p Gaming), for 500 to 600 extra more money (depending on the Custom Model). Much more worth it. If I would jump to a RTX 4090, the Average Performance Gain would be 211% for 950bucks extra. Still worth it.
Damn, I think its time I upgrade. I'll probably wait for the 5000 series as I only have 1080p monitors. Once I do upgrade... I'll need to buy a new monitor as well.
I game at 1080p settings not on highest, so 400 egg-cooking watts or even 300WATTS are a waste of money and melting ice caps! 4070 the perfect sub-200W card.
im rocking with a asus oc 4080 and on max settings for most games i pull about 200-250 watts, but i also have 1000w psu so ill handle it either way. if youre upgrading your card its no brainer to upgrade the other stuff as well lol.
This Is not 1440p but 4k, in 1440p in cyberpunk without Path tracing but fully maxed with DLSS quality i have 90/100 fps with my 4070ti. For the results of this video or you are in a severe bottleneck or the real resolution Is 4k and not 1440p.
@@iDTecKt Me too! I don't understand if the video is deliberately misleading to make it appear that these cards are less powerful (for which reason I don't know) or if he misspelled the title, those results are not 1440p with the 4070ti, it is much more powerful than what you see in the video.
@@EddieMcclanahan The results of this benchmark are simply false, the author of the video has peddled benchmarks done in 4k resolution, pretending they are in 1440p. I don't know why this fake, but it doesn't give real information about these video cards in 1440p.
@@agentpurple5602 im pretty sure those are all pretend-benchmarks, and the dude who made this video actually didnt do a single benchmark of anything and just stitched together some fake number counters. actual benchmarking takes a lot of time and noone who puts in this much work doesnt put at least like 5% of the benchmarking time into presenting the data. but these fake benchmark videos obviously only took like 5 minutes to produce. and they net alot of views. its kinda sad
I don't understand.. I'm only getting like 84 fps on Far Cry 6 in 1080p Ultra settings, I have my 4070 TI paired with a i5 13600K and 32gb of DDR5 6400. Any ideas?
can't say for warhammer , but came from a 1080ti to a 4070ti at 1440p , and i think it was worth , at 1440p i was dropping setting hard to stay at native resolution and i was looking to upgrade whit a 3080ti 3090 but they were more expansive
@@InTeRsHD Thanks for the reply. I found out that the 4070Ti performs at times better then a 3090. Which brand of 4070 do you use and which PSU? Do you have any coil whine with the 4070?
@@jamrocks464 im using a 4070ti tuf gaming OC edition and a Corsair rmx shift series 850w , i have less coil whine than my 1080ti rog strix and my old corsair hx1000i
Here is Gigabyte Gaming OC 4070 Ti for 1080p on 144Hz, g-sync - all on ultra, ray tracing on, rocking very high fps. Perfect future proof 1080p card, good 144p card and barely 4k card.
This is a test at 1440p. This card is HEAVILY CPU bottlenecked at anything lower than 4K. Not to mention, it's a HALO product. They are never a value purchase.
Let's be honest. Are 4070ti and 4080 really bad for the price? Yeah, probably not cheap, but hey, the 3080 was only 2 months old at MSRP. And it wasn't really a good graph. I really don't recommend the 4090 if it's not for work. But if you already have the 3090, then I really don't recommend the jump. But if you don't have the 3090, the 4090 is totally worth it. Many on the street always mentioned the 7900xtx. But I'll be honest, a friend gave me one (not the founders) and I didn't really use it. The drivers are only a small part of the problem. The real problem is "I don't see utilities for AMD GPUs outside of gaming and maybe data science jr with low or essential knowledge in drivers (yes, that excuse about native drivers used by useless A fairly common line of thought in jr data science and sometimes in between. With high dismissal rates). At least 1/5 times, they make you lose the data load, first they pay you and not another asshole, due to the fact that you promised them the job 2 hours before, if it fails you 1/9 times (and it's your fault) It's something (it gives you room to show your face), but for a client "it will always be your fault" you can't afford that luxury. And just using the gpu for gaming, well I'll be honest, only 21% use it like that (and the same with the constant complaints about the economy). GPUs are a luxury product aimed at the middle class, not the lower class. And much of its cost factors in the average salary of the middle class.
not a good benchmark in the witcher 3 u sett RTX off so the 5900x get insane bottleneck with RTX u buy a 40 serie for RTX on so the 5900x is to low i guess
Ain't about how much you need as you don't see swap files on use in memory but they are still there using the extra space when they are recall in a similar fashion to RAM. Do you see the impact? probably not but when pushing the limits having more RAM & VRAM give you stability as your game won't crash if they go over the limit even if just for a second and you don't have to wait for a fix patch to solve it, also improve loading times on pre-render textures or models you already load, like going back to town on MMORPGs. Also when adding mods on game you see the benefit of having extra RAM and VRAM like Skyrim in 4K with 300 mods or City Skylines and probably more to come in the future. That just regarding games, no need to mention professional work stations and video production.
I’m glad for the 4090 Fe I bought last week at my local Best Buy. I don’t have to worry about running out of vram on any games especially re4 remake which does eat up 17gb of vram!
4080 and 4090 , i tried both ingame and you cant see difference , your eye cant tell . thoses technical test are only for showing numbers perfs, but you only can see big upgrade if you come from a 3060 to a 4080
idk i saw quite the difference as far as max settings, 4k gaming, etc with the 4080 and im coming from a 3080. Its def worth upgrading from the 30 series to the 40. If youre coming from a cheaper/older card and dont wanna spend a shit ton just go for a 3070/3080/4070. but theres def a difference lol.
There is a difference if you play to flight simulator or perform rendering (Blender), but if you play to counter strike or other "Light game" you won't see any difference because there are too much FPS and the limit is not on GPU
NVIDIA and the rest need to strive for best performance for 200 or less Watts. Saved power to charge phones/cars. Over 250W should be banned IMO, and I'm not exactly a greenie.
it just depends on what youre coming from. say someone coming from 20 series or low 30 series its a nice upgrade. I went to a 4080, but thats bc i came from 3080 so a 4070 or ti wouldve been pointless.
@@francesca251 i just bought 4070 coming from rx 5600xt, the normal prices here in eu for the 4070 is 700€ and the TI version is 850-900€ and was thinking about the 3080(used)and 4070(new), for the price of 550€ and i saw the power draw comparison and just bought the 4070. But 200€ for the ti version and just 12gb vram kinda expens for that price.
4:32 last game TW Warhammer - 4080 is better than 4090 it has more fps but someone put 67% for 4080 abd 99% for 4090 lol,
I bought the 4080 but came
From the 1070 😂. Payed a lot but won’t be upgrading for a very long time
You should have bought 4090 and be set for years.
@@zacthegamer6145 No lmao. His 4080 is plenty, no need to spend a few 100 more for a 4090. Of course the 4090 is top of the line, but in this gen of gaming the 4080 if paired with the right specs will last him plenty. i got the 4080/ryzen 9 7950x. im loving it, hope you are enjoying that 4080 man!!
@@nothim-7161 Not when 5000 series come out. 4090 is the only card worth buying. How much slower your 4080 is from 4090 (30-40%) with 8 GB less VRAM.
@@zacthegamer6145 compared to my 3080 it’s much better. I mean how much more do you need? Lmao you’re acting like any game out rn even needs a 4090. You’re not even getting everything out of the 4090. Maybe a couple years down the road🤣
@@nothim-7161 A lot of games max out 4090 even. Just keep living under a rock and justify your purchase. A 4080 is the worst price/performance card of 4000 series while 4090 is the best. 4090 will be equivalent to 5080 next Gen meanwhile 4080 will be beaten by 5060ti/5070!
The Last of US at 4K native, ultra settings use 17GB VRAM. Let UE5 games come out and your 4080 will be a compromise!
I've upgraded to 4080 from 3070 (aka 2080ti with less vram), sold it for 1/3 the price of 4080 so yeah it was kinda less expensive but still not cheap. For me it was worth it since I'm definitely skipping 5000 series and won't be running out of vram for that time.
games are already hitting 15 gb vram i give the 4080 a year and it will be stutter city
@@mikem-gb3pj Bullshit
You're probably confusing this with RAM. There are games that can take up 16GB if you have 32GB. But just because the memory is being occupied doesn't mean you absolutely need it. And it's the same with graphics cards. A 4090 will almost always use more VRAM just because it can, but that doesn't mean it's needed. There are also two different values. First the memory that is reserved and then the memory that is actually used. And no game needs more than 16GB of VRAM, that's absolute bullshit.
With the 4080 you can definitely skip a generation. A very good card and definitely more useful than a 4070 TI because of the memory. 16GB is definitely enough, especially with Nvidia cards, which have significantly better memory management.
Also went for the 4080 after using the 3060 Ti. I think the 16GB VRAM will be enough until a new console generation hits the market. We'll see.
@@mikem-gb3pj well it depends on which resolution youre playing. hogwarts legacy on 1440p sucks 7gb of vram approximately. If youre playing 4k, then okay lol.
This video is a perfect example of why I'm skipping 40 series and just sticking it out with my $700 3080 12gb. The 4070ti isn't enough of an upgrade for me and the 4080 is too expensive. If I'm going to spend $1200, I'll just save it and wait for next gen.
Remember not everyone has a 3080 so for someone making a new build or upgrading a weaker build the 4070ti is a better choice if it’s around the same price
the 4000 series are for people who still have a 2000 series or 1000 series GPU.
Those who have a 3000 series GPU can skip until the 5000 series.
@@BeatmasterAC well said
@@BeatmasterAC lol i went from a 3070 to a 4070ti and the performance gain is massive
@@iDTecKt
you went from a 3070 to 4070ti.
That means 45% AVG Performance Gain for roughly extra $400 (depending on the Custom Model).
That's kinda meh. Not worth it in my opinion.
in my Situation: I have a Palit RTX 2080 Super Gamerock Premium (769€ in 2019). If I would upgrade to a RTX4080 the performance Gain is 153% on Average (for 1440p Gaming), for 500 to 600 extra more money (depending on the Custom Model). Much more worth it.
If I would jump to a RTX 4090, the Average Performance Gain would be 211% for 950bucks extra.
Still worth it.
Damn, I think its time I upgrade. I'll probably wait for the 5000 series as I only have 1080p monitors. Once I do upgrade... I'll need to buy a new monitor as well.
I game at 1080p settings not on highest, so 400 egg-cooking watts or even 300WATTS are a waste of money and melting ice caps!
4070 the perfect sub-200W card.
True but not at the price nvidia wants. Should be 100.00 cheaper, then it's a great card
lol that's why you under volt the cards, my 4070ti lost about 5 fps for 90 watts of power
im rocking with a asus oc 4080 and on max settings for most games i pull about 200-250 watts, but i also have 1000w psu so ill handle it either way. if youre upgrading your card its no brainer to upgrade the other stuff as well lol.
This Is not 1440p but 4k, in 1440p in cyberpunk without Path tracing but fully maxed with DLSS quality i have 90/100 fps with my 4070ti. For the results of this video or you are in a severe bottleneck or the real resolution Is 4k and not 1440p.
I get 70 min 85 av and 105 max with Path tracing
@@iDTecKt Me too! I don't understand if the video is deliberately misleading to make it appear that these cards are less powerful (for which reason I don't know) or if he misspelled the title, those results are not 1440p with the 4070ti, it is much more powerful than what you see in the video.
My 4070ti gets better performance then what's showing in this video, so I was thinking tge same thing.
@@EddieMcclanahan The results of this benchmark are simply false, the author of the video has peddled benchmarks done in 4k resolution, pretending they are in 1440p. I don't know why this fake, but it doesn't give real information about these video cards in 1440p.
@@agentpurple5602 im pretty sure those are all pretend-benchmarks, and the dude who made this video actually didnt do a single benchmark of anything and just stitched together some fake number counters. actual benchmarking takes a lot of time and noone who puts in this much work doesnt put at least like 5% of the benchmarking time into presenting the data. but these fake benchmark videos obviously only took like 5 minutes to produce. and they net alot of views. its kinda sad
The problem is the processor, not the gpu, in the actual generation of processors is better the experience and the performance
what
Messed up the last game, seems to be showing RTX 409s framerate for the 4080.
I don't understand.. I'm only getting like 84 fps on Far Cry 6 in 1080p Ultra settings, I have my 4070 TI paired with a i5 13600K and 32gb of DDR5 6400. Any ideas?
maybe RT is on? turn it off to get more frames.
also turn dlss on
ubisoft game fps optimize usually worse due to my experience on AC
lol using i5
I fixed my issue @@Greepled. I ended up getting around 170+fps. Nothing wrong with the i5 13600k, busta.
i5 still shit@@Lethaldot
What if SLI would have existed, buying two Rtx4080 around 2x600 euros (1200) total, is the same price as one 4080...
Why is the 4080 FPS increase in% not correct?🤔
Math is OFF on 4080. 🤔
I am surprised that the majority didn‘t realize that 😂
Which is way the 4090 is the better buy
@@TheRealMrCN do you even math. The 4080 increase nunber should be higher
@@TheRealMrCNIF you have double the money easily available to spend.
Yeah, when did the 4080 get that good
4090 cpu Bottleneck
My favourit game is Total War Warhammer 3 and I'm using a Nvidia Asus 1080.
Is the Nvidia 4070 or 4070Ti a good investment?
can't say for warhammer , but came from a 1080ti to a 4070ti at 1440p , and i think it was worth , at 1440p i was dropping setting hard to stay at native resolution and i was looking to upgrade whit a 3080ti 3090 but they were more expansive
@@InTeRsHD Thanks for the reply.
I found out that the 4070Ti performs at times better then a 3090.
Which brand of 4070 do you use and which PSU? Do you have any coil whine with the 4070?
@@jamrocks464 im using a 4070ti tuf gaming OC edition and a Corsair rmx shift series 850w , i have less coil whine than my 1080ti rog strix and my old corsair hx1000i
@@InTeRsHD Thank you very much for the reply! I might consider the 4070Ti from Gainward or Gigabyte! :)
It would be a solid jump in performance over your 1080 for sure! Especially if you are playing at 1440p
the 4080 was a worthy investment. Plus i only paid $929 on Amazon so yup.🎉🎉🎉🎉
How did you buy a 4080 for $929? Which card model?
Where Imma buy it now
@@Frenchfrys17 FOREAL NO WAYYYYY
Here is Gigabyte Gaming OC 4070 Ti for 1080p on 144Hz, g-sync - all on ultra, ray tracing on, rocking very high fps. Perfect future proof 1080p card, good 144p card and barely 4k card.
someone got 120 with dlss and frame gen on 4070 at 1440, think ti and above are capable of 4k
It's strange, my 4070ti zotac amp extreme paired 12600kf is rolling cyberpunk 90-100 in fullHD, but everything on ultra and dlss is on..
Fake
@@brobiv2452 lol hes at 720p upscaled to 1080p 😆
On 1080p your card will be insanely bottlenecked, i bet the 4070 ti usage is around 70%. Try at least 1440p
@@VoidElSidBS bottleneck? you mean underutilized?
are you using DLAA or DLSS? for me DLSS quality its 120- 130fps, DLAA 90-100FPS, frame gen all off.
4090 x3 times more expensive than a 4070 but hardly ever double the performance :/ as a 4090 owner.. feeling like i made a gigantic mistake.
This is a test at 1440p.
This card is HEAVILY CPU bottlenecked at anything lower than 4K. Not to mention, it's a HALO product. They are never a value purchase.
In more demanding (future) scenarios the 4090 will completely crush the 4070, it will be the difference between butter smooth and unplayable.
just pray your connector does not melt.. 4090 is the best if you can afford it.
true its the sale of the year
What a crappy times we live in when 1k euro GPU (4070ti) Struggles to reach 60 fps in 1440p gaming..
Die 4090, ist zweimal die 4070 😎 eimndeutig!!!
Saludos pero con que procesador ?
And again nobody noticed. Look at the numbers at the last test.
RT is so not needed by me. Cook an egg with all that heat.
No primeiro benchmark, vc errou na comparação da 4070 para a 4080, são 41% de diferença e não 26%.
The first two he fucked up on
Wrong percentage for 4080
I guess you should change your cpu
With 4080 even your fps soo low
Let's be honest.
Are 4070ti and 4080 really bad for the price? Yeah, probably not cheap, but hey, the 3080 was only 2 months old at MSRP. And it wasn't really a good graph.
I really don't recommend the 4090 if it's not for work. But if you already have the 3090, then I really don't recommend the jump.
But if you don't have the 3090, the 4090 is totally worth it.
Many on the street always mentioned the 7900xtx. But I'll be honest, a friend gave me one (not the founders) and I didn't really use it.
The drivers are only a small part of the problem. The real problem is "I don't see utilities for AMD GPUs outside of gaming and maybe data science jr with low or essential knowledge in drivers (yes, that excuse about native drivers used by useless
A fairly common line of thought in jr data science and sometimes in between. With high dismissal rates).
At least 1/5 times, they make you lose the data load, first they pay you and not another asshole, due to the fact that you promised them the job 2 hours before, if it fails you 1/9 times (and it's your fault) It's something (it gives you room to show your face), but for a client "it will always be your fault" you can't afford that luxury.
And just using the gpu for gaming, well I'll be honest, only 21% use it like that (and the same with the constant complaints about the economy).
GPUs are a luxury product aimed at the middle class, not the lower class. And much of its cost factors in the average salary of the middle class.
Where are you pulling these bs numbers from 😂
not a good benchmark in the witcher 3 u sett RTX off so the 5900x get insane bottleneck with RTX u buy a 40 serie for RTX on so the 5900x is to low i guess
4090 just a monster
I must ask what the hell type of game is going to use anywhere near 24 gb
Having more vram than you need wouldn't hurt
4k
vr
Ain't about how much you need as you don't see swap files on use in memory but they are still there using the extra space when they are recall in a similar fashion to RAM.
Do you see the impact? probably not but when pushing the limits having more RAM & VRAM give you stability as your game won't crash if they go over the limit even if just for a second and you don't have to wait for a fix patch to solve it, also improve loading times on pre-render textures or models you already load, like going back to town on MMORPGs. Also when adding mods on game you see the benefit of having extra RAM and VRAM like Skyrim in 4K with 300 mods or City Skylines and probably more to come in the future. That just regarding games, no need to mention professional work stations and video production.
I’m glad for the 4090 Fe I bought last week at my local Best Buy. I don’t have to worry about running out of vram on any games especially re4 remake which does eat up 17gb of vram!
Parando pra ver o vídeo completo, vc errou muita porcentagem tanto pra + como pra -.
Wrong results
4080 and 4090 , i tried both ingame and you cant see difference , your eye cant tell .
thoses technical test are only for showing numbers perfs, but you only can see big upgrade if you come from a 3060 to a 4080
idk i saw quite the difference as far as max settings, 4k gaming, etc with the 4080 and im coming from a 3080. Its def worth upgrading from the 30 series to the 40. If youre coming from a cheaper/older card and dont wanna spend a shit ton just go for a 3070/3080/4070. but theres def a difference lol.
@@nothim-7161 Appreciate the honest comment. Getting my 4080 today really looking forward to test it out!
There is a difference if you play to flight simulator or perform rendering (Blender), but if you play to counter strike or other "Light game" you won't see any difference because there are too much FPS and the limit is not on GPU
the 4080 is the same as what i get on my 3080 in cod with it OC
Cyberpunk is an amazing game. Thank you, for include it, in your video.
4080 on lock :(((((
from where im from the 4080 is 2x more expensive then the 4070. but it not even 100 % more performance so the 4080 is not worth it at all
4090 king.
NVIDIA and the rest need to strive for best performance for 200 or less Watts. Saved power to charge phones/cars. Over 250W should be banned IMO, and I'm not exactly a greenie.
The market sets the game
Nobody is forcing anyone to get those cars above 200w.
차이를 모르겠대
FAKE !!!
how?
5900x bottleneck
4070 so bad
4070Ti is nice from what I seen.
it just depends on what youre coming from. say someone coming from 20 series or low 30 series its a nice upgrade. I went to a 4080, but thats bc i came from 3080 so a 4070 or ti wouldve been pointless.
€/fps 4070 actually it's the best choice. With a 600€ price start vs 1150+ of a 4080, there is not challenge. 150% fps against 200% euro to pay 🤐
@@francesca251 i just bought 4070 coming from rx 5600xt, the normal prices here in eu for the 4070 is 700€ and the TI version is 850-900€ and was thinking about the 3080(used)and 4070(new), for the price of 550€ and i saw the power draw comparison and just bought the 4070. But 200€ for the ti version and just 12gb vram kinda expens for that price.
First!!!!!
Somehow RTX 4070 managed to perform than my poor graphics card
Most likely cpu bottle neck
Bro, can u shate that Scene of FPS Monitor?