I'm curious, how many of you actually care about Ray Tracing performance? Also, I'm going to be streaming BF V with Ultra RT on Twitch in half an hour if anyone is interested: www.twitch.tv/dawiddoestechstuff
I game at 1080p ultra settings on 3080fe and asus rog 27" curved 144hz monitor. Went from a 1080ti, and its much better. And it affords me time to wait for nice high res/frame rate monitor.
The one thing I love about having a 3080 12gb on 1080p is I just have story mode games maxed out with rtx and still get 80+fps and plus crazy high fps on multiplayer games it's the best of both worlds and 1080p still looks great too me
@@null1245 it's more the increased bandwidth and increase in cuda cores (even though small still more then 10gb) but it's mostly the memory bandwidth increase
I think it's funny that they make 360hz monitors but no company optimizes the games for them like what the fuck do they expect us to play csgo forever? Like come on 60 looks like a picture show now and it's not easy to go back
@@Kaiserin_Amy yeah but when you are used to high FPS you never really want to go back so seeing a game struggle at 60 doesn't even make me want to touch it I need atleast 120fps now
@@trevino4195 Well I also played at high fps, and while fps games need high fps (thats a weird sentence) I still am fine with lower fps in lower paced games and rather enjoy the graphics. But you can just turn down settings if you want higher fps.
We already have games that are demanding enough on 1080p that we see better results even from a 4090 over the 3090ti. It just older games you’ll have bottle necks on the cpus side. But Fortnite with the unreal 5.1 engine actually sees better results from using a 6900xt vs a 7900 xt and xtx at 1080p granted it’s in the 400+fps range we still see the previous gen playing at nearly 100 frames slower which means there is some sort of performance increase. Now if we keep seeing jumps in engine technology we could see the newer cards and cpus needing to be buffed to handle these engines even at 1080p. Imagine a game that actually utilizes the unreal 5 engine to its fullest. You’d need a newer high end card in order to play at 144 fps at 1080p. The problem is cpus won’t be able to handle it if we keep going with the small performance increases we have been seeing gen to gen from amd and intel a like as their cpus can’t handle a lot of 1080p games with these high end cards already.
3080 was for 1080p what the 4090 is today for 4K: a high FPS bliss that only gets dampened when some "Hogwarts Legacy" mess arrives (see 3:33 for Crysis). 1080p players who bought the 3080 at launch price were playing smart for the long term. Well done!
Finally, a benchmark on a resolution that is most commonly used. I hate how the bigger creators feel the need to push for 4k and 8k content. 1080p is still AMAZING.
I've really come to enjoy how your content is aimed at the more average consumer. It's nice to not just see the biggest and baddest new tech stuff! You really help add to the ecosystem of tech RUclipsrs
1080p is great for monitors 27” and below. So the card may be overkill for now but it will still be relevant several years down the road and really get your money’s worth.
hey, i have a doubt. im going to buy a 3080 pc. But i have a 1080p monitor. I dont understand what is overkill. Does it mean my monitor will be damaged using such advanced graphic card?
@@sanjaysatishkrishna9480 It depends on what you want. Your monitor will not be damaged, but it will not give you the benefits of using a stronger GPU. Point of GPUs like 3080 is they give you playable FPS on 4K resolution(which is 4x bigger than 1080p). If you use them on smaller resolution you're gonna get more FPS (if your monitor supports that of course). Think of it Weaker GPU - Less FPS Stronger GPU - More FPS Smaller Resolution - More FPS Bigger Resolution - Less FPS The bigger your resolution is, the better GPU you'll need to get playable FPS at that resolution. But your not gonna get more FPS out of your monitor if it doesn't support it. 60Hz monitor - 60FPS max 144Hz monitor - 144FPS max etc.
Me: *Is using a 3080 to play Terraria.* Uhhhh..... (I also do CAD, so that's why I have the card. But it's funny using an $800 GPU to play a 2D game that's locked to 60 FPS.)
PSA: Anyone with a 3000 series card should undervolt it!! You save up to 100 watts while retaining near exact performance, regardless of your resolution! Also worthwhile to do as these cards don't overclock well at all.
Interesting. Just got a 3080 and am intrigued. How would one do this, and what kind of savings are we talking about? I'm probably only a 15 hours a week gamer, if that.
@@davidober1199 That's awesome. There are many good tutorials on RUclips. "Optimum tech" does a great one, he often undervolts for small builds. I went with his recommended 850 Mhz to 850 mV for basically any 3080. It can be adjusted but unlike an overclock, almost all undervolts are totally stable. It only takes a moment to do in MSI afterburner. On my Evga 3080 I was able to save roughly 100 watts while only reducing my performance 1-2%. I did a few different benchmarks to compare.
@@davidober1199 Nice, mine's actually the same one. I had already planned on undervolting any 3080 I was able to get, but feel it's even more worth it with this card as it has some of the highest power draw of all 3080s. According to EVGA the card is for a 750w or above power supply, so you're golden.
@@Elseworldz game wise its decent, but oh my God is it horribly optimized. There is no reason why a 3080 should be getting less than 100fps on it in 1080p. It doesn't even look fantastic compared to some much better optimized titles
From Ryzen 7 1700x and gtx 1060 to core i9 10850k and Rtx 3080 how times change Awesome videos dude. I like how you have progressed, keep doing what you are doing
There is no “overkill”, cuz you could always find some sh!t unoptimized “game” that couldn’t do 100 fps in 1080p where all other games could in some midrange gpu Well, in normal life of course ppl get this-performance cards for 4K😃
@@03e-210a yes it can, but remember to get i9-12900KS, disable all e-cores and performace ones except one and then overclock it to max possible frequency to run this crap single-core-only “game”😉😂
It's not overkill for really high refresh rate 1080p gaming, I'd say at least 165hz, preferably even more (unlike 3090 which is just wasted on 1080p, not so much due to its performance increase over 3080, but because paying upwards of $1500 for a 24gb vram card for that resolution is really wasteful), however in quite a number of cases you will still be limited by the single thread performance of even the fastest cpus available today (which was shown in this video as well - everytime gpu usage was below 90%, even though cpu wasn't even hitting 50% - that just means one thread was pegged to the max and was limiting the rest of the system), so I'd say the true calling for the 3080 is 1440p 144-165hz.
@Gothera Gaming i run a 1080p laptop with an external 24" 1080p display and even that doesn't struggle with 1080p. Can't imagine how smooth things must be with a 10700k
Meanwhile, my brain just goes: 3080 @ 1080p just means it won't start to feel slow for as long, and also leaves the option for going up to 1440p or even 4k if I decide I want to down the line. Seriously, for some people, futureproofing things when they have the budget on hand is a LOT better of an idea, than taking the cheaper option now and then upgrading later.
Kinda disagree, futureproofing is kinda bad imo, you shouldnt buy something overkill for what you need now and expect it to last a long time. 2 years from now you can buy a gpu as strong as 3080 but for half the price and with more ram/features. Buy something bang for buck now probably the 3060 which is rumored to be as good as 2080 and 12gb, then buy 4060 which is probably gonna be a 3080 but with better rt, more ram, alot lower wattage, new warranty etc. You spent alot less money basically after you sell your 3060.
@@fran117 That's not the case tho. Again, I said SOME PEOPLE. In general, futureproofing doesn't work. However, when it comes to overkill tech for a 'low' resolution, it can in fact save money in the long run (although that wasn't what I said either), especially for those that AREN'T bothering with reselling cards. You are also operating off the assumption that we will see similar hardware jumps as the recent gen in the next few and that prices will remain stable. That's far from guaranteed (especially with, IIRC, at least one AIB saying NVIDIA's MSRP is not realistic). To say nothing of RT not being a factor for probably 90% of players in the first place. I specifically said 'better', not 'cheaper'. Some people are in financial situations that are unstable or otherwise not guaranteed to have the money to do an upgrade in the next few years. Going overkill NOW means you have a better safety net for the long run. Additionally don't have to worry about part compatibility for the theoretical upgrade in the future. Who knows, that 4k series GPU might require PCIE 4.0, which means you might be screwed if you are on an older Mobo, unless you want to spend MORE money to upgrade that while you are at it, which would probably also mean needing a new CPU. Ya gotta factor in the fact that upgrading in the future might mean needing additional parts on top, and factor that into the cost as well. Current trends with wattage don't really point toward a decrease in wattage from Nvidia, really.
@@jtnachos16 with amd getting rich and intel coming into gpu market, im expecting only good things coming out next gen. But i guess you are right that some people wont be able to upgrade every gen.
As someone who owns a 3080, and 1080p panel, I can say 3080 is not an overkill. In games like Cod, apex legends, bf, I can get 240 frames+ at 95-99% gpu utilization. You will need to pair your 3080 with a beefy cpu however.
Very , very good video !! I am also an " advocate " for 1080p resolution when gaming ( either on console or pc ) . Many games cannot max out at 1080p on a high end PC like this one . Though many do NOT know about the benefits of Super Sampling !! If you own an NVIDIA GPU and your game CAN be played maxed out at 1440p then it is advisable to use so called DSR ( VSR in the case for AMD gpu's ) . Basically you render the game INTERNALLY at 1440p ( 2K ) and then the image is Super Sampled on your 1080p Monitor or TV . By the way , when applying such a " trick " you should consider DISABLING any Anti Aliasing ( eg SMAA , SSAA , TXAA or TAA ) . Note that FXAA may not affect either performance nor distort image quality .You get a MUCH better quality image as Super Samplings has been proven to be better than Anti Aliasing ( do not take my word for it , search on RUclips ! ) . And what is even more important AND interesting Frame Time ( not to be confused with frame rate , FPS ), often becomes much smoother and thus giving a better experience. Look at how the 3080 RTX fares when playing at 1080p VS 1440p VS 2160p . Some older games ( gem's included eg GTA V , Rise of the Tomb Raider ) can play at a 4K ( native resolution , hence can be rendered internally at 4K ) . I think the reason for this is that 4K gaming is NOT cpu bound ( given that your cpu is a recent one and does not cause a so called " cpu bottleneck " ) Now, Coming back to consoles, let's talk about PS5 or XBOX Series X. Yes some games CAN be played at 4K ... But how many at 60 FPS ?? Moreover take Marvel Avengers released a few months ago. The PS5 can achieve a SOLID 60 FPS only at a 1080p resolution . And this game was released BEFORE the launch of the PS5 . So , games that will be released in just a year from now , how will they fare ?? Games become more and more demanding over time . The PS5 and it's counterpart will STRUGGLE to maintain a solid 60 FPS at 1080p much sooner than what most gamers think , especially with games making use of Ray Tracing technology !!
Hey man awesome stuff! I'm liking your vids! The stock shortage was sort of a blessing in disguise, I was prepared to spend $700+ on a 3080 but I went for a $300 rx 5600xt instead and have been really impressed at 1080p!
I love today that it's totally viable to game 1080p, while 1440p and 4k are also very reasonable. It just makes 1440p and 4k feel more premium and budget 1080p doesn't feel like you're really downgrading.
Loving all the videos! You have my thumb(s) up. Another great one 👍 Dawid! Thank you good sir. I'm happy you finally got your 3080. As they say. Better late than never and good things come to those that wait.
@@razortech9260 Some games seem to just give up and act as if there's a bottleneck on the lowest settings. My computer has a Ryzen 9 3900x and a GTX 1070, and I like to play games at pretty low settings. I know that a 3900x is overkill for the GPU that I have but certain games just do not push it up to full usage. I know that none of the cores even max out in games so it's not a bottleneck. Maybe there's a way to fix that but yeah it's not gonna be a case of just getting 1000 frames in every game.
DiscreetLuka Depend on games/maps, I have 3080, i7-10700k and 280hz. Even I use low settings i cant get stable fps or reach to 280fps. So for 360hz monitor, beside csgo or valorant, i dont think you can get minium fps of 250
@@qwe29362 That seriously depends on the games you play, and I’ve seen benchmarks with what I’ll be getting with the 3080 and I should be able to reach 360fps without a problem on low settings on most games. I’m surprised you keep a stable 280fps, is it because your GPU is bottlenecking your CPU?
You need to look at the 1% Lows and then factor in the Monitors Refresh Rate, for example when you said BF5 is getting 170FPS Average it doesn't mean now you should get a 240Hz Monitor.. No, now you get a 144hz Monitor and G-Sync or V-Sync Lock the Game to play at a Constant 144FPS to match your 144Hz Monitor.
Og man! Getting an 120HZ gaming projector for Christmas. 1080p 120'' screen, 120Hz all the way. I just do not care for 4K. Give me the 120s man!!!!! Excellent video mate!
I think it matters for the sake of future proofing your system. You can pretty much play anything at least 60fps 1080p with the 1080ti. The problem is that the 1080ti won’t be able to hold on to that in a couple years, the 3080 on the other hand will be able to handle 1080p gaming for a very long time, with RTX enabled as well.
Speaking of your accent, I didn't give it much thought and watched your channel for like a year before I realized you weren't from the EU but Canada. I'm even usually pretty good with accents lol.
Unironically agree. Minecraft RTX doesn't have nearly as many benchmarks as it should given how popular the game is and how demanding the RTX effects on it are.
There really needs to be a channel that just overmods the shit out of games and then benchmarks them with high-end CPUs and GPUs to see how well it performs.
the main reason i still choose 1080p with my 3080 is the frame rate dips. if i average around 180-200fps and it only dips to 150-170 then it isnt really noticeable. but in 1440p i only average about 150-160fps. so the frame rate dips become much more noticeable when they drop down to 110-115. personally i still never use ray tracing at all. id rather get 180+fps with max settings. im a frame rate above all else kind of person.
Hey Dawid you can unlock Escape from tarkovs fps to make it whatever your monitor supports (or your gpu can pump out) Vox_e has a short vid about how to do it if you want to for the next test
@Dawid - Hi, first time viewer here. Obviously being new, I don't know if this is like an inside joke but I really don't think you need to be so hard on yourself. I thought your presentation of this video was on par with some much bigger and older channels and your analysis and conclusion was spot on for a video with this title. Correct me if I'm wrong but what I saw in this video is someone who knows what he's talking about, on the whole looks very confident and comfortable in front of the camera, but in those few brief moments, lets his insecurities poke through in the form of (verbal and written) comments putting himself down when I saw absolutely nothing wrong.
Overkill? Maybe for now but wait and see how far games will push things in the next two or three years. It would be a pretty nice investment especially since there are many 1080p monitors that are high quality and have very high refresh rates.
Fun fact: if you have a 1660 Super and Call of Duty: Black Ops Cold War, you can actually toggle ray tracing on and off, and the VRAM slider actually increases when you enable ray tracing. Cool little bug IMO.
@@JudeTheRUclipsPoopersubscribe it's not about how he sounds like. it's about his name. Though how he says 'thirty' *does* sound similar the the Welsh 'dd'.
@@mattparker9726 the only dawid I know irl is from Poland lol, the Welsh drag out the word more like thiiirty, instead of fiddy like how dawid says it lol. I lived in Wales for 2 years so I've got a good idea how they speak.
The nice thing is you are not limited to 1080p if you have one of these beasts. BTW, I listen to your videos, half because of the way you pronounce 3080, etc! :)
I bought a 3080 to play at 1440p 60+ FPS in new games for the foreseeable future. But I also enjoy 144+ FPS in multiplayer games. Also helps that graphics don't matter as much in that case. And it's fun to mess around with lower resolutions, ray tracing, dlss, etc.
To uncap eft you have to turn vsync on in game, trust me it will make sense, then you will have to go to nvidia control panel > manage 3d settings > program settings > add escape from tarkov (by clicking add, browse, then go to desktop and click open file location on bsg launcher, then go to the battlestate games folder at the top then eft live where the eft exe file is and add it) > scoll to v sync and turn it off. You can no go back to the game, turn on v sync if you haven't already, and test to see what fps you get, it should be unlimited now. For some reason this bypasses the 120fps set limit
i know that video is kinda old by now, but the 3080 does make sense at 1080p for niche sections. i do simracing, my screen for the sim setup is a 1080p screen.... 49in super ultra wide. racing sims also dont feel great under 120fps so for a good experience with an ultra wide, or triple screens. you need quite a bit of GPU beef to drive those frames.
i think one thing I noticed no one talks about is the performance hit from using nvidia broadcast on both cards, and the fact that you can’t get that from an and card
Unless you really really reallyyyy care about high fps in story based AAA games with everything maxed then fine 3080 for 1080p in that regard makes sense,kinda? but man if someone plans on getting 3080 do yourself a favor and at least upgrade to 1440p. 4k is where these cards shine and imo it should be like this 3070-1080p, 3080-1440p or 4k, 3090 - 4k.
"Imma video this if it kills me" I'm afraid how often this phrase applies to Dawid, Patron Saint of Hardware E-Waste, may our GTXs be blessed by his OEM-ness.
On a competitive gaming point of view, an extra graphic detail is good but a disadvantage. You should be able to see fast with less graphics. Hence FPS matters the most. This is where rtx 3080 at 1080p really shines in a world where AMD does not exist.
I would think it's not overkill. Since a lot of games will actually make use of the full power of the GPU even at 1080p. It also makes for a lot more versatile of a build having a powerful GPU. When it comes to which games you can play, what quality settings, and being able to reach high refresh rate gaming in 1080p. Edit: Allt his being said, it's better to use AMD 6800XT or 6900XT (most likely) for 1080p gaming. In most titles.
Resolution vs fps is very personal. I prefer smoothness from high fps with everything on, than high resolution. I use 24 inch monitor, so 1080p vs 2k isnt a game changing to me (from de distance i play). But i understand some people have a better perception with resolution, and dont see much diference above 120hz.
Huh, this video strangely applies to me. A couple years ago I got this monitor because I was planning on building a budget PC meant for 1080p gaming but years later I never got to do it and I actually got a job when I turned 18 and am actually building a 3080 t.i. build and was planning on getting a 1440p monitor replace the same exact monitor you have in this video, lol
Key words are: "GOING into the future IS GOING to be more relevant" The 3k series is the new baseline and is very rapidly going to be eclipsed as developers and cardmakers come to grips with ray tracing, much like, say, early straight-wing jet fighters from WW2 were rapidly eclipsed by fighters designed from the ground up to be jets. 3k series hype beasts are going to be bag holders within a year or less
Would love to see a similar video, but looking @ Hitting 240hz constant and how settings are lowered to achieve this and what impact it makes visually and to the smoothness , every review on u tube seems to be max setting only...
Well that's because all maxed out settings is actually running the game at 4k and then downscaling it. If he wanted genuinely 1080p performance, he should have left all forms of anti-aliasing off so the card would only render the native res.
@@DawidDoesTechStuff Because every one is doing it. I like the humour and wit mate. Please just review old crap and make me laugh. You are the only guy on here with a clue about originality. I once subbed to all of the "big boys" only to find that they all copy each other and do the same crap week in week out.
I'm curious, how many of you actually care about Ray Tracing performance?
Also, I'm going to be streaming BF V with Ultra RT on Twitch in half an hour if anyone is interested: www.twitch.tv/dawiddoestechstuff
this is a video about a graphics card. I think.
me
Dawid
monkey power
I game at 1080p ultra settings on 3080fe and asus rog 27" curved 144hz monitor. Went from a 1080ti, and its much better. And it affords me time to wait for nice high res/frame rate monitor.
1080p is still great,not everyone cares about high resolution,some just want high fps
At which point high end GPUs aren't the play. Really fast CPUs, tho...
And if you want high FPS RT gaming, then eat your heart out. 😅
There are monitors now overclockable to 400 Hz lol
Maximum FPS 60. More than that its just useless, no matter what.
@@professionalhater3348 ?????
Ah yes,
Seeing the rtx 3080 performing on a resolution similar to the number of cards produced
I feel like that's a pretty generous number. 😅
100 cards worldwide, yikes.
@@DawidDoesTechStuff im sure 24p would be more realistic
1920 x 1080 = 2.073.600
idk, 2 million stocks doesn't sound so little to me 😆
@@BortaMaga 1080p, so the joke is that only 1080 were produced
Overkill:
YES
Great value:
YES
Available:
...
Haha!! Yeah, all these launches almost feel like a figment of my imagination at this point.
Overkill? Overkill will be when games at max settings and ray tracing are at 1080p 240 hertz
@@vntamed700 1440p*
@@adriano760 yea but we are talking about if the RTX 3080 is overkill for 1080p
Hotel trivago
3:19Ah, my favourite game "Ass Valhalla".
69
@@paratoxik922 420
Its a running joke
LMFAO
and another favourite of mine: Crysis Recash grab
The one thing I love about having a 3080 12gb on 1080p is I just have story mode games maxed out with rtx and still get 80+fps and plus crazy high fps on multiplayer games it's the best of both worlds and 1080p still looks great too me
Do you recon the extra 2gb makes a difference
@@null1245 it's more the increased bandwidth and increase in cuda cores (even though small still more then 10gb) but it's mostly the memory bandwidth increase
@@null1245 and 2 gh is great for slight future proofing
I mean you can get 130+ frames on sotr on 1440p with a undervolt and max settings. 3080 could prob push 200 on 1080p lmaoooo
Using fps unlocker I'm getting 90fps in open world and 120fps indoors and smaller areas in Elden Ring at max setting in 4k with my 3080 12gb.
Crysis Remastered be like:
*_Bruh, nobody used high refresh back in 2007, so ya don't need it in 2020 either_*
I think it's funny that they make 360hz monitors but no company optimizes the games for them like what the fuck do they expect us to play csgo forever? Like come on 60 looks like a picture show now and it's not easy to go back
because 360hz is for esport, you dont really need 360fps in a game like cyberpunk. Not that it doesn't need optimization.
@@Kaiserin_Amy yeah but when you are used to high FPS you never really want to go back so seeing a game struggle at 60 doesn't even make me want to touch it I need atleast 120fps now
@@trevino4195 Well I also played at high fps, and while fps games need high fps (thats a weird sentence) I still am fine with lower fps in lower paced games and rather enjoy the graphics. But you can just turn down settings if you want higher fps.
@@Kaiserin_Amy yeah I can't turn down Graphics either the only thing that's going to fix my problem is a RTX 3080 at 1080p lol
I don’t hate the orange lips lol. It matches the monitor!
Das sehe ich genauso.
Hi Anna. Hope you are having a great day.
@@bladeparsons7296 She deserves a salute after tazing poor Dawid, she might be dangerous, gotta stay on her good side.
@@bladeparsons7296 Hehe, I wish, things were so simple back then.
@@GoolagCEO ja ich sehe das auch genau so XD :)
Get one of these for 1080p and you'll be running on ultra for the next 6 years. Or 3 depending on how well pc ports are optimized.
We already have games that are demanding enough on 1080p that we see better results even from a 4090 over the 3090ti. It just older games you’ll have bottle necks on the cpus side. But Fortnite with the unreal 5.1 engine actually sees better results from using a 6900xt vs a 7900 xt and xtx at 1080p granted it’s in the 400+fps range we still see the previous gen playing at nearly 100 frames slower which means there is some sort of performance increase. Now if we keep seeing jumps in engine technology we could see the newer cards and cpus needing to be buffed to handle these engines even at 1080p. Imagine a game that actually utilizes the unreal 5 engine to its fullest. You’d need a newer high end card in order to play at 144 fps at 1080p. The problem is cpus won’t be able to handle it if we keep going with the small performance increases we have been seeing gen to gen from amd and intel a like as their cpus can’t handle a lot of 1080p games with these high end cards already.
3080 was for 1080p what the 4090 is today for 4K: a high FPS bliss that only gets dampened when some "Hogwarts Legacy" mess arrives (see 3:33 for Crysis).
1080p players who bought the 3080 at launch price were playing smart for the long term. Well done!
Finally, a benchmark on a resolution that is most commonly used. I hate how the bigger creators feel the need to push for 4k and 8k content. 1080p is still AMAZING.
I've really come to enjoy how your content is aimed at the more average consumer. It's nice to not just see the biggest and baddest new tech stuff! You really help add to the ecosystem of tech RUclipsrs
Your humor is so smooth that I can't even tell if it's intentional or not.
1080p is great for monitors 27” and below. So the card may be overkill for now but it will still be relevant several years down the road and really get your money’s worth.
27 is too big for 1080p. 24" is where i draw the line.
hey, i have a doubt. im going to buy a 3080 pc. But i have a 1080p monitor. I dont understand what is overkill. Does it mean my monitor will be damaged using such advanced graphic card?
@@sanjaysatishkrishna9480 It depends on what you want. Your monitor will not be damaged, but it will not give you the benefits of using a stronger GPU.
Point of GPUs like 3080 is they give you playable FPS on 4K resolution(which is 4x bigger than 1080p).
If you use them on smaller resolution you're gonna get more FPS (if your monitor supports that of course).
Think of it
Weaker GPU - Less FPS
Stronger GPU - More FPS
Smaller Resolution - More FPS
Bigger Resolution - Less FPS
The bigger your resolution is, the better GPU you'll need to get playable FPS at that resolution.
But your not gonna get more FPS out of your monitor if it doesn't support it.
60Hz monitor - 60FPS max
144Hz monitor - 144FPS max etc.
@@sanjaysatishkrishna9480 overkill means it works extremely good, almost too goos
@@sanjaysatishkrishna9480 it means your card will run at full load, 1080p easily
Why isn’t this channel more popular? Seriously your videos are the best!
Saturated market
Me: *Is using a 3080 to play Terraria.*
Uhhhh.....
(I also do CAD, so that's why I have the card. But it's funny using an $800 GPU to play a 2D game that's locked to 60 FPS.)
Haha!! Hey, I'm sure it's the best damn Terraria performance ever. 😂
*Insert 37 different mods you should use*
The real trick is to use the 3080 to play oldschool runescape lmao
Lol. Kinda like me with my 2070 Super doing PS2 and Wii emulation.
You can actually unlock the framerate by enabling frame skip or something like that. Only tested for 144Hz though
PSA: Anyone with a 3000 series card should undervolt it!! You save up to 100 watts while retaining near exact performance, regardless of your resolution! Also worthwhile to do as these cards don't overclock well at all.
Seems the 3000 series cards are the vegas of this generation... Except they're actually good at playing games.
Interesting. Just got a 3080 and am intrigued. How would one do this, and what kind of savings are we talking about? I'm probably only a 15 hours a week gamer, if that.
@@davidober1199 That's awesome. There are many good tutorials on RUclips. "Optimum tech" does a great one, he often undervolts for small builds. I went with his recommended 850 Mhz to 850 mV for basically any 3080. It can be adjusted but unlike an overclock, almost all undervolts are totally stable. It only takes a moment to do in MSI afterburner. On my Evga 3080 I was able to save roughly 100 watts while only reducing my performance 1-2%. I did a few different benchmarks to compare.
@@zsookah3 Very interesting. I have a 750w PSU. Do I need 850? This is my card: EVGA GeForce RTX 3080 10 GB FTW3 ULTRA GAMING Video Card
@@davidober1199 Nice, mine's actually the same one. I had already planned on undervolting any 3080 I was able to get, but feel it's even more worth it with this card as it has some of the highest power draw of all 3080s. According to EVGA the card is for a 750w or above power supply, so you're golden.
YES the 3080 is great and when the 4080 come out maybe ill be able to buy one
The 4080 or 4090 is projected to be 70% faster than the 3090
More like 40%
@@mustdie27 yeah probably tbh
@@buzzball4864 Also projected to be 70% more the price of the 3090 lmao
@@mustdie27hopefully the stock situation will get better by then
Ass Valhalla
😁
Sad. I love the game.
@@Elseworldz game wise its decent, but oh my God is it horribly optimized. There is no reason why a 3080 should be getting less than 100fps on it in 1080p. It doesn't even look fantastic compared to some much better optimized titles
@@itzpayday1238 its ubisoft
@@itzpayday1238 I thought i had broke drivers or something when i turned Valhalla on for the first time and saw i was getting around 85 fps.
TL;DR:
RTX off: 4k.
RTX on: 1080p.
cant wait to try out the 3080 on my 1080p monitor :))
The memory usage on BFV is equal to the number of rtx 3080 available on stores.
Me: has some intel integrated graphics, happy with 35 fps on minecraft
Dawid: is a 3080 overkill?
on a 4th gen i7 I was able to get 60 fps in fortnite how tf do you get 35 in minecraft
@@allahgatorsswamp269 optifine is key, if you’re not using it, why the hell are you still reading this comment?
From Ryzen 7 1700x and gtx 1060 to core i9 10850k and Rtx 3080
how times change
Awesome videos dude. I like how you have progressed, keep doing what you are doing
There is no “overkill”, cuz you could always find some sh!t unoptimized “game” that couldn’t do 100 fps in 1080p where all other games could in some midrange gpu
Well, in normal life of course ppl get this-performance cards for 4K😃
aka cyberpunk 2077 LOL
And gta 4.
Can it run crysis tho?
can it run saints row 2?
@@03e-210a yes it can, but remember to get i9-12900KS, disable all e-cores and performace ones except one and then overclock it to max possible frequency to run this crap single-core-only “game”😉😂
this is the res I'm gonna play with my 3080
This is one of few channels i did NOT unsubscribe for reviewing completely unavailable GPU's yesterday
Dawid gets a pass from me to as he never comes across as a smug git.
It's not overkill for really high refresh rate 1080p gaming, I'd say at least 165hz, preferably even more (unlike 3090 which is just wasted on 1080p, not so much due to its performance increase over 3080, but because paying upwards of $1500 for a 24gb vram card for that resolution is really wasteful), however in quite a number of cases you will still be limited by the single thread performance of even the fastest cpus available today (which was shown in this video as well - everytime gpu usage was below 90%, even though cpu wasn't even hitting 50% - that just means one thread was pegged to the max and was limiting the rest of the system), so I'd say the true calling for the 3080 is 1440p 144-165hz.
Incredible how people don't consider 1080p to be high res. Just recently I got a computer with a 1080p screen and it just looks so damn good
Same
@Gothera Gaming i run a 1080p laptop with an external 24" 1080p display and even that doesn't struggle with 1080p. Can't imagine how smooth things must be with a 10700k
Meanwhile, my brain just goes:
3080 @ 1080p just means it won't start to feel slow for as long, and also leaves the option for going up to 1440p or even 4k if I decide I want to down the line.
Seriously, for some people, futureproofing things when they have the budget on hand is a LOT better of an idea, than taking the cheaper option now and then upgrading later.
Kinda disagree, futureproofing is kinda bad imo, you shouldnt buy something overkill for what you need now and expect it to last a long time. 2 years from now you can buy a gpu as strong as 3080 but for half the price and with more ram/features. Buy something bang for buck now probably the 3060 which is rumored to be as good as 2080 and 12gb, then buy 4060 which is probably gonna be a 3080 but with better rt, more ram, alot lower wattage, new warranty etc. You spent alot less money basically after you sell your 3060.
@@fran117 That's not the case tho. Again, I said SOME PEOPLE.
In general, futureproofing doesn't work. However, when it comes to overkill tech for a 'low' resolution, it can in fact save money in the long run (although that wasn't what I said either), especially for those that AREN'T bothering with reselling cards.
You are also operating off the assumption that we will see similar hardware jumps as the recent gen in the next few and that prices will remain stable. That's far from guaranteed (especially with, IIRC, at least one AIB saying NVIDIA's MSRP is not realistic). To say nothing of RT not being a factor for probably 90% of players in the first place.
I specifically said 'better', not 'cheaper'. Some people are in financial situations that are unstable or otherwise not guaranteed to have the money to do an upgrade in the next few years. Going overkill NOW means you have a better safety net for the long run. Additionally don't have to worry about part compatibility for the theoretical upgrade in the future.
Who knows, that 4k series GPU might require PCIE 4.0, which means you might be screwed if you are on an older Mobo, unless you want to spend MORE money to upgrade that while you are at it, which would probably also mean needing a new CPU. Ya gotta factor in the fact that upgrading in the future might mean needing additional parts on top, and factor that into the cost as well. Current trends with wattage don't really point toward a decrease in wattage from Nvidia, really.
@@jtnachos16 with amd getting rich and intel coming into gpu market, im expecting only good things coming out next gen. But i guess you are right that some people wont be able to upgrade every gen.
As someone who owns a 3080, and 1080p panel, I can say 3080 is not an overkill. In games like Cod, apex legends, bf, I can get 240 frames+ at 95-99% gpu utilization. You will need to pair your 3080 with a beefy cpu however.
I am just here to hear him say "thirty" over and over again 👍
Ten thiddy
tenthiddy.com
Tree fiddy
@@SomeDudeQC thiddy eighty
Would'a loved to have seen this video made today
See that Cyberpunk 2077 performance at 1080p
yes its overkill at 1080P until you throw Cyberpunk at it
Very , very good video !! I am also an " advocate " for 1080p resolution when gaming ( either on console or pc ) . Many games cannot max out at 1080p on a high end PC like this one .
Though many do NOT know about the benefits of Super Sampling !! If you own an NVIDIA GPU and your game CAN be played maxed out at 1440p then it is advisable to use so called DSR ( VSR in the case for AMD gpu's ) .
Basically you render the game INTERNALLY at 1440p ( 2K ) and then the image is Super Sampled on your 1080p Monitor or TV .
By the way , when applying such a " trick " you should consider DISABLING any Anti Aliasing ( eg SMAA , SSAA , TXAA or TAA ) . Note that FXAA may not affect either performance nor distort image quality .You get a MUCH better quality image as Super Samplings has been proven to be better than Anti Aliasing ( do not take my word for it , search on RUclips ! ) .
And what is even more important AND interesting Frame Time ( not to be confused with frame rate , FPS ), often becomes much smoother and thus giving a better experience. Look at how the 3080 RTX fares when playing at 1080p VS 1440p VS 2160p . Some older games ( gem's included eg GTA V , Rise of the Tomb Raider ) can play at a 4K ( native resolution , hence can be rendered internally at 4K ) .
I think the reason for this is that 4K gaming is NOT cpu bound ( given that your cpu is a recent one and does not cause a so called " cpu bottleneck " )
Now,
Coming back to consoles, let's talk about PS5 or XBOX Series X. Yes some games CAN be played at 4K ... But how many at 60 FPS ??
Moreover take Marvel Avengers released a few months ago. The PS5 can achieve a SOLID 60 FPS only at a 1080p resolution . And this game was released BEFORE the launch of the PS5 . So , games that will be released in just a year from now , how will they fare ??
Games become more and more demanding over time . The PS5 and it's counterpart will STRUGGLE to maintain a solid 60 FPS at 1080p much sooner than what most gamers think , especially with games making use of Ray Tracing technology !!
I’m wearing the exact same hoodie as you. I literally got it yesterday.
For the love of God Dawid can you please start recording gameplay at 60fps??
You want make videos to be in 60FPS?
@@DawidDoesTechStuff yes
@@DawidDoesTechStuff oh yes we'd like that
@@DawidDoesTechStuff Engrish
i know right!!! its always just choppy in the gameplay sections!
THANK YOU SO VERY MUCH FOR THE 1080P BENCHMARKS.
Again Dawid brings some reality into the situation with a touch of humour
Hey man awesome stuff! I'm liking your vids! The stock shortage was sort of a blessing in disguise, I was prepared to spend $700+ on a 3080 but I went for a $300 rx 5600xt instead and have been really impressed at 1080p!
I love today that it's totally viable to game 1080p, while 1440p and 4k are also very reasonable. It just makes 1440p and 4k feel more premium and budget 1080p doesn't feel like you're really downgrading.
Loving all the videos! You have my thumb(s) up. Another great one 👍 Dawid! Thank you good sir. I'm happy you finally got your 3080. As they say. Better late than never and good things come to those that wait.
May as well use 1440p at this resolution, some great displays in the £200-£300 range
Making the videos I always find useful. Thank you sir.
You should have done one where you put all the graphics settings on Lowest to see how crazy the FPS can get
I really want to see that
I already feel bad for that cpu...
@@ShmoothFruits lol. It has to be done, in the name of science
@@razortech9260 Some games seem to just give up and act as if there's a bottleneck on the lowest settings. My computer has a Ryzen 9 3900x and a GTX 1070, and I like to play games at pretty low settings. I know that a 3900x is overkill for the GPU that I have but certain games just do not push it up to full usage. I know that none of the cores even max out in games so it's not a bottleneck. Maybe there's a way to fix that but yeah it's not gonna be a case of just getting 1000 frames in every game.
Bruh this guy grows his channel.faster than my nails grow back ... Subscribed 10 months ago and he was at 18k ... Give me some tips dude
I mean, I’m getting a 1080p 360Hz monitor, so I’ll probably need a 3080 to push that many frames
DiscreetLuka Depend on games/maps, I have 3080, i7-10700k and 280hz. Even I use low settings i cant get stable fps or reach to 280fps. So for 360hz monitor, beside csgo or valorant, i dont think you can get minium fps of 250
@@qwe29362 That seriously depends on the games you play, and I’ve seen benchmarks with what I’ll be getting with the 3080 and I should be able to reach 360fps without a problem on low settings on most games. I’m surprised you keep a stable 280fps, is it because your GPU is bottlenecking your CPU?
whats the point of having that many frames? You won't realize the difference between a 240 and a 360
@@rebelrin6686 I have a friend who has the PG259QN 360Hz and a 240Hz monitor and I can really tell the difference between the two.
DiscreetLuka Yes, you can notice the smoothness. But it’s just a matter of time your eyes get used to it
Good vid! Gave me the info I needed. I'll hold out as long as I can with my 2070 Super. Perhaps I can wait until "4000 series".
PC elitists are triggered when people still game at 1080p lol
No?
@@SussyBaka-em9rq yes?
I guess that's why you call them elitist snobs, and ignore them.
1080p is perfect for performance in esports titles where you need more frames
I'm still here gaming at 900p with RX570 lmao
I really needed this video, thank you!
You need to look at the 1% Lows and then factor in the Monitors Refresh Rate, for example when you said BF5 is getting 170FPS Average it doesn't mean now you should get a 240Hz Monitor.. No, now you get a 144hz Monitor and G-Sync or V-Sync Lock the Game to play at a Constant 144FPS to match your 144Hz Monitor.
no you should cap it at 150 or 160 because it will not always give you 144 fps. so its better to have it at 10 fps more than your refresh rate
Og man! Getting an 120HZ gaming projector for Christmas. 1080p 120'' screen, 120Hz all the way. I just do not care for 4K. Give me the 120s man!!!!! Excellent video mate!
I think it matters for the sake of future proofing your system. You can pretty much play anything at least 60fps 1080p with the 1080ti. The problem is that the 1080ti won’t be able to hold on to that in a couple years, the 3080 on the other hand will be able to handle 1080p gaming for a very long time, with RTX enabled as well.
😁👍💯
Speaking of your accent, I didn't give it much thought and watched your channel for like a year before I realized you weren't from the EU but Canada. I'm even usually pretty good with accents lol.
When is minecraft getting added to these benchmarks
Unironically agree. Minecraft RTX doesn't have nearly as many benchmarks as it should given how popular the game is and how demanding the RTX effects on it are.
There really needs to be a channel that just overmods the shit out of games and then benchmarks them with high-end CPUs and GPUs to see how well it performs.
Minecraft RTX makes the rtx 2060 super bring to its knees... especially my 1050ti...
this is the first 3080 review which hits the nail on the button. thumbs up
the main reason i still choose 1080p with my 3080 is the frame rate dips. if i average around 180-200fps and it only dips to 150-170 then it isnt really noticeable. but in 1440p i only average about 150-160fps. so the frame rate dips become much more noticeable when they drop down to 110-115. personally i still never use ray tracing at all. id rather get 180+fps with max settings. im a frame rate above all else kind of person.
Hey Dawid you can unlock Escape from tarkovs fps to make it whatever your monitor supports (or your gpu can pump out) Vox_e has a short vid about how to do it if you want to for the next test
Hope he notices this, it's a good idea.
3080 1080p 144Hz gamer here 😂
is it any good on a 144hz monitor?
Does rtx 3080 is good for 1080p?
Same
3080 1080p 165hz 0.5ms gamer here :p
@@axezz9451 how is it
@Dawid - Hi, first time viewer here. Obviously being new, I don't know if this is like an inside joke but I really don't think you need to be so hard on yourself. I thought your presentation of this video was on par with some much bigger and older channels and your analysis and conclusion was spot on for a video with this title.
Correct me if I'm wrong but what I saw in this video is someone who knows what he's talking about, on the whole looks very confident and comfortable in front of the camera, but in those few brief moments, lets his insecurities poke through in the form of (verbal and written) comments putting himself down when I saw absolutely nothing wrong.
I hope you see this buddy because I think you're doing a brilliant job. Just keep doing what you're doing man, great content.
Overkill? Maybe for now but wait and see how far games will push things in the next two or three years. It would be a pretty nice investment especially since there are many 1080p monitors that are high quality and have very high refresh rates.
well 1 year later and it's not overkill anymore if you are into high refresh rate gaming.
Fun fact: if you have a 1660 Super and Call of Duty: Black Ops Cold War, you can actually toggle ray tracing on and off, and the VRAM slider actually increases when you enable ray tracing. Cool little bug IMO.
Dies it cripple the performance? Lol
@@ArtisChronicles I have no idea. I think it's just a bug. The game has a metric ton of bugs and glitches
Dawid I've always been curious about your heritage, are you Welsh?
I was born and raised in Namibia.
Lol he sounds nothing like a Welsh person
@@DawidDoesTechStuff wait really? I'd never guess that.
@@JudeTheRUclipsPoopersubscribe it's not about how he sounds like. it's about his name. Though how he says 'thirty' *does* sound similar the the Welsh 'dd'.
@@mattparker9726 the only dawid I know irl is from Poland lol, the Welsh drag out the word more like thiiirty, instead of fiddy like how dawid says it lol. I lived in Wales for 2 years so I've got a good idea how they speak.
The nice thing is you are not limited to 1080p if you have one of these beasts.
BTW, I listen to your videos, half because of the way you pronounce 3080, etc! :)
"red lips"
"I Can't Get No Satisfaction ..."
Since when do I hear audio from texts...
I bought a 3080 to play at 1440p 60+ FPS in new games for the foreseeable future. But I also enjoy 144+ FPS in multiplayer games. Also helps that graphics don't matter as much in that case. And it's fun to mess around with lower resolutions, ray tracing, dlss, etc.
Man, my pc just apparently became a loser cause of its 2070 super T_T
nah don’t sweat it fam that card is still solid
Its still an awesome card, a pc with a 2070 super is still high end imo, you dont need to upgrade yet
Cries in 2060super T_T
Ya'll having RTX's? I'm still sitting here with my 1080 from 2 generations ago..
@@ShmoothFruits bruh my girl is still running a GTX 970 wym
To uncap eft you have to turn vsync on in game, trust me it will make sense, then you will have to go to nvidia control panel > manage 3d settings > program settings > add escape from tarkov (by clicking add, browse, then go to desktop and click open file location on bsg launcher, then go to the battlestate games folder at the top then eft live where the eft exe file is and add it) > scoll to v sync and turn it off. You can no go back to the game, turn on v sync if you haven't already, and test to see what fps you get, it should be unlimited now. For some reason this bypasses the 120fps set limit
i almost died when he called the 2070 super a loser card
Lmao
Im sitting here with a gtx 1050ti be like *leave me alone*
i know that video is kinda old by now, but the 3080 does make sense at 1080p for niche sections. i do simracing, my screen for the sim setup is a 1080p screen.... 49in super ultra wide. racing sims also dont feel great under 120fps so for a good experience with an ultra wide, or triple screens. you need quite a bit of GPU beef to drive those frames.
Not everyone cares about resolution. I prefer having high frames while playing any game on ultra settings
1080p 60fps is good enough for me hahaha. Keeps my GPU and other pieces cool
Exactly.
yo I have the same card and 1080p too i’m glad you made this
i think one thing I noticed no one talks about is the performance hit from using nvidia broadcast on both cards, and the fact that you can’t get that from an and card
Me: Hippity hoppity you're 240hz monitor is now my propertie!
Also me when watching my pocket: Nevermind forget it!
Hahahahahhahahhaha you are so funny and these are really original and fun comments that make me happy
Original - no
Spelling - also no
Well done.
@@tolvi2322 yes i have to play this game called Reddit so original with you....
Got a 1080p 144hz, ips panel, 1ms monitor for my rtx 3080 i9 12900k 😁. Now i can play any game with ray tracing without the tension of low fps
damn the 2070 super really is a good card still!
Loving the Flight of the Navigator T. 👌🏻
"you dont need to worry about a 3080" ok well ill sit here with my 3080 and my 1080p monitor thank u very much
Ikr. If you got a 240hz monitor 3080 is worth it no matter the resolution
@@nelsonlol5218 mines 144 hz, but still. Im happy to get that fps on max settings on nearly every game
@@BadChocMilky yh usually to get stable 240fps its required to lower all settings. So yeah you could increase setting until you get 144fps
Unless you really really reallyyyy care about high fps in story based AAA games with everything maxed then fine 3080 for 1080p in that regard makes sense,kinda? but man if someone plans on getting 3080 do yourself a favor and at least upgrade to 1440p. 4k is where these cards shine and imo it should be like this 3070-1080p, 3080-1440p or 4k, 3090 - 4k.
5:20
there is no way on earth bf5 uses that much ram
does it?
Lmao
most likely background processes as MSI Afterburner just sees how much RAM your computer is using, not how much RAM the game is using
"Imma video this if it kills me"
I'm afraid how often this phrase applies to Dawid, Patron Saint of Hardware E-Waste, may our GTXs be blessed by his OEM-ness.
On a competitive gaming point of view, an extra graphic detail is good but a disadvantage. You should be able to see fast with less graphics. Hence FPS matters the most. This is where rtx 3080 at 1080p really shines in a world where AMD does not exist.
I would think it's not overkill. Since a lot of games will actually make use of the full power of the GPU even at 1080p. It also makes for a lot more versatile of a build having a powerful GPU. When it comes to which games you can play, what quality settings, and being able to reach high refresh rate gaming in 1080p.
Edit: Allt his being said, it's better to use AMD 6800XT or 6900XT (most likely) for 1080p gaming. In most titles.
i have a 3090 and a1080p monitor
How does it look? I bet you have pretty high fps? Am looking to get a RTX3080 with 360hz screen 1080p.
I really wanted to know about this as I'm spinning my head over these 2K monitor prices.
Microsoft Flight Simulator: *no*
Resolution vs fps is very personal. I prefer smoothness from high fps with everything on, than high resolution. I use 24 inch monitor, so 1080p vs 2k isnt a game changing to me (from de distance i play). But i understand some people have a better perception with resolution, and dont see much diference above 120hz.
Me still rocking a 1060 and not knowing what ray tracing is
I'm not all that interested in it just yet. Sure it's fairly doable now on both AMD and Nvidia, but my RX 480 can still afford to age a little longer.
Very informative. You sir get my like.
Seems like kind of an unfair performance using a 2070S. Would be much more interested in a 2080 Ti comparison
msrp wise they are closer though. 2080ti should be compared with 3090 for genereal uplift
Best one would obviously be the 2080 super, since that was the previous 700-800 dollar card.
Huh, this video strangely applies to me. A couple years ago I got this monitor because I was planning on building a budget PC meant for 1080p gaming but years later I never got to do it and I actually got a job when I turned 18 and am actually building a 3080 t.i. build and was planning on getting a 1440p monitor replace the same exact monitor you have in this video, lol
Key words are: "GOING into the future IS GOING to be more relevant"
The 3k series is the new baseline and is very rapidly going to be eclipsed as developers and cardmakers come to grips with ray tracing, much like, say, early straight-wing jet fighters from WW2 were rapidly eclipsed by fighters designed from the ground up to be jets. 3k series hype beasts are going to be bag holders within a year or less
Lol no.
Would love to see a similar video, but looking @ Hitting 240hz constant and how settings are lowered to achieve this and what impact it makes visually and to the smoothness , every review on u tube seems to be max setting only...
3:20 Lol nice
exactly what i needed, thank you!!
when you realised even with a 3080, you still cant play few years old game at 1080p 144hz lol
Well that's because all maxed out settings is actually running the game at 4k and then downscaling it. If he wanted genuinely 1080p performance, he should have left all forms of anti-aliasing off so the card would only render the native res.
@@outlawfps3948 unless he specifies the game is using 4x msaa, you claim wont stand
@@Simon-tr9hv Dawg just go in the game and you can see that setting is turned on when using a certain preset
I'm so surprised you were ever embarrassed by the way you say thiddy. It's perfect. Never stop
me: a 1280x720 gamer that still wants a 3080 for overkill fps
everyone that hears this
✋🧿👄🧿👍
Dont sleep on Ben Q , its better than the other famous companies out there.
Oh god no. Here comes the 2000 "I've got a 3080" videos. I'm disappointed.
I mean, I've been talking for months about how I ordered one, so I'm not sure why you're disappointed. 😅
@@DawidDoesTechStuff Because every one is doing it. I like the humour and wit mate. Please just review old crap and make me laugh. You are the only guy on here with a clue about originality. I once subbed to all of the "big boys" only to find that they all copy each other and do the same crap week in week out.
Really like the video and gained a new sub. Is there 24” version of this model?
Show me the thiddy's
(I only care about the thiddys now)
10thiddys
Once I can get a 3070 for MSRP Im going to play that at 1080p I just want to max out my monitor in apex legends