@@jackpeendog8697 Yup, I initially began following him for the tech content but then I continued watching his videos for the production quality as well.
Is not only the quality I would say how clearly and to the point, he gets to explain things, no screaming not weird fx graphics he just goes straight to the point. He knows whats he is talking about..
I feel like this haha. My whole build is done (10900k / 3080) but I am currently using my crappy 1080p 60Hz monitor since the new one hasn't arrived yet. Should be here Tuesday tho!
Well a monitor with refresh rate isn't the only place where the fastest gpu and cpu would make sense. There is vr and video or photo editing where a faster refresh rate monitor won't change the experience that much
I have a 144hz monitor but it's a 24" 1080p TN panel from 6-7 years ago. Now I want to upgrade to 4k, but the 144hz options are either extremely expensive or not out yet. Don't know if I should wait for the new 4k monitors that are said to be releasing next year or to get a 1440p in the meantime...
I just bought a 3600, they are an absolute bargain. Will upgrade this time next year to a 5800 or similar. From everything I’ve seen the 3600 is the real sweet spot for sure. And I’m not certain the 5600 will really be worth the extra cost at the moment
@@Daeyae the 10600K is only marginally better in games, but anything else the 3600 would dunk on it. meaning if you wanted to record gameplay or stream, the 3600 is the better CPU
@@knightsljx 5-20% more FPS isn't marginally especially for a CPU and the 10600k also has better 64core speeds meaning its better than the 3600 for multitasking also, especially with a nice cooler and a good OC and tuning
@@Daeyae Your points aren't really that valid. 1. 3600 does have better multitasking capabilities... there are more factors than core speed, lol. 2. 5-20 is a big range, and not very apparent except on lower resolutions (you probably wouldn't be on 1080p with a 3080...) 3. Higher cost (good oc and coolers with motherboards cost more) 4. The 3600 costs 200 bucks... the 10600k costs almost 300... 5. the 3600 is much older. A better comparison would be the upcoming 5600. conclusion: you don't know what you're talking about
Seriously Ali, you’re taking Tech RUclips to the next level. The effort and research you put into your videos while making them very enjoyable to watch is impressive. Keep up the great work!
Agreed.. sitting on a 8700k currently. But id imagine it would fall somewhere slightly around the 10600k since I kinda won the lottery with a 5.1 overclock thats held incredibly stable for 2.5 years..
"this video is a primer for upcoming cpu releases" he cant keep a straight face saying that. I'll take that as a sign that the new ryzen cpus are good.
This is the best explanation of Bottlenecking I heard from techtubers. Thank you for clarifying it out! I wish and overclocked 7700K were here tested too.
My college housemate (who doesn't really know much about PC building) recently got a 3080 (needed it for his Deep Learning stuff), and it's funny when another mutual friend of ours were telling him that his choice of getting the Ryzen 3600 will bottleneck his GPU. Lol
@@waifuhunter9815 not really sure which software he uses but yeah, he got the 3080 for the CUDA cores. He was thinking of getting a 2080 Ti, but couldn't find a good deal for one.
I do like these kinds of videos. Ali, you are one of these few that does such deep dives. Which is appreciated since I always think I need more than I “actually” do.
One maybe interesting thing to note is that if your GPU is maxed out, your latency will increase quite a bit, ive noticed this in own testing and heard others reporting the same, so I cap my fps on just below what I can maintain almost all the time.
Great video! Loved the intro as well. Exactly what I was wondering as I am building my first pc with the 3080 in mind and don't want bottlenecking. Im surprised 3950x isn't way ahead. I wish I knew what the exact parts I need for my build so I can start deal hunting.
I have a 3090 and Ryzen 5600x and was using a 1080p Ultrawide and wondering why the FPS was so bad and the FPS was dipping so much. Also in BF5 my GPU usage was at 40-60% while my cpu was at 90-100%. I recently sold the monitor and got a 4K OLED and it’s so much better. The gpu is now being utilized 98% and cpu down to 30-50%. My average FPS is obviously a little bit lower but not by much and the FPS is way more consistent. CPU bottleneck is definitely a thing.
The Ryzen 5 2600 would do well with a 3080 at 4K. Hell it does nice enough at 1440p already since you must never hit 100% GPU utilization (or even 98%) since that impacts latency negatively.
@Ping Pong I did even better, I watched GNs and HUs video on them. The thing with Ultra Low Latency mode is that some games with wacky API implementations do not work with it. So not most new games but a few old ones and some OpenGL Titles. Still its good tech. I myself just use manual FPS limiters :P.
Just the insane quality and ambience of the video makes it so eye candy that you want to watch it till the end and the info provided is all time useful
@@gixxxer1k sell the 6700k n motheboard n ram n get something better.u get good money for it still 4 cores ain't cutting no more Lots of games will max it out n I mean alot
I literally just posted on another comment. I’m clocked identical to you @ 5.0 with my 8700k and no cooling issues at all. I game on a 1440 165hz monitor and play really only competitive FPS titles. Anyone who knows cpus knows 8700k @ 5.0 is insane but I’m kinda questioning my 3080 purchase looking at 6 core older tech. I’m wondering if 3070 might have been a better choice for me. I don’t play games that I need to see in 4K I like clicking foreheads in multiplayer first persons lol and I love my 1440 monitor. So confusing lol.
My current 2020 gaming PC: CPU: Intel Core i7 4790k @ 4.4Ghz(all core) @ 1.22v Ram: 16GB 2400Mhz DDR3 Corsair Vengence Pro Graphics card: EVGA RTX 2080TI XC Gaming Hybrid (watercooled) Motherboard: Asus MAXIMUS VII HERO z97 socket LGA1150 Monitor: Acer Predator XB271HK 4K Nvidia Gsync 60Hz IPS gaming monitor I have no problems playing all games max setting using my 4K G-Sync monitor, most users over exaggerate cpu bottlenecking & think they need a 12 core cpu for "future proofing"...
@_Nism0 if you’re on a 60hz, 1080p monitor then it doesn’t matter. This GPU is so powerful you could run anything above that monitor without it breaking a sweat. 1% lows won’t even matter lol
@jordy rozon if you had a 1060 on a 1080/60 monitor that's fine, but upgrading the monitor to a 1080/240 or 1440/144 you gain nothing, as the 1060 probably cant do 60+ on the highest settings for newer titles, but getting the 3080, you still gain the performance, and can super sample your monitor. Can always buy a monitor after it's better than getting the better monitor 1st unless you can utilise it in anyway, like if you play csgo
You should look more into frametime consistency, I found from my experience that consistency is more desirable than high framerate although the two are usually related.
This is a question i always had and never fully understood it but this video is the best thing ever. I just needed the understanding now i can go ahead and test out myself! Also, the visuals are absolutely epic
What about a 4770k at 4k? I think it's interesting to see how old cpus do at the high end. I've never felt the need to upgrade but people rarely test 4k,where at.
Great video as always man! You always post great videos that answer all the questions i have sitting in the back of my mind.. eg will a 650W psu be enough for a 3080.
@@Fazed_ the 3090 he got legit. We used to have tons of them in stock cuz they are so ridiculously priced for such little performance gain. This was 3 months ago, and they are no longer in stock even at that price. My 3080, however, is from a scalper. I got it around 5 months ago now and if I had to do it again, i would absolutely do it again. It cost me $200 Canadian more than MSRP to buy it and it was totally worth it to me knowing how fucked up the stock is now. I got to play several high end games and got to enjoy RTX. And after 2 months, I decided to start mining on it cuz it was sitting there idle. I have since recovered $500+ canadian from it. I would absolutely do it again.
@@IvanC14 the price I would like to see depends on how much I money I have lol (but prob around $200-250 new budget tier from Nvidia...or at least lower the price of current gen budget cards)
@@jetjet6560 the prices for the budget cards is definitely gonna drop. also, don't forget the 3060ti coming out, and the 2060/1660 ti is gonna be really cheap too
Yes, from those benchmarks we can see common occurance, the CPU bottleneck mostly is still at 1080p. At 1440p there is only little bottleneck, going with the trend and common knowledge of higher resolution and higher graphical setting the more GPU matters then you should be experiencing very little to no bottleneck at 4K High/Ultra and only 60Hz with that CPU yet.
I love all his video's especially since they tackle the SFF side. I wish he would do an ASUS Rog Strix X570-i stand alone review since there really isn't any (or any good one's out there) video's just on the board itself. The New Ryzen 5000 series are coming out soon so it would be a good time. The amazing quality of his videos and in-depth reviews, it's a NO BRAINERRRRR!
I finally got my RTX 3080.. but it was looking like early 2021 for my 5900x preorder to finish my build.. I was thinking of giving in and buying a 5900x for $900 on ebay but I ended up buying a 3700x for $300 instead. I can always sell the 3700x once my 5900x arrives.
This is such a fucking well made video. Thank you so much for this, exactly what I needed to know you explained in full detail with several demonstrations. You earned yourself a new subscriber.
“You’d want your system to be GPU bound rather than CPU bound” A bit more nuanced than that, you’ll get the highest framerate being GPU-bound, but with a sufficiently fast enough processor, you’d want to be CPU-bound to minimize total system lag on fast paced games. That’s what Nvidia Reflex Low Latency / Framerate capping to prevent maximum GPU usage does, (enforce a virtual CPU-bound condition), and you do see latency improvements vs. 99-100% GPU usage. Check out battle (non)sense’s videos on the topic.
Solution: Game in 4K. I have tested my Sandy Bridge 2600K (10 year old CPU) vs my 9900K with my RTX 3070 and fund ind most titles to be less than 5% difference in performance. I had to RMA my motherboard for my 9900K so I set up my old rig with my 2600K but installed the 3070 and was pleasently suprised to see that I was getting about the same FPS. I game on a LG OLED 48" CX running it in 3840x1600/1400px ultra wide aspect and found there to be no performance loss in titles like Mafia The definitive edition, Red Dead Redemption 2 etc. Sure if I was gaming CS GO in 1080p I would not be getting the 400FPS but who cares. We are not professionals we are just hobbiest gaming for the fun of it. I was suprised by the result. I thought for sure I would lose like 25% performance or more. 1) because of CPU but also because of Sandy Bridge is PCI-E 2.0.
I was going to get a 5800X for my next upgrade and I thought pairing it with an RTX 3080 would be ideal but I just noticed that the 5600X would actually be the sweet spot. I'm not gonna change what I planned on doing simply because I do video and photo editing but it's actually insane seeing these high frames with 6 cores chips !
@@MatthiasW97 I totally changed my mind on that point and got a 5900X. Those 12 cores are actually all used on recent titles, and smooth out everything + multitasking is better
Great video but I really really wish you chose 1080p low on all games. That truly shows the difference between CPUs in CPU bound scenarios. The 5950x and 5900x are high core exceptions to this due to their amazing IPC architectural and cache changes
Inb4 Ryzen 5000 series absolutely slaughters every CPU tested in this video. I am currently using an R5 2600 (which I got 2 years ago), so I think it may finally be time for me to upgrade this November! :)
the real bottleneck of the 3080 is the supply from Nvidia right now
Thats facts
he has 10% of nvidia's 3080 stock
@@meunknown69420 10% of 50 units XD
this is why i get laptop...
While keeping the price of the 2080 Ti at $1400.....
Nothing else bottlenecks your system like your wallet does..😄😄
Truth has been said
Fax
😂😂😂😂
it's either the wallet or the GPU's stock issues
My wallet doesn't bottleneck anything. Do you hear me?
Man, the production quality of these videos is just out of this world.
ik bro i can't stop watching them
he is so levelheaded you could use him to hang up shelfs
@@jackpeendog8697 Yup, I initially began following him for the tech content but then I continued watching his videos for the production quality as well.
Is not only the quality I would say how clearly and to the point, he gets to explain things, no screaming not weird fx graphics he just goes straight to the point. He knows whats he is talking about..
@@elvisortiz6813 Yeah his presentation is pretty good as well, I agree.
Imagine buying the ultimate gpu and cpu combo, but not having any money left to upgrade your 60hz monitor.
I feel like this haha. My whole build is done (10900k / 3080) but I am currently using my crappy 1080p 60Hz monitor since the new one hasn't arrived yet. Should be here Tuesday tho!
@@dethndots nice! What monitor is it!
Well a monitor with refresh rate isn't the only place where the fastest gpu and cpu would make sense. There is vr and video or photo editing where a faster refresh rate monitor won't change the experience that much
I have a 144hz monitor but it's a 24" 1080p TN panel from 6-7 years ago. Now I want to upgrade to 4k, but the 144hz options are either extremely expensive or not out yet. Don't know if I should wait for the new 4k monitors that are said to be releasing next year or to get a 1440p in the meantime...
@@_Booker_DeWitt it depend on the dimension of the screen you want to buy.
the intro is sick!
so am i
Yet so random lol
Nah
@@ampere7220 you should get tested
@@ThamasTheDankEngine not that you mentioned it
beautiful cinematography!!!
fr
the intro wow just wow
umm, we gonna talk about that intro. what the actual... just perfect
This is by far the best explanation of bottlenecking that I've ever seen. Well done.
The production quality and content quality is absolutely top class. Your absolutely one of the few that pushes for perfection and it really shows
exactly what i need to know. you seem like a step ahead every time :D
you should have included the R5 3600, I think most people building a new system or upgrading are probably looking at the 3600.
10600k is better than a 3600, not by a massive amount non OC but they would get similar
I just bought a 3600, they are an absolute bargain. Will upgrade this time next year to a 5800 or similar. From everything I’ve seen the 3600 is the real sweet spot for sure. And I’m not certain the 5600 will really be worth the extra cost at the moment
@@Daeyae the 10600K is only marginally better in games, but anything else the 3600 would dunk on it. meaning if you wanted to record gameplay or stream, the 3600 is the better CPU
@@knightsljx 5-20% more FPS isn't marginally especially for a CPU and the 10600k also has better 64core speeds meaning its better than the 3600 for multitasking also, especially with a nice cooler and a good OC and tuning
@@Daeyae Your points aren't really that valid.
1. 3600 does have better multitasking capabilities... there are more factors than core speed, lol.
2. 5-20 is a big range, and not very apparent except on lower resolutions (you probably wouldn't be on 1080p with a 3080...)
3. Higher cost (good oc and coolers with motherboards cost more)
4. The 3600 costs 200 bucks... the 10600k costs almost 300...
5. the 3600 is much older. A better comparison would be the upcoming 5600.
conclusion: you don't know what you're talking about
Seriously Ali, you’re taking Tech RUclips to the next level. The effort and research you put into your videos while making them very enjoyable to watch is impressive. Keep up the great work!
welcome to the channel with sick intros and lit b-rools
I love b-rools
@@breadkrum6860 I love the A-rools
Absolutely love the intro!
This is much more helpful than many PC channels, and a more practical look vs. a certain channel that drops truck loads of data and analysis. Subbed.
something like a 7700K, 8700K or 9700K would've been interesting. Those were among the most popular CPU's not too long ago.
7700K owner here, pls help
this
Agreed.. sitting on a 8700k currently. But id imagine it would fall somewhere slightly around the 10600k since I kinda won the lottery with a 5.1 overclock thats held incredibly stable for 2.5 years..
@@genericname4924 4770k right here with a 1080
@@benji182 Same.. still rocking a 1080 currently.
"this video is a primer for upcoming cpu releases"
he cant keep a straight face saying that. I'll take that as a sign that the new ryzen cpus are good.
This is the best explanation of Bottlenecking I heard from techtubers. Thank you for clarifying it out! I wish and overclocked 7700K were here tested too.
My college housemate (who doesn't really know much about PC building) recently got a 3080 (needed it for his Deep Learning stuff), and it's funny when another mutual friend of ours were telling him that his choice of getting the Ryzen 3600 will bottleneck his GPU. Lol
well if he's using like Tenserflow or Keras his cpu won't matter cause CUDA is what's important
Actually if he is doing 4k gaming at ultra settings, no bottle neck there with smooth 60fps across virtually all games.
How can he not know about PC building but need a 3080 for deep learning stuff? Something doesn't add up.
@@waifuhunter9815 not really sure which software he uses but yeah, he got the 3080 for the CUDA cores. He was thinking of getting a 2080 Ti, but couldn't find a good deal for one.
@@zaxmaxlax You would be surprised to know there are a lot of programmers who know little about computer hardware.
At this point it's sorcery that Ali comes out with vids that explain stuff perfectly just when we need it. Absolute Sorcery
Who is Ali?
@@raghavendrarao6287 Ali Sayed is Optimum Tech's name
This video is so bad ass. One of the best aesthetically pleasing and informative videos ever. Thank you!!
I haven't even watch the video yet, already liked it. You are a class act bro.
I do like these kinds of videos. Ali, you are one of these few that does such deep dives. Which is appreciated since I always think I need more than I “actually” do.
Great video Ali as always. Just missing the R7 3700X 8C/16T in this comparison. No CPU like this in your chart
Production quality is seriously stellar. Given the relatively smaller size of your channel, the production is on the same level as someone like MKBHD
As always, a huge thanks for making this and I'll see you Ali in the next one.
Been watching your content for a couple months now and I just want to say that it is of the highest quality. Keep it up!
One maybe interesting thing to note is that if your GPU is maxed out, your latency will increase quite a bit, ive noticed this in own testing and heard others reporting the same, so I cap my fps on just below what I can maintain almost all the time.
Great video! Loved the intro as well. Exactly what I was wondering as I am building my first pc with the 3080 in mind and don't want bottlenecking. Im surprised 3950x isn't way ahead. I wish I knew what the exact parts I need for my build so I can start deal hunting.
I have a 3090 and Ryzen 5600x and was using a 1080p Ultrawide and wondering why the FPS was so bad and the FPS was dipping so much. Also in BF5 my GPU usage was at 40-60% while my cpu was at 90-100%. I recently sold the monitor and got a 4K OLED and it’s so much better. The gpu is now being utilized 98% and cpu down to 30-50%. My average FPS is obviously a little bit lower but not by much and the FPS is way more consistent. CPU bottleneck is definitely a thing.
I think you are the only techtuber tackled CPU bottlenecking w/ 3080 thank you! 2600 user here :D
The Ryzen 5 2600 would do well with a 3080 at 4K. Hell it does nice enough at 1440p already since you must never hit 100% GPU utilization (or even 98%) since that impacts latency negatively.
@Ping Pong That is a solution, but it does not work on all games. It also does not work if you are GPU Bound. Only Reflex does.
@Ping Pong I did even better, I watched GNs and HUs video on them.
The thing with Ultra Low Latency mode is that some games with wacky API implementations do not work with it. So not most new games but a few old ones and some OpenGL Titles. Still its good tech.
I myself just use manual FPS limiters :P.
i got the 2600 3 months ago .. should i worried about it? or wait the next gen ?
Correct. My 2700x/RTX3080 saw no worse performance at 4K compared to the highest tier CPUs I've seen on RUclips benchmarks.
@@heinzgame6400 If it serves you well, dont replace it. Remember, a few percent in a benchmark do not matter. if your experience is good - be happy.
Just the insane quality and ambience of the video makes it so eye candy that you want to watch it till the end and the info provided is all time useful
This is why I will continue to use my 8700k, at 5.0 GHz I just have not seen enough performance increase to justify a new processor and motherboard.
or your wallet bottlenecked
I still have 6700k, but might isn’t stable past 4.2, weighing my options. Also monitor is 1440 100hz
@@gixxxer1k strange, my 6700K runs very stable and cool @4.7 with default voltage.
@@gixxxer1k sell the 6700k n motheboard n ram n get something better.u get good money for it still
4 cores ain't cutting no more
Lots of games will max it out n I mean alot
I literally just posted on another comment. I’m clocked identical to you @ 5.0 with my 8700k and no cooling issues at all. I game on a 1440 165hz monitor and play really only competitive FPS titles. Anyone who knows cpus knows 8700k @ 5.0 is insane but I’m kinda questioning my 3080 purchase looking at 6 core older tech. I’m wondering if 3070 might have been a better choice for me. I don’t play games that I need to see in 4K I like clicking foreheads in multiplayer first persons lol and I love my 1440 monitor. So confusing lol.
Thanks OT. Content is always of the highest tier. PS I am doing a T1 build based off yours haha :D
My current 2020 gaming PC:
CPU: Intel Core i7 4790k @ 4.4Ghz(all core) @ 1.22v
Ram: 16GB 2400Mhz DDR3 Corsair Vengence Pro
Graphics card: EVGA RTX 2080TI XC Gaming Hybrid (watercooled)
Motherboard: Asus MAXIMUS VII HERO z97 socket LGA1150
Monitor: Acer Predator XB271HK 4K Nvidia Gsync 60Hz IPS gaming monitor
I have no problems playing all games max setting using my 4K G-Sync monitor, most users over exaggerate cpu bottlenecking & think they need a 12 core cpu for "future proofing"...
True for 4k it is, 1080p you'll be loosing alot of fps
well done Ali! Another great video that's useful for the community. Your videos are one of the best in PC RUclipsrs. Always succinct and to the point
The real bottleneck is how much FPS can your monitor show.
8k 12 fps
Max is 360 hz
@_Nism0 if you’re on a 60hz, 1080p monitor then it doesn’t matter. This GPU is so powerful you could run anything above that monitor without it breaking a sweat. 1% lows won’t even matter lol
$>Monitor>GPU>CPU
@jordy rozon if you had a 1060 on a 1080/60 monitor that's fine, but upgrading the monitor to a 1080/240 or 1440/144 you gain nothing, as the 1060 probably cant do 60+ on the highest settings for newer titles, but getting the 3080, you still gain the performance, and can super sample your monitor. Can always buy a monitor after it's better than getting the better monitor 1st unless you can utilise it in anyway, like if you play csgo
Congrats on 400K subs. New subscriber here, and loving your content and video editing.
Man your video is literally peace giving ❤️
Thank you very much. Great video as always. Good information also.
You should look more into frametime consistency, I found from my experience that consistency is more desirable than high framerate although the two are usually related.
This is a question i always had and never fully understood it but this video is the best thing ever. I just needed the understanding now i can go ahead and test out myself! Also, the visuals are absolutely epic
Salute to all my fellow gamers who bought the r5 3600 like myself
I snagged a new system with the 3600 and a 6800xt woot
@@freakshow861
That's great
Have fun 👍🏽
thanks for this video! you just helped me save quite a lot of cash from upgrading my cpu to the latest gen.
@1:45 u already got a 5600x, nice job mate!!
@Ping Pong you mean rendering
Good video my man. Better than most popular tech RUclips videos and excellent knowledge content.
My favorite channel on youtube
Favorite intro yet! Nice work:)
Send that card on over, so I can see how my e8400 handles it! lmao
I think at that point the shipping costs would be more expensive than just buying one for himself.
Man I wish more people would watch videos like yours and stop dropping ‘bottleneck’ into every PC build conversation
I was just watching JayzTwoCents bottlenecking RTX 2080ti with the i3 xD
the amount of time you must spend doing all the research, experiments and making the video... Well appreciated mate
AUTHOR OF THE MUSIC is? Thats what we need to know. Great vid too i guess :D
Sounds like something from Com Truise or HOME
Probably Home, as they allow people to use their music royalty free
Coooool intro man!
What about a 4770k at 4k? I think it's interesting to see how old cpus do at the high end. I've never felt the need to upgrade but people rarely test 4k,where at.
Thanks so much, you are the only Content Creator that focuses alot of Content with a Ryzen 3 3300X. I truly do appreciate it!! Thanks so much from 🇨🇦
Damn production quality so good it's a shame to upload in 1080p, time for 4k!
It is in 4k
Ur cell /monitor is bottlenecking this video
@@NeVErseeNMeLikEDis It shows up as 1080p on my s10
Me 720 60
@@QwoRE112 turn up the resolution in your settings I get 1440p 60 on my S9+
Great video as always man! You always post great videos that answer all the questions i have sitting in the back of my mind.. eg will a 650W psu be enough for a 3080.
That intro was ᗩ E S T ᕼ E T I ᑕ
af
Loved that intro!
Thanks for some amazing videoes!
The 10600k is awesome is uf you're just gaming
for now...
coughs ryzen 5 5600 *cheap* *gud* *included cooler*
The upcoming 5600x just beat it in single core performance. Sooooooo....... Intel is at DeadLake.
@muhammad ehtasam not yet shown, but I would assume it does.
@muhammad ehtasam no it beat the 10700 not the i9. That's the 5900x maybe the 5800x too
You deserve more subs. Very nice content.
THAT INTRO HAD ME LIKE 👁👄👁
Such a good explanation. Good job!
I’d love to see a 4 core vs 6 core
And 6 core vs 8 core
with Ryzen 5000’s 8 core ccx layout
Great video. Thanks for the insight. It has helped me recommend a CPU to a friend who is looking at purchasing a 3080 (when they become available).
did they get one?
@@Fazed_ no lmao. Friend ended up buying a 3090 cuz it was available. And hasn't been able to get a CPU still. This is 3 months ago now lmao
@@FT099 did he get it from a scalper or legit?
@@Fazed_ the 3090 he got legit. We used to have tons of them in stock cuz they are so ridiculously priced for such little performance gain. This was 3 months ago, and they are no longer in stock even at that price. My 3080, however, is from a scalper. I got it around 5 months ago now and if I had to do it again, i would absolutely do it again. It cost me $200 Canadian more than MSRP to buy it and it was totally worth it to me knowing how fucked up the stock is now. I got to play several high end games and got to enjoy RTX. And after 2 months, I decided to start mining on it cuz it was sitting there idle. I have since recovered $500+ canadian from it. I would absolutely do it again.
I will not be buying an rtx 3080 in the near or distant future, shit is mad expensive lmao...but I can't help but watch OT's videos
Its fine, its out of stock anyway
What price you suggest for anew generation of GPU’s ?
@@IvanC14 the price I would like to see depends on how much I money I have lol (but prob around $200-250 new budget tier from Nvidia...or at least lower the price of current gen budget cards)
@@jetjet6560 the prices for the budget cards is definitely gonna drop. also, don't forget the 3060ti coming out, and the 2060/1660 ti is gonna be really cheap too
Production value never disappoints!
THAT INTROOOOOOO
This is the type of experiments that I want to see from the reviewers. thanks a lot
Da intro 🤤🤤🤤
We're you a cinematographer or a DOP in a past life? I'm just in awe of how good the footage looks.
Were*
@@entruh4520 Ouch
So i'm playing on a 4k tv with 60hz....meaning that my r5 2600x is perfectly fine for that?
Yes, from those benchmarks we can see common occurance, the CPU bottleneck mostly is still at 1080p.
At 1440p there is only little bottleneck, going with the trend and common knowledge of higher resolution and higher graphical setting the more GPU matters then you should be experiencing very little to no bottleneck at 4K High/Ultra and only 60Hz with that CPU yet.
I love all his video's especially since they tackle the SFF side. I wish he would do an ASUS Rog Strix X570-i stand alone review since there really isn't any (or any good one's out there) video's just on the board itself. The New Ryzen 5000 series are coming out soon so it would be a good time. The amazing quality of his videos and in-depth reviews, it's a NO BRAINERRRRR!
Damn your fan base loves u lmao
So you're saying my i7 6700k is probably not gonna blow the doors off my expectations hahahaha. Damn it.
Thanks for this. Have to settle with my 2600x due to stock issues. Video helped confirm some of the performance I'm getting.
Another top tier video by OT. Keep the great content coming!
I finally got my RTX 3080.. but it was looking like early 2021 for my 5900x preorder to finish my build.. I was thinking of giving in and buying a 5900x for $900 on ebay but I ended up buying a 3700x for $300 instead. I can always sell the 3700x once my 5900x arrives.
3080 is a 4K card in 2020/2021. Anything less is wasting its potential. 3700x is perfect for it.
Thank you for this video. It has helped as I am a R5 2600x owner.
This is such a fucking well made video. Thank you so much for this, exactly what I needed to know you explained in full detail with several demonstrations. You earned yourself a new subscriber.
Really excited for your initial Zen 3 reviews coming up, and hoping that the non-X SKUs follow soon after.
Hands down the best intro for a tech video I’ve ever seen
I have the ryzen 5 2600x, was so curious about this. Thanks so much!!
Loving the music choices used in this video.
“You’d want your system to be GPU bound rather than CPU bound”
A bit more nuanced than that, you’ll get the highest framerate being GPU-bound, but with a sufficiently fast enough processor, you’d want to be CPU-bound to minimize total system lag on fast paced games.
That’s what Nvidia Reflex Low Latency / Framerate capping to prevent maximum GPU usage does, (enforce a virtual CPU-bound condition), and you do see latency improvements vs. 99-100% GPU usage.
Check out battle (non)sense’s videos on the topic.
Really cool intro with the PS1 feature, nice!
Solution: Game in 4K. I have tested my Sandy Bridge 2600K (10 year old CPU) vs my 9900K with my RTX 3070 and fund ind most titles to be less than 5% difference in performance. I had to RMA my motherboard for my 9900K so I set up my old rig with my 2600K but installed the 3070 and was pleasently suprised to see that I was getting about the same FPS. I game on a LG OLED 48" CX running it in 3840x1600/1400px ultra wide aspect and found there to be no performance loss in titles like Mafia The definitive edition, Red Dead Redemption 2 etc. Sure if I was gaming CS GO in 1080p I would not be getting the 400FPS but who cares. We are not professionals we are just hobbiest gaming for the fun of it.
I was suprised by the result. I thought for sure I would lose like 25% performance or more. 1) because of CPU but also because of Sandy Bridge is PCI-E 2.0.
i will surely keep my i7 7700 non k. but right now pairing it wd rtx 3060 12gb.
I was going to get a 5800X for my next upgrade and I thought pairing it with an RTX 3080 would be ideal but I just noticed that the 5600X would actually be the sweet spot. I'm not gonna change what I planned on doing simply because I do video and photo editing but it's actually insane seeing these high frames with 6 cores chips !
Most of the games even today won't get much use out of these extra cores
@@MatthiasW97 I totally changed my mind on that point and got a 5900X. Those 12 cores are actually all used on recent titles, and smooth out everything + multitasking is better
I think the production is getting progressively better with each video.
Great video but I really really wish you chose 1080p low on all games. That truly shows the difference between CPUs in CPU bound scenarios. The 5950x and 5900x are high core exceptions to this due to their amazing IPC architectural and cache changes
P.s. beautiful intro man. Love your channel.
Love your content, I hope you blow up in viewers and subscribers even further in the future. q
THATS A MADLAD INTRO, MAD RESPECT MAN✊
Great video!
Inb4 Ryzen 5000 series absolutely slaughters every CPU tested in this video. I am currently using an R5 2600 (which I got 2 years ago), so I think it may finally be time for me to upgrade this November! :)
Clean, clear and a lot of content. Great
Been hitting the curls 💪
Thumbs up for the intro alone!