@@Blobby762 Because the 4060 is an absolutely trash GPU, even it's predecessor had 12G on a 192-bit bus while the 4060 has 128-bit and 8G. It's literally a 50 class GPU under a 60 class GPU sticker and price. But hey if you're happy about being scammed, I won't be the judge.
@@laszlozsurka8991 I welcome the discussions here, and please try to keep it civil, boys. Attack the idea, not the person.👍 And yes, I do agree that it is the RTX 4050 but 'rebranded' by leatherjacket man to be the 4060 as I've shown here - ruclips.net/video/mqTigMKpuJw/видео.html
Yeah, I currently don't have one but will definitely look into it if I'm able to acquire one (at a decent price). I'm hoping their issues would have been resolved by then.
i see rx6700xt has way more fps so you can even use a reall high end Refresh rate monitor plus you can see the 0.1% lows are better on the amd card and the Smoothness is better on the AMD card so for competetive gamepaly you has the same lows as on the RTX 4060 but it is smoother because the 0.1% lows are this kind of things that let you feel stuttering on high refresh rate monitors. What has power consume to do with Competetive gaming ? if you get a RTX 4090 nobody talks about the huge power consume or the RTX 4070ti that draw over 300 watts. There is no point in saying "the card do not suck much power" if you want the most possible Performance then it does not matter because you do not drive a Super Sport Car and complain about the Gasoline usage right ? so the RTX 4060 has less ram and some times it has better 1% lows but then worse 0.1% lows and over all lower Latency so it is the worst to say it is a better option. It is a no brainer to pick up the most possible FPS and we all know that the Rx6700xt perform way better then the RTX 4060 so there is just one choice to pick and it is the Rx6700xt if you have just 300$ to spend and this youtuber is a hardcore Nvidia fanboy who speak nonselse this is why you hear him excuse why the RTX 4060 is better when it even does not matter. 99% of the frames are better on the Rx6700xt with better 1% lows and or 0.1% lows if you do not belive then watch apex legends or cs go or rainbow six siege, kinda stupid to think that competetive palyers has to turn all graphics to low and this make them a better player instead of having enough FPS for good settings like better Textures or Draw Distance that really batters and here we have the Rx6700xt again as a winner. Do not buy the RTX 4060 card it is a very bad card go to the RTX 4070 instead if you are a hardcore Fanboy of Nvidia, but even then you can get a Rx7800xt and have way more performance for less or same price.
Hi there, I noticed that your comments are heavily leaning towards one manufacturer, however since you've written that long comment... I'll respect the time you've allocated and answer them as objectively as I can. Please note that if you continue to attack aperson and not the idea, I will have no choice but to moderate you and your comments. 1%, 0.1% lows - this is a summarised and simplified version of frametime results. For players who are competitive, consistency of low frametimes matter more than any of those standard 'summarised' metrics. Whilst it is good to see these simplified metrics for presentation, the actual frametime behviour matters more in actual competitive gameplay - erratic frametimes affect a player's precision (not I didn't say accuracy). Unfortunately most PC hardware enthusiasts are used to these metrics, which is OK for single-player games but not as useful for competitive-style players. This is why competitive players usually prefer stability (given that a certain frametime has been achieved i.e. 8ms or less). If you are going by these metrics (avg, 1% and 0.1%), you need to look at these figures all together and they must have lower gaps between each figure to be considered better. Power Consumption - Previously I don't cover this metric however over the last year, more and more people are looking at this, and requesting me to include this in our benchmarks. This is because of the global economics situation, wherein some regions are heavily affected by power increases. This is why I mentioned on the outro "...power consumption might be - that's for you to decide". RTX 4090 Gamers - Most of these guys would not be my typical audience. They would usually buy the most expensive Nvidia card, and game - they generally don't look back or watch YT videos to verify it (after purchase). And I agree, they will not look at power consumption. RTX 4060 - Different games will have different behaviour on different cards (especially on competitive settings), hence why I've made this channel. Main Tech YTers usually don't do this, instead they just set everything to ultra, do benchmarks and make charts - most of them don't even play these multiplayer games whilst I 'actively' play the games listed here. As mentioned, there is NO one card (between the two) that is superior on ALL competitive games shown here. Competitive Settings - it's not necessarily all low settings. FOV is usually subjective (on competitive play), and Draw distance can sometimes be disadvantageous on some competitive games. Most of my settings are based off usual settings from my viewers and some research done by ALBU Performance. Hardcore Nvidia fanboy - I have a disclaimer in my description if you're curious (which you don't really see on the main YTers). But hey, I'll take that as a compliment - been previously called an AMD fanboy on my AMD-related content so I guess this calls it even. Seriously though, it doesn't benefit me to be biased on one manufacturer.. if we want better products, we need to highlight issues of different cards - that's the only way to make things better. Ok that's it.. please keep it civil. Happy New Year! 👍
Seems a bit unfair to pin previous gen 6700XT to a current gen 4060. And yet it still destroys the 4060 💀 ALso just a fyi in fortnite, DX12 is more favored on AMD, as for some reason Nvidia's implantation of DX12 isn't very good.
@@mx8719 misread your comment. 6600 is pretty low tier on AMD. I would suggest lowering settings, as I can't replicate on a 7800XT. Could also be your on a slow SSD / HDD. Or you're CPU bottlenecked, which is the most likely cause. Just judging by the graphics card.
@@Triro It’s not the settings, it runs 160ish frames on high settings and even higher when i drop to medium, it just randomly drops to like 40 every few mins. My cpu is ryzen 5 4500 but even when i drop to performance settings it changes nothing, it’ll still drop to 40 for one second, my SSD is a WD black nvime which is my main boot drive
@@mx8719 Fortnite is more CPU bound, sense its a competitive game. Other games like CyberPunk, metro exodus etc aren't so, as its more GPU bound. Having a decently speced CPU is pretty much needed. To avoid any bottlenecks
Bro. U dont know how to use amd. 6:17 click on fan tuning. 30% at 55. Degrees 45% at 65 degrees 50 % at 70 60% at 75 degrees. U welcome now ure off shutters. U need to know that what matters the most is the core temp. When u see 74 75 76 then u have like 95+ on hotspot. At 97 degrees statrs to throttle but even until that point. The gpu performs worse if it's nor cooled.
Used to do custom fan curves on my previous videos but feedback from viewers was they wanted to see as stock settings hence why it's stock. And I've tried it with fan curve to eliminate that issue too, same result for Warzone. It's not a thermal issue unfortunately. AMD hotspot can go up to 110C max as per spec.
@@OverseerPC-Hardware it shut down for me some games on stock when hit 97 degrees hotspot. This card has to stay at 55-60 degrees. Otherwise is malicious the curve îs way too ambitious and focused on small noise level. I honestly don't believe them , they want You to purchase another sooner. I didn't know the background sorry If i sounded harsh. And second i havent played warzone lows are constant i get it but im talking about shutter rate/ minute lets Say. It won't feel smoother at 90 degrees hotspot, guaranteed.
Been testing AMD cards for quite some time, usually it's fine even if hotspot goes close to 100C. In your case, I would be looking at your PSU. I remember I had a lower tier PSU before and had the same issue.. thought that it was my GPU but ended up being PSU not able to handle micro-power spikes even though wattage was within recommendation. Warzone does this every now and then, it ends up with pretty crappy performance and they fix it...and then it's back again hence why I cover this regularly. If you look at my previous videos, it was night and day vs Nvidia (heavily favouring AMD). Appreciate the feedback bro. As I mentioned, I just set it on stock settings because some people felt that I was giving AMD an unfair advantage by doing fan curves. Normally, I have fan curves set on my daily PC 👍
@@OverseerPC-Hardware fan curve fixed this for me any never had an issue after till today. Thanks for your info. I'll checkout other videos too and even get a subscribe for a nice informative cool mannered reply !
I guess you know what you are doing (much better than me LOL 😀) but if the concern is the worse 1% lows on the AMD card, would not be a better solution to set the max GPU frequency a bit lower? For example 2450 MHz instead of above 2500? I assume the average FPS will drop but if the 1% low improves then the trade off can be worth in a competitive game.
What if I want to use the card for streaming bcuz I’m really indecisive with these 2 cards and I want to use one to stream, CS2 is my main game. And I also don’t really care about quality only performance. Which do you prefer
We did some livestreaming tests previously, feel free to check it out: RX 6700 XT H265 - ruclips.net/user/liveC-OqE9X4g6M?feature=share ruclips.net/user/livegtf621XovZ4?si=OHqWq1wd9jOQBxgz RTX4060 AV1 - ruclips.net/user/liveb4eAmUiVMqg?si=MXoP6OarHFVG7ZZA It's not CSGO2 but at least it should give you some idea. 👍
for CS go you will pick for sure the Rx6700xt because it performs way better there and if you set the FPS limit to your Refresh monitor it even get better for streaming. there is no real difference between streaming when it comes to the Rx6700xt or RTX 4060 just the Performance hit it get and yes they get it. This is not a question about What card you should get, it is a question about why you will buy a RTX 4060 over a Rx6700xt if the Nvidia card deliver less performance and has less Vram (very important for the future because there are games out now that suck in 1080p easy over 8gb vram.
if going nvidia the RTX 3060 12 gb is a better card than the 4060 i mean you watch your own video most of the time 1% low are worse on the 4060 consistently , it got no bus width and running on 8 lanes the RTX 2060 super beats the 4060 in heavy loads half the time of course its got twice the bus width and uses 16 pcie lanes, but the build quality of the ELSA cards are pretty good got few my self , they are well made . Elsa makes a 2060 super for 175$ and half the time under a heavy load it beats the 4060 throw a unreal engine 5 game and see what happens
As much as I love to shill RX 6700 XT for how well it performs for me in games, I can't really recommend AMD right now. For 3 months I've been having multiple issues with their drivers, and when I say drivers, I meant multiple driver versions. I even rolled back to a version from December 2022, which was released more than 1 year ago, and it just keeps crashing on me even when doing only browser tasks. I suggest that if you're going the AMD route, go with the 7000 series right now. Their latest drivers have not been good for 6000 series users. r/Amd subreddit would agree with me on this.
I'd have to agree on you with this one bro. I think the current drivers have left out the 6000 series. When I was reviewing the 7800 XT, it went really well.. I even thought that they fixed the FN issue then, however I realised with this video that it wasn't universal to all AMD GPUs.
It's not but I've always had it on because it improves visibility a lot (subjective). And this one has always been superior since it was introduced years back, compared to Nvidia's attempt of a copy (sharpening filter).
A comparative video of that would be interesting, what the game offers compared to that AMD improvement, and AMD vs Nvidia, thanks for the advice I will do it, is there any game that should not apply that image sharpening option? Thank you for your video and response I will follow your channel@@OverseerPC-Hardware
I have it turned on by default. For the games I play (which are all the games here), I always prefer having it on - try it, night and day difference. If it's too sharp for you, just lower the strength to your liking.
amd is better, i have the asrock challanger d oc at linux with 2860mhz at 186watts stock at idle 26C gaming 61C at my mitx case i play jarpg and argp no shooters so rx6700xt has more power bcz more vram!
I h8 life, my dad bought a rtx 3060 2 years ago cuz it was the best in the market and now its FUCKING ASS for fortnite and not only do i have to buy a new gpu, i have to buy a new cpu and a new motherboard BECAUSE THE CPU I HAVE IS FROM 2012
I think the GPU is still good bro. FN changed its game engine on Chapter 4 to UE5 and it's probably why your old CPU wasn't working as well on Performance Mode. If you upgrade your platform, the GPU should still be good. 👍
@@OverseerPC-Hardware the RTX 3060 is Ass of course the CPU from 2012 is ass too but for 2024 the RTX 3060 is a bad card, the RTX 4060 in your video is the same level of shit card, so 6 or 8 gb vram in 2023 was not enough already and you think 2024 will change anything ? Even if he got the 12 gb version of the RTX 3060 card it is still ass card because it is as power full as a Rx5700 and we all know that card is old and has not enough Power to deliver 144 FPS in modern games and low settings are for no one because low settings just make you a worse palyer bacuse your Graphics matters alot "Draw Distance" or "Fov" matters alot but eat more FPS, if you can see the enemy earlyer then you get a advantage instead of not seeing anything and get reckt but then cry that your 1% lows was good.
as a person who has been on the 3060 12gb since launch, its still got enough horsepower for another generation AT 1080p. the RX 7600 and RTX 4060 perform marginally better but have less VRAM with a smaller bus. You'll benefit more from a platform change. With a Newer CPU/platform (AM4/5,LGA1700) you'll have access to PCIE 4.0, ReBar and DDR4/5 memory which is much faster; with LGA1700 spanning a few Intel Generations, same with AM4 (and hopefully) AM5. The cost of an upgrade from the 3060; even if you sell it on is minimum $450~ It's only around the 4070 and 7700xt/7800xt that you'll see a performance leap worth the effort. on the 3060, with DLSS balanced at 1080p + DLSS Ray Reconstruction. You can play Cyberpunk with Med/High Raytracing 70-80 FPS, and then with DLSS performance, you can even go to RT Overdrive tldr 3060 is fine at 1080p so long as u are fine playing with with settings, its really that old cpu that is ruining ur fortnite experience and bcs it is a competitive game, fps matters more than graphical fidelity and at 1080p, your more likely CPU bound.
Just experimenting with the algorithm bro. Title style Yes.. But Thumbnail, I'd say mine is probably much more different. Thanks for the feedback and viewing the video. 👍
this guy has no idea of waht Competetive or even what hardware is really are, he just know that he is a fanboy. This channel is really bad when it comes to Reviews.
👉 Do you like us to include 'The Finals' in our usual gaming suite? Please let us know.. or, you can just hit the *LIKE* button. Cheers! 🍻
If you buy a 4060 over a 6700 XT you have problems... or you're just an NVIDIA fanboy.
Nah, I'm just a broke little boy who can't afford a PSU upgrade to accommodate the XT'.
I'm just a fan boy who love their GPU cool
why do you care so much about how i spent my money
@@Blobby762 Because the 4060 is an absolutely trash GPU, even it's predecessor had 12G on a 192-bit bus while the 4060 has 128-bit and 8G. It's literally a 50 class GPU under a 60 class GPU sticker and price.
But hey if you're happy about being scammed, I won't be the judge.
@@laszlozsurka8991 I welcome the discussions here, and please try to keep it civil, boys. Attack the idea, not the person.👍
And yes, I do agree that it is the RTX 4050 but 'rebranded' by leatherjacket man to be the 4060 as I've shown here - ruclips.net/video/mqTigMKpuJw/видео.html
There's also a rumor on 7600xt, so we will have more option on $300 price point
Yes, it'll be very interesting to see how it runs when released.
Would be interesting to see the Arc in this comparison.
Yeah, I currently don't have one but will definitely look into it if I'm able to acquire one (at a decent price). I'm hoping their issues would have been resolved by then.
As always, very nice video!! Keep em coming
Thanks for the feedback bro
i see rx6700xt has way more fps so you can even use a reall high end Refresh rate monitor plus you can see the 0.1% lows are better on the amd card and the Smoothness is better on the AMD card so for competetive gamepaly you has the same lows as on the RTX 4060 but it is smoother because the 0.1% lows are this kind of things that let you feel stuttering on high refresh rate monitors.
What has power consume to do with Competetive gaming ? if you get a RTX 4090 nobody talks about the huge power consume or the RTX 4070ti that draw over 300 watts.
There is no point in saying "the card do not suck much power" if you want the most possible Performance then it does not matter because you do not drive a Super Sport Car and complain about the Gasoline usage right ?
so the RTX 4060 has less ram and some times it has better 1% lows but then worse 0.1% lows and over all lower Latency so it is the worst to say it is a better option.
It is a no brainer to pick up the most possible FPS and we all know that the Rx6700xt perform way better then the RTX 4060 so there is just one choice to pick and it is the Rx6700xt if you have just 300$ to spend and this youtuber is a hardcore Nvidia fanboy who speak nonselse this is why you hear him excuse why the RTX 4060 is better when it even does not matter.
99% of the frames are better on the Rx6700xt with better 1% lows and or 0.1% lows if you do not belive then watch apex legends or cs go or rainbow six siege, kinda stupid to think that competetive palyers has to turn all graphics to low and this make them a better player instead of having enough FPS for good settings like better Textures or Draw Distance that really batters and here we have the Rx6700xt again as a winner.
Do not buy the RTX 4060 card it is a very bad card go to the RTX 4070 instead if you are a hardcore Fanboy of Nvidia, but even then you can get a Rx7800xt and have way more performance for less or same price.
Hi there, I noticed that your comments are heavily leaning towards one manufacturer, however since you've written that long comment... I'll respect the time you've allocated and answer them as objectively as I can. Please note that if you continue to attack aperson and not the idea, I will have no choice but to moderate you and your comments.
1%, 0.1% lows - this is a summarised and simplified version of frametime results. For players who are competitive, consistency of low frametimes matter more than any of those standard 'summarised' metrics. Whilst it is good to see these simplified metrics for presentation, the actual frametime behviour matters more in actual competitive gameplay - erratic frametimes affect a player's precision (not I didn't say accuracy). Unfortunately most PC hardware enthusiasts are used to these metrics, which is OK for single-player games but not as useful for competitive-style players. This is why competitive players usually prefer stability (given that a certain frametime has been achieved i.e. 8ms or less). If you are going by these metrics (avg, 1% and 0.1%), you need to look at these figures all together and they must have lower gaps between each figure to be considered better.
Power Consumption - Previously I don't cover this metric however over the last year, more and more people are looking at this, and requesting me to include this in our benchmarks. This is because of the global economics situation, wherein some regions are heavily affected by power increases. This is why I mentioned on the outro "...power consumption might be - that's for you to decide".
RTX 4090 Gamers - Most of these guys would not be my typical audience. They would usually buy the most expensive Nvidia card, and game - they generally don't look back or watch YT videos to verify it (after purchase). And I agree, they will not look at power consumption.
RTX 4060 - Different games will have different behaviour on different cards (especially on competitive settings), hence why I've made this channel. Main Tech YTers usually don't do this, instead they just set everything to ultra, do benchmarks and make charts - most of them don't even play these multiplayer games whilst I 'actively' play the games listed here. As mentioned, there is NO one card (between the two) that is superior on ALL competitive games shown here.
Competitive Settings - it's not necessarily all low settings. FOV is usually subjective (on competitive play), and Draw distance can sometimes be disadvantageous on some competitive games. Most of my settings are based off usual settings from my viewers and some research done by ALBU Performance.
Hardcore Nvidia fanboy - I have a disclaimer in my description if you're curious (which you don't really see on the main YTers). But hey, I'll take that as a compliment - been previously called an AMD fanboy on my AMD-related content so I guess this calls it even. Seriously though, it doesn't benefit me to be biased on one manufacturer.. if we want better products, we need to highlight issues of different cards - that's the only way to make things better.
Ok that's it.. please keep it civil. Happy New Year! 👍
Seems a bit unfair to pin previous gen 6700XT to a current gen 4060.
And yet it still destroys the 4060 💀
ALso just a fyi in fortnite, DX12 is more favored on AMD, as for some reason Nvidia's implantation of DX12 isn't very good.
Fortnite don't like AMD cards usually I was told😂 I have a 6600 and it drops frames alot
@@mx8719 misread your comment.
6600 is pretty low tier on AMD. I would suggest lowering settings, as I can't replicate on a 7800XT.
Could also be your on a slow SSD / HDD.
Or you're CPU bottlenecked, which is the most likely cause. Just judging by the graphics card.
@@Triro It’s not the settings, it runs 160ish frames on high settings and even higher when i drop to medium, it just randomly drops to like 40 every few mins. My cpu is ryzen 5 4500 but even when i drop to performance settings it changes nothing, it’ll still drop to 40 for one second, my SSD is a WD black nvime which is my main boot drive
@@Triro It’s only fortnite though and i don’t really know why 😂 most other games run fantastic on high-ultra settings 1080p
@@mx8719 Fortnite is more CPU bound, sense its a competitive game.
Other games like CyberPunk, metro exodus etc aren't so, as its more GPU bound.
Having a decently speced CPU is pretty much needed. To avoid any bottlenecks
Bro. U dont know how to use amd. 6:17 click on fan tuning. 30% at 55. Degrees 45% at 65 degrees 50 % at 70 60% at 75 degrees. U welcome now ure off shutters. U need to know that what matters the most is the core temp. When u see 74 75 76 then u have like 95+ on hotspot. At 97 degrees statrs to throttle but even until that point. The gpu performs worse if it's nor cooled.
Used to do custom fan curves on my previous videos but feedback from viewers was they wanted to see as stock settings hence why it's stock.
And I've tried it with fan curve to eliminate that issue too, same result for Warzone. It's not a thermal issue unfortunately. AMD hotspot can go up to 110C max as per spec.
@@OverseerPC-Hardware it shut down for me some games on stock when hit 97 degrees hotspot. This card has to stay at 55-60 degrees. Otherwise is malicious the curve îs way too ambitious and focused on small noise level. I honestly don't believe them , they want You to purchase another sooner.
I didn't know the background sorry If i sounded harsh. And second i havent played warzone lows are constant i get it but im talking about shutter rate/ minute lets Say. It won't feel smoother at 90 degrees hotspot, guaranteed.
Been testing AMD cards for quite some time, usually it's fine even if hotspot goes close to 100C.
In your case, I would be looking at your PSU. I remember I had a lower tier PSU before and had the same issue.. thought that it was my GPU but ended up being PSU not able to handle micro-power spikes even though wattage was within recommendation.
Warzone does this every now and then, it ends up with pretty crappy performance and they fix it...and then it's back again hence why I cover this regularly. If you look at my previous videos, it was night and day vs Nvidia (heavily favouring AMD).
Appreciate the feedback bro. As I mentioned, I just set it on stock settings because some people felt that I was giving AMD an unfair advantage by doing fan curves. Normally, I have fan curves set on my daily PC 👍
@@OverseerPC-Hardware fan curve fixed this for me any never had an issue after till today. Thanks for your info. I'll checkout other videos too and even get a subscribe for a nice informative cool mannered reply !
Thanks for the support bro 👍
I guess you know what you are doing (much better than me LOL 😀) but if the concern is the worse 1% lows on the AMD card, would not be a better solution to set the max GPU frequency a bit lower? For example 2450 MHz instead of above 2500? I assume the average FPS will drop but if the 1% low improves then the trade off can be worth in a competitive game.
Thanks for the suggestion, I will look into this when I'm back from holiday. I might just publish this as a short if I can't fit it in a video. 👍
What if I want to use the card for streaming bcuz I’m really indecisive with these 2 cards and I want to use one to stream, CS2 is my main game. And I also don’t really care about quality only performance. Which do you prefer
We did some livestreaming tests previously, feel free to check it out:
RX 6700 XT
H265 - ruclips.net/user/liveC-OqE9X4g6M?feature=share
ruclips.net/user/livegtf621XovZ4?si=OHqWq1wd9jOQBxgz
RTX4060
AV1 - ruclips.net/user/liveb4eAmUiVMqg?si=MXoP6OarHFVG7ZZA
It's not CSGO2 but at least it should give you some idea. 👍
for CS go you will pick for sure the Rx6700xt because it performs way better there and if you set the FPS limit to your Refresh monitor it even get better for streaming.
there is no real difference between streaming when it comes to the Rx6700xt or RTX 4060 just the Performance hit it get and yes they get it.
This is not a question about What card you should get, it is a question about why you will buy a RTX 4060 over a Rx6700xt if the Nvidia card deliver less performance and has less Vram (very important for the future because there are games out now that suck in 1080p easy over 8gb vram.
Cs2 runs fine on a rx 6400 lol.
Get the 4060 if you want anything more than just gaming.
You will still play at ultra settings if you want.
What about lows with 7800XT is it more better?
I haven't tried it out but might do a quick run latet
if going nvidia the RTX 3060 12 gb is a better card than the 4060 i mean you watch your own video most of the time 1% low are worse on the 4060 consistently , it got no bus width and running on 8 lanes the RTX 2060 super beats the 4060 in heavy loads half the time of course its got twice the bus width and uses 16 pcie lanes, but the build quality of the ELSA cards are pretty good got few my self , they are well made . Elsa makes a 2060 super for 175$ and half the time under a heavy load it beats the 4060 throw a unreal engine 5 game and see what happens
As much as I love to shill RX 6700 XT for how well it performs for me in games, I can't really recommend AMD right now. For 3 months I've been having multiple issues with their drivers, and when I say drivers, I meant multiple driver versions. I even rolled back to a version from December 2022, which was released more than 1 year ago, and it just keeps crashing on me even when doing only browser tasks. I suggest that if you're going the AMD route, go with the 7000 series right now. Their latest drivers have not been good for 6000 series users. r/Amd subreddit would agree with me on this.
I'd have to agree on you with this one bro. I think the current drivers have left out the 6000 series. When I was reviewing the 7800 XT, it went really well.. I even thought that they fixed the FN issue then, however I realised with this video that it wasn't universal to all AMD GPUs.
image sharping is a boost?
It's not but I've always had it on because it improves visibility a lot (subjective). And this one has always been superior since it was introduced years back, compared to Nvidia's attempt of a copy (sharpening filter).
A comparative video of that would be interesting, what the game offers compared to that AMD improvement, and AMD vs Nvidia, thanks for the advice I will do it, is there any game that should not apply that image sharpening option? Thank you for your video and response I will follow your channel@@OverseerPC-Hardware
I have it turned on by default. For the games I play (which are all the games here), I always prefer having it on - try it, night and day difference. If it's too sharp for you, just lower the strength to your liking.
Rx 6700xt 💪🏻
don't buy a 300 dollar graphics card, get the 260 dollar intel arc a770
amd is better, i have the asrock challanger d oc at linux with 2860mhz at 186watts stock at idle 26C gaming 61C at my mitx case i play jarpg and argp no shooters so rx6700xt has more power bcz more vram!
Buy the right 2K GPU instead.
so if i spend 300 on a gpu i still get stuterring on shit cheaterzone
Unfortunately, yes. It wasn't always like this.. however in the current state, it is.
I h8 life, my dad bought a rtx 3060 2 years ago cuz it was the best in the market and now its FUCKING ASS for fortnite and not only do i have to buy a new gpu, i have to buy a new cpu and a new motherboard BECAUSE THE CPU I HAVE IS FROM 2012
I think the GPU is still good bro. FN changed its game engine on Chapter 4 to UE5 and it's probably why your old CPU wasn't working as well on Performance Mode. If you upgrade your platform, the GPU should still be good. 👍
@@OverseerPC-Hardware the RTX 3060 is Ass of course the CPU from 2012 is ass too but for 2024 the RTX 3060 is a bad card, the RTX 4060 in your video is the same level of shit card, so 6 or 8 gb vram in 2023 was not enough already and you think 2024 will change anything ?
Even if he got the 12 gb version of the RTX 3060 card it is still ass card because it is as power full as a Rx5700 and we all know that card is old and has not enough Power to deliver 144 FPS in modern games and low settings are for no one because low settings just make you a worse palyer bacuse your Graphics matters alot "Draw Distance" or "Fov" matters alot but eat more FPS, if you can see the enemy earlyer then you get a advantage instead of not seeing anything and get reckt but then cry that your 1% lows was good.
as a person who has been on the 3060 12gb since launch, its still got enough horsepower for another generation AT 1080p. the RX 7600 and RTX 4060 perform marginally better but have less VRAM with a smaller bus. You'll benefit more from a platform change. With a Newer CPU/platform (AM4/5,LGA1700) you'll have access to PCIE 4.0, ReBar and DDR4/5 memory which is much faster; with LGA1700 spanning a few Intel Generations, same with AM4 (and hopefully) AM5. The cost of an upgrade from the 3060; even if you sell it on is minimum $450~ It's only around the 4070 and 7700xt/7800xt that you'll see a performance leap worth the effort. on the 3060, with DLSS balanced at 1080p + DLSS Ray Reconstruction. You can play Cyberpunk with Med/High Raytracing 70-80 FPS, and then with DLSS performance, you can even go to RT Overdrive tldr 3060 is fine at 1080p so long as u are fine playing with with settings, its really that old cpu that is ruining ur fortnite experience and bcs it is a competitive game, fps matters more than graphical fidelity and at 1080p, your more likely CPU bound.
@@allxtend4005 what are you on about, the 3060 is way better than what most people have
copying pc builder with identical titles and thumnnails lolololol
Just experimenting with the algorithm bro.
Title style Yes.. But Thumbnail, I'd say mine is probably much more different.
Thanks for the feedback and viewing the video. 👍
this guy has no idea of waht Competetive or even what hardware is really are, he just know that he is a fanboy.
This channel is really bad when it comes to Reviews.