I can, situationally. Like I was fine with a low frame rate in Elden Ring before I upgraded, because I knew that it's capped at 60fps anyway, and I wasn't going to lose any fights I was remotely prepared for by running at only 45fps on my old card. Seeing that it was steady, and wouldn't dip on me when some giant enemy swooped down out of the sky, was more important. For those twitchy FPS titles, of course, I was looking for 100fps or better. Very fortunately, COD:MW, Cold War, and Doom (2016) were all incredible performers on even that old low-profile GPU.
To be fair, I don't think it's unfair to _not_ have to dedicate an SSD to a GPU because developers want to take the Crysis entitlement up another notch.
@@ZeroHourProductions407an you put that in English? Oh, and if you're suggesting that VRAM is expensive you're wrong. And if you're suggesting that video game graphics shouldn't improve over time, you're an anomaly. Are you trying to suggest that PC games shouldn't be able to produce console levels of fidelity? And by the way, there is no SSD that is 16 gigabytes, except if you want to call a thumb drive an SSD.
Geordie, I'll learn to trust you when you learn forgery is a crime, punishable by blah blah. I'm just kidding. Nvidia has stuck us with more or less 4-8 GB VRAM for the past 4 years (excepting the flagships, which cost as much as actual ships) and it's embarrassing. Look at Intels $350 Arc card with 16 GB VRAM. VRAM is not expensive, but I see tons of simpers in comments sections saying "oh Nvidia is just making smart business decisions by ripping off their customers" Yeah, well those smart decisions caused me (a 20 year Nvidia fanboy) to jump ship. When I have to replace my current GPU (hopefully in the year 2030 😅) I'm going to be looking at Intels BattleMage.
I always felt that the 3GB VRAM buffer together with the cut down GPU made the GTX 1060 3GB a bad deal especially considering that the RX 480 8GB had an MSRP of just 30 USD more and it performs a lot better these days relative to even the GTX 1060 6GB.
@@burtiq Nvidia's mindshare was very strong back then (and still is though these days it feels like more people are willing to consider AMD). On one hand going with what people around you recommend makes sense and is often an ok choice but it can backfire if those people aren't well informed themselves and/or care more about you being on "their team" rather than you getting good value for money.
@@burtiq I bought my old 1060 in 2017 because even though I wanted an RX580 I couldn't get one for less than £330 due to miners I could and did buy a 6GB 1060 for £180. Nothing to do with fanboyism, you just can't justify paying that price difference.
It's had a pretty long run, but nothing lasts forever. The performance is just borderline acceptable in the reviewed games. I caved about a month ago and picked up a cheap RX6600 to tide me over. (Very happy with it, tbh) Hopefully the situation in the sub-$300 area improves in the coming years.
My first ever dedicated gpu! Had a Palit Dual model, which has kinda mediocre cooling. Served me well until I bought an RX6600. Donated to my cousin to upgrade his ancient R7 250x, and oh boy how happy he is. Mainly he plays esport tittles and some older mmorpg, and it serves him so well too.
I used one (Zotac single fan) for 3 years and all it cost me was some thermal paste and elbow grease (bought it used so re-pasted it then again a day before I sold it), got exactly what I paid for it. Best value card ever, didn't even use much electric.
I had a 3 GB EVGA 1060 as one of my very first cards. It eventually found its way into a Christmas PC present for my niece and nephew a couple of years ago.
Sams things going to happen with the 3060ti and 3070/3070ti and the 6700XT and 6800/6800XT Once the Nvidia cards start getting older the performance gap will increase and favour the larger Vram Radeon cards, we're already seeing the 6800 beat the 3070 in RT
Great content as always. It just occurred to me. Maybe you could make a video comparing modern low settings to old ultra settings. That'd be interesting. It's infuriating how modern low settings can look much worse than older games set to ultra, yet require more VRAM and processing power to deliver that inferior experience while the older titles run much better.
We already know the result between most remasters and originals (because why would they do a remaster unless it's a cashgrab), but something like Portal 1 vs Portal 2, Just Cause 2 vs 3, Trine 2 vs 4, Doom 2016 vs Eternal
went from a 7yr old g4560 1050 2gb build to a 5600x 3060 12gb build and it's such a massive difference, the old build can still handle esports titles well but not without tinkering with a bunch of settings
currently i'm stuck with r5 2600 and gtx 1050 while waiting for new gpus to come. i have probably maxed out what this setup can do with system optimizatons + overclocking and... it works (just don't expect it even launching smth new)
@@RotcodFox Without a doubt. Had an i3-6100 combined with a GTX 750 Ti and the CPU was the bottleneck. And this was at least 3-4 years ago. Had the CPU replaced with an i5-6400T and it was a major step up regard to consistency in performance.
I suspect this card could live it's best life used as an encoder for a home media center OR as an emulation GPU for older console games. Most old consoles had very low Vram to begin with and if you stick with 1080p or lower, you may have a great deal of fun.
I just wanted to point out reapplying thermal paste did not help the card in Halo. It was the drivers. Actually, your temperatues jumped up 10c after reapplying paste. Likely because the drivers were actually leting the card get it's workout, but it running so much better while being 10c higher shows it wasn't the paste and probably would've been better to leave out so newcomers don't think slapping on paste will fix their driver issues. Besides, 58-59c is VERY cool for a GPU. Not sure why thermal paste was even a thought.
Jesus you are really on the ball, everytime I think about a video you should make you are already hard at work, never stop these awesome videos. PS I really want to see you to do a review of the old Quadro K6000 or the Maxwell Quadros since those has a massive amount of VRAM for its time and even by todays standards and really want to see how it does in recent games.
Didn't actually expect that card to run anything modern, count me surprised. Although the end result does looks like oil painting or slightly melted wax sculpture, but it runs so that's a definitive win.
Passed on a 1060 3GB to another RUclipsr (Mikes Unboxing) to see if 3GB could still kick it today. Yep, still does the trick (with some graphical compromises).
im rocking the asus tuf fx505dd laptop with a ryzen 5 3550h and a gtx 1050 3gb in it and to be fair the 3gb vram limitation doesnt affect me at all since i do not own any games past the year 2021 with the exception of sons of the forest and even that runs absolutely okay .
I still use this exact card and am pretty pleased. I don’t play very many new AAA games so my opinion is likely very skewed compared to other players on pc but Im okay with running games that are expansive with all settings on performance or low. If you’re like me and you mainly play games that released before 2019 this card is highly recommended for the price but it is barley cutting it in 2023 and soon will be near unusable.
just get a 6700xt. The 7700 and 7800xt are barely going to be an upgrade apparently, and its the main reason AMD are not releasing them. They released the 7800 and 7800xt already but lied and called em the 7900xtx and 7900xt.
I upgraded from this exact model of 1060 in December to an Arc A770!! It honestly still performed pretty good, I was even able to get Star Citizen to 40 fps on low.😂 Very surprising to see the Gigabyte model when the video started, thought Random Gaming stole mine🤣
This video came out just after I built my new pc with a 4070ti and although the upgrade from my old 1060 3gb was night & day difference, I only started to feel like my gpu was in need of an upgrade in the past year. Up until now, the card help up pretty damn well.
Also the GT 440 3GiB had an impressive amount of memory for the time especially for a low end card that is lower end than the GTX 1060 five generations earlier
I'm still trying to scrape by with this thing even with a new build a couple of months back, but by the end of the year I'm going to have to say goodbye to this card that got me through so much in the past three years. Thinking about it's older brother, the 2060 Super.
The upgrade from my gtx 660 to the 1060 3gb at the end of 2018 was huge, got it for 120 eur. Soon the low vram started to irritate me, bought a 1070 in 2019 and used it until a few months ago, whole diff story!
With spare parts I strapped an Rx 5700 xt with the same CPU and I shit you not in games like cyberpunk, the outer worlds, spiderman, the card is maxed out
I miss the good old days when you could use 700 series cards as your main gpu. I had a 780ti for about 2 months as my only card after selling my 2060super for basically what I bought it for and preordering another gpu at only a little over MSRP. It served me well but since 700 series isn't supported anymore you can only used it for older games despite the fact they could still run them (altho due to the 3GB of Vram at medium/low 1080p) and for new games probably a lot of settings on low.
@@tilburg8683 What surprises me more is that until 2020, having a gtx 750, you could play any new game without any problems and did not need to have 6-8GB of video memory to play with a comfortable FPS
I have one of these cards, but last year I replaced it with an RX 6600 because on newer titles, mainly Cyperpunk and Escape from Tarkov, struggled on it. Cyberpunk just ran hard against the VRAM limit and then some, so minor stutters accompanied most long drives, even if the average framerate was about 40 otherwise. And Tarkov stuttered less for the most part, but its too competitive to even live with those struggles, especially since the VRAM kept me from raising LOD to max, which has the odd side effect that any magnifying optic (even ones with optional magnification that are set to 1x) WILL set LOD to max on the fly, causing the card to suddenly need to load in a bunch of stuff that wasnt getting rendered before, freezing the game for a split second. Right when you want to shoot someone. Strangely enough the VRAM never bottlenecked it and the GPU usage was always glued to 100%, it just put a hard limit on how much stuff can be loaded in at any point in time. It still overclocked nicely and was well cooled.
Still rocking the 1060 6gb, only real problems seem to be driver related. I want to upgrade but I bought a premade last time when they were about the same price as the GPU alone and I know I'll probably need to upgrade my PSU which is a large part of the cost of a new system reusing some of my parts. I think I'm stuck until something goes out, I'm not a heavy gamer so I still have some older game options.
Costco has had some screaming deals on prebuilds, off and on, since the end of 2022, some even with a 3060 12GB for $800 all-in. So it may be worth keeping tabs on that. In case they list a deal that undercuts a build or isn't costing much more than an upgrade.
What driver issues are you having? There is a unique Vulkan driver bug which kills Yuzu on the 900-series but i'm fairly certain 10-series is not affected, they had a similar bug but fixed it.
I really don´t get the criticism against FSR and why it´s inferior to DLSS. FSR is saving a lot of cards out there, bringing their owners the possibility to enjoy even the latest games to an acceptable degree. It´s an "anti planned obsolescence" technology and we should be grateful to have it.
I finally got hold of some games that have FSR and I thought that 'Quality' and 'Balanced' looked close enough to native to use it without issue. Sure, it's not as good as DLSS and sometimes XeSS can have the edge. But FSR 3.0 is up next and promises to improve on a lot of the issues. At the end of the day, it works on Intel/AMD/Nvidia and costs us not a penny to use for extra fps. Some game devs implement it far better than others, seems to be the problem.
True i mean if you are budget constrain you are likely to have small monitor where the image is not that bad unless you put in on ultra performamce mode
Some updates ago, I've watched 1060 3GB and 7970 3GB running Hogwarts Legacy. Geforce had good frames at 1080p quality fsr, but the textures were often completely missing, leaving just a hole in their place. 7970 though, it didn't have that problem - as anything above 720p balanced fsr was too much. I've tried it on my R9 380 4GB though and was pleasantly surprised. While it couldn't hold 60fps in its dream, it was still pretty stable gameplay at around 40-50, can't remember now, with all the textures looking fine at 1080p quality fsr. It was a good gaming experience. So I'd really like to see that comparison with a 1050ti, they might just be too close in many situations. If not in frame rate, then in stability and/or overall looks.
My first pc build was at the end of my undergrad and used this card with a ryzen 3 1200, and it was a midrange dream until shadow of the tomb raider stuttered like a mad man. It was a great time though
Back when i was building my first ever pc (i am still using it) the 1060 3gb version was due to discounts cheaper than gtx 1050ti which was my only option for the budget So i am using the gtx 1060 3gb version with an i5 9400f It ran pretty good, at least for what it costed me (the total build costed around 45k INR which inc of all taxes should be around 450$)
Tough choice, but good discount amount. 1050ti is pretty slow, 1060/3gb is too VRAM starved. I decided to side step the choice and get the 970, which has plenty of VRAM but it has other gremlins - the driver quality has been suffering with unique bugs lately (Yuzu is broken), and it doesn't do varying refresh displays. Got it for 80€ used in 2019. I'll be replacing it now, it's been becoming less reliable in spite of repaste/repad and then fan controller failed, but it's nearly 10 years old, so, fine i guess? Aiming for 160€ for the 2060Super, which will take some hunting, but if i get too frustrated i can grab the 2060 standard for around 130€. 3060/12GB is the dream but well outside my capability.
I just replaced my gtx 970 strix by a rx 7900xt hellhound . In the end the memory was the thing holding it back the most.. I can't imagine having only 3 ☠️
I really love the untextured trees in that halo game there... I actually wish they would make a cartoony halo game with non existent textures like that
Hey been watching your channel for a while now im giving my buddy my 3060 and saving up for a RX 7900 xt so im going back to this card for a bit so thanks for the vid
Had the Gtx 1060 3gb in my dell prebuild that i upgraded from ( i know i shouldnt have bought a prebuilt ) but its good for olders games like the settlers 3 and avp2 even csgo , after that i saved up for a custom rig from pc specialist now am running ryzen 5 2600, 16gb of ram and a 2060 palit single fan card , i only play at 1080p , so i am happy with my current setup , ps sorry to have rambled so much .
I have a gtx 1050 3Gb version in a laptop and i can say it is very good. Surprises me almost every day with performance. I played uncharted, hogwarts legacy, death stranding, newer and old assassins creeds, rdr2 and a lot of other less demanding/older games. I play at 1080p medium and usually lock the newer games to 30 fps via Rivatuner. I hate stuttering/frame drops from 30 fps. But with overclocking and undervolting both the gpu and cpu, the games run very well, no dibs bellow 30 or stuttering. So i have to say 3gb vram still holds up, unless the game is very bad optimized. But then you just have to wait for patches and play it then
Oh and a tip for playing with 30 fps. Turn of ingame vsync and fps lock and turn on half refresh rate vsync (if you have a 60hz display) in nvidia control panel and lock the game to 30 fps via Rivatuner
this is why the PC gaming is at its worst aswell because its people like you who says "games are so unoptimized omg i cant even run it well on my 8 YEAR OLD PC!!" The answer is no, 3GB is definitely NOT holding up, even 4gb vram is dead and irrelevant in 2023 so idk what you are smokin if you think 3gb still holds in 2023 especially when even some non triple A title like Halo Infinite already shutdown people with only 3gb vram
I was playing Halo Infinite on a 1060 3GB since the game's release. It was a struggle on graphics even running on LOW but i ended up working my way up to Platinum 5 on ranked slayer so i guess that speaks to the ability to play. The card had a lot of issues loading larger maps (think BTB) but i stuck with ranked slayer most of the time anyway. The nail in the coffin was when 343 released Season 3 of Infinite where they cut off people with less that 4GB of VRAM.
Hey ive stillgot this card! Bought in 2017 when money was very tight and its served me well. Looking for the more mainstream rdna3 cards and rtx 4060, then look at changing
Still using my R9 290 4gb from 2015 and videos like this only reaffirm to me that I've no reason to upgrade other than for VR maybe. Me back in 2015 thought my system would've been dead in 7 years lmao, love my "old" PC.
Still running this card in 2024! Honestly, because I'm only playing older games at the moment, I'm well off. However, I do want to play Cyberpunk for instance, and I'll need to get a new card someday even if I have barely any money left most of the time. I don't want to play Cyberpunk with bad settings. I don't want to do that to myself.
I didn't know that this video was about the 1060. It's a surprise, given that I have a 1060. However, the one that I have is the Superclocked version, which has 6GB, instead of the 3. For what it is, it works like a dream, even to this day. I can run games at medium and high to reach around 45-60fps (depending on the game).
1060/6GB is the original card and very nice. SC or not SC tiny difference, no matter, though EVGA is always just pleasant and fairly reliable, so good choice. 3GB is NVidia's cut cost scam released a longer time later, it misses so much capability that it should have had a more distinctive name. You find many people not paying attention or not understanding the difference and hooking up with a so much inferior product. DX11 heavy testing and not enough warning by the press also to blame. NV's scammy naming goes more brazenly on. 3060 (12GB) got silently supplanted by the 3060/8GB which is missing a bunch of hardware and loses 20% performance, making it compete for performance with the original 2060 (6GB) and not always winning, and being well short of 2060Super in performance, at measly 30€ discount from the real 3060. The full 12GB version of 3060 is very nice though! Sometimes there are nice surprises. 2060Super is not a 2060, it's effectively a more power frugal 2070. Very good. I will not complain about over-delivering.
Considering to some benchmark pages a RX 6500 XT would be just as fast - but considering the negative reviews when it launched you might be able to get one for less money. And even if not, it is probably using less power and it´s a lot newer.
I first read the title as "When you have 32GB of RAM in 2023..." and was confused and worried. Surely I hadn't fell that much behind the curve on that front yet?
I never did understand the 3G choice for this card. All I can figure is that there were a bunch of 512 MB memory chips floating around for really cheap.
@@FacialVomitTurtleFights I think some of the 970s had full 4gb, at least mine never showed 3.5 on any games nor did it slow down when going over 3.5gb usage. Pretty pleased with it and definitely gainded itself a place on my shelf even after it's retirement, can't say the same for my 4690k, that poor thing didn't age so well
@@FacialVomitTurtleFights That cannot be done i don't think. 970 uses the same chip as 980 and same memory bus layout physically as the 980, with 256 data pins. But they split the bus into the 224-bit bus and 32-bit bus to artificially make it slower, and there's a chip select logic so the two cannot run at the same time. If you wanted to reuse the chip from the 1060 and split its 192 bit bus into two chip-select buses 96 bit wide, one carrying 3GB and another 1.5GB for a total of 4.5GB and some cost saving, it would also necessarily become horribly slow. The 970 just about works out OK, since the last half-a-gig become effectively idle and just storing the display window manager, and it's still faster to have that slow half-gig than to fetch data via PCI Express. As far as games are concerned, you're missing maybe 200MB fast VRAM compared to a full 4GB card, or they might not even have a metric to detect the difference. I wouldn't anticipate to see this trick done ever again, especially now that they can just lock down performance because the firmware is now hardware signed. I think the reason they made the 1060/3gb is because NVidia's problem is that they sometimes release something REALLY good and then have a hard time competing against themselves, with people plain not upgrading for 3-4 generations. So they try to release cards with deliberately not enough VRAM, even if they have to scam consumers into buying them by reusing a product name and not differentiating them well enough. If a card is too slow, you just dial down the settings. If you have a DX12/Vulkan title that needs a base amount of VRAM that you don't have, well that's game over. Planned obsolescence in truest sense of the word.
Id very much like to know how it fares in Baldur's Gate 3. The game calls for a 970 minimum, the 1060 iirc ever so slightly out performs the 970 in almost ever category except from VRAM, where the 970 has 4
I still have a 3gb 1060 kicking around, it was ok when it was new, but time has not been kind to it. It’s great for older titles, but as you said, games these days aren’t made for low vram configs, and the sacrifices required to get playable fps are ridiculous, if I want to play Minecraft I’ll load Minecraft, but when it’s hard to tell the difference between Minecraft and something like hogwarts, it’s too much for me.
I mean, did you watch the video? The sacrifices were minimal and still ended with OK performance in most of the games. Only issues were Dead Space needing fsr, which looked fine but had stutters, and Star Wars that needed fsr but looked decent and ran well. It's coming to the end sure, but you can still game with various new titles and it look OK. It's not like he turned all the games into 240p.
This is ny exact gtx 1060. Its been a great card for the past few years but Im looking to finally upgrade it to something with a bit more power and vram. This card will still be used in a home server when I upgrade but its been good to me.
My textures looked like that on a gtx 1650 on low settings and it was because of the high texture package that installs with the game. I removed the texture pack and it looked alot better. I think you need 6 gb to run those textures and you have to choose not to install them otherwise you get them anyway.
Have you tested the 6600 lite yet it only has 2 outputs a hdmi and display port but i understand still runs at the same clock speeds but at only 170.00 US Dollars.
My first GPU in a PC i build from zero. God, do I regret that purchase. Went into 8GB within 1.5 years and stayed there until I picked the 3080Ti a few months ago.
I have a GTX 1060 3GB. I spent around 90 Canadian dollars on it. Great for boomer shooters and big budget games from several years ago, like Gears of War 4 and Call of Duty WWII. My main PC has an AMD Radeon RX 6600.
Still rocking 1060 3gb with i3 8100 since 5 years, playing 4-10 years old games now mostly tho, sometimes new games i like, if its optimised to run at least 45fps average
I was rocking a 2060 3GB GPU (in a laptop) for quite a while before my most recent upgrade. I never ran into game I wasn't capable of running. VRAM is important, but definitely not as big of a deal as people say.
It depends, but your scenario is a laptop. Even some DX11 games during the PS4 era like Dark Souls 3, the VRAM usage was more than 4 GB at 1080p Ultra. Shadow of the Tomb Raider at 1080p Ultra was more than 6 GB VRAM. 3 GB VRAM was more than good enough for the early era of DX11 and overkill with DX 10 and older. It really depends...
Still using an Asus GTX 1060 3 GB on my secondary pc for light gaming. Still pretty decent for competitive games like Valorant but AAA games will struggle a lot unless you played at a lower resolution and settings.
Try hook 32gb kit memory, I think because the nature of some games that tend to caching some previous texture to system memories, especially when seeing the system memory usage.
I upgraded from this to a 3080 FTW 3 that got for $350. It was a night a day difference (obviously) The 1060 made a decent card for my GF’s “getting into PC gaming” PC I put together though
5:08 Frametime is always more important than fps IMO. I can tolerate low but stable fps but not stutters.
Same.
I can, situationally. Like I was fine with a low frame rate in Elden Ring before I upgraded, because I knew that it's capped at 60fps anyway, and I wasn't going to lose any fights I was remotely prepared for by running at only 45fps on my old card. Seeing that it was steady, and wouldn't dip on me when some giant enemy swooped down out of the sky, was more important.
For those twitchy FPS titles, of course, I was looking for 100fps or better. Very fortunately, COD:MW, Cold War, and Doom (2016) were all incredible performers on even that old low-profile GPU.
Totally agree. Stable 30 FPS is better then unstable 30-45fps
I think 30 FPS just feels too laggy, but I agree that prioritising consistent frametime is more important.
As a virtual reality player, I gotta agree
It's good nvidia learned not to cheap out on the RAM on new cards... OH WAIT 😂
To be fair, I don't think it's unfair to _not_ have to dedicate an SSD to a GPU because developers want to take the Crysis entitlement up another notch.
@@ZeroHourProductions407an you put that in English?
Oh, and if you're suggesting that VRAM is expensive you're wrong.
And if you're suggesting that video game graphics shouldn't improve over time, you're an anomaly.
Are you trying to suggest that PC games shouldn't be able to produce console levels of fidelity?
And by the way, there is no SSD that is 16 gigabytes, except if you want to call a thumb drive an SSD.
@@ZeroHourProductions407 8GB was midrange in 2017.
Your point and blaming devs in general is stupid.
Geordie, I'll learn to trust you when you learn forgery is a crime, punishable by blah blah.
I'm just kidding. Nvidia has stuck us with more or less 4-8 GB VRAM for the past 4 years (excepting the flagships, which cost as much as actual ships) and it's embarrassing.
Look at Intels $350 Arc card with 16 GB VRAM. VRAM is not expensive, but I see tons of simpers in comments sections saying "oh Nvidia is just making smart business decisions by ripping off their customers"
Yeah, well those smart decisions caused me (a 20 year Nvidia fanboy) to jump ship. When I have to replace my current GPU (hopefully in the year 2030 😅) I'm going to be looking at Intels BattleMage.
@@Boogie_the_cat heyyy look when I built my first pc that was actually an option lol 16gb, 32gb and 64gb SSD's
I always felt that the 3GB VRAM buffer together with the cut down GPU made the GTX 1060 3GB a bad deal especially considering that the RX 480 8GB had an MSRP of just 30 USD more and it performs a lot better these days relative to even the GTX 1060 6GB.
If you were young and foolish like me, you bought the 1060 because there were A TON of nvidia fanboys to skew your decision.
@@burtiq Nvidia's mindshare was very strong back then (and still is though these days it feels like more people are willing to consider AMD). On one hand going with what people around you recommend makes sense and is often an ok choice but it can backfire if those people aren't well informed themselves and/or care more about you being on "their team" rather than you getting good value for money.
@@burtiq
I bought my old 1060 in 2017 because even though I wanted an RX580 I couldn't get one for less than £330 due to miners I could and did buy a 6GB 1060 for £180. Nothing to do with fanboyism, you just can't justify paying that price difference.
@@darthwiizius I bought my 1060 when they were 20$ apart purely because of fanboyism.
And post was about 480, not 580
Wasnt this card considered a great value back then? 🤣
It's had a pretty long run, but nothing lasts forever. The performance is just borderline acceptable in the reviewed games. I caved about a month ago and picked up a cheap RX6600 to tide me over. (Very happy with it, tbh) Hopefully the situation in the sub-$300 area improves in the coming years.
The 6600 is the new budget king
My first ever dedicated gpu! Had a Palit Dual model, which has kinda mediocre cooling. Served me well until I bought an RX6600. Donated to my cousin to upgrade his ancient R7 250x, and oh boy how happy he is. Mainly he plays esport tittles and some older mmorpg, and it serves him so well too.
Me with my 4GB of vram feeling superior
😂
*With a 1050ti*
Yeah, as I have a 290X on my 2nd setup.
Yes, me with my GTX 750TI 2GB 🤣💀
Me here chillin with gt 730
Had the 6GB version, pretty good for the time and nice resale value when I switched to 1070 with almost no costs.
I used one (Zotac single fan) for 3 years and all it cost me was some thermal paste and elbow grease (bought it used so re-pasted it then again a day before I sold it), got exactly what I paid for it. Best value card ever, didn't even use much electric.
There are several versions of this card, there's one "good" 6 gb card and a "bad" one, same with the 3gb variant, and there's even a 5gb version.
Still rocking 1060 6gb i7 laptop 🔥
I had a 3 GB EVGA 1060 as one of my very first cards. It eventually found its way into a Christmas PC present for my niece and nephew a couple of years ago.
same, had a 3GB Zotac
Sams things going to happen with the 3060ti and 3070/3070ti and the 6700XT and 6800/6800XT
Once the Nvidia cards start getting older the performance gap will increase and favour the larger Vram Radeon cards, we're already seeing the 6800 beat the 3070 in RT
@@yahyasajid5113 comparing a 3070 to a 3gb 1060 shows you have no clue what you’re even talking about
@@dylanzachary683 not sure which comment you're reading, clearly I drew no comparison of the 3070 to the 1060
I was comparing it directly to the 6800
@@yahyasajid5113 The thing is though, once those cards get older at least the Nvidia cards will still work
Great content as always. It just occurred to me. Maybe you could make a video comparing modern low settings to old ultra settings. That'd be interesting. It's infuriating how modern low settings can look much worse than older games set to ultra, yet require more VRAM and processing power to deliver that inferior experience while the older titles run much better.
We already know the result between most remasters and originals (because why would they do a remaster unless it's a cashgrab), but something like Portal 1 vs Portal 2, Just Cause 2 vs 3, Trine 2 vs 4, Doom 2016 vs Eternal
@@RadioactiveBlueberry Portal 2 is very beautiful game, and highest settings can be run on new RDNA2 iGPU, so clearly not very high performance hit.
went from a 7yr old g4560 1050 2gb build to a 5600x 3060 12gb build and it's such a massive difference, the old build can still handle esports titles well but not without tinkering with a bunch of settings
I imagine that the g4560 was also a massive bottleneck for the 1050
currently i'm stuck with r5 2600 and gtx 1050 while waiting for new gpus to come. i have probably maxed out what this setup can do with system optimizatons + overclocking and... it works (just don't expect it even launching smth new)
@@checkyboxbruh that's not even that bad, i have a i7-4700 and gtx 750ti😭, my r5 5600 and 6750xt are coming tomorrow tho🎉
@@RotcodFox Without a doubt. Had an i3-6100 combined with a GTX 750 Ti and the CPU was the bottleneck. And this was at least 3-4 years ago. Had the CPU replaced with an i5-6400T and it was a major step up regard to consistency in performance.
@@obi-wankenobi8023 How is your experience?
I suspect this card could live it's best life used as an encoder for a home media center OR as an emulation GPU for older console games. Most old consoles had very low Vram to begin with and if you stick with 1080p or lower, you may have a great deal of fun.
I just wanted to point out reapplying thermal paste did not help the card in Halo. It was the drivers. Actually, your temperatues jumped up 10c after reapplying paste. Likely because the drivers were actually leting the card get it's workout, but it running so much better while being 10c higher shows it wasn't the paste and probably would've been better to leave out so newcomers don't think slapping on paste will fix their driver issues. Besides, 58-59c is VERY cool for a GPU. Not sure why thermal paste was even a thought.
Jesus you are really on the ball, everytime I think about a video you should make you are already hard at work, never stop these awesome videos. PS I really want to see you to do a review of the old Quadro K6000 or the Maxwell Quadros since those has a massive amount of VRAM for its time and even by todays standards and really want to see how it does in recent games.
i'll take a look :)
Recently I got a Tesla M40 at home, but a workstation conversion is needed.
"You guys have 3GB VRAM?" - A 1050 owner.
3:13 this pop-in jeeeez
Love the way you make videos for your viewers and not for views or anything else..Keep up the good work mate ❤
thanks :)
I actually had this card until a few days ago, when I upgraded to a used 5700 XT! I must say it served me well despite how much everyone hated it.
still use it, and I just ordered a used 3070 ti today, gonna be a hell of an upgrade
Didn't actually expect that card to run anything modern, count me surprised. Although the end result does looks like oil painting or slightly melted wax sculpture, but it runs so that's a definitive win.
Big fucking facts. I just bought a 7900xt 20gb for 800$ and tbh I don’t need it 😂 was surprised wow
it can run gtav and bf5 great, but kinda sucks at bf2042
Passed on a 1060 3GB to another RUclipsr (Mikes Unboxing) to see if 3GB could still kick it today. Yep, still does the trick (with some graphical compromises).
im rocking the asus tuf fx505dd laptop with a ryzen 5 3550h and a gtx 1050 3gb in it and to be fair the 3gb vram limitation doesnt affect me at all since i do not own any games past the year 2021 with the exception of sons of the forest and even that runs absolutely okay .
I still use this exact card and am pretty pleased. I don’t play very many new AAA games so my opinion is likely very skewed compared to other players on pc but Im okay with running games that are expansive with all settings on performance or low. If you’re like me and you mainly play games that released before 2019 this card is highly recommended for the price but it is barley cutting it in 2023 and soon will be near unusable.
I had the 1060 3gb for such a long time. It's good to see that it can still run games!
That's my config today ! 12400f paired with my glorious GTX 1060 3gb. I plan to change with a RX 7700 XT card in a few months.
just get a 6700xt. The 7700 and 7800xt are barely going to be an upgrade apparently, and its the main reason AMD are not releasing them. They released the 7800 and 7800xt already but lied and called em the 7900xtx and 7900xt.
Nice! I still have the 12400f too. It’s incredible
Get rx 6800, 16gb vs 12 on 7700xt, should be even in perf, only difference would be power usage
@@noticing33 yeah, it's my second choice... I wait for a sale
I upgraded from this exact model of 1060 in December to an Arc A770!! It honestly still performed pretty good, I was even able to get Star Citizen to 40 fps on low.😂 Very surprising to see the Gigabyte model when the video started, thought Random Gaming stole mine🤣
How is a770 working? I am thinking of getting one
The most impressive thing in your comment was the fact that you willingly bought an Arc GPU.
@@NoThisIsntMyChannelarc gpu isn't that bad, it's just not polished for older games
Nice which one did you get?
This video came out just after I built my new pc with a 4070ti and although the upgrade from my old 1060 3gb was night & day difference, I only started to feel like my gpu was in need of an upgrade in the past year. Up until now, the card help up pretty damn well.
Im still playing on mx110 which is 2gb vram and i am happy
Also the GT 440 3GiB had an impressive amount of memory for the time especially for a low end card that is lower end than the GTX 1060 five generations earlier
I'm still trying to scrape by with this thing even with a new build a couple of months back, but by the end of the year I'm going to have to say goodbye to this card that got me through so much in the past three years. Thinking about it's older brother, the 2060 Super.
The upgrade from my gtx 660 to the 1060 3gb at the end of 2018 was huge, got it for 120 eur. Soon the low vram started to irritate me, bought a 1070 in 2019 and used it until a few months ago, whole diff story!
i traded this exact card for some tree. it served me well and that was some damn good weed for it.
😂
1060 3gb here. I have yet to find a game worth the upgrade.
Currently still using this card, does me well still my cpu is holding me back more in some more modern games i play (i5-4670)
With spare parts I strapped an Rx 5700 xt with the same CPU and I shit you not in games like cyberpunk, the outer worlds, spiderman, the card is maxed out
Man, this is R9 290x I got about a decade ago is still kicking & I don't know how, but I'm not going to question it.
I remember back when a 1060 3GB was your main system GPU, doesn't seem like long ago but it was. I think I first started watching in the G3258 days.
I had a 2GB 760 up until recently, power was never its issue but holy crap that VRAM really became a problem in the last year or two
I miss the good old days when you could use 700 series cards as your main gpu. I had a 780ti for about 2 months as my only card after selling my 2060super for basically what I bought it for and preordering another gpu at only a little over MSRP. It served me well but since 700 series isn't supported anymore you can only used it for older games despite the fact they could still run them (altho due to the 3GB of Vram at medium/low 1080p) and for new games probably a lot of settings on low.
@@tilburg8683 What surprises me more is that until 2020, having a gtx 750, you could play any new game without any problems and did not need to have 6-8GB of video memory to play with a comfortable FPS
on the Halo Infinite clip, take a look at the GPU Power value. No wonder such a big improvement.
I have one of these cards, but last year I replaced it with an RX 6600 because on newer titles, mainly Cyperpunk and Escape from Tarkov, struggled on it.
Cyberpunk just ran hard against the VRAM limit and then some, so minor stutters accompanied most long drives, even if the average framerate was about 40 otherwise.
And Tarkov stuttered less for the most part, but its too competitive to even live with those struggles, especially since the VRAM kept me from raising LOD to max, which has the odd side effect that any magnifying optic (even ones with optional magnification that are set to 1x) WILL set LOD to max on the fly, causing the card to suddenly need to load in a bunch of stuff that wasnt getting rendered before, freezing the game for a split second. Right when you want to shoot someone.
Strangely enough the VRAM never bottlenecked it and the GPU usage was always glued to 100%, it just put a hard limit on how much stuff can be loaded in at any point in time. It still overclocked nicely and was well cooled.
Time is running fast. Nowadays 1060 is low end gpu..
To be fair, its life as a mid gamma was like 4-5 years, when gpus like gtx 960 went from mid to low in 1-2 years :(
@UnjustifiedRecs the 6gb version cost over 300Dollars in 2016, pretty much midrange.
I would have said it was a low end card at it's time but this card is still my daily driver it still plays cod somehow.
Still rocking the 1060 6gb, only real problems seem to be driver related. I want to upgrade but I bought a premade last time when they were about the same price as the GPU alone and I know I'll probably need to upgrade my PSU which is a large part of the cost of a new system reusing some of my parts. I think I'm stuck until something goes out, I'm not a heavy gamer so I still have some older game options.
Costco has had some screaming deals on prebuilds, off and on, since the end of 2022, some even with a 3060 12GB for $800 all-in. So it may be worth keeping tabs on that. In case they list a deal that undercuts a build or isn't costing much more than an upgrade.
What driver issues are you having? There is a unique Vulkan driver bug which kills Yuzu on the 900-series but i'm fairly certain 10-series is not affected, they had a similar bug but fixed it.
I really don´t get the criticism against FSR and why it´s inferior to DLSS. FSR is saving a lot of cards out there, bringing their owners the possibility to enjoy even the latest games to an acceptable degree.
It´s an "anti planned obsolescence" technology and we should be grateful to have it.
I finally got hold of some games that have FSR and I thought that 'Quality' and 'Balanced' looked close enough to native to use it without issue. Sure, it's not as good as DLSS and sometimes XeSS can have the edge. But FSR 3.0 is up next and promises to improve on a lot of the issues. At the end of the day, it works on Intel/AMD/Nvidia and costs us not a penny to use for extra fps. Some game devs implement it far better than others, seems to be the problem.
Fsr definitely more friendly to the pocket
True i mean if you are budget constrain you are likely to have small monitor where the image is not that bad unless you put in on ultra performamce mode
It's amazing to see how much more my card, the GTX 1060 with 6GB, can do compared to the 3GB version.
Thanx for another great video , i always look forward to your content , cheers .
Some updates ago, I've watched 1060 3GB and 7970 3GB running Hogwarts Legacy. Geforce had good frames at 1080p quality fsr, but the textures were often completely missing, leaving just a hole in their place. 7970 though, it didn't have that problem - as anything above 720p balanced fsr was too much. I've tried it on my R9 380 4GB though and was pleasantly surprised. While it couldn't hold 60fps in its dream, it was still pretty stable gameplay at around 40-50, can't remember now, with all the textures looking fine at 1080p quality fsr. It was a good gaming experience. So I'd really like to see that comparison with a 1050ti, they might just be too close in many situations. If not in frame rate, then in stability and/or overall looks.
I bought the $87 2GB and it seems to work fairly well.
My first pc build was at the end of my undergrad and used this card with a ryzen 3 1200, and it was a midrange dream until shadow of the tomb raider stuttered like a mad man. It was a great time though
Back when i was building my first ever pc (i am still using it) the 1060 3gb version was due to discounts cheaper than gtx 1050ti which was my only option for the budget
So i am using the gtx 1060 3gb version with an i5 9400f
It ran pretty good, at least for what it costed me (the total build costed around 45k INR which inc of all taxes should be around 450$)
Tough choice, but good discount amount. 1050ti is pretty slow, 1060/3gb is too VRAM starved. I decided to side step the choice and get the 970, which has plenty of VRAM but it has other gremlins - the driver quality has been suffering with unique bugs lately (Yuzu is broken), and it doesn't do varying refresh displays. Got it for 80€ used in 2019. I'll be replacing it now, it's been becoming less reliable in spite of repaste/repad and then fan controller failed, but it's nearly 10 years old, so, fine i guess? Aiming for 160€ for the 2060Super, which will take some hunting, but if i get too frustrated i can grab the 2060 standard for around 130€. 3060/12GB is the dream but well outside my capability.
i upgraded from 2gb 750ti to 3070 and it's already considered low Vram lmao
haha im going from the 3gb 1060 to a 3070 ti, and im so curious, how long will it hold up, 1060 lasted 6 years, and still going strong
Bruh, I remember when 2gb was considered high-end
I remember when 512mb cards were over $300.
In the 2005-2012 era probably.
@@NikosM112 thereabouts, first 2gb card were released in 2008
I still use a 2GB GTX 960
I remember 2012 when I bought my HD7770 with 1GB I felt 2 gb was overkill :)))
This guy did the first resident evil 4 test after the game got cracked. Guess is a pirate here, welcome to the gang.
I just replaced my gtx 970 strix by a rx 7900xt hellhound .
In the end the memory was the thing holding it back the most..
I can't imagine having only 3 ☠️
I really love the untextured trees in that halo game there... I actually wish they would make a cartoony halo game with non existent textures like that
Nintendo 64 Halo
Hey been watching your channel for a while now im giving my buddy my 3060 and saving up for a RX 7900 xt so im going back to this card for a bit so thanks for the vid
Had the Gtx 1060 3gb in my dell prebuild that i upgraded from ( i know i shouldnt have bought a prebuilt ) but its good for olders games like the settlers 3 and avp2 even csgo , after that i saved up for a custom rig from pc specialist now am running ryzen 5 2600, 16gb of ram and a 2060 palit single fan card , i only play at 1080p , so i am happy with my current setup , ps sorry to have rambled so much .
I have a gtx 1050 3Gb version in a laptop and i can say it is very good. Surprises me almost every day with performance. I played uncharted, hogwarts legacy, death stranding, newer and old assassins creeds, rdr2 and a lot of other less demanding/older games. I play at 1080p medium and usually lock the newer games to 30 fps via Rivatuner. I hate stuttering/frame drops from 30 fps. But with overclocking and undervolting both the gpu and cpu, the games run very well, no dibs bellow 30 or stuttering. So i have to say 3gb vram still holds up, unless the game is very bad optimized. But then you just have to wait for patches and play it then
Oh and a tip for playing with 30 fps. Turn of ingame vsync and fps lock and turn on half refresh rate vsync (if you have a 60hz display) in nvidia control panel and lock the game to 30 fps via Rivatuner
this is why the PC gaming is at its worst aswell because its people like you who says "games are so unoptimized omg i cant even run it well on my 8 YEAR OLD PC!!" The answer is no, 3GB is definitely NOT holding up, even 4gb vram is dead and irrelevant in 2023 so idk what you are smokin if you think 3gb still holds in 2023 especially when even some non triple A title like Halo Infinite already shutdown people with only 3gb vram
@@Eleganttf2 Cry more.
@@wyterabitt2149 you sure do
I was playing Halo Infinite on a 1060 3GB since the game's release. It was a struggle on graphics even running on LOW but i ended up working my way up to Platinum 5 on ranked slayer so i guess that speaks to the ability to play. The card had a lot of issues loading larger maps (think BTB) but i stuck with ranked slayer most of the time anyway.
The nail in the coffin was when 343 released Season 3 of Infinite where they cut off people with less that 4GB of VRAM.
Hey ive stillgot this card! Bought in 2017 when money was very tight and its served me well. Looking for the more mainstream rdna3 cards and rtx 4060, then look at changing
I still have a gtx 950 that i bought in 2015 or 2016. It has acceptable performance for 720p but the 2gb of vram makes it full in almost every game.
My friend just gave me a great deal on a 6gb 1060 for my home media server. Sold it to me for $25! Looking forward to learning linux with that build.
Would like to see how RX590 or Radeon VII would do in 2023. They had 8gigs of VRAM too if I'm not wrong
Still using my R9 290 4gb from 2015 and videos like this only reaffirm to me that I've no reason to upgrade other than for VR maybe.
Me back in 2015 thought my system would've been dead in 7 years lmao, love my "old" PC.
And now developers are like "yo, we need you to install an nvme in your gpu so you can run our game at 480p"
That's EA for you...
Wtf what games are these?
I miss my r9 280 😭 overclocked very well just a shame no dx12
Still running this card in 2024! Honestly, because I'm only playing older games at the moment, I'm well off. However, I do want to play Cyberpunk for instance, and I'll need to get a new card someday even if I have barely any money left most of the time. I don't want to play Cyberpunk with bad settings. I don't want to do that to myself.
I didn't know that this video was about the 1060. It's a surprise, given that I have a 1060. However, the one that I have is the Superclocked version, which has 6GB, instead of the 3. For what it is, it works like a dream, even to this day. I can run games at medium and high to reach around 45-60fps (depending on the game).
1060/6GB is the original card and very nice. SC or not SC tiny difference, no matter, though EVGA is always just pleasant and fairly reliable, so good choice. 3GB is NVidia's cut cost scam released a longer time later, it misses so much capability that it should have had a more distinctive name. You find many people not paying attention or not understanding the difference and hooking up with a so much inferior product. DX11 heavy testing and not enough warning by the press also to blame.
NV's scammy naming goes more brazenly on. 3060 (12GB) got silently supplanted by the 3060/8GB which is missing a bunch of hardware and loses 20% performance, making it compete for performance with the original 2060 (6GB) and not always winning, and being well short of 2060Super in performance, at measly 30€ discount from the real 3060. The full 12GB version of 3060 is very nice though!
Sometimes there are nice surprises. 2060Super is not a 2060, it's effectively a more power frugal 2070. Very good. I will not complain about over-delivering.
Will be interesting to see how it will handle Unreal 5 engine games
Yeah I don’t think it’ll go well haha
Considering to some benchmark pages a RX 6500 XT would be just as fast - but considering the negative reviews when it launched you might be able to get one for less money. And even if not, it is probably using less power and it´s a lot newer.
Nice content as always😁
I think I'll give my 2060 a try on my channel😅
I would like to see a second part of this video, targeting native 720p @60 fps and 1080p @30 fps gameplay.
I first read the title as "When you have 32GB of RAM in 2023..." and was confused and worried. Surely I hadn't fell that much behind the curve on that front yet?
Bought one for my friend. They've been enjoying it. it's a decent deal
What do you think is the best to by second hand :
GTX 1080 8gb
or
RTX 2060 6gb
I literally still run one exactly, but it's age is really visible. I'm gonna upgrade very soon
I never did understand the 3G choice for this card. All I can figure is that there were a bunch of 512 MB memory chips floating around for really cheap.
Seem like nvidia should have reused gtx 970 memory or something 3.5gb (+.5?) or whatever haha
@@FacialVomitTurtleFights I think some of the 970s had full 4gb, at least mine never showed 3.5 on any games nor did it slow down when going over 3.5gb usage. Pretty pleased with it and definitely gainded itself a place on my shelf even after it's retirement, can't say the same for my 4690k, that poor thing didn't age so well
@@joey_f4ke238 They had 4Gb of memory but the last .5gb was slower than the first 3.5gb.
Google somehting like "970 vram 3.5gb?" to find out more.
@@FacialVomitTurtleFights That cannot be done i don't think. 970 uses the same chip as 980 and same memory bus layout physically as the 980, with 256 data pins. But they split the bus into the 224-bit bus and 32-bit bus to artificially make it slower, and there's a chip select logic so the two cannot run at the same time. If you wanted to reuse the chip from the 1060 and split its 192 bit bus into two chip-select buses 96 bit wide, one carrying 3GB and another 1.5GB for a total of 4.5GB and some cost saving, it would also necessarily become horribly slow. The 970 just about works out OK, since the last half-a-gig become effectively idle and just storing the display window manager, and it's still faster to have that slow half-gig than to fetch data via PCI Express. As far as games are concerned, you're missing maybe 200MB fast VRAM compared to a full 4GB card, or they might not even have a metric to detect the difference. I wouldn't anticipate to see this trick done ever again, especially now that they can just lock down performance because the firmware is now hardware signed.
I think the reason they made the 1060/3gb is because NVidia's problem is that they sometimes release something REALLY good and then have a hard time competing against themselves, with people plain not upgrading for 3-4 generations. So they try to release cards with deliberately not enough VRAM, even if they have to scam consumers into buying them by reusing a product name and not differentiating them well enough. If a card is too slow, you just dial down the settings. If you have a DX12/Vulkan title that needs a base amount of VRAM that you don't have, well that's game over. Planned obsolescence in truest sense of the word.
The hardware requirements of these new games are inexcusable, considering the graphical fidelity of those games... or lack thereof.
Yeah expexcially on raster graphics side
try testing the 2060 next, we gotta see how well it does in 2023
Id very much like to know how it fares in Baldur's Gate 3. The game calls for a 970 minimum, the 1060 iirc ever so slightly out performs the 970 in almost ever category except from VRAM, where the 970 has 4
I still have a 3gb 1060 kicking around, it was ok when it was new, but time has not been kind to it. It’s great for older titles, but as you said, games these days aren’t made for low vram configs, and the sacrifices required to get playable fps are ridiculous, if I want to play Minecraft I’ll load Minecraft, but when it’s hard to tell the difference between Minecraft and something like hogwarts, it’s too much for me.
Me too - EVGA. Keeping it for old times sake
I mean, did you watch the video? The sacrifices were minimal and still ended with OK performance in most of the games. Only issues were Dead Space needing fsr, which looked fine but had stutters, and Star Wars that needed fsr but looked decent and ran well.
It's coming to the end sure, but you can still game with various new titles and it look OK.
It's not like he turned all the games into 240p.
This is ny exact gtx 1060. Its been a great card for the past few years but Im looking to finally upgrade it to something with a bit more power and vram. This card will still be used in a home server when I upgrade but its been good to me.
Depends on which games you play . If you play older titles like league etc u don't need
My textures looked like that on a gtx 1650 on low settings and it was because of the high texture package that installs with the game. I removed the texture pack and it looked alot better. I think you need 6 gb to run those textures and you have to choose not to install them otherwise you get them anyway.
Have you tested the 6600 lite yet it only has 2 outputs a hdmi and display port but i understand still runs at the same clock speeds but at only 170.00 US Dollars.
I am amazard the load it can run on this 3gb. 1060 3G was a beast of that time.
i had 3gb just last year and honestly i didnt have big issues with most games. some new big games had stutters (forza mostly)
I had this card. It was much faster than the 1050-Ti, but VRAM was an issue in 2018 already.
i would love to see kerbal space program 2 in your benchmark list as it is literally the most graphically demanding game
My first GPU in a PC i build from zero. God, do I regret that purchase. Went into 8GB within 1.5 years and stayed there until I picked the 3080Ti a few months ago.
I have a GTX 1060 3GB. I spent around 90 Canadian dollars on it. Great for boomer shooters and big budget games from several years ago, like Gears of War 4 and Call of Duty WWII. My main PC has an AMD Radeon RX 6600.
Its sad that my 760 has 4gb but a 1060 has 3gb.
Back in 2016 I bought a 6GB version of a GTX 1060 because even at that time, seven years ago, 3GB were... discouraging, to say the least.
I beat Hogwarts legacy using this card i put everything on low except textures on ultra which made the game look deecent and also used FSR obiovusly.
Ooh I had this one, ran very well and cool. But warzone 1 was the game that made me retire it because of the 3gb buffer…
Still rocking 1060 3gb with i3 8100 since 5 years, playing 4-10 years old games now mostly tho, sometimes new games i like, if its optimised to run at least 45fps average
I was rocking a 2060 3GB GPU (in a laptop) for quite a while before my most recent upgrade. I never ran into game I wasn't capable of running. VRAM is important, but definitely not as big of a deal as people say.
I agree
Just be caucios with the settings
It is a big deal. 4gb vram is the minimum you'd want even for modern esports titles.
@@NikosM112 not really. 3GB runs most games perfectly fine aslong as you're not maxing out settings lol
It depends, but your scenario is a laptop. Even some DX11 games during the PS4 era like Dark Souls 3, the VRAM usage was more than 4 GB at 1080p Ultra.
Shadow of the Tomb Raider at 1080p Ultra was more than 6 GB VRAM.
3 GB VRAM was more than good enough for the early era of DX11 and overkill with DX 10 and older.
It really depends...
@@takehirolol5962 Thats why you shouldnt be an idiot and crank everything to ultra.
Still using an Asus GTX 1060 3 GB on my secondary pc for light gaming. Still pretty decent for competitive games like Valorant but AAA games will struggle a lot unless you played at a lower resolution and settings.
Had a pavilion 1060 3gb for a while till i got the 3050 8gb, it gets a lot of hate but man it kicks ass
Try hook 32gb kit memory, I think because the nature of some games that tend to caching some previous texture to system memories, especially when seeing the system memory usage.
Me with 2GB of VRAM xD
I upgraded from this to a 3080 FTW 3 that got for $350. It was a night a day difference (obviously)
The 1060 made a decent card for my GF’s “getting into PC gaming” PC I put together though
Still using a 970ti 3GB, just about holding on. Want the next gpu i ge to last, so will probably go with a 4080 or rx 7900 xtx