This is true for me. Most of my installed PC library is from the previous decade or earlier (with a few awesome exceptions) and on the console side, I bought a Series X specifically because of the backwards compatibility program.
the main reason i am holding on to aging hardware (literally running it until it dies) is because i am poor AF on top of everything getting a price hike with each new release
@@TheRealDlo and there are increasing numbers of people who fit into this category. Prices are going up but wages aren't. We're living in a new Gilded Age, people... for every rich person there are a hundred poor, so that they may hoard the wealth. I think this (broke ppl) is the real reason that the RX 580 has such good numbers on Steam. Because it's recently become a "budget" (
I feel ya. Was on a 3770k & 760GTX for the longest. Current machine is a used workstation with a refurb card. No way I'd buy new (esp nvidia) unless the vid card was makin bank. Scanner and printer are at least a decade old. But with the weird new subscription stuff companies are going with they really would have to take them from my cold dead hands.
And because we don't need to since our existing systems still play all the games we like. I used to NEED to upgrade to get performance needed to play what I wanted. Paying big bucks to get a difference in performance that I don't even notice? no thanks.
No lies detected. I am playing Space Marine 2 in 4K on medium settings. I can play DCS at max settings. I run an RTX 2070 8GB GDDR5. I have 48GB of 3200MHz RAM by Crucial and an Intel i7 6700K 4.00GHz CPU. I have had this PC for 8yrs only changing from an 6150Ti to the RTX 5 years ago.
Yes we getting rinsed and even Nvidia shaft the retailers that move their stuff. We had a Black Friday last week and Nvidia did not give retailer permission to discount on the 40 serires..... The 40 series has just been an abusive relationship from NVidia and I hope that everyone sees the light and boycots the 40 series....... poor perfomance per $$ (especially if you are trying to upgrade from 3060 ti), and the Adobe like customer abuse...... I will be AMD for next couple of years when I upgrade and sell on, cause Nvidia can go and kick rocks now..... they are troughing/trolling/rinsing their customer base and they can get Fkd!
1060 here. I haven’t upgraded because the value proposition for “mid range” cards has been obliterated. Hoping things get better, but the greed has killed my interest in upgrading to a brand new card.
1060 3gb here, the 3gb was 40% cheaper than the 6gb model here, It still serves me well but i'm looking at the 3060 12gb or the 7600xt 16gb to upgrade. There is no point in waiting anymore because the vram requirements are getting higher and higher. At the end of the day unless you have the best of the best and spend as much on a GPU as the entire rest of your system, you can't even play the newest releases without issue.
My main reason to not upgrade is just the fact that i dont see modern AAAs worth playing i often just find myself replaying older titles which my current gpu can handle pretty well.
This. The games are all woke garbage these days. No point in playing any of them. The few games that aren't total garbage and filled with woke cancer usually run fine on the GPU I have. (RTX 2070 Super).
Yup, I bought the mass effect trilogy at Xmas on sale. Working thru that at the moment. A witcher 3 Ng+ play after that and then maybe a final Skyrim playthrough with plenty of visual mods(I recently upgraded to a rx5700xt)
The Pascal architecture was really good I'm still on a 1080 that a bought a few years ago for 200 bucks but I saw some 1080ti at 200-250 it's really a great deal for someone who wants a cheap gpu.
I got a gtx1080 since it came out and i can run everything decently on 3440x1440. Not on high settings ofcourse, about mid settings. Games still look good at those settings. I did however upgrade my cpu to 7800x3d this year, but there isnt much difference compared to 6850k. Loading times are a bit faster and maybe 10fps more at best.
I came here to say the same thing but you beat me to it, But yes, I'm rockin a 1070 In my desktop, and I have a laptop with a 1080. And they both truck along just fine.
was rocking a 1070ti, built a new pc for wifey, decided to upgrade the gpu to a 6750xt and while it's good the old 1070ti can still comfortably play anything i throw at it locked in 60fps, at max i have to go down from ultra to high in a few more demanding games and i can totally live with that.
No reason to upgrade, all the games are trash, I play 5+ year old games, literally no reason to play anything new besides Indie games which don't require monster hardware anyways.
No reason for you to upgrade. You may not like modern games, and that’s okay, but I hate to break the bad news to you but it turns out the industry is booming more than ever and selling more games than ever. So the vast majority do not feel this way
@AngerMaker413 great but I can still play all the games my friends are playing... with high frame rates... better load times than my Xbox brethren... and it's called a gtx 1060! Yeah I want a 1080, but 1060 still cuts the mustard to this day!
@@Tipman2OOO I went from a 3050ti 4gb to a 4060 16gb and it made a worlds of difference for my experience. Those older cards still hold up well, no doubt, but at this point it’s only a matter of time before you start having to turn everything on low. The 1060 is pushing 8 years old now, but by the time you’re due for an upgrade you could still get by with a 3070 or something like that for another 8 years
Even if they do require monster hardware, the graphics and visual appearance on many games are worse than games from 6+ years ago. Just garbage from these dev studios. I'll happily play RDR2 over and over than spend money on these new games or an overpriced GPU.
The problem with the inflation argument is that incomes do not keep up with it. We are losing purchasing power constantly. Companies are forced to raise their prices because of the endless money printing, but we, the people, make less and less of that money for ourselves. It's still at least as hard to make a given amount of money today as it was in 2016 or 2012, but the prices are leaps and bounds above what they were back then. And that's in every aspect of our lives, meaning that luxury goods like computer hardware become a much lower priority.
@@loganmedia4401 Let's say you used to be able to pay everyone at every step of the process from start to getting a graphics card on a store shelf with an average of $150 per card. The required profit to keep R&D competitive, make it worthwhile for those who invest in your company, and have a little cushion is $50. After tax, you sell the cards to retailers and the third parties for $250. They for about 280-300. Now, every step is 1.5-2x more expensive. It costs 300 to get a card produced, tested, packaged, and shipped to a store, despite your toughest optimisations. R&D is also more expensive. Let's say $75. Due to tax, you have to sell for $450. Retailers for $480-500+. You complain that they won't sell the card for $300, making a $150/33% loss on every card sold. Why would they waste their time on low-cost, low-profit cards if they want to survive? It's no wonder most of the focus is on high-end products. That's the only way to sustain the business and keep developing high-quality products. It's unfortunate, but it's not their fault. They're not printing money, debasing the currencies, destroying opportunities for business, etc.
Prices are all up because corporations see a world war as an excuse to gouge... they see the sun rising another day as an excuse to gouge the poor and pay their shareholders.
The majority of inflation after the first couple of years of repairing the worldwide supply line of goods has been out and out gouging of the consumer. In the graphics cards industry higher prices have also come from huge demands of crypto and AI.
I think part of it too is that the average buyer has gotten wise to the "stack shifting" scheme we've seen with the last few generations, and are content with waiting longer for those full ~80% generational jumps within their chosen price bracket. They can fudge the labeling all they want, but $1000+ cards have no place within the normal consumer lineup and shouldn't even be on the radar for most people.
10 series was the last launch that made sense. 80Ti if you game, Titan if you’re developing. No universe retail market should be dealing with $1k+ GPU’s
If it runs windows it's plenty. I come from i7 2600 to Ryzen 5 5600 and I'm planing to hold to it for a steady 5-6 years or untill AM 6. If I can run windows and play chess then it's fine. I don't plan on playing the next Crysis on ultra settings.
@@rictr7421nah Not really. The RTX2080TI is better than the RTX4060. Shure one was the high end at the time and one is the low end, but nowadays it isn’t reasonable to buy high end. Mid tier now is the old high end because they are still insane with prices. The RX7800XT is not 100% faster. It’s more like 40%.
You don't even get to own AAA titles--they "sell" them to you on Steam than take them away from you when the community is't spending enough on microtransactions. The best games still are the ones you have the solid media for.
@@PCgamerChannelFor you its amazing. For us its not. I was thinking heavily and ordered an RTX3050 6GB version. Also got a Series S. Why? Because i looked up on my steam library and the newest game i play is Insurgency:Sandstorm. I'm not interested in modern bs. Starfield with its "They/Them" pronounce etc. Newest game i finished was the Dead Island 2 on the Xbox. I had no problem with performance there
@@patriktoth6258 I got 2 gaming pcs 2 ps5s and 2 switches what was your point you were trying to make? XD. I also got a 4080 super gpu bought for over 1,000 dollars 2 months ago dont assume things then flex a weak pc.
Going from 1998-1999 and looking at what was available just 5 years later, you were going from running a couple 3dfx (R.I.P.) Voodoo 2s in SLI at the high end, to the GeForce 6800 Ultra. Those kinds of massive leaps in such a short span of time have been gone for a while now.
yes, but there were massive leaps in resource-intensiveness. you could literally not be able to smoothly/acceptably play with a high-end card like voodoo or riva tnt2 anymore after 2-3 years. gta 3 was unplayable pretty much, for example. my 1080 ti is still strong even upgrading to 1440p, and it's over 7 years old tech. games aren't all that different anymore, unless we're talking 4k plus raytracing.
Yes but you do realize that a lot of people use 1440p? Literally every big RUclipsr you see uses 1440p. Because it's the new standard. 1080p is fading away.
I finally had to give up my 2060 because it was starting to show its age and run hot even after good cleanings and maintenance. And I have a few UE5 titles now and they really made my pc burn up and howl to run them.
Same here. I got GPU envy from watching RUclips, but my old 2060 super still plays everything i like and no new titles demand anything better. I won't notice any better performance, but WILL notice the hit in my checkbook.
Single most impactful cause is price. GPUs since 2020 have been valued at over double what they are actually worth. Buy used. Know the pitfalls. Get deals.
The price to performance increase per generation has decreased greatly over time. I’m not even counting pascal cards as their performance increase was an outlier. But from 2080 to 3080/90 the performance increase DID NOT match the price increase & even less so for the 30 to 40 series. This is not to say they didn’t increase the performance but it is not arguable that generational improvements of cpu/gpu have stagnated greatly while the prices have increased greatly
@@ahiwalter9153 Be glad you even get a new next generation card at any price. They don`t owe you anything. I think the kind off performance you get these days is simply insane. And who says it should match anything......it`s just in your head.
It really isn’t insane lol…maybe if you’re talking a 3090 to 4090 upgrade but that’s a $2,000 gpu so it’s to be expected. Any other segment of the 40 series cards are pretty minor increases while still charging $600 for a 8gb card lol & the lukewarm sales are evident that this generation had one compelling option for people looking to upgrade. BYou really need to do more research or cause your statements are wildly incorrect.
I built a very good pc, myself, for 1400cad 2 christmas back. Still going strong. I wanna buy a 7800xt soon and i will he good for atleast 7 more years.
I used to work summers and buy a medium to high end rig, now I have a full time job and just a gpu of similar performance cost as much as the whole rig would have 10 years ago, that's why I still have a gtx 1070
Still running my GTX970. The 4gb Vram is starting to become an issue in many new games, good thing there's still plenty of good old games and optimized new ones that it can run well. Was going to upgrade for Starfield, glad I didn't.
A GTX 970 doesn't have 4GB of VRAM. It has 3.5GB of VRAM. It's why I went with the R9 390 back in 2016. It was $300 and had 8GB of VRAM and the GTX 970 Ti FTW was well over $300 and it only had 3.5GB of VRAM. The R9 390 was a much better deal. Paired it with a i7 6700 and 16GB of RAM. Doom 2016 ran super smooth and looked amazing.
@@wulfone5961 I fell for the shitty Nvidia marketing it seems. It actually does have 4gb of vram, as it says everywhere, but the last 512 of those are significantly slower making it in effect a 3.5gb card. WTF Nvidia?!? Reasons to avoid them in the future, I guess. A friend of mine fell for their 3050 6gb as well.
@@LarsaXL The thing is, I had run starfield with the game's included DLSS option, it shouldn't be working but it did work and given me some 25-40 fps most of the time. Stuttering lags around some parts that required assets loads are expected, but so far I only find that sound files were the issues. The rest of the game runs perfectly fine.
I'm on an RTX 3090, but I still keep my GTX 1080 Ti Founders Edition as a backup. You never know when those power hungry bricks will give up the ghost with their insane power peaks that can go up to 600W or beyond. I've already had to change TWO PCIE-E power cables due to partially melted connectors. I can only guess it's due to these intermittent power peaks that happen for milliseconds and saturate the capacity of the wires, which in turn melts the plastic around them. Now I only run my RTX 3090 undervolted. Nvidia should really revise the power consumption and delivery on their newer high-end graphics cards.
@@bgtubberThe 3090 is still a beast. I sold my 3090 Asus Strix and 3080 EVGA FTW3 Ultra 12gb. Those things pulled far to much power it was insane. Along with not having the performance anymore to make it reasonable to me. I upgraded to an Asus 4090 Asus Strix white and a 4070 Super FE.
I have always thought paying through the nose to play a handful of games that you may not even be interested in didn't make sense. I think many titles seven years or older with a high frame rate at 1080 still look amazing.
I’ve been loyal to my 1080 TI for seven years now. I snagged this gem for $650 back in mid-2017. Now, if we leap forward seven years, the equivalent price-to-performance card, the RTX 4080 (non TI) , is hitting the shelves at a whopping $1,000. It’s baffling, really. The audacity of game developers these days is something else-they seem to be living in a bubble if they think the average Joe can just drop $1,400 on a PC setup that struggles to run modern games at 60FPS without resorting to frame generation. It’s like they expect us to break the bank just to keep up with the latest titles. And let’s not even get started on the cost of a full rig upgrade. With the way prices are skyrocketing, you’d need to take out a second mortgage just to afford a top-tier graphics card. It’s a slap in the face to gamers everywhere. We’re not all made of money, and it’s about time developers and manufacturers realized that. Gaming used to be an accessible hobby, but at this rate, it’s turning into a luxury few can afford.
the equalent card is the Rx7800xt cost even less then 650$ nowdays around 500$. for 650 you can get a Rx 7900 Gre, the RTX 4070ti cost too much and RTX 4080 still cost too much. Try to use your brain over your fanboyism, the better products are not always Intel or Nvidia, there is a Option it is called AMD and the Hardware and Software from AMD is as good if not better then ther Competition. When i hear people say that DLSS is a selling point, so DLSS is not really a selling point because you get locked out if a new generation of Nvidia cards is released, FSR on the other hand is then a selling point because you do not get Locked out if AMD release a New generation and i wounder how RTX 3000 user use now FSR instead of DLSS ... Great selling point for them i see.
if we take into account real inflation (aka non-fake inflation), that 4080 is certainly cheaper today than the 1080 ti was 7 years ago. this doesn't mean that purchasing-power hasn't eroded for many fixed-income people, so it may feel less affordable for many.
It’s getting to the point where if developers continue to embrace the latest and greatest, they are going to seriously hamper their sales. This is already happening with some AAA titles. And I fear that Frame Gen has just become an excuse to release badly designed/optimized games.
I got my RX 7900 xtx for 750$ new, great deal considering the absurd prices Nvidia Gpu's have. Won't upgrade at least 2 generations from AMD. Why AMD? Because Nvidia doesn t give a rat's shit about it's main consumers, us gamers.
If the 2070 Super had 12 GB or more of VRAM (almost nothing did at the time), it would still be a great GPU. Even with 8GB, it's still very good. Mine gets used regularly with a Ryzen 5 5600 for photo editing.
@@rangersmith4652 mines one of the best performing 2070 super ever tested on 3dmark. Holds several records lol. I got lucky and got a very overclockable card
The power connector wasn't stupid, it was the people that were stupid. Leave your PC half plugged in to the receptacle and let me know how that goes (don't do it, you will burn your house down).
@@apersonontheinternet8006I get what you're saying but a well designed connector for high power loads should make it impossible for current to flow if it is not fully seated.
I'd say, ideally, the time to upgrade your old GPU is when the new GPU's 1% lows are your old card averages in raw rasterization assuming you've got a CPU (and RAM) to back up you new purchase. Also the price (to performance) is a huge deal. Only after that i'd consider wattages,temps and features.
I have had a 5700xt, which is basically a slightly better 1080ti, at least for more recent games, for almost 4 years now and I am not planning to upgrade anytime soon. It still plays 1080p perfectly.
@coulsenbailey3829 It really is luck of the draw. I used to have a GTX 960 that was so whiny it drowned out the monitor audio. And an Athlon 3000G fan that basically came with a free rodent orchestra the way it "REEEEEEE"d every active second. I've had more powerful cards and fans that were fine.
@@Mc3ks My gpu was getting roubleshooted for 2 days (7900XT) so I put my old 1080 TI back in my PC, cleaned it up because it was a blower cooler version from Palit lol. It still ran Elden Ring DLC at 1440p high for the two days I used it. It will always remain in my gaming memory because it was the first high-end GPU I ever bought. I went from a 750 TI to RX 480...but then I tried 1440p on it and sold it after just 2 months to upgrade to 1080 TI. When I was on 750 TI I used 4k as a joke setting to see how little fps I got. When I used the 1080 TI, I played Deux EX:Mankind Divided that way lol. Probably one of the best GPUs ever released. With inflation included, the RTX 4090 would need to be around 1050-1100€ with the same performance it has right now to match the 1080 TI in price/performance/generational leap. That's just crazy.
At 1080p medium settings with FSR maybe. I upgraded from a 2080ti when I stopped getting over 75 fps at 1440p medium. I didn't buy the top of the line gpu at the time to play 60fps medium quality like a console does.
My 1070 TI's front fan stopped working causing overheat crashes and the solution was to tape two 120mm fans to the side after removing fan shroud. Temps are now way lower than they were when stock fans was actually working. I plan on upgrading in 2028 for the 7000 series. Hoping for a 7060 TI 32GB that is 150 teraflops for under 600 dollars.
I’m fine with graphics the way they are. I just want games to run right on release and compiling shaders at the start of the game. Speaking of Elden ring, yesterday it started randomly freezing for a second then fast forwarding to catch up…
I played both uncharted and the last of us part 1 I let compile shaders but the games still stutter frequently even after beating them a couple of time and I have an 8gb gpu so it wasn't vram , elden ring didn't stutter for me anymore after the first 5 hours
"future proofing" was a huge PC building trend between 2015 and 2018 or so, and so far it turns out to have been a great decision. a 980ti will still run most games, and the guys who got a 5820k are likely still happily gaming on them.
Future proofing doesn't exist in PC hardware land. There's only best bang for buck. I've owned a 980ti almost a decade ago. And I'm happy Ive moved on to a 1080ti and a 3080. Next will be something 5080. And really a 3080 crushes a 980ti, let alone features like DLSS
I bought my 1080Ti a few weeks after launch, and it's going strong today. It's no longer in my main gaming PC, or even my secondary. It's in the machine I daily drive -- the one I'm using right now.
As other people have said in the comments, this video is over analysing, the main reason people aren't upgrading is because GPU's are ridiculously expensive now. Nvidia (and AMD to a lesser extent) have priced themselves out of the market. Now most people are waiting until a new mid-range card offers up roughly twice the performance of their present card, so we are seeing people skipping generations. We all want to feel like we are getting good value for money and it just doesn't feel that way anymore.
Majority of players didn't chase new hardware even before the whole 2018+ era of price hikes. Most games on the market dont need strong cards that is enthusiast level. I am still on my 1080 because there isnt any new games worth a stronger card right now and a lot of studios are actually starting to optimise their games enough that i can enjoy some really impressive visual games even with that old card at stable 60FPS. And i think that is key optimisation is finally getting better and the GPU market needs a kick in its nuts HARD.
tehnically , from High to Max your GPU will notice in usage ( if you use frame limiter ) or framerate , but YOU will barely notice an improvement even if you play the game just for that at that moment
My RX 480 totally stood the test of time running something like Metro Exodus, Doom Eternal, and DX12 Fortnite. There's room for making these older cards shine brighter if people keep innovating on what game engines are capable of, not on a graphical level by a technical level in rendering smarter graphics to save and use resources as necessary. Here's to hoping my 6600 XT lives as long, especially since I'm now into undervolting.
The rx6600xt is often overlooked but it's a beast of a card for 1080p especially coming from a rx480. I was real happy with my 150usd rx6600xt coming from a 1060 6gb. Good luck hope it last as long as the 480.
@@MrNota500 I can push for 1440p and even 1728p/1800p with enough given resources on my 6600 XT. I feel like I should of waited on the 7600, but this is doing well enough in letting me play older games I usually play finally at 2160p. Pushing it to 4320p is even crazy when it keeps a solid 60 frames.
As someone who went from RX 590 to 6650XT on the main comp, and RX 480 to 6650XT on 2ndary comp, I feel ya. Great cards for great price. (Got both of mine on sale, by the by)
I alao jjave a 6600xt. I thought i would be stuck playing on medium setting. I keep pushing it and she ALWAYS deliver. Well, i am not playing the latest game either. But i am running a HEAVY modded fallout 4 with all the 8k HD textures and she is stable AF. I played FF rebirth recently, all at max. Never stuttered even with mods. I am really satisfied. When i builty PC it was my weakest link. I dont need to, but ad soon as i can get a deal for either a 6800xt or 7800xt, i will jump on it. If i get a 7800xt, i will start buying pieces to built my next gaming pc around it. But frankly, i am confident i can play for another 4-5 years with my 6600xt.
Still running a 1080ti in my boys rig. It does everything he wants/needs and i'll be damned if im paying the inflated greedy prices they are asking for new gpu's at present.
Back in the days going from a Radeon HD 4870 to a HD 5870 was an uplift of 90%. And that in a year. I had the HD 4870 and saw the performance of the HD 5870, i had to buy. These days it doesn't really get much better. You can do 6-8 years with a gpu, if you buy the best one. But nowadays i just buy mid class and use it for 4 - 6 years.
My conspiracy theory is that game developers are intentionally not optimizing their game to force people to buy newer hardware. I wouldn't be surprised if Intel/Nvidia/AMD invest into gaming companies.
That's because major process node jump (65nm/55nm to 40nm). 5870 to 6970 performance jump is like 10 to 15 percent only since both based on the same 40nm process.
@@soniablanche5672that is what max setting/ultra setting is. Some people think when they can't run the ultra setting with good performance then the optimization is shit.
@@arenzricodexd4409 ultra, high and even some low settings do not make any visual difference besides with more fps, or sometimes not changing fps for good or worse. worst yet, games look the same or worse as older games while running terribly lol. cyberpunk 2077 is still nvidia's rtx playground and interactive advertisement. poor all intel, amd and nvidia users who do not have the latest top end rtx. forced to eat up that one ~24gb nvidia overdrive update lol.
Last year I updated from an RX 580 to a RX 6700 XT. I do not regret the upgrade, but so far I havent played a lot of the new shiny, demanding titles even when my card can perfectly run those games at 60+ fps. I still keep playing TL2, MH World, MH Rise, Genshin Impact and Halo MCC. Tbh, outside of a few new titles like Helldivers 2, i'm not interested on the latest and greatest, at least not at the absurd pricepoint the are tying to sell us ($70.00 for an usually incomplete game is outrageous!).
Recently built a PC after using a laptop/console for years...went for a 5600 paired with a 6750 XT and I think it's great so far! Best you can get for £300 in my opinion.
No, $70 isn't that big of an ask although I'll agree with you that the shipping of incomplete games is a big problem. In the 90's games went from anywhere between $50-$80, adjust that for inflation and you're looking at $100-$160. Those games didn't cost near as much to develop even when accounting for inflation. Gamers are being unreasonable.
Dunno about highebd cards but 7800xt runs faster underclocked than stock an takes 80-180w from the wall depending of the game ofc. At stock it takes 220-280. No idea why they are sold on arrow to the knee mode.
@@colto2312its worth upgrading to a polaris card, its $50 for +100% perf and works well on Linux been at it as a daily for ~2 years. Only thing I couldn't get to work is opencl GPU computing.
I'm a lot better now in finding games that work properly on the hardware I'm using. A friend gifted me a second gen i5 dell Inspiron laptop with Intel HD 3000 graphics. It was kind of refreshing looking at metacritic PC games lists from 2006 and back. Got to check tons of games I missed out on from not having a good PC at the time. I don't know why I never did this when I got my first proper gaming computer
Rx 6600 user here. Bought it last year at around $200 + shipping and import fees. It has served me faithfully for 1080p, and can even run most stuff well at 1440p, so long as you don't go for Ray Tracing, and min-max your settings. I will stick with it for as long as it lasts me.
I'm in the same boat. I used to stick with nVidia's GTX *50 stuff (750, 1050, then 1650 was my last before switching over to AMD because the RTX 3050 was so much more expensive than the RX 6600 in my country). The RX 6600 was such a huge upgrade and is the first time I felt like I actually had a gaming PC. I'm also surprised at how efficient it was. I have a kill-a-watt and the consumption of my PC with an RX 6600 was slightly less than the power consumption of my PC with the 1650 at 1080p resolutions (the RX6600 starts consuming way more once we get to higher resolutions and settings, but at that point the 1650 can no longer keep up in terms of performance.)
I've never had a problem with this card. Fixed rate 90FPS > 120FPS variable rate. The human eye can't distinguish anything over 55 FPS so even 90FPS is overkill and people do it for bragging rights. The excessive cost to get to 4k isn't worth it unless you're video editing--4k is silly for gaming. Honestly if you're a pro gamer you want lower resolution without particle effects regardless of how fast your setup is . A gamer with a 10k computer running 4k with full effects will lose to a gamer with a 1k setup running no particle effects at HD.
@@swallowedinthesea11 Nice. Congrats on the high end PC. Why do you need to OC, though? The i9 and the 4090 will run any game that's out right now at stock speeds without breaking a sweat.
My brother inlaw who works at IT is a big believer in saving money by keeping his hardware going for as long as possible even when he can easily be playing with the latest and greatest hardware every year. Currently he is using a 1050 on a few home computers his kids use. Of course when he does upgrade he makes sure it will last a long time like his current hardware. Right now am using the RTX 2060 since 2019 and it runs everything on ultra and high except for a few of the very latest games where i have it to turn it down to high and medium settings for STARFIELD and CYBERPUNK 2077. I thought i would have issues with the 6gb vram but STARFIELD runs perfectly at 1080p or 1440p. Thanks for the great commonsense video and happy gaming! ☕
Been running a 2060S since Spring 2020, still plenty good enough for my uses. The only game I know I can't tweak to acceptable performance is AW2, that game is deigned to run at a locked 30FPS at high on a 2060S so I'll wait until I upgrade in a year or three. When I can't run the latest AAA releases that'll just give me motivation to peruse my astonishingly good backlog. I know cards after Pascal and RDNA went up in price but they do also seem to last longer, I never got more than 3 years out of my 2 previous cards (GTX770 and GTX1060).
@@Lord_Muddbutter Sorry to hear that. I hope the upgrade is bringing the gaming joy. I was fortunate enough to find one of those settings guide on RUclips which gave just the right advice on running STARFIELD so i have been very happy with its performance. Am saving up for a new pc to replace my i7 8700 since i do want to get back to enjoying things at high and ultra when it comes to new games but i have more than enough games in my STEAM and GOG library to not feel like am missing out.
Still rockin a RTX 2060. My boys PCs have GTX 1660s. These cards are more then powerful enough to run what we play. Newest game I play is 7 Days to Die. They both play Roblox and Minecraft. No 4090s needed here.
I'm still using a processor from 2011 (AMD Phenom II X4 970 Black Edition). Upgraded from a GTX460 768MB to a GTX 960 2GB in 2016. The GTX 480, Nvidia's top of the line card in 2010, was $499.99 ! The GTX 470 was $349.99
As someone who only play games at 1080p and isn't concerned with Ray Tracing and FSR or DLSS, I see no point in upgrading my 6600xt. Cyberpunk 2077 at Max settings comfortably runs at 50-75 fps and with a bit of overclocking, it doesn't drop below 55fps.
@@chaz-e No, we have more than 3, but only 4 of them are currently running/used. We had 3 PCs in our house in the 90s when I was a kid growing up so I don't understand your surprise. That was 30 years ago. My PC is 5 years old and was mid-range when I built it then for a few hundred bucks. My oldest daughter's is running on my previous 12yo 8-core CPU with my previous 7yo GPU. My youngest daughter's rig is running on a 9yo APU that I bought off Ebay for $25, and I'd put in the HP OEM RX 550 GPU that I bought for a couple bucks off Ebay to beef up her graphics a bit. My wife is running a rig that I bought off my little brother for $500 a few years ago with a Ryzen 5 2400 and a GTX 1050ti, I wanted to help him out a bit so I overpaid for it. Then we have the two prebuilts my wife was running previously that are just sitting collecting dust and cobwebs - a dual-core we bought in 2012 and a quad-core we bought in 2017, just little cheap ~$300 jobs. We also have her late father's old Mac Pro just sitting collecting dust, it's only worth ~$200 nowadays. Nobody here is running anything fancy like $2000 gaming rigs. It's all hand-me-downs or built from cheap used parts off Ebay. I have a few misc CPUs and RAM and a motherboard as well ready to be built into another rig if the need arises. We didn't go out shopping one day and spend thousands of dollars on several brand-new high-end PCs. It's been a slow accumulation of parts over the last 10 years as I've upgraded my rigs and our daughters received hand-me-down parts. Both my wife and I are self-employed and work from home on our computers so the PCs have paid for themselves many times over. We can't do what we do without them. We want our daughters to be as comfortable with computers as we are, having grown up with them, so we made sure they each had one after a certain age via the hand-me-down chain. When I upgraded my rig our oldest received my existing rig, when I upgraded again, our youngest received her rig, then received my rig, and I always have the newest hardware. There's not a piece of hardware in this house that is younger than 5 years old, and most of it is closer to a decade old. We're not driving around in Ferraris over here - we're clunking around in 15 year old Hondas with 150k miles on them. Literally.
Got 5700xt year ago used for $120 I don't feel like I need anything stronger than that. If it's an unoptimized game then I will just skip it, rest works great. And older games run beautifully at 4k/60fps or more.
@@Dregomz02 Yeah it still holds up pretty well at 1080p just fine for most things - but then games that only use Mesh Shaders are burnt. I guess Alan Wake 2 released a patch though that improves its performance on GPUs that don't support mesh shaders, which was probably a good idea. The 5700XT is still plenty decent though, even if it doesn't hold a candle to mid-range GPUs of today, like the 4070 SUPER, which is what I've been eyeing for a while in the event that there are enough funds for me to buy one.
I have a GTX 1070. Hasn't been anything worth replacing it since GPU prices went nuts. Pricing has finally dropped and a 4060 is starting to look very appealing to me.
just upgraded from i5 6600k/R9 390 to a 7800x3D/4080 SUPER two days ago and it has been life changing graphics look so good. it took 8 years like the title said
This is awesome. I went from a 9900k/1080 TI to a 13700k/RX 7900XT last year and it made a world of difference. I can only imagine how you feel. Actually, before that, I was on a i5 4670 with a 750 TI so I think I can kinda empathize lol. Enjoy!
I'm on a 5700xt, my brother is on a 1070ti, one friend is also on a 1070ti and another friend is rocking 1080 and all of them are still churning on perfectly fine
@@Sirpesari i get that but I meant as in a new buyer if they decide to buy these types of GPUs they can easily go for newer ones that in some cases are roughly 10+ fps more and some of GPUs are cheaper than the infamous GTX 1080/TI on used market in some countries not to mention they'll get driver support more
@@scarfaceReaper Depends on a market really, gtx 1080ti still kicks the snot out of 3060 for example in rasterisation and for example here in Finland you can have an used gtx 1080ti for under 200e while a new 3060 costs over 300e
It actually does make sense for these companies to really lean into the hardware being improved by software angle due to the example of how much of an improvement we are seeing with the adoption by data centers and the enterprise sector of AMD technologies like ROCm, AMD Vitis, ZenDNN, and Amuse AI.
I loved my old 1060 6GB, most cheapest single fan noise monster version (Zotac) it made my old GTX770 it replaced look like a PS1 by comparison. And it did it while using half the electricity, Pascal was a huge leap forward in performance and efficiency, it doubled FPS in games over the 960 in real world testing.
@Bsc8 AMD is no longer a serious player in the GPU market... not by market share anyways... If you look at the Steam Hardware survey, a few years ago the most used AMD card used to be around #15 on the chart with Nvidia commanding the top 14... Today, AMD has dropped to #31 before you see their most prolific dedicated GPU.... which is the RX 580... yes, the top 30 cards are all Nvidia.
I feel you. I bought the quad-core 7700k months before Intel abandoned Kaby Lake and introduced (mainstream) hexa-core. There's still opportunity cost to consider.
I tried opening some games on my Widows 7 OS computer the other day and was unable to so because Steam doesn't support it any longer. With that said in order to operate ANY computer game (that is accessed through Steam) you will need to upgrade your OS to at least Windows 10! I just dropped $125 on a Windows 11 OS Home edition on Amazon. I'm waiting as long as possible before I have to upgrade my GTX1080 GPU.
Part of the problem is the performance gains are not what they used to be, and so in order to push performance, they have to make the product itself bigger and use more silicon and run at a much higher wattage, or they can't get enough extra performance comared to previous generations. The 5090 is looking like it will be a 600W TDP, compared to the 1080 TI which was only 250W, less than half the amount. This necessarily increases the price of the product.
What is this GPU, lol? 2080 Ti was the most powerful 5yo GPU It is around 20% taster than 6700 non XT (ps5 GPU) It doesn't make 6700 look slow And apparently, 1200$ for GPU only and 500$ for the whole system makes no difference to certain somebody
@@angrynimbus270 I have a ps5 and Xbox series x, they never fail to disappoint with their mediocre performance. Console GPUs are roughly on par with a GTX 1070, idk what the AMD equivalent would be. So yeah, slow compared to a 2080 Ti.
Consoles generally get a pretty modest performance increase due to uniformed architecture. I don't think the current consoles (Xbox Series X, PS5) are considered "slow" at this point, plus "Pro" models are in the pipeline.
My i7-6700k/EVGA 1080Ti Black SC is still running strong, pushing a 34" ultrawide GSync at 1440p. Have i7-12700k/3080Ti and i9-13900k/4090 systems waiting on build, but no hurry. Best to all.
"Visuals" is a loaded term. Because photorealism is cool, but applying it to say, mincraft doesnt really do much. What we need is for visuals to go in the direction of simulation. That means fewer pre-programmed animations such as a flag in the wind, or fog off a pond etc. Or no more clipping because materials would actually interact. Stuff would simulate based on environment factors. I think if we worked in that direction, it would be a visual fidelity revolution because when things are simulated it 'clicks' in your brain and becomes much more real. Subconciously, your aware that every time you shoot that laser cannon, the same sparks animations are used over and over when the laser hits a wall. But as soon as its simulated, it would feel like your playing at "ulber ultra settings" even though the textures and lighting are the exact same. These are the things that would make me consider upgrading to the latest flagship just to experience it.
@@scrubscrub4492semi pro people will keep buying nvidia cards. And going forward majority of those buying gaming gpu like geforce or radeon will be those that use it beyond gaming. Even AMD know this hence with 7900XTX they start talking about running professional apps on those GPU. The sales from gamer will shrink.
@@clockworklegionaire2135 intel gpus actually have very good price performance, even better than amd. but they are selling them with very little profit margins, i dont think theyll be able to keep it up for very long. the drivers are a problem too...
I changed my gtx560 (zotac 2GB) in 2014 to a gtx970 (msi gaming oc), got scammed by vRAM but still used that card a lot. Then in 2022 i went for an rx6750xt (XFX Merc Black) cause the 3070 i wanted only had 8GB of vRAM. Considering i have a card that was close to a 3070 with 2022 drivers and it's now better than a 3070ti in 1080p@144Hz maxed out raster with 2024 drivers (but also only 3% slower than an rx6800 after my oc+uv tweaks): i'm going to skip 1440p and upgrade monitor + gpu to max out 4k@144Hz in the future with rdna6 and cheap oled displays that dont burn in.
I game 5k-7k between 45-70fps with my rtx 3090 with OC so I'm like when ps5 pro comes out that will be closer to tf Wise to rtx 3090 and with frame gen fsr3 I still don't see the GPU upgrade at the moment worth it although I'm curious on spec's for 5080 or 5080ti hmm?!?!?
Bought a 1080ti off my buddy about 2 years ago to upgrade the ancient r9 290x. Its quite an upgrade from that, and still performs very well👌 with a i7 6700k.
I think the introduction of handheld PCs like the Steam Deck did contribute a lot. Now, developers need to keep in mind that their games should be playable on APUs which means you can get along with a decent dedicated GPU all the time.
10:30 Totally agreed, I've recently upgraded from 1650 Super to 7800XT and I find myself indulging in some 2D RPG's / Remasters that would run fine even on integrated hardware
i had a Titan card for over 5 years and it ran all new games in 4k ultra but at 30-40fps. i bought a budget 3060 rtx prebuilt. it ran 4k games depending on the game can get 50-60 fps. you really don't need anything stronger than 3060 unless you want 4k and 60+ fps. if you are a 1k 2k gamer just get 3060. I might build a custom for 5060 rtx cards or intel battle mage cards.
The point is that you can keep that card many years and dont have to buy new after 3-4years. I have rtx 1070 ti for 8years and planning to buy battlemage when released
recently going from a 1080 to 3080, i noticed i could bump a couple settings up and get higher fps, but visually i honestly can't tell much of a difference. if there wasn't a number in the corner of the screen telling me the 3080 is better, i wouldn't be able to tell you which is which
To touch on the point of resolution and such. I feel the reason for 1080p still being so common is the fact of the matter is, that the vast majority of people can not tell or see the difference between 1080p or a significantly higher res like 4k or even 1440p. There could also be a argument that to some, higher resolutions just look more noisy and hurt the eyes.
I've also heard that most people can't tell the difference between 1080, 1440, or 4K... but I keep thinking there's some context missing. If I'm sitting about 6' away from a 60 inch monitor, the differences are obvious. If I'm sitting 10-12' away, not so much. That said, I'm perfectly happy playing games at 1280x800 on a Steam Deck (unless any in-game text fonts are too small to read easily).
The average global personal income is $9,733 per year. That's about $800/month. Worldwide, who the hell is going to spend $1000 on just a graphics card? most people can't even afford a $500 for an entire computer. The disconnect here is crazy.
Im still on my GTX 1080 and it just works when games have options like fsr / xess . i can run starfield on 1440p medium at about 60fps with frame gen and it doesnt feel choppy.
Im still using my 1650Ti laptop and yeah i have to lower settings in some games but literally every game i have runs fine, it might be a niche in the sense that i dont really play new AAA games but this ol laptop just keeps on going!
Only now I'm replacing my PC which has served me for the last 14 years, due to compatibility issues... Sure, it is firmly in the potato category (i7- 870, GT 240) but you'd be surprised how well it still works - and if software wasn't stopped being supported I'd likely use it for years more! And, if anyone is wondering, I'm getting on the GTX 1080ti badwagon - I was able to get a used PC with one for ~370$ :)
Bro y'all made Nvidia the monopoly. They have 90% market share and they are the most valuable company. Also the 4060 uses 115W compared to the 1080 Ti's 250W.
I just upgraded from a 980 2 weeks ago. Literally only did it so I could play gray zone. It ran 90% of games. It struggled with new titles like BF2042. But those older games it was still great. Was an amazing card for me. I’ll never bad mouth it, I bought it in 2013 and it’s been great for a decade
My RX 580 was still kicking in 2022, running all the games I enjoy playing at medium or even high graphics at an acceptable frame rate at 1080p. It will still even able to handle VR games that I play. I ended up upgrading to a 3060 in 2022 because it was on sale for a great price at my local best buy and I admittedly jumped onto the ray tracing hype (I've only ever used RT in Minecraft and then never touched it again lol, but at least it has a decent bit more performance). Many old GPUs can definitely still hold their own even for some modern games. The 1080 Ti is still excellent for example and can even thrash the shit out of the budget RTX cards in some ways and you can find them used on eBay for less than $200 sometimes.
There is an article on Eurogamer that states that games older than 6 years account for 60% of play time.
if one looks at PC only, its even more than that.
That's because people don't wanna work these days, unemployment my man
The most recent game I played was Red Dead Online and I did not see any newer games with better graphics, even Cyberbug 2077.
@@greg8909 Cyberpunk 2077 graphics are WAY ahead of red dead redemption 2
This is true for me. Most of my installed PC library is from the previous decade or earlier (with a few awesome exceptions) and on the console side, I bought a Series X specifically because of the backwards compatibility program.
the main reason i am holding on to aging hardware (literally running it until it dies) is because i am poor AF on top of everything getting a price hike with each new release
Foreal. I upgraded from an i72600k and HD 7870. To a 1900f and 1660ti. I the upgraded last year to a 10900f and 6800xt. I'm done.
This take makes sense to me if you just dont have the money.
@@TheRealDlo and there are increasing numbers of people who fit into this category. Prices are going up but wages aren't. We're living in a new Gilded Age, people... for every rich person there are a hundred poor, so that they may hoard the wealth.
I think this (broke ppl) is the real reason that the RX 580 has such good numbers on Steam. Because it's recently become a "budget" (
I feel ya. Was on a 3770k & 760GTX for the longest. Current machine is a used workstation with a refurb card. No way I'd buy new (esp nvidia) unless the vid card was makin bank. Scanner and printer are at least a decade old. But with the weird new subscription stuff companies are going with they really would have to take them from my cold dead hands.
get a job and your life back
We aren't upgrading because the price is too much, simple as that.
And because we don't need to since our existing systems still play all the games we like. I used to NEED to upgrade to get performance needed to play what I wanted. Paying big bucks to get a difference in performance that I don't even notice? no thanks.
@@DonkeyHotey-l2e This is the real truth.
No lies detected. I am playing Space Marine 2 in 4K on medium settings. I can play DCS at max settings. I run an RTX 2070 8GB GDDR5. I have 48GB of 3200MHz RAM by Crucial and an Intel i7 6700K 4.00GHz CPU. I have had this PC for 8yrs only changing from an 6150Ti to the RTX 5 years ago.
Yes we getting rinsed and even Nvidia shaft the retailers that move their stuff. We had a Black Friday last week and Nvidia did not give retailer permission to discount on the 40 serires..... The 40 series has just been an abusive relationship from NVidia and I hope that everyone sees the light and boycots the 40 series....... poor perfomance per $$ (especially if you are trying to upgrade from 3060 ti), and the Adobe like customer abuse...... I will be AMD for next couple of years when I upgrade and sell on, cause Nvidia can go and kick rocks now..... they are troughing/trolling/rinsing their customer base and they can get Fkd!
thanks to covid and Putin everything is so expensive nowadays
1060 here. I haven’t upgraded because the value proposition for “mid range” cards has been obliterated. Hoping things get better, but the greed has killed my interest in upgrading to a brand new card.
1060 3gb here, the 3gb was 40% cheaper than the 6gb model here, It still serves me well but i'm looking at the 3060 12gb or the 7600xt 16gb to upgrade. There is no point in waiting anymore because the vram requirements are getting higher and higher. At the end of the day unless you have the best of the best and spend as much on a GPU as the entire rest of your system, you can't even play the newest releases without issue.
I refuse to upgrade to a GPU that cost more than my car.
But its your father's car, you don't own a car.
I hope u get a new car lol
Bishop that’s not a car that’s a scooter and the battery pack is on fire
bro drives a yangwang
Where are you finding a car that cheap lol
My main reason to not upgrade is just the fact that i dont see modern AAAs worth playing i often just find myself replaying older titles which my current gpu can handle pretty well.
Same. With so much bland, uninspired woke schlock, it's getting harder and harder to justify PC upgrades.
😂 this is failed thinking. This year games are mainly next gen only.
This. The games are all woke garbage these days. No point in playing any of them. The few games that aren't total garbage and filled with woke cancer usually run fine on the GPU I have. (RTX 2070 Super).
Yup, I bought the mass effect trilogy at Xmas on sale. Working thru that at the moment. A witcher 3 Ng+ play after that and then maybe a final Skyrim playthrough with plenty of visual mods(I recently upgraded to a rx5700xt)
If I can't run new Witcher when it comes out I'll upgrade then
My GTX 1070 OC (8GB) still holds on.
Can't turn on all the new reflections and shinny things, but still gives me decent frames in modern games.
I also have 1070, not even OC and in 1080p still runs most games on high settings 60+fps or a mix of medium/high. I'm fine with that.
The Pascal architecture was really good I'm still on a 1080 that a bought a few years ago for 200 bucks but I saw some 1080ti at 200-250 it's really a great deal for someone who wants a cheap gpu.
I got a gtx1080 since it came out and i can run everything decently on 3440x1440. Not on high settings ofcourse, about mid settings. Games still look good at those settings. I did however upgrade my cpu to 7800x3d this year, but there isnt much difference compared to 6850k. Loading times are a bit faster and maybe 10fps more at best.
I came here to say the same thing but you beat me to it, But yes, I'm rockin a 1070 In my desktop, and I have a laptop with a 1080. And they both truck along just fine.
was rocking a 1070ti, built a new pc for wifey, decided to upgrade the gpu to a 6750xt and while it's good the old 1070ti can still comfortably play anything i throw at it locked in 60fps, at max i have to go down from ultra to high in a few more demanding games and i can totally live with that.
Um... Am I the only that doesn't upgrade just because GPUs are stupidly expensive now and you pay a lot of money for very little in return?
Nop u are not the only one 2000 was meh 3000 was omg I want it but price 😢 and low ram 4000 is 🥵 but price 😢 so we w8 until gtx is dead
If you want to achieve a specific frame rate or change resolutions it makes a lot of sense to upgrade. Not gonna argue the expensive part :)
@@TheRealDlo try but let me ask you this has ther bean a game that you love but camt play ? Like wicher 3 😂 I sometimes feel meh u know
Nope. It’s gonna be a WHILE before my 4070ti is going to be replaced
That's not being a good Consoomer!
Must have Current Thing!
No reason to upgrade, all the games are trash, I play 5+ year old games, literally no reason to play anything new besides Indie games which don't require monster hardware anyways.
No reason for you to upgrade. You may not like modern games, and that’s okay, but I hate to break the bad news to you but it turns out the industry is booming more than ever and selling more games than ever. So the vast majority do not feel this way
@AngerMaker413 great but I can still play all the games my friends are playing... with high frame rates... better load times than my Xbox brethren... and it's called a gtx 1060! Yeah I want a 1080, but 1060 still cuts the mustard to this day!
@@Tipman2OOO I went from a 3050ti 4gb to a 4060 16gb and it made a worlds of difference for my experience.
Those older cards still hold up well, no doubt, but at this point it’s only a matter of time before you start having to turn everything on low.
The 1060 is pushing 8 years old now, but by the time you’re due for an upgrade you could still get by with a 3070 or something like that for another 8 years
@@AngerMaker413 nah modern games suck major ass. there's reason the top streaming games on twitch are 7+ years old.
Even if they do require monster hardware, the graphics and visual appearance on many games are worse than games from 6+ years ago. Just garbage from these dev studios. I'll happily play RDR2 over and over than spend money on these new games or an overpriced GPU.
The problem with the inflation argument is that incomes do not keep up with it. We are losing purchasing power constantly. Companies are forced to raise their prices because of the endless money printing, but we, the people, make less and less of that money for ourselves. It's still at least as hard to make a given amount of money today as it was in 2016 or 2012, but the prices are leaps and bounds above what they were back then. And that's in every aspect of our lives, meaning that luxury goods like computer hardware become a much lower priority.
They're not forced to raise their prices.
@@loganmedia4401 Let's say you used to be able to pay everyone at every step of the process from start to getting a graphics card on a store shelf with an average of $150 per card. The required profit to keep R&D competitive, make it worthwhile for those who invest in your company, and have a little cushion is $50. After tax, you sell the cards to retailers and the third parties for $250. They for about 280-300.
Now, every step is 1.5-2x more expensive. It costs 300 to get a card produced, tested, packaged, and shipped to a store, despite your toughest optimisations. R&D is also more expensive. Let's say $75. Due to tax, you have to sell for $450. Retailers for $480-500+. You complain that they won't sell the card for $300, making a $150/33% loss on every card sold. Why would they waste their time on low-cost, low-profit cards if they want to survive? It's no wonder most of the focus is on high-end products. That's the only way to sustain the business and keep developing high-quality products. It's unfortunate, but it's not their fault. They're not printing money, debasing the currencies, destroying opportunities for business, etc.
Prices are all up because corporations see a world war as an excuse to gouge... they see the sun rising another day as an excuse to gouge the poor and pay their shareholders.
The majority of inflation after the first couple of years of repairing the worldwide supply line of goods has been out and out gouging of the consumer. In the graphics cards industry higher prices have also come from huge demands of crypto and AI.
@@jimralston4789 AMEN!!@
I think part of it too is that the average buyer has gotten wise to the "stack shifting" scheme we've seen with the last few generations, and are content with waiting longer for those full ~80% generational jumps within their chosen price bracket. They can fudge the labeling all they want, but $1000+ cards have no place within the normal consumer lineup and shouldn't even be on the radar for most people.
10 series was the last launch that made sense. 80Ti if you game, Titan if you’re developing. No universe retail market should be dealing with $1k+ GPU’s
I buy a new PC every 10 year, because only after 10 year I will see the new PC speed and capacity can be at least doubled or more on every benchmark.
This is the way
Well it’s going to be far more than double, you can double on performance easily every 5 years or less.
Hence I'm still running a 2600K over clocked to 4.5Ghz and a 1080 ti, runs fine on everything at 1080P, 2k, 4k is over rated IMO
If it runs windows it's plenty. I come from i7 2600 to Ryzen 5 5600 and I'm planing to hold to it for a steady 5-6 years or untill AM 6. If I can run windows and play chess then it's fine. I don't plan on playing the next Crysis on ultra settings.
@@rictr7421nah Not really.
The RTX2080TI is better than the RTX4060. Shure one was the high end at the time and one is the low end, but nowadays it isn’t reasonable to buy high end.
Mid tier now is the old high end because they are still insane with prices. The RX7800XT is not 100% faster. It’s more like 40%.
When it comes to big AAA titles, there is hardly anything worth playing these days, so no need for better hardware.
You don't even get to own AAA titles--they "sell" them to you on Steam than take them away from you when the community is't spending enough on microtransactions. The best games still are the ones you have the solid media for.
I disagree. Can give you 30+ games that are amazing on my ps5 alone.
@@PCgamerChannellist them please
@@PCgamerChannelFor you its amazing. For us its not. I was thinking heavily and ordered an RTX3050 6GB version. Also got a Series S. Why? Because i looked up on my steam library and the newest game i play is Insurgency:Sandstorm. I'm not interested in modern bs. Starfield with its "They/Them" pronounce etc. Newest game i finished was the Dead Island 2 on the Xbox. I had no problem with performance there
@@patriktoth6258 I got 2 gaming pcs 2 ps5s and 2 switches what was your point you were trying to make? XD. I also got a 4080 super gpu bought for over 1,000 dollars 2 months ago dont assume things then flex a weak pc.
Going from 1998-1999 and looking at what was available just 5 years later, you were going from running a couple 3dfx (R.I.P.) Voodoo 2s in SLI at the high end, to the GeForce 6800 Ultra. Those kinds of massive leaps in such a short span of time have been gone for a while now.
yes, but there were massive leaps in resource-intensiveness. you could literally not be able to smoothly/acceptably play with a high-end card like voodoo or riva tnt2 anymore after 2-3 years. gta 3 was unplayable pretty much, for example. my 1080 ti is still strong even upgrading to 1440p, and it's over 7 years old tech. games aren't all that different anymore, unless we're talking 4k plus raytracing.
@@memedbengul4350 It was a double-edged sword, for sure.
Back then you couldn't even be sure your GPU and sound card would work on next year's game let alone 7 years in the future
1080p is also still the most widely used rez.
Also, you are wrong that "graphics don't matter anymore" - It's the continuingly rising prices and the flat or even falling wages that is the problem.
Yes but you do realize that a lot of people use 1440p? Literally every big RUclipsr you see uses 1440p. Because it's the new standard. 1080p is fading away.
@@AMDFan-s1yRUclipsrs are a small minority of gamers
@@KaitouKaiju did you read what I just said? Everyone is moving away from 1080p.
@@AMDFan-s1yeveryone in your bubble maybe
My 2060 is doing fine, and I dont have plans to change it since most of the games I play are old
I finally had to give up my 2060 because it was starting to show its age and run hot even after good cleanings and maintenance. And I have a few UE5 titles now and they really made my pc burn up and howl to run them.
Same here. I got GPU envy from watching RUclips, but my old 2060 super still plays everything i like and no new titles demand anything better. I won't notice any better performance, but WILL notice the hit in my checkbook.
same
The 3060 is still #1 on Steam. Im rocking my 3060ti im sure another 2 generations at the rate GPU development is progressing.
Single most impactful cause is price. GPUs since 2020 have been valued at over double what they are actually worth.
Buy used. Know the pitfalls. Get deals.
Double? Have you compared fps/$? I did and they are worth the dollar. I wish they were overpriced but they are not. FE were real bargain.
To you maybe. But who are you deciding if it is worth it or not.
Just pay up, or else forget it.
The price to performance increase per generation has decreased greatly over time. I’m not even counting pascal cards as their performance increase was an outlier. But from 2080 to 3080/90 the performance increase DID NOT match the price increase & even less so for the 30 to 40 series. This is not to say they didn’t increase the performance but it is not arguable that generational improvements of cpu/gpu have stagnated greatly while the prices have increased greatly
@@ahiwalter9153 Be glad you even get a new next generation card at any price.
They don`t owe you anything.
I think the kind off performance you get these days is simply insane.
And who says it should match anything......it`s just in your head.
It really isn’t insane lol…maybe if you’re talking a 3090 to 4090 upgrade but that’s a $2,000 gpu so it’s to be expected. Any other segment of the 40 series cards are pretty minor increases while still charging $600 for a 8gb card lol & the lukewarm sales are evident that this generation had one compelling option for people looking to upgrade. BYou really need to do more research or cause your statements are wildly incorrect.
Proud owner of gtx 1060 6gb. 🤗
Me too 😀
I ran that for years until it died on me. My upgrade? A 1080ti :)
1050ti
RX580 here everything ok
Same
Still kickin and 60fpsing 95% of Steam library
Great card, soon it will dethrone 750ti from my personal favourite list.
It just keeps on going and it aged like fine wine.
Good job AMD ❤
Still love my 590 but wish 500 series had rocm support. Did AMD recently say they weren't going to have updates for 500 series anymore?
Gave mine to my girlfriend when I upgraded to 7800xt, it runs all of her games great still
The real reason?
PC Price went from $1500 to $2500 in 4 years.
$1500 still gets you an amazing system. Something like an 7800x3d and rx7800xt, probably even better.
I built a very good pc, myself, for 1400cad 2 christmas back. Still going strong.
I wanna buy a 7800xt soon and i will he good for atleast 7 more years.
And wages have not increased that much over the past decade!
I used to work summers and buy a medium to high end rig, now I have a full time job and just a gpu of similar performance cost as much as the whole rig would have 10 years ago, that's why I still have a gtx 1070
PCs havent gone up in price......Your dollar is worth LESS , you can buy less now...........fjb
Still running my GTX970. The 4gb Vram is starting to become an issue in many new games, good thing there's still plenty of good old games and optimized new ones that it can run well.
Was going to upgrade for Starfield, glad I didn't.
A GTX 970 doesn't have 4GB of VRAM. It has 3.5GB of VRAM. It's why I went with the R9 390 back in 2016. It was $300 and had 8GB of VRAM and the GTX 970 Ti FTW was well over $300 and it only had 3.5GB of VRAM. The R9 390 was a much better deal. Paired it with a i7 6700 and 16GB of RAM. Doom 2016 ran super smooth and looked amazing.
Funny that Starfield is actually playable on 960, yes, it look shit, extremely buggy and yes its boring, but none the less it does work.
@@Saviliana it is? I wasn't aware. That is good to know.
@@wulfone5961 I fell for the shitty Nvidia marketing it seems.
It actually does have 4gb of vram, as it says everywhere, but the last 512 of those are significantly slower making it in effect a 3.5gb card.
WTF Nvidia?!?
Reasons to avoid them in the future, I guess. A friend of mine fell for their 3050 6gb as well.
@@LarsaXL The thing is, I had run starfield with the game's included DLSS option, it shouldn't be working but it did work and given me some 25-40 fps most of the time. Stuttering lags around some parts that required assets loads are expected, but so far I only find that sound files were the issues. The rest of the game runs perfectly fine.
I'm on an RTX 3090, but I still keep my GTX 1080 Ti Founders Edition as a backup. You never know when those power hungry bricks will give up the ghost with their insane power peaks that can go up to 600W or beyond. I've already had to change TWO PCIE-E power cables due to partially melted connectors. I can only guess it's due to these intermittent power peaks that happen for milliseconds and saturate the capacity of the wires, which in turn melts the plastic around them. Now I only run my RTX 3090 undervolted. Nvidia should really revise the power consumption and delivery on their newer high-end graphics cards.
Have you undervolted?
@@CuttinInIdaho Yes, I undervolted it after the cables started melting at the connectors, lol. BTW, I have never overvolted the card.
@@bgtubberThe 3090 is still a beast. I sold my 3090 Asus Strix and 3080 EVGA FTW3 Ultra 12gb. Those things pulled far to much power it was insane. Along with not having the performance anymore to make it reasonable to me. I upgraded to an Asus 4090 Asus Strix white and a 4070 Super FE.
My 3080 finally died after years of overheating and power spikes.
@@mikeclarke3990 Oof. :(
The price percentage margin from a 1080 ti to a 4090 is 128%, in what world does nVidia think they are himathy?
I have always thought paying through the nose to play a handful of games that you may not even be interested in didn't make sense. I think many titles seven years or older with a high frame rate at 1080 still look amazing.
Yep they really do.
Only cause there is no price competition..
Competition is coming, slow but catching up, and form the other side of the ocean. Just like the 80s.
I’ve been loyal to my 1080 TI for seven years now. I snagged this gem for $650 back in mid-2017. Now, if we leap forward seven years, the equivalent price-to-performance card, the RTX 4080 (non TI) , is hitting the shelves at a whopping $1,000. It’s baffling, really. The audacity of game developers these days is something else-they seem to be living in a bubble if they think the average Joe can just drop $1,400 on a PC setup that struggles to run modern games at 60FPS without resorting to frame generation.
It’s like they expect us to break the bank just to keep up with the latest titles. And let’s not even get started on the cost of a full rig upgrade. With the way prices are skyrocketing, you’d need to take out a second mortgage just to afford a top-tier graphics card. It’s a slap in the face to gamers everywhere. We’re not all made of money, and it’s about time developers and manufacturers realized that. Gaming used to be an accessible hobby, but at this rate, it’s turning into a luxury few can afford.
650 in 2017 is equivalent to 831 today due to inflation, so the prices have increased, just not as much as people believe.
the equalent card is the Rx7800xt cost even less then 650$ nowdays around 500$. for 650 you can get a Rx 7900 Gre, the RTX 4070ti cost too much and RTX 4080 still cost too much.
Try to use your brain over your fanboyism, the better products are not always Intel or Nvidia, there is a Option it is called AMD and the Hardware and Software from AMD is as good if not better then ther Competition.
When i hear people say that DLSS is a selling point, so DLSS is not really a selling point because you get locked out if a new generation of Nvidia cards is released, FSR on the other hand is then a selling point because you do not get Locked out if AMD release a New generation and i wounder how RTX 3000 user use now FSR instead of DLSS ... Great selling point for them i see.
if we take into account real inflation (aka non-fake inflation), that 4080 is certainly cheaper today than the 1080 ti was 7 years ago. this doesn't mean that purchasing-power hasn't eroded for many fixed-income people, so it may feel less affordable for many.
It’s getting to the point where if developers continue to embrace the latest and greatest, they are going to seriously hamper their sales. This is already happening with some AAA titles. And I fear that Frame Gen has just become an excuse to release badly designed/optimized games.
I got my RX 7900 xtx for 750$ new, great deal considering the absurd prices Nvidia Gpu's have. Won't upgrade at least 2 generations from AMD. Why AMD? Because Nvidia doesn t give a rat's shit about it's main consumers, us gamers.
EVGA 2070 Super here. Still going strong at 1440p.
If the 2070 Super had 12 GB or more of VRAM (almost nothing did at the time), it would still be a great GPU. Even with 8GB, it's still very good. Mine gets used regularly with a Ryzen 5 5600 for photo editing.
@@rangersmith4652 mines one of the best performing 2070 super ever tested on 3dmark. Holds several records lol. I got lucky and got a very overclockable card
Also a 2070s user. I am gonna upgrade with 5000 series but I still love my 2070s.
I had a ASUS Strix 2080 Super.
Due to the vram of 8gb.
I upgraded to a new ASUS TUF 4070 for 400 dollars USD/ 600 Dollars CAD.
same with a 1080ti
Another reason not to upgrade was the stupid 12-pin power connector.
16 but yes
The power connector wasn't stupid, it was the people that were stupid. Leave your PC half plugged in to the receptacle and let me know how that goes (don't do it, you will burn your house down).
@@apersonontheinternet8006I get what you're saying but a well designed connector for high power loads should make it impossible for current to flow if it is not fully seated.
@@KaitouKaiju engineers make an idiot proof product then the world makes a better idiot.
And no, that isn’t how it works but whatever you say.
I'd say, ideally, the time to upgrade your old GPU is when the new GPU's 1% lows are your old card averages in raw rasterization assuming you've got a CPU (and RAM) to back up you new purchase. Also the price (to performance) is a huge deal. Only after that i'd consider wattages,temps and features.
I have had a 5700xt, which is basically a slightly better 1080ti, at least for more recent games, for almost 4 years now and I am not planning to upgrade anytime soon. It still plays 1080p perfectly.
I remember putting that GPU in my nephews qhd 120hz rig.
Great card for the price but it's showing its age now.
Rx5700 isn't bad. Got it for an upgrade for a gaming rig I found in a pawn shop. Works pretty well
i have 5700xt aswell, thing kicks ass. i was lucky to build my pc right before covid when all the parts prices went up
@@DreadMewsame here, but im definitely buying either a 8800xt or a 5070 next year
I can confirm my Vega 56 is still somehow pulling along just fine
I couldn't stand my vega 56 because of how terrible It whined
@coulsenbailey3829 It really is luck of the draw.
I used to have a GTX 960 that was so whiny it drowned out the monitor audio.
And an Athlon 3000G fan that basically came with a free rodent orchestra the way it "REEEEEEE"d every active second.
I've had more powerful cards and fans that were fine.
@@coulsenbailey3829 sorry to hear that, mine has been really good to me.
Him ram baby
Hbm
1080ti with 11gb is a ridiculously powerful card nearly 8 years later. Feels like it came out yesterday
Doubt it. Depending on the game you play, you could be really pushing it to its breaking point.
@@namecannotbeblank8920Your doubt is misplaced.
It's fine for playing even _Star_ _Citizen_ (Alpha, resource-heavy, graphically intense).
@@Mc3ks My gpu was getting roubleshooted for 2 days (7900XT) so I put my old 1080 TI back in my PC, cleaned it up because it was a blower cooler version from Palit lol. It still ran Elden Ring DLC at 1440p high for the two days I used it.
It will always remain in my gaming memory because it was the first high-end GPU I ever bought. I went from a 750 TI to RX 480...but then I tried 1440p on it and sold it after just 2 months to upgrade to 1080 TI. When I was on 750 TI I used 4k as a joke setting to see how little fps I got. When I used the 1080 TI, I played Deux EX:Mankind Divided that way lol.
Probably one of the best GPUs ever released. With inflation included, the RTX 4090 would need to be around 1050-1100€ with the same performance it has right now to match the 1080 TI in price/performance/generational leap. That's just crazy.
It's a good card even today, but consumes slot of power and lacks dlss and other Nvidia features
At 1080p medium settings with FSR maybe. I upgraded from a 2080ti when I stopped getting over 75 fps at 1440p medium. I didn't buy the top of the line gpu at the time to play 60fps medium quality like a console does.
Still rocking a 1080ti thanks EVGA
My 1070 TI's front fan stopped working causing overheat crashes and the solution was to tape two 120mm fans to the side after removing fan shroud. Temps are now way lower than they were when stock fans was actually working. I plan on upgrading in 2028 for the 7000 series. Hoping for a 7060 TI 32GB that is 150 teraflops for under 600 dollars.
I’m fine with graphics the way they are. I just want games to run right on release and compiling shaders at the start of the game. Speaking of Elden ring, yesterday it started randomly freezing for a second then fast forwarding to catch up…
Yup dont care if it compiles them 10 minutes long as the game runs smooth.
I played both uncharted and the last of us part 1 I let compile shaders but the games still stutter frequently even after beating them a couple of time and I have an 8gb gpu so it wasn't vram , elden ring didn't stutter for me anymore after the first 5 hours
@@purehollow my issue turned out to be the controller driver. It would quickly disconnect and reconnect causing the game to freeze.
Yea :) I agree with this all the way.
"future proofing" was a huge PC building trend between 2015 and 2018 or so, and so far it turns out to have been a great decision. a 980ti will still run most games, and the guys who got a 5820k are likely still happily gaming on them.
Future proofing doesn't exist in PC hardware land. There's only best bang for buck. I've owned a 980ti almost a decade ago. And I'm happy Ive moved on to a 1080ti and a 3080. Next will be something 5080. And really a 3080 crushes a 980ti, let alone features like DLSS
@@UmVtCg OK, how much did Nvidia pay you to say this?
@@Ang3lUki There is no such thing as future proof
@@mitsuhh Did neither of you watch the video?
@@Ang3lUki do you know what future proof means?
I bought my 1080Ti a few weeks after launch, and it's going strong today. It's no longer in my main gaming PC, or even my secondary. It's in the machine I daily drive -- the one I'm using right now.
As other people have said in the comments, this video is over analysing, the main reason people aren't upgrading is because GPU's are ridiculously expensive now. Nvidia (and AMD to a lesser extent) have priced themselves out of the market. Now most people are waiting until a new mid-range card offers up roughly twice the performance of their present card, so we are seeing people skipping generations. We all want to feel like we are getting good value for money and it just doesn't feel that way anymore.
Majority of players didn't chase new hardware even before the whole 2018+ era of price hikes.
Most games on the market dont need strong cards that is enthusiast level. I am still on my 1080 because there isnt any new games worth a stronger card right now and a lot of studios are actually starting to optimise their games enough that i can enjoy some really impressive visual games even with that old card at stable 60FPS.
And i think that is key optimisation is finally getting better and the GPU market needs a kick in its nuts HARD.
tehnically , from High to Max your GPU will notice in usage ( if you use frame limiter ) or framerate , but YOU will barely notice an improvement even if you play the game just for that at that moment
My RX 480 totally stood the test of time running something like Metro Exodus, Doom Eternal, and DX12 Fortnite. There's room for making these older cards shine brighter if people keep innovating on what game engines are capable of, not on a graphical level by a technical level in rendering smarter graphics to save and use resources as necessary. Here's to hoping my 6600 XT lives as long, especially since I'm now into undervolting.
The rx6600xt is often overlooked but it's a beast of a card for 1080p especially coming from a rx480. I was real happy with my 150usd rx6600xt coming from a 1060 6gb. Good luck hope it last as long as the 480.
I have RX 6900XT, and the RX 480 remains as the reserve
@@MrNota500 I can push for 1440p and even 1728p/1800p with enough given resources on my 6600 XT. I feel like I should of waited on the 7600, but this is doing well enough in letting me play older games I usually play finally at 2160p. Pushing it to 4320p is even crazy when it keeps a solid 60 frames.
As someone who went from RX 590 to 6650XT on the main comp, and RX 480 to 6650XT on 2ndary comp, I feel ya.
Great cards for great price. (Got both of mine on sale, by the by)
I alao jjave a 6600xt. I thought i would be stuck playing on medium setting. I keep pushing it and she ALWAYS deliver. Well, i am not playing the latest game either. But i am running a HEAVY modded fallout 4 with all the 8k HD textures and she is stable AF.
I played FF rebirth recently, all at max. Never stuttered even with mods. I am really satisfied. When i builty PC it was my weakest link. I dont need to, but ad soon as i can get a deal for either a 6800xt or 7800xt, i will jump on it.
If i get a 7800xt, i will start buying pieces to built my next gaming pc around it. But frankly, i am confident i can play for another 4-5 years with my 6600xt.
Still running a 1080ti in my boys rig. It does everything he wants/needs and i'll be damned if im paying the inflated greedy prices they are asking for new gpu's at present.
Back in the days going from a Radeon HD 4870 to a HD 5870 was an uplift of 90%.
And that in a year.
I had the HD 4870 and saw the performance of the HD 5870, i had to buy.
These days it doesn't really get much better.
You can do 6-8 years with a gpu, if you buy the best one.
But nowadays i just buy mid class and use it for 4 - 6 years.
My conspiracy theory is that game developers are intentionally not optimizing their game to force people to buy newer hardware. I wouldn't be surprised if Intel/Nvidia/AMD invest into gaming companies.
That's because major process node jump (65nm/55nm to 40nm). 5870 to 6970 performance jump is like 10 to 15 percent only since both based on the same 40nm process.
@@soniablanche5672that is what max setting/ultra setting is. Some people think when they can't run the ultra setting with good performance then the optimization is shit.
RTX 3090 to RTX 4090 was 70% in 2 years. I see your point.
@@arenzricodexd4409 ultra, high and even some low settings do not make any visual difference besides with more fps, or sometimes not changing fps for good or worse. worst yet, games look the same or worse as older games while running terribly lol.
cyberpunk 2077 is still nvidia's rtx playground and interactive advertisement. poor all intel, amd and nvidia users who do not have the latest top end rtx. forced to eat up that one ~24gb nvidia overdrive update lol.
The only reason I upgraded from an RX 580 to an RX 6600 is to save electricity.
As long as we stick together and refuse to upgrade, what are pc game companies going todo not sell to 60% ish of PC users.
Last year I updated from an RX 580 to a RX 6700 XT. I do not regret the upgrade, but so far I havent played a lot of the new shiny, demanding titles even when my card can perfectly run those games at 60+ fps. I still keep playing TL2, MH World, MH Rise, Genshin Impact and Halo MCC. Tbh, outside of a few new titles like Helldivers 2, i'm not interested on the latest and greatest, at least not at the absurd pricepoint the are tying to sell us ($70.00 for an usually incomplete game is outrageous!).
Recently built a PC after using a laptop/console for years...went for a 5600 paired with a 6750 XT and I think it's great so far! Best you can get for £300 in my opinion.
this is honestly why i haven't upgraded. the games i play are still good on my rx580. even Elden Ring lol
I'm on the same boat. Got a new card, still playing old games for the most part.
No, $70 isn't that big of an ask although I'll agree with you that the shipping of incomplete games is a big problem. In the 90's games went from anywhere between $50-$80, adjust that for inflation and you're looking at $100-$160. Those games didn't cost near as much to develop even when accounting for inflation.
Gamers are being unreasonable.
Long live my R9 390/spaceheater.
I run an R290 until 2021. I got a deal on an R580 or I'd have run it a couple more years.
The R9 390 is nowhere near as power hungry as the new high end GPUs. 275W used to be a lot, not so much nowadays.
can you get yours working well on linux? i had to give it up
Dunno about highebd cards but 7800xt runs faster underclocked than stock an takes 80-180w from the wall depending of the game ofc. At stock it takes 220-280. No idea why they are sold on arrow to the knee mode.
@@colto2312its worth upgrading to a polaris card, its $50 for +100% perf and works well on Linux been at it as a daily for ~2 years. Only thing I couldn't get to work is opencl GPU computing.
I still use a 1080ti and i7 7700k. My pc still runs as smooth as the day I first booted it up
You gave me flashbacks of when I was trying to use my 9800gt and GTX 260 way past their prime. Upgrading to the HD 6870 felt huge!
I'm a lot better now in finding games that work properly on the hardware I'm using. A friend gifted me a second gen i5 dell Inspiron laptop with Intel HD 3000 graphics. It was kind of refreshing looking at metacritic PC games lists from 2006 and back. Got to check tons of games I missed out on from not having a good PC at the time. I don't know why I never did this when I got my first proper gaming computer
I rocked a 1060 3gb until this year and everything still played very well between 30 to 60 fps on some mainstream titles
Rx 6600 user here. Bought it last year at around $200 + shipping and import fees. It has served me faithfully for 1080p, and can even run most stuff well at 1440p, so long as you don't go for Ray Tracing, and min-max your settings. I will stick with it for as long as it lasts me.
Same but xt.
I'm in the same boat. I used to stick with nVidia's GTX *50 stuff (750, 1050, then 1650 was my last before switching over to AMD because the RTX 3050 was so much more expensive than the RX 6600 in my country). The RX 6600 was such a huge upgrade and is the first time I felt like I actually had a gaming PC. I'm also surprised at how efficient it was. I have a kill-a-watt and the consumption of my PC with an RX 6600 was slightly less than the power consumption of my PC with the 1650 at 1080p resolutions (the RX6600 starts consuming way more once we get to higher resolutions and settings, but at that point the 1650 can no longer keep up in terms of performance.)
I've never had a problem with this card.
Fixed rate 90FPS > 120FPS variable rate. The human eye can't distinguish anything over 55 FPS so even 90FPS is overkill and people do it for bragging rights. The excessive cost to get to 4k isn't worth it unless you're video editing--4k is silly for gaming. Honestly if you're a pro gamer you want lower resolution without particle effects regardless of how fast your setup is . A gamer with a 10k computer running 4k with full effects will lose to a gamer with a 1k setup running no particle effects at HD.
@@jasonwaterfalls6145 Have fun missing out! I have an i9 and an RTX 4090! I'm watching tutorials on how to OC my CPU.
@@swallowedinthesea11 Nice. Congrats on the high end PC. Why do you need to OC, though? The i9 and the 4090 will run any game that's out right now at stock speeds without breaking a sweat.
Long live my good'ol RX580... Still playing and keeping my room warm and cozy 😂😂😂
My brother inlaw who works at IT is a big believer in saving money by keeping his hardware going for as long as possible even when he can easily be playing with the latest and greatest hardware every year. Currently he is using a 1050 on a few home computers his kids use. Of course when he does upgrade he makes sure it will last a long time like his current hardware.
Right now am using the RTX 2060 since 2019 and it runs everything on ultra and high except for a few of the very latest games where i have it to turn it down to high and medium settings for STARFIELD and CYBERPUNK 2077. I thought i would have issues with the 6gb vram but STARFIELD runs perfectly at 1080p or 1440p.
Thanks for the great commonsense video and happy gaming! ☕
Coming from someone who just upgraded from a 2060 6gb. It runs Starfield on 1440p like mashed potatoes
Been running a 2060S since Spring 2020, still plenty good enough for my uses. The only game I know I can't tweak to acceptable performance is AW2, that game is deigned to run at a locked 30FPS at high on a 2060S so I'll wait until I upgrade in a year or three. When I can't run the latest AAA releases that'll just give me motivation to peruse my astonishingly good backlog. I know cards after Pascal and RDNA went up in price but they do also seem to last longer, I never got more than 3 years out of my 2 previous cards (GTX770 and GTX1060).
@@Lord_Muddbutter Sorry to hear that. I hope the upgrade is bringing the gaming joy. I was fortunate enough to find one of those settings guide on RUclips which gave just the right advice on running STARFIELD so i have been very happy with its performance. Am saving up for a new pc to replace my i7 8700 since i do want to get back to enjoying things at high and ultra when it comes to new games but i have more than enough games in my STEAM and GOG library to not feel like am missing out.
@@77Arcturus More power to you.
Just few weeks ago upgraded gtx1080 to 7800xt.
And 1080 aint done yet.. Still fine for 1080p so goes to girlfriends pc.
Still rockin a RTX 2060. My boys PCs have GTX 1660s. These cards are more then powerful enough to run what we play. Newest game I play is 7 Days to Die. They both play Roblox and Minecraft. No 4090s needed here.
I'm still using a processor from 2011 (AMD Phenom II X4 970 Black Edition). Upgraded from a GTX460 768MB to a GTX 960 2GB in 2016.
The GTX 480, Nvidia's top of the line card in 2010, was $499.99 !
The GTX 470 was $349.99
GTX 480, the grill master! I got mine for 200$ on a deal in 2011. Still rocks hard on my second rig, new Indie games also run perfectly fine.
As someone who only play games at 1080p and isn't concerned with Ray Tracing and FSR or DLSS, I see no point in upgrading my 6600xt.
Cyberpunk 2077 at Max settings comfortably runs at 50-75 fps and with a bit of overclocking, it doesn't drop below 55fps.
Or just turn down some settings to medium and you'll get solid 60+ fps
@@scarfaceReaper can't tell the difference between 55 and 60-70ish
@@benedictjajo Lucky
I'm running an RX 5700XT, my 13yo daughter an RX 570, and my 9yo daughter an RX 550.
You have 3 PCs in single house?
@@chaz-e No, we have more than 3, but only 4 of them are currently running/used. We had 3 PCs in our house in the 90s when I was a kid growing up so I don't understand your surprise. That was 30 years ago.
My PC is 5 years old and was mid-range when I built it then for a few hundred bucks. My oldest daughter's is running on my previous 12yo 8-core CPU with my previous 7yo GPU. My youngest daughter's rig is running on a 9yo APU that I bought off Ebay for $25, and I'd put in the HP OEM RX 550 GPU that I bought for a couple bucks off Ebay to beef up her graphics a bit. My wife is running a rig that I bought off my little brother for $500 a few years ago with a Ryzen 5 2400 and a GTX 1050ti, I wanted to help him out a bit so I overpaid for it. Then we have the two prebuilts my wife was running previously that are just sitting collecting dust and cobwebs - a dual-core we bought in 2012 and a quad-core we bought in 2017, just little cheap ~$300 jobs. We also have her late father's old Mac Pro just sitting collecting dust, it's only worth ~$200 nowadays. Nobody here is running anything fancy like $2000 gaming rigs. It's all hand-me-downs or built from cheap used parts off Ebay. I have a few misc CPUs and RAM and a motherboard as well ready to be built into another rig if the need arises.
We didn't go out shopping one day and spend thousands of dollars on several brand-new high-end PCs. It's been a slow accumulation of parts over the last 10 years as I've upgraded my rigs and our daughters received hand-me-down parts.
Both my wife and I are self-employed and work from home on our computers so the PCs have paid for themselves many times over. We can't do what we do without them. We want our daughters to be as comfortable with computers as we are, having grown up with them, so we made sure they each had one after a certain age via the hand-me-down chain. When I upgraded my rig our oldest received my existing rig, when I upgraded again, our youngest received her rig, then received my rig, and I always have the newest hardware. There's not a piece of hardware in this house that is younger than 5 years old, and most of it is closer to a decade old.
We're not driving around in Ferraris over here - we're clunking around in 15 year old Hondas with 150k miles on them. Literally.
Got 5700xt year ago used for $120 I don't feel like I need anything stronger than that. If it's an unoptimized game then I will just skip it, rest works great. And older games run beautifully at 4k/60fps or more.
@@Dregomz02 Yeah it still holds up pretty well at 1080p just fine for most things - but then games that only use Mesh Shaders are burnt. I guess Alan Wake 2 released a patch though that improves its performance on GPUs that don't support mesh shaders, which was probably a good idea. The 5700XT is still plenty decent though, even if it doesn't hold a candle to mid-range GPUs of today, like the 4070 SUPER, which is what I've been eyeing for a while in the event that there are enough funds for me to buy one.
I ain’t readin all ahh
I went from a GTX285 to a 1060 to a 4070ti. That was like over 20 years and the 1060 is still going on strong in a side project linux box.
Same, 760, 1060 now 3060ti
I have a 1060 and 3600 AMD CPU collecting dust in the garage.
I have a GTX 1070. Hasn't been anything worth replacing it since GPU prices went nuts. Pricing has finally dropped and a 4060 is starting to look very appealing to me.
just upgraded from
i5 6600k/R9 390 to a
7800x3D/4080 SUPER
two days ago and it has been life changing graphics look so good. it took 8 years like the title said
This is awesome. I went from a 9900k/1080 TI to a 13700k/RX 7900XT last year and it made a world of difference. I can only imagine how you feel.
Actually, before that, I was on a i5 4670 with a 750 TI so I think I can kinda empathize lol. Enjoy!
Still using my undervolted Vega 56 on a 10th gen i5 for gaming. :)
I'm on a 5700xt, my brother is on a 1070ti, one friend is also on a 1070ti and another friend is rocking 1080 and all of them are still churning on perfectly fine
Still good cards but they are being beaten in some if not most games by low budget cards
@@scarfaceReaper Sure but why buy something new that just barely beats the old one that still works
@@Sirpesari i get that but I meant as in a new buyer if they decide to buy these types of GPUs they can easily go for newer ones that in some cases are roughly 10+ fps more and some of GPUs are cheaper than the infamous GTX 1080/TI on used market in some countries not to mention they'll get driver support more
@@scarfaceReaper Depends on a market really, gtx 1080ti still kicks the snot out of 3060 for example in rasterisation and for example here in Finland you can have an used gtx 1080ti for under 200e while a new 3060 costs over 300e
The 1080 is the best card ever
A GPU that was a good performer at 1080p back in 2015 or whenever will still be good performer at 1080p in 2024.
Small, talented indie studios have changed everything for me, personally. Most of the games I enjoy now can be played on hardware from 12 years ago.
It actually does make sense for these companies to really lean into the hardware being improved by software angle due to the example of how much of an improvement we are seeing with the adoption by data centers and the enterprise sector of AMD technologies like ROCm, AMD Vitis, ZenDNN, and Amuse AI.
Still rocking a GTX1060....but looking forward to Battlemage/8800XT
Sadly there will be no rdna4 cards higher tier than rx8700xt, kinda like rdna1. I'll wait for rdna5 but i will probably upgrade to rdna6.
@@Bsc8
If the 8700XT has the value of the 5700XT I might buy one, that old RDNA card was a beast for the price.
I loved my old 1060 6GB, most cheapest single fan noise monster version (Zotac) it made my old GTX770 it replaced look like a PS1 by comparison. And it did it while using half the electricity, Pascal was a huge leap forward in performance and efficiency, it doubled FPS in games over the 960 in real world testing.
@@darthwiizius your right tho! AMD has the opportunity to bring another rx5700xt in terms of value, let's hope they dont screw us with the rx8700xt :)
@Bsc8 AMD is no longer a serious player in the GPU market... not by market share anyways...
If you look at the Steam Hardware survey, a few years ago the most used AMD card used to be around #15 on the chart with Nvidia commanding the top 14...
Today, AMD has dropped to #31 before you see their most prolific dedicated GPU.... which is the RX 580... yes, the top 30 cards are all Nvidia.
I bought a 1080 5 months before the ti came out. I am still mad about it because i am still using the 1080.
If you still using it you didnt lose anything
Lol you're fine. It's still good at 1080p. I only upgraded from mine because I wanted 4k 60fps
I feel you. I bought the quad-core 7700k months before Intel abandoned Kaby Lake and introduced (mainstream) hexa-core.
There's still opportunity cost to consider.
heck the RX480 8gb in my i5 4690k second computer is still doing fairly good
I tried opening some games on my Widows 7 OS computer the other day and was unable to so because Steam doesn't support it any longer. With that said in order to operate ANY computer game (that is accessed through Steam) you will need to upgrade your OS to at least Windows 10! I just dropped $125 on a Windows 11 OS Home edition on Amazon. I'm waiting as long as possible before I have to upgrade my GTX1080 GPU.
Part of the problem is the performance gains are not what they used to be, and so in order to push performance, they have to make the product itself bigger and use more silicon and run at a much higher wattage, or they can't get enough extra performance comared to previous generations. The 5090 is looking like it will be a 600W TDP, compared to the 1080 TI which was only 250W, less than half the amount. This necessarily increases the price of the product.
pascal architecture really one of the best
*The Best*
I still use my 1080Ti- even for VR.. but….
Don’t try and use a PC adapter for a PSVR2 on it.
Pascal too old an architecture to be supported or work… 😢
When a 5 year old GPU can still make the most up to date console look slow, yeah, there’s still some use left there
What is this GPU, lol?
2080 Ti was the most powerful 5yo GPU
It is around 20% taster than 6700 non XT (ps5 GPU)
It doesn't make 6700 look slow
And apparently, 1200$ for GPU only and 500$ for the whole system makes no difference to certain somebody
@@angrynimbus270 I have a ps5 and Xbox series x, they never fail to disappoint with their mediocre performance. Console GPUs are roughly on par with a GTX 1070, idk what the AMD equivalent would be. So yeah, slow compared to a 2080 Ti.
@@chincemagnetSaying a ps5 is on par with a 1070 is just dumb
Consoles generally get a pretty modest performance increase due to uniformed architecture. I don't think the current consoles (Xbox Series X, PS5) are considered "slow" at this point, plus "Pro" models are in the pipeline.
My i7-6700k/EVGA 1080Ti Black SC is still running strong, pushing a 34" ultrawide GSync at 1440p. Have i7-12700k/3080Ti and i9-13900k/4090 systems waiting on build, but no hurry. Best to all.
I run a 580 until a year ago. I got an RX 6800 and will roll with it for another 3-5 years.
"Visuals" is a loaded term. Because photorealism is cool, but applying it to say, mincraft doesnt really do much. What we need is for visuals to go in the direction of simulation. That means fewer pre-programmed animations such as a flag in the wind, or fog off a pond etc. Or no more clipping because materials would actually interact. Stuff would simulate based on environment factors. I think if we worked in that direction, it would be a visual fidelity revolution because when things are simulated it 'clicks' in your brain and becomes much more real. Subconciously, your aware that every time you shoot that laser cannon, the same sparks animations are used over and over when the laser hits a wall. But as soon as its simulated, it would feel like your playing at "ulber ultra settings" even though the textures and lighting are the exact same. These are the things that would make me consider upgrading to the latest flagship just to experience it.
Here is a concept for Nvidia....make their video cards affordable so people don't have to hold on to their old 'good enough' video card.
then how will Jensen keep himself clothed in leather jackets?
Here is another concept
The price will remain high as long as people keep buying
Stop buying GeForce, get a Radeon or an Arc.
@@scrubscrub4492semi pro people will keep buying nvidia cards. And going forward majority of those buying gaming gpu like geforce or radeon will be those that use it beyond gaming. Even AMD know this hence with 7900XTX they start talking about running professional apps on those GPU. The sales from gamer will shrink.
@@scrubscrub4492 Arc? 🤣🤣
@@clockworklegionaire2135 intel gpus actually have very good price performance, even better than amd. but they are selling them with very little profit margins, i dont think theyll be able to keep it up for very long.
the drivers are a problem too...
I changed my gtx560 (zotac 2GB) in 2014 to a gtx970 (msi gaming oc), got scammed by vRAM but still used that card a lot. Then in 2022 i went for an rx6750xt (XFX Merc Black) cause the 3070 i wanted only had 8GB of vRAM. Considering i have a card that was close to a 3070 with 2022 drivers and it's now better than a 3070ti in 1080p@144Hz maxed out raster with 2024 drivers (but also only 3% slower than an rx6800 after my oc+uv tweaks): i'm going to skip 1440p and upgrade monitor + gpu to max out 4k@144Hz in the future with rdna6 and cheap oled displays that dont burn in.
First. And I'm keeping my 4090 for 10 years. I'll end up running at low settings eventually but at least I can max out textures.
Thats a Smart play Honestly.
I game 5k-7k between 45-70fps with my rtx 3090 with OC so I'm like when ps5 pro comes out that will be closer to tf Wise to rtx 3090 and with frame gen fsr3 I still don't see the GPU upgrade at the moment worth it although I'm curious on spec's for 5080 or 5080ti hmm?!?!?
The power draw of 400 watts for ten years... Holy shit that's not as good an idea as you think
You'd save money buying a 60 class card when it's beating the 4090
@@veilmontTV It usually draws more like 200 to 250 watts. Newer games can get up to over 400 watts.
Bought a 1080ti off my buddy about 2 years ago to upgrade the ancient r9 290x. Its quite an upgrade from that, and still performs very well👌 with a i7 6700k.
its crazy that it took two generations until the 3080 TI to just double the performance. and it cost way more.
1660s here,5 years and still decent for 1080 p
My RTX 3080 10gb is doing just fine and will so for many years to come.
7:07 my man Kryzzp 🥰💪😇👍glad you are referencing him in the video as he’s a great game benchmarker 💪🥳🤩!
I think the introduction of handheld PCs like the Steam Deck did contribute a lot. Now, developers need to keep in mind that their games should be playable on APUs which means you can get along with a decent dedicated GPU all the time.
10:30 Totally agreed, I've recently upgraded from 1650 Super to 7800XT and I find myself indulging in some 2D RPG's / Remasters that would run fine even on integrated hardware
Ah yes!, I'm still rock'in an MSI RX580 8G. works for everything I play.
i had a Titan card for over 5 years and it ran all new games in 4k ultra but at 30-40fps. i bought a budget 3060 rtx prebuilt. it ran 4k games depending on the game can get 50-60 fps. you really don't need anything stronger than 3060 unless you want 4k and 60+ fps. if you are a 1k 2k gamer just get 3060. I might build a custom for 5060 rtx cards or intel battle mage cards.
The point is that you can keep that card many years and dont have to buy new after 3-4years. I have rtx 1070 ti for 8years and planning to buy battlemage when released
@@ketsi3079 that is the point but you got some idiot streamers and youtubers that claim you need 4090 or 4080. you dont.
"why gamers don't upgrade pc for 10 years"
Wallet left the chat
recently going from a 1080 to 3080, i noticed i could bump a couple settings up and get higher fps, but visually i honestly can't tell much of a difference. if there wasn't a number in the corner of the screen telling me the 3080 is better, i wouldn't be able to tell you which is which
To touch on the point of resolution and such. I feel the reason for 1080p still being so common is the fact of the matter is, that the vast majority of people can not tell or see the difference between 1080p or a significantly higher res like 4k or even 1440p. There could also be a argument that to some, higher resolutions just look more noisy and hurt the eyes.
I've also heard that most people can't tell the difference between 1080, 1440, or 4K... but I keep thinking there's some context missing. If I'm sitting about 6' away from a 60 inch monitor, the differences are obvious. If I'm sitting 10-12' away, not so much. That said, I'm perfectly happy playing games at 1280x800 on a Steam Deck (unless any in-game text fonts are too small to read easily).
The average global personal income is $9,733 per year. That's about $800/month.
Worldwide, who the hell is going to spend $1000 on just a graphics card? most people can't even afford a $500 for an entire computer.
The disconnect here is crazy.
Me, who owns a GTX 1050Ti 😂
I upgraded my kids to GTX 1070 last year, and used one of their GTX 1050 to upgrade my PC. Whatever gets you through the night.
lol same
i'm still rockin' a gtx 760 2GB from '13, hopefully getting a rtx 3080 before too long
🤡🤡🤡
@@Rebe-Caufman don't be so hard on yourself
Im still on my GTX 1080 and it just works when games have options like fsr / xess . i can run starfield on 1440p medium at about 60fps with frame gen and it doesnt feel choppy.
Im still using my 1650Ti laptop and yeah i have to lower settings in some games but literally every game i have runs fine, it might be a niche in the sense that i dont really play new AAA games but this ol laptop just keeps on going!
Only now I'm replacing my PC which has served me for the last 14 years, due to compatibility issues... Sure, it is firmly in the potato category (i7- 870, GT 240) but you'd be surprised how well it still works - and if software wasn't stopped being supported I'd likely use it for years more!
And, if anyone is wondering, I'm getting on the GTX 1080ti badwagon - I was able to get a used PC with one for ~370$ :)
a whole pc, monitor and all? if so, that's a steal.
Blame it on Nvidia for monopolizing the GPU industry and bringing out unnecessary crap like the RTX 4060 w/8gb ram for a ridiculous $300.00 USD MSRP.
Bro y'all made Nvidia the monopoly. They have 90% market share and they are the most valuable company. Also the 4060 uses 115W compared to the 1080 Ti's 250W.
Why Gamers Aren't Upgrading Their GPUs? Because you must sell your organs in order to be able to afford it.
I remember when a new $200 GPU could play any AAA on high quality.
yeah, like around 1890 😅
I just upgraded from a 980 2 weeks ago. Literally only did it so I could play gray zone. It ran 90% of games. It struggled with new titles like BF2042. But those older games it was still great. Was an amazing card for me. I’ll never bad mouth it, I bought it in 2013 and it’s been great for a decade
My RX 580 was still kicking in 2022, running all the games I enjoy playing at medium or even high graphics at an acceptable frame rate at 1080p. It will still even able to handle VR games that I play. I ended up upgrading to a 3060 in 2022 because it was on sale for a great price at my local best buy and I admittedly jumped onto the ray tracing hype (I've only ever used RT in Minecraft and then never touched it again lol, but at least it has a decent bit more performance).
Many old GPUs can definitely still hold their own even for some modern games. The 1080 Ti is still excellent for example and can even thrash the shit out of the budget RTX cards in some ways and you can find them used on eBay for less than $200 sometimes.