yeah i mean unless you have an older CPU. just get a second hand 6700XT and it will be better than this you loose a considerable ammount of performance if you have something like a ryzen 3600 or 5600. and if you cannot use resize bar on your mobo STAY CLEAR of the b570 and b580
haha, but his not wrong. with how much silicon advancements have slowed, a 24gb amd 9070 xt, or used nvidia 3090 will probably both last into 2030 (the only risk is if the ps6 has 32gb of ram, than games will likely start using even more vram than they do).
The generational gains are still there, nvidia's pricing is just god awful. The 40 series was able produce insane gains in transistor density and a huge boost in clocks so that they could make a chip with faster than 3080 performance at less than half the size, the problem is they then charged $600 for it instead of $350.
because wafer price increase is crazy. especially TSMC wafer price. their 5nm cost is almost double from their 7nm. even if you cut your die size by half there is no saving in cost since new node price double from the previous one.
@@arenzricodexd4409It's so depressing that literally no other semiconductor manufacturer can compete with them. Intel is in such shambles that Arrow Lake has to effectively be manufactured there while waiting for an A18 that might or might not come. Samsung is theoretically at feature parity, but they're so much smaller (as weird as calling something that would be a one-to-one reference for your bog-standard Cyberpunk megacorp "small") than TSMC that they barely factor into the equation. Frankly, there's zero chance of these prices coming down from market forces.
@@EbonySaints TSMC having hundreds of client over the years finally bear fruit for them. tons of customer meaning tons of experience dealing with various design. nvidia keep asking TSMC to do crazy things since the late 90s. Intel should heed Jensen advice more than a decade ago about opening their fab to others. back then intel was ahead of everyone else and they think they will continue to be ahead and will keep that advantage to themselves only....until 14nm happen. as for samsung....it seems the company have issues at various division. their exynos has been underwhelming for the last several years. their fab start losing it's competitiveness starting with 7nm/8nm generation. they got terrible yield since 5nm era. even samsung starts looking for TSMC to produce their exynos. but TSMC are rejecting samsung proposal fearing their secret recipe will got stolen by samsung.
TSMC N4 is easily 2.5-3 times more expensive than Samsung 8nm being sold for bargain prices. If you want faster performance you'll have to pay up. And it'll only get worse. N2 is rumoured to be another 1.5-2x more expensive than N4. This is just unsustainable and AMD can fix pricing once but at some point it won't be possible to scale raster performance as process nodes continue explode in cost.
@@pctechloonie give me a break, you have apple the greediest company on earth putting 3nm silicon in a full mini comp for $500. They can sell a 300mm 5n for $400 they could sell it for $350. Intel is over there selling it for $250.
The B580 and 70's best reason for consumers to exist is that it pushes down the higher tier card's pricing due to a disruption. I hope more come out of intel to slap the guy selling a 16GB card in 2025 for 1000 USD at the knees
Perfect timing for help my insomnia , excited to check this out thank you for keep making content and you appear happier and that is nice in a parasocial way
Misleading video, this is a cat video Also, the problem with both of these cards is just availability, real street price and the cpu overhead in some games
@@MisakaMikotoDesu as someone stuck with a GTX970 til the 2020 cryptoboom was over, I wish I'd just shelled out for a 1080. Remember naively thinking at the time: "Wow!! If this is how fast a GTX1080 is, I can't even imagine how fast a GTX1180 will be!! 😮 That'll be the time to upgrade for sure!!"
@@razvandavid1510 I hate people throwing "unoptimized" around as if it's fact The truth is as it's always been, games are optimized for the current gen consoles, so medium settings just about Anything above that historically never has been optimized for, hell CDPR said openly they didn't even test cyberpunk ultra settings because they didn't have hardware fast enough to run it no matter how hard they tried
@@flamingscar5263 Cyberpunk 2077 is part of the new slop generation with AI upscaling and frame gen technology. It's very real that unoptimization is a larger problem than it was in the past because everyone is just slapping on AI upscaling to a game otherwise running on 60fps 720p native with current gen hardware. Check out Threat Interactive, he breaks down how bad current games are with optimization, or mainly how bad UE5 is but everyone is using it
As someone who lives in a country with steep import taxes on any purchase over $200, with very limited access to second hand sources like Ebay and even Aliexpress and an atrocious second hand market at home (used RX580s over ~$100), the supposed death of $200 hero GPUs is frankly terrifying.
I think there is a space for at least the $200 segment still. People jump into the space without that much of a budget and getting something that isn't last generation and going out the door or niche would be nice for sure. Getting back into the space with just a $500 budget allowed for me to cram in a 1060 3gb for $130 and that was really good performance for nearly 3 years (granted, I think paying $20 more for a RX470 may have been a better idea overall but point stands). Otherwise, that entire market segment is at the behest of the used market for that region which for many countries may not really be a good option (and places like aliexpress feel subpar for GPUs vs CPUs).
people want the GPU to be cheap of course but making it cheap is becoming harder. honestly we don't know how long intel can keep cards like B570 at $220. just look at B580. if you look at newegg pricing most cards goes from $370 to $410. there is no model priced at or near $250 MSRP.
@@arenzricodexd4409 the price issue of the B570 and B580 is because there's too big of a demand and also because of scalpers. Also idk what to say about the issue of making cheap GPUs when both AMD and Nvidia pump up their prices so much each year.
My old RX 580 died yesterday. It was in mine, my brother's and then my sister's computers since the release. It still provided more than enough performance for today's teenager who needs to run Office suite and some occasional Sims 4 gaming nights.
I still have my RX580! I remember doing some heavy gaming back when it was new, and for a season I went full 1440p for everyday use (even though I my monitor isn't even 1080p), but the real bottleneck I have is everything else: an FX-3820E (💀), 8GB DDR3 (💀), and an aging 2TB HDD (💀). Oh, and the only fan in the case it's the CPU cooler (💀💀💀) I've been working for years to have enough for an upgrade, and the one thing I think it's going be carried over from my old PC to the new one is going to be that RX580 until AMD releases a GPU that's like super fast at 1440p and above _without_ upscalers or frame generation
I just bought 3060 ti from a panic seller just for 200€. Second hand market is really the best option for most right now. Especially considering that even intel video cards are higher in price after taxes so actually getting it for 200€ is impossible.
Depends... Secondhand 3090ti cost 1000€+ over here, which makes no sense. The 7900xtx trades for 800€ used and 900€ new... And as someone looking to upgrade from a 3080, let me tell you, seing them go for about 400€ used is insanity. Uses high end sucks, man...
@@leviathan5207 3090Ti end up being expensive because of those CUDA. and ultimately it is the last consumer grade card with NVLink support which is very useful if they want to combined VRAM with other 3090Ti. no matter how much impressive those 4090 are they did not have NVLink
I'm sure not many people know this, but I really love how you make your own music. Your videos have such an inimitable style. Incredible off-the-wall intro hook. Slow zoom into the cat while you're still talking. You're like a wizard in a tower slowly going mad by his magic, and for that I'll always show up for what you have to say
I just upgraded from a Pentium G4560 with the gtx 1050 2gb 😅 Built a new rig with the Ryzen 5 7600 and the Arc B580, also upgraded the monitor from an old 1080p 60hz to a new 1440p 155hz. Works really really good together with the R5 7600 and Arc B580! Can highly recommend this combo. 👍
Never cared about upgrading gpu until VR came. It is more than the still image look but upgrading in both immersion and gameplay as well as future proof like AR/VR bring is what makes me upgrade them in the first place.
I personally praise the B580 more, because it actually beats the competition for less while offering more of that precious vram. Fortunately I do no longer shop entry level, but for those who do its a amazing card. Capable of 1440p and all the potential to age nicely. Sadly this "bet" of aging nicely is what budget gamers need to make these days.
The "for less" part unfortunately only applies in the US (it's 320-340 in euroland), and pairing it with a budget system with a lower-end or older CPU in it will kneecap the Arc cards. Still best to look at the used market instead in that situation
@Knaeckebrotsaege yeah I know. They are almost DOA in Europe. Cheapest ones I saw were in Germany and France for like 319€. I would still buy it though, just for the vram alone. Unless you wanna get an RX6800... I recently got one for a friend for 380€
@@Kiyuja I personally have a RX 6800 in my main rig (as the non-alternative was a 3070 with just 8GB back then) and honestly can't wait to get rid of it. All the annoying driver and/or adrenalin issues I've had over the past 2.5 years made it painfully obvious why nvidia has the market share it does. The hardware side seems great, the software side is a letdown, basically a repeat of why I stopped buying ATI/AMD back in 2011 after my disastrous driver experience with the HD 6870. Was rooting for Intel to stir up the market but it looks like it'll take several more years for that to happen without all the caveats that come with them right now (ReBAR being a requirement, and the whole overhead thing, both disqualifying them from budget/older builds they're aimed at) ... and that's if Intels GPU division doesn't throw in the towel. Basically: situation is just about as "meh" as it's been for the past half-decade
@@Knaeckebrotsaege I personally use a 4070 Super and am in no way interested in Radeon products, but my friend seemed pretty happy with his new card (used to have a 1060). I love Nvidias driver, because it lets me enhance older games that I play from time to time, other drivers dont have this functionality. This was mostly about budget gaming tho and honestly Intel does deliver an ok product now, for being a newbie to the market. There is a lot of room to grow but for casuals its fine, no worrying about vram and most modern games are DX11, 12 or Vulkan anyway. Yes, the ReBar thing is kind of an issue for some people but the biggest problem honestly is almost no XeSS in games, but this will get better over the next years.
1:31 the horrendous compromise is the cpu overhead with anything less than the 9800x3d. Whats funny is if I replaced my 1080ti which is paired with a first gen ryzen, with a B570, it would probably perform at like half of the 1080ti's performance, even though in theory it's supposed to have similar performance, just because of the amount of cpu power it needs to function at its regular rate.
You do make a good point, I bought my 2060 Super in 2021 for £550 It still competes with the 30 series quite nicely & plays most if not all games at 1080p medium/high combinations of settings, but it has lasted almost an entire console generation, which makes that price tag more worthwhile - so you are right we do get more out of the cards these days, but we get less leaps too. 50 series is artificially leaping now because its probably the most cost effective way to find performance, instead of R&D.
to get more performance and efficiency we need node shrink. 3nm did not happen this gen for both AMD and nvidia because Apple pretty much gobble all of 3nm supply. worse it seems apple will stick to 3nm for quite a while because even apple are not willing to pay the expensive price TSMC is asking for their 2nm stuff.
I think it's essentially cyclical, there was a huge bump in raw performance from 1995 -> 2016. Since then performance gains have had diminished returns and the focus has drifted towards figuring out new spins on already existing technologies, be it ray tracing or AI (all of which have existed in rudimentary forms for 30ish years). What's likely happening is we figure out a new paradigm for graphics technology, we optimise it, we get diminishing returns, then find a new paradigm and so on and so on.
8:00 I disagree with this. If you buy an expensive card now, you are expecting to play games at high settings and framerates. In four years, yes that card can still easily run games but are you happy going for 1440p 120 fps ray tracing to 1080p 60 fps? Just buy what you need now and upgrade when the games you care about don't run at settings you want.
that’s built on the assumption that all consumers looking for a mid range 300-400 dollar card are expecting the highest level of performance. fallacy of composition. future proofing is a really important part of buying the right pc parts (within reason)
Had to rewatch it because the first time i was focused on the Fluffy and then again to hear those boring tech news. (Thanks for the work you do Philip!)
Just sharing my experience: Due to my humble budget, I actually was using RX470 until I got a 1080ti a couple of months ago. Sure, it's not packing the latest features like RT or DLSS, but I'm really happy with what I have. I'm not saying that I wouldn't buy a new graphics card if I had the money for it, but I'm not not losing sleep over it. I got my "gen-on-gen uplift", it's just an uplift from the past and for now, that's enough for me because now I have a lot more games in my library that I can enjoy in higher fidelity or actually run games I wasn't able to before! To each their own or whatever the saying is. Have a nice day ✌️😉
Amazing video Philip, another installment in my favourite series! I've began to wonder as well if $400 or so will become the new class of hero cards-it's a shame but tech often seems to eb and flow like this. In the spirit of yesteryear, however, I am vainly hoping to see a desktop Ryzen APU with that new 8060S iGPU. APUs just tickle something in my brain and they have for years, largely because of those old 2200G videos you made all those years ago when I was much younger. We can only hope that AMD will decide to be innovative this gen and not stagnant! 🙏
I've said it before but i'll say it again, i really like this style of video, I feel like seeing your face talking, with expressions makes the video much more enjoyable than the classic stock photo/ai image slideshow
My problem with naming the GTX 1080 Ti as 8 year old graphics card (meaning satisfied you for 8 years) is that now it's 1080p card, sometimes needing low presets in the most demanding games but when it's was released it was a 1440p card all max. So you either went for a resolution downgrade (ew) or you got 1080Ti as 1080p card and played many games with a major CPU bottleneck which negates the value point
Big selling point for Intel cards over other budget options from previous generations for me is the AV1 support. I love encoding videos in av1, it saves so much space for the game clips
I bought my pc 9 years with a gtx 970 and a i7 6700k and 16gb ram. For €1200 in 2015. Upgrade to 1080ti in 2019 for €400 . I have kept this till this year. Finally upgraded.
Still using GTX 1060 6gb. I don't play much AAA-games so I've had to skip only a handful of games that can't run stable 60fps. In hindsight, I should have bought at least 1070. I thought I was not going to play much games anymore, but then the indie market started pumping out great games. Now I have been meaning to upgrade my machine for 3-4 years already, but now I finally need to upgrade when Windows 10 is at the end of life. It just feels weird to "upgrade" to another 1080p machine, and upgrading to 1440p is already quite a steep investment.
fluffykins demands cuddles. also, totally agree on viewpoint that under certain circumstances ( like the hit that 1080ti was/is ) it makes more sense to pay more. Just two problems - 1. we don't know which product will be "the next 1080ti" and 2. even not considering inflation, price brackets have shifted and top tier cards are significantly more expensive. With the incoming crop, the most price-wise comparable product would be 5080 non-supertai. Doubtful if that would become the practical successor of 1080ti.
That was my line of thinking as well and how I ended up with a used 3090 when mining died. Best $500 I've ever spent on tech in my life, still have no desire to upgrade.
It was a relief when the cat popped up after you looked down and said _"Hello Fluffy..."_ How managed to get a cat in repose to look British is amazing.
thanks for the pointing out that's trying to squeeze the most amount of acceptable performance at a given moment might end up cause a detriment to the near future experience. it's hard to predict how software will powercreep with the hardware we buy, but if having the means then spending more tends also last more.
The same thought that pushed me to the halo card while having a FHD monitor. It might not be utilized to its potential for gaming but the experience it provides will be long relevant than exhausting their lifetime for UHD gaming usage with every latest release. And also helping me being quicker in productivity.
I like getting low-mid tier hardware and upgrading more often. Having an excuse to renew feels good. Currently got a 3060ti (this was £370 when I got it in 2021, which is at the very top end of my "mid" tier budget! It was 2021 okay, prices were wild and stock alerts were needed to even have a chance at getting a card).
Intel needs to beef up their control panel features and I’m in - e.g. CMAA2 should be added to their driver, they should add a good GPU scaler like FSR1, a good FPS limiter like Nvidia has, and toss in the open source AMD CAS while they’re at it. Something a lot of people sleep on is how nice of a feature for non-VRR users Intel’s smart sync is. Anyhow, very excited about these cards I just think Intel can and should go further.
still happily using my 1070 and dreading the value prospect of any "new" gpu releases, this new arc stuff is the first time in a while where i felt like the situation is finally getting better.
I respect philip's opinions on computer hardware more than any journalist or other youtuber. So fun to see him complain there was no card like this for years, then publish this video. Like a really long and weird foreshadowed reveal.
ive been waiting since February 23, 2019. the day the 1660ti came out (my current card). and then. the world ended in 2020. and HALF A DECADE LATER i can finally purchase a new one. i may wait one more for good measure. but great video.
As a former RX 480 and HD 7850 owner I've simply shifted to looking at the used market. If I can get a GPU with 2x the performance of my old one (or whatever I feel like I need to get acceptable performance) at around $150-200 used that's what I'll get. Which is why I bought a used 5700XT around april 2023 for $150. I made that purchase with Elden Ring at 1440p in mind and it's been great for that. Sure unoptimized UE5 titles like Marvel Rivals and The Finals need low settings and agressive upscaling to run well and look like ass, but mediocre F2P multiplayer games aren't going to get me to upgrade as long as they run in some capacity. In fact I probably just won't play games with higher requirements. If a game doesn't run on affordable hardware I'll just wait for the hardware it runs on to become affordable, however many years that might take.
I spent $400 for a new rx 6800 more than a year ago and have been loving it, 16gb of vram is so nice to have and great raster performance at 1440p, if all $400 cards could aspire to that level $400 would really be the best price to perform
I bought a 2080ti about 5 years ago and was very happy being able to run all games at max and can now still run games at near max settings, the only thing it struggles with is raytracing, and I am finally thinking of upgrading to a 5080 or 5070 after the reviews come out
1:55 I had Crossfire R9 290's. Card was amazing honestly. I think I paid like $399 for them in 2014? Upgrading from a 9800 GTX+ and Nvidia 640 GT was heck of an improvement. Crossfire was such a pain in the butt though. I should have just bought an R9 290x and been happy. About time to upgrade that system's upgrade now.... the 2070 Super is actually still doing pretty well for my tastes. It's just in those big AAA games where I want to play maxed out, I can't anymore. Even with DLSS. Which is nuts, honestly. And I upgraded to 1440p. Which hurts, haha. I almost want to buy one of these B570 just to support Intel. But...
interesting thoughts on gpu prices, I must say. I haven't been paying attention to gpu price-to-performance ratios for that long (and not that much generally), but if you spend a lot upfront and it lasts you for a lot of time, then I guess you could justify the higher price. Although electricity cost is another concern, so you may need to account for efficiency and power usage as well, which I didn't see you talk about in this video
Can relate. My RX 6800 is over 4.5 years old at this point, and is still just behind a 4070 and possibly the upcoming 5070. Price/performance stagnated after the RX 6000/RTX 3000 series. That means by the time RTX 6000 comes out the 6800/4070/5070 perforance class will have hovered around the $550-600 price bracket for at least 6 years.
All this really depends on who you are asking. If you're talking to the guy who cares about visual details, then future proofing might be good. If you talk to the guy who primarily want's to join their friends in the latest multiplayer hit, getting by is really all that's needed. I'm playing on a GTX 960 6GB because my friends are not into graphics demanding games. For those pretty games I have a PS5 and the option to resell a title I don't enjoy.
I've had a Rog Strix 1070ti for 7 years, it was still doing great, I changed because I was doing a huge CPU upgrade and did the GPU just for the sake of it, I never really needed to upgrade, but my current GPU is kinda nice in more demanding games, wasn't need tho
I went from 0 interest in the B570 to being sold on it to being turned off of it by the end of this video
Fluffy definitely helps the interest
yeah i mean unless you have an older CPU. just get a second hand 6700XT and it will be better than this you loose a considerable ammount of performance if you have something like a ryzen 3600 or 5600. and if you cannot use resize bar on your mobo STAY CLEAR of the b570 and b580
@@rohanjosem unfortunately a lot of people will just not buy a 2nd hand card, their loss IMO
@@flamingscar5263 ikr i got myself a 6750XT for 216 USD
@@rohanjosem yeah 6700 xt is still a great offering, selling for $120-ish in the marketplace of my country
came for the budget card, stayed for the fluffy cat
yeah i came, too
None of those GPUs match the cat. Nvidia, AMD, Intel . . . they're all frauds and hacks!
*Fluffy the cat
If you’re gonna put cats in your video like this, you might as well use red arrows and insane color saturation as well.
don't tell intel will use cute cat's to sale their Gpu's
that intro in the top right was incredible...
unironically some great cinematography
Fluffy is a veteran of both world wars, i can see it in his droopy eyes.
are you referring to both of philip's divorces?
@@thehen101 Now there are two of them?
@@thehen101 two?!
@@Isaax this is getting out of hand!
you know there was a cat that hopped between warships back in the day?
Unsinkable Sam :)
phillip is reaching the Jensen singularity: "The more you buy the more you save"
😂
Soon we’ll be hearing AI in every third sentence he utters.
haha, but his not wrong. with how much silicon advancements have slowed, a 24gb amd 9070 xt, or used nvidia 3090 will probably both last into 2030 (the only risk is if the ps6 has 32gb of ram, than games will likely start using even more vram than they do).
I don’t have a clue what this video was about I was too busy watching the cat
Same. Came for the card, stayed for the car.
all i learned from this video is that fluffykins is eepy
and I'd like two dozen more videos just like this, please
philip plz fix
@@brixxconnor3411 vroom
I'm gonna have to watch this twice now because I was too focused on your cat the first time. Very clever ploy, Philip
Very manipulative but I don't care more cat content!
We love a mouse petting the cat 3:23
fluffy is looking absolutely great compared to when you first showed him off.
his fur shows you take great care of him and make sure he eats well.
raytracing of his fur improved
The generational gains are still there, nvidia's pricing is just god awful. The 40 series was able produce insane gains in transistor density and a huge boost in clocks so that they could make a chip with faster than 3080 performance at less than half the size, the problem is they then charged $600 for it instead of $350.
because wafer price increase is crazy. especially TSMC wafer price. their 5nm cost is almost double from their 7nm. even if you cut your die size by half there is no saving in cost since new node price double from the previous one.
@@arenzricodexd4409It's so depressing that literally no other semiconductor manufacturer can compete with them. Intel is in such shambles that Arrow Lake has to effectively be manufactured there while waiting for an A18 that might or might not come. Samsung is theoretically at feature parity, but they're so much smaller (as weird as calling something that would be a one-to-one reference for your bog-standard Cyberpunk megacorp "small") than TSMC that they barely factor into the equation. Frankly, there's zero chance of these prices coming down from market forces.
@@EbonySaints TSMC having hundreds of client over the years finally bear fruit for them. tons of customer meaning tons of experience dealing with various design. nvidia keep asking TSMC to do crazy things since the late 90s. Intel should heed Jensen advice more than a decade ago about opening their fab to others. back then intel was ahead of everyone else and they think they will continue to be ahead and will keep that advantage to themselves only....until 14nm happen.
as for samsung....it seems the company have issues at various division. their exynos has been underwhelming for the last several years. their fab start losing it's competitiveness starting with 7nm/8nm generation. they got terrible yield since 5nm era. even samsung starts looking for TSMC to produce their exynos. but TSMC are rejecting samsung proposal fearing their secret recipe will got stolen by samsung.
TSMC N4 is easily 2.5-3 times more expensive than Samsung 8nm being sold for bargain prices. If you want faster performance you'll have to pay up. And it'll only get worse. N2 is rumoured to be another 1.5-2x more expensive than N4. This is just unsustainable and AMD can fix pricing once but at some point it won't be possible to scale raster performance as process nodes continue explode in cost.
@@pctechloonie give me a break, you have apple the greediest company on earth putting 3nm silicon in a full mini comp for $500. They can sell a 300mm 5n for $400 they could sell it for $350. Intel is over there selling it for $250.
After 3:05 i completely lost interest in what philip was talking about
car showed up
car showed up
car showed up
car showed up
car showed up
what a thumbnail, phillip
Cat is being violated
He changed it 😭
@@zaidlacksalastname4905 why did he remove the lick
what was it?
@@KingLich451 licking the flow-through vent on the GPU
Daily B580 user here - it is nice to see you actually do some videos about Intel gpu products :D
The B580 and 70's best reason for consumers to exist is that it pushes down the higher tier card's pricing due to a disruption. I hope more come out of intel to slap the guy selling a 16GB card in 2025 for 1000 USD at the knees
I really hope so because I refuse to pay 1k usd for 16gb of vram
From what I've seen it brought up the Arc cards pricing to be in line with the overpriced AMD and NVIDIA cards. (by retailers)
Perfect timing for help my insomnia , excited to check this out thank you for keep making content and you appear happier and that is nice in a parasocial way
Misleading video, this is a cat video
Also, the problem with both of these cards is just availability, real street price and the cpu overhead in some games
My GTX 1080 is still alive in my dedicated VR computer. The idea of using a single GPU for 10 years would have been insane to me in the past.
It's still a good card for anything not literally brand new and very unoptimised.
1080 is such a good card especially if you bought it on sale when 20 series was coming out.
@@MisakaMikotoDesu as someone stuck with a GTX970 til the 2020 cryptoboom was over, I wish I'd just shelled out for a 1080.
Remember naively thinking at the time:
"Wow!! If this is how fast a GTX1080 is, I can't even imagine how fast a GTX1180 will be!! 😮 That'll be the time to upgrade for sure!!"
@@razvandavid1510 I hate people throwing "unoptimized" around as if it's fact
The truth is as it's always been, games are optimized for the current gen consoles, so medium settings just about
Anything above that historically never has been optimized for, hell CDPR said openly they didn't even test cyberpunk ultra settings because they didn't have hardware fast enough to run it no matter how hard they tried
@@flamingscar5263 Cyberpunk 2077 is part of the new slop generation with AI upscaling and frame gen technology. It's very real that unoptimization is a larger problem than it was in the past because everyone is just slapping on AI upscaling to a game otherwise running on 60fps 720p native with current gen hardware. Check out Threat Interactive, he breaks down how bad current games are with optimization, or mainly how bad UE5 is but everyone is using it
Fluffy really stole the show.
As someone who lives in a country with steep import taxes on any purchase over $200, with very limited access to second hand sources like Ebay and even Aliexpress and an atrocious second hand market at home (used RX580s over ~$100), the supposed death of $200 hero GPUs is frankly terrifying.
It feels like the B570 and 80 is a miracle to exist at all… and that’s kinda sad.
I think there is a space for at least the $200 segment still. People jump into the space without that much of a budget and getting something that isn't last generation and going out the door or niche would be nice for sure. Getting back into the space with just a $500 budget allowed for me to cram in a 1060 3gb for $130 and that was really good performance for nearly 3 years (granted, I think paying $20 more for a RX470 may have been a better idea overall but point stands). Otherwise, that entire market segment is at the behest of the used market for that region which for many countries may not really be a good option (and places like aliexpress feel subpar for GPUs vs CPUs).
people want the GPU to be cheap of course but making it cheap is becoming harder. honestly we don't know how long intel can keep cards like B570 at $220. just look at B580. if you look at newegg pricing most cards goes from $370 to $410. there is no model priced at or near $250 MSRP.
@@arenzricodexd4409 the price issue of the B570 and B580 is because there's too big of a demand and also because of scalpers. Also idk what to say about the issue of making cheap GPUs when both AMD and Nvidia pump up their prices so much each year.
My old RX 580 died yesterday. It was in mine, my brother's and then my sister's computers since the release. It still provided more than enough performance for today's teenager who needs to run Office suite and some occasional Sims 4 gaming nights.
I still have my RX580! I remember doing some heavy gaming back when it was new, and for a season I went full 1440p for everyday use (even though I my monitor isn't even 1080p), but the real bottleneck I have is everything else: an FX-3820E (💀), 8GB DDR3 (💀), and an aging 2TB HDD (💀). Oh, and the only fan in the case it's the CPU cooler (💀💀💀)
I've been working for years to have enough for an upgrade, and the one thing I think it's going be carried over from my old PC to the new one is going to be that RX580 until AMD releases a GPU that's like super fast at 1440p and above _without_ upscalers or frame generation
1:02 vr mentioned
200$??
Intel!??
Vr!!!?????
Good lord I'm bout to bust 😫💦
Watched this part until … I no longer needed to
I just bought 3060 ti from a panic seller just for 200€. Second hand market is really the best option for most right now. Especially considering that even intel video cards are higher in price after taxes so actually getting it for 200€ is impossible.
damn bro i got mine for 600 back in the day, that is a damn good deal
@@quantum5661 And it is still in warranty for another year and the seller only used for 3 months
Depends... Secondhand 3090ti cost 1000€+ over here, which makes no sense. The 7900xtx trades for 800€ used and 900€ new... And as someone looking to upgrade from a 3080, let me tell you, seing them go for about 400€ used is insanity.
Uses high end sucks, man...
@@leviathan5207 3090Ti end up being expensive because of those CUDA. and ultimately it is the last consumer grade card with NVLink support which is very useful if they want to combined VRAM with other 3090Ti. no matter how much impressive those 4090 are they did not have NVLink
always has been
LOVE the frequent uploads recently 🎉
I'm sure not many people know this, but I really love how you make your own music. Your videos have such an inimitable style. Incredible off-the-wall intro hook. Slow zoom into the cat while you're still talking. You're like a wizard in a tower slowly going mad by his magic, and for that I'll always show up for what you have to say
I'm still on a 1050ti, it's been outdated for so long but I'm determined to keep it going
I just upgraded from a Pentium G4560 with the gtx 1050 2gb 😅
Built a new rig with the Ryzen 5 7600 and the Arc B580, also upgraded the monitor from an old 1080p 60hz to a new 1440p 155hz. Works really really good together with the R5 7600 and Arc B580! Can highly recommend this combo. 👍
Never cared about upgrading gpu until VR came.
It is more than the still image look but upgrading in both immersion and gameplay as well as future proof like AR/VR bring is what makes me upgrade them in the first place.
I personally praise the B580 more, because it actually beats the competition for less while offering more of that precious vram. Fortunately I do no longer shop entry level, but for those who do its a amazing card. Capable of 1440p and all the potential to age nicely. Sadly this "bet" of aging nicely is what budget gamers need to make these days.
The "for less" part unfortunately only applies in the US (it's 320-340 in euroland), and pairing it with a budget system with a lower-end or older CPU in it will kneecap the Arc cards. Still best to look at the used market instead in that situation
@Knaeckebrotsaege yeah I know. They are almost DOA in Europe. Cheapest ones I saw were in Germany and France for like 319€. I would still buy it though, just for the vram alone. Unless you wanna get an RX6800... I recently got one for a friend for 380€
@@Kiyuja I personally have a RX 6800 in my main rig (as the non-alternative was a 3070 with just 8GB back then) and honestly can't wait to get rid of it. All the annoying driver and/or adrenalin issues I've had over the past 2.5 years made it painfully obvious why nvidia has the market share it does. The hardware side seems great, the software side is a letdown, basically a repeat of why I stopped buying ATI/AMD back in 2011 after my disastrous driver experience with the HD 6870. Was rooting for Intel to stir up the market but it looks like it'll take several more years for that to happen without all the caveats that come with them right now (ReBAR being a requirement, and the whole overhead thing, both disqualifying them from budget/older builds they're aimed at) ... and that's if Intels GPU division doesn't throw in the towel.
Basically: situation is just about as "meh" as it's been for the past half-decade
@@Knaeckebrotsaege I personally use a 4070 Super and am in no way interested in Radeon products, but my friend seemed pretty happy with his new card (used to have a 1060). I love Nvidias driver, because it lets me enhance older games that I play from time to time, other drivers dont have this functionality.
This was mostly about budget gaming tho and honestly Intel does deliver an ok product now, for being a newbie to the market. There is a lot of room to grow but for casuals its fine, no worrying about vram and most modern games are DX11, 12 or Vulkan anyway. Yes, the ReBar thing is kind of an issue for some people but the biggest problem honestly is almost no XeSS in games, but this will get better over the next years.
the adjusted for inflation GPU hero!
Excellent cat video. The guy making background voice gave it a unique feel.
1:31 the horrendous compromise is the cpu overhead with anything less than the 9800x3d. Whats funny is if I replaced my 1080ti which is paired with a first gen ryzen, with a B570, it would probably perform at like half of the 1080ti's performance, even though in theory it's supposed to have similar performance, just because of the amount of cpu power it needs to function at its regular rate.
You do make a good point,
I bought my 2060 Super in 2021 for £550 It still competes with the 30 series quite nicely & plays most if not all games at 1080p medium/high combinations of settings, but it has lasted almost an entire console generation, which makes that price tag more worthwhile - so you are right we do get more out of the cards these days, but we get less leaps too. 50 series is artificially leaping now because its probably the most cost effective way to find performance, instead of R&D.
to get more performance and efficiency we need node shrink. 3nm did not happen this gen for both AMD and nvidia because Apple pretty much gobble all of 3nm supply. worse it seems apple will stick to 3nm for quite a while because even apple are not willing to pay the expensive price TSMC is asking for their 2nm stuff.
i played with the 2060 XC for 6 years and it played very well, i changed because i have a 280hz now and it wasnt cutting it on fps games
3:20 the moment my focus started waning from the topic
9:54 best way to hit the 10 min mark 🤣🤣🤣
0:28 thank you for saying asteriscs instead of asterixes
I think it's essentially cyclical, there was a huge bump in raw performance from 1995 -> 2016. Since then performance gains have had diminished returns and the focus has drifted towards figuring out new spins on already existing technologies, be it ray tracing or AI (all of which have existed in rudimentary forms for 30ish years). What's likely happening is we figure out a new paradigm for graphics technology, we optimise it, we get diminishing returns, then find a new paradigm and so on and so on.
8:00 I disagree with this. If you buy an expensive card now, you are expecting to play games at high settings and framerates. In four years, yes that card can still easily run games but are you happy going for 1440p 120 fps ray tracing to 1080p 60 fps? Just buy what you need now and upgrade when the games you care about don't run at settings you want.
that’s built on the assumption that all consumers looking for a mid range 300-400 dollar card are expecting the highest level of performance. fallacy of composition. future proofing is a really important part of buying the right pc parts (within reason)
you can tell this is a good video because fluffy is in it
Had to rewatch it because the first time i was focused on the Fluffy and then again to hear those boring tech news. (Thanks for the work you do Philip!)
glad you kept the new format
Felt good to see the conclusion align with my original perspective
Just sharing my experience:
Due to my humble budget, I actually was using RX470 until I got a 1080ti a couple of months ago. Sure, it's not packing the latest features like RT or DLSS, but I'm really happy with what I have. I'm not saying that I wouldn't buy a new graphics card if I had the money for it, but I'm not not losing sleep over it. I got my "gen-on-gen uplift", it's just an uplift from the past and for now, that's enough for me because now I have a lot more games in my library that I can enjoy in higher fidelity or actually run games I wasn't able to before!
To each their own or whatever the saying is.
Have a nice day ✌️😉
I'm going to have to watch this twice now because I was too focused on your human the first time. Very clever ploy, Kitty,
Nice cat review! Make more of this stuff!
5:17 Instantly got goosebumps with the Who's Lila? music
God I love Philip Tech Tips. I genuinely look forward to these videos.
Best Marlon Brando godfather cat moment yet.
Finally! The video we've been waiting years for 🤩
Amazing video Philip, another installment in my favourite series!
I've began to wonder as well if $400 or so will become the new class of hero cards-it's a shame but tech often seems to eb and flow like this. In the spirit of yesteryear, however, I am vainly hoping to see a desktop Ryzen APU with that new 8060S iGPU. APUs just tickle something in my brain and they have for years, largely because of those old 2200G videos you made all those years ago when I was much younger.
We can only hope that AMD will decide to be innovative this gen and not stagnant! 🙏
Haha wtf is that ending 😂
Hilarious 😂
I've said it before but i'll say it again, i really like this style of video, I feel like seeing your face talking, with expressions makes the video much more enjoyable than the classic stock photo/ai image slideshow
My problem with naming the GTX 1080 Ti as 8 year old graphics card (meaning satisfied you for 8 years) is that now it's 1080p card, sometimes needing low presets in the most demanding games but when it's was released it was a 1440p card all max. So you either went for a resolution downgrade (ew) or you got 1080Ti as 1080p card and played many games with a major CPU bottleneck which negates the value point
Big selling point for Intel cards over other budget options from previous generations for me is the AV1 support. I love encoding videos in av1, it saves so much space for the game clips
I enjoy videos from philip2 very much. Sad they get so few views :(
I bought my pc 9 years with a gtx 970 and a i7 6700k and 16gb ram. For €1200 in 2015. Upgrade to 1080ti in 2019 for €400 . I have kept this till this year. Finally upgraded.
Still using GTX 1060 6gb. I don't play much AAA-games so I've had to skip only a handful of games that can't run stable 60fps. In hindsight, I should have bought at least 1070. I thought I was not going to play much games anymore, but then the indie market started pumping out great games. Now I have been meaning to upgrade my machine for 3-4 years already, but now I finally need to upgrade when Windows 10 is at the end of life. It just feels weird to "upgrade" to another 1080p machine, and upgrading to 1440p is already quite a steep investment.
That is objectively the best RUclips thumbnail ever.
i clicked on the video to see a new budget card, stayed around for philip's yapping and his fluffy cat :D
fluffykins demands cuddles.
also, totally agree on viewpoint that under certain circumstances ( like the hit that 1080ti was/is ) it makes more sense to pay more. Just two problems - 1. we don't know which product will be "the next 1080ti" and 2. even not considering inflation, price brackets have shifted and top tier cards are significantly more expensive. With the incoming crop, the most price-wise comparable product would be 5080 non-supertai. Doubtful if that would become the practical successor of 1080ti.
That was my line of thinking as well and how I ended up with a used 3090 when mining died. Best $500 I've ever spent on tech in my life, still have no desire to upgrade.
It was a relief when the cat popped up after you looked down and said _"Hello Fluffy..."_ How managed to get a cat in repose to look British is amazing.
The script of this video on the teleprompt "[...]and their XS upscaling is (CAT ENTERS SCENE HERE*) nice as well"
thanks for the pointing out that's trying to squeeze the most amount of acceptable performance at a given moment might end up cause a detriment to the near future experience. it's hard to predict how software will powercreep with the hardware we buy, but if having the means then spending more tends also last more.
That cat looks British
The same thought that pushed me to the halo card while having a FHD monitor. It might not be utilized to its potential for gaming but the experience it provides will be long relevant than exhausting their lifetime for UHD gaming usage with every latest release. And also helping me being quicker in productivity.
I like getting low-mid tier hardware and upgrading more often. Having an excuse to renew feels good. Currently got a 3060ti (this was £370 when I got it in 2021, which is at the very top end of my "mid" tier budget! It was 2021 okay, prices were wild and stock alerts were needed to even have a chance at getting a card).
I think Fluffy's commentary really explained this situation to me very well.
Intel needs to beef up their control panel features and I’m in - e.g. CMAA2 should be added to their driver, they should add a good GPU scaler like FSR1, a good FPS limiter like Nvidia has, and toss in the open source AMD CAS while they’re at it.
Something a lot of people sleep on is how nice of a feature for non-VRR users Intel’s smart sync is. Anyhow, very excited about these cards I just think Intel can and should go further.
still happily using my 1070 and dreading the value prospect of any "new" gpu releases, this new arc stuff is the first time in a while where i felt like the situation is finally getting better.
DO NOT THE CAT!!
Ew, okay this comment is a bit too far even for my standards, leave the animal alone.
@UsernameDoesntCareyou don't get it
DO THE CAT
@UsernameDoesntCare what
THE
When I said this was "a cat and mouse situation", this isn't what I meant.
I respect philip's opinions on computer hardware more than any journalist or other youtuber. So fun to see him complain there was no card like this for years, then publish this video. Like a really long and weird foreshadowed reveal.
Philip begins his second grind and rise to RUclips fame soon
I came for Philip - I stayed for Fluffy! Such a cutie!
The last 5-10 seconds will live in memory until the end of time.
You're such a casual.
subbed
ive been waiting since February 23, 2019. the day the 1660ti came out (my current card). and then. the world ended in 2020. and HALF A DECADE LATER i can finally purchase a new one. i may wait one more for good measure. but great video.
As a former RX 480 and HD 7850 owner I've simply shifted to looking at the used market. If I can get a GPU with 2x the performance of my old one (or whatever I feel like I need to get acceptable performance) at around $150-200 used that's what I'll get. Which is why I bought a used 5700XT around april 2023 for $150. I made that purchase with Elden Ring at 1440p in mind and it's been great for that. Sure unoptimized UE5 titles like Marvel Rivals and The Finals need low settings and agressive upscaling to run well and look like ass, but mediocre F2P multiplayer games aren't going to get me to upgrade as long as they run in some capacity.
In fact I probably just won't play games with higher requirements. If a game doesn't run on affordable hardware I'll just wait for the hardware it runs on to become affordable, however many years that might take.
That cat made my day. SO precious. Forgot this video was about GPU's for a sec.
I spent $400 for a new rx 6800 more than a year ago and have been loving it, 16gb of vram is so nice to have and great raster performance at 1440p, if all $400 cards could aspire to that level $400 would really be the best price to perform
As someone that has had a 1060 3gb since 2017, the 1080ti-tier performance would be a huge upgrade for me
that thumbnail is phenomenal
I bought a 2080ti about 5 years ago and was very happy being able to run all games at max and can now still run games at near max settings, the only thing it struggles with is raytracing, and I am finally thinking of upgrading to a 5080 or 5070 after the reviews come out
1:55 I had Crossfire R9 290's. Card was amazing honestly. I think I paid like $399 for them in 2014? Upgrading from a 9800 GTX+ and Nvidia 640 GT was heck of an improvement. Crossfire was such a pain in the butt though. I should have just bought an R9 290x and been happy. About time to upgrade that system's upgrade now.... the 2070 Super is actually still doing pretty well for my tastes. It's just in those big AAA games where I want to play maxed out, I can't anymore. Even with DLSS. Which is nuts, honestly. And I upgraded to 1440p. Which hurts, haha. I almost want to buy one of these B570 just to support Intel. But...
Interesting point on longevity. I used my GTX1070 from end of 2017 to just now.
I love that you just focus on the cat. You know your innanet well, sir.
your cat is such a sweetheart
interesting thoughts on gpu prices, I must say. I haven't been paying attention to gpu price-to-performance ratios for that long (and not that much generally), but if you spend a lot upfront and it lasts you for a lot of time, then I guess you could justify the higher price.
Although electricity cost is another concern, so you may need to account for efficiency and power usage as well, which I didn't see you talk about in this video
your explanations are stellar.
Can relate. My RX 6800 is over 4.5 years old at this point, and is still just behind a 4070 and possibly the upcoming 5070. Price/performance stagnated after the RX 6000/RTX 3000 series.
That means by the time RTX 6000 comes out the 6800/4070/5070 perforance class will have hovered around the $550-600 price bracket for at least 6 years.
that cat thinks you're lonely cuz you're talking to yourself xD
my cat was on my lap watching this video with me it felt like catception when his jumped up on his lap
best part @ 5:19
Fluffy looks so cute and, well, fluffy with his winter furr
I completely lost track of everything the moment fluffy came onscreen I kept watching the entire rest of the video with fluffy being my sole focus.
the added Vram for sure is nice to see
All this really depends on who you are asking. If you're talking to the guy who cares about visual details, then future proofing might be good. If you talk to the guy who primarily want's to join their friends in the latest multiplayer hit, getting by is really all that's needed. I'm playing on a GTX 960 6GB because my friends are not into graphics demanding games. For those pretty games I have a PS5 and the option to resell a title I don't enjoy.
What's the banger chiptune playing in the background at 2:00 - 3:40? Also, nice cat review.
Bro its 1am this is awesome
I've had a Rog Strix 1070ti for 7 years, it was still doing great, I changed because I was doing a huge CPU upgrade and did the GPU just for the sake of it, I never really needed to upgrade, but my current GPU is kinda nice in more demanding games, wasn't need tho