Low end shoppers should WAIT until big companies like Amazon buy a good computer in bulk and put it on sale, usually for the holidays. I'm sure they'll be plenty of great deals this holiday season on computers. Just be patient and wait a few months. Especially when the next generation is right around the corner.
I wonder if you guys have some idea about monitor stuff as well? The recent releases of the new QD-OLED and WOLED monitors and TVs (like the LG C4 and G4) are being marketed even here in Southeast Asia, but it's WEIRD because you can't find any of the mainly marketed stuff on any store online at all and local stores always have an out-of-stock sign on them even if there are ads showing they're out now. It's like they're marketing them, but not selling them at all. And even when I ask stores, I get no proper answer why they're not available. It's just weird.
@@henrythegreatamerican8136 that's what I am planning on doing. wait for Prime Day and pick up the RX 7900xt and Ryzen 7 5800x3D as an am4 high-end upgrade.
The counterpoint to your argument is that Nvidia still need to sell GPU's... If their new cards don't represent better price/performance than their last generation then they won't sell.
@@cairnex4473 "If their new cards don't represent better price/performance than their last generation then they won't sell." People buy Nvidia regardless of price/perf. Assume that Nvidia will boost RT performance which is becoming a big deal. They could also launch a new gen of DLSS that only works for 5000 series. In other words this is more than just raster perf. If the 5090 is 30% faster than the 4090, it's an $1800 - $2000 GPU because of the boost to everything else, so what that really means is a 30% boost in perf. using RT, and then a better boost using a new version of DLSS along with the added tensor cores. Nvidia is going to keep selling top end GPUs because there will be the upper 20% in the wealth bracket or household income that can easily afford it, and that's probably a good 50+ million families around the world and with as big as gaming has gotten EVEN WITH Microsoft screwing things up, there'll be enough sales there because they can. Yes, if Nvidia launches a new gen of DLSS that only works for 5000 series there'll be the bitching and moaning by TBers and in the comments section, and then people rush out to buy 5000 series. But hey, maybe after the 5000 series is out for a while the 4090 drops in price to $1500 and they'll only cost $1700 from retailers.
I recently built a computer and used the xfx 6800 that was shown in the video. I have had no problems, except the card is huge! Make sure your case will fit a 340 mm gpu. It's a really great card!
Any time I hear the question "Buy Now or Wait", it always comes down to this: 1- If you need a card, any card, just to play games, and you have none right now, then just buy. There will always be price drops and sales coming out. Get what you need to play, that will keep you tided over for now. I'd start with a used 30xx or 6xxx GPU at 1080p. 2- If you want the best of the best, just get what's out. They very rarely decrease in price before new stuff comes out. The 4090 won't get cheaper til the 5090 is out, and then you'll be wanting to buy that instead. Just suck it up and pay the money, if that's what you really want. Or don't. 3- If you need to upgrade or replace an existing card, that's where it gets more tricky. You really have to know what performance you want, the card that's best for that, and then watch the market for a while to see about price fluxuations. Typically cards are at their best prices when demand is down between releases (like right now) or once newer cards are out and noone wants the old card. For instance, if you are on a 10xx or older card, the past year has been a great chance to upgrade to a 30xx card. Still is, if you can find a new one, but if you're happy with used there's plenty stock available.
Yeah, to me it's difficult because I got a 6600 and I want to get av1 encoding which I see a lot of people being so polarized about it. I do tons of encoding and decoding so I'd like to have my storage cost nearly halved and my performance a third up (looking at the 7600XT), and 16gb VRAM for eventual AI workloads. Apparently there's some NPUs in the 7600xt but there isn't much use I heard, but perhaps in the future? Also the reason I look for the x600 series is because of the lower power envelope, I got a tiny PC case to stay mobile. I know that game devs and GPU companies are perfectly able to make very efficient graphics and I want to put my money there, to kinda tell them to fxxk off with the huge thermonuclear cards!!
@silverhawkroman in your situation, I'd probably get the 3060 12gb or 7700 xt. The 3060 would only be better for vram and the encoding, but on par graphically. 7700xt beats it solidly.
@@ElladanKenet lol there's no 7700xt that is shorter than 266mm in my country, i only have a max length of 250mm. Oh and i forgot to mention that I run 5 screens (preferrably 6, but moving from 580 to 6600 kinda ruined that), and i heard nvidia cards dont do above 4.
Sure if you don't give a fuck about the quality of the screen you're playing on... I am waiting for a 6090 and a very impressive 8K TV to come out next year or 2026. I'd have to wait even longer if I wasn't going to upgrade until 8k 120hz is a thing...
The only times it’s really worth to wait is if Black Friday or a new release is really close. Otherwise the waiting can be infinite.. wait for next discount/deal.. hmm.. new release soon.. too expensive wait for deal.. it‘s been so long maybe wait a couple months for the next generation, etc.. Afterwards you‘ll always feel a bit of buyer’s remorse with PC parts, no matter the time you purchase.
Think the 6800 is vastly underrated. It out performs a 2080ti and has all the new DX bells and whistles so should be relevant for at least another 4 years
@@sdnnvs Anyone who has used a 6000 card knows the drivers have been great. AMD is the entire reason Nvidia remade their control panel ffs. I'm so sick of people spewing this trash...grow tf up.
That segment starting at 16:40 … none of what you talk about should be causing a massive drop in sales in any one period. US Generally Accepted Accounting Principles (GAAP) dictate that any credits (known as MDF in the industry) are recognized in the period in which they are earned regardless of when the cash/product is rebated back to the retailer. This may have been an issue with respect to a big drop in cash flow, but in terms of reported revenue it’s a complete non-factor. Another way of explaining this is that in a particular period, if a retailer sold a card for $500 as in your example, but then earned $50 in MDF credits from AMD, the revenue recognized from AMD would have been the $400 BOM kit to the OEM minus the $50 credit to the retailer. They would have only recognized $350 in revenue on that sale even though they’re receiving $400 in cash from the OEM. When the retailer cashes in those accumulated credits, they’re effectively redeeming that $50 credit sitting on AMD’s balance sheet. This is essentially gift card accounting for B2B transactions. Revenue is recognized when merchandise changes hands, but the cash flow is going to be different because of the issuance and redemption of those credits/gift cards.
Nobody said it caused the drop in sales. A year ago there were new models still at high prices, this year same models and prices have fallen. So if they're booking $50 retailers' credits it's no longer a $400 BOM but a reduced margin $300 so AIB can profit and retailers can sell them cheaper claiming new credits, so proportionately the numbers are worse as there's still fixed costs in the biz to pay. Falling sales and lower margins at the same time as falling console sales lower revenue and profit.
It isn't that AMD is selling less. They are selling for less money. I.e. margins are lower due to the over production of RNDA2. The RX 6600 was a 329.99 MSRP gpu...it has been regularly selling for 199 for quite a while now.
As someone who's just going to finish Bachelors in Business Administration (Mojor in Accounting) I loved your comment. Good job try to explain to the the people unfamiliar with GAAP accrual basis principle.
@@Toshinben Mate, you're "confusing physics with chemistry". While both accounting and finance is from the larger field of Business (like science is for physics and chemistry), they are two different branches that DO relate together to form the reality, but are separate branches/schools of studies.
i bought a 7900 xt for a linux build i made, very happy with it, beastly device, price was a bit much but not nearly as crazy or inflated as the nvidia parts
Only caviat is still how far support for AMD GPUs laggs in professional workloads like editing, rendering, and AI models that still reply on CUDA. Though software like Davinci Resolve, Agisoft Metashape, and Blender are thankfully major bucks to that trend.
7900xt is a complete waste of money and it has no future proofing because of its weak and useless raytracing capabilities 4070 ti, 4080, 4090 all are better deals than 7900xt and it is a fact
I believe that the low-end (and midrange) product cycle is way slower than the high-end. People don't care about new products, they care about affordable GPUs. They watch like a hawk when prices drop, they won't watch Jensen rambling about his famous friends or elephants.
@@tringuyen7519 because they have access to all of the people who actually are willing to commit that kind of cash. Don't get me wrong, they are moving cards. I just mean to the bulk of the market there's very little that actually feels like a good deal. I think most people are reluctantly caving and just biting the bullet on their purchase.
All my two dozen actively gaming friends only care about 200 - 400 USD desktop dGPUs and 800 - 1300 USD laptops. It's kind of like those companies try to target the 1% of richest people on Earth.
@@Kynareth6 Unfortunately, they know they can get a lot of people to spend beyond their means if they gatekeep/paywall good performance in the latest games. Add to that the actual fact that it IS a lot harder to produce a good value low end card now... Especially since people's build quality expectations are really high these days compare GPU coolers now to 10 years ago, they are NOT the same. Mind you, I do think something like a 4070 that looks like a janky PCB with a cheap cooler zip tied to it could actually sell well if it didn't thermal throttle and provided a significant enough savings. At the end of the day these companies have decided that if you don't want to spend $500+ then you should stick with the older generation that they overproduced. It's very frustrating, especially with how VRAM crippled older cards are now for the most part.
I have a 1070ti, and currently don't mind putting my settings mid/low. Will have to upgrade eventually but for now I can play all the games I want to play.
agreed. still rockin a 1070 since same timeframe. only game that starts to become annoying in frame drops is helldivers2 that i play. so other than that... doing a major upgrade right now for 2k+ (cuz if im getting a new gpu im just gonna get a whole new computer, i doubt this case will even fit the newer cards, and teh psu i know is definitely not going to cut it)
was going to do a similar upgrade from a 6700xt to a GRE but then I ended looking at the 7900xt, because might as well since im spending $550+, and what I ended up buying was a 7900xtx and 7800x3d build lol it got out of hand quickly.
I think from a performance upgrade, the 7900xtx is the only reasonable upgrade from a 6700xt. Everything else feels like it's not a big enough improvement
@@Greenalex89 yea man, 6700xt I got for $160 from a local guy a year ago and since I'm on a 3440x1440 monitor I needed all that performance. I upgrade every 4 or 5 years. Had my rx580 since 2017 and just recently upgraded mid 2023.
I've had a 6800xt for two years now , I'm not needing to move off that until I upgrade from 1440p to high refresh 4k OLED. That won't happen for awhile.
I ALMOST bought one last month (going from 5600XT since 2021). Then my friend called me, guess he heard I was gonna upgrade, and he offered me his 2080Ti for £200. Couldn't say no. Even though I HATE Nvidia. Not bought Nvidia (new) since the GTX970. Now I can chill though, will buy a 7800XT when they bottom out in price I guess. Or a 7900GRE.
@@Jwasin3_1i highly doubt people are mining near the levels you think they are and also they dont over clock to mine they underclock. Your comment is FULL of misnomers.
I was on a 570 8GB $230 for the longest time up until the start of this year ....when I "upgraded" to a 6600 cause it was selling for 230 ... honestly I don't think I need to upgrade unless my old roster of games somehow get updated Edit : for reference the 7600 8gb is a $400 local currency product so ...you be the judge
Nothing to judge here, really. Would've done the same thing for a budget build. The 6600 is plenty for older titles and it's a good buy in most regions I've looked at (mostly €-prices). No need to go with a 7600(XT) as of yet.
Well I'm one of this people. I understand that new chips with semi good yields and a smaller manufacturing process are costing more. But I don't see the reason to pay for new games which cost 20-30€ 10 years back now are going for 60-80€. Game developer are getting 10 times more greedy than any graphic cards manufacturer could even dream of. I don't buy any game over 40€ most of the time I wait till they are 20-25€ max because a card I buy once. But I don't have 60 or 70€ laying around every 4-6 weeks to buy this or that game I want to play. Even games where I will invest couple of 100 hours into I won't buy for full price. But a decent GPU every 2-3 years is a must. I'm someone who must max out raytracing/pathtracing in any game I play, and for that I need enough power. Still didn't buy alan wake 2 yet because it is 40€ at keysites. Waiting for it to drop top 25ish. I'm not buying top end but high mid tier to low high end. Just bought a 4070 ti Super for 900€ because at the moment it is the best deal price to performance wise if you need DLSS/Framegen and Raytracing. Gonna get the 5070 ti probably in 12 months (if it gets 16 gb of VRam) if nVidia is gonna bring out a new exclusive DLSS feature. Otherwise I will wait for 6000 series.V
@@williehrmann Nvidia 60% profit margin. Sony 7% profit margin. Who is greedier? Also 10 years back they were $60. I remember Assassin's Creed 3 being $60 and that was 2012.
@@jiggerypokery2962 I doubt that Nvidia hast 60% profit margin. Where do you get that numbers from? But yeah there were some overpriced games even 10 years ago. But back then you'd get a lot of games for 20-30€ sometimes 6 months after release. Now I have 2 wait 2 years before I can buy them.
@@williehrmann I just found it from Moore's Law is Dead youtube channel where he has a tweet from TechEpiphany showing Nvidia's margins at minute 15. It's 64.73%, it just happened to be something I watched today, but i see it's only for the month of May on second look. Last year's net profit margin for Nvidia is 48.85% it can be found by googleing "Nvidia's profit margin 2024" But in short I'd play a great game on a 470 before I'd play a bad game on a 4090. Games are what is most important in gaming and the rewards should show that in a perfect world.
It's because the 7900xtx is a higher value product in terms of components quality and longevity. Nvidia doesn't allow much headroom at all for pricing for their AIBs, so they have to cut every corner to make at least a little profit. That's why EVGA left. From a software standpoint, the 4080 is simply to weak to handle raytracing in 4k without upscaling. But why buy such an expensive card when u will have to play in a lower res using upscaling? That's so bad. RT doesn't even look better, just different. Only PT looks better, but even the 4090 can't handle it with laughable 20 fps at 4k. Current gen Nvidia GPUs can't handle their own features and next Gen nvidias will have 50 series exclusive features, blocking 40 series owners, same as now with the 30 series.. So from a pure gaming standpoint there is just no reason to not prefer raster performance. And given the 24 gigs of VRAM of the 7900xtx (yes overkill), u don't have to lower the res to still be able to play in 4k in the next 5 years. But with the 4080 and the greedy 16 gigs it's already looking bad even right now in some titles in 4k (additional texture mods far exceed 16 gigs). Marketing-wise AMD should lower prices, but from a technical and hardware standpoint the 7900xtx is just the higher value product and will last much longer than a 4080. If u don't believe it just wait and see. Even if both cards were at the same price, for me and my gaming needs and the expected longevity of both cards, the 7900xtx would still be the better choice by far,cause of the 24 GB of VRAM alone. I'm not even talking about the rediculous power connector of the 4080/4090 that can burn your house down if it's 1mm off.
Maybe I'm crazy... but I feel like if you needed to upgrade your GPU you already have by now. So in that sense for new builds/budget conscious buyers they can still feast in the sub $500 price point, but anyone above that would be served waiting until Blackwell/RDNA4. Just my thoughts.
Tom you often talk about "overkill" when talking about the 4090 or upcoming 5090. But if you are an avid Simracing enthousiast like me and want to go max settings with the upcoming Pimax Crystal Super VR, there is no such thing as overkill.
Same here. But the problem with iRacing is their 10+ year old engine that works at 60Hz (+ 6 steps in one Hz; so it's like 360Hz but not really). Triple 4K is hard for any game, and in this scenario, new/faster GPU will help. They announced a new engine is in the works, when they hired a bunch of devs, but that is years away. Remember, they announced rain is coming a year+ ago, and we got it two months ago. The problem with ACC is graphics engine (UE4). Currently, 58X3D is "enough CPU" for online big grids. I really hope when AC 2 comes out, it will be optimized for new 8 core CPUs and decent GFX engine with VR in mind. It should come out this year, but I'm very skeptical. Maybe early access, like with ACC. We are getting GTRevival, which should be on the sim side very strongly. And they said they are going away from UE5 because "120Hz in VR is not possible." -Ian Bell. But this is also at least a year away. Rennsport is dead, IMO. AMS2 has no online presence. So we, simracers, are in the weird spot right now. While prices of equipment are the lowest they have ever been (wheels, pedals, rigs,...), we don't have any decent sim that utilizes new CPUs and GPUs.
VR really is about the only gaming use case where the 5090+ is worth it. But right now I'm playing Helldivers 2 at 1440p Super wide 120hz, so unless I get back into VR flight sims (oh hai Heatblur F-4E) I probably won't really need to upgrade the GPU.
Yes, I agree. I like the peace of mind that if I buy a VR game, it'll perform really well on my Quest 3. Just recently got Skyrim VR and knowing that I'll be able to play it with the addition of mods is worth the "overkill" title.
@@VoldoronGaming It's using TSMC 6nm. It's the cheapest node in the whole RDNA 3 lineup. The more expensive chiplet GPUs (7700XT and up I believe) use both 5nm and 6nm.
@@VoldoronGaming N6, which the low end uses, is dirt cheap at this point. The die is unlikely to cost more than $15, and that's assuming all dies with defects are just tossed out. VRAM has been dirt cheap for well over a year now at about $3 per GB.
@@VoldoronGaming LOL Voldoron you again the ultimate AMD shill, you dare said the "node is expensive" but couldnt said the same thing about Nvidia using TSMC's 5nm ? bahahaha you're a joke with your double standard
Gross revenue should be decent profitable but net profit? That's tricky without the minutiae figures as in cost of shipment of parts per unit to OEMs etc, volume per shipment, potential tariff costs if there any import/export costs/duties.
The RX 6700 XT is a fantastic card! The only reason why I upgraded was that I had a special occasion at the time and I wanted the highest end card AMD produced to celebrate. (A Sapphire RX 7900 XTX Nitro+)
What pain? Gaming isn't even Nvidia's main moneymaker anymore. AI is their bread and butter. At this point selling gaming GPUs is more like a side hustle.
I believe that the real resaon AMD is predicting equal or less earnings in the 2nd half of the year is because they might try to make a move on Nvidias market share at mainstream price points with the RX 8000 series. If the leaks of Moore's Law Is Dead are anything to go by (which they have been proven to be in the past), the 8800XT might offer 7900XT level of rasterization performance aswell as +30% Raytracing performance at 500$ compared to the 7900XTs 700$ (and 900$ launch price). Considering that RDNA 4 is not a clean slate Desing but more of a "bug fix" for RDNA 3, to achive this scale of generational price to performance uplift, profit margins have to be minimal. If AMD is making a move on market share, even if desktop GPU sales are expectet to rise significantly, AMD would not expect a revenue increase (and potentially a revenue decrease) due to extremely low profit margins. Therefore, AMD would be trading short term revenue for market share by lowering margins. This is something that they have done repeatetly in the past and that makes sense considering their current market share is likely holding them back from piercing into the Nvidia core buyers portion of the market (Mainsteam/ non Hardware-Enthusiast Market). Altough this asessment is based on a logical line of thought, i do hope for every gamers sake that i am right and we might see some major competition in the desktop GPU market.
i dont have big expectations with RDNA 4 but similar/equal to a 7900 xt but more efficient and no more than 600, idieally 500 or 550 but more than 600 would be a flop.... i´d jump to it if no more than 600, would be a huge jump for me but if costs more than that wouldnt make sense over the 7900 xt becuase less vram and maybe bit less performance.
Jjust wait for the RTX5090. There's no point in buying a 2-year-old 4090. No one should buy a 4090 now. These types of cards should be bought when they first come out.
@@josh93824 Sorry, I don't have a precise expectation in that regard. Depends probably on whether Nvidia tightened the thumb screws on 3rd parties, that hardly earn anything already, or wether the new contracts do leave a little bit of wiggle room.
I don't understand how the sales numbers prove the 7600 XT is selling well considering all the other cards on the list sold even better like the 4070 which went from 60 to 320 units sold?
Lots of folks buying 7600xt 16GB for running LLMs. That 16GB memory allows one to run llama 7b model at 30 tokens/sec. Check out LM studios rocm version if anyone of you have that card.
It’s spiking because of the 16gb vram. If they release the 8800 XT for 500/550 you’ll be getting 75-80%!of the 4090 for almost a quarter the price of the RTX 4090. Amd will sell a lot of them. This is a golden opportunity to take midrange marketshare.
That's the optimistic guess. If it comes out around the XTX, but with 16GB, better RT and, say, 100W less power draw for $550 then it will be an absolute smash-hit. But if it comes out a little less than the 7900XT, better RT, but still 300W-ish TDP and still $550 then it's going to be a somewhat lukewarm reception to say the least.
The same reason why I just splurged on a 4090 now. Found a 4090 locally for 1250. Maxed out on 4K frames for now, and this will be my final upgrade before everything gets too expensive.
Not really. Some yes, but specifically 4060 goes from 80 to 110 to 130 and doesn't go along with the other cards. I suppose you might say that RX 7600 XT is not gaining on popularity, but rather 4060 is loosing, but the result is the same.
High end sales are flatlining b/c the cost of living is astronomical. 84% of Americans are living paycheck to paycheck. Less money for needs/ necessities and even less money for wants. We all out here just trying to survive
still using a 6700xt on am4 platform and it does the job for me. I am in no hurry to upgrade either my 5900x or the 6700xt still getting over 140 fps in the games I play @ 1440p
As a 1080ti/8600k owner I will be waiting for zen 5 to stabilize in supply/price at least. My Gsync-monitor is 1440X3440 80hz, so running games at medium works fine for my taste in delivered framerates. Perhaps I might even wait for next Gen QD-OLED display to det 34+ inch 4K 120hz before I invest in a computer upgrade. Paying a premium for extra performance above 80 FPS for my monitor doesn't make much sense, so better to save up and get the setup that are in "sync"
I'd be glad to see RDNA4 launch in the next 3 months, but I'm a bit skeptical. I wonder how much 7900xt/xtx stock there is on shelves. AMD is likely waiting for that to clear.
gotta love how absolutely non-applicable all of these prices are for the European market. Everything is still overpriced to hell over here. It's almost a $200 difference between what a card costs in the US and over here.
I don't know how to square your statements / sources saying that AMD cards have been selling incredibly well, with Lisa Su's statements that GPU sales are down. Claims like "RDNA 3 is selling VERY well" don't make any sense with these results.
You cannot kick mdf down the quarter. That's not how it works. This is just bad sales. If the sales are equal for 7600 and 4060 how do you reconcile the data in steam survey where there is no 7600 and 4060 is there. You can attribute many problems to the steam survey but in most cases it largely represents the macroeconomics of gpu market in general.
it's pretty weird how he keeps trying to make the 7600 look better in sales and usage than it really is, that gpu was a market flop, mindfactory is the only place in germany where amd has a lead that big, let alone the rest of germany, and the world
It is pretty much confirmed at this point that AMD will not be competing in the high end next generation. You've seen Nvidia's pricing when AMD is close: The 3090 was $1600 when the $1000 6900XT was only 10% weaker in raster. The 4090 is effectively $1800 when the, now, $850-900 7900XTX is only 20% weaker in raster.... who the FUCK do you think they're going to price things when the strongest AMD card will go against the 5070?
The 6800 is still my favorite product from AMD. I had it constantly below 180w after ancient gameplays optimization video. The best affordable 16 GB card you can get. Especially if found secondhand for $250-$300
them low priced 7800XT's you showed from new egg are refurbished/open box used cards. Kinda misleading tbh. New 7800's you rarely see below $480 and the premium cooler models almost never get discounted.
Right....as if opening a box or repairing a product back to QC standard is somehow worse than the gamble of a new non-validated unit. You do realize that a refurbished product is validated to meet QC right? Refurbished products be it 3rd part authorized or 1st party is also far less likely to be DOA or fail than a new product. New product are tested by batches. Bad unit get missed in some batches, then get sent to a customer to be DOA or fail early. ALL refurbished units are required to fully pass QC as if it's new, or they never leave their facility. I have years of refurb experience and one of which was in QC, so I know first hand. At least within the SE U.S. I shouldn't have to explain why an unused GPU in an open box isn't a problem. But hey, by all means if you don't believe my experience, that's fine. Keep enjoying your cranial enema!
@@fateunleashed9680 You completely miss the point. At no point was i speaking about the quality of the 7800XT cards. I feel as though the price of the 7800XT is misrepresented here. I have no problem with refurb or used products.
I bought the 7900xt when it first came out for around $900.. The 7900XTX were all purchased by scalpers and had a $500 markup that i refused to pay. I don't regret it a bit, been a great card.. Around $600 if you can find one, it's a steal.
Going by bang for the buck, i'd say there currently are but 5 GPUs worth buying: 1. Intel Arc A380 6GB for $110, 2. RX 6600-6650 XT 8GB for $200-$220, 3. RX 6800 16GB for $360, 4. RX 7900 GRE 16GB for $540, 5. RTX 4070 12GB for $550 Everything else either has sold out, doesn't deliver what it should, or is utter luxury because you already are into the 4k gaming range starting with option #3. And I very much doubt that it makes sense be buying anything beyond an RTX 4070 right now, because the introduction of the RTX 50X0 will likely come with a price drop for the RTX 40X0 series.
I picked up a replacement for my Vega64, finally, though I don't do much gaming these days and what I do do is Esports at 4k 144hz, but I decided to pick up a 7900XT as it was selling at $100 less than the cheapest in my locale and I wanted something to last as long as my Vega has for a similar inflation adjusted price. My alternative for the same money from Nvidia was an RTX 4070TI (not the super variant) so it was a bit of a non-contest.
holding on to my 7900XTX Nvidia failed with their Super cards but won the wallets of the nvidiots. 5080 at best is going to be between a 4080 and 4090 just with better RT than the 4090. my card is gonna last awhile.
Here in Europe I've seen the 4090 drop below msrp for a month now. It currently sits at around 2k USD converted, which includes a 25% sales tax, and at launch it was around 2.2k USD which it has never dipped below before. I'm using ASUS TUF models as reference here. The 4080 Super also dropped by about 2-300 USD converted
@@Dave-dh7rt yes, 25%.. gotta love it XD At least with my private company using high end gaming pc hardware, I can deduct it all. But regular consumers aren't so lucky.. This is in Denmark by the way, Europe's avg sales tax rate is around 21%
Well if you're following basically any hardware news channels, the rumours that RDNA 4 is basically going to be a bugfix really only improving RT perf is already out there. So it sets the expectations of the people. RDNA 5 is the one that has the potential to be a letdown because now you've had basically 2 generations of cards that are on the same performance range having made relatively little gains from 6000 series, and many of the 6000 and 7000 series owners are gonna be looking to upgrade to the 9000 series for a new breath of life. Given that most people are looking to upgrade every 2 generations or so(HW unboxed survery) So, if AMD doesnt deliver with at least i'd say 30-50% uplift, they might be looking at a serious market share loss as some people jump ship to nvidia assuming they have decent gains gen/gen which could add up to the 30-50% that upgraders are looking for. It their price is even remotely acceptable and if blackwell doesnt sell well, it can throw nvidia into a 3000 series situation where they cut prices and people are going to start jumping and AMD is gonna have to keep the consumer GPU team on life support because the revenue stream from people upgrading isnt going to come. RDNA 5 is going to be a big turning point for AMD for most consumers who upgrade every 2-3 generations. Only time will tell how well the new architecture will perform and whether in the eyes of the price/performance conscious consumer, they win out nvidia in the mid range and keep their customer base.
Depends on what you're looking for. If the top card comes out at around a 7900XT in performance, with much better RT, 16GB and a 100W reduction in power usage for, say, $550, then a lot people are going to find that attractive. If you're looking for "a 5090 competitor, but cheaper", then yeah, sure, you're going to be disappointed.
@@user-eu5ol7mx8ythe founding fathers owned slaves and the original version of the Constitution said that only land owning white males could vote ... Soooo, I kinda doubt that they would give a shit about the blanket disenfranchisement of non-wealthy Americans.
The nasty part of this is that this tax will trickle down to buyers across the globe as well since the manufacturers are American, so by doing this they're just raising the prices worldwide
I think it's a horrible time right now to buy a 40 series GPU without a good discount. Buying a 4090 is especially pointless without an excessive discount of at least 50% because otherwise if you go top end, you might as well wait for a 5090 instead.
Wait what????????? Should I wait for a 5090???????????? How about sending a couple thousand USD and I'll be the first one in line at Micro Center. Otherwise I'll wait for the performance of the 4090 to drop down closer to $1000 USD and buy that because that's good enough for 4K gaming and I'm never in my life going to be 8K gaming, because why? Just to spend money? 4K gaming is something I would do on a 65" 4K OLED or similar quality. Otherwise I'm 2K gaming on a PC monitor, and for that all I need is the power of the 4080 or 4080 Super to drop down closer to $600 USD and then I'll have an outstanding 2K gaming rig at a price I can stomach paying. If that takes one or two generations of new products so be it but 2K gaming doesn't warrant a $1000 GPU. FOr 1080p gaming something along the lines of a 4060 Ti or 4070 being around a price of $300 USD and having the RT performance for 1080p would be nice.
The problem with GPUs is that the need the dye for commercial purposes so they price the dye at a premium for consumers. Until AI and other processing becomes more efficient and doesn't need as much processing this is going to be an on-going problem.
I made the mistake of buying a 4090 a year after it released, was seeing if a Ti variant came and it never did. I will not be waiting again and getting a 5090 the moment they drop.
I've had a couple chances to get a 4090 for pretty cheap. Like $1300 and have passed. The melting cables. Power requirements. The lack of games and software that really need it for either work or games. It's not really needed and a bit not ready for prime time. The 5090 or 5080 maybe.
@@douglasmurphy3266 I pay attention to sales on amazon, Newegg, and Microcenter I have nearby. Microcenter has good sales on open box returns. Newegg has good sales with coupon codes sometimes. And Amazon has hidden sales in the third party seller tab that can be brand new sometimes. If you need one right this second you won't see these. I got a 6950 xt for a friend for $450 for example.
For us ML hobbyists the 5090 really depends on the amount of VRAM. It will need a minimum of 48GB to really be any upgrade over the 4090, as even with that card VRAM is already the biggest bottleneck. Otherwise we're getting to the point where a decently kitted out Mac Studio with 128GB or more unified RAM is a better deal than a PC with nvidia GPU.
Not wrong. But stepping outside the nVidia ecosystem for AI/ML is always an adventure in dealing with software and library compatibility. If you’re writing your own code it’s mostly okay as long as you’re not dependent on CUDA-only libraries. Running code from GitHub or HuggingFace is very hit and miss though. I will say that if VRam is an issue and you’re willing to step outside the CUDA sandbox, Radeon Pro have some pretty attractive value offerings. A W7900 GPU gets you 48GB of VRam for the low price of $3,600 on Amazon right now. 128GB of unified memory on a Mac Studio is a pretty sweet idea though. The only drawbacks to that system are the cost (>$10k) and the absurdly low power draw (read: clock frequency) given the intended use case. But if you absolutely need > 48GB, it’s a great solution and costs significantly less than enterprise GPU’s like the H100.
Buy any top die: play the silicon lottery My microcenter open box special XTX has a world record benchmark. I bought it a week after launch (it had been returned for the wonky launch reference cooler orientation problem) and put a water block on it for my first custom loop.
@@Chronically_ChiII If the warranty is there, I tend to be be positively inclined toward refurbished products. I had an excellent experience with 2 HD7950 refurbished by Sapphire, including one that I OCed when it was in my system, and that still performed pretty well for a 9 years old card when in 2021 I used it again to build a makeshift PC for my SO with no repasting or anything whatsoever, handling (with compromise but still) games from the PS4 era like Horizon Zero Dawn and else. The fact I bought those 2 cards for 100€ each while they weren't that old at the time, it's still probably one of my best purchase ever. I suspect not every refurbishers are the same, but I'd be inclined to trust either when it's the constructor (Sapphire in this case) or a reputable company (like Microcenter, that I wished I could have here in France).
@@Chronically_ChiII Refurbished/open box is good as long as there is a good way to RMA/return the card if it truly doesn't work. Especially for top dies/cards at the top of the stack. (4090 / XTX)
i picked up a 7800xt for 425$ from microcenter about 4 months ago has been a great card, would love about double the VRAM as a Heavy VR user and club photographer in VRChat, VRChat being able to eat even a 4090's full VRAM buffer thanks to player created avatars and worlds not being anywhere near as optimized as a commercially made game, also could use some more CPU performance my 7700x is already starting to feel the pain of increased cpu usage in games.
@@puffyips yes but the increase for the price isnt worth it for me atm, Single-Core is still a massive limitation in VR, I'm hoping to pick up a 9800x3d hopefully when it comes out, Code optimization feels like a lost art in modern gaming getting 30fps at the equivalent of 1440p in VR entirely because of High CPU time and my GPU sitting at 50% usage and Frame doubling and Upscaling tech causes increase in motion sickness in VR thanks to motion jitter
You should try using the DLSS Enabler Mod (FSR3) playing max settings plus path-traced RT with your 7900XTX. On my 1440p OLED i get 70-100 fps all the time. Thats also using the NOVA LUT mod. The visual fidelity is the best I have ever seen in gaming.
I just got into PC gaming, primarily a console gamer for 25 years. I have a laptop with a i9-13000H/4060 8gbVRAM/16gbDDR6 SRAM. It can run anything I've tried really well, but definitely shows it's the Mid-Range GPU on a couple newer games from the past year, obviously depending on you're setting preferences. VERY happy with it and ultimately glad I bought a laptop that is mostly comparable to SeriesX/PS5; As opposed to a SeriesX and a cheap laptop. Probably will end up building a desktop in 3-5 years using an Nvidia 5080, or whatever else is powerful and affordable at that point in time.
@@mlsasd6494 don't hate the player hate the game. If AMD isn't winning over the average consumer when it comes to GPU purchases. That's on them. No Company is owed their business. Besides they do buy AMD CPUs so there is that.
@@GRIGGINS1 i think my intentions were not quite clear. I just wanted to point out that the comment is redundant as AMD has less margin and less market share and a less stable customer base which obviously leads to a lot lower revenue. I think everyone should know that nvidia is the market leader in the GPU segment. Also as you ponted out, i dont think AMD cares too much. Companies dont have feelings and they are killing it on the CPU side while also participating in the AI rush. Sure more money is always better but i think they are currently happy in their position.
I'm surprised high end took this long to slow down. Most people cannot afford a high end pc, midrange makes a lot of sense to me, high end you might as well push for a high refresh rate 4k panel opposed to a wide 1440 or regular 1440 panel. Last year I remember mind you the price gap was big for panels. I'll keep my 2.7k panel and 6900xt for another generation probably lol.
the fact that people are still wanting a 90 class card after everything Nvidia has done is really gross. Sure hope your cards dont catch fire or melt on ya. If they do you can rest assured that Nvidia will most definitely not have your back.
The 3090ti didn't have the issue, despite actually sucking down 600 watts to the 4090's 450. Sour grapes aside, there are valid reasons to get a 4090. And not just ray tracing, which I couldn't give a crap about. The review websites are full of crap, though. Quoting "average" increases at resolutions like 1080p and 1440p with data points that are heavily CPU-constrained. The 40 series is better cooled than the 7000 series, more power efficient, Reflex is actually a pretty big deal, and in my use case (VR), nVenc was a huge deal for wireless encode before the recent VirtualDesktop updates, and there are about 5 years of VR titles that AMD was basically unsupported in. The 4090 is roughly 40% stronger than the 7800xtx, not the 10% Moore's Law talks about. It's very, very difficult to make use of all of that at 1080p, but in VR, suoersampling can fix problems like awfulnanti aliasing through brute force incredibly effectively. Plus, titles like MSFS and ACC genuinely *need* that level of performance to run anywhere near acceptably.
I to am considering an RX 7600XT. The RX 7600 XT is like a 1080ti ++ in many regards. (was pondering to get a 1080ti first, but 11 gb is not enough). The RX 7600XT is above the 1080TI for gaming. In power consumption it is acceptable, though I would have wanted it lower. Where it shines is content creation, depending on your workflow. Davinci Resolve it really shines, nice to have for content creation in general. Some other things will take longer (sometimes quite a bit) then Nvida, but it can be done. Always been a Nivdia user, tried a Rx590 (100 bucks at the time, so why not). Had one time a Driver issue, but just rolled back to previous driver, until they fixed the issue. I also really Like Dr Lisa Su, She is an Inspiration for us all :)
Short answer is ‘yes’ for me on the 5090. I’m not interested in the 4090 as an upgrade to the 3070 at this stage in its lifecycle, so for me it’s going to be a no brainer to go for the 5090 next year instead and really future proof myself.
With most people struggling to buy groceries and pay bills the high end will be dead until the economy stops being crap. Since AMD is skipping high end this year they need to price RDNA 4 correctly and not pull another RDNA3.
@@benjaminlynch9958 you would be surprised how many poor people buy 1500$+ cards with credit/financing or saving ,if only the rich bought these cards they would sell 80% less of them . i know 2 homeless people who couch surf one has a 3090 the other a 7900XTX. its insane
Would be actually cool being able to activate RT without loosing half the fps. This is something that even an nvidia 4090 cannot do. I am curious about intel battlemage in that regard.
@@Simon_Denmark You seem to have missed that intel's arc rt implementation is the most efficient of all three major gpu vendors. When activating rt intel's solution looses the least amount of fps while nvidia's models often loose half or even more fps. Yes this is indeed something.
@@aladdin8623 Yes but Intel is very new in the dGPU market and Nvidia is on their 3rd* generation of RT GPU’s. I wouldn’t hold my breath. The overall FPS will still be lower with Intel’s GPU’s.
@@Simon_Denmark Only because the 4090 compensates its rt defficiency by brute force on a massive die. Don't forget intel didn't produce a high end gpu competitor but only a mid tier line up.
Nice, I bought a 6800 for slightly more than referenced by yourself (GPU's generally are slightly more expensive here), assuming that 359 is without VAT (tax).
If it's true that AMD isn't adjusting their financial statements to account for rebates when they're earned, that could mean they're playing spicy with GAAP compliance, which could be an interesting story.
It’s pretty much a sure thing they’re compliant with GAAP accounting policies. There’s less than zero chance an auditor wouldn’t catch something that significant. I think the more likely scenario is that unit sales are just down (particularly semi-custom) hard. On a related note, I really wish tech RUclipsrs would reach out to accounting / finance experts when discussing financial results and when accounting policies like this come into play. It’s so easy to get the story wrong if you don’t have that accounting / finance knowledge base.
The only way to make sure you get the best bang for the buck is to keep waiting until the day before you die. If you need to upgrade, then upgrade. Something better is always about to be released.
I'd love to swap my 2070 Super for say a 7800 XT but the one thing holding me back and waiting is the fact I'll lose DLSS support. Which so far, DLSS has been incredible allowing me to continue playing games that look fantastic at 4k...on a 2070 Super...I just don't think I'll get that kind of longevity switching to any of the AMD cards
I would expect nvidia to price the 5090 to where it doesn't even move the 4090 off of $1600 much less anything down the stack AMD's sweet spot pricing usually comes 6 months to a year after a product is released, they get MSRP out of as many people as they can then they drop to market based pricing underneath the "official" price. Example: 6750XT makes no sense at $550, after things settled down it dropped to $330 making it the best value in new budget builds - and I'm seeing it dropped to $300 just recently. If price is your concern, you'd be waiting 6 months just to need to wait 6 months for the decent pricing to materialize. Buy something now and keep the box nice and clean so you can sell it in a year.
[SPON: Directly support MLID by buying the Shargeek 170: shrsl.com/4i5a6 ]
Low end shoppers should WAIT until big companies like Amazon buy a good computer in bulk and put it on sale, usually for the holidays. I'm sure they'll be plenty of great deals this holiday season on computers. Just be patient and wait a few months. Especially when the next generation is right around the corner.
I wonder if you guys have some idea about monitor stuff as well? The recent releases of the new QD-OLED and WOLED monitors and TVs (like the LG C4 and G4) are being marketed even here in Southeast Asia, but it's WEIRD because you can't find any of the mainly marketed stuff on any store online at all and local stores always have an out-of-stock sign on them even if there are ads showing they're out now. It's like they're marketing them, but not selling them at all. And even when I ask stores, I get no proper answer why they're not available. It's just weird.
From what you are hearing are MSRP going to be $1000 for 5080 and $1500 for 5090 or is a $2000 MSRP expected on 5090?
@@henrythegreatamerican8136 that's what I am planning on doing. wait for Prime Day and pick up the RX 7900xt and Ryzen 7 5800x3D as an am4 high-end upgrade.
I backed the Shargeek 170 on Kickstarter and have been very happy with it. I approve this sponsor.
4090 will not lower the price because no one have a answer against it yet. The 5090 will just be even more expensive.
As long as people buy them, they will continue to be expensive. Only one way to make your voices heard, boycott.
The counterpoint to your argument is that Nvidia still need to sell GPU's... If their new cards don't represent better price/performance than their last generation then they won't sell.
@@cairnex4473theyre selling plenty of ai cards, they barely care about the pc builder market, they just use it as a testing ground
@@cairnex4473nvidia sell a lot of gpus!
AI ones!
They don`t need ”poor” gamers
@@cairnex4473 "If their new cards don't represent better price/performance than their last generation then they won't sell."
People buy Nvidia regardless of price/perf.
Assume that Nvidia will boost RT performance which is becoming a big deal. They could also launch a new gen of DLSS that only works for 5000 series. In other words this is more than just raster perf.
If the 5090 is 30% faster than the 4090, it's an $1800 - $2000 GPU because of the boost to everything else, so what that really means is a 30% boost in perf. using RT, and then a better boost using a new version of DLSS along with the added tensor cores.
Nvidia is going to keep selling top end GPUs because there will be the upper 20% in the wealth bracket or household income that can easily afford it, and that's probably a good 50+ million families around the world and with as big as gaming has gotten EVEN WITH Microsoft screwing things up, there'll be enough sales there because they can.
Yes, if Nvidia launches a new gen of DLSS that only works for 5000 series there'll be the bitching and moaning by TBers and in the comments section, and then people rush out to buy 5000 series.
But hey, maybe after the 5000 series is out for a while the 4090 drops in price to $1500 and they'll only cost $1700 from retailers.
Thanks Tom!!! I literally went to Newegg to get a 6800, I traded in my 6650xt and literally paid $220 for it. What a deal man! Thanks for bring it up!
I recently built a computer and used the xfx 6800 that was shown in the video. I have had no problems, except the card is huge! Make sure your case will fit a 340 mm gpu. It's a really great card!
I figuratively went to Newegg and did the same thing.
Any time I hear the question "Buy Now or Wait", it always comes down to this:
1- If you need a card, any card, just to play games, and you have none right now, then just buy. There will always be price drops and sales coming out. Get what you need to play, that will keep you tided over for now. I'd start with a used 30xx or 6xxx GPU at 1080p.
2- If you want the best of the best, just get what's out. They very rarely decrease in price before new stuff comes out. The 4090 won't get cheaper til the 5090 is out, and then you'll be wanting to buy that instead. Just suck it up and pay the money, if that's what you really want. Or don't.
3- If you need to upgrade or replace an existing card, that's where it gets more tricky. You really have to know what performance you want, the card that's best for that, and then watch the market for a while to see about price fluxuations. Typically cards are at their best prices when demand is down between releases (like right now) or once newer cards are out and noone wants the old card. For instance, if you are on a 10xx or older card, the past year has been a great chance to upgrade to a 30xx card. Still is, if you can find a new one, but if you're happy with used there's plenty stock available.
Yeah, to me it's difficult because I got a 6600 and I want to get av1 encoding which I see a lot of people being so polarized about it. I do tons of encoding and decoding so I'd like to have my storage cost nearly halved and my performance a third up (looking at the 7600XT), and 16gb VRAM for eventual AI workloads. Apparently there's some NPUs in the 7600xt but there isn't much use I heard, but perhaps in the future? Also the reason I look for the x600 series is because of the lower power envelope, I got a tiny PC case to stay mobile. I know that game devs and GPU companies are perfectly able to make very efficient graphics and I want to put my money there, to kinda tell them to fxxk off with the huge thermonuclear cards!!
@silverhawkroman in your situation, I'd probably get the 3060 12gb or 7700 xt. The 3060 would only be better for vram and the encoding, but on par graphically. 7700xt beats it solidly.
@@ElladanKenet lol there's no 7700xt that is shorter than 266mm in my country, i only have a max length of 250mm. Oh and i forgot to mention that I run 5 screens (preferrably 6, but moving from 580 to 6600 kinda ruined that), and i heard nvidia cards dont do above 4.
Sure if you don't give a fuck about the quality of the screen you're playing on... I am waiting for a 6090 and a very impressive 8K TV to come out next year or 2026. I'd have to wait even longer if I wasn't going to upgrade until 8k 120hz is a thing...
The only times it’s really worth to wait is if Black Friday or a new release is really close. Otherwise the waiting can be infinite.. wait for next discount/deal.. hmm.. new release soon.. too expensive wait for deal.. it‘s been so long maybe wait a couple months for the next generation, etc.. Afterwards you‘ll always feel a bit of buyer’s remorse with PC parts, no matter the time you purchase.
Think the 6800 is vastly underrated. It out performs a 2080ti and has all the new DX bells and whistles so should be relevant for at least another 4 years
also vastly unavailable outside america.
The drivers were garbage... so it can't be considered underrated.
They didn't make many, yields were good and the higher end Navi21 models were selling out so few 6800 were around.
@@sdnnvs ah, more driver FUD
@@sdnnvs Anyone who has used a 6000 card knows the drivers have been great. AMD is the entire reason Nvidia remade their control panel ffs. I'm so sick of people spewing this trash...grow tf up.
That segment starting at 16:40 … none of what you talk about should be causing a massive drop in sales in any one period. US Generally Accepted Accounting Principles (GAAP) dictate that any credits (known as MDF in the industry) are recognized in the period in which they are earned regardless of when the cash/product is rebated back to the retailer. This may have been an issue with respect to a big drop in cash flow, but in terms of reported revenue it’s a complete non-factor.
Another way of explaining this is that in a particular period, if a retailer sold a card for $500 as in your example, but then earned $50 in MDF credits from AMD, the revenue recognized from AMD would have been the $400 BOM kit to the OEM minus the $50 credit to the retailer. They would have only recognized $350 in revenue on that sale even though they’re receiving $400 in cash from the OEM. When the retailer cashes in those accumulated credits, they’re effectively redeeming that $50 credit sitting on AMD’s balance sheet.
This is essentially gift card accounting for B2B transactions. Revenue is recognized when merchandise changes hands, but the cash flow is going to be different because of the issuance and redemption of those credits/gift cards.
Noice Mr. Finance. Keepin' us all smarter than the plebs.
Nobody said it caused the drop in sales. A year ago there were new models still at high prices, this year same models and prices have fallen. So if they're booking $50 retailers' credits it's no longer a $400 BOM but a reduced margin $300 so AIB can profit and retailers can sell them cheaper claiming new credits, so proportionately the numbers are worse as there's still fixed costs in the biz to pay.
Falling sales and lower margins at the same time as falling console sales lower revenue and profit.
It isn't that AMD is selling less. They are selling for less money. I.e. margins are lower due to the over production of RNDA2. The RX 6600 was a 329.99 MSRP gpu...it has been regularly selling for 199 for quite a while now.
As someone who's just going to finish Bachelors in Business Administration (Mojor in Accounting) I loved your comment.
Good job try to explain to the the people unfamiliar with GAAP accrual basis principle.
@@Toshinben Mate, you're "confusing physics with chemistry".
While both accounting and finance is from the larger field of Business (like science is for physics and chemistry), they are two different branches that DO relate together to form the reality, but are separate branches/schools of studies.
i bought a 7900 xt for a linux build i made, very happy with it, beastly device, price was a bit much but not nearly as crazy or inflated as the nvidia parts
Only caviat is still how far support for AMD GPUs laggs in professional workloads like editing, rendering, and AI models that still reply on CUDA.
Though software like Davinci Resolve, Agisoft Metashape, and Blender are thankfully major bucks to that trend.
I got one too amazing card, upgraded from 2070 Super. The extra vram is extremely useful for messing around with AI.
@@handlemonium All fair points, but not everyone has that particular use case. I would suggest the majority don't.
Sry bro but they're both high af. 😅
7900xt is a complete waste of money and it has no future proofing because of its weak and useless raytracing capabilities
4070 ti, 4080, 4090 all are better deals than 7900xt and it is a fact
RDNA2 was a generation of 4 flagships, lol. 6900/6950XT, 6800XT, and 6800 were all beast mode GPUs.
Once the crash happened the prices were insane. I got a 6800xt for 470
I think you meant "RAGE MODE" 😆
@@rick_takahashilmao
6800XT was in my mind the best card of the generation.
Got my 6800XT for 509 Euro with the Starfield Premium bundle... So effectively 409 Euro 😂
I believe that the low-end (and midrange) product cycle is way slower than the high-end. People don't care about new products, they care about affordable GPUs. They watch like a hawk when prices drop, they won't watch Jensen rambling about his famous friends or elephants.
Yeah doesn't matter how fast a GPU is if you can't buy one anyway.
@@TheDarksideFNothingNvidia would disagree.
@@tringuyen7519 because they have access to all of the people who actually are willing to commit that kind of cash.
Don't get me wrong, they are moving cards. I just mean to the bulk of the market there's very little that actually feels like a good deal. I think most people are reluctantly caving and just biting the bullet on their purchase.
All my two dozen actively gaming friends only care about 200 - 400 USD desktop dGPUs and 800 - 1300 USD laptops. It's kind of like those companies try to target the 1% of richest people on Earth.
@@Kynareth6 Unfortunately, they know they can get a lot of people to spend beyond their means if they gatekeep/paywall good performance in the latest games.
Add to that the actual fact that it IS a lot harder to produce a good value low end card now... Especially since people's build quality expectations are really high these days compare GPU coolers now to 10 years ago, they are NOT the same.
Mind you, I do think something like a 4070 that looks like a janky PCB with a cheap cooler zip tied to it could actually sell well if it didn't thermal throttle and provided a significant enough savings.
At the end of the day these companies have decided that if you don't want to spend $500+ then you should stick with the older generation that they overproduced.
It's very frustrating, especially with how VRAM crippled older cards are now for the most part.
still using a 1080TI since 2017 and still happy with it. i feel that it lacks some power here and there but these prices are way to high.
I have a 1070ti, and currently don't mind putting my settings mid/low. Will have to upgrade eventually but for now I can play all the games I want to play.
@@MrBouncingDeathrdna2 (especially used) was/is everyone’s saving grace
@@puffyips I actually bought a 7800XT for a nice price (refurb), with a amd 7800X3D, because I got an urge for something new.
agreed. still rockin a 1070 since same timeframe. only game that starts to become annoying in frame drops is helldivers2 that i play. so other than that... doing a major upgrade right now for 2k+ (cuz if im getting a new gpu im just gonna get a whole new computer, i doubt this case will even fit the newer cards, and teh psu i know is definitely not going to cut it)
Still using a pair of VooDoo 2's in SLI. Don't see any reason to upgrade yet. Hitting well over 100fps in Quake II.
ZOOOooooooommmm...
Upgraded from a 1080 to a 7900GRE two weeks ago. Very happy so far. I’ll upgrade once I drop below minimum recommended specs.
was going to do a similar upgrade from a 6700xt to a GRE but then I ended looking at the 7900xt, because might as well since im spending $550+, and what I ended up buying was a 7900xtx and 7800x3d build lol it got out of hand quickly.
I think from a performance upgrade, the 7900xtx is the only reasonable upgrade from a 6700xt. Everything else feels like it's not a big enough improvement
@@Greenalex89 yea man, 6700xt I got for $160 from a local guy a year ago and since I'm on a 3440x1440 monitor I needed all that performance. I upgrade every 4 or 5 years. Had my rx580 since 2017 and just recently upgraded mid 2023.
I'm waiting for RDNA 5. My 6800 XT is just fine for the games i play.
I've had a 6800xt for two years now , I'm not needing to move off that until I upgrade from 1440p to high refresh 4k OLED. That won't happen for awhile.
My GPU still runs everything well, but it's been 4 years so the upgrade itch is strong lol
Oh, that's more than fine.
Same.
@@user-eu5ol7mx8y Wait for next gen at least
0:20 7800 XT user here!
I ALMOST bought one last month (going from 5600XT since 2021). Then my friend called me, guess he heard I was gonna upgrade, and he offered me his 2080Ti for £200. Couldn't say no.
Even though I HATE Nvidia. Not bought Nvidia (new) since the GTX970.
Now I can chill though, will buy a 7800XT when they bottom out in price I guess. Or a 7900GRE.
See if the connectors melt again before I buy 5080 or 5090
That's one of the reasons why I recently decided to buy a 4070. Don't want to take unnessary risks even though the improved connector looks better.
@@kkevus the connectors also burn because some people overclocking to mine Bitcoin and running it way to much too lol
@@Jwasin3_1i highly doubt people are mining near the levels you think they are and also they dont over clock to mine they underclock. Your comment is FULL of misnomers.
@@NahBNah I did it to my 4090 . Never again lol
I was on a 570 8GB $230 for the longest time up until the start of this year ....when I "upgraded" to a 6600 cause it was selling for 230 ... honestly I don't think I need to upgrade unless my old roster of games somehow get updated
Edit : for reference the 7600 8gb is a $400 local currency product so ...you be the judge
Nothing to judge here, really. Would've done the same thing for a budget build. The 6600 is plenty for older titles and it's a good buy in most regions I've looked at (mostly €-prices). No need to go with a 7600(XT) as of yet.
I'm kinda same I had a 590 and took a 6700
The RX 6600 is a solid card, I expect that to be what the average person has in terms of performance for the past 3 generations.
I bought a used 6800 for 340€ in almost mint condition.
Great GPU!
$250 from April 2017 when the RX 580 launched is equivalent to $320 from April 2024
The 7600XT is going for as low as $320
I find it crazy people will step over their injured grandmother to give Nvidia $2000 for a graphics card only to complain games are $70 not $60.
Well I'm one of this people. I understand that new chips with semi good yields and a smaller manufacturing process are costing more. But I don't see the reason to pay for new games which cost 20-30€ 10 years back now are going for 60-80€. Game developer are getting 10 times more greedy than any graphic cards manufacturer could even dream of. I don't buy any game over 40€ most of the time I wait till they are 20-25€ max because a card I buy once. But I don't have 60 or 70€ laying around every 4-6 weeks to buy this or that game I want to play. Even games where I will invest couple of 100 hours into I won't buy for full price. But a decent GPU every 2-3 years is a must. I'm someone who must max out raytracing/pathtracing in any game I play, and for that I need enough power. Still didn't buy alan wake 2 yet because it is 40€ at keysites. Waiting for it to drop top 25ish. I'm not buying top end but high mid tier to low high end. Just bought a 4070 ti Super for 900€ because at the moment it is the best deal price to performance wise if you need DLSS/Framegen and Raytracing. Gonna get the 5070 ti probably in 12 months (if it gets 16 gb of VRam) if nVidia is gonna bring out a new exclusive DLSS feature. Otherwise I will wait for 6000 series.V
@@williehrmann Nvidia 60% profit margin. Sony 7% profit margin. Who is greedier? Also 10 years back they were $60. I remember Assassin's Creed 3 being $60 and that was 2012.
@@jiggerypokery2962 I doubt that Nvidia hast 60% profit margin. Where do you get that numbers from? But yeah there were some overpriced games even 10 years ago. But back then you'd get a lot of games for 20-30€ sometimes 6 months after release. Now I have 2 wait 2 years before I can buy them.
@@williehrmann I just found it from Moore's Law is Dead youtube channel where he has a tweet from TechEpiphany showing Nvidia's margins at minute 15. It's 64.73%, it just happened to be something I watched today, but i see it's only for the month of May on second look. Last year's net profit margin for Nvidia is 48.85% it can be found by googleing "Nvidia's profit margin 2024"
But in short I'd play a great game on a 470 before I'd play a bad game on a 4090. Games are what is most important in gaming and the rewards should show that in a perfect world.
Microtransactions - Seasons pass for solo games - Premiums "packs" - Subscriptions are why people complaining when the game is already 60-70$
The 4080 came down by $200 with the Super, and AMD didn't lower the 7900XTX at all in response. Had they dropped by $200, I would own one already.
They did drop prices by 100$?
It's because the 7900xtx is a higher value product in terms of components quality and longevity. Nvidia doesn't allow much headroom at all for pricing for their AIBs, so they have to cut every corner to make at least a little profit. That's why EVGA left.
From a software standpoint, the 4080 is simply to weak to handle raytracing in 4k without upscaling. But why buy such an expensive card when u will have to play in a lower res using upscaling? That's so bad. RT doesn't even look better, just different. Only PT looks better, but even the 4090 can't handle it with laughable 20 fps at 4k.
Current gen Nvidia GPUs can't handle their own features and next Gen nvidias will have 50 series exclusive features, blocking 40 series owners, same as now with the 30 series..
So from a pure gaming standpoint there is just no reason to not prefer raster performance. And given the 24 gigs of VRAM of the 7900xtx (yes overkill), u don't have to lower the res to still be able to play in 4k in the next 5 years. But with the 4080 and the greedy 16 gigs it's already looking bad even right now in some titles in 4k (additional texture mods far exceed 16 gigs).
Marketing-wise AMD should lower prices, but from a technical and hardware standpoint the 7900xtx is just the higher value product and will last much longer than a 4080. If u don't believe it just wait and see.
Even if both cards were at the same price, for me and my gaming needs and the expected longevity of both cards, the 7900xtx would still be the better choice by far,cause of the 24 GB of VRAM alone. I'm not even talking about the rediculous power connector of the 4080/4090 that can burn your house down if it's 1mm off.
Wait for 2,500 USD 5090 that you can't buy, b/c Nvidia doesn't make gaming GPUs anymore?
NVIDIA does!
$999 5070 will be super seller!
😂😂😂
Or $1500 5080…. If 5090 at $2500 is too much for some ”poor” gamers…
@@haukionkannel I don't think 5080 will be 1500. More like 1200.
Maybe I'm crazy... but I feel like if you needed to upgrade your GPU you already have by now. So in that sense for new builds/budget conscious buyers they can still feast in the sub $500 price point, but anyone above that would be served waiting until Blackwell/RDNA4. Just my thoughts.
I would've liked the 7900 XTX at $799 at launch, and by now it should probably be a little less than that.
Tom you often talk about "overkill" when talking about the 4090 or upcoming 5090.
But if you are an avid Simracing enthousiast like me and want to go max settings with the upcoming Pimax Crystal Super VR, there is no such thing as overkill.
Same here. But the problem with iRacing is their 10+ year old engine that works at 60Hz (+ 6 steps in one Hz; so it's like 360Hz but not really). Triple 4K is hard for any game, and in this scenario, new/faster GPU will help. They announced a new engine is in the works, when they hired a bunch of devs, but that is years away. Remember, they announced rain is coming a year+ ago, and we got it two months ago.
The problem with ACC is graphics engine (UE4). Currently, 58X3D is "enough CPU" for online big grids. I really hope when AC 2 comes out, it will be optimized for new 8 core CPUs and decent GFX engine with VR in mind. It should come out this year, but I'm very skeptical. Maybe early access, like with ACC.
We are getting GTRevival, which should be on the sim side very strongly. And they said they are going away from UE5 because "120Hz in VR is not possible." -Ian Bell. But this is also at least a year away.
Rennsport is dead, IMO.
AMS2 has no online presence.
So we, simracers, are in the weird spot right now. While prices of equipment are the lowest they have ever been (wheels, pedals, rigs,...), we don't have any decent sim that utilizes new CPUs and GPUs.
@@alpha007org just play BeamNG lol
@@Wasmachineman Hehe. BeamNG is more of a sandbox and not a racing sim.
VR really is about the only gaming use case where the 5090+ is worth it.
But right now I'm playing Helldivers 2 at 1440p Super wide 120hz, so unless I get back into VR flight sims (oh hai Heatblur F-4E) I probably won't really need to upgrade the GPU.
Yes, I agree. I like the peace of mind that if I buy a VR game, it'll perform really well on my Quest 3. Just recently got Skyrim VR and knowing that I'll be able to play it with the addition of mods is worth the "overkill" title.
Grabbed a 7900XT refurb a month ago for £500, having no regrets about that given the 7800XT is still £500.
You're going to need the extra money to pay for your massive power bill. Crazy inneficient card.
@@mintymusdo you even own one or are you going off of non undervolted benchmarks? Lmao
@@puffyips Are you saying they are efficient?
@@mintymusthis comment has been sponsored by nvidia
@@exglued2394 If that helps you feel better, go ahead and believe that. AMD GPUs suck.
I have no idea how low end RDNA3 couldn't be massively profitable when component costs are somewhere around $100
The node is expensive.
@@VoldoronGaming It's using TSMC 6nm. It's the cheapest node in the whole RDNA 3 lineup. The more expensive chiplet GPUs (7700XT and up I believe) use both 5nm and 6nm.
@@VoldoronGaming N6, which the low end uses, is dirt cheap at this point. The die is unlikely to cost more than $15, and that's assuming all dies with defects are just tossed out. VRAM has been dirt cheap for well over a year now at about $3 per GB.
@@VoldoronGaming LOL Voldoron you again the ultimate AMD shill, you dare said the "node is expensive" but couldnt said the same thing about Nvidia using TSMC's 5nm ? bahahaha you're a joke with your double standard
Gross revenue should be decent profitable but net profit? That's tricky without the minutiae figures as in cost of shipment of parts per unit to OEMs etc, volume per shipment, potential tariff costs if there any import/export costs/duties.
Got myself a 6700XT a few months back, pretty happy with the performance since I only have a 1080p monitor anyway
The RX 6700 XT is a fantastic card!
The only reason why I upgraded was that I had a special occasion at the time and I wanted the highest end card AMD produced to celebrate. (A Sapphire RX 7900 XTX Nitro+)
You also need a pretty beefy card for VR gaming if you want high resolutions
Let these companies feel the pain. We want 4080 performance for under $700
Nvidia has almost 80% margin
You will have, just need to wait till 2025.
What pain? Gaming isn't even Nvidia's main moneymaker anymore. AI is their bread and butter. At this point selling gaming GPUs is more like a side hustle.
@MuShinnen claim is semi accurate the 4080super dropped price by $200 and recently sold for $949 according to Videocardz article.
Update vs the 4080.
@@aberkae still overpriced imo
My goal is buying used from people upgrading to the new cards.
hoping to get a used 4080 or 4080 super for like $600 later this year on a site.
Yup I’m selling my 4070 ti super for $600 to offset the cost of a 5080. Of course depending if it’s even logical to upgrade depending on specs.
I believe that the real resaon AMD is predicting equal or less earnings in the 2nd half of the year is because they might try to make a move on Nvidias market share at mainstream price points with the RX 8000 series. If the leaks of Moore's Law Is Dead are anything to go by (which they have been proven to be in the past), the 8800XT might offer 7900XT level of rasterization performance aswell as +30% Raytracing performance at 500$ compared to the 7900XTs 700$ (and 900$ launch price). Considering that RDNA 4 is not a clean slate Desing but more of a "bug fix" for RDNA 3, to achive this scale of generational price to performance uplift, profit margins have to be minimal. If AMD is making a move on market share, even if desktop GPU sales are expectet to rise significantly, AMD would not expect a revenue increase (and potentially a revenue decrease) due to extremely low profit margins. Therefore, AMD would be trading short term revenue for market share by lowering margins. This is something that they have done repeatetly in the past and that makes sense considering their current market share is likely holding them back from piercing into the Nvidia core buyers portion of the market (Mainsteam/ non Hardware-Enthusiast Market). Altough this asessment is based on a logical line of thought, i do hope for every gamers sake that i am right and we might see some major competition in the desktop GPU market.
learn to use paragraphs
i dont have big expectations with RDNA 4 but similar/equal to a 7900 xt but more efficient and no more than 600, idieally 500 or 550 but more than 600 would be a flop.... i´d jump to it if no more than 600, would be a huge jump for me but if costs more than that wouldnt make sense over the 7900 xt becuase less vram and maybe bit less performance.
Jjust wait for the RTX5090. There's no point in buying a 2-year-old 4090. No one should buy a 4090 now. These types of cards should be bought when they first come out.
Nvidia will price it a way so customer will loose, no matter what you decide to do. It's been like that some years now.
@@petergplus6667what do you think a higher end 5090 will cost on release? I’m thinking of like a MSI suprim x liquid, if it’s even out at release.
@@josh93824 Sorry, I don't have a precise expectation in that regard. Depends probably on whether Nvidia tightened the thumb screws on 3rd parties, that hardly earn anything already, or wether the new contracts do leave a little bit of wiggle room.
I don't understand how the sales numbers prove the 7600 XT is selling well considering all the other cards on the list sold even better like the 4070 which went from 60 to 320 units sold?
And there's still what feels like an infinite supply of used RTX 3060 12GB cards available for $250-$300 or even $200-$225 if you haggle it down.
@@handlemonium And new ones for around $280. I remember Tom talking about the overbuild on the 3060 chip over a year ago.
This channel is a bit devoid of critical thinking skills, and it's full if post hoc rationalization.
because one retailer isn't the whole picture, which is what people fail to address, both 7600s are market failures
Lots of folks buying 7600xt 16GB for running LLMs. That 16GB memory allows one to run llama 7b model at 30 tokens/sec.
Check out LM studios rocm version if anyone of you have that card.
It’s spiking because of the 16gb vram.
If they release the 8800 XT for 500/550 you’ll be getting 75-80%!of the 4090 for almost a quarter the price of the RTX 4090. Amd will sell a lot of them. This is a golden opportunity to take midrange marketshare.
If 8800xt dmis equiv to 4080, it's going to cost 750 at least. Don't expect price drops. New generations will stack on top of existing inventory.
That's the optimistic guess. If it comes out around the XTX, but with 16GB, better RT and, say, 100W less power draw for $550 then it will be an absolute smash-hit. But if it comes out a little less than the 7900XT, better RT, but still 300W-ish TDP and still $550 then it's going to be a somewhat lukewarm reception to say the least.
So the 8800XT will end up a 5070 competitor for raster.
So, the 8800XT will be a 5070 competitor for rasterization.
@@ehenningsen RTX 5070 will come with a price increase. Expect it to cost 699-799
Buy now or pay 25% more later. Hmmm.
Did he not mention this at all???
The same reason why I just splurged on a 4090 now. Found a 4090 locally for 1250. Maxed out on 4K frames for now, and this will be my final upgrade before everything gets too expensive.
care to explain for those of us that have no idea what you are talking about?
That mindfactory data had every gpu roughly doubling, are you sure the stats were for equal time periods?
Not really. Some yes, but specifically 4060 goes from 80 to 110 to 130 and doesn't go along with the other cards. I suppose you might say that RX 7600 XT is not gaining on popularity, but rather 4060 is loosing, but the result is the same.
Don't buy now or when the 5090 comes out. ITs going to be the same BS as before
Until the end of time and way worse than we can imagine.
5090 will be sold out for first year just like 4090 was.
@@jaggsta was that really for an entire year?
@@jaggsta sure, there are a lot of nvidiots out there.
@@ofon2000with Nvidia withholding stock, that's not a surprise.
High end sales are flatlining b/c the cost of living is astronomical. 84% of Americans are living paycheck to paycheck. Less money for needs/ necessities and even less money for wants. We all out here just trying to survive
still using a 6700xt on am4 platform and it does the job for me. I am in no hurry to upgrade either my 5900x or the 6700xt still getting over 140 fps in the games I play @ 1440p
As a 1080ti/8600k owner I will be waiting for zen 5 to stabilize in supply/price at least. My Gsync-monitor is 1440X3440 80hz, so running games at medium works fine for my taste in delivered framerates.
Perhaps I might even wait for next Gen QD-OLED display to det 34+ inch 4K 120hz before I invest in a computer upgrade. Paying a premium for extra performance above 80 FPS for my monitor doesn't make much sense, so better to save up and get the setup that are in "sync"
I'd be glad to see RDNA4 launch in the next 3 months, but I'm a bit skeptical. I wonder how much 7900xt/xtx stock there is on shelves. AMD is likely waiting for that to clear.
It feels like 7900xtx are selling well. I could imagine 7800xt and 7900gre sells are blocking 7900xt sells a little bit
gotta love how absolutely non-applicable all of these prices are for the European market. Everything is still overpriced to hell over here. It's almost a $200 difference between what a card costs in the US and over here.
Why the 7900 gre isnt selling more is completely strange to me
I don't know how to square your statements / sources saying that AMD cards have been selling incredibly well, with Lisa Su's statements that GPU sales are down. Claims like "RDNA 3 is selling VERY well" don't make any sense with these results.
Did you not read the slide?
Gay sex
Did you even watch the video
Specific AMD cards have been selling really well. Overall, sales not good.
@@sulimanthemagnificent4893 I read the slide. What's your point?
Bought the 7900 XTX before they all go away
Amazing GPU, zero problems on linux
Got the nitro+. Not bullshitting, best GPU I've ever had in 20 years. Quality and performance are outstanding
Happy with the 3090 I got from Microcenter for 760. I’m not upgrading until I can get a 4090 for less than 1K.
You cannot kick mdf down the quarter. That's not how it works. This is just bad sales. If the sales are equal for 7600 and 4060 how do you reconcile the data in steam survey where there is no 7600 and 4060 is there. You can attribute many problems to the steam survey but in most cases it largely represents the macroeconomics of gpu market in general.
Different things… 4060 has sold in long range more… 7600 has picked up in sales late weeks…
it's pretty weird how he keeps trying to make the 7600 look better in sales and usage than it really is, that gpu was a market flop, mindfactory is the only place in germany where amd has a lead that big, let alone the rest of germany, and the world
Only place where sales matter is mindfactory.
There seems to be some wierd techtuber push for amd cards. Dude runs a 4090 in his main rig 🙄 do as i say not as i do
When you look at budget prebuilt gaming pcs. You see a ton of 4060s. Way more than AMD gpus.
Just got this video recommended. Your contacts in distribution gives amazing context to purchase decisions. Love the sales trends
Glad it was helpful!
I hope so much that Blackwell will be reasonably priced.
And then you woke up and your coffee was cold.
It is pretty much confirmed at this point that AMD will not be competing in the high end next generation. You've seen Nvidia's pricing when AMD is close: The 3090 was $1600 when the $1000 6900XT was only 10% weaker in raster. The 4090 is effectively $1800 when the, now, $850-900 7900XTX is only 20% weaker in raster.... who the FUCK do you think they're going to price things when the strongest AMD card will go against the 5070?
The 6800 is still my favorite product from AMD. I had it constantly below 180w after ancient gameplays optimization video. The best affordable 16 GB card you can get. Especially if found secondhand for $250-$300
them low priced 7800XT's you showed from new egg are refurbished/open box used cards. Kinda misleading tbh. New 7800's you rarely see below $480 and the premium cooler models almost never get discounted.
Right....as if opening a box or repairing a product back to QC standard is somehow worse than the gamble of a new non-validated unit. You do realize that a refurbished product is validated to meet QC right? Refurbished products be it 3rd part authorized or 1st party is also far less likely to be DOA or fail than a new product. New product are tested by batches. Bad unit get missed in some batches, then get sent to a customer to be DOA or fail early. ALL refurbished units are required to fully pass QC as if it's new, or they never leave their facility. I have years of refurb experience and one of which was in QC, so I know first hand. At least within the SE U.S. I shouldn't have to explain why an unused GPU in an open box isn't a problem. But hey, by all means if you don't believe my experience, that's fine. Keep enjoying your cranial enema!
@@fateunleashed9680 You completely miss the point. At no point was i speaking about the quality of the 7800XT cards. I feel as though the price of the 7800XT is misrepresented here. I have no problem with refurb or used products.
Why wait for the 5090 when you can wait for the 6090?
I bought the 7900xt when it first came out for around $900.. The 7900XTX were all purchased by scalpers and had a $500 markup that i refused to pay. I don't regret it a bit, been a great card.. Around $600 if you can find one, it's a steal.
I paid 900 quid for my Liquid Devil 7900XTX 10 months ago. xD
Your time has value too. So it's not a bad deal :) Plus the 7900xt will last quite some time
Just snagged a used 6800 for $315 + shipping. Building a comp with it early next week, glad I chose correctly 😊
Good Lord that's an INSANE deal! Good job!
@@MooresLawIsDeadjayz2cents got it for 260 few daya back 😂
Ding ding ding ding
Don’t feel like you need display port 2.1 when your shopping, seriously there is no noticeable difference
Going by bang for the buck, i'd say there currently are but 5 GPUs worth buying:
1. Intel Arc A380 6GB for $110, 2. RX 6600-6650 XT 8GB for $200-$220, 3. RX 6800 16GB for $360, 4. RX 7900 GRE 16GB for $540, 5. RTX 4070 12GB for $550
Everything else either has sold out, doesn't deliver what it should, or is utter luxury because you already are into the 4k gaming range starting with option #3.
And I very much doubt that it makes sense be buying anything beyond an RTX 4070 right now, because the introduction of the RTX 50X0 will likely come with a price drop for the RTX 40X0 series.
@MooresLawIsDead Maybe some of those high range (4090) purchases are being made by AI developers / enthusiasts?
I will have my eyes on a used 7900xtx when the next gen are released. Not interested in Nvidias power connector.
I got a 7900xt. I will wait for the 5090 to drop to upgrade. I would buy AMD but no big increase in performance.
Rdna5 will be released at the end of next year and will be a giant leap in performance. Supposedly 300+ cu.
I picked up a replacement for my Vega64, finally, though I don't do much gaming these days and what I do do is Esports at 4k 144hz, but I decided to pick up a 7900XT as it was selling at $100 less than the cheapest in my locale and I wanted something to last as long as my Vega has for a similar inflation adjusted price.
My alternative for the same money from Nvidia was an RTX 4070TI (not the super variant) so it was a bit of a non-contest.
watcha LOT of tech content.. MLID my favorite. It's awesome. TY for your hard work!
Downloading this video so I can watch it at home in a few hours as I cook my curry!
Working at Micro Center somewhere in the MidWest, I'm calling BS on AMD out-selling Nvidia. Not from what I'm seeing from the past 30days.
If he said that, you would have a point. Watch again, and listen.
He is always biased towards amd
holding on to my 7900XTX Nvidia failed with their Super cards but won the wallets of the nvidiots. 5080 at best is going to be between a 4080 and 4090 just with better RT than the 4090. my card is gonna last awhile.
Lol does the 5090 come with its own case because I assume it will be double the size of the last one... like 7 slots lol
They probably made the 4090 coolers so big so they didn't have to re-tool for the 5090.
Yes, and it needs it's own electrical plug..
@@Greenalex89 Really you would assume it should have it's own 1500w power supply as well.
Here in Europe I've seen the 4090 drop below msrp for a month now. It currently sits at around 2k USD converted, which includes a 25% sales tax, and at launch it was around 2.2k USD which it has never dipped below before. I'm using ASUS TUF models as reference here. The 4080 Super also dropped by about 2-300 USD converted
25% SALES TAX?????????????
@@Dave-dh7rt yes, 25%.. gotta love it XD
At least with my private company using high end gaming pc hardware, I can deduct it all. But regular consumers aren't so lucky.. This is in Denmark by the way, Europe's avg sales tax rate is around 21%
@@Dave-dh7rt Why do you think they have free health care?
Wait for RDNA 4 if you’re looking to be sorely disappointed.
i wont be if i can get 7900 gre overclocked performance for $400
Well if you're following basically any hardware news channels, the rumours that RDNA 4 is basically going to be a bugfix really only improving RT perf is already out there. So it sets the expectations of the people.
RDNA 5 is the one that has the potential to be a letdown because now you've had basically 2 generations of cards that are on the same performance range having made relatively little gains from 6000 series, and many of the 6000 and 7000 series owners are gonna be looking to upgrade to the 9000 series for a new breath of life. Given that most people are looking to upgrade every 2 generations or so(HW unboxed survery)
So, if AMD doesnt deliver with at least i'd say 30-50% uplift, they might be looking at a serious market share loss as some people jump ship to nvidia assuming they have decent gains gen/gen which could add up to the 30-50% that upgraders are looking for. It their price is even remotely acceptable and if blackwell doesnt sell well, it can throw nvidia into a 3000 series situation where they cut prices and people are going to start jumping and AMD is gonna have to keep the consumer GPU team on life support because the revenue stream from people upgrading isnt going to come.
RDNA 5 is going to be a big turning point for AMD for most consumers who upgrade every 2-3 generations. Only time will tell how well the new architecture will perform and whether in the eyes of the price/performance conscious consumer, they win out nvidia in the mid range and keep their customer base.
Depends on what you're looking for. If the top card comes out at around a 7900XT in performance, with much better RT, 16GB and a 100W reduction in power usage for, say, $550, then a lot people are going to find that attractive. If you're looking for "a 5090 competitor, but cheaper", then yeah, sure, you're going to be disappointed.
@@Joker-no1fz lol you'd wish for only 400$, prepare to be dissapointed when you find its not at that price mark my word
@@Eleganttf2 o well ill buy used idc dude. my last gpu was used.
i cant believe recent nvida's still only have 12 GB of vram when my GTX1080ti has 11!?!
US will add 25% GPU import tax soon. I don't know wait is a good idea.
It's infuriating what these idiot politicians are doing to this country. The Founding Fathers must be turning in their graves.
@@user-eu5ol7mx8ythe founding fathers owned slaves and the original version of the Constitution said that only land owning white males could vote ... Soooo, I kinda doubt that they would give a shit about the blanket disenfranchisement of non-wealthy Americans.
@@user-eu5ol7mx8ybut muh china bad
The nasty part of this is that this tax will trickle down to buyers across the globe as well since the manufacturers are American, so by doing this they're just raising the prices worldwide
But but china bad
I think it's a horrible time right now to buy a 40 series GPU without a good discount. Buying a 4090 is especially pointless without an excessive discount of at least 50% because otherwise if you go top end, you might as well wait for a 5090 instead.
Wait what?????????
Should I wait for a 5090???????????? How about sending a couple thousand USD and I'll be the first one in line at Micro Center.
Otherwise I'll wait for the performance of the 4090 to drop down closer to $1000 USD and buy that because that's good enough for 4K gaming and I'm never in my life going to be 8K gaming, because why? Just to spend money? 4K gaming is something I would do on a 65" 4K OLED or similar quality. Otherwise I'm 2K gaming on a PC monitor, and for that all I need is the power of the 4080 or 4080 Super to drop down closer to $600 USD and then I'll have an outstanding 2K gaming rig at a price I can stomach paying. If that takes one or two generations of new products so be it but 2K gaming doesn't warrant a $1000 GPU.
FOr 1080p gaming something along the lines of a 4060 Ti or 4070 being around a price of $300 USD and having the RT performance for 1080p would be nice.
The problem with GPUs is that the need the dye for commercial purposes so they price the dye at a premium for consumers.
Until AI and other processing becomes more efficient and doesn't need as much processing this is going to be an on-going problem.
5090 will be $4000
🤣🤣🤣🤣
5060 will be $600 and give 30fps in 1080p! :D
I made the mistake of buying a 4090 a year after it released, was seeing if a Ti variant came and it never did. I will not be waiting again and getting a 5090 the moment they drop.
I've had a couple chances to get a 4090 for pretty cheap. Like $1300 and have passed. The melting cables. Power requirements. The lack of games and software that really need it for either work or games. It's not really needed and a bit not ready for prime time. The 5090 or 5080 maybe.
I'm on a 2080. There's a lack of games I'm interested in to justify an upgrade.
If the 5070 has 16gb I'll get it
Do you have a friend who sells all his stuff when rent is due? Most 4090's I see under $1750 are bogus sellers on Amazon
@@douglasmurphy3266 I pay attention to sales on amazon, Newegg, and Microcenter I have nearby. Microcenter has good sales on open box returns. Newegg has good sales with coupon codes sometimes. And Amazon has hidden sales in the third party seller tab that can be brand new sometimes. If you need one right this second you won't see these. I got a 6950 xt for a friend for $450 for example.
@@douglasmurphy3266 Amazon is a terrible place to buy used goods. Ebay is where its at for used GPUs.
For us ML hobbyists the 5090 really depends on the amount of VRAM. It will need a minimum of 48GB to really be any upgrade over the 4090, as even with that card VRAM is already the biggest bottleneck. Otherwise we're getting to the point where a decently kitted out Mac Studio with 128GB or more unified RAM is a better deal than a PC with nvidia GPU.
Not wrong. But stepping outside the nVidia ecosystem for AI/ML is always an adventure in dealing with software and library compatibility. If you’re writing your own code it’s mostly okay as long as you’re not dependent on CUDA-only libraries. Running code from GitHub or HuggingFace is very hit and miss though.
I will say that if VRam is an issue and you’re willing to step outside the CUDA sandbox, Radeon Pro have some pretty attractive value offerings. A W7900 GPU gets you 48GB of VRam for the low price of $3,600 on Amazon right now.
128GB of unified memory on a Mac Studio is a pretty sweet idea though. The only drawbacks to that system are the cost (>$10k) and the absurdly low power draw (read: clock frequency) given the intended use case. But if you absolutely need > 48GB, it’s a great solution and costs significantly less than enterprise GPU’s like the H100.
XTX for $750 is refurbished/open box which is a gamble buying
Buy any top die: play the silicon lottery
My microcenter open box special XTX has a world record benchmark.
I bought it a week after launch (it had been returned for the wonky launch reference cooler orientation problem) and put a water block on it for my first custom loop.
Just went to check, still stands, current WR Furmark Knot (VK) score 13669
So you recommend refurbished?
But only from microcenter?
@@Chronically_ChiII If the warranty is there, I tend to be be positively inclined toward refurbished products.
I had an excellent experience with 2 HD7950 refurbished by Sapphire, including one that I OCed when it was in my system, and that still performed pretty well for a 9 years old card when in 2021 I used it again to build a makeshift PC for my SO with no repasting or anything whatsoever, handling (with compromise but still) games from the PS4 era like Horizon Zero Dawn and else.
The fact I bought those 2 cards for 100€ each while they weren't that old at the time, it's still probably one of my best purchase ever.
I suspect not every refurbishers are the same, but I'd be inclined to trust either when it's the constructor (Sapphire in this case) or a reputable company (like Microcenter, that I wished I could have here in France).
@@Chronically_ChiII Refurbished/open box is good as long as there is a good way to RMA/return the card if it truly doesn't work. Especially for top dies/cards at the top of the stack. (4090 / XTX)
i picked up a 7800xt for 425$ from microcenter about 4 months ago has been a great card, would love about double the VRAM as a Heavy VR user and club photographer in VRChat, VRChat being able to eat even a 4090's full VRAM buffer thanks to player created avatars and worlds not being anywhere near as optimized as a commercially made game, also could use some more CPU performance my 7700x is already starting to feel the pain of increased cpu usage in games.
Would the 7800x3d help? Or are you thinking zen 6 high core count and vcache all on a single ccd
@@puffyips yes but the increase for the price isnt worth it for me atm, Single-Core is still a massive limitation in VR, I'm hoping to pick up a 9800x3d hopefully when it comes out, Code optimization feels like a lost art in modern gaming getting 30fps at the equivalent of 1440p in VR entirely because of High CPU time and my GPU sitting at 50% usage and Frame doubling and Upscaling tech causes increase in motion sickness in VR thanks to motion jitter
Holy cow..
I am loving my 7900xtx and im excited for FSR 3.1!!! I am very curious to see what and when AMD launches it's next gpu products.
You should try using the DLSS Enabler Mod (FSR3) playing max settings plus path-traced RT with your 7900XTX. On my 1440p OLED i get 70-100 fps all the time. Thats also using the NOVA LUT mod. The visual fidelity is the best I have ever seen in gaming.
I have the 7900xtx and I don't care about any upscale, that's why I bought it.
I like to keep up on this kind of stuff even though my 6950xt (bought on sale late last year) is doing just fine
I think I'll wait high key...I'll still watch this all the way through though lol
So glad I got a 6800xt when it was available. The unit is killing it.
Great video as always!
I just got into PC gaming, primarily a console gamer for 25 years. I have a laptop with a i9-13000H/4060 8gbVRAM/16gbDDR6 SRAM. It can run anything I've tried really well, but definitely shows it's the Mid-Range GPU on a couple newer games from the past year, obviously depending on you're setting preferences.
VERY happy with it and ultimately glad I bought a laptop that is mostly comparable to SeriesX/PS5; As opposed to a SeriesX and a cheap laptop. Probably will end up building a desktop in 3-5 years using an Nvidia 5080, or whatever else is powerful and affordable at that point in time.
Reminder Nvidia gaming rev is almost 3x amd gaming this quarter
The fact is the general public buys Nvidia more. Other youtubers have gone over this.
If you sell more products at higher margins also to a less price sensitive crowd (budget AI) you have more revenue. Great revelation
@@mlsasd6494 don't hate the player hate the game. If AMD isn't winning over the average consumer when it comes to GPU purchases. That's on them. No Company is owed their business. Besides they do buy AMD CPUs so there is that.
@@GRIGGINS1 i think my intentions were not quite clear. I just wanted to point out that the comment is redundant as AMD has less margin and less market share and a less stable customer base which obviously leads to a lot lower revenue. I think everyone should know that nvidia is the market leader in the GPU segment. Also as you ponted out, i dont think AMD cares too much. Companies dont have feelings and they are killing it on the CPU side while also participating in the AI rush. Sure more money is always better but i think they are currently happy in their position.
@@mlsasd6494 Ah I see. Thanks for clarifying you point. I agree.
I'm surprised high end took this long to slow down. Most people cannot afford a high end pc, midrange makes a lot of sense to me, high end you might as well push for a high refresh rate 4k panel opposed to a wide 1440 or regular 1440 panel. Last year I remember mind you the price gap was big for panels. I'll keep my 2.7k panel and 6900xt for another generation probably lol.
Great video Tom
7900 GRE is 525 right now as well
6750 XT for 300, 6800 for 360, 7900 gre for 525, 4070 super for 570
These are the ones worth buying right now
the fact that people are still wanting a 90 class card after everything Nvidia has done is really gross. Sure hope your cards dont catch fire or melt on ya. If they do you can rest assured that Nvidia will most definitely not have your back.
People have more money than sense.
The 3090ti didn't have the issue, despite actually sucking down 600 watts to the 4090's 450.
Sour grapes aside, there are valid reasons to get a 4090. And not just ray tracing, which I couldn't give a crap about.
The review websites are full of crap, though. Quoting "average" increases at resolutions like 1080p and 1440p with data points that are heavily CPU-constrained.
The 40 series is better cooled than the 7000 series, more power efficient, Reflex is actually a pretty big deal, and in my use case (VR), nVenc was a huge deal for wireless encode before the recent VirtualDesktop updates, and there are about 5 years of VR titles that AMD was basically unsupported in.
The 4090 is roughly 40% stronger than the 7800xtx, not the 10% Moore's Law talks about. It's very, very difficult to make use of all of that at 1080p, but in VR, suoersampling can fix problems like awfulnanti aliasing through brute force incredibly effectively.
Plus, titles like MSFS and ACC genuinely *need* that level of performance to run anywhere near acceptably.
I to am considering an RX 7600XT. The RX 7600 XT is like a 1080ti ++ in many regards. (was pondering to get a 1080ti first, but 11 gb is not enough). The RX 7600XT is above the 1080TI for gaming. In power consumption it is acceptable, though I would have wanted it lower. Where it shines is content creation, depending on your workflow. Davinci Resolve it really shines, nice to have for content creation in general. Some other things will take longer (sometimes quite a bit) then Nvida, but it can be done. Always been a Nivdia user, tried a Rx590 (100 bucks at the time, so why not). Had one time a Driver issue, but just rolled back to previous driver, until they fixed the issue. I also really Like Dr Lisa Su, She is an Inspiration for us all :)
I’m waiting on Arm processors to take over so we can get native BootCamp on M-Series and Hackintosh on non-M-Series… It’s coming!!!
By that point, RISC-V would probably worth consideration, unless you like Apple Tax.
Short answer is ‘yes’ for me on the 5090. I’m not interested in the 4090 as an upgrade to the 3070 at this stage in its lifecycle, so for me it’s going to be a no brainer to go for the 5090 next year instead and really future proof myself.
With most people struggling to buy groceries and pay bills the high end will be dead until the economy stops being crap. Since AMD is skipping high end this year they need to price RDNA 4 correctly and not pull another RDNA3.
People buying $1,000+ graphics cards aren’t struggling to buy groceries and pay bills. LOL.
Yeah I'm hoping RDNA 4 will be fine for it's price to performance ratio.
@@benjaminlynch9958 you would be surprised how many poor people buy 1500$+ cards with credit/financing or saving ,if only the rich bought these cards they would sell 80% less of them . i know 2 homeless people who couch surf one has a 3090 the other a 7900XTX. its insane
@@benjaminlynch9958 a lot of (tech) layoffs recently
Guess I go back to playing Pokémon blue on my original game boy on the toilet..
I upgraded from my beloved 1080ti to 4090 and im really hoping the latter has a good lifespan.
blackwell will have HORRIBLE price to performance gains and rdna4 will be barely a bump to raster… glad i bought my rdna2 card😅
Would be actually cool being able to activate RT without loosing half the fps. This is something that even an nvidia 4090 cannot do. I am curious about intel battlemage in that regard.
@@aladdin8623You’re expecting battlemage to be better than 4090 in regards of RT? That’s something.
@@Simon_Denmark You seem to have missed that intel's arc rt implementation is the most efficient of all three major gpu vendors. When activating rt intel's solution looses the least amount of fps while nvidia's models often loose half or even more fps. Yes this is indeed something.
@@aladdin8623 Yes but Intel is very new in the dGPU market and Nvidia is on their 3rd* generation of RT GPU’s. I wouldn’t hold my breath. The overall FPS will still be lower with Intel’s GPU’s.
@@Simon_Denmark Only because the 4090 compensates its rt defficiency by brute force on a massive die. Don't forget intel didn't produce a high end gpu competitor but only a mid tier line up.
Nice, I bought a 6800 for slightly more than referenced by yourself (GPU's generally are slightly more expensive here), assuming that 359 is without VAT (tax).
If it's true that AMD isn't adjusting their financial statements to account for rebates when they're earned, that could mean they're playing spicy with GAAP compliance, which could be an interesting story.
It’s pretty much a sure thing they’re compliant with GAAP accounting policies. There’s less than zero chance an auditor wouldn’t catch something that significant.
I think the more likely scenario is that unit sales are just down (particularly semi-custom) hard.
On a related note, I really wish tech RUclipsrs would reach out to accounting / finance experts when discussing financial results and when accounting policies like this come into play. It’s so easy to get the story wrong if you don’t have that accounting / finance knowledge base.
The only way to make sure you get the best bang for the buck is to keep waiting until the day before you die. If you need to upgrade, then upgrade. Something better is always about to be released.
Another melting GPU power cable, yay.
I'd love to swap my 2070 Super for say a 7800 XT but the one thing holding me back and waiting is the fact I'll lose DLSS support. Which so far, DLSS has been incredible allowing me to continue playing games that look fantastic at 4k...on a 2070 Super...I just don't think I'll get that kind of longevity switching to any of the AMD cards
I would expect nvidia to price the 5090 to where it doesn't even move the 4090 off of $1600 much less anything down the stack
AMD's sweet spot pricing usually comes 6 months to a year after a product is released, they get MSRP out of as many people as they can then they drop to market based pricing underneath the "official" price. Example: 6750XT makes no sense at $550, after things settled down it dropped to $330 making it the best value in new budget builds - and I'm seeing it dropped to $300 just recently.
If price is your concern, you'd be waiting 6 months just to need to wait 6 months for the decent pricing to materialize. Buy something now and keep the box nice and clean so you can sell it in a year.
25% tariffs are coming too.
@@PaelmoonI hate tarrifs. But I wonder, will this affect prices globally or just US prices?