The "price-cuts" by Nvidia are just laughable at this point. The higher end cards are now sitting at prices where you could maybe justify their launch MSRP should have been. The lower end cards haven't even hit MSRP. Even the 4 year old 1660 super is still 250€ new. Which, surprise surprise! is also above its launch MSRP of 230€. I mean you've gotta be shitting me with that stuff.
I know this are just rumors, but the fact that the RTX 4060 was specified with a power draw of 240 watts (and should pobalbly have better architectur) and is worse than a RTX 3070 with a power draw of 220 watts is just hilarious.
Why? Leaks are putting the performance level of the 4060 a complete tier above the 3070 as in 3080 levels of performance and you cannot see why the power draw increases 20 watts over a 3070? Considering you are moving up a tier on performance at only a cost of 20 more watts seems pretty reasonable to me.
remember that TSE puts an extreme load at the gpu, that normaly doesnt happen, so in gaming and "normal" use cases it could be 200-220 watts (somewhere around that number....i dont care, i need a faster gpu.) but if this powerconsumtions are true : 4060= 240/ 4060ti= 260-280/ 4070= ~320-350/ 4070ti =350-400/ 4080=400-450/ 4080ti and 4090 i dont even want to know
"Current cuts have not moved inventory as much as they'd like." Of course not... for Nvidia, your crap was WAY over priced. Gotta' keep droppin' those prices. Not to mention, the merge is just around the corner. People are being smart to see what impact that all has on the market. As for these 4060/Ti "leaks"... I have my doubts.
They won't do it. It makes no sense to sell warehouses of GPUs for less than what it cost to make them. Even if they manage to sell everything, they will oversaturate the market to the point where nobody will buy the next gen cards. Why would you buy a 3000 card just to replace it with 4000 6 month later? Not in this economy. They will just cut 4000 series production and make you pay extra for the next gen. Layoffs will most likely follow to cut costs.
@@harryshuman9637 That wasn't what I was implying. What I'm implying is that they're still trying to get those high profit margins even though the people willing to spend that are gone. The cuts thus far are a joke when you can get a 6700 XT for 3060 money or a 6600 XT for the same price as a damn 3050 or that they're trying to sell the 3080 Ti for what it's original MSRP should have been. I don't think further cuts will place them at a loss. Just less money than they'd ideally like to be making. Besides, let's say we're talking them making the decision between loosing money on product or continue sitting on it. It would be even dumber to sit on the product. If you have it at cost, and people still aren't buying, it's unreasonable to think that as time goes on... they might if we just keep the price where it's at. That product only becomes worth less. It makes no sense for them to sit on a bunch of excess inventory and they're be better off selling at a loss. It's been known to happen, and they write that loss off on their taxes, and it's so much less of an issue right now considering how much bloody money they made with the mining boom. It would be one thing if it weren't a huge surplus, but it is, and it makes no sense to sit on that inventory. That being said, price to performance isn't going to be great on 4000 series regardless if they sold them at a loss or not. However, lay offs could happen either way. After all, we're headed into a recession. And things are only going to get worse. I have a feeling there are hard times ahead for the PC component industry and these high priced enthusiast components aren't going to sell all that well. I think Intel, AMD and Nvidia have failed to read the room, much in the way they failed to see the economic crash coming. They're trying for excess when people are starting at looking to tighten their belts. Regardless, I think the flood of used GPU's ready to hit the market will have an impact on new prices to an extent, and I think people that are willing to buy used should heavily consider waiting a few weeks to see how this plays out.
@@harryshuman9637 Did you not watch until the end of the video. There most certainly will be more price cuts from both Nvidia and AMD. If you have information to the contrary, show it... but your opinion that they would rather hold onto inventory until it is worthless rather than sell at a loss is unfounded.
price cuts are also a lie. 3060 ti is still going for £100 over msrp. In what world and what 3rd world education do you have to have to call £100 over msrp a "price cut" They don't even need to cut the prices to get rid of their entire stock literally make the gpu cost msrp and you will sell out instantly. The greed is unimaginable. I would say the 40 series will save us but we already know 40 series cards are going to be insanely over priced for no reason lol. Going to laugh my ass off when 4060 drops with same performance as a 3060 ti but costs £50 more or something classic nvidia.
I have 2 PCs that I game on. One has a 6600 and the other a 6700XT. The difference in performance is massive. However once I’ve set up each game to max out performance I cannot see the difference in visual appearance whilst playing the games. Like many people I feel the NEED to max out settings and upgrade so I can set them even higher. But if you can play at your refresh rate and resolution, you don’t need to upgrade all the time. Don’t fall for the marketing. The gameplay and enjoyment will be the SAME and you can spend the money on a holiday or something else to chill out .
Thats true, but some dumb people(me) got into 4k gaming on large screens. So yeah performance is important. I do use my monitor as a tv as well though. And am patient for upgrades 5 years usually. So need that perfomance
Facts! I used to have a GTX 1060 6GB, 4 months ago I upgraded to an RTX 3060 and it's been amazing! Some games at ultra settings are not always 60FPS+ but like @iceburglettuce said you don't have to put all at ultra, you could customize the settings and get 75FPS+ without too much image quality loss. (I'm talking without DLSS on btw) the 3060 can max out at over 60FPS almost every game at 1080p. (I play 1080p 144hz), my next update hopefully before this year ends is getting a new CPU, probably an i7-12700 (When the new CPU's come out) the prices should drop which should allow me to upgrade to a very capable CPU for streaming. (currently have an i7-9700 (non-K) which is not bad) but I want more FPS in some games and a smoother Streaming experience.
The 7600X, 7700X and other parts releasing at the end of Sep are in full production, and have been for at least a couple weeks, so if parts are recognized by software, those are production samples getting testing.
I do find the 4080 and 4090 numbers impressive, but I think Nvidia is a bit heavy handed with chopping up the 'lesser' cards. The 3080,12gb & ti are going to start looking pretty juicy when the prices drop further, and to me 3090 vs 4070 is a no brainer if the 3090 gets to 800 or below while were still waiting for even 4080 to come out
Finally a reasonable comment and not one saying it's obsolete because it has the word Ampere next to it so it must suck. The 4070 will only have 10 gigs of ram too if rumors are correct and the 4090 is just 35% faster. Not 50%
Yep, perf/watt simply doesn't add up, especially if you also take into account they are also on a lower more efficient node. I guess you should never believe a so called "leaker", that you've never heard of before.
eer yea... my GTX 1070 racks up the electricity bill when i use it, and its 150 watt.... a 275 watt card would be pushing the limit for my liking. Interestingly the GTX 1080ti is 250 watts has more VRAM and higher memory bandwidth at 484GB\s
I would settle for a 4nm 1080ti card, yes no tensor cores i dont care. Just a 4nm 1080ti with 11GB. And it should be about 50 watts. And the UK price should be around £200
The 4060s won't be out for another 6 months at the earliest which means these are ES. Nvidia is still sitting on 3000 stock that they haven't sent to AIB partners as the decoded talk from Jensen says a few days ago. You MIGHT see two Nvidia products this year, the 4090 and 4080. Mostly you'll be looking at discounted 3000 series and the 4090 whenever it launches, but the 4090 should now be in its finished state. If it comes out in Oct, the die has to be rolling out of TSMC in August. It then has to go to manufacturers for GPU production, and it needs about 5 weeks to get into N. America after the product leaves a factory. Yes, it takes that long. It's not air freight. Container ships take about a month to cross the Pacific, then has to enter customs then get into supply chains. So 6 weeks there total. So, if a GPU came out of the factory today, the earliest you'd see it in N. America at a retailer is 6 weeks from now. That would get you to around Oct 8th. When did the 4090 have to be finished testing if it launches in Oct? June. Then TSMC would produce some finished die, it would get tested in a finished product and then August would go mass production So, that production sheet shown from some Asian manufacturer who we don't know who it is, that's for retail sales if the launch is Oct. If the launch is Nov that could have been a batch for testing finished die
Good thing I did not buy any graphics card during the price spikes. And the prices are still a bit too high. If I were to buy a graphics card, I would just buy one that has less TDP and provides the necessary performance that would allow me to play at 1080p High settings. I do not really care about the benchmarks too much, I just wanna play games. Just enjoy what you buy and make careful financial decisions. :) Note: This is coming from a person who has an Rx 460, so whatever graphics card I would buy will be a huge performance increase I have currently.
the 3060 ti draws 200w the 4060 ti rumored to draw 270w the 6700xt (because 6600xt is not the eprformance class for it to go against lol) is 230w the 7600xt is 6nm, not 7, so *slightly* better efficiency from node alone but not much, rumored to be only a bit more so it could draw 180-250w (250 being worst case scenario)
I will not be upgrading my 3090. I am lucky that I was able to get mine before all the prices went nuts, and I am going to keep it for a few more years.
@@laszlozsurka8991 3090 will hold more than well. Games will be limited by console hardware, only PC exclusive titles will make real use of the hardware. A game that launches for xbox series s, will suffer sever quality drops when adapted to pc and ps5, it will have less improvements as a next gen title due to the limitations that XSS applies due to having a weak GPU. I'm using a 1080ti since launch and i estimate it will "hold well", for another 3-4 years for high/ultra settings at 1080p or 1440p with no new tech like RT, the 3090 is on a different level than the 1080ti. If Sony and Microsoft will release "pro consoles" in the next couple of years, i assume their performance to be pushed close to 3090 performance tier, perhaps a bit higher. This won't mean the GPU will suddently struggle with gaming. RTX 3090 is likely to last a decade at the rate things are going. Even if a new generation of consoles drops, it will take 2-3 years to abandon previous generation. RTX 3090 is only going to become irrelevant when games are focused for a new generation of consoles that might push the performance gap so high to the point the 3090 will be like "minimum" to play.
I see no price meaningful price drops of GPUs in EU. So they say they have full stocks and must move it out? Wake me up when they actually go 20-30% below original MSRP.
3080 12Gb are dropping under 900 € this week (there's a bundle from msi, keyboard + mouse + ventus 3080 12Gb for 899). I would buy one as soon as it drops below 800€ imho
The responsible thing to do about rumors from questionable sources at best and downright spreading false information at worst is not to cover them imo. If the "leaks" come from someone who has history of being right it may be worth noting but if the info comes from a random dude on Twitter it's best to leave it alone. You are still amplifying the rumor no matter how much disclaimers you say in between and is actually how false info spreads irl.
Yep, there's already people taking it as being 100% true, in the comment section. A lot of things simply don't add up, in the so called "leak", which should instantly set those false flags flying. It's usually is a good "BS indicator" whenever things don't add up and in this case it's almost certainly a steaming pile of BS (a nobody "leaker", very early for leak on those GPU's, odd number of SM's listed, perf/watts doesn't add up, test results rounded up and lower perf than expected for the 4060). Of course some gullible fools and fanboys, will always believe and soak up anything that's said without questioning it, if it fits neatly into their mindset. They'll then of course, pass on that BS misinformation to other gullible idiots.
@@laszlozsurka8991 I know it sounds silly but my guess is bitcoin will rise again and the chip crisis could start at any time because of chon and taiwan.
Or you know.... Time Spy doesn't test RT performance and AI which is the main focus of improvement with Nvidia since 2000 series launch. Raster performance is already absurdly high. B-
Retro topic but just so people know about why the 3060 comes with 12gb, is because of the 192 bits bus it uses which can accommodate either 6gb or 12gb, and 6gb was considered too low.
So 6gb was considered too low for a 1080p card, but 8gb was considered "fine" for a much faster card like the 3070ti which is supposed to run 1440p ultra / even some 4k? Weird decisions
@@NoThisIsntMyChannel well 8gb isn't a ton, but is considered the minimum acceptable these days. 6gb is too off putting considering even the 2060 had 12gb. BuildZoid explained this thing about the bus width in a video and to me it makes perfect sense, is the only thing that makes sense imo.
I think they nerfed the cards on purpose to keep prices up for 30 series since they still stuck with so much of those where people might still want a 30 series over the 4060
A suggestion on selecting text since I see you had some problems with it. Double click will select the word under your cursor and triple click will select the line of text the word is part of. You might find that easier than trying to drag select in a chart like that.
These leaks IMO look more realistic than from Kop7mite going up roughly 1 tier instead of 2 tiers from 4060 being around the 3070 instead of 4070 being around the 3090.
hopefully this changes, we are still months away from the release of those GPU, but it is a bit odd that there is that much performance gap between the 4060 and 4060 Ti.
3060ti is also way closer to 3070 than 3060. Basically it depends on the used chip, usually 60 Ti models are cut down 80 or 70 chip and 60 are tier lower chips. 1060 6gb was way faster than 1060 2l3gb due to it being cut down 70 model too. It then makes sense that 4060tinis a cut down 70 model to and is closer to 70 performance than 60 card. Usually been like that 60Ti or 70 = last gen 80 model performance. 3060Ti= 2080, 2070 = 1080 (+DLSS), 1060 6Gb = 980 and 1070 = 980Ti, 970 (329$) = 780Ti and 960 = 780. So makes sense that 4060Ti = 3080 at almost half the orginal price of 3080, so MSRP of 399$. Always pay attention to 60ti models, if they are released on releaese date.
yeah, if you look it that way it makes sense, i knew the 2060KO for example had a cutdown chip of the 2080?, but didnt knew this was a common practice.
@@InnuendoXP I think what's affecting people's views could be how the 4070 is rumored to performance match the 3090, so effectively punching 2 classes above its weight vs last gen cards - and meanwhile the 4060 can't even match the previous gen card 1 tier above (aka 3070). So the feeling of _budget_ really hits harder here. When we look at last gens, the 3070 was well matched with the 2080 in various performance tasks - had there been a 2090 then the 3070 would not have reached anywhere near that level by any chance. Yet the 4070 is leaked to do just that, so the gap feels even more significant down to the 4060 which effectively should be the 1-tier-below card. Either way, time will tell. I'm sure the accuracy of current claims are unprecise anyways.
@@Real_MisterSir the gen on gen increase we're seeing here in % relative performance terms isn't extraordinary if your view is over the past 20 years instead of the past 5. The 2 cryptobooms, Radeon's half decade of struggling to catch up, and Turing's pathetic price:performance non-increase have heavily distorted people's expectations to be conditioned into buying less for more. Just look at the current consoles, they're 2 years old, but to get a GPU as powerful as the graphics processing part of their APUs alone is still more than 3/4 the cost of the console. In 2005 the Xbox360 destroyed PC GPUs of its day, yet in 2 years' time, midrange PC GPUs were wiping the floor with it - and midrange meant $180-300 in those days. Tier on tier gains since Pascal have been pretty pitiful really. A 5 year-old high-end card shouldn't be equivalent to a modern day midrange card, it should be absolutely decimated by it.
4060 at 230 watts is Nvidia getting absurd at this point. That heat doesn't magically disappear in your room, and the extra power does not magically get paid for free. It pretty much guarantees I will not upgrade this generation. It will be awkward if in order to get a sub-200w card worth upgrading to I have to buy something like an RTX 6050 in five years.
The 60 series normally sits between the 70 and 80 series from the previous gen. The 2060 was an outlier because it was as powerful in rasterised output as a 1080. The 4060ti is supposed to be stronger than a 3080. Just because the 3060 was the weakest upgrade in generations doesn't mean we should either expect or accept that as a new normal.
Something still bugs me about the 4080 and 4090 performance its just looks like too much. knowing how nasty nvidia can be, no way they would allow that much performance jump within one generation. The 4060 and 4060ti numbers, to be honest, looks more accurate that the other leaked numbers. Its way too easy for them to shave down a die and make it not perform as good, just to make sure they have something lined up for 5000 series. And FYI, they were doing this to get 3080 chips out of 3090 chips (mind you that is 3090 chips that did not meet the criteria, but was still too fast to be a 3080).
rtx 4060ti being rtx 3080 is INSANE. I think I wont pull the trigger this gen but if rtx 5060ti or its AMD equivelant comes with a similar performance uptick, It is time for me to go all in with my hard earned cash. The pinnacle of price to performance.
I bought a rx 6600xt in the spring when prices were first starting to drop, so I'm wondering what the best value upgrade will be for me in the future. Rx 6750xts are currently less expensive than what i paid for my rx 6600xt lol.
People need to realize that as the later generations of GPU are more performing, the minimum power requirements always goes up... So that 4060 is actually a more optimal 3090 since it's higher than a 3080? Power requirements didn't drop much. We are seeing less and less return on price/perf because reducing node sizes from 7nm to 5nm isn't much. 28nm to 14nm is a 14 unit jump. 14nm to 7nm is a 7 unit jump. 7nm to 5nm is a 2 unit jump... So 4nm, 3nm, or even 1nm, won't see efficiency increase that much, as well as risk involved with them being that small... Interference and such can happen more easily. So how do we get more performance without the power struggle? We'll probably have to reinvent CPU/GPUs entirely beyond this point, with things like stacked chips being a mandatory requirement over monolithic, but the power usage won't really drop at all unless we utilize some sort of 'infinity fabric' type usage (for instance 4 tiny gpu chips working side by side, but is at a 10~30% usage level, which requires much less voltage and heat, versus a single chip which at 100% usage level requires a much much higher scaling of voltage and heat dispersion to stabilize it). All chips works at a spectrum of efficiency... All chips at 1% to 10% usage have almost non-differing voltage/performance levels, it's mostly the same... It's when you push those chips past 50% usage that you start to see how the voltage scaling and performance scaling rises dramatically (lets say Sample A hits 1600mv at 50%, sample B hits 1750mv at 50% for the same perf)... So what happens, if AMD takes the method of multi-core type CPU design and apply that to the GPU? Like say 64 core GPU multi-die? Actually, GPU dies already functions like that in a way, they have 'stream processors counts' and all that in there which allows them to parallel process graphics... So all we would be doing is multiplying the amount of 'dies' and the inner stream processors, and require a way to sync them all to work together. But at a basic level: Why not have 4 "cores" of GPU dies, small, but each one representing 1080p resolution of work space, and you combine four of these resolutions together to make a 4k screen? It's like having 4 borderless 1080p monitors stitched together. If you play at 1080p monitor, only 1 core is needed to be active (100% usage)? Or perhaps all four cores splits the work evenly 25% usage? Power usage would vastly differ since that one core at 100% usage will create more heat than 4 cores at 25%.
As far as I know: Ada SMs can only be disabled in blocks of 4. Ampere SMs could only be disabled in blocks of 2. We already received incorrect information from kopite as well like the 126SMs for 4090 so I would not automatically assume it is all fake, but I think: .) 4060 is 32SMs .) TDP values are maximum values of those tesboards. I would release version TDP to be around 70-80% of that so probably 200W. .) We dont know if the TSE values are for the max TDP or for something between. Also voltage curve is not optimized at that point. Taking other leaked TSE values into account I am leaning more towards the 6K being release TDP values.
The peak power or transient spikes, are the real challenge here. If 4000 series does spike at 300% nominal, you'll need a PSU upgrade for any of these to at least 1000w.
Guys . There is no leak Its only fanfiction At this point, i cant take any of these news serious until there is something direct from amd or Nvidia. Not trying to disrespect chanel owner since its not his fault for doing his job as jornalist and reporting about this. thats not my intent to disrespct Its just too many flip flops about these "leaks" that i am getting allergic about gpu speculation.
The fuck, the 60 cards should be under 200w. I remember when Maxwell came out and the 980 was about 180w, understand the vast increase in performance since then, but the power draw is a piss take.
110s 3080 10gb and Eve N a 3080 12gb some day , the minors were melting the plastic layers almost rendering those poor GPUS useless before getting a kill.
1070 and 1070ti have an odd number of SM:s. However, what is weird is the top SKU of that sized die cut so much. That alone would make it almost certainly something of an engineering sample testing things instead of the final design.
We can pretty much guess what the rates will be... just extrapolate what the 2,000 and 3,000's have been compared to their top cards, and THERE YOU GO, we have the approximate numbers for the lower cards of the 4,000 series. Can someone just do that and put a video up?
Rtx 3080 droped alot in price in sweden, think i can save alot of money waiting for a rtx 4060 ti instead or should I just go for a 3080? 3080 is 777 dollars for EVGA GeForce RTX 3080 10GB FTW3 ULTRA LHR
I know that GPU isn't much of an architectural miracle, but the trend of GPU scaling does not make me excited. They constantly increase the power draw. Ignoring the 1kW PSU required for that, just the VRM capable of supplying so much current without blowing up will cost a lot. The PCB on which the card is built has a lot more copper to dissipate heat quicker. The cooler for xx60 cards is massive compared to older generations. Those will never cost below $250 with such strategy. And just the raw heat generation is horrific.
Question If nvidias next gen GPUs are way better partly due to much higher power consumption…then how will they fair once placed in laptops next year that can not draw that much power . Is it safe to guess they will only slightly outperform the 3000 series laptop GPUs ?
Moore's law is dead suggested that Nvidia bought a lot of Ampere stock from retailers and AIBs to be used in something else... perhaps that's why the 4060 is basically a 3060 ti??
So... I have a 1060 6GB now, should I wait for the RTX 4000 series to come out so I will be able to get a RTX 3000 at a cheaper price or should I buy now?
It depends on how much you value saving a few hundred dollars at best. Also, if you're willing to buy used you could save a lot of money as well. The difference from a 1060 to a 3000 card is going to be huge. I wouldn't get greedy tbh but I also have a solid job so a few hundred dollars doesn't stress me like it used to when I was making 10 dollars an hour lol.
Currently, the most important thing about these cards is their price. So far, it appears to me that the 4070 is the sweet spot. The rumored price is $599. If money is not an issue, those purchasing an nVidia card should buy a card with more than 12GB of VRAM. If money is an issue, they should not pay more than $600 for any card without more than 12GB of VRAM.
Maybe price/performance wise but let's be real. $600 is way more than most people will pay for. There's a reason why the xx50 ti and xx60 series are always the most popular cards by quite a significant margin. Only exception would be the 970 but that card was about half the price.
@@immanuelkant7552 Agreed. I generally go for a $180 graphics card myself. The most modern game I play is Wizardry 8. I do however, plan on using Poser in the future, and the 4070 would apear to give me the most bang for the buck.
Looking into this comment section I am always amazed at how seriously some people take marketing terms like 3nm. Tells you a lot about how little the consumers really understands
I was thinking the same... For me personally upgrading to 4000 series is at least 4 years away. Unlikely to see any gain in performance regarding games, comparing to 3080+.
i think they gonna raise the price for this mid range GPU by 50$ just so people then buy the remaining stock of old GPU. if it really outperforms 3070 then it would be great for budget 1440p gamers
I think people are over coping that the 4060 is going to change. Free reminder that the 3060 is basically identical in performance to the 2060 super which mirrors these charts showing the 4060 and 3060 ti being almost identical. If anything is fake here it's the 4060 ti performance. I expect it to out class the 3070 but beating the 3070 ti by miles are being that close to a 3080? That's literally impossible because the 3080 will never go to 600 dollars and you can't price the 4060 ti above a 3080. This really shows why they are dumping hard and trying to actually force brand makers to actually make the gpus msrp instead of as it is now the 3060 ti still being 150 dollars over msrp. At this rate we are likely to see inflated msrp for both the 30 and 40 series to "fix" the pricing, for example I expect the 3070 ti will go from 600 to 700 dollars msrp and the 4060 ti with true performance will fall just below that so they can price it about 600 dollars. This is why I hate even looking at "leak" videos because it's just fake at the end of the day and as you can see by the comments it turns people delusional lol. Might honestly be worth upgrading to a 30 series card just before 40 series and the price increases are announced together, as much as I prefer upgrading between 2 generations in this market I can already see even the 4060 series is going to be priced more than what I could get a 3070 ti for right now as companies try to dump them.
@Exter_pc on Instagram Interesting I didn't know there were bots advertising scam gpu's for sale. Maybe I don't watch enough tech videos or maybe its because the tech channels I watch have better comment filters set to avoid these bots lol. Still interesting though first gpu bot I have seen on youtube.
i think this would be good. exactly right. any wants a new full-Hd card from the new series? ya can have one. to bigger the spectrum is veeeery good. Easier to sort the cards for newbies in things of graphic cards, and much more intelligent when it's done
To be honest the 4060 seems legit. The 3060 is barely faster than the 2060 super. I could see NVIDIA doing something similar this generation since they still want to move 30 series cards.
i just want to know if it makes sense to get a pc with a 6800xt and i512600kf instead of waiting for the new gen stuff, taking, higher prices, higher power usage, higher power costs (me living in germany) and inflation in consideration
My advice would be, buy everything except GPU. Once the next generation drops, buy a 4060/ti card, or AMD equivalent. Power draw on GPUs is getting insane. The current rtx 3060 and ti cards are already small beasts that could last a very long time, so buying the 4000 series will surely be an even better deal (assuming you would buy a current GPU at around MSRP or a little under MSRP, which imo, is a bad move). RTX 4070 = rtx 3090ti or so, so you should balance the GPU value based on this statement. Also a 3090ti would very likely consume more power than the 4070. Only reason to prioritize for a 3090ti would be the vram. The 6800XT consumes up to 300W. Buying a high end card right now is a very bad move, you will be loosing more value for what you invested in when next gen drops. If you really want to have a build before next gen drops, you should either buy a CPU with an IGPU just to make some use out of it (i bought a 5600G back in feb, the IGPU can run a fair amount of games on low quality settings), or you buy a CPU with no IGPU and buy a cheap card in the used market just to have has a temporary solution. Like a gtx 1070 or so, should be enough to get most of your games running at a fair performance rate.
@@CHT1992 thanks for the advice mate. I really dont mind waiting my priority is on getting good value (price/performance ratio). i think if i wait i would decide between 7600xt or 7700 i guess since i really dont really need nvidia and amd might have a bit better value for gaming and more vram. Are u sure tho that its also not worth to wait on the cpu?
Hi, thanks for the informative video. Will Nvidia launch a 4060 Founders edition this time? Because the 3060 is courrently to have for about almost the same price as the 3060ti Founders Edition directly from Nvidia. Its laughable how expensive 3060s are because there is no Founders Edition.
The leak makes no sense. They're saying the 4060ti is gonna use a heavily cut down AD104, then the 4060 will be using a cut down AD106? Why wouldn't the 4060 get the fully enabled AD106 die with such a performance between the regular 60 & 60ti? Unless they really plan on milking the stack & having a regular 4060, 4060 super, 4060ti.... 🙄 Edit: 7700X & 7600X looking pretty decent, although I like what Intel is doing with the E cores on their newer chips. I need a CPU upgrade soon, gonna be a tough call which way to go.
The "price-cuts" by Nvidia are just laughable at this point. The higher end cards are now sitting at prices where you could maybe justify their launch MSRP should have been. The lower end cards haven't even hit MSRP.
Even the 4 year old 1660 super is still 250€ new. Which, surprise surprise! is also above its launch MSRP of 230€. I mean you've gotta be shitting me with that stuff.
Best thing to do is not to buy any of that crap.
This
So factoring in how much prices increased i could reasonably have a 1660 at $180 with the supers at $210 maybe, but yea prices now are crazy
Inflation.
Imagine buying a 250 dollar 1660 super instead of a rx 6600
I know this are just rumors, but the fact that the RTX 4060 was specified with a power draw of 240 watts (and should pobalbly have better architectur) and is worse than a RTX 3070 with a power draw of 220 watts is just hilarious.
Why? Leaks are putting the performance level of the 4060 a complete tier above the 3070 as in 3080 levels of performance and you cannot see why the power draw increases 20 watts over a 3070?
Considering you are moving up a tier on performance at only a cost of 20 more watts seems pretty reasonable to me.
@@billwiley7216 its the 4060ti thats overperforming, not the 4060.
@@billwiley7216 It better be if its using more power.
yes its better but at the cost of efficiency. I sure hope a 2000watt psu won't be a thing a decade from now...
remember that TSE puts an extreme load at the gpu, that normaly doesnt happen, so in gaming and "normal" use cases it could be 200-220 watts (somewhere around that number....i dont care, i need a faster gpu.) but if this powerconsumtions are true : 4060= 240/ 4060ti= 260-280/ 4070= ~320-350/ 4070ti =350-400/ 4080=400-450/ 4080ti and 4090 i dont even want to know
"Current cuts have not moved inventory as much as they'd like."
Of course not... for Nvidia, your crap was WAY over priced. Gotta' keep droppin' those prices. Not to mention, the merge is just around the corner. People are being smart to see what impact that all has on the market. As for these 4060/Ti "leaks"... I have my doubts.
They won't do it. It makes no sense to sell warehouses of GPUs for less than what it cost to make them. Even if they manage to sell everything, they will oversaturate the market to the point where nobody will buy the next gen cards. Why would you buy a 3000 card just to replace it with 4000 6 month later? Not in this economy.
They will just cut 4000 series production and make you pay extra for the next gen. Layoffs will most likely follow to cut costs.
@@harryshuman9637 That wasn't what I was implying. What I'm implying is that they're still trying to get those high profit margins even though the people willing to spend that are gone. The cuts thus far are a joke when you can get a 6700 XT for 3060 money or a 6600 XT for the same price as a damn 3050 or that they're trying to sell the 3080 Ti for what it's original MSRP should have been. I don't think further cuts will place them at a loss. Just less money than they'd ideally like to be making.
Besides, let's say we're talking them making the decision between loosing money on product or continue sitting on it. It would be even dumber to sit on the product. If you have it at cost, and people still aren't buying, it's unreasonable to think that as time goes on... they might if we just keep the price where it's at. That product only becomes worth less. It makes no sense for them to sit on a bunch of excess inventory and they're be better off selling at a loss. It's been known to happen, and they write that loss off on their taxes, and it's so much less of an issue right now considering how much bloody money they made with the mining boom. It would be one thing if it weren't a huge surplus, but it is, and it makes no sense to sit on that inventory.
That being said, price to performance isn't going to be great on 4000 series regardless if they sold them at a loss or not. However, lay offs could happen either way. After all, we're headed into a recession. And things are only going to get worse. I have a feeling there are hard times ahead for the PC component industry and these high priced enthusiast components aren't going to sell all that well. I think Intel, AMD and Nvidia have failed to read the room, much in the way they failed to see the economic crash coming. They're trying for excess when people are starting at looking to tighten their belts.
Regardless, I think the flood of used GPU's ready to hit the market will have an impact on new prices to an extent, and I think people that are willing to buy used should heavily consider waiting a few weeks to see how this plays out.
@@TheGameBench That's fine, but I'm still telling you they will not cut the prices.
@@harryshuman9637 Did you not watch until the end of the video. There most certainly will be more price cuts from both Nvidia and AMD. If you have information to the contrary, show it... but your opinion that they would rather hold onto inventory until it is worthless rather than sell at a loss is unfounded.
price cuts are also a lie. 3060 ti is still going for £100 over msrp. In what world and what 3rd world education do you have to have to call £100 over msrp a "price cut" They don't even need to cut the prices to get rid of their entire stock literally make the gpu cost msrp and you will sell out instantly. The greed is unimaginable. I would say the 40 series will save us but we already know 40 series cards are going to be insanely over priced for no reason lol. Going to laugh my ass off when 4060 drops with same performance as a 3060 ti but costs £50 more or something classic nvidia.
I have 2 PCs that I game on. One has a 6600 and the other a 6700XT. The difference in performance is massive. However once I’ve set up each game to max out performance I cannot see the difference in visual appearance whilst playing the games.
Like many people I feel the NEED to max out settings and upgrade so I can set them even higher.
But if you can play at your refresh rate and resolution, you don’t need to upgrade all the time. Don’t fall for the marketing. The gameplay and enjoyment will be the SAME and you can spend the money on a holiday or something else to chill out .
Thats true, but some dumb people(me) got into 4k gaming on large screens. So yeah performance is important. I do use my monitor as a tv as well though. And am patient for upgrades 5 years usually. So need that perfomance
Facts! I used to have a GTX 1060 6GB, 4 months ago I upgraded to an RTX 3060 and it's been amazing! Some games at ultra settings are not always 60FPS+ but like @iceburglettuce said you don't have to put all at ultra, you could customize the settings and get 75FPS+ without too much image quality loss. (I'm talking without DLSS on btw) the 3060 can max out at over 60FPS almost every game at 1080p. (I play 1080p 144hz), my next update hopefully before this year ends is getting a new CPU, probably an i7-12700 (When the new CPU's come out) the prices should drop which should allow me to upgrade to a very capable CPU for streaming. (currently have an i7-9700 (non-K) which is not bad) but I want more FPS in some games and a smoother Streaming experience.
Fax
A friend of mine has a beast of a pc with a 12900k and a 3080. But he plays on a 10 year old 1080p 60hz monitor... It frustrates me to no end lmao.
@@Adama.1 60Hz lol. Could have saved hundreds
Wishing you a speedy recovery. Thanks for the content.
The 7600X, 7700X and other parts releasing at the end of Sep are in full production, and have been for at least a couple weeks, so if parts are recognized by software, those are production samples getting testing.
I do find the 4080 and 4090 numbers impressive, but I think Nvidia is a bit heavy handed with chopping up the 'lesser' cards. The 3080,12gb & ti are going to start looking pretty juicy when the prices drop further, and to me 3090 vs 4070 is a no brainer if the 3090 gets to 800 or below while were still waiting for even 4080 to come out
@@pmf822 as someone who is looking to buy a gpu upon the new releases, that breaks my heart.
Finally a reasonable comment and not one saying it's obsolete because it has the word Ampere next to it so it must suck. The 4070 will only have 10 gigs of ram too if rumors are correct and the 4090 is just 35% faster. Not 50%
4060 can't be there @240W...
3060ti takes 200W
That would mean a 83% perf/watt compared to the prev gen
Yep, perf/watt simply doesn't add up, especially if you also take into account they are also on a lower more efficient node. I guess you should never believe a so called "leaker", that you've never heard of before.
eer yea... my GTX 1070 racks up the electricity bill when i use it, and its 150 watt.... a 275 watt card would be pushing the limit for my liking. Interestingly the GTX 1080ti is 250 watts has more VRAM and higher memory bandwidth at 484GB\s
I would settle for a 4nm 1080ti card, yes no tensor cores i dont care. Just a 4nm 1080ti with 11GB. And it should be about 50 watts. And the UK price should be around £200
3060 is 1080ti performance level -> undervolt and you can bring it down to ~110W
Не болей! Выздоравливай!
the fact that the 4000 cards all have tse scores rounded to nearest 1k tells me they're still just estimates.
You Mean the simlair bump from 2060 Super to 3060Ti, the 3060 12GB and 2060 super is pretty much on par with each other.
The 4060s won't be out for another 6 months at the earliest which means these are ES.
Nvidia is still sitting on 3000 stock that they haven't sent to AIB partners as the decoded talk from Jensen says a few days ago. You MIGHT see two Nvidia products this year, the 4090 and 4080.
Mostly you'll be looking at discounted 3000 series and the 4090 whenever it launches, but the 4090 should now be in its finished state. If it comes out in Oct, the die has to be rolling out of TSMC in August. It then has to go to manufacturers for GPU production, and it needs about 5 weeks to get into N. America after the product leaves a factory. Yes, it takes that long. It's not air freight. Container ships take about a month to cross the Pacific, then has to enter customs then get into supply chains. So 6 weeks there total. So, if a GPU came out of the factory today, the earliest you'd see it in N. America at a retailer is 6 weeks from now. That would get you to around Oct 8th.
When did the 4090 have to be finished testing if it launches in Oct? June. Then TSMC would produce some finished die, it would get tested in a finished product and then August would go mass production
So, that production sheet shown from some Asian manufacturer who we don't know who it is, that's for retail sales if the launch is Oct. If the launch is Nov that could have been a batch for testing finished die
I can't imagine anything but the 4090 being released until they sell the 30 series stock.
Good thing I did not buy any graphics card during the price spikes. And the prices are still a bit too high. If I were to buy a graphics card, I would just buy one that has less TDP and provides the necessary performance that would allow me to play at 1080p High settings. I do not really care about the benchmarks too much, I just wanna play games. Just enjoy what you buy and make careful financial decisions. :)
Note: This is coming from a person who has an Rx 460, so whatever graphics card I would buy will be a huge performance increase I have currently.
Wishing you a fast recovery!
4060 Ti, new arch, 4nm and only match RDNA2 7nm? That is horrible. Should be able to do the same perf at 50% less power.
the 3060 ti draws 200w
the 4060 ti rumored to draw 270w
the 6700xt (because 6600xt is not the eprformance class for it to go against lol)
is 230w
the 7600xt is 6nm, not 7, so *slightly* better efficiency from node alone but not much, rumored to be only a bit more
so it could draw 180-250w (250 being worst case scenario)
Exactly my thought. Lovelace efficiency looks very underwhelming. :(
@@philRacoindie was talking 6800 XT
Both Nvidia and AMD have terrible efficiency.
If you are a REAL gamer that wants state of the art efficiency, then Intel ARC is the best of the best.
@@ok-70707 you are the clown..
You completely missed the joke
i am not sure but the difference between the 4060 and 4060 ti looks a little bit odd compared to other card and their ti variant
Why 3060ti to 3060 is around 1800 points better while the 4060 ti is around 2000 points faster than the 4060. I dont see where the problem is here.
@@ElonMusk-Krypto it's a 1777 to 2600 difference
@@ElonMusk-Krypto plus look not only at the 3060 but also to the others
I will not be upgrading my 3090. I am lucky that I was able to get mine before all the prices went nuts, and I am going to keep it for a few more years.
You don't even have to. The 3090 will hold well for the next 4-6 years or so.
@@laszlozsurka8991 3090 will hold more than well. Games will be limited by console hardware, only PC exclusive titles will make real use of the hardware. A game that launches for xbox series s, will suffer sever quality drops when adapted to pc and ps5, it will have less improvements as a next gen title due to the limitations that XSS applies due to having a weak GPU.
I'm using a 1080ti since launch and i estimate it will "hold well", for another 3-4 years for high/ultra settings at 1080p or 1440p with no new tech like RT, the 3090 is on a different level than the 1080ti.
If Sony and Microsoft will release "pro consoles" in the next couple of years, i assume their performance to be pushed close to 3090 performance tier, perhaps a bit higher. This won't mean the GPU will suddently struggle with gaming.
RTX 3090 is likely to last a decade at the rate things are going. Even if a new generation of consoles drops, it will take 2-3 years to abandon previous generation. RTX 3090 is only going to become irrelevant when games are focused for a new generation of consoles that might push the performance gap so high to the point the 3090 will be like "minimum" to play.
I see no price meaningful price drops of GPUs in EU. So they say they have full stocks and must move it out? Wake me up when they actually go 20-30% below original MSRP.
3080 12Gb are dropping under 900 € this week (there's a bundle from msi, keyboard + mouse + ventus 3080 12Gb for 899). I would buy one as soon as it drops below 800€ imho
As a Swede I must say that it is so annoying to see American prices without tax, when every card costs MSRP + 400 - 900 USD.
The responsible thing to do about rumors from questionable sources at best and downright spreading false information at worst is not to cover them imo. If the "leaks" come from someone who has history of being right it may be worth noting but if the info comes from a random dude on Twitter it's best to leave it alone. You are still amplifying the rumor no matter how much disclaimers you say in between and is actually how false info spreads irl.
Yep, there's already people taking it as being 100% true, in the comment section. A lot of things simply don't add up, in the so called "leak", which should instantly set those false flags flying. It's usually is a good "BS indicator" whenever things don't add up and in this case it's almost certainly a steaming pile of BS (a nobody "leaker", very early for leak on those GPU's, odd number of SM's listed, perf/watts doesn't add up, test results rounded up and lower perf than expected for the 4060).
Of course some gullible fools and fanboys, will always believe and soak up anything that's said without questioning it, if it fits neatly into their mindset. They'll then of course, pass on that BS misinformation to other gullible idiots.
problem is that even "legit" leakers are often very, very wrong
I really want a RTX3080 for 500 dollars, will that happen sometime soon?
I don't believe 500 dollars, but it would be 700 dollars.
@Mehmet Şerif Well it’s already at 700 dollars, but MSRP two years after release ain’t that appealing😂
@@BenSheriff $700 was the MSRP for the 3080... imagine buying a 2 year old card at MSRP
@@laszlozsurka8991 I know it sounds silly but my guess is bitcoin will rise again and the chip crisis could start at any time because of chon and taiwan.
@@laszlozsurka8991 nothing wrong with that because the card is just as powerful as it was before
Or you know.... Time Spy doesn't test RT performance and AI which is the main focus of improvement with Nvidia since 2000 series launch. Raster performance is already absurdly high.
B-
Retro topic but just so people know about why the 3060 comes with 12gb, is because of the 192 bits bus it uses which can accommodate either 6gb or 12gb, and 6gb was considered too low.
So 6gb was considered too low for a 1080p card, but 8gb was considered "fine" for a much faster card like the 3070ti which is supposed to run 1440p ultra / even some 4k? Weird decisions
@@NoThisIsntMyChannel Strange choice indeed but the 3070ti has GDDR6X VRAM. Better comparison would've been the 3060ti/3070.
@@NoThisIsntMyChannel well 8gb isn't a ton, but is considered the minimum acceptable these days. 6gb is too off putting considering even the 2060 had 12gb. BuildZoid explained this thing about the bus width in a video and to me it makes perfect sense, is the only thing that makes sense imo.
If they are 250+ watt calorimeters, then I'm scared for the next generations.
The 40 series lineup looks more and more disappointing every passing day
What about the 5080 and the 5090 Ti’s ? Any luck doing those benchmarks ?
230 to 270 fucking watts on a xx60 series? ohh god! we need some nuclear fusion powerplant to cope with that... 😄
Great video. Hope you feel better soon.
i am also isolating bro. luckly have my PC to watch ur vids. get well soon :D
I saw the 4060s will only have 8GB of ram but perform like a 3080? That's a 1080p card. Not enough RAM.
Awesome video and always, and will admit for a second i thought it was my kids running around :D Get well soon Daniel
I think they nerfed the cards on purpose to keep prices up for 30 series since they still stuck with so much of those where people might still want a 30 series over the 4060
A suggestion on selecting text since I see you had some problems with it. Double click will select the word under your cursor and triple click will select the line of text the word is part of. You might find that easier than trying to drag select in a chart like that.
My 1050 ti says hi, lol
1060 3GB here 🤣
Thumbnail score is wrong though, my 6900xt outperforms the 3090ti in timespy.
These leaks IMO look more realistic than from Kop7mite going up roughly 1 tier instead of 2 tiers from 4060 being around the 3070 instead of 4070 being around the 3090.
Kop7mite says a lot of shit tbh
hopefully this changes, we are still months away from the release of those GPU, but it is a bit odd that there is that much performance gap between the 4060 and 4060 Ti.
Eh, in the past there's been a big gulf before. Eg. the GTX560 vs the 560Ti, huge difference. 1080 vs 1080Ti, large difference.
3060ti is also way closer to 3070 than 3060.
Basically it depends on the used chip, usually 60 Ti models are cut down 80 or 70 chip and 60 are tier lower chips.
1060 6gb was way faster than 1060 2l3gb due to it being cut down 70 model too.
It then makes sense that 4060tinis a cut down 70 model to and is closer to 70 performance than 60 card.
Usually been like that 60Ti or 70 = last gen 80 model performance.
3060Ti= 2080, 2070 = 1080 (+DLSS), 1060 6Gb = 980 and 1070 = 980Ti, 970 (329$) = 780Ti and 960 = 780.
So makes sense that 4060Ti = 3080 at almost half the orginal price of 3080, so MSRP of 399$.
Always pay attention to 60ti models, if they are released on releaese date.
yeah, if you look it that way it makes sense, i knew the 2060KO for example had a cutdown chip of the 2080?, but didnt knew this was a common practice.
@@InnuendoXP I think what's affecting people's views could be how the 4070 is rumored to performance match the 3090, so effectively punching 2 classes above its weight vs last gen cards - and meanwhile the 4060 can't even match the previous gen card 1 tier above (aka 3070). So the feeling of _budget_ really hits harder here.
When we look at last gens, the 3070 was well matched with the 2080 in various performance tasks - had there been a 2090 then the 3070 would not have reached anywhere near that level by any chance. Yet the 4070 is leaked to do just that, so the gap feels even more significant down to the 4060 which effectively should be the 1-tier-below card.
Either way, time will tell. I'm sure the accuracy of current claims are unprecise anyways.
@@Real_MisterSir the gen on gen increase we're seeing here in % relative performance terms isn't extraordinary if your view is over the past 20 years instead of the past 5. The 2 cryptobooms, Radeon's half decade of struggling to catch up, and Turing's pathetic price:performance non-increase have heavily distorted people's expectations to be conditioned into buying less for more.
Just look at the current consoles, they're 2 years old, but to get a GPU as powerful as the graphics processing part of their APUs alone is still more than 3/4 the cost of the console. In 2005 the Xbox360 destroyed PC GPUs of its day, yet in 2 years' time, midrange PC GPUs were wiping the floor with it - and midrange meant $180-300 in those days.
Tier on tier gains since Pascal have been pretty pitiful really. A 5 year-old high-end card shouldn't be equivalent to a modern day midrange card, it should be absolutely decimated by it.
4060 being as good or competitive with the 3080 doesn’t make sense. 4060 being roughly equal to the 3070 is what I’m expecting
4060 at 230 watts is Nvidia getting absurd at this point. That heat doesn't magically disappear in your room, and the extra power does not magically get paid for free. It pretty much guarantees I will not upgrade this generation. It will be awkward if in order to get a sub-200w card worth upgrading to I have to buy something like an RTX 6050 in five years.
that's why i might opt for an amd tbh
@@vipersrt30 Yeah I'm looking into AMD as well.
The 60 series normally sits between the 70 and 80 series from the previous gen. The 2060 was an outlier because it was as powerful in rasterised output as a 1080. The 4060ti is supposed to be stronger than a 3080. Just because the 3060 was the weakest upgrade in generations doesn't mean we should either expect or accept that as a new normal.
Rest up Dan interesting to see if leaks are accurate abt 4060 & 4060 TI
Something still bugs me about the 4080 and 4090 performance its just looks like too much. knowing how nasty nvidia can be, no way they would allow that much performance jump within one generation. The 4060 and 4060ti numbers, to be honest, looks more accurate that the other leaked numbers. Its way too easy for them to shave down a die and make it not perform as good, just to make sure they have something lined up for 5000 series. And FYI, they were doing this to get 3080 chips out of 3090 chips (mind you that is 3090 chips that did not meet the criteria, but was still too fast to be a 3080).
rtx 4060ti being rtx 3080 is INSANE. I think I wont pull the trigger this gen but if rtx 5060ti or its AMD equivelant comes with a similar performance uptick, It is time for me to go all in with my hard earned cash. The pinnacle of price to performance.
I bought a rx 6600xt in the spring when prices were first starting to drop, so I'm wondering what the best value upgrade will be for me in the future. Rx 6750xts are currently less expensive than what i paid for my rx 6600xt lol.
The 4060 consumes more power than the 3070 and performs worse? Definetly fake, super fake.
People need to realize that as the later generations of GPU are more performing, the minimum power requirements always goes up...
So that 4060 is actually a more optimal 3090 since it's higher than a 3080? Power requirements didn't drop much.
We are seeing less and less return on price/perf because reducing node sizes from 7nm to 5nm isn't much.
28nm to 14nm is a 14 unit jump.
14nm to 7nm is a 7 unit jump.
7nm to 5nm is a 2 unit jump...
So 4nm, 3nm, or even 1nm, won't see efficiency increase that much, as well as risk involved with them being that small...
Interference and such can happen more easily.
So how do we get more performance without the power struggle? We'll probably have to reinvent CPU/GPUs entirely beyond this point, with things like stacked chips being a mandatory requirement over monolithic, but the power usage won't really drop at all unless we utilize some sort of 'infinity fabric' type usage (for instance 4 tiny gpu chips working side by side, but is at a 10~30% usage level, which requires much less voltage and heat, versus a single chip which at 100% usage level requires a much much higher scaling of voltage and heat dispersion to stabilize it).
All chips works at a spectrum of efficiency... All chips at 1% to 10% usage have almost non-differing voltage/performance levels, it's mostly the same... It's when you push those chips past 50% usage that you start to see how the voltage scaling and performance scaling rises dramatically (lets say Sample A hits 1600mv at 50%, sample B hits 1750mv at 50% for the same perf)...
So what happens, if AMD takes the method of multi-core type CPU design and apply that to the GPU?
Like say 64 core GPU multi-die? Actually, GPU dies already functions like that in a way, they have 'stream processors counts' and all that in there which allows them to parallel process graphics... So all we would be doing is multiplying the amount of 'dies' and the inner stream processors, and require a way to sync them all to work together.
But at a basic level:
Why not have 4 "cores" of GPU dies, small, but each one representing 1080p resolution of work space, and you combine four of these resolutions together to make a 4k screen?
It's like having 4 borderless 1080p monitors stitched together.
If you play at 1080p monitor, only 1 core is needed to be active (100% usage)? Or perhaps all four cores splits the work evenly 25% usage? Power usage would vastly differ since that one core at 100% usage will create more heat than 4 cores at 25%.
The vram scale for the 40 series is what the 30 series should've had
As far as I know:
Ada SMs can only be disabled in blocks of 4.
Ampere SMs could only be disabled in blocks of 2.
We already received incorrect information from kopite as well like the 126SMs for 4090 so I would not automatically assume it is all fake, but I think:
.) 4060 is 32SMs
.) TDP values are maximum values of those tesboards. I would release version TDP to be around 70-80% of that so probably 200W.
.) We dont know if the TSE values are for the max TDP or for something between. Also voltage curve is not optimized at that point. Taking other leaked TSE values into account I am leaning more towards the 6K being release TDP values.
Hey hello Daniel, i was wondering i fi should get an AMD gráfics card or NVIDIA card and between the ex 6800 xt and rtx 3050?
6800 xt better
Tank you.
Lol 3050 ti even exist? I know only about 1050ti and 3050 and these gpus sht
@@godofdeath8785 3050 ti is only on laptops.
What does AD106 mean? Is it a reference to a existing sku?
I did bio logical material modfications , internal bio cell for remote diagnosis and regen.
3080ti currently $900 or so, im looking to bu one for $650 or so, do you think they will drop to this in 3 months or so?
The peak power or transient spikes, are the real challenge here. If 4000 series does spike at 300% nominal, you'll need a PSU upgrade for any of these to at least 1000w.
Guess I'll have to wait. 4060ti ain't gonna cut it for me.
Honestly not looking at all to move to a computer product that has the same effect as an electric space heater in my house.
Guys .
There is no leak
Its only fanfiction
At this point, i cant take any of these news serious until there is something direct from amd or Nvidia.
Not trying to disrespect chanel owner since its not his fault for doing his job as jornalist and reporting about this.
thats not my intent to disrespct
Its just too many flip flops about these "leaks" that i am getting allergic about gpu speculation.
tiered of these fake leaks. tech tubers been doing these types of video for over a year about the 4000 series. just wait till they finally launch.
people were believing the pipedream that was rtx 4060 = rtx 3080
The fuck, the 60 cards should be under 200w. I remember when Maxwell came out and the 980 was about 180w, understand the vast increase in performance since then, but the power draw is a piss take.
110s 3080 10gb and Eve N a 3080 12gb some day , the minors were melting the plastic layers almost rendering those poor GPUS useless before getting a kill.
1070 and 1070ti have an odd number of SM:s. However, what is weird is the top SKU of that sized die cut so much. That alone would make it almost certainly something of an engineering sample testing things instead of the final design.
What is an expected price of RTX 4090ti
We can pretty much guess what the rates will be... just extrapolate what the 2,000 and 3,000's have been compared to their top cards, and THERE YOU GO, we have the approximate numbers for the lower cards of the 4,000 series. Can someone just do that and put a video up?
The power draw is definitely wrong if the 4080 will indeed have one of 340w. 180-240w is much more likely and leaves 260-320w for 4070 and 4070ti.
I hope next gen we get a huge perf/watt increase instead of crazyness
You can't do that anymore since they bios locked em.
Thank You for your best guess. Get well.
What would be best for gaming - AMD GPU and CPU running SAM, or the same AMD GPU and an Intel CPU with slightly higher clock?
At this rate, they're trying to make graphics card powerful enough to power the matrix. 😆
At least they will consume as much power as the matrix.😉
Rtx 3080 droped alot in price in sweden, think i can save alot of money waiting for a rtx 4060 ti instead or should I just go for a 3080? 3080 is 777 dollars for EVGA GeForce RTX 3080 10GB FTW3 ULTRA LHR
I know that GPU isn't much of an architectural miracle, but the trend of GPU scaling does not make me excited.
They constantly increase the power draw. Ignoring the 1kW PSU required for that, just the VRM capable of supplying so much current without blowing up will cost a lot. The PCB on which the card is built has a lot more copper to dissipate heat quicker. The cooler for xx60 cards is massive compared to older generations.
Those will never cost below $250 with such strategy. And just the raw heat generation is horrific.
Question If nvidias next gen GPUs are way better partly due to much higher power consumption…then how will they fair once placed in laptops next year that can not draw that much power . Is it safe to guess they will only slightly outperform the 3000 series laptop GPUs ?
Moore's law is dead suggested that Nvidia bought a lot of Ampere stock from retailers and AIBs to be used in something else... perhaps that's why the 4060 is basically a 3060 ti??
So... I have a 1060 6GB now, should I wait for the RTX 4000 series to come out so I will be able to get a RTX 3000 at a cheaper price or should I buy now?
It depends on how much you value saving a few hundred dollars at best. Also, if you're willing to buy used you could save a lot of money as well. The difference from a 1060 to a 3000 card is going to be huge. I wouldn't get greedy tbh but I also have a solid job so a few hundred dollars doesn't stress me like it used to when I was making 10 dollars an hour lol.
Currently, the most important thing about these cards is their price. So far, it appears to me that the 4070 is the sweet spot. The rumored price is $599.
If money is not an issue, those purchasing an nVidia card should buy a card with more than 12GB of VRAM. If money is an issue, they should not pay more than $600 for any card without more than 12GB of VRAM.
Maybe price/performance wise but let's be real. $600 is way more than most people will pay for. There's a reason why the xx50 ti and xx60 series are always the most popular cards by quite a significant margin. Only exception would be the 970 but that card was about half the price.
@@immanuelkant7552 Agreed. I generally go for a $180 graphics card myself. The most modern game I play is Wizardry 8. I do however, plan on using Poser in the future, and the 4070 would apear to give me the most bang for the buck.
Looking into this comment section I am always amazed at how seriously some people take marketing terms like 3nm. Tells you a lot about how little the consumers really understands
Are the people still think about spend money in the 4000 series when the 3000 series we have now, are more than enough for (games), Really?? 😂😂
I was thinking the same... For me personally upgrading to 4000 series is at least 4 years away. Unlikely to see any gain in performance regarding games, comparing to 3080+.
Can the next gen please come out I’m in desperate need of an upgrade
Same but we will run into issue with scalping?
@@shinobusensui9395 I’ll probably pay for a bot so I don’t miss out
Looks like AMD is going to get spanked by Intel. They better have a good counterpunch with the X3D version of Zen 4.
😂😂🤣🤣 okay
i think they gonna raise the price for this mid range GPU by 50$ just so people then buy the remaining stock of old GPU. if it really outperforms 3070 then it would be great for budget 1440p gamers
Don't believe what you hear, and believe only the half of what you see.
I love this guy just down to earth no stupidity and he has a good body check out that chest we all know he's ripped
I think people are over coping that the 4060 is going to change. Free reminder that the 3060 is basically identical in performance to the 2060 super which mirrors these charts showing the 4060 and 3060 ti being almost identical. If anything is fake here it's the 4060 ti performance. I expect it to out class the 3070 but beating the 3070 ti by miles are being that close to a 3080? That's literally impossible because the 3080 will never go to 600 dollars and you can't price the 4060 ti above a 3080.
This really shows why they are dumping hard and trying to actually force brand makers to actually make the gpus msrp instead of as it is now the 3060 ti still being 150 dollars over msrp. At this rate we are likely to see inflated msrp for both the 30 and 40 series to "fix" the pricing, for example I expect the 3070 ti will go from 600 to 700 dollars msrp and the 4060 ti with true performance will fall just below that so they can price it about 600 dollars.
This is why I hate even looking at "leak" videos because it's just fake at the end of the day and as you can see by the comments it turns people delusional lol. Might honestly be worth upgrading to a 30 series card just before 40 series and the price increases are announced together, as much as I prefer upgrading between 2 generations in this market I can already see even the 4060 series is going to be priced more than what I could get a 3070 ti for right now as companies try to dump them.
@Exter_pc on Instagram Interesting I didn't know there were bots advertising scam gpu's for sale. Maybe I don't watch enough tech videos or maybe its because the tech channels I watch have better comment filters set to avoid these bots lol.
Still interesting though first gpu bot I have seen on youtube.
No way these cards are faster than a 3080
Why not? The 3060 Ti is faster than a 2080.
i think this would be good. exactly right. any wants a new full-Hd card from the new series? ya can have one. to bigger the spectrum is veeeery good.
Easier to sort the cards for newbies in things of graphic cards, and much more intelligent when it's done
hey could u talk about the rx 6600m from aliexpress please 🙏🙏 i can only find videos in portuguese
To be honest the 4060 seems legit. The 3060 is barely faster than the 2060 super. I could see NVIDIA doing something similar this generation since they still want to move 30 series cards.
That power draw on the 4060 is ridiculous.
I cannot wait for the 4060 ti or the 4070. I just hope we get back to normal prices especially for the mid-high end cards (roughly 250$).
It will be $650
@@shinobusensui9395 4070? That would suck.
i just want to know if it makes sense to get a pc with a 6800xt and i512600kf instead of waiting for the new gen stuff, taking, higher prices, higher power usage, higher power costs (me living in germany) and inflation in consideration
My advice would be, buy everything except GPU. Once the next generation drops, buy a 4060/ti card, or AMD equivalent. Power draw on GPUs is getting insane.
The current rtx 3060 and ti cards are already small beasts that could last a very long time, so buying the 4000 series will surely be an even better deal (assuming you would buy a current GPU at around MSRP or a little under MSRP, which imo, is a bad move).
RTX 4070 = rtx 3090ti or so, so you should balance the GPU value based on this statement. Also a 3090ti would very likely consume more power than the 4070. Only reason to prioritize for a 3090ti would be the vram.
The 6800XT consumes up to 300W. Buying a high end card right now is a very bad move, you will be loosing more value for what you invested in when next gen drops. If you really want to have a build before next gen drops, you should either buy a CPU with an IGPU just to make some use out of it (i bought a 5600G back in feb, the IGPU can run a fair amount of games on low quality settings), or you buy a CPU with no IGPU and buy a cheap card in the used market just to have has a temporary solution. Like a gtx 1070 or so, should be enough to get most of your games running at a fair performance rate.
@@CHT1992 thanks for the advice mate. I really dont mind waiting my priority is on getting good value (price/performance ratio). i think if i wait i would decide between 7600xt or 7700 i guess since i really dont really need nvidia and amd might have a bit better value for gaming and more vram. Are u sure tho that its also not worth to wait on the cpu?
Hi, thanks for the informative video. Will Nvidia launch a 4060 Founders edition this time? Because the 3060 is courrently to have for about almost the same price as the 3060ti Founders Edition directly from Nvidia. Its laughable how expensive 3060s are because there is no Founders Edition.
The leak makes no sense. They're saying the 4060ti is gonna use a heavily cut down AD104, then the 4060 will be using a cut down AD106? Why wouldn't the 4060 get the fully enabled AD106 die with such a performance between the regular 60 & 60ti? Unless they really plan on milking the stack & having a regular 4060, 4060 super, 4060ti.... 🙄
Edit: 7700X & 7600X looking pretty decent, although I like what Intel is doing with the E cores on their newer chips. I need a CPU upgrade soon, gonna be a tough call which way to go.
why is every one posting COULD BE performance number? just wait and see . all these guys are just making click bait videos
my 3080 is already barely utilized in 1440p with my aging 9900k, im looking at a 6000 series amd or 13th gen intel cpu/mobo instead of more gpu power
4060 with only 8gb makes this card almost instantly EOL. This is a low end card. The ti is basically the 4060.
In the leak the 7700x is running with 5200MT/s ram with 48ns latency which means this is almost the best it can do
I thought inf Cache runs at 3ghz that would mean best is 6000 mhz ram
They release these REALLY late how the f*ck are they leaking it now?