P600s darkbase 802 from be quiet phanteks evolv. Just to name a few! I’m swapping from p600s to 802 cause I think it just looks elegant. And I’m going to do a full loop on this bad boy! With a 4090!
I read that the AIBs were told from Nvidia, that the cards will be 600W and they need to build for that wattage. Of course at the last min.... Nvidia informed the AIBs that it would only be around 450W. By then, the AIBs had already built for 600W. Thus the reason we are seeing so many cards that are over built!
I'm wondering how much of that was in response to uproar over the meme-TDPs. But really the biggest scandal to me is the pricing. Like, all I have to say negative about nVidia really ultimately boils down to its greed. It wouldn't actually be such a bad product, maybe even some of the literal fire hazards, were it not for the fact they were offering these way worse TDPs and jacking prices too. So that's what we were bitching about, nvidia, that we had to now factor in new PSU costs to our PC building budget. And now you can see part of why I just spent my money on clothes and stuff instead. Because I had to tally up my cost totals for a new monitor and new GPU and if you tacked onto that a new PSU it's like, nah.
@@-BEASTOR- I kinda just want to see comprehensive videos talking about like what hardware should be focused on in a PC that does VR. I assume GPUs are the most important part but I want to see how much VR improvement is noticed across generations. I also would guess that more VRAM would be incredibly impactful on VR performance to a greater degree than typical gaming
Slight correction to what Jay said, these cards are PCIe Gen 4 and not gen 5, though they take the PCIe Gen 5 power cable. AMD's upcoming cards will use PCIe Gen 5 though.
I feel that a tech reviewer/influencer shouldn't be getting a key spec detail like this incorrect, especially seeing as videos are shot multiple times and edited before being released.
@@LordApophis100 are you telling me that Jay or the rest of his team don't look over the video when editing before they publish it? If they don't then that doesn't seem very professional to me.
@@fffabz6013 I can't help but suspect that this is the main reason for the comically oversized coolers. I can feel my monkey brain going; "More BIG is more better!"
I would love to see a ridgid offset 145 - 180 degree connectors... Something that would allow you to connect your cables above and or below your GPU without having to bend the wires and putting stress on the connectors.
@@boominfree7408 Yep those are or should be available really soon. The reason I want an adaptor that faces the connector towards the back of the case before you connect the cables. This would ease cable management, and puts as little stress on the connectors a possible, since you won't need to bend the cables, until the are further away.
Nvidia doesn’t tell their partners much about the final specs or giving them even drivers to test the cards they design. Asus probably didn’t know if it will be 400W or 600W or how hot the chip will run, so they had to spec the cooler for the upper limit.
The going theory is that the chip was originally specced out for Samsung's process, which is still lagging behind TSMC. That's what the AIBs are specc'd for.
@@watchm4ker Sounds bogus to me since those chips were sitting in warehouses for a couple of months now. We knew for a long time that Nvidia is using the customized TSMC N4. Chips are designed with some specific process(es) in mind, but unless you do a full tape out and verification with both processes you can't just switch between.
@@LordApophis100 Backing you up here. There is absolutely no way whatsoever they would just switch between processes, you just can't do it in a reasonable timeframe, you need to know about it way in advance and you have to deliberately design every component subsystem TO that process. This wouldn't be like picking a different circuit board manufacturer or hell, picking different components on the circuit board. This is more like fundamentally changing the physics of the system.
@@LordApophis100 AIBs have lead times, too. Especially with how complex the cooling systems have gotten, I wouldn't be surprised if designs have to be done in parallel with the chip design.
It's making more and more sense why EVGA bowed out. They couldn't deal with NVIDIA's shenanigans. That cooler design was most likely intended for 600 upper spec range so it's pretty clear that Asus didn't know the final specs until close to release date. If NVIDIA isn't careful, they will loose another AIB partner.
@@deathstroke7316 LOL I'm sorry Slade, but you'll need to go to AMD if you want something in that power tier. Nvidia isn't interested in the budget market anymore.
A lot of VR titles are still CPU bound at this point. For instance, I can run Blade and Sorcery just fine on my 2070, but as soon as more than like 7 enemies spawn in my dear old 2700 starts showing its age.
@@abnormallynormal8823 I'm not sure implying that things are CPU limited on a 2700, and therefore CPU bottlenecked in general, is a valid point. I assume this is a joke since you yourself call your CPU old?
I've been wondering about why they aren't making 90 degree cables if the bend is such a big concern. I was planning on getting custom cabling done anyway, but I will definitely be making sure to get that fixed when I build.
Murphy's law of planned obsolescence: take $%#$^$ out till it stops working, put the last thing back and move to the next specification. This includes, but is not limited to: wire diameter, plastic thickness, strength, & longevity, corrosion resistance, fire resistance.
@@garystinten9339 works well in the usa. does not work in australia when they can say for example ahh yeeeh a computer part uhm mate that should last 12 years warranty mate
@@EddieOtool graphics cards may soon require us to have household nuclear power plants to prevent us from crashing the power grid if more than one person on a grid has a card.
I will say that i am actually impressed with the level of performance increase that these cards produce. The FPS you can get at 4k is truly amazing out of the box. Hopefully the 4080 16gb and the (should be 4070) 12gb model have the same kid of boost, because that is where i will be looking.
@@arnoldnym2466 OP thought Jay was referring to PCI-E SLOT version 5.0, but Jay was referring to PCI-E POWER CONECTOR 5.0. POWER CONNECTOR ON 4090 IS PCIE 5.0 (600 WATTS).
Makes me excited to see what the 5080 will do in 2 years. I got a 3080 from EVGA queue earlier this year and it's a beast for my 4k 120hz OLED. FSR and DLSS really come in clutch so I'm skipping 4000 but the performance improvements are very impressive!
Just don't expect to use it for anything above 120hz later as it doesn't support higher frequencies. It doesn't have DisplayPort 2.0 as amd and Intel gpus
I walked into a Best Buy earlier this year and bought a 3080ti FE. I'll wait for the 5080/5090 as well as Zen5. Hopefully DDR5 pricing is not as expensive.
@@Broba_Fett yeah I went ahead and got a 5800x3d. The 8700k lasted me 5 years so I expect the 5800x3d to do the same. Plus I don't plan on upgrading my screen anytime soon, if I do it's because this TV crapped out or something so amazing came out I just have to have it.
@@Wonkanator2 the something amazing is coming out quite literally in January so a few months from now. LG is going to be launching their 27 and 32 in 1440p 240Hz 0.1 Ms gaming monitors which are starting production in a month so even though they're not 4K, the Fidelity increase of oled is way better than 4K
Definitely waiting to see what rDNA3 has to offer. I don't plan on building a new rig until spring tho so hopefully by then, prices won't be too insane but it'll definitely be between a top end AMD or Nvidia. I'm partial to Asus so I'll be going that way regardless haha
hmm have Asus ever done an AMD card? I can't recall any, Guess you'll be going with Nvidia anyway if sticking to that brand. Would love for them to do something with AMD though. They usually do hardware design very well.
This will be my second thing to do when I build mine. First to do is bench Prepar 3d V5.2 Not sure why it's never brought up by flight simmers. It's way more matured and mod developed then msfs.
I think LTT's review had MSFS data. 4k it opens up some decent performance, but 1440p it's limited by CPU performance. In dense, heavy AI areas (busy airports like LAX), you're still bound by CPU so you won't really see any performance improvements. We need faster single core processing speed to get any better performance on MSFS.
I still have a gtx 1080 in 2023 (almost 2024) and it still preformes really good for 1440p gaming. Just ordered the 4090 and im ready to be blown away.
Jay did make a video exposing that nvidia ceo is artificially limiting production to manipulate supply and demand, so it makes sense that they'd sell out.
I'm waiting to see how the 4080s series will look, as well as whatever they deem to call the 4070 and eventually 4060. Where those will fall into the linup, as far as price/performance goes, as well as compatibility with older systems. I had just gotten a 3060ti after they became fully available again from EVGA (and are now considered extinct) and it's working fabulously on a zen3 system.
Are you really thanking him for waiting until the cards release to give us the numbers? He is doing what he loves to do while not having to pay a dime and in the end make money at it. Maybe people don't remember what it was like a couple years ago so let me remind you, youtubers used to give us the specs early so we could make an informed decision on our purchase and now these new youtubers take money to screw us. Really sad people like you embolden there actions.
@@xxJudgmentalxx You mean those who want to keep producing content and not get their asses handed to them in court for breaking the contract and embargo?
PHIL that transition from Jay's gun shot to the numbers was golden! Love you're work man! Those really are just chonky cards. I can see from the charts why EVGA might have also stepped out. Seems like there isn't much more room to tweek the cards and Nvidia might have hobbled that. Kind of puts some comments Nvidia & EVGA said into perspective in regards to what AIBs really do to cards now.
Looking forward to the "so you want to buy a 4090?" video where you show the cases that support this card, the power supplies that would work with it, the monitors that get the most out of the card along with CPUs that can fully take advantage. The 40 series is not only ridiculously overpriced by itself, but all the other considerations if you want to get the most from this mean the real price of entry to make proper use of this is astronomical.
Not only that, once you actually have one, you're gonna cry once the energy bill comes in, especially with where energy prices are going at the moment (at least in europe)
@@Doggettxx Probably not. 1Kw/h is a bit less than 1 Euro. Looking at some RUclipsr 4090 benchmarks average total system consumption in worst games is around 500W. That means less than 0.5 euro per hour of gaming. If you can expend 4000$ in such a system ( + $2000 for a good 4K 120Hz monitor and peripherals?) then the power consumption is still not very relevant in terms of costs. If you use it for work then yes, but as it is much more efficient per rendered frame than 3080 / 3090 it means cutting electrical costs, not the opposite. In fact it can also reduce costs in CPU limited games because of better efficiency at a same frame rate.
@@Doggettxx I imagine that whoever is buying 4090 and making a ultra high end build with it is already kinda rich, at least much more than an avarage person, so I don't see how the bills are going to be a problem to those people. 4090 + high end parts + 4k monitor or most likely 4k oled tv, we are talking like 5-6k$ minimum. Using a 4090 with anything less than that is a waste so I don't see an avarage person buying this unless they stop eating just to save for it, but that's their problem then, along with the bills, should've know better.
I'm going from a 1080 to a 3090. Same price. Since I play at 1440p, I have no real need for anything else. I'll transition to 4K gaming when they're making cards for 8K 120 XD
To me these results just push home even further what EVGA said about Nvidia and the concern of them basically pushing the AIB partners out of the market. Even more certain on waiting for rDNA3 before I upgrade now.
Well the high-end market as Nvidia won't be producing anything below a 4070. I suppose it depends if AIBs are interested in mid-tier GPUs - which I am sure they are. But they now know they will be missing out on the higher end.
@@Anukinihun Yeah. I am also waiting till next year for proper ATX 3.0 PSU. Then I will consider building new PC from ground up. Until then I am happy with 9700K and 2080
The 3090 hare like half the price now, so this one should also cost between 8-10K as the 5000 series is released. So everyone will have a chance to get it sooner or later. :)
Now we know why EVGA was so unhappy. FE board cooling design is more than partners are capable of matching. Like AMD Ryzen 7000, NVIDIA has "overclocked" as well leaving little on the table for partners. As der8auer this could have been a 300-350W two slot card without all the dramatics. Nvidia FE leaves partners little room.
Seriously on point. EGVA saw this coming and said hell no. What a joke. The AIBs are essentially just selling you fancy cases and pretty lights for your card now.
Clock speeds are just a number, if no context is provided. If we are talking performance, 4090 is 60% better than 3090, costs around 60-70% more, so not a great bang for the buck, 900$ 4070 and 4080 are going to be even less of a value picks.. So imo 30 series gpus are still very much worthy
@@vinylSummer 3090 is around $1100 -$1200. 4090 give you 60-80% more performance with 50% higher price (the same price when 3090 launched) so it is quite worth it.
Hey Jay please also add the power consumption charts to these benchmarks, cuz nowadays its more relevant than anytime since GPUs were a thing. Or are you gonna talk about it in the OC video?
Jay thanks for mentioning Vrel is the perf limit for these cards. That was the most important statement in the video for me. I was shopping for the Strix due to the cooler size and 600W power limit hoping to get my usual extra 6-8% from OC. Knowing Vrel is limiting factor I will feel comfortable with a lower spec model when I see one in stock. When thermals and power aren't the limiting factor, SKUs with smaller coolers and less quality power delivery will do just fine. The trick now is to find a 4090 in stock...
This makes EVGA'S angry storm out more sense. How are you going to design a cooling system that is robust and within profit margin, without knowing how much the MSRP of the card is going to cost...
Now take a look at the uniform 600W TDP coolers of the AIB´s and the fact that Nvidia "suddenly" changed the TDP to 450W. Nvidia isn´t even pretending anymore.
It'd be nice to get FE cards officially in Australia as a reference point for AIB pricing, but alas the range goes from A$2900-3800 at two of our major retailers Ouch, I'm good 🤣
Oh so there is no way to get the 4080 fe at retail in aud? Fuck! Ill just buy the 3080 and call it a day ill just get the 5000 series or even rx equivalent cards when they are out
Which retailers Bro? I drove to Scorptec Sydney from Brisbane just to get the r9 7950x at launch. Within minutes it was out of stock whilst paying with afterpay🤣. Who has stock of 4090 in Au so I can trade in my car?
I was also checking in Aus for FE options but haven't seen any. Even on the official site, nothing is listed when you click "Buying Options" under the 4090...
I will never understand why the aftermarket/AIB versions of these cards only use a thin 2-slot bracket, while the somewhat comparatively smaller FE card uses a beefy 3-slot bracket. Make it make sense! Companies need to make cards with a 3-slot bracket as standard nowadays. It would help the GPU sag issues a lot.
Ikr?! Like who do they think they're fooling with those 2-slot brackets? If the card is going to take up the slot, the bracket may as well too for extra rigidity.
Gonna get mine in ~ 2/3 weeks (~ 2800$ here in Romania the OC model). Can't wait to play all the new games like dead end city & vampire survivors & blazing chrome 👏👏👏
I'm very interested in under-volting results. I have no interest in the 40 series unless it can be under-volted, a healthy amount. Even if it's at the cost of some performance, since the jump from 30 series seems huge anyway. If it really is a matter of just juiced up power, we'll see under-volting expose that, right?
Already saw a chart where they were able to keep 95% of the performance at 70% power, which makes me happy. Think at max that was 260 watts, but may have been a tad lower. Think it was shown in one of Paul's recent Sunday tech news videos.
At this point, with these small gains from a significant price increase, I'd much rather put more money on a proper airflow chassis than a 3rd party 4090 edition. Overall performance lifts will likely be similar and you have the added bonus of a rig with better longevity.
Man, I've always loved Asus for motherboards and graphics cards anyways, but this card is just sexy. Definitely need to get my hands on this and the Z790 Maximus Hero for my new build. I'm past due for an upgrade. Have a 2070 SUPER, i7-10700k, and Maximus XII Hero I got pre-release back before COVID.
EKWB released their FE block and I believe their STRIX block has just released as of embargo day. Can't wait to see how much smaller these cards get, and what minimum loop would be needed to keep these cards as cool as their giant air coolers. Also want to see how older PCI-E 3.0 systems handle these new cards, if bandwidth is now a factor.
4000 series Nvidia is PCIE 4.0 data interface and 5.0 power delivery. PCIE 3.0 cpu / boards might bottleneck the new cards, but its not because of the gen 5 power spec.
I think to really see the difference between this and the FE you probably would have to do a test that shows both boost clocks, and fan RPM on a graph together; I imagine the larger heatsink mass should allow the Strix edition to run with the fans really low more often with the same performance then the FE would but that's just a guesstimate on visual observation of the sheer size difference rather then anything.
This is the best looking non-FE card so far imo. It would fit really well in my Lian Li Dynamic XL case. Price reflects its overall aesthetic and performance, but is not cool enough to be $400 more than FE imo. My 3080 TI FE will suffice for a few years. Exciting times for content creation, because this series is essentially fueling the possibilities of what we will see in the near future. Cheers!
honestly, i cant say i like the design at all, at least compared to the 30 series strix design. owned a 3060 ti strix and 3080 ti strix while mining, now ive got a white 3090 strix for my main rig, something about that red and blue and square design of this card puts me off, idk
@@00gr1ngo8 white 3090 oc strix is the most beautiful card made to date imo I agree this nee 4090 isn’t as sleek also can’t run them in SLi unlike 3090
The Asus RTX4090 24GB Strix GPU in Australia is $A3799 and is currently listed as out-of-stock shops at most PC sales. My 1080Ti still runs most of the games I play at 60FPS in 4K so any upgrade I will consider will be the 3080ti or a Radeon card.
The Asus 4090 may only have a slight edge over the Nvidia 4090 FE out of the box, but my question is how much of an edge does it have in full overclock performance? Can you do a video doing that?
The cheapest rtx 4090 listed in my country is 1951,57 usd. Including vat. I got myself a 5900x with a rx6800xt in 2021. And i'm using a 27" 1440p monitor. I don't se any reason to upgrade, mostly cuz I don't need rt, and I mostly play retro games
HEY JAY! I was standing in line at my local Canada Computers waiting for one of the 12 4090's they have in stock this morning and I actually talked myself out of the purchase. The fact it has some weirdo power connector, and I would have to pony up for a new PSU changed my mind. I have a corsair 850w platinum that I bought for the 3080 I got at last launch. Im tired of upgrading my entire power system every time a new top tear card comes out. I'll wait for the 4080, but seriously what does this thing pull from the wall when you are running something like my setup.. a 5950x with 6 SSD's 32gb ram and mostly overclocked everything. I also thought winter is coming and this card would probably allow me to turn my furnace off for the winter... what kind of temps does this thing push into a room? FPS is great.... theres more to a good gaming setup than just the most frames. If gaming costs $6 bucks a day in power VS $20 thats a big deal. If my room turns into the sweatiest smelliest room in the house... is that a good gaming experience. I am excited but not for this card anymore. If they just kept it under 850watts total system power Sure... I could live with that. I dunno maybe I am crazy to not have gotten one today. Too much money for too many compromises.
@@MrRafting more like obsessed. I like trying to hit the top of the leaderboards for 3d mark benchmarks. I mostly game on. 170hz 1440p monitor so even the 3080 is a little extreme.
Similar experience here. Instead of a platform upgrade in a year I decided to get a new GPU. Mainly since I didn't want to deal with the higher power requirements of the 40 series but also the "new" hardware prices of DDR5 and motherboards. So when CC had the evga 3070Ti on sale I bought it to replace my 5700XT.
SO UPDATE.... I actually couldn't wait.... I went and got one.... TURNS OUT 850W is plenty for the Gigabyte OC card. Pulls 450w exactly at 99percent GPU utilization in Control.... Synthetic benchmarks too. CPU pulls about 120ish watts (5950x)... so 850 is lots if you run an AMD Ryzen 9 Spoiler alert... this card is insane. Sold my 3080 to a kid who desperately wanted an upgrade from his GTX 1660... For $550 he got a nice ASUS 3080 that will make his year. I'm a hypocrite... sue me. Oh an as far as thermals go... its better than my old 3080 by a long shot. Windforce 3X is truely good. Barely fit in my NZXT H510, but with the noctua chromax black... I have renamed this PC Big Chungus.
I bought a 3080 earlier this year, and it was the first time I really splurged on a card (not through a scalper don't worry). I'm happy with the performance I get and I always intended for this card to last me a long time. Unless I get a VR headset sometime in the near future, I'm not getting a new card for at least 6 years. That's the plan anyway.
That's a great purchase! I got the exact same mindset. I'm going 3090, only because the price fell so fast to normal and i need to fully use my 1440p screen. Just got my new rig in Feb so the other parts are brand new. Replacing my 3060ti, but that one is going into a future extra desktop that I'll use at my mom's house. It should last a while for 1080p i use there. :)
I really think these bigger and heavier cards and the extra price will be useful only for overclocking whenever it would be available. I would love to see that video come soon
I feel like this is another time the AIBs got screwed by NVidea. Like you said the AIBs are trying to meet the 600W spec. but what if that’s all the info they were given? It’s been mentioned before that they sometimes don’t even get drivers before media so it could be the case that they just had to prepare for the worst case scenario. So now they’re screwed in two ways - 1. MASSIVE cards that limit their target audience 2. (Probably) A lot higher BOM and this higher sale price that then seems unreasonable compared to the FE. So yeah. I see why EVGA had enough with the nonsense.
Perhaps EVGA saw this coming. I wouldn't be surprised if Nvidia wants to get rid of the AIBs all together. Honestly, how many units does ASUS expect to sell of this card? It is worse than the FE in every way: - 25% more expensive than the FE - only a few extra FPS in each benchmark than the FE so probably less than a 5% performance gain - needs a bigger case than the FE - same temps as the FE - comparable noise level as the FE? if you ignore the coil whine on the ASUS anyway - uglier than the FE (personal preference) Maybe this ASUS card allows you to overclock higher than the FE, but enough to justify the extra 400 USD? I have my doubts.
That's what really concerns me with this generation of cards. If the FE cards are actually decent this time around, they are likely going to be the go to card for people just because they'll be cheaper for similar or the same performance. EVGA may have been right to get out of the market.
@@TheSlickmicks I assume the same. This time Nvidia "forgot" to mention the power target is not 600W anymore....so they have to overspend on a cooling solution. What a friendly move, isn´t it.
I just got the 4080 version of this card and it's absolutely incredible. The one that that actually impressed me the most is how quiet it is. Even when doing Stable Diffusion renders it's so so so so much quieter than the 3080ti I replaced. And it's much faster and now I can play 4k games at 60+FPS!!!
I think Jay made a mistake at the end of the video, 4000 series do not have gen 5 PCIe. They are 4.0 16x. 5.0 8x would have freed up more PCIe lanes for nvme SSDs :(
I just imagined Jay hardcore dancing to the song that starts at 2:50 while waiting for benchmarks. Didn't know he got down like that lolol. Solid video as always. I'm waiting on someone to do benchmarks for the gigabyte 3090 card as that's the manufacturer I'm currently waiting on to drop.
The revelation of voltage lock is interesting. For extreme overclockers , they should probably wait until they can unlock it. I'm working with a small case , and so these large cards won't work for me . I'm looking at the AIO ones for a possible fix. Or worst case, I will have to just get a large case and forget about trying to have a portable pc.
@@yasinparti4385 During the height of the graphics card shortage that was a fairly standard price. You couldn't find even older gen cards for reasonable prices. Anyone who had a dead card back then was screwed.
The cheapest 3080 Ti on the market, the Palit RTX Gaming Pro OC at launch in my country retailed for the equivalent of $2850, back in June of 2021. I also got a RTX 3080 Ti, in June of this year for the equivalent of $900. At launch, the same model was retailing for $ 3050. So yeah, those were some crazy times.
Thanks for the video. But it would be very interesting to have the decibels comparison between FE and Strix. Expecially because with 3090, FE was the most silent 3090 card, and Strix was the noisier 3090 card (and the coolest). So I'm VERY interested into this decibel comparison with 4090 cards.
Whats the volume of the GPU matter? Sure, if it sounds like a plane. But you've got your headphones on 99% of the time your GPU is doing anything that may cause it to make a lot of noise, don't you?
@@jovanmaric6160 No, I don't. And specifically, I didn't ask "Please, let me know if the GPU's volume does matter", but I clearly and politely asked for a dB comparison between 4090 cards. At the moment, I'm interested only into this information. Many thanks.
Now just down the the available power usage but like 150 to 250 watts, bring the price down by about 5-700 bucks, and maybe then it would be a good choice to pick it up.
it can actually get pretty insane, the semiconductor industry needs insane amounts of capital to keep churning. might be able to get an idea of the price of the raw material but all the research and development among various manufactures down the chain might pretty hard to account for. its an nvidia card but they still need get their chips fabbed by TSMC, and TSMC needs money to pay for their hella-expensive stuff from ASML, so they gotta tax too... its messy lol
At these price points it would be insane not to wait to see what AMD brings to the table. My process will likely be 1) wait for RDNA3 2) select highest performer 3) wait for EK to make a block for that card
It's not insane if you plan to use it in productivity workflows. AMD cards only recently got optimizations for Blender, but if you're using Blender you'll have a better time with optix which is an Nvidia exclusive. I don't care about AMD gpus because they are yet to catch up with Nvidia in the productivity world
What I love the most about your 4090 reviews are the guitar riffs. After listening to them, I'm totally oblivious to anything you have to say about the GPUs. All kidding aside, thanks for the 4090 reviews! Awesome sauce!
I personally would never pay more than 800$ on any graphics card however awesome it may be. This is just my personal limit. Much less for pecuniary than for sanity reasons. Impressive cards nonetheless.
As for the power connection to this card, there is a corsair cable that plugs into the card and into 2 8 pin type 4 PCIe slots in the PSU. Whilst not a right angle plug it is far more flexible than the adapter that comes with the card. This cable, Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power Cable CP-8920284.
Someone needs to do a "which cases the strix 4090 fits in" video.. seriously.
@@dievas_ My radiator makes it so the 3080 barely fits
I will put my 4090 in an auquarium.
P600s darkbase 802 from be quiet phanteks evolv. Just to name a few! I’m swapping from p600s to 802 cause I think it just looks elegant. And I’m going to do a full loop on this bad boy! With a 4090!
Every full/super tower, which is used by people that are not on budget. (People on budget cant use 4090 anyway)
@@aritramondal6475 Just get a thermaltake view 71, Have room for the 4090, and 5 radiators in front of it. Seriously this case is just too damn big.
If that was left in a hardware store, someone would mistake it for an A/C unit...
Oh.... So thats why my new A/C plugs into a pci-e 4 slot...
Remove the heatsink and it's a heater
@@nhojlagap6222 nah it's the heatsink that transfers the heat to the air, turning it into a heater. Otherwise it's just a boiling piece of silicon.
A space heater would be more accurate.
Looks like one of those vertical window ACs but in reality it’s a small heater for your room lol
I read that the AIBs were told from Nvidia, that the cards will be 600W and they need to build for that wattage. Of course at the last min.... Nvidia informed the AIBs that it would only be around 450W. By then, the AIBs had already built for 600W. Thus the reason we are seeing so many cards that are over built!
I'm wondering how much of that was in response to uproar over the meme-TDPs. But really the biggest scandal to me is the pricing. Like, all I have to say negative about nVidia really ultimately boils down to its greed. It wouldn't actually be such a bad product, maybe even some of the literal fire hazards, were it not for the fact they were offering these way worse TDPs and jacking prices too. So that's what we were bitching about, nvidia, that we had to now factor in new PSU costs to our PC building budget.
And now you can see part of why I just spent my money on clothes and stuff instead. Because I had to tally up my cost totals for a new monitor and new GPU and if you tacked onto that a new PSU it's like, nah.
I expect the 4090TI / new TITAN card may require 600W
Is this why EVGA exited the market?
crazy that they were able to fit a radiator from a '98 honda civic into a 4090! now thats some cooling power!
best comment :)
V-Tech kicked in yo!
I only can imagine that 4090 reaching 6.5 ghz gpu core pulling 1000w
Excellent
HAahaha! Lost it.
I didn't notice how ridiculously big the 4090 FE is until you brought out the 3080 and it looked TINY when that card used to be huge before.
The PCB is only half the total length. I'm looking for naked 4090s to watercool.
The joke with 40 series will be you build your PC inside your GPU.
@@Shalmaneser1 you can use one of the AIO variants, no?
It's insane
@@Shalmaneser1 umm ....aio 4090 exists right??
I would really love to see VR performance of some sort across generations
Same, that’s where my only limitations these days are anymore.
Oh yea good point, these cards are amazing for VR
8k flight simulator, half life alyx and others
@@-BEASTOR- I kinda just want to see comprehensive videos talking about like what hardware should be focused on in a PC that does VR. I assume GPUs are the most important part but I want to see how much VR improvement is noticed across generations. I also would guess that more VRAM would be incredibly impactful on VR performance to a greater degree than typical gaming
VR is shit and dead
Slight correction to what Jay said, these cards are PCIe Gen 4 and not gen 5, though they take the PCIe Gen 5 power cable. AMD's upcoming cards will use PCIe Gen 5 though.
I feel that a tech reviewer/influencer shouldn't be getting a key spec detail like this incorrect, especially seeing as videos are shot multiple times and edited before being released.
@@Paulrus10 Jay doesn't really do scripted content but speaks out of his mind, so mistake like that are more likely.
@@LordApophis100 are you telling me that Jay or the rest of his team don't look over the video when editing before they publish it? If they don't then that doesn't seem very professional to me.
@@Paulrus10 They sometimes do corrections (in text), but they don't do perfectly edited content.
@@Paulrus10 this is youtube not Steven Spielberg blockbuster
I remember thinking how huge the 30 series FE cards looked when they were released. Seeing that 3080 next to the 4090... holy crap!
This. When he pulled the 3080 at 8:40 it feels like a toy compared to the 4090
@@fffabz6013 I can't help but suspect that this is the main reason for the comically oversized coolers. I can feel my monkey brain going; "More BIG is more better!"
1080ti was bigger than a 3080ti
@@mrBurns-si4jt It didn't need to be though. That was a 200-250 watt card, so it's needed a much less powerful cooler.
@@fffabz6013 and then the 4090 looks small compared to the after market cards.
I would love to see a ridgid offset 145 - 180 degree connectors... Something that would allow you to connect your cables above and or below your GPU without having to bend the wires and putting stress on the connectors.
You might also consider 3D printing a wire loom to hold said wires and thus relieving the stress and maybe redirecting them.?.
---Just my two cents.
Just a custom 90deg ended cable would be great
@@boominfree7408 Yep those are or should be available really soon.
The reason I want an adaptor that faces the connector towards the back of the case before you connect the cables. This would ease cable management, and puts as little stress on the connectors a possible, since you won't need to bend the cables, until the are further away.
Nvidia doesn’t tell their partners much about the final specs or giving them even drivers to test the cards they design. Asus probably didn’t know if it will be 400W or 600W or how hot the chip will run, so they had to spec the cooler for the upper limit.
The going theory is that the chip was originally specced out for Samsung's process, which is still lagging behind TSMC. That's what the AIBs are specc'd for.
@@watchm4ker Sounds bogus to me since those chips were sitting in warehouses for a couple of months now. We knew for a long time that Nvidia is using the customized TSMC N4. Chips are designed with some specific process(es) in mind, but unless you do a full tape out and verification with both processes you can't just switch between.
@@LordApophis100 Backing you up here. There is absolutely no way whatsoever they would just switch between processes, you just can't do it in a reasonable timeframe, you need to know about it way in advance and you have to deliberately design every component subsystem TO that process. This wouldn't be like picking a different circuit board manufacturer or hell, picking different components on the circuit board. This is more like fundamentally changing the physics of the system.
@@LordApophis100 AIBs have lead times, too. Especially with how complex the cooling systems have gotten, I wouldn't be surprised if designs have to be done in parallel with the chip design.
It's making more and more sense why EVGA bowed out. They couldn't deal with NVIDIA's shenanigans. That cooler design was most likely intended for 600 upper spec range so it's pretty clear that Asus didn't know the final specs until close to release date. If NVIDIA isn't careful, they will loose another AIB partner.
Nice to hear that both 4090s are quite silent and very well cooled
And very well huge that can't even a fit a case
@@vroom3257 Not unless you get creative ;)
@@sirmrmcjack2167 get creative with a 1600$ card? lol u got sum ballz
@@AliYassinToma watercool it
@@RiceCubeTech he meant like cutting the case up or modding the card it self i guess. thats what i understood
Yes!~ So glad someone picked up the 90 degree adapter. I still don't know why they don't just ship with one for the price and size.
That thing can probably double as a load bearing structure for something like a one of those desk PC builds......or a house.
I'm really looking forward to seeing your test results of the 4080 Pro and the 4080 Faux.
Fauxty80
16gb is pretty interesting. Its wattage is very impressive compared to 3090ti.
I want to see review of 4030. I finally want to upgrade my gt1030
@@deathstroke7316 LOL I'm sorry Slade, but you'll need to go to AMD if you want something in that power tier. Nvidia isn't interested in the budget market anymore.
lmao great pfp
hey Jay for these super high end cards would you consider doing VR benchmarks which is somewhere where most decent cards still struggle?
A lot of VR titles are still CPU bound at this point. For instance, I can run Blade and Sorcery just fine on my 2070, but as soon as more than like 7 enemies spawn in my dear old 2700 starts showing its age.
@@abnormallynormal8823 I'm not sure implying that things are CPU limited on a 2700, and therefore CPU bottlenecked in general, is a valid point. I assume this is a joke since you yourself call your CPU old?
@@abnormallynormal8823 I thought VR was the biggest benefit to higher end GPU and 4k yadda yadda? Well that';s a bummer.
@@pandemicneetbux2110 I do VR and 4K exclusively. My 4090 is constantly at 95-98% utilization. My 11th gen CPU usually hovers around 45-65%.
I've been wondering about why they aren't making 90 degree cables if the bend is such a big concern. I was planning on getting custom cabling done anyway, but I will definitely be making sure to get that fixed when I build.
Murphy's law of planned obsolescence: take $%#$^$ out till it stops working, put the last thing back and move to the next specification. This includes, but is not limited to: wire diameter, plastic thickness, strength, & longevity, corrosion resistance, fire resistance.
@@Shalmaneser1 so you leave yourself open to have someone solve the problem and make money off your back lol?
@@garystinten9339 works well in the usa. does not work in australia when they can say for example ahh yeeeh a computer part uhm mate that should last 12 years warranty mate
I always use 90 degree connections, it makes a much neater build.
Yeah...and now let's talk about USB cables...
In the future we might need a separate enclosure for the graphics card with its own power supply and graphic port hookup.
when thunderbolt becomes a common connector for ANYTHING with graphics
Plugged onto a separate powerplant, whenever possible.
Laptops already have this so I don’t see why not lol
future ? :v probably needed now xD
@@EddieOtool graphics cards may soon require us to have household nuclear power plants to prevent us from crashing the power grid if more than one person on a grid has a card.
I will say that i am actually impressed with the level of performance increase that these cards produce. The FPS you can get at 4k is truly amazing out of the box. Hopefully the 4080 16gb and the (should be 4070) 12gb model have the same kid of boost, because that is where i will be looking.
Jay the 4090 is actually PCIe 4.0 still not 5.0 so the 3.0 performance probably won’t be too bad.
1:49 4090 actually uses PCI-E 5.0 power conector. He was not referring to PCI-e slot version.
@@quantumdestroyer6039 Yes he clearly was. Why else would he talk about performance degradation?
Yeah i thought i heard they were PCI 4.0, still more than enough really.
@@quantumdestroyer6039 Then why was he talking about pcie 3.0 bandwidth lmao
@@arnoldnym2466 OP thought Jay was referring to PCI-E SLOT version 5.0, but Jay was referring to PCI-E POWER CONECTOR 5.0. POWER CONNECTOR ON 4090 IS PCIE 5.0 (600 WATTS).
Makes me excited to see what the 5080 will do in 2 years. I got a 3080 from EVGA queue earlier this year and it's a beast for my 4k 120hz OLED. FSR and DLSS really come in clutch so I'm skipping 4000 but the performance improvements are very impressive!
Just don't expect to use it for anything above 120hz later as it doesn't support higher frequencies. It doesn't have DisplayPort 2.0 as amd and Intel gpus
I walked into a Best Buy earlier this year and bought a 3080ti FE. I'll wait for the 5080/5090 as well as Zen5. Hopefully DDR5 pricing is not as expensive.
@@Broba_Fett yeah I went ahead and got a 5800x3d. The 8700k lasted me 5 years so I expect the 5800x3d to do the same. Plus I don't plan on upgrading my screen anytime soon, if I do it's because this TV crapped out or something so amazing came out I just have to have it.
@@bjrnerikmol7737 yeah exactly, I'd imagine the 5000 series will have displayport 2.0 though
@@Wonkanator2 the something amazing is coming out quite literally in January so a few months from now. LG is going to be launching their 27 and 32 in 1440p 240Hz 0.1 Ms gaming monitors which are starting production in a month so even though they're not 4K, the Fidelity increase of oled is way better than 4K
Definitely waiting to see what rDNA3 has to offer. I don't plan on building a new rig until spring tho so hopefully by then, prices won't be too insane but it'll definitely be between a top end AMD or Nvidia. I'm partial to Asus so I'll be going that way regardless haha
if is for gaming you don't need these cards, unless you splurge 5-10k on a full 4k setup
@@grasthube oh trust me I know exactly what I have to do for my next build lmao 🤣
hmm have Asus ever done an AMD card? I can't recall any, Guess you'll be going with Nvidia anyway if sticking to that brand. Would love for them to do something with AMD though. They usually do hardware design very well.
@@microjet9563 Asus made rDNA2 cards?
@@microjet9563 I literally just bought an ASUS TUF RX 6800XT
Finally I can install my case in my Asus 4090
And people thought ITX cases have no use...I need to go spelunking now to find the power button...
I would have loved to see MSFS 2020 on the benchmarks , that thing is a FPS monster
This will be my second thing to do when I build mine. First to do is bench Prepar 3d V5.2 Not sure why it's never brought up by flight simmers. It's way more matured and mod developed then msfs.
I think LTT's review had MSFS data. 4k it opens up some decent performance, but 1440p it's limited by CPU performance. In dense, heavy AI areas (busy airports like LAX), you're still bound by CPU so you won't really see any performance improvements. We need faster single core processing speed to get any better performance on MSFS.
@@final3119 or instead, microsoft should hire some programmers, who can write code that executes parallel, on multi-core CPU
@@ricsip parallelization is not easy in titles like these
I still have a gtx 1080 in 2023 (almost 2024) and it still preformes really good for 1440p gaming.
Just ordered the 4090 and im ready to be blown away.
I skipped a few generations and I'm down to go all in with a new build. Seems cards are still selling out immediately even at that price.
I just built with a 3080 and its still pretty top tier. The 4000 series size is so big
Jay did make a video exposing that nvidia ceo is artificially limiting production to manipulate supply and demand, so it makes sense that they'd sell out.
@@sandbox242 Shit, I thought that might happen.
It's called scalping
@@buddmcstudd6994 I thought about that, but I do have a 4k monitor. It would be nice to utilize it as well.
I'm waiting to see how the 4080s series will look, as well as whatever they deem to call the 4070 and eventually 4060.
Where those will fall into the linup, as far as price/performance goes, as well as compatibility with older systems.
I had just gotten a 3060ti after they became fully available again from EVGA (and are now considered extinct) and it's working fabulously on a zen3 system.
I had a 3080ti had issues returned it no Amazon since they don't have any in stock :( will have to wait for the 4080
4070? The next card Nvidia will release is the $799 4080 8GB.
They will look the same and perform less as has always happened.
@@wolfshanze5980 then the $499 4080 6GB lololo
You're a very memeable creator
Thank you for working so hard to get these vids pumped out!
Are you really thanking him for waiting until the cards release to give us the numbers? He is doing what he loves to do while not having to pay a dime and in the end make money at it. Maybe people don't remember what it was like a couple years ago so let me remind you, youtubers used to give us the specs early so we could make an informed decision on our purchase and now these new youtubers take money to screw us. Really sad people like you embolden there actions.
Theres an embargo
@@xxJudgmentalxx theres an embargo. What do you expect them to do? Lmao
@@peterlegend That never stopped the youtubers before but it stops these greedy ones that don't care a lick about us.
@@xxJudgmentalxx You mean those who want to keep producing content and not get their asses handed to them in court for breaking the contract and embargo?
that gunshoot transition was amazing!
PHIL that transition from Jay's gun shot to the numbers was golden! Love you're work man! Those really are just chonky cards. I can see from the charts why EVGA might have also stepped out. Seems like there isn't much more room to tweek the cards and Nvidia might have hobbled that. Kind of puts some comments Nvidia & EVGA said into perspective in regards to what AIBs really do to cards now.
Looking forward to the "so you want to buy a 4090?" video where you show the cases that support this card, the power supplies that would work with it, the monitors that get the most out of the card along with CPUs that can fully take advantage. The 40 series is not only ridiculously overpriced by itself, but all the other considerations if you want to get the most from this mean the real price of entry to make proper use of this is astronomical.
Not only that, once you actually have one, you're gonna cry once the energy bill comes in, especially with where energy prices are going at the moment (at least in europe)
@@Doggettxx yep. UK here... The energy cost to run this would be crippling
@@Doggettxx Probably not. 1Kw/h is a bit less than 1 Euro. Looking at some RUclipsr 4090 benchmarks average total system consumption in worst games is around 500W. That means less than 0.5 euro per hour of gaming. If you can expend 4000$ in such a system ( + $2000 for a good 4K 120Hz monitor and peripherals?) then the power consumption is still not very relevant in terms of costs.
If you use it for work then yes, but as it is much more efficient per rendered frame than 3080 / 3090 it means cutting electrical costs, not the opposite.
In fact it can also reduce costs in CPU limited games because of better efficiency at a same frame rate.
@@Doggettxx Probably cost about £200 a year to game on it with current energy prices, mental.
Assuming you play for 2 hours a day, every day, that is.
@@Doggettxx I imagine that whoever is buying 4090 and making a ultra high end build with it is already kinda rich, at least much more than an avarage person, so I don't see how the bills are going to be a problem to those people. 4090 + high end parts + 4k monitor or most likely 4k oled tv, we are talking like 5-6k$ minimum. Using a 4090 with anything less than that is a waste so I don't see an avarage person buying this unless they stop eating just to save for it, but that's their problem then, along with the bills, should've know better.
These cards are insane performance wise. I just bought an EVGA FTW3 3090 for $800. Moving up from a 1080ti, I am quite happy with that!
Dude I legit did the same thing 😂
Even the latest 4090 can't maintain 60fps @4k across all games, so I'm not seeing the "insane performance"
That card will last you an very long time, gz on that card tho :)
it can, in most games. If it doesn’t, use DLSS3?
I'm going from a 1080 to a 3090. Same price. Since I play at 1440p, I have no real need for anything else. I'll transition to 4K gaming when they're making cards for 8K 120 XD
Small correction if it hasn't been mentioned yet, but the RTX 40 series are not PCIe 5.0, they're still PCIe 4.0.
To me these results just push home even further what EVGA said about Nvidia and the concern of them basically pushing the AIB partners out of the market. Even more certain on waiting for rDNA3 before I upgrade now.
🤣
Asus pricing is their own volition tho. Nobody asked for such a stupid card.
Well the high-end market as Nvidia won't be producing anything below a 4070. I suppose it depends if AIBs are interested in mid-tier GPUs - which I am sure they are. But they now know they will be missing out on the higher end.
Gotta say it kinda hurts to see such a great card that normal people will never be able to even touch!!
you def can, just gotta mortgage the house
wait for RDNA 3.
@@Anukinihun Yeah. I am also waiting till next year for proper ATX 3.0 PSU. Then I will consider building new PC from ground up. Until then I am happy with 9700K and 2080
if you are not playing 4k 144hz or 1440p 240hz this card is stupid.......
The 3090 hare like half the price now, so this one should also cost between 8-10K as the 5000 series is released. So everyone will have a chance to get it sooner or later. :)
absolutely love the background music, thank you.
Now we know why EVGA was so unhappy. FE board cooling design is more than partners are capable of matching. Like AMD Ryzen 7000, NVIDIA has "overclocked" as well leaving little on the table for partners. As der8auer this could have been a 300-350W two slot card without all the dramatics. Nvidia FE leaves partners little room.
Seriously on point. EGVA saw this coming and said hell no. What a joke. The AIBs are essentially just selling you fancy cases and pretty lights for your card now.
Maybe the "partners" should focus on models with sensible cooling solutions which can fit in existing PC cases?
Nvidia would never give such an uplift in performance unless they were forced to. It makes me think that AMD will be releasing something hefty.
I assume it will trade blows is raw performance, behave like DLSS 2.0 in RT and cost 200USD less
they better be! amd has 2 options:
better performance at the same price or same performance at a better price
lmao. AMD is at least 2 gens behind, maybe in 2030 they'll release something on par with a 4090.
@@kasumi8239 Oh sweet summer child....
With clock speeds like that, it makes no sense to buy any 30 series card for more than 500 bucks...
Clock speeds are just a number, if no context is provided. If we are talking performance, 4090 is 60% better than 3090, costs around 60-70% more, so not a great bang for the buck, 900$ 4070 and 4080 are going to be even less of a value picks.. So imo 30 series gpus are still very much worthy
@@vinylSummer 3090 is around $1100 -$1200. 4090 give you 60-80% more performance with 50% higher price (the same price when 3090 launched) so it is quite worth it.
Hey Jay please also add the power consumption charts to these benchmarks, cuz nowadays its more relevant than anytime since GPUs were a thing. Or are you gonna talk about it in the OC video?
I just got this GPU from microcenter today at launch day! From gtx 1060 to rtx 4090 baby! 🔥🔥🔥
Good job, I cant get mine due to scalpers...
What an upgrade lol...
Hope u pair it with the highest refresh rate monitor though, or u waste your money!
Jay thanks for mentioning Vrel is the perf limit for these cards. That was the most important statement in the video for me. I was shopping for the Strix due to the cooler size and 600W power limit hoping to get my usual extra 6-8% from OC. Knowing Vrel is limiting factor I will feel comfortable with a lower spec model when I see one in stock. When thermals and power aren't the limiting factor, SKUs with smaller coolers and less quality power delivery will do just fine. The trick now is to find a 4090 in stock...
Would love to see an ISO power comparison between a 3090 TI, 6950xt, and 4090 all allowed to run or capped at 500 watts...
This makes EVGA'S angry storm out more sense.
How are you going to design a cooling system that is robust and within profit margin, without knowing how much the MSRP of the card is going to cost...
Now take a look at the uniform 600W TDP coolers of the AIB´s and the fact that Nvidia "suddenly" changed the TDP to 450W. Nvidia isn´t even pretending anymore.
Ordered one of these, in a few days I'll officially enter the enthusiast tier
It'd be nice to get FE cards officially in Australia as a reference point for AIB pricing, but alas the range goes from A$2900-3800 at two of our major retailers
Ouch, I'm good 🤣
Oath. Very much frustrating to have the cards get way over cooked in terms of price just because FE isn't around.
Oh so there is no way to get the 4080 fe at retail in aud? Fuck! Ill just buy the 3080 and call it a day ill just get the 5000 series or even rx equivalent cards when they are out
Which retailers Bro? I drove to Scorptec Sydney from Brisbane just to get the r9 7950x at launch. Within minutes it was out of stock whilst paying with afterpay🤣. Who has stock of 4090 in Au so I can trade in my car?
Evga said they couldn't compete with NVDIA. They weren't over exaggerating.
I was also checking in Aus for FE options but haven't seen any. Even on the official site, nothing is listed when you click "Buying Options" under the 4090...
EVGA was on to something when they decided to cut ties with the green team
I will never understand why the aftermarket/AIB versions of these cards only use a thin 2-slot bracket, while the somewhat comparatively smaller FE card uses a beefy 3-slot bracket. Make it make sense! Companies need to make cards with a 3-slot bracket as standard nowadays. It would help the GPU sag issues a lot.
Ikr?! Like who do they think they're fooling with those 2-slot brackets? If the card is going to take up the slot, the bracket may as well too for extra rigidity.
Evga left for this reason, FE is now THE choice for 4000 series and Nvidia has tons of chip/card to sell, no more shortage !!
I like how the 4090 sold out instantly. Exactly what happened last launch.
Nowhere near as bad, I can still get a 4090 online from 1 retailer and a few stores have some in stock.... After 4 hours on sale.
There is no shortage of rich people willing to buy every single top end card that releases
Let's see the stock levels in 6 weeks 👀
@@B1u35ky or average Joes that swipe the remaining balance on that credit card up and down up and down up and down
Gonna get mine in ~ 2/3 weeks (~ 2800$ here in Romania the OC model). Can't wait to play all the new games like dead end city & vampire survivors & blazing chrome 👏👏👏
I managed to get the4090 suprim water cooled one. I hope it ends up good. Never had a msi card before. Can't wait for your review on that one
Pray you don't have to rma it msi is rubbish and Will screw you over
I'm very interested in under-volting results. I have no interest in the 40 series unless it can be under-volted, a healthy amount. Even if it's at the cost of some performance, since the jump from 30 series seems huge anyway. If it really is a matter of just juiced up power, we'll see under-volting expose that, right?
Already saw a chart where they were able to keep 95% of the performance at 70% power, which makes me happy. Think at max that was 260 watts, but may have been a tad lower.
Think it was shown in one of Paul's recent Sunday tech news videos.
If the 4090 can get more perf for less power than 3090ti then it should be easily possible to under-volt I would guess
I got my Strix at microcenter this morning, and they also sold ATX 3.0 PSUs. I got the 1650w one, and it really helps the bending issue.
Baller
Yeah me too, got the 4090 strix but passed on the 3.0 PSU, it was a $500 seasonic.
At this point, with these small gains from a significant price increase, I'd much rather put more money on a proper airflow chassis than a 3rd party 4090 edition. Overall performance lifts will likely be similar and you have the added bonus of a rig with better longevity.
Yeah I regret not getting fe 4090. Don't want to spend over 2k on aib
I predict a lot of pre setup vertical mount motherboards and cases in the future.
All-white NV7 delivering Saturday mated with ASUS ROG Strix GeForce RTX 4090 White OC, baby!! Shitload of D30 120's coming too. Jacked up over here.
Man, I've always loved Asus for motherboards and graphics cards anyways, but this card is just sexy. Definitely need to get my hands on this and the Z790 Maximus Hero for my new build. I'm past due for an upgrade. Have a 2070 SUPER, i7-10700k, and Maximus XII Hero I got pre-release back before COVID.
EKWB released their FE block and I believe their STRIX block has just released as of embargo day. Can't wait to see how much smaller these cards get, and what minimum loop would be needed to keep these cards as cool as their giant air coolers. Also want to see how older PCI-E 3.0 systems handle these new cards, if bandwidth is now a factor.
I don’t know why he said they were 5.0 cards. They’re not. The 4090 is pcie 4.0.
4000 series Nvidia is PCIE 4.0 data interface and 5.0 power delivery. PCIE 3.0 cpu / boards might bottleneck the new cards, but its not because of the gen 5 power spec.
Thanks for pointing out the potential pcie 3.0 impact, will be useful for everyone on 10th gen and earlier cpus
I think to really see the difference between this and the FE you probably would have to do a test that shows both boost clocks, and fan RPM on a graph together; I imagine the larger heatsink mass should allow the Strix edition to run with the fans really low more often with the same performance then the FE would but that's just a guesstimate on visual observation of the sheer size difference rather then anything.
👆👆 got something for you
Agreed, maybe more OC room also. Waiting for more tests.
This is the best looking non-FE card so far imo. It would fit really well in my Lian Li Dynamic XL case. Price reflects its overall aesthetic and performance, but is not cool enough to be $400 more than FE imo. My 3080 TI FE will suffice for a few years. Exciting times for content creation, because this series is essentially fueling the possibilities of what we will see in the near future. Cheers!
honestly, i cant say i like the design at all, at least compared to the 30 series strix design. owned a 3060 ti strix and 3080 ti strix while mining, now ive got a white 3090 strix for my main rig, something about that red and blue and square design of this card puts me off, idk
@@00gr1ngo8 white 3090 oc strix is the most beautiful card made to date imo I agree this nee 4090 isn’t as sleek also can’t run them in SLi unlike 3090
are you sure you dont need the 80% performance gain?
I like the MSI Suprim X the most. It looks amazing in the silverstone case I own. Black/silver
The Asus RTX4090 24GB Strix GPU in Australia is $A3799 and is currently listed as out-of-stock shops at most PC sales. My 1080Ti still runs most of the games I play at 60FPS in 4K so any upgrade I will consider will be the 3080ti or a Radeon card.
the upcoming Radeon ones look to be insane.
The Asus 4090 may only have a slight edge over the Nvidia 4090 FE out of the box, but my question is how much of an edge does it have in full overclock performance? Can you do a video doing that?
Voltage locked
Got to wait for the rgb cables to pair with the card for the performance boost. I like to set mine to blue for better cooling.
This is like the best benchmark satats display EVER!! Love the music!
The cheapest rtx 4090 listed in my country is 1951,57 usd. Including vat. I got myself a 5900x with a rx6800xt in 2021. And i'm using a 27" 1440p monitor. I don't se any reason to upgrade, mostly cuz I don't need rt, and I mostly play retro games
HEY JAY! I was standing in line at my local Canada Computers waiting for one of the 12 4090's they have in stock this morning and I actually talked myself out of the purchase. The fact it has some weirdo power connector, and I would have to pony up for a new PSU changed my mind. I have a corsair 850w platinum that I bought for the 3080 I got at last launch. Im tired of upgrading my entire power system every time a new top tear card comes out. I'll wait for the 4080, but seriously what does this thing pull from the wall when you are running something like my setup.. a 5950x with 6 SSD's 32gb ram and mostly overclocked everything. I also thought winter is coming and this card would probably allow me to turn my furnace off for the winter... what kind of temps does this thing push into a room? FPS is great.... theres more to a good gaming setup than just the most frames. If gaming costs $6 bucks a day in power VS $20 thats a big deal. If my room turns into the sweatiest smelliest room in the house... is that a good gaming experience. I am excited but not for this card anymore. If they just kept it under 850watts total system power Sure... I could live with that. I dunno maybe I am crazy to not have gotten one today. Too much money for too many compromises.
The fact that you say you have a 3080 and wanted to buy a 4090 means you're 101% crazy.
@@MrRafting more like obsessed. I like trying to hit the top of the leaderboards for 3d mark benchmarks. I mostly game on. 170hz 1440p monitor so even the 3080 is a little extreme.
Similar experience here. Instead of a platform upgrade in a year I decided to get a new GPU. Mainly since I didn't want to deal with the higher power requirements of the 40 series but also the "new" hardware prices of DDR5 and motherboards. So when CC had the evga 3070Ti on sale I bought it to replace my 5700XT.
I have a corsair 850w platinum
Hardware unboxed did their entire review on an 850w PSU
SO UPDATE.... I actually couldn't wait.... I went and got one.... TURNS OUT 850W is plenty for the Gigabyte OC card. Pulls 450w exactly at 99percent GPU utilization in Control.... Synthetic benchmarks too. CPU pulls about 120ish watts (5950x)... so 850 is lots if you run an AMD Ryzen 9
Spoiler alert... this card is insane. Sold my 3080 to a kid who desperately wanted an upgrade from his GTX 1660... For $550 he got a nice ASUS 3080 that will make his year.
I'm a hypocrite... sue me.
Oh an as far as thermals go... its better than my old 3080 by a long shot. Windforce 3X is truely good.
Barely fit in my NZXT H510, but with the noctua chromax black... I have renamed this PC Big Chungus.
I bought a 3080 earlier this year, and it was the first time I really splurged on a card (not through a scalper don't worry). I'm happy with the performance I get and I always intended for this card to last me a long time.
Unless I get a VR headset sometime in the near future, I'm not getting a new card for at least 6 years. That's the plan anyway.
That's a great purchase!
I got the exact same mindset.
I'm going 3090, only because the price fell so fast to normal and i need to fully use my 1440p screen. Just got my new rig in Feb so the other parts are brand new.
Replacing my 3060ti, but that one is going into a future extra desktop that I'll use at my mom's house. It should last a while for 1080p i use there. :)
I really think these bigger and heavier cards and the extra price will be useful only for overclocking whenever it would be available. I would love to see that video come soon
makes it quieter as well.
@@jondonnelly3 Idk about that. Because i think once it starts to heat up the fans may cause some noise
I wonder how far you could push this card with a water cooler
I'm sure Jay will find out. He is the king of water-cooling
Not far thanks to locked voltage, thanks Nvidia.
@@sidewinder86ify I'm sure their will be ways around it
I like the song, or rather the two riffs on repeat, while you guys show the graphs. Keep the breakdowns comin
I feel like this is another time the AIBs got screwed by NVidea.
Like you said the AIBs are trying to meet the 600W spec. but what if that’s all the info they were given? It’s been mentioned before that they sometimes don’t even get drivers before media so it could be the case that they just had to prepare for the worst case scenario.
So now they’re screwed in two ways -
1. MASSIVE cards that limit their target audience
2. (Probably) A lot higher BOM and this higher sale price that then seems unreasonable compared to the FE.
So yeah. I see why EVGA had enough with the nonsense.
Perhaps EVGA saw this coming. I wouldn't be surprised if Nvidia wants to get rid of the AIBs all together.
Honestly, how many units does ASUS expect to sell of this card? It is worse than the FE in every way:
- 25% more expensive than the FE
- only a few extra FPS in each benchmark than the FE so probably less than a 5% performance gain
- needs a bigger case than the FE
- same temps as the FE
- comparable noise level as the FE? if you ignore the coil whine on the ASUS anyway
- uglier than the FE (personal preference)
Maybe this ASUS card allows you to overclock higher than the FE, but enough to justify the extra 400 USD? I have my doubts.
As someone who has the Asus Strix RTX 3080, this new cooler design is a massive downgrade.
Also a MASSIVE downgrade in terms of the cards aesthetics.
Jay has become my go to for fun tech news and all the specs. Love the production and attention to detail.
actually he comes off as to campy at this point
That's what really concerns me with this generation of cards. If the FE cards are actually decent this time around, they are likely going to be the go to card for people just because they'll be cheaper for similar or the same performance. EVGA may have been right to get out of the market.
Exactly what I was thinking
Looks like EVGA made the right choice, I'll bet other brands will follow suit after this term. So let's hope Nvidia can make a lot more cards.
If NVIDIA loses more partners it will hurt them big time. I think you are right though. I bet we see more partners cut ties after this.
@@TheSlickmicks I assume the same. This time Nvidia "forgot" to mention the power target is not 600W anymore....so they have to overspend on a cooling solution. What a friendly move, isn´t it.
I just got the 4080 version of this card and it's absolutely incredible. The one that that actually impressed me the most is how quiet it is. Even when doing Stable Diffusion renders it's so so so so much quieter than the 3080ti I replaced. And it's much faster and now I can play 4k games at 60+FPS!!!
to me anything above 30fps it's just fine, gaming realness is what I'd love to see improved .
Regarding being bottlenecked by older PCIE, LTT also pointed out these cards can also be bottlenecked by their DisplayPort.
I think Jay made a mistake at the end of the video, 4000 series do not have gen 5 PCIe. They are 4.0 16x. 5.0 8x would have freed up more PCIe lanes for nvme SSDs :(
Linus is a bottleneck for Anthony
I just imagined Jay hardcore dancing to the song that starts at 2:50 while waiting for benchmarks. Didn't know he got down like that lolol. Solid video as always. I'm waiting on someone to do benchmarks for the gigabyte 3090 card as that's the manufacturer I'm currently waiting on to drop.
The revelation of voltage lock is interesting. For extreme overclockers , they should probably wait until they can unlock it. I'm working with a small case , and so these large cards won't work for me . I'm looking at the AIO ones for a possible fix. Or worst case, I will have to just get a large case and forget about trying to have a portable pc.
Might be too big for you but a Fractal Compact should fit the FE card, just no AIB ones.
Cant believe I paid 2k for my 3080Ti Strix when it launched, damn that 4090 Strix is looking sweet, shame weve no games to play lol
I hope you mean 3090Ti, otherwise that’s a HORRIBLE deal...
@@yasinparti4385 During the height of the graphics card shortage that was a fairly standard price. You couldn't find even older gen cards for reasonable prices. Anyone who had a dead card back then was screwed.
ohh boi....
The cheapest 3080 Ti on the market, the Palit RTX Gaming Pro OC at launch in my country retailed for the equivalent of $2850, back in June of 2021. I also got a RTX 3080 Ti, in June of this year for the equivalent of $900. At launch, the same model was retailing for $ 3050. So yeah, those were some crazy times.
@@yasinparti4385 Yeah bro it was bad timing indeed, and now I'm back as I just paid 1650 for this ROG 4090, going to be a great winter!
i wish you would have linked the 90 degree 16 pin pcie adapter from cablemod... i can't find it anywhere on their website
Thanks for the video. But it would be very interesting to have the decibels comparison between FE and Strix. Expecially because with 3090, FE was the most silent 3090 card, and Strix was the noisier 3090 card (and the coolest). So I'm VERY interested into this decibel comparison with 4090 cards.
Whats the volume of the GPU matter? Sure, if it sounds like a plane. But you've got your headphones on 99% of the time your GPU is doing anything that may cause it to make a lot of noise, don't you?
@@jovanmaric6160 No, I don't. And specifically, I didn't ask "Please, let me know if the GPU's volume does matter", but I clearly and politely asked for a dB comparison between 4090 cards. At the moment, I'm interested only into this information. Many thanks.
Quoting Jay he said the Strix was silent but has slight coil whine
@@adrianocastaldini according to other testers the FE is pretty silent too. Check out other reviewers.
Commenting before they fix the title
Same here
Card is completely overkill
Oooooomgggg
@@TheMaddoxfam uhm, did you read the title correctly?
@@TheMaddoxfam That's not what I am talking about...
The metal music in Jay's video is the best part lol
Regardless of everything else, I really love the strix design. Looks so cool
Now just down the the available power usage but like 150 to 250 watts, bring the price down by about 5-700 bucks, and maybe then it would be a good choice to pick it up.
I’d love someone to do a breakdown of the actual cost to produce one of these so we could really see what value there isn’t in it
it can actually get pretty insane, the semiconductor industry needs insane amounts of capital to keep churning. might be able to get an idea of the price of the raw material but all the research and development among various manufactures down the chain might pretty hard to account for. its an nvidia card but they still need get their chips fabbed by TSMC, and TSMC needs money to pay for their hella-expensive stuff from ASML, so they gotta tax too... its messy lol
At these price points it would be insane not to wait to see what AMD brings to the table. My process will likely be 1) wait for RDNA3 2) select highest performer 3) wait for EK to make a block for that card
It's not insane if you plan to use it in productivity workflows. AMD cards only recently got optimizations for Blender, but if you're using Blender you'll have a better time with optix which is an Nvidia exclusive. I don't care about AMD gpus because they are yet to catch up with Nvidia in the productivity world
7:17 A.K.A. "Scientific Method".
I think the 3090s is probably the best example ever for needing to water cool your system it's a prime candidate for water cooling
we are talking about the 4090s here, not the 3090s. ;)
What I love the most about your 4090 reviews are the guitar riffs. After listening to them, I'm totally oblivious to anything you have to say about the GPUs. All kidding aside, thanks for the 4090 reviews! Awesome sauce!
I have a feeling ASUS is going to take in a lot of the homeless EVGA users
I personally would never pay more than 800$ on any graphics card however awesome it may be. This is just my personal limit. Much less for pecuniary than for sanity reasons. Impressive cards nonetheless.
I’m super excited I’ve never had the best card at launch I have a 4090 coming 🤩
Make sure to get msi afterburner. Better to have a bit more agressive cooling then stock. My 3090 died because of this.
lol they are all sold out
As awesome as the technology in this card is, it's just absurdly overkill for 99% of users
95 percent of user won't be ever able to buy a xx90 series gpu.
@@Hi-levels yeah that too lmao
As for the power connection to this card, there is a corsair cable that plugs into the card and into 2 8 pin type 4 PCIe slots in the PSU. Whilst not a right angle plug it is far more flexible than the adapter that comes with the card. This cable, Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power Cable CP-8920284.