Get the news on the NVIDIA RTX 40 launch here: ruclips.net/video/3tZ01ymHZEs/видео.html Learn about EVGA leaving the video card industry here: ruclips.net/video/cV9QES-FUAM/видео.html Grab the BRAND NEW GN 'Amp' Medium Anti-Static Modmat for PC building, in stock and shipping now: store.gamersnexus.net/products/medium-modmat-v2
Calling for GamersNexus to petition to FTC for mandatory full spec list for GPU be in their boxes. This is to help consumers/buyers not to be misled by Nvidia and their Partners regarding the performance of GPUs, like they doing with 4080 16gb and 4080 12gb variants with massive difference in Cuda cores. Hope GN help protect its viewers and the consumers.
DLSS essentially downgrades the resolution while making it look like a higher quality resolution in order to increase frame rates. Much like the 40 series cards... give them a shiny new look, make it seem faster and sell it at a higher price...
@@techpriest4787 watercooling... it isn't expensive at all, it's just marketed as such because it hasn't been the default product ever and thus is sold as a premium. But realistically it's pretty much just as cheap to produce and more efficient both in terms of thermals and size.
I get the feeling that the reason many manufacturers have omitted the RTX 4080 12GB pictures is because they originally rendered them as "RTX 4070" and Nvidia blindsided them.
yea there's an irish youtuber who does tech stuff and he went over this in detail. The 12gb 4080 is without a doubt the 4070 rebranded. IDK why they did it other than to price gouge people who don't know better.
I think I know what really happened to EVGA. They heard the rumors of marketing on these cards from their competition and they were like "I'm too old for maximum dark power obelisk bullshit" and just left the industry. Seriously tho imagine how hard the market is on these manufacturers, you have to design something that will be functional in coolling down what is essentially a furnace on a chip and make it look "special", market it right and sell it with a laughable margin because of the insane competition all the while jensen goes around asking you how you dare wanting partner information on a product you're going to sell and provide support for, when you don't do anything at all and rake in a lot of money for doing nothing. This whole market is insane and it's unlike anything else really.
I think it’s pretty obvious evga was pissed nvidia was letting random Chinese companies cut their legs from under them given all the new cooler manufacturers appearing from thin air. It’s all made in China anyways so we will see if there’s a real difference
I'm impressed with how professional he managed to remain throughout the whole thing. Multiple companies needed to be called out to their face and told to eff off.
@@benjaminoechsli1941 No True Nerdsman, if he wanted it pronounced that way he should have written it that way (blame USA copyright maximalism since Jif (food brand) could sue Jif (file format) even though the risk of confusion is non-existent) though I am sad that this decades long dispute is irrelevant now that webms are better quality for lower file size and hardware accelerated on practically every mobile device.
If the best cardmaker pulls out, you know its a dumpster fire. On paper that should be the one AIB partner Nvidia would bend over backwards to keep. I hope this gen Nvidia gets the 3dfx treatment and take a much needed hit to their ego and come back down to reality.
In fairness, they're nVidia fanboys so And yes, fanboys. Because I was calling it a year ago when I said Lovelace was probably going to be a shitshow, and it's so much worse than I expected. Literally the only people left willing to put up with this shit are either amateur professionals who have to use some obscure nVidia feature for rendering, and the fanboys who'll buy a brick of elephant shit as long as it has the logo. Everyone else has been drifting toward AMD for awhile now. Last time I even saw a truly great nVidia generation it was still called GTX. Every single RTX lineup has been bad and gotten progressively worse, from DLSS 1.0 and "price uplift" to the absolutely insane instability and power issues, nVidia's bad drivers etc. on Ampere with nutso price gouging only overshadowed by their MSRP looking "reasonable" from scalping. I could already tell just based by TDPs that AMD had a threatening lineup coming up. What I didn't expect was for EVGA to be driven away from nVidia's abusive relationship and the AIBs to go nuts also but well, at least it's entertaining.
@@lukemiller511 That's right, that's part of my point. And out of that, some are even worse than others, and it speaks volumes that even after so many soulsucking corpos think nVidia is beyond the pale to work with for soulless profit, there's still a bunch of nVsimps out there who'll keep defending this company.
These cards need to start being sold in the home appliance sections of big box stores. The power and heat management are gaining ground on small cooking appliances.
6:35 That dev literally went back ~10 years and found an ancient Stackoverflow technique to rape mouse wheel events with jQuery and control video playback via parallax scrolling jank. Incredible. Counting my blessings as a front-end developer. We are a special people.
You know, back in the pascal days, i thought that "zotac arctic storm" and "palit super jetstream" were absurd names for graphics cards. Now we have serious "night baron" and "midnight kaleidoscope" cards with bionic shark fans supported by dark obelisks and anti-gravity that give me 905,637 square millimeters of absolute dark power and i dont know if i want to laugh or faint... great vid though!
@@Ernismeister don’t forget the GAMING GAMING DARK OBELISK VENTING COPIUM SUPER GAMER MAX STAY SILENT SPONSORED GAMING GAMING AAAHH STOP IT NVIDIGAMING GAMING GAMING
This was hands down the funniest GN video I've watched. LOL Steve's reactions to these absolutely ludicrous naming schemes is priceless. "Bringing users a new level of brilliance and absolute dark power" made me literally laugh out loud. Keep up the great work, guys.
This was truly a masterpiece, one for the Internet records. The only thing missing is the upcoming Yeston "Waifu" GPU line-up: RTX 4070 "Juggernaut's Juggies of Joy" using a Dual Antimatter Interdimensional support stick. RTX 4080 "The Thicc Thighs of Terror" featuring a truly inspiring Obelisk of Manhood support stick. RTX 4090 " Kahuna's Kryptonite" using the Stick of Enchanted Anti-Gravity to support the massive weight and contain the dark power of the Kahunas from transforming into a supermassive black hole.
@@MustacheMerlin Humiliating? It'd be hilarious. Who wouldn't want to watch people die inside when they ask what video card you have, and you get to say 'oh yeah the JUGGERNAUGHT JUGGIES OF JOY' with the 'MOMMY MOO WATER COOLING BLOCK'
Yugi: "Dark Magician Suprim! Use *Absolute Dark Power* Attack!" Kaiba: "I activate my trap card! *Dark Obelisk* holds your attack!" Yugi: "Big Mistake Kaiba! My Night Baron uses his special effect *Midnight Kaleidoscope* to absorb your trap card for me to use later!" Kaiba: "NO!!"
Someone gotta write an SCP story out of this, where the GPU corrupts the users and slowly turns their room into a portal to hell 😂 The corruption progresses fastest when the card runs at full power. Overclocking results are redacted.
Hey ya'll gonna regret your cynicism after the last AIB leaves and you're stuck with identical-looking cards directly from Nvidia... you'll have to buy GPU sleeves like you do for iPhones 😄
I some ways, GPUs are becoming more important than CPUs, nowadays. I have a good RTX 3000 and an old 3770k, and still it is able to run most recent AAA games @ 80 to 95 fps ^^.
We're officially at the point where the ATX standard now needs to apply to graphics cards and we should screw the graphics card into the chassis and then have the motherboard plug into the graphics cards PCIe slot.
I remember old cases with guide rails on the front to support long ISA cards. It's like fashion: everyone who kept his old trumpet trousers from the seventies was in again a few years ago...
AMD is really going to takeover from next year. I fucking love the company because they don’t rush things just so they can release before the competition like Nvidia and Intel. They also innovate in terms of efficiency and new technology like 3d v Cache. Nvidia is just increasing the power limits, clock speed and heat and there you go a new gen.
@@picolete yea but chances are they will be more efficient and maybe even have better performance with rdna 3.0 because whether or not anyone wants to admit in 6000 series was very close to competing on an outdated base. I mean I have a 6900xt and a 3080ti in seperate rigs and the 6900xt while not outperforming the 3080ti definitely performs more consistently
Remember this happened almost 8 years ago and AMD rocked the scene with Radeon Fury Nano. Hopefully Intel or AMD do that again because at this point Nvidia basically went off the rails requiring a full sized PC for a card that is basically a rebranded 60 or 70 series selling at Titan prices.
Год назад+3
AMD still rocks my scene… never could afford a high-end GPU… and I'll never want one anyway.
I, as a level 20 warlock, am happy my gpu can finally supply me all of the dark power I need for my midnight rituals! If only I didn't have to buy three separate gpus to get the dark obelisk and crystals I need as well. . .
90% of these 'marketing names' sound like they're made by marketing companies who THINK they know what a gamer is, but actually have no fxckin idea. *"Midnight Kaleidoscope, a new level of brilliance and absolute dark power"*
To be fair, we can only guess how many gamers are sane and vocal in the "gamer space", and how many are (mostly young and) dumb and only vocal when they have things to say about someone's mother. Marketing people are likely to have more of those stats (all of the harvested data must land somewhere, right?), so maybe they are catering to the bigger crowd, and we are the delusional ones...
I suspect a significant portion of the gamers are actually accurately reflected in these names and designs. People you can sell lootboxes to, you can apparently sell anything at any price.
I now want to see a Frigidaire GPU, it will be 6 slots long, and plug into 3 pci express ports just for support. It will be a mix of white, grey, and beige color themes and look like an over size box with some fans on it. It's website landing page will only have the specs and a user manual on it. It'll also have a remote that you can use to heat up your house in the winter, as such the card will also have a build in psu and require a separate power cable to be plugged into the "card's" back I.O. As for branding it'll only have a part number and be prices just a bit more then the rest of the gpu's out there.
lol I thought you said frigate at first. Reminded me of mandalore's BFGA2 video describing a 40k ship as "a flying gun-brick the size of a freeway" and I thought, well yeah might as well just introduce the frigate class of nVidia GPUs. Then they can also introduce the corvette, cruiser, battleship. At least it'd be consistent!
It's nice to know that Nvidia are pushing the electricity consumption envelope so hard with the 40 series that we may at least get some important progress in cooling for the nuclear fusion industry. May be more affordable too.
@@soragranda Bruh, Norway is self suficient in electricity and had the lowest electricity prices until the war happened. Now we are extremely high because we sell our electricity cheaper out and sell it expensive in-country. Germany should never have been reliant on Russian gas.
@@friedrichquecksilber770 nVidia's overall stupidity and incapability isn't an accurate reflection of what's going on with that though, they're just the more inefficient GPU makers. R9 Fury and Fermi for example got there before. It's not the first time we needed 3x8pins to power a GPU last gen. AMD's cards last gen still looked largely sane for example, while nVidia clearly was struggling and started building bricks with Ampere. Honestly anyone paying any attention saw this coming all the way back with the RTX 2000 series which direction the company was going, in pushing dumb gimmicks and superficial bs with massive power draw and impaired frametimes/powerspikes/drivers (yes, the RTX 2080ti literally had 5700XT tier driver problems on a $1200 GPU) including emphasizing cooler designs (that was ripped off from Sapphire's AMD Fury/Nitro design) instead of talking about pure performance. It's like nVidia basiclaly decided they couldn't compete anymore and so decided to make everything about running HairWorks better. What I'm curious about now is whether nVidia's software still sucks.
@@pandemicneetbux2110 They make graphics cards a trendy object to justify ever more aberrant prices but as there will always be pigeons Nvidia is rubbing its hands ....
@@RogueStar777 I didn't think it would actually work though because at least in the states, you had kids and all these young poor people ages 18-30 who'd suddenly get a $1200 check mailed to them, so of course you can sell a 3070 for that much to somebody who's really bad with money and can't prioritize for shit. But then after they stopped being money printing machines you'd think anyone would've gotten a clue and stopped buying these shitty ass GTX 3050's and alleged RTX 3060's for over $300. It just sucks because it impacts AMD, which is also a corpo so they jack prices too. At this point I'm realizing maybe I shouldn't even bother upgrading this year, because if AMD has prices anywhere close to nVidia I'm checked out. My 5700XT still works fine, and while I'd love to have a widescreen or better 4k panel, I don't actually need it. Seeing how I'd be buying a new monitor, it's a big chunk of cash for upgrading already. I just assumed people would realize paying that kind of money for graphics cards is utterly insane unless you're literally making money off it, and people would stop paying for it.
Because they're dumping efficiency for power, to be the best raw performance at any cost, particularly the cost of your increased power bill from both powering the monster, and cooling the heat it outputs into your home. At least in winter, you won't need to run your heater nearly as much.
@@j.trades9691 That's actually not very far off, most homes in the US have 15A outlets@120V which is roughly 1800w before your breaker trips and generally the entire room (if not more) will be on the same circuit. With GPUs in the 600w range and transient spikes in the 1000's, CPUs pulling around 200w plus the rest of the system, you'll be pretty close to popping a breaker with full system power and a transient spike. This is of course assuming the power supply doesn't shut itself off and there's nothing else connected to the circuit in the room. Most power chords are rated for 15 amps as well, people are going to have to start educating themselves on home wiring and safety in the future
@@siliconalleyelectronics187 I live in an old house where, before I moved it to another room, my computer was on a circuit with two bedrooms, an electric kettle, and the microwave. That worked as well as you'd expect.
Acer made blower-style 3060 Ti and 3070s last gen for their pre-builts, you can find them second-hand. Really good for undervolting and not affecting CPU thermals. Since I use a dual-system case I really need blower cards, at LEAST the main system in the bottom needs a blower card. Asus also had a short-lived blower-type card but it was too deep, rising way too far above the bracket. Also I just don't buy Asus.
It's ironic how detached the manufacturers are from the customer base. This generation really deserves a hard flop, and we can only hope the next one is better
Tech in general is becoming too damn money hungry as these companies really focusing on net profit & revenues instead of catering to the wants/needs of the customers who keep their companies running. You still have 30 series cards sitting at dumb prices because you refuse to take more losses for another quarter but wanna release & shelf 40 series cards at similar prices to 2 yr cards you can't get rid of while installing this belief in people that DLSS 3 should make them buy it, especially since "DLSS 3 is only for 40 series" cause if it was compatible for 3090Ti them performance charts you showed would be pointless, your 4000 cards would be sitting till 2024 and your balls would shrivel up in that tight black pleather jacket you think you look like Tom Cruise in. I see why EVGA left these fools, at least they have some sense of humanity.
If their marketing materials seem strange and foreign it's probably because their marketing teams are from China or Taiwan. I'm pretty sure EVGA was the only US based aib.
honestly at this point I don't see why we can't eliminate the CPU completely. but a couple of sata ports and USB ports on these cards, bam. they already are basically self-contained minicomputers on their own.
Or they could make a dummy pcie slot on the bottom for ATX boards as another mounting point at this rate. Hell why not put legs on the GPU like a coffee table?
@@laggmonstret What's crazy is a lot of ITX cases use a PCIe riser cable to vertically mount the GPU. So somehow we got to the point that ITX builds are more secure than giant steel ATX cases 😂
Not many people know this, but the Galax serious card was initially called Serious Bill, to hint on the electricity bills people would be paying each month, but nvidia obviously banned that naming...
Pretty disappointed in the latest gen of PC hardware so far, feels like we're going backwards. Everything is getting bigger, hotter and more power hungry.
I love BFGPU concept. Finally we are getting real performance gains. You are stupid if you think a halo flagship discrete graphics card for a desktop computer that plugs into a wall, should be small and "energy efficient". If you don't want a flagship don't buy one.
I tried warning all you guys a year ago what kind of a total shitshow nVidia had become but you didn't listen. You could've prevented this. I even called, hell not just this, but Ampere when I warned you all that Ampere was looking pretty sketchy right before release, and it turned out to be one of the most meme-worthy nVidia generations since Fermi. They even started using the Sapphire Nitro Fury-X passthrough cooler design, which was an actual meme on AMD's old R9 300/Fury series cards for how hot and unstable they were, and suddenly nobody bats an eye when nVidia has to adopt it and make it mandatory. Then when I saw the TDPs I just knew it was over, because Jensen will do everything in his power to make sure they have the ostensibly "best" performing halo card regardless how much the rest of the lineup is total shite. This signalled to me they already knew that AMD was gonna beat them with RDNA3 so all what this is is some truly desperate moves on nVidia's part to maintain their image, because optics is all that really matters to nVidia, not the performance. Meanwhile they even managed to drive EVGA away, after having successfully ensured that all Macs (shitty as they are) and all consoles now have AMD hardware on it, thus ensuring much better stability, support, and optimization for productivity software and games running on AMD. I mean what do they even have left, the Switch and Tesla or something? EVGA clearly bailed at the right time. Sadly, the average nVidia buyer is now looking like a beaten housewife trying to rationalize how the corpo really loves them deep down inside, and shelling out 1080ti levels of cash for a shitty 4070 that's going to draw more power and perform even worse than a 7700XT and still require you replacing your 850/750w PSU most likely, so now you get to add that into the cost of upgrading your monitor and GPU. It absolutely astounds me that even after EVGA themselves finally had enough with nVidia, that there's still honest to God thousands of people out there who insist on buying nVidia cards for no good reason at all but branding and memes.
@@SlavaBagmut Yeah which is why I won't bother with it. $1600 for a 4080 on newegg, what the fuck do they think this is 2020? Yeah nah cunt, that's a HARD pass for me. What's hilarious it's not even like it's any good, they've got bursting into flames 12tard plugs, rather than 8pins. It's like, this is not a small thing to me when you're running 450w+ That does not make it more better. That makes it WORSE which means you should charge LESS for it. When AMD was pulling that shit they had to charge less money because who in tf wants to pay more money for a card that has to draw 100w more power for being equally performing. The psychotic thing is, there's allegedly still all these abject retards buying it, and I'm sorry but at a certain fundamental level I'm going to respect you less for what you're spending $1600 on, if it's crystal meth or a 4080 I don't actually respect what you have to say on other things because of that decision proving you're not cognitively reliable.
Maybe EVGA got out because it saw that the investment commitment to buying an marketing the 4090 cards was going to be a losing proposition and instead of having an unprofitable year or two, it just walked away.
He's right, Nvidia made too many 30 series cards but is refusing to lower prices for AIB partners while they themselves are lowering prices of FE cards, trying to outsource the loss of excess inventory to AIB partners. Now they decide they'll just increase the price of 40 series so people will be forced to buy up 30 series stock first.
If they were losing money with the higher end 30XX gpus, they would've have been absolutely been hemorrhaging money with these, especially considering nobody is going to buy them when there's the surplus of 30XX gpus
I've never actually faced palmed before, but I have this playing in "desktop mode" minimised while I'm at the gym, and I actually facepalmed when I heard that, the delivery was perfect. I didn't even have to see Steve's face.
@@superbusstarodub ikr nobody saw that one coming. Im quite the Nostradamus. Just like when I predicted I would eat spaghettis for diner yesterday... Im never wrong.
@@joelmorningstar3645 Yup. Just when I thought the pricing and manipulating the 30 series market was enough of a shitshow, now we get this. I mean... is Nvidia that determined to drop the gaming market, that they'll do anything and everything wrong in their power? xD
I mean....Thats why my 3080 is just chilling in a Cupboard right now. :D (jk...My new MB or CPU is broken. Dunno what exactly seems to be the Problem...)
I think these cards will follow in step with recent sales trends on GPUs, that is to say nobody wants these. They're too expensive and average consumers budgets have no room for these overpriced, oversized, overspeced cards right now.
I think you're delusional. People are very impulsive and dumb. Similar to idiots paying $100K+ for a new truck that's worth $25K, people will line up to buy these and they will sell out...as usual.
Agreed, I had a budget set aside for these cards. My max was $1200 for a 4090.. and to see these base prices set to release at $1699 its absolutely absurd and I won't be buying one.. I personally know at least 3 other people that have decided not to buy the 4000 series at all because of Nvidias slefish decision, and it's sad to say but I hope Nvidia gets the shit end of the stick on this release.
Historically few people have bought top tier cards, so I think you're right. What will be interesting is what Nvidia chooses to do with the 4070/4060, and how AMDs chiplettes compare
I like how you started with ASUS as they seem to be the only one following their naming model seriously. It's like you didn't want people to confuse them with the other nonsense they are surrounded by.
There aren't that many games that need these cards. The only game i can think of, Cyberpunk 2077, plays well enough on 3080/3090. There is no need to go to a 4090 unless you are a content creator. The 4080s aimed at the gamer won't be that much better than last gen 3090s, with their lower Cuda cores and their slower memory bandwidth. Personally, I would jump on the 3090 GPUs being released from mining, before that market picks up again.
@@chrisliddiard725 it's gonna be a hot minute before it does. All the markets are getting ready to self emulate like monks. Housing, wall street, energy, crypto, major currencies ... all about to take a big old Amber Herd right in the bed of the global economy.
Agreed ! Seriously though, what happened to these manufacturers ?! It's like they are advertising toys for small children. The marketing jargon is ridiculous. I think that EVGA left the GPU scene at the right time. This all feels like a bad comedy.
Can't wait for the Super RTX 4080 FE XL FTW 2SLGBTQIA+ ULTRA Turbo Champion Edition II ALPHA Gold RGB Platinum HyperX Bionic Stealth Shark Carbon OMEGA Ti Remastered Limited Re-Edition!
In fairness, they are. Have you ever tried interacting with "people" on Steam forums? You can easily forget just how F'ing stupid average gamers are until you actually go and interact with the ones not playing your niche city builders or whatever, and realizing these are literally Qanon follower tiers. Plus it's nVidia anyway, so they tend to get a much bigger portion of the bottom end of the bell curve regardless.
It's gonna be wild to see the sales numbers on these, because I don't know anyone who is even considering a 40 series card. The power draw alone is turning most people off. They picked a hell of a year to release a high power draw card considering the energy cost issues going around.
was thinking that as well. I'm already considering selling my i5 11400/RX6600 XT PC in favor of a Mac with Apple Silicon due to the power draw. And don't get me wrong, my PC draws 500W tops, imagine your GPU alone is using that much power...
You are paying almost the same for energy for a house with ac and fridge working all the time only to fed that card, feels like the gaming industry is going backwards strong, I’ll stick with my apple m1 and my current 3080 till gen 5 offers something more power efficient
I considered buying the 40 series because I'm on a 2070 right now, and boy oh boy, I'm good on the electricity bill(because its going to be pretty low compared to everyone else) but not the whole card price nor how big it is, plus I'll need a new PSU.
These names are basically just what happened when the monitor guys got together and made a big bingo card corresponding to each letter in the alphabet and numbers between 0-9. They then, simply constructed these brilliant pieces of art.
Dont think it going to matter to much by the comments ive seen across multiple videos on the 40 series AIB makers are going to be struggling to sell Nvidia cards anyway .. EVGA should jump ship to AMD this gen and make HUGE $$$$
You know some guy is gonna grab the "Dark Power" of the Midnight Kaleidoscope and then combine it with that "Dark Obelisk", and cringe all over some gaming platform.
@@charlesballiet7074 up until now we have no independent performance reviews. However even if they did not achieve any architectural advantages (unlikely) they would still have massively increased the cuda core count.
His seatbelt didn't kill him. It was the lack of the HANS device which is now a nascar staple due to Earnhardts death. It sucks we had to lose best to save the worst
@@petermuller608 the closest I've seen is a preview video which Digital Foundry (they already have a founders 4090 in hand) dropped 2 days ago showing some percentages vs 3090ti rather than raw framerates. I think a lot of the gains referenced DLSS 3.0 but the increases were truly massive. 4090 at 250% FPS vs the 3090ti in CP2077 maxed everything 4K. They claim 8K 60FPS is now possible
Nvidia still sucks, won't buy anything from them, until they release truly open source drivers. Otherwise you'll be stuck with unsupported binary blob garbage on Linux after a while, like on my Thinkpad W530
When these cards drop and you do your reviews, could you do a breakdown of which cards would be the best option if you're planning to watercool them in a custom loop?
I have a hunch, that nvidia forgot to mention to their partners that they cut back the 4090 tdp from 6-700W to 450. that is why everybody is coming with the 4 slot cards with triple huge fans, meanwhile nv can show a slicker founders edition card. This would also support EVGA-s claims about getting every info in the last moment, or even on the reveal event itself.
450W is not actually the limit for the partner cards. MSI has a 'silent' setting at 450W, but also a higher setting. They cannot tell us yet how high it goes due to their NDA.
This makes me think that EVGA ditched when they were told to relabel the 4070 model. It would have been hilarious if all the board partners went with the original plan and released the 4070 model. Or did a bad job on mspaint to edit the number. That would have been a nice foot in the backside to Nvidia.
@@qwerfa think it was Sapphire that used to (long time ago) make boxes with no model numbers on them whatsoever. They just stuck it on with a sticker because they used the same box for all their cards
yeah, the only somewhat reasonably priced model is the 4090, but thats definitely a gpu for workstations and thus not for children. I can't imagine a graphics designer thinking "oh yeah, I need that 90s gamer kids gpu model" xD the designs are about as atrocious as the 4080 pricing.
They are also kind of named like toys too... say the name of any of those cards with the word "Nerf" in front of them and you think its a nerf launcher or a water pistol.
The 5000 series will come with their own separate case that you connect to your main tower. And with the optional liquid cooler that will be required to keep your GPU under 100 degrees, you can purchase the additional optional tower by NVIDIA for $699
5000 series will only include one USB C port, which you will have to buy a $50 dongle to convert it to Displayport and if you take off the cooler and repaste the card yourself, it bricks the bios until you send it back to Apple...err I mean Nvidia for a firmware update.
EVGA: 3090 *FTW3 ULTRA GAMING* with *PowerLink 41s* for maximum power filtering… EVGA: leaves GN: oh look these AIBs calling out their cards with cringe names *Msi SuprimX with an X*, galax with *4090 Serious Gaming* TF lol? Nothing changed, they trynna hate on the AIBs for literally something that has been the same for years, and now that EVGA leaves…
@@Jusdutari naming isn't the issue for me it's the gimmick they're using to sell us a literal brick from anti gravity core pipes to fking support sticks. Nvidia did all of this by raising 50% more power than their competitor to try and squeeze 10-20% more performance out of their GPUs. I'm skipping on this generation sadly unless they release something that won't break my motherboard and require me to get a 1000W GPU.
Ehhh, not really? EVGA wasn't exactly innocent, they had a stupid number of different models with confusing clock and memory speed adjustments, cooling buzzwords and model names, etc. FTW3, ICX, Black, XC, XC3, Ultra, etc. You had to get through all that crap before you could ever tell why any card was better than any other they sold, other than price.
@GamerNexus I saw this idea in another comment chain, but I am going to repost it as a fresh comment because I think it deserves more love. The 4000 series has left the realm where "card" can accurately describe these pieces of hardware! As classic cyberpunk fans recall, netrunners had to use solid blocks of circuitry to hack into things. These blocks were referred to as "cyberdecks," and hackers colloquially referred to as "deckers." We should all start referring to these new GPUs as "cyberdecks." They are big and heavy and expensive enough to warrant it. Also, driving a VR rig is absolutely the beginning of Cyberpunk's 'Netspace.
Well, if we are lucky, the gravitic effect of the Gigabyte cards is so strong, it can pull the entire marketing team in close enough for the sharks to be able to eat them and spare us any similar BS talk in the future.
Maybe for the 5000 series Nvidia should just make a product that performs the same as the 4000 series but uses half the power and takes up half the space. Maybe brand it like Sony and call it the 4000 series slim.
AMD is going to make those. They've been catching up for a couple generations, and this is the perfect time to demonstrate how absurd the 5000 series is by revealing what it should have been.
People made fun of me on reddit for adding a bunch of heatsinks to the back of my gpu, and for trying to build my own heatsink. I told them it was "future proofing". They're not laughing anymore...
@@CarlosZ34NSM ARC is starting to look a whole lot better next to these gigantic, power hungry Nvidia cards. Honestly, even if the prices were reasonable I wouldn't want to stick one of these things in my computer.
@@spiffnoblade7731 Agreed. Arc is competing with the 3060 tops. It needs one more generation to get a full lineup. Hopefully it wont keep falling further behind.
I am interested in upping my "dark power" game. Would I be best going for a Galax card to get the "Dark Obelisk" or would Palit's "absolute dark power" be more powerful and ...absolute? I feel like the "dark obelisk" would require more rituals and sacrificing and I'm already caught for time as it is!
... and then there's the whole issue with anti-gravity. Will the dark powers grant me the power to defeat gravity or will I have to decide between the two?! Should I buy a range of cards to take advantage of all the magic or would I be meddling with powers I cannot possibly comprehend?
Yeah, might get dangerous ;) . Seriously though, what happened to these manufacturers ?! It's like they are advertising toys for small children. The marketing jargon is ridiculous. I think that EVGA left the GPU scene at the right time. This all feels like a bad comedy.
Imagine being the inventors of dark power and anti gravity plates, only to turn around and use them to build a graphics card, instead of world domination.
Plan for utter Domination: 1. Buy a Palit 4090 2. Order customized Support Bracket with "Absolute Dark Power" written on it 3. Put a DMC V Vergil figurine on top of backplate 4. ????? 5. Own every Game feeling motivated
By the time we get to the 5000 series, we’ll need AIO cards or custom water loops for the cards to keep them cooled. I just upgraded from my 1080ti that was a beast of a card from gigabyte to the 3080 strix card and the size differences is actually comical!
Same story here, had 1080 TI Strix for a while , sold it to a friend a year+ ago. Got my 3080 Strix recently, this friend came to me to clean his computer, was overwhelmed by the size and design. 1080 looks waay cheaper now with the materials and build quality.
Respect to Steve to say those names out loud without bursting into laugh. The timeline on the left is also just pure gold. The most hilarious news to upcoming GPUs ever watched so far
That was the best laugh I have had in a solid week! Honestly, these GPU companies are just rediculous these days. 4 slots is crazy enough, but when they cover their card in sparkly dark matter and fall in love with three fan designs supported by dark obelists, I have to wonder if they even know what kind of products they make?
Honestly these cards are getting ridiculous. If there isn't a development in GPU power efficiency and cooling soon then we'd be seeing 6 slot sized GPUs in the not too distant future.
Only if the emptied brained masses are stupid enough to actually buy into the hype machine and purchase these wastes of silicon (which of course they will)
We probably actually need more software and driver optimization, since the currrent and the last generation of gpu hardware has not actually got pushed to their limits at all.
Pretty soon all graphics “cards” are going to require a dedicated heat pump just to keep them under 90C. Going to be expensive running your PC off of 240 volt plugs.
As a former employee of EVGA, we where told everything last. We didn't get any assistance from Nvidia when GPU/copper/silicon Shortages, we where left for dead.
No one blames EVGA for not wanting to lose money on 60% of the line-up. Nvidia needs a good boycott to bring them back into line. Hopefully enough of us managed to get a 30 series to make it happen.
Well, that sucks! After hearing that, it's impressive how well EVGA did, with their products. How long did you work there? Was it a fairly large operation, or would I be surprised, if I actually went there?
Seems we've reached a tipping point where GPUs become faster by becoming larger not more advanced, smaller, efficient. That means everything is more expensive, larger PCs motherboards, PSUs.
Actually these GPUs are more efficient than the last generation as performance per watt will show .... They are also more advanced ..... The Laws of Physics doesn't include miracles .... More power comes at a cost in this case it's the size of a cooling solution needed .....
Had to laugh so much when hearing about the brilliance, dark powers, and anti-gravity technology. Hikes, Albert Einstein was nothing compared to these geniuses.
You are not alone. I've been Nvidia customer over a decade, but this launch crossed the line for good. Greedy, AIB bullying, budget gaming destroyer, Nvidia is really awfull company lets face it. They make excellent GPU's, but we need AMD to change things like they did with CPU market.
Holding my EVGA 1070 SC until I need a new build. I’m interested in either AMD or Intel, but NVIDIA’s attitude towards consumers, partners and media isn’t something I can support. I’ve been feeling this one coming for some time.
@@progenitor_amborella I lift my hat for you sir. I've been thinking about changing to AMD too, i know this is kinda poor idealism because nothing changes, but i can atleast keep my principals and not buy from Nvidia anymore. AMD has made it easy too. Because gaming with their GPU's is pretty much same, they even improved FSR wich should be now very usable with 4k.
i know it would be mind-numbingly painful but i would to love see the AIBs claims on fan improvements tested. Every year they claim ___% more airflow than past gen. i feel like these guys have created 3500% more airflow in the last 10 years somehow
When the EPA slammed the lid on automotive emissions (70's), this caused a massive loss of MPG with marginal air quality. There was an avalanche of articles giving on how to improve MPG, who knew we all were such lousy drivers! It took 10 years for manufacturers to retool. Now we're harvesting 90% of the fuel energy and can't afford the complicated devices. When you added up all the claims of those early 'experts', you could leave with a 1/4 tank and have an overflowing tank at the end of your trip. Such is the power of the press.
Any MiniITX build basically will need to be water-cooled now to reduce the size. Maybe that's why all those companies are now selling a GPU+Pump combo so you can just attach it to a loop and done.
@Mike Where do you put the humongous radiator required to dump the heat with? Or are we all going to be attaching RC aircraft-level ducted fans to 140mm rads and watching the machine take off and fly around the room with all the subtlety, grace and decibel level of a rabid goose on crack, while blasting air out the end that's hot enough to strip paint?
Get the news on the NVIDIA RTX 40 launch here: ruclips.net/video/3tZ01ymHZEs/видео.html
Learn about EVGA leaving the video card industry here: ruclips.net/video/cV9QES-FUAM/видео.html
Grab the BRAND NEW GN 'Amp' Medium Anti-Static Modmat for PC building, in stock and shipping now: store.gamersnexus.net/products/medium-modmat-v2
Where is the gigabyte page link
Calling for GamersNexus to petition to FTC for mandatory full spec list for GPU be in their boxes. This is to help consumers/buyers not to be misled by Nvidia and their Partners regarding the performance of GPUs, like they doing with 4080 16gb and 4080 12gb variants with massive difference in Cuda cores. Hope GN help protect its viewers and the consumers.
Tech Jesus is a comedian 🤣😭
Btw it says that your genius link isn't working
How's it compare to watch?v=0frNP0qzxQc 🤣
DLSS is amazing; it can even upscale prices.
😂
More latency and more money
wait until we find out what the hell these AIB prices gonna be.
DLSS essentially downgrades the resolution while making it look like a higher quality resolution in order to increase frame rates. Much like the 40 series cards... give them a shiny new look, make it seem faster and sell it at a higher price...
@@dr1flush
Yep, money is queued; waiting for Ngreedia's financial service to processed it and recycle it.
Nvidia is actually very considerate of their buyers by providing them with video card, space heater, dumbbell and building brick all in one.
At this price, I would expect all 4 products.
Do you have any better idea of how to cool it?
Don't forget the gravity field generator
@@techpriest4787 watercooling... it isn't expensive at all, it's just marketed as such because it hasn't been the default product ever and thus is sold as a premium. But realistically it's pretty much just as cheap to produce and more efficient both in terms of thermals and size.
@@techpriest4787 I can't tell if this is a legitimate question or just snark
Edit: it's just an Nvidia Simp
I get the feeling that the reason many manufacturers have omitted the RTX 4080 12GB pictures is because they originally rendered them as "RTX 4070" and Nvidia blindsided them.
yea there's an irish youtuber who does tech stuff and he went over this in detail. The 12gb 4080 is without a doubt the 4070 rebranded. IDK why they did it other than to price gouge people who don't know better.
@@DATWagonator 10:25
It make sense
I get the feeling youre speculating for no reason since the 4070 was already announced
@@Mavis847 I get the feeling that you're commenting for no reason. 🤡
I think I know what really happened to EVGA. They heard the rumors of marketing on these cards from their competition and they were like "I'm too old for maximum dark power obelisk bullshit" and just left the industry. Seriously tho imagine how hard the market is on these manufacturers, you have to design something that will be functional in coolling down what is essentially a furnace on a chip and make it look "special", market it right and sell it with a laughable margin because of the insane competition all the while jensen goes around asking you how you dare wanting partner information on a product you're going to sell and provide support for, when you don't do anything at all and rake in a lot of money for doing nothing. This whole market is insane and it's unlike anything else really.
I think it’s pretty obvious evga was pissed nvidia was letting random Chinese companies cut their legs from under them given all the new cooler manufacturers appearing from thin air. It’s all made in China anyways so we will see if there’s a real difference
You can really feel Steve's will to live fading over the course of this video
Is the use of the word "Jif" an attempt to get the internet to kill him? Suicide by nerd? Or by peanut butter?
@@ruhnon331 Considering that the creator of the format pronounces it "jif", I'm pretty sure True Nerds would be on his side.
@@ruhnon331 he knows he's saying it correctly :D
I'm impressed with how professional he managed to remain throughout the whole thing. Multiple companies needed to be called out to their face and told to eff off.
@@benjaminoechsli1941 No True Nerdsman, if he wanted it pronounced that way he should have written it that way (blame USA copyright maximalism since Jif (food brand) could sue Jif (file format) even though the risk of confusion is non-existent) though I am sad that this decades long dispute is irrelevant now that webms are better quality for lower file size and hardware accelerated on practically every mobile device.
EVGA pulled out just in time
We all would regret less if we all "pulled out sooner".
But it was just the tip…..
If the best cardmaker pulls out, you know its a dumpster fire. On paper that should be the one AIB partner Nvidia would bend over backwards to keep. I hope this gen Nvidia gets the 3dfx treatment and take a much needed hit to their ego and come back down to reality.
Giggity
Satisfying the Al Gore Rhythm.
Props to Steve for not completely losing his sanity with these things
not so sure about that
I dunno man, he's saying it's JIF...
18:17 BRO he loses it LOL
It’s the dark power getting to him
I lost it when he got to the "dark obelisk"
LOL, marketing for these brands feels like they're targeting 13 year olds on Xbox Live, circa 2005.
It's absolutely hilarious.
13 year old's with 2 Grand too somehow.
ok
In fairness, they're nVidia fanboys so
And yes, fanboys. Because I was calling it a year ago when I said Lovelace was probably going to be a shitshow, and it's so much worse than I expected. Literally the only people left willing to put up with this shit are either amateur professionals who have to use some obscure nVidia feature for rendering, and the fanboys who'll buy a brick of elephant shit as long as it has the logo. Everyone else has been drifting toward AMD for awhile now. Last time I even saw a truly great nVidia generation it was still called GTX. Every single RTX lineup has been bad and gotten progressively worse, from DLSS 1.0 and "price uplift" to the absolutely insane instability and power issues, nVidia's bad drivers etc. on Ampere with nutso price gouging only overshadowed by their MSRP looking "reasonable" from scalping. I could already tell just based by TDPs that AMD had a threatening lineup coming up. What I didn't expect was for EVGA to be driven away from nVidia's abusive relationship and the AIBs to go nuts also but well, at least it's entertaining.
@@pandemicneetbux2110 They're just companies, neither of them care about you
@@lukemiller511 That's right, that's part of my point. And out of that, some are even worse than others, and it speaks volumes that even after so many soulsucking corpos think nVidia is beyond the pale to work with for soulless profit, there's still a bunch of nVsimps out there who'll keep defending this company.
These cards need to start being sold in the home appliance sections of big box stores. The power and heat management are gaining ground on small cooking appliances.
Space heaters are obsolete.
@@MistyKathrine especially after these 40 series. Some random husband will justify why he put the pc at the living room during winter.
That heat has to go somewhere...Your bedroom just became an Air Fryer.
i already use these as heaters
We need these cards for heating, if there is a gas shortage.
"The sheer size of it's heat sink is affecting gravity itself. As with everything in marketing: Technically true is true enough." Nailed it. 🤣
technically anything with mass affects gravity in its local area.
Would you like a Extra Big-Ass GPU? Now with More Molecules!
Perhaps the apple, that hit Issac Newton, fell because someone dropped an RTX 4090.
Well the card is named GeForce after all...
Technically correct is the best form of correct.
This escalated quickly on so many levels. 4 slots and beyond, Bionic Shark Fans, Dark Obelisk support sticks and 4x8-pin Power adapters. God help us.
Who the living hell even _has_ four eight pins????
The funny thing is, ten years down the road, all this BS will be in an APU
@@micmacha A Motherboard, but to save time they just call it 24 Pin there 😅
@@GamePat96 24 is 3x8
ABSOLUTE DARK POWER
6:35 That dev literally went back ~10 years and found an ancient Stackoverflow technique to rape mouse wheel events with jQuery and control video playback via parallax scrolling jank.
Incredible. Counting my blessings as a front-end developer. We are a special people.
Inno3D isn't just "brutal", they're "201% committed". That's impressive.
Now I'm just wondering why not 347%?
I was kinda meh on the 200% but they really won me over at 201%
now let's see paul allen's commitment
Inno is the the Aldi of the graphics card world
And of course I'm treated with an ad for that Queen's Blade game while watching this video on my phone.
Other AIB’s: “Where are you going?”
EVGA: “I see an iceberg, I’m leaving.”
You know, back in the pascal days, i thought that "zotac arctic storm" and "palit super jetstream" were absurd names for graphics cards. Now we have serious "night baron" and "midnight kaleidoscope" cards with bionic shark fans supported by dark obelisks and anti-gravity that give me 905,637 square millimeters of absolute dark power and i dont know if i want to laugh or faint... great vid though!
Your reviews are truly a jift to the gaming world.
Even goes on to affect wildlife like jiraffes
ok
jamen
This is my reaction when people call it a "jiff". Say the word "gift" then drop the "T".
Jreat comment
NVidia: "We need you to rebrand your RTX 4070s as RTX 4080 12GB"
_EVGA has left the chat._
NVIDIA next gen: "all cards are RTX 5090s now"
@@marcogenovesi8570 RTX 5090 4GB
@@Ernismeister 💀🤣🤣
@@Ernismeister $1299
@@Ernismeister don’t forget the
GAMING GAMING DARK OBELISK VENTING COPIUM SUPER GAMER MAX STAY SILENT SPONSORED GAMING GAMING AAAHH STOP IT NVIDIGAMING GAMING GAMING
This was hands down the funniest GN video I've watched. LOL Steve's reactions to these absolutely ludicrous naming schemes is priceless. "Bringing users a new level of brilliance and absolute dark power" made me literally laugh out loud. Keep up the great work, guys.
They spelt it wrong though. It should have read :
*New level of brilliance, and UNLIMITED POWAAAAH.*
@@DailyCorvid yeah we gonna need that unlimited powah if those power requirements are real
Reminds me of the review of the "Jundam" edition GPU
We are actually living in the most Loonie Toons timeline possible.
“Bionic shark fans” 😂
This was truly a masterpiece, one for the Internet records. The only thing missing is the upcoming Yeston "Waifu" GPU line-up:
RTX 4070 "Juggernaut's Juggies of Joy" using a Dual Antimatter Interdimensional support stick.
RTX 4080 "The Thicc Thighs of Terror" featuring a truly inspiring Obelisk of Manhood support stick.
RTX 4090 " Kahuna's Kryptonite" using the Stick of Enchanted Anti-Gravity to support the massive weight and contain the dark power of the Kahunas from transforming into a supermassive black hole.
Hey, those "juggies of joy" could be used as extra surface area for heat transfer!
lmfao
Hey man, you went too far. Don’t you dare talk bad about anything Waifu!!!!!!!
and somehow it'd be the most sane out of all of them.
@@MustacheMerlin Humiliating? It'd be hilarious.
Who wouldn't want to watch people die inside when they ask what video card you have, and you get to say 'oh yeah the JUGGERNAUGHT JUGGIES OF JOY' with the 'MOMMY MOO WATER COOLING BLOCK'
Yugi: "Dark Magician Suprim! Use *Absolute Dark Power* Attack!"
Kaiba: "I activate my trap card! *Dark Obelisk* holds your attack!"
Yugi: "Big Mistake Kaiba! My Night Baron uses his special effect *Midnight Kaleidoscope* to absorb your trap card for me to use later!"
Kaiba: "NO!!"
Palit's "Absolute dark power" had me in tears for minutes
All of them has gone crazy.....
Someone gotta write an SCP story out of this, where the GPU corrupts the users and slowly turns their room into a portal to hell 😂
The corruption progresses fastest when the card runs at full power. Overclocking results are redacted.
I had to pause the video LMAO
Dont forget Gigabyte's Bionic Shark Fans. lol
With advertising like that, you can't tell me that CERN didn't open a portal
Loved the writing
Hi
👍
Wow
Great video
Great
EVGA looking smarter and smarter every day since the 40 series announcement.
facts
"They weren't crazy, just ahead of the curve" when they got out.
Hey ya'll gonna regret your cynicism after the last AIB leaves and you're stuck with identical-looking cards directly from Nvidia... you'll have to buy GPU sleeves like you do for iPhones 😄
@@XuryFromCanada ha. ha.
I don't understand. Where do they go from here? Is EVGA now going to make Radeons?
Maybe we should start considering having the GPU be the main board and CPU/etc as addon cards.
🤣🤣🤣🤣🤣🤣
Especially in the case of mini-itx systems! 🤣🤣🤣🤣🤣🤣🤣
I some ways, GPUs are becoming more important than CPUs, nowadays. I have a good RTX 3000 and an old 3770k, and still it is able to run most recent AAA games @ 80 to 95 fps ^^.
@@fridaycaliforniaa236 dude that bottleneck is insane
We're officially at the point where the ATX standard now needs to apply to graphics cards and we should screw the graphics card into the chassis and then have the motherboard plug into the graphics cards PCIe slot.
I remember old cases with guide rails on the front to support long ISA cards. It's like fashion: everyone who kept his old trumpet trousers from the seventies was in again a few years ago...
haha nice
The more I hear of the 4000 series the more I'm excited for the AMD cards
AMD is really going to takeover from next year. I fucking love the company because they don’t rush things just so they can release before the competition like Nvidia and Intel. They also innovate in terms of efficiency and new technology like 3d v Cache. Nvidia is just increasing the power limits, clock speed and heat and there you go a new gen.
AMD cards will be expensive too
Sssssh, don't tell anyone. I need the stock of those cards for me.
@@picolete yea but chances are they will be more efficient and maybe even have better performance with rdna 3.0 because whether or not anyone wants to admit in 6000 series was very close to competing on an outdated base. I mean I have a 6900xt and a 3080ti in seperate rigs and the 6900xt while not outperforming the 3080ti definitely performs more consistently
@@picolete AMD will, you nailed it.
Someone has to tell ASUS that in physical products, you can't round down. If a card is 3.1 or 3.2 slot tick, it blocks 4 slots, period.
NO, just find all those 0.8 slot peripherals that don't exist😂
I already pre-order the 0.8 slot ASUS Wi-Fi 6E card. You guys just aren't looking in the right places.
@@manny892007 who needs wifi maaaan
It gives some breathing room for gpu if you have anything on slot 4 or 5 or have a short case .
@@manny892007 uhhm, that's not how it works.
Remember this happened almost 8 years ago and AMD rocked the scene with Radeon Fury Nano.
Hopefully Intel or AMD do that again because at this point Nvidia basically went off the rails requiring a full sized PC for a card that is basically a rebranded 60 or 70 series selling at Titan prices.
AMD still rocks my scene… never could afford a high-end GPU… and I'll never want one anyway.
Can't wait for an RTX 7090 that will finally use all 7 of my PCIE slots
And 7 power supplies.
Nah you´ll need a seperate PC case only for the GPU
It will come with its own power supply too
It'll have its own PC case
@@Zamaric but youve got to dice for it from lootboxe's
I, as a level 20 warlock, am happy my gpu can finally supply me all of the dark power I need for my midnight rituals! If only I didn't have to buy three separate gpus to get the dark obelisk and crystals I need as well. . .
mfw my graphics card summons Cthulhu
Idk about dark power but it's for sure a black hole for your wallet
😆
I laughed out loud when that section came around. Real life can be weirder than fiction
its called dark power because itll suck up so much electricity the power goes out.
90% of these 'marketing names' sound like they're made by marketing companies who THINK they know what a gamer is, but actually have no fxckin idea. *"Midnight Kaleidoscope, a new level of brilliance and absolute dark power"*
To be fair, we can only guess how many gamers are sane and vocal in the "gamer space", and how many are (mostly young and) dumb and only vocal when they have things to say about someone's mother.
Marketing people are likely to have more of those stats (all of the harvested data must land somewhere, right?), so maybe they are catering to the bigger crowd, and we are the delusional ones...
speak for yourself, my computer will crush yours with its absolute dark power once I get my hand on one of these
@@Lishtenbird yeah but even those kids will know that SRS is cringe asf
Their marketing team probably watches too much Twitch lmao. I mean that kinda sums up their audience.
I suspect a significant portion of the gamers are actually accurately reflected in these names and designs. People you can sell lootboxes to, you can apparently sell anything at any price.
I now want to see a Frigidaire GPU, it will be 6 slots long, and plug into 3 pci express ports just for support. It will be a mix of white, grey, and beige color themes and look like an over size box with some fans on it. It's website landing page will only have the specs and a user manual on it. It'll also have a remote that you can use to heat up your house in the winter, as such the card will also have a build in psu and require a separate power cable to be plugged into the "card's" back I.O. As for branding it'll only have a part number and be prices just a bit more then the rest of the gpu's out there.
lol I thought you said frigate at first. Reminded me of mandalore's BFGA2 video describing a 40k ship as "a flying gun-brick the size of a freeway" and I thought, well yeah might as well just introduce the frigate class of nVidia GPUs. Then they can also introduce the corvette, cruiser, battleship. At least it'd be consistent!
All jokes aside... the idea of using a PCIe slot as additional support doesn't sound all that stupid 🤔
It's nice to know that Nvidia are pushing the electricity consumption envelope so hard with the 40 series that we may at least get some important progress in cooling for the nuclear fusion industry. May be more affordable too.
At the same time when most of the western world is experiencing rising power costs
@@dustojnikhummer "most" Just urope (next time don't buy you energy fuel from the always war thinking neighbor XD).
@@soragranda Bruh, Norway is self suficient in electricity and had the lowest electricity prices until the war happened. Now we are extremely high because we sell our electricity cheaper out and sell it expensive in-country. Germany should never have been reliant on Russian gas.
With those gas prices, those might come in handy heating my room.
@@soragranda Oh thank you, like we haven't been telling that Germany for the last 10 fucking years.
i can only imagine what sort of ungodly power could be achieved by combining the DARK OBELISK with the MIDNIGHT KALEIDOSCOPE
😂😂😂
Imagine all the dark power you can get with this combination
Perhaps the darkness that descends would hide the price.
Apparantly the Neptune can "restore the once glorious Atlantis civilization to the fullest", so maybe this is their secret!
This...this is why they did away with SLI.
At this stage they should just build the motherboard into the card itself.
Or change how it fits into the slot.
Don't give them ideas...
@@MistyKathrine The Solution would be so simple: Just use a second slot for stabilisation...
That will probably be the future lmao. For uber lazy builds.
lol for what they are charging for these things I can live with that lol, slap an am4/am5/intel socket on the back and lets just do this lol
it's kind of weird how GPU's are getting bigger, We live in a time where everything gets smaller and more powerful.
Well the GPUs are getting smaller and thats why they get hot and need these crazy big coolers
@@friedrichquecksilber770 nVidia's overall stupidity and incapability isn't an accurate reflection of what's going on with that though, they're just the more inefficient GPU makers. R9 Fury and Fermi for example got there before. It's not the first time we needed 3x8pins to power a GPU last gen. AMD's cards last gen still looked largely sane for example, while nVidia clearly was struggling and started building bricks with Ampere. Honestly anyone paying any attention saw this coming all the way back with the RTX 2000 series which direction the company was going, in pushing dumb gimmicks and superficial bs with massive power draw and impaired frametimes/powerspikes/drivers (yes, the RTX 2080ti literally had 5700XT tier driver problems on a $1200 GPU) including emphasizing cooler designs (that was ripped off from Sapphire's AMD Fury/Nitro design) instead of talking about pure performance. It's like nVidia basiclaly decided they couldn't compete anymore and so decided to make everything about running HairWorks better.
What I'm curious about now is whether nVidia's software still sucks.
@@pandemicneetbux2110 They make graphics cards a trendy object to justify ever more aberrant prices but as there will always be pigeons Nvidia is rubbing its hands ....
@@RogueStar777 I didn't think it would actually work though because at least in the states, you had kids and all these young poor people ages 18-30 who'd suddenly get a $1200 check mailed to them, so of course you can sell a 3070 for that much to somebody who's really bad with money and can't prioritize for shit. But then after they stopped being money printing machines you'd think anyone would've gotten a clue and stopped buying these shitty ass GTX 3050's and alleged RTX 3060's for over $300. It just sucks because it impacts AMD, which is also a corpo so they jack prices too.
At this point I'm realizing maybe I shouldn't even bother upgrading this year, because if AMD has prices anywhere close to nVidia I'm checked out. My 5700XT still works fine, and while I'd love to have a widescreen or better 4k panel, I don't actually need it. Seeing how I'd be buying a new monitor, it's a big chunk of cash for upgrading already. I just assumed people would realize paying that kind of money for graphics cards is utterly insane unless you're literally making money off it, and people would stop paying for it.
Because they're dumping efficiency for power, to be the best raw performance at any cost, particularly the cost of your increased power bill from both powering the monster, and cooling the heat it outputs into your home. At least in winter, you won't need to run your heater nearly as much.
I'm impressed at how little i care about the 4000 series. Well done nvidia.
💀☠
I care about dlss 3.0, but since that's exclusive (fuck off nvidia), I really don't care anymore
Wait for actual performance before judging 🥸 Also, if those video cards are too expensive, wait for lower end models
@@deivytrajan Power Consumption?
@@mowe758 literally nothing outside of cg/cad work or ai is going to use any of this graphical power. This is such a shit show
EVGA made the right call: Their PSU production is going to skyrocket for these. You'll need a separate PSU just to power these behemoths.
Why separate psu? We are far from there for consumer gear.
You'll need a dedicated circuit for your house to power a PC, at this rate.
@@j.trades9691 Actually close to true. I’ve found that if I have two PCs (3090s) going full bore transcoding video my circuit breaker pops.
@@j.trades9691 That's actually not very far off, most homes in the US have 15A outlets@120V which is roughly 1800w before your breaker trips and generally the entire room (if not more) will be on the same circuit. With GPUs in the 600w range and transient spikes in the 1000's, CPUs pulling around 200w plus the rest of the system, you'll be pretty close to popping a breaker with full system power and a transient spike. This is of course assuming the power supply doesn't shut itself off and there's nothing else connected to the circuit in the room. Most power chords are rated for 15 amps as well, people are going to have to start educating themselves on home wiring and safety in the future
@@siliconalleyelectronics187 I live in an old house where, before I moved it to another room, my computer was on a circuit with two bedrooms, an electric kettle, and the microwave. That worked as well as you'd expect.
I really think the GTX 1000 series really was the peak of GPU Design.
1080ti is the greatest card ever made based off pure value and longevity of being relevant. Still getting 60+ fps in modern triple A games.
My 1080 just failed.
Acer made blower-style 3060 Ti and 3070s last gen for their pre-builts, you can find them second-hand. Really good for undervolting and not affecting CPU thermals. Since I use a dual-system case I really need blower cards, at LEAST the main system in the bottom needs a blower card. Asus also had a short-lived blower-type card but it was too deep, rising way too far above the bracket. Also I just don't buy Asus.
@@adiohead im sad for you bro
Agreed. All downhill from there.
The new cards are so big and heavy that under normal operation, two smaller graphic cards are orbiting them in an elliptical orbit.
😂
Thats why they need the anti-gravity plate to prevent that from happening, duhh
It's ironic how detached the manufacturers are from the customer base. This generation really deserves a hard flop, and we can only hope the next one is better
pray for a good 4060 ti?
nVidia just straight up telling ITX PC users to go fuck themselves.
Tech in general is becoming too damn money hungry as these companies really focusing on net profit & revenues instead of catering to the wants/needs of the customers who keep their companies running. You still have 30 series cards sitting at dumb prices because you refuse to take more losses for another quarter but wanna release & shelf 40 series cards at similar prices to 2 yr cards you can't get rid of while installing this belief in people that DLSS 3 should make them buy it, especially since "DLSS 3 is only for 40 series" cause if it was compatible for 3090Ti them performance charts you showed would be pointless, your 4000 cards would be sitting till 2024 and your balls would shrivel up in that tight black pleather jacket you think you look like Tom Cruise in. I see why EVGA left these fools, at least they have some sense of humanity.
I just hope that AMD is able to release a flagship card with better efficiency, because we've officially crossed over into the realm of absurdity.
If their marketing materials seem strange and foreign it's probably because their marketing teams are from China or Taiwan. I'm pretty sure EVGA was the only US based aib.
If we keep going with this size trend we’ll start needing cases that securely mount the GPU first and then hang the motherboard off like a GPU lol
honestly at this point I don't see why we can't eliminate the CPU completely. but a couple of sata ports and USB ports on these cards, bam. they already are basically self-contained minicomputers on their own.
🤣🤣🤣🤣🤣🤣
Or they could make a dummy pcie slot on the bottom for ATX boards as another mounting point at this rate.
Hell why not put legs on the GPU like a coffee table?
Imagine doing a ITX build with one of these behemoths!
@@laggmonstret What's crazy is a lot of ITX cases use a PCIe riser cable to vertically mount the GPU. So somehow we got to the point that ITX builds are more secure than giant steel ATX cases 😂
Not many people know this, but the Galax serious card was initially called Serious Bill, to hint on the electricity bills people would be paying each month, but nvidia obviously banned that naming...
NV can't ban the naming lol. Marketing probably just thought it was smarter to change it and sell more.
History proves that Nvidia can do pretty much what they want and too often get away with it.
The only fightback AIBs have is to pullan Evga move
@@notsyzagts7967 Nvidia has all the leverage in the partnership between their AIBs
Pretty disappointed in the latest gen of PC hardware so far, feels like we're going backwards. Everything is getting bigger, hotter and more power hungry.
I love BFGPU concept. Finally we are getting real performance gains. You are stupid if you think a halo flagship discrete graphics card for a desktop computer that plugs into a wall, should be small and "energy efficient". If you don't want a flagship don't buy one.
I tried warning all you guys a year ago what kind of a total shitshow nVidia had become but you didn't listen. You could've prevented this. I even called, hell not just this, but Ampere when I warned you all that Ampere was looking pretty sketchy right before release, and it turned out to be one of the most meme-worthy nVidia generations since Fermi. They even started using the Sapphire Nitro Fury-X passthrough cooler design, which was an actual meme on AMD's old R9 300/Fury series cards for how hot and unstable they were, and suddenly nobody bats an eye when nVidia has to adopt it and make it mandatory.
Then when I saw the TDPs I just knew it was over, because Jensen will do everything in his power to make sure they have the ostensibly "best" performing halo card regardless how much the rest of the lineup is total shite. This signalled to me they already knew that AMD was gonna beat them with RDNA3 so all what this is is some truly desperate moves on nVidia's part to maintain their image, because optics is all that really matters to nVidia, not the performance. Meanwhile they even managed to drive EVGA away, after having successfully ensured that all Macs (shitty as they are) and all consoles now have AMD hardware on it, thus ensuring much better stability, support, and optimization for productivity software and games running on AMD. I mean what do they even have left, the Switch and Tesla or something? EVGA clearly bailed at the right time.
Sadly, the average nVidia buyer is now looking like a beaten housewife trying to rationalize how the corpo really loves them deep down inside, and shelling out 1080ti levels of cash for a shitty 4070 that's going to draw more power and perform even worse than a 7700XT and still require you replacing your 850/750w PSU most likely, so now you get to add that into the cost of upgrading your monitor and GPU. It absolutely astounds me that even after EVGA themselves finally had enough with nVidia, that there's still honest to God thousands of people out there who insist on buying nVidia cards for no good reason at all but branding and memes.
Card itself aint big, it's just a hunk of cooling
AND most important more pricey
@@SlavaBagmut Yeah which is why I won't bother with it. $1600 for a 4080 on newegg, what the fuck do they think this is 2020? Yeah nah cunt, that's a HARD pass for me. What's hilarious it's not even like it's any good, they've got bursting into flames 12tard plugs, rather than 8pins. It's like, this is not a small thing to me when you're running 450w+ That does not make it more better. That makes it WORSE which means you should charge LESS for it. When AMD was pulling that shit they had to charge less money because who in tf wants to pay more money for a card that has to draw 100w more power for being equally performing.
The psychotic thing is, there's allegedly still all these abject retards buying it, and I'm sorry but at a certain fundamental level I'm going to respect you less for what you're spending $1600 on, if it's crystal meth or a 4080 I don't actually respect what you have to say on other things because of that decision proving you're not cognitively reliable.
Maybe EVGA got out because it saw that the investment commitment to buying an marketing the 4090 cards was going to be a losing proposition and instead of having an unprofitable year or two, it just walked away.
He's right, Nvidia made too many 30 series cards but is refusing to lower prices for AIB partners while they themselves are lowering prices of FE cards, trying to outsource the loss of excess inventory to AIB partners.
Now they decide they'll just increase the price of 40 series so people will be forced to buy up 30 series stock first.
This man is a legend, he can do a professional and informative video with trolling including 😂😂
If they were losing money with the higher end 30XX gpus, they would've have been absolutely been hemorrhaging money with these, especially considering nobody is going to buy them when there's the surplus of 30XX gpus
@@Nighterlev Just like what they did with Microsoft, Intel, Sony, Apple, Linus T., probably Nintendo too.
Nvidia’s sure got a lot of enemies, huh?
@@Nighterlev Good points.
These marketing teams need a raise, just for providing pure comedy value.
I genuinely laughed out loud when he said the support stick was called "The Dark Obelisk"
Missed opportunity to call it an Obelisk of Light.
I've never actually faced palmed before, but I have this playing in "desktop mode" minimised while I'm at the gym, and I actually facepalmed when I heard that, the delivery was perfect. I didn't even have to see Steve's face.
Black shaft
@@johnbuscher I'd go with calling it the 'neuralyzer' if i was on that particular marketing team
@@mikewazowski8368 he's like a father who isn't even angry anymore, just dissapointed
I predict a strong trend in watercooling this generation. Its starting to make sense now with oddball slots multipliers.
either that or a strong trend in people skipping 40-series
wow what a brave prediction
@@superbusstarodub ikr nobody saw that one coming. Im quite the Nostradamus.
Just like when I predicted I would eat spaghettis for diner yesterday... Im never wrong.
Instead of a 'Support Stick', they need to have a GPU Shroud with an integrated "Kick Stand" that rotates out and down at the far end of the card.
That'd be pretty sleek
now thats innovative thinking right there
Yeah then the case manufacturers could just build like a Zip Tie gloved clicky ridge design the stand could run across to lock in place the height
Gpu with Rgb kick stand, great idea!
they just need to make smaller cards... pretty soon you are gonna need half a room just for a pc
I like how versatile the new GPUs are.
In the winter they act as heater and when somebody is trying to break in, you can use that thing as a weapon
Don't forget being able to bend space time when necessary.
@@joelmorningstar3645 Yup. Just when I thought the pricing and manipulating the 30 series market was enough of a shitshow, now we get this. I mean... is Nvidia that determined to drop the gaming market, that they'll do anything and everything wrong in their power? xD
@adsaa host your own burning man at the comfort of your own room.
Forgot to mention that now you need to know structural engineering in order to install the nvidia brick(video card)
I mean....Thats why my 3080 is just chilling in a Cupboard right now. :D (jk...My new MB or CPU is broken. Dunno what exactly seems to be the Problem...)
I think these cards will follow in step with recent sales trends on GPUs, that is to say nobody wants these. They're too expensive and average consumers budgets have no room for these overpriced, oversized, overspeced cards right now.
I think you're delusional. People are very impulsive and dumb. Similar to idiots paying $100K+ for a new truck that's worth $25K, people will line up to buy these and they will sell out...as usual.
Agreed, I had a budget set aside for these cards. My max was $1200 for a 4090.. and to see these base prices set to release at $1699 its absolutely absurd and I won't be buying one.. I personally know at least 3 other people that have decided not to buy the 4000 series at all because of Nvidias slefish decision, and it's sad to say but I hope Nvidia gets the shit end of the stick on this release.
youre delusional if you dont think these will be sold out on the first week
Historically few people have bought top tier cards, so I think you're right.
What will be interesting is what Nvidia chooses to do with the 4070/4060, and how AMDs chiplettes compare
@@KMB_StiX Clearly they're not after your average moderately intelligent standard common-sense grade normal working wage consumer.
I like how you started with ASUS as they seem to be the only one following their naming model seriously. It's like you didn't want people to confuse them with the other nonsense they are surrounded by.
The size and weight of these cards is absolutely monstrous.
Price too!
yeah that 2,7m card at 16:40
They still got nothing on the ladies walking around here.
I will not be surprised if some of these cards would literally fall out at some point in the future due to the weight.
@@konradukasiewicz4834 what the actual fuck lmao
I have never laughed so hard at an earnest piece of tech news coverage. The absolute state of PC gaming is so funny to me.
Bloodborne man is everywhere I go, goddamn
There aren't that many games that need these cards. The only game i can think of, Cyberpunk 2077, plays well enough on 3080/3090. There is no need to go to a 4090 unless you are a content creator. The 4080s aimed at the gamer won't be that much better than last gen 3090s, with their lower Cuda cores and their slower memory bandwidth.
Personally, I would jump on the 3090 GPUs being released from mining, before that market picks up again.
Console players like "yeah ..maybe I'll cost just a bit longer .."
@@chrisliddiard725 it's gonna be a hot minute before it does. All the markets are getting ready to self emulate like monks. Housing, wall street, energy, crypto, major currencies ... all about to take a big old Amber Herd right in the bed of the global economy.
Agreed ! Seriously though, what happened to these manufacturers ?! It's like they are advertising toys for small children. The marketing jargon is ridiculous. I think that EVGA left the GPU scene at the right time. This all feels like a bad comedy.
I love the idea that somehow 'bionic shark fans' made it through various stages of marketing committees at Gigabyte.
Unreleased epidode of The Office
It's gigabyte dude. They're trash and I can tell u first hand they r. 6 month process of an rma later and the product still messed up.
The results of work from home generation.
I guess we know now where Scott, the son of Dr Evil, works.
Can't wait for the Super RTX 4080 FE XL FTW 2SLGBTQIA+ ULTRA Turbo Champion Edition II ALPHA Gold RGB Platinum HyperX Bionic Stealth Shark Carbon OMEGA Ti Remastered Limited Re-Edition!
This must have been so incredibly embarrassing to cover. I am convinced that these board manufacturers are convinced that all gamers are idiots.
In fairness, they are. Have you ever tried interacting with "people" on Steam forums? You can easily forget just how F'ing stupid average gamers are until you actually go and interact with the ones not playing your niche city builders or whatever, and realizing these are literally Qanon follower tiers. Plus it's nVidia anyway, so they tend to get a much bigger portion of the bottom end of the bell curve regardless.
It's gonna be wild to see the sales numbers on these, because I don't know anyone who is even considering a 40 series card. The power draw alone is turning most people off. They picked a hell of a year to release a high power draw card considering the energy cost issues going around.
was thinking that as well. I'm already considering selling my i5 11400/RX6600 XT PC in favor of a Mac with Apple Silicon due to the power draw. And don't get me wrong, my PC draws 500W tops, imagine your GPU alone is using that much power...
Yep. Definitely waiting to see what AMD has coming up.
You are paying almost the same for energy for a house with ac and fridge working all the time only to fed that card, feels like the gaming industry is going backwards strong, I’ll stick with my apple m1 and my current 3080 till gen 5 offers something more power efficient
They will sell out, the consumer is absolutely stupid by now
I considered buying the 40 series because I'm on a 2070 right now, and boy oh boy, I'm good on the electricity bill(because its going to be pretty low compared to everyone else) but not the whole card price nor how big it is, plus I'll need a new PSU.
These names are basically just what happened when the monitor guys got together and made a big bingo card corresponding to each letter in the alphabet and numbers between 0-9. They then, simply constructed these brilliant pieces of art.
These names prove that they’re marketing towards the only people that would buy them: idiots.
Monitor guys when their VG(IDK)27(27inch)AQ(QHD) monitor isn't called VGLE27NPIPS99J1440P
or got pissed in karaoke bar ,that meeting did not happen
Dont think it going to matter to much by the comments ive seen across multiple videos on the 40 series AIB makers are going to be struggling to sell Nvidia cards anyway ..
EVGA should jump ship to AMD this gen and make HUGE $$$$
You know some guy is gonna grab the "Dark Power" of the Midnight Kaleidoscope and then combine it with that "Dark Obelisk", and cringe all over some gaming platform.
I've not been this underwhelmed by a GPU launch in many years.
reminds me of the 20 series launch. insane price hikes for marginal horse power increase
@@charlesballiet7074 up until now we have no independent performance reviews. However even if they did not achieve any architectural advantages (unlikely) they would still have massively increased the cuda core count.
His seatbelt didn't kill him. It was the lack of the HANS device which is now a nascar staple due to Earnhardts death. It sucks we had to lose best to save the worst
@@petermuller608 the closest I've seen is a preview video which Digital Foundry (they already have a founders 4090 in hand) dropped 2 days ago showing some percentages vs 3090ti rather than raw framerates. I think a lot of the gains referenced DLSS 3.0 but the increases were truly massive. 4090 at 250% FPS vs the 3090ti in CP2077 maxed everything 4K. They claim 8K 60FPS is now possible
Nvidia still sucks, won't buy anything from them, until they release truly open source drivers.
Otherwise you'll be stuck with unsupported binary blob garbage on Linux after a while, like on my Thinkpad W530
When these cards drop and you do your reviews, could you do a breakdown of which cards would be the best option if you're planning to watercool them in a custom loop?
I have a hunch, that nvidia forgot to mention to their partners that they cut back the 4090 tdp from 6-700W to 450. that is why everybody is coming with the 4 slot cards with triple huge fans, meanwhile nv can show a slicker founders edition card. This would also support EVGA-s claims about getting every info in the last moment, or even on the reveal event itself.
Yep that seems about right
Or maybe they are already prepared for the 4090ti with 600W++ TDP.
450W is not actually the limit for the partner cards. MSI has a 'silent' setting at 450W, but also a higher setting. They cannot tell us yet how high it goes due to their NDA.
@@aapje it’s 650W based on a alleged AIB employee source
6 Watt TDP? Dang
This makes me think that EVGA ditched when they were told to relabel the 4070 model. It would have been hilarious if all the board partners went with the original plan and released the 4070 model. Or did a bad job on mspaint to edit the number. That would have been a nice foot in the backside to Nvidia.
I just want to see a bunch of stickers over of the last two digits on the boxes.
@@qwerfa that would be the best thing ever.
Lol, or some shitty sticker of an '8' stuck over the '7' in 4070. Would be awesome.
Things become harder if the print is on PCB directly by silkscreen. No amount of stickers can hide that. LMAO
@@qwerfa think it was Sapphire that used to (long time ago) make boxes with no model numbers on them whatsoever. They just stuck it on with a sticker because they used the same box for all their cards
The funny thing is, that they are designed to look like toys, yet the price suggests that they are definitely not 😅
Designs made to appear to children, who will never be able to afford it
(except for like 13 kids, whose parents are multi millionaires)
yeah, the only somewhat reasonably priced model is the 4090, but thats definitely a gpu for workstations and thus not for children. I can't imagine a graphics designer thinking "oh yeah, I need that 90s gamer kids gpu model" xD the designs are about as atrocious as the 4080 pricing.
They are also kind of named like toys too... say the name of any of those cards with the word "Nerf" in front of them and you think its a nerf launcher or a water pistol.
These are the Batman & Robin of GFX cards.
All this pathetic RGB crap is so annoying.
@@SmokeWiseGanja I actually like the way Founders Edition looks like. I don't understand why AIBs can't stick with that...
With these names I feel like I just woke up from one of those weird as fuck dreams where you weren't sure if it was real life or actually a dream...
I burst out laughing at "new level of brilliance and absolute dark power", that's actually incredible.
That's exactly what got me too. The whole build up of everything being nonsense, that was the straw that broke the camel's back.
Imagine that in a glass case with Trident Royal memory. BLING overload.
The 5000 series will come with their own separate case that you connect to your main tower. And with the optional liquid cooler that will be required to keep your GPU under 100 degrees, you can purchase the additional optional tower by NVIDIA for $699
5000 series will only include one USB C port, which you will have to buy a $50 dongle to convert it to Displayport and if you take off the cooler and repaste the card yourself, it bricks the bios until you send it back to Apple...err I mean Nvidia for a firmware update.
5000 series will allow u to install Ur pc into the case and keep it cool 🤣
At this point, they're pretty much almost becoming Apple
Although even Apple doesn't go that far
C'mon, LN2 or go home!
The tower will ofcourse have a 4 inch screen capable of playing state of the art gifs!
Now I truly understand EVGA's "Principle Decision", right after watching this video to the very end.
EVGA: 3090 *FTW3 ULTRA GAMING* with *PowerLink 41s* for maximum power filtering…
EVGA: leaves
GN: oh look these AIBs calling out their cards with cringe names *Msi SuprimX with an X*, galax with *4090 Serious Gaming*
TF lol? Nothing changed, they trynna hate on the AIBs for literally something that has been the same for years, and now that EVGA leaves…
And EVGA itself was heavily segmented. There'd probably be at least 10 more cards if they were in the fray.
@@Jusdutari naming isn't the issue for me it's the gimmick they're using to sell us a literal brick from anti gravity core pipes to fking support sticks. Nvidia did all of this by raising 50% more power than their competitor to try and squeeze 10-20% more performance out of their GPUs. I'm skipping on this generation sadly unless they release something that won't break my motherboard and require me to get a 1000W GPU.
Ehhh, not really? EVGA wasn't exactly innocent, they had a stupid number of different models with confusing clock and memory speed adjustments, cooling buzzwords and model names, etc. FTW3, ICX, Black, XC, XC3, Ultra, etc. You had to get through all that crap before you could ever tell why any card was better than any other they sold, other than price.
@GamerNexus I saw this idea in another comment chain, but I am going to repost it as a fresh comment because I think it deserves more love.
The 4000 series has left the realm where "card" can accurately describe these pieces of hardware! As classic cyberpunk fans recall, netrunners had to use solid blocks of circuitry to hack into things. These blocks were referred to as "cyberdecks," and hackers colloquially referred to as "deckers."
We should all start referring to these new GPUs as "cyberdecks." They are big and heavy and expensive enough to warrant it. Also, driving a VR rig is absolutely the beginning of Cyberpunk's 'Netspace.
Well, if we are lucky, the gravitic effect of the Gigabyte cards is so strong, it can pull the entire marketing team in close enough for the sharks to be able to eat them and spare us any similar BS talk in the future.
And it was this day, that we knew, no one had a pull out game that rivaled EVGA.
Maybe for the 5000 series Nvidia should just make a product that performs the same as the 4000 series but uses half the power and takes up half the space. Maybe brand it like Sony and call it the 4000 series slim.
I'm betting they do that for the next generation, once they've sold out their 3000 stock
Naah by the size trajectory, in Nvidia 5000 series there will be a slot to install your PC in the GPU case.
AMD is going to make those. They've been catching up for a couple generations, and this is the perfect time to demonstrate how absurd the 5000 series is by revealing what it should have been.
Yes, they will be called RTX Super - Slim.
@@SanilSinghTomar Hahaha!
People made fun of me on reddit for adding a bunch of heatsinks to the back of my gpu, and for trying to build my own heatsink. I told them it was "future proofing".
They're not laughing anymore...
We went through shortages, price hikes, scalpers, crypto miners just to get to this point... Damnit this gpu nightmare will never end.....
Lets just hope amd doesn't follow suit.
@@CarlosZ34NSM ARC is starting to look a whole lot better next to these gigantic, power hungry Nvidia cards.
Honestly, even if the prices were reasonable I wouldn't want to stick one of these things in my computer.
@@CarlosZ34NSM arc needs another year or two to get some ground and i wouldnt bet much on amd so it looks like were screwed.
@@rars0n magine caring about power lmfaooo it meeds power to deliver power dumbass.
@@spiffnoblade7731 Agreed. Arc is competing with the 3060 tops. It needs one more generation to get a full lineup. Hopefully it wont keep falling further behind.
Videocards used to just come bundled with game keys. Gotta say bundling absolute dark power is really upping the ante
I am interested in upping my "dark power" game. Would I be best going for a Galax card to get the "Dark Obelisk" or would Palit's "absolute dark power" be more powerful and ...absolute? I feel like the "dark obelisk" would require more rituals and sacrificing and I'm already caught for time as it is!
What about combining Dark Obelisk stick with Dark Power for the greatest effect? Although that would definitely require a sacrifice to afford it...
... and then there's the whole issue with anti-gravity. Will the dark powers grant me the power to defeat gravity or will I have to decide between the two?! Should I buy a range of cards to take advantage of all the magic or would I be meddling with powers I cannot possibly comprehend?
Yeah, might get dangerous ;) . Seriously though, what happened to these manufacturers ?! It's like they are advertising toys for small children. The marketing jargon is ridiculous. I think that EVGA left the GPU scene at the right time. This all feels like a bad comedy.
One step to turning to dark side. "Yes yes use the anger Anakin feel the power of the dark side."
I love that the dark obelisk and the dark power graphics cards are covered with RGB lights.
We are going to go back to the time when computers took up entire rooms
Yup 🤣 devolving at lightning speeds
Just watching Steve's facial expressions as he reads the names is classic.
ASUS thoughtfully manufacturing bricks that can be thrown through the windows of their office
😂
Finally a metaverse reference that is understandable
and they left the coordinates for said office on the brick itself for convenience, such nice guys :D
Next time putin kills someone, it'll be death by rtx 4000
Soon gamers & pc builders they will have one case for the GPU & a second case for other components.
Imagine being the inventors of dark power and anti gravity plates, only to turn around and use them to build a graphics card, instead of world domination.
It's Absolute dark power tho. Like imagine dark power actually becomes Absolute... But your point stands.
Plan for utter Domination:
1. Buy a Palit 4090
2. Order customized Support Bracket with "Absolute Dark Power" written on it
3. Put a DMC V Vergil figurine on top of backplate
4. ?????
5. Own every Game feeling motivated
Imagine not
Not even building a graphics card, just repackaging. And as EVGA said - for low profit margins. kek
By the time we get to the 5000 series, we’ll need AIO cards or custom water loops for the cards to keep them cooled. I just upgraded from my 1080ti that was a beast of a card from gigabyte to the 3080 strix card and the size differences is actually comical!
Asetek can only wet dream that this will happen. Last time a GPU was designed facotry like this it was sued and shutdown from Asetek (Radeon Fury X)
Same story here, had 1080 TI Strix for a while , sold it to a friend a year+ ago.
Got my 3080 Strix recently, this friend came to me to clean his computer, was overwhelmed by the size and design.
1080 looks waay cheaper now with the materials and build quality.
By this time I think water-cooling must be a requirement for next generation Nvidia cards
Thinking the same: see my comment above
No, I do not like nor do I trust water cooling.
You need six times the water generated by Niagara Falls to get these close to workable...
True, but even the ones that are watercooled don't directly cool the VRM apparently.
And nitrogen cooling for the next after that.
Respect to Steve to say those names out loud without bursting into laugh. The timeline on the left is also just pure gold. The most hilarious news to upcoming GPUs ever watched so far
I just got to the "bionic revolving shark fans" he looks like he's struggling haha.
@@Inconsistense “Bionic revolving shark fans” feels like it was made in a random phrase generator
Dunno, i find it hard to take his Jif obsession seriously when he insists on calling out other's stupid naming schemes
I can honestly say I'm super excited to finally buy a 30 series card. 😆
I just wish I bought a 3080 instead of a 3070 Ti, prob was I couldn’t find one in existence
Now I'm super excited to buy my first ever AMD card lol
I've been on my 1080TI and with the 3000 series prices now I'm considering going for a 3080TI! It's about time?
I got a 3080 last month! Upgraded from a 2060. Still expensive af...
You're playing right into Nvidia's hands
They want you to buy a 3xxx series card
That was the best laugh I have had in a solid week! Honestly, these GPU companies are just rediculous these days. 4 slots is crazy enough, but when they cover their card in sparkly dark matter and fall in love with three fan designs supported by dark obelists, I have to wonder if they even know what kind of products they make?
Honestly these cards are getting ridiculous. If there isn't a development in GPU power efficiency and cooling soon then we'd be seeing 6 slot sized GPUs in the not too distant future.
Only if the emptied brained masses are stupid enough to actually buy into the hype machine and purchase these wastes of silicon (which of course they will)
We probably actually need more software and driver optimization, since the currrent and the last generation of gpu hardware has not actually got pushed to their limits at all.
Soon you won’t be sticking graphics cards in your PC, you’ll be sticking PCs in your graphics card lol.
@@TH3C001 eGPU (external gpu) exists by the way..
Pretty soon all graphics “cards” are going to require a dedicated heat pump just to keep them under 90C. Going to be expensive running your PC off of 240 volt plugs.
As a former employee of EVGA, we where told everything last. We didn't get any assistance from Nvidia when GPU/copper/silicon Shortages, we where left for dead.
Feels rough man. Honestly nvidia needs to remove the top brass that are doing this bull crap
No one blames EVGA for not wanting to lose money on 60% of the line-up. Nvidia needs a good boycott to bring them back into line. Hopefully enough of us managed to get a 30 series to make it happen.
Well, that sucks! After hearing that, it's impressive how well EVGA did, with their products. How long did you work there? Was it a fairly large operation, or would I be surprised, if I actually went there?
Mindblowing when I heard that. How are you supposed to make a quality product without reasonable drivers or facts about the product at all.
It's a shame, I would ONLY buy EVGA cards, now it's gonna either be ASUS or AMD. nVidia need their wings clipped.
It really is hard to take any of this seriously at this point.
Calling gifs both jif and gif, trying to get everyone angry. I love it.
We've reached a point where PC Part Picker will now have to add a weight spec to there filter when searching for a card.
It's got me thinking of a water-cooled card for the weight savings
@@gyratingwolpertiger6851 I would think of getting an AMD card instead
@@pcmasterracetechgod5660 Yeah I'm leaning that way myself
Steve chuckling out "What the f***" and just walking off at 18:18 is really the only summary that's appropriate for this release.
I legit cackled for so long at this.....lmaaaaaoo
Seems we've reached a tipping point where GPUs become faster by becoming larger not more advanced, smaller, efficient. That means everything is more expensive, larger PCs motherboards, PSUs.
Tipping point. Was that a pun because of the free walking stick with every purchase.
Actually these GPUs are more efficient than the last generation as performance per watt will show .... They are also more advanced ..... The Laws of Physics doesn't include miracles .... More power comes at a cost in this case it's the size of a cooling solution needed .....
@@longjohn526 why?...... do...... you....... type....... like...... this........?
@@zakkart Gotta... make... sentences... longer... so... yeah...
Good gen of cards to just skip over, for many reasons
Had to laugh so much when hearing about the brilliance, dark powers, and anti-gravity technology. Hikes, Albert Einstein was nothing compared to these geniuses.
I’ve got a pretty strong feeling that if I upgrade this cycle it’ll be to an AMD card…
I already switched to AMD because I game on Linux.
You are not alone. I've been Nvidia customer over a decade, but this launch crossed the line for good. Greedy, AIB bullying, budget gaming destroyer, Nvidia is really awfull company lets face it. They make excellent GPU's, but we need AMD to change things like they did with CPU market.
Holding my EVGA 1070 SC until I need a new build. I’m interested in either AMD or Intel, but NVIDIA’s attitude towards consumers, partners and media isn’t something I can support. I’ve been feeling this one coming for some time.
@@progenitor_amborella I lift my hat for you sir. I've been thinking about changing to AMD too, i know this is kinda poor idealism because nothing changes, but i can atleast keep my principals and not buy from Nvidia anymore.
AMD has made it easy too. Because gaming with their GPU's is pretty much same, they even improved FSR wich should be now very usable with 4k.
i know it would be mind-numbingly painful but i would to love see the AIBs claims on fan improvements tested. Every year they claim ___% more airflow than past gen. i feel like these guys have created 3500% more airflow in the last 10 years somehow
When the EPA slammed the lid on automotive emissions (70's), this caused a massive loss of MPG with marginal air quality. There was an avalanche of articles giving on how to improve MPG, who knew we all were such lousy drivers! It took 10 years for manufacturers to retool. Now we're harvesting 90% of the fuel energy and can't afford the complicated devices. When you added up all the claims of those early 'experts', you could leave with a 1/4 tank and have an overflowing tank at the end of your trip. Such is the power of the press.
We'll have an ef5 tornado in our rooms in a couple of years
That dude who does 3d printed fan testing would 10000% do that if he could get his hands on the cards
@@Shalmaneser1 That's pretty funny but I believe modern automobile engines have a thermal efficiency between 20% and 40%.
RIP to all the planned miniITX builds out there
@Mike Yeah, for the premium and car like maintenance.
Any MiniITX build basically will need to be water-cooled now to reduce the size. Maybe that's why all those companies are now selling a GPU+Pump combo so you can just attach it to a loop and done.
What miniITX builds when you need 1.5kw PSU :P
@@ArchusKanzaki what if the pump fails.. $2000+ paperweight?
@Mike Where do you put the humongous radiator required to dump the heat with? Or are we all going to be attaching RC aircraft-level ducted fans to 140mm rads and watching the machine take off and fly around the room with all the subtlety, grace and decibel level of a rabid goose on crack, while blasting air out the end that's hot enough to strip paint?
By the time the 6XXX cards come out, the flashing lights will blind small animals.
RTX 5090 will come with its own power supply and large case with cooling. You'll just need to plug it in directly to the power outlet.
I bet you thats not far off with Jensen trying to be the second coming of Steve Jobs and only ending up being the great value version.
I don't think it can do that. The pc needs dc current to run
3 phase outlet.
@@lordadz1615 that's why he said it's own power supply.
Sounds like real thing...
I think more people are going with Radeon GPUs in the future. I know this guy will be.
I'm leaning toward used 30-series, but if AMD can keep from being crazy on pricing I might well go that route. We'll soon see.
Same. I've only stuck with Nvidia because I never had any problems with their products. Probably gonna go 30 series or radeon.
If only I didn’t want the AI capabilities 😢
Since they are keeping 30 series around that might not be a bad direction to go.
Thermals and power are issues all GPUs will have to deal with.
Good thing Nvidia is releasing their new space heater just in time for winter!
My 3090 already heats my room up with ease. I can't imagine how much heat these gonna put out 😬
my 1050 Ti rig and my Phenom II server already keep my room warm… my next card is gonna turn my room into a sauna i swear
Just in time for the Australian Summer *_*