One kidney, please! - NVIDIA RTX 4090

Поделиться
HTML-код
  • Опубликовано: 8 июн 2024
  • Get $25 off all pairs of Vessi Footwear with offer code shortcircuit at www.vessi.com/shortcircuit
    The 4090 is here! We can't turn it on or talk about performance at all, but we stole the Zotac 4090 Extreme Airo GPU from Labs so we can unbox it for you and talk about what we should expect from 4090 performance.
    Buy a Zotac Gaming GEFORCE RTX 4090 AMP Extreme AIRO: geni.us/SBp2
    Purchases made through some store links may provide some compensation to Linus Media Group.
    Want us to unbox something? Make a suggestion at lmg.gg/7s34e
    ► SUBSCRIBE ON FLOATPLANE: floatplane.com/ltt
    ► GET MERCH: lttstore.com
    ► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/scsponsors
    ► PODCAST GEAR: lmg.gg/podcastgear
    ► SUPPORT US ON FLOATPLANE: www.floatplane.com/
    FOLLOW US ELSEWHERE
    ---------------------------------------------------
    Twitter: / shrtcrctyt
    Instagram: / shortcircuityt
    TikTok: / linustech
    Facebook: / shortcircuityt
    CHAPTERS
    ---------------------------------------------------
    0:00 Our 4090 FE was stolen
    0:53 Unboxing and basic specs
    2:36 Initial design impressions
    3:12 What we know about performance
    3:51 Sponsor - Vessi
    4:32 DLSS 3, RTX Remix, AV1 Encoding
    8:31 More on design and current 3090 pricing
    10:15 Outro
  • НаукаНаука

Комментарии • 4 тыс.

  • @ShortCircuit
    @ShortCircuit  Год назад +1420

    Stay tuned to LTT for the full 4090 review coming soon! Would you consider buying a 4090? What GPU do you have right now?

    • @fake3187
      @fake3187 Год назад +60

      AMD Radeon 6600... 😷
      Ryzen 5600x

    • @Snikpoh1390
      @Snikpoh1390 Год назад +35

      3070. I would like to upgrade to a 40 series so I can 4k game with less stress on my GPU.

    • @matthewjalovick
      @matthewjalovick Год назад +12

      I have a Razer Blade 14 with a 3070. Bought it in the summer of ‘21 for a solid PCVR experience to couple with my Quest 2 and I have no regrets :) everything is between 100-120fps in VR.

    • @Azrael8
      @Azrael8 Год назад +32

      Sticking with AMD.

    • @Max-xx6lu
      @Max-xx6lu Год назад +15

      3070 3700x enough for now

  • @FatUberUddersOfChaos
    @FatUberUddersOfChaos Год назад +7827

    With the size of these things its starting to feel like installing a PC into your PC.

    • @oiytd5wugho
      @oiytd5wugho Год назад +141

      funnily enough pcie "PCs" (DPUs, SmartNICs, etc) are much, much smaller

    • @Renuclous
      @Renuclous Год назад +294

      Well, for all intents and purposes, a gpu IS a PC inside your PC. It has a processor, „chipset“, memory, in- and output IO and Power delivery circuitry.

    • @Todd_Manus
      @Todd_Manus Год назад +47

      @@Renuclous Nooooooo..... A PC is a PC, not because of only hardware... software is part of the equation.. Linux, Windows, macOS.

    • @Napert
      @Napert Год назад +102

      @@Todd_Manus firmware

    • @Antifuzz1
      @Antifuzz1 Год назад +81

      yo dawg i heard you like PC's

  • @manankakkar4888
    @manankakkar4888 Год назад +2077

    Just in time to function as a space heater in winters.

    • @aryadhole
      @aryadhole Год назад +13

      😂

    • @davidlazarus67
      @davidlazarus67 Год назад +111

      Yes but will anyone in Europe be able to afford to run it?

    • @chitorunya
      @chitorunya Год назад +30

      @@davidlazarus67 good thing not everyone lives in Europe then.

    • @davidlazarus67
      @davidlazarus67 Год назад +30

      @@chitorunya Yes but more than double North America. We are a much bigger market than the USA.

    • @Mr.Morden
      @Mr.Morden Год назад +27

      Any 110/120V users running an actual space heater in their gaming room will have their breaker trip.

  • @therandomdrone4370
    @therandomdrone4370 Год назад +84

    Im so glad he brought realism to the durability of power cables. I can tell when they mentioned the lifespan, most people have never heard of lifespan of cables plugged in and out and freaked out over something they had no idea about.

  • @Avengedsevenfoldxfan
    @Avengedsevenfoldxfan Год назад +10

    We went from GPU Sag to GPU Gansta Lean

  • @surg0083
    @surg0083 Год назад +487

    Only 1,600 before tax. Wow, that’s more than a lot of people’s whole build

    • @mugendreamz8277
      @mugendreamz8277 Год назад +12

      Could very well get up to 1800 depending on which manufacturer you get .

    • @valentinvas6454
      @valentinvas6454 Год назад +28

      It will be around 2200-2300 in my country which is like three times more than the average monthly salary 😬.
      I really wonder how many people will buy even the 4080 12 GB also known as the new 4070. I bet even the true 4070 will be around 800-900 here which is awful.
      RDNA 3 please save us with more reasonable pricing.

    • @ashkanaref4056
      @ashkanaref4056 Год назад +9

      My house costs less than that

    • @stevenkantrowitz6040
      @stevenkantrowitz6040 Год назад +1

      Yeah tell me about it. Mine was $1,250 and I still felt like I was over paying.

    • @legend8202
      @legend8202 Год назад

      Brooooooooooo.... that would be about as much as my whole PC! I have a 2080 super, but I am pretty sure that is only a fraction of my total rig's cost... I did got my PC on a sale though!

  • @wavelogic8471
    @wavelogic8471 Год назад +184

    Every time he slams it on the table, I get a sharp pain up my spine, this is no way to treat your kidney!

    • @nickevison391
      @nickevison391 Год назад +17

      Why not? I treat my liver like that every other weekend

    • @DRAWKCABLLA
      @DRAWKCABLLA Год назад

      Nvidia is a piece of turd, which makes their products turds. Not kidneys. He is slamming a turd on the table which should make you feel repulsed

    • @stevenjoummaa4640
      @stevenjoummaa4640 Год назад

      @@nickevison391 By slamming it on the table?

    • @nickevison391
      @nickevison391 Год назад +1

      @@stevenjoummaa4640 metaphorically with alcohol, yes

  • @martins7194
    @martins7194 Год назад +3

    Sizewise it reminds me of the Soundblaster AWE32, which fitted in an ISA slot.
    It was so long, that in our pc case we had it in the bottom slot and it was resting on a small rubber block to prevent it from hitting the bottom of the case.

  • @GameCookerUSRocks
    @GameCookerUSRocks Год назад +3

    Benchmarks? Where is the video for it? I also wanted to know about the fan noise people were complaining about. Thanks !!!

  • @thatzaliasguy
    @thatzaliasguy Год назад +1274

    With how big GPUs are getting, I can't wait to see someone make an SFF PC out of the shell of one.

    • @joshuaadams4945
      @joshuaadams4945 Год назад +3

      Right, I am taking my Star Wars Titan XP enclosure into a RPi build, but it is pretty small inside .

    • @xbk923
      @xbk923 Год назад +1

      😂

    • @tripodal69
      @tripodal69 Год назад +9

      At this point you're modding the GPU to put a PC inside it.

    • @usagihakushaku5338
      @usagihakushaku5338 Год назад +1

      @@john-paulhunt6943 It gets to the point where majority of people just don't need better one , games are made with majority of people in mind , the medium will just be treated as ultra setting and newest GPUs will play games on above that.

    • @ruinenlust_
      @ruinenlust_ Год назад

      what does any of this mean

  • @NikilanRz
    @NikilanRz Год назад +899

    The 40 Series feels like connecting a PC to the GPU

    • @mkhanman12345
      @mkhanman12345 Год назад +11

      connecting pc to the heatsink

    • @ejkk9513
      @ejkk9513 Год назад +3

      That's always been true as the GPU has always been the top consumer of power. Now it's more than ever... but that's never changed

    • @_valheim
      @_valheim Год назад +1

      @@ejkk9513 its more in reference to the size

    • @4doorsmorewhors
      @4doorsmorewhors 10 месяцев назад

      Wow so original and SOOO funny I can't stop laughing

    • @snoopy2252
      @snoopy2252 5 месяцев назад

      Lol

  • @christos1182
    @christos1182 Год назад +36

    The sizes of these things are getting to the point where Cloud Streaming can really become a more appealing option with a bit more advancement of the tech. I saw a video with the 4090 ROG Strix and it requires a full ATX tower due to how large it is. That's insane. The heat output could probably take care of an entire 3 bedroom house in winter!

    • @Lucas_Simoni
      @Lucas_Simoni Год назад +6

      Imagine one of those in the summer without an AC / bad air-circulation in the room... It makes me remember of a guy who got brain damage from a heat stroke from his crypto mining hardware.

    • @Adaephonable
      @Adaephonable Год назад +5

      A typical single room space heater is upwards of 750 Watts, some even up to 1500 Watts. So no, this card could not provide the same heating as 3 of them.
      400W < 2250-4500W (Incase the math confused you)

    • @christos1182
      @christos1182 Год назад +10

      @@Adaephonable wow no need to be condescending! Obviously my comment was hyperbole...yeesh!

    • @domnero1211
      @domnero1211 Год назад

      Could someone explain to me what the hell this is ? I’m lost what is this device used for?

    • @Aashishkebab
      @Aashishkebab Год назад

      @@Lucas_Simoni Chubbyemu fan?

  • @nathantron
    @nathantron Год назад +8

    I think these new cards are a prime example of Peltier TEC should start being implemented. A solid copperplate >>TEC>> with a thin fin heatsink designed to have air forced though as fast as possible or even water cooling.

    • @Adaephonable
      @Adaephonable Год назад +1

      You are aware of how inefficient Peltier coolers are? Instead of a 400W card you could have a 600W card that does the exact same thing...

    • @nathantron
      @nathantron Год назад +1

      @@Adaephonable I'm more concerned with getting the heat off the card and into another more efficient medium as fast as possible. If we do that the wind tunnel might be able to do some real work. Hahahahaaa

  • @martinutasi7857
    @martinutasi7857 Год назад +667

    Just imagine, the 5090 will use 2x PCIe x16 solts and it will come with an additional 600W PSU in the box😂

    • @NeonCoding
      @NeonCoding Год назад +124

      In the box? Not a chance.

    • @nielskoolstra
      @nielskoolstra Год назад +54

      The next one will ship with an ATX wallpower plug at the back. Just so it can take a whole breaker for itself. \sarcasm

    • @Ms.Fowlbwahhh
      @Ms.Fowlbwahhh Год назад +13

      Nah it will come with a dolly so you can move your appliances hooked up to the 240V outlets

    • @BrianG61UK
      @BrianG61UK Год назад +10

      Only 600W?

    • @Gamer-nc8qp
      @Gamer-nc8qp Год назад +4

      @@nielskoolstra that actually makes sense and id be fine w it

  • @Neoxon619
    @Neoxon619 Год назад +1655

    How does a modern GPU not have DisplayPort 2.0.

  • @paullasky6865
    @paullasky6865 Год назад +1

    Fascinating how excited we all get about a massive fan and some RGB lights

  • @pldcanfly
    @pldcanfly Год назад +345

    Can't wait for the day a manufacturer will come up with the idead of using 2 pcie-slots with one card. One for data, and the other for structural integrity.

    • @RadioactiveBlueberry
      @RadioactiveBlueberry Год назад +18

      By structural integrity you mean to prevent sagging? JayzTwoCents made a video how to do it.

    • @st0nedpenguin
      @st0nedpenguin Год назад +1

      ...what do you think the massive heatsink does?

    • @nabawi7
      @nabawi7 Год назад +6

      Yeah that would be smart. Second PCI-E x16 slot is unused by the vast majority of gamers anyway since SLI is a thing of the past so this would be a good way to make use of it.

    • @nabawi7
      @nabawi7 Год назад +3

      ​@@st0nedpenguin To transfer heat and make the card super heavy? Im not sure what you're implying here.

    • @st0nedpenguin
      @st0nedpenguin Год назад

      @@nabawi7 The heatsink provides much of the structural rigidity.

  • @sturdybutter
    @sturdybutter Год назад +1006

    That’s insane that a card that expensive doesn’t have DP 2.0

    • @nonfungiblemushroom
      @nonfungiblemushroom Год назад +145

      That's Nvidia for you, they know people with more money than sense will buy their bs GPUs now matter how much they hobble them...

    • @aberkae
      @aberkae Год назад +140

      If AMD 7000 series gets dp 2.0 then Nvidia will become peasant status!😜
      Nvidia is long overdue for an ego correction!

    • @verakoo6187
      @verakoo6187 Год назад +9

      Wouldnt that be on Zotac in this case? Pretty sure the manufactuer chooses which ports to use.

    • @master_baiter1873
      @master_baiter1873 Год назад +65

      ​@@verakoo6187 no.

    • @tyleryoungman50
      @tyleryoungman50 Год назад +49

      @@verakoo6187 no, Nvidia locked it down

  • @theonlyremedy
    @theonlyremedy Год назад +5

    That thing looks like a small skateboard, just add the wheels!

  • @edhikurniawan
    @edhikurniawan Год назад +1

    I like how Linus seems to understand how much a kidney, underground.

  • @squirrel9999
    @squirrel9999 Год назад +129

    The first ever computer was over 300kg, looks like we are moving in the wrong direction

    • @RadioactiveBlueberry
      @RadioactiveBlueberry Год назад +16

      299kg of that was not for cooling though

    • @moortu
      @moortu Год назад +20

      The chips keep getting smaller.
      It's the cooling that is becoming bigger again.

    • @squirrel9999
      @squirrel9999 Год назад

      @@RadioactiveBlueberry True but shhhhh

    • @berengerchristy6256
      @berengerchristy6256 Год назад

      @@moortu are these chips actually smaller though? the problem is that transistor count keeps increasing while die size stays the same or shrinks

    • @itsahandle
      @itsahandle Год назад

      You're just considering weight, what about the sheer computational power 🤔

  • @DrModsQ
    @DrModsQ Год назад +218

    This GPU looks like it needs it's own case to be held properly without ripping the PCI-e connector off the motherboard

    • @LuLeBe
      @LuLeBe Год назад +9

      Seriously, how do they make sure the card's PCB doesn't just snap? I'm less concerned about the motherboard, it's loaded along it's length and not perpendicular(ly?) and the PCIe connector is soldered it with so many pins, it should hold up fine.

    • @justjoe5373
      @justjoe5373 Год назад +11

      That's what I said on another guy's video, you need a construction licence to build a pillar for this card

    • @959tolis626
      @959tolis626 Год назад +8

      @@LuLeBe First off, it's "perpendicular to", since you seem to be wondering (just trying to be helpful, not sarcastic, I don't know how my comment comes off).
      Now, if you've ever worked on a PCB, you'd know these things are tough as shit. They're just layers of metal on top of each other, really. I tried to drill through a CPU once (I killed it by mistake and chose to make a keychain out of it instead of making e-waste) and it broke my drill bit. I thought the PCB can't be that tough. Wrong. I was worried about the IHS, as I didn't have a proper drill bit on hand to drill through metal, but it chewed through the IHS easily and quickly. Then, I thought the PCB would be a piece of cake aaaaaaand it wouldn't drill through and I snapped my bit in half. I had to borrow another bit to finish the job (while keeping the same hole size).

    • @fynkozari9271
      @fynkozari9271 Год назад +9

      450w, that's more than my entire PC. 65w ryzen 1700, 175w rtx2070. 65w monitor 1440p 144hz LG32GK650f.

    • @TheAdatto
      @TheAdatto Год назад

      Depends. If the stiffness is good it should stay straight and weight distribution oke

  • @PecanPie745
    @PecanPie745 Год назад

    I’m waiting for the width of these new high end GPUs to need two PCIe slots to mount in a case. The first slot is the actual data slot while the second one is for support to avoid GPU sag.

  • @seananderson4107
    @seananderson4107 Год назад +588

    I highly support EVGA's move to drop Nvidia.
    And more people need to do that if they're going to listen and make changes.

    • @TheArsenalgunner28
      @TheArsenalgunner28 Год назад +64

      It’s been over 10 years…and Nvidia still isn’t listening. They’re like the ‘Putin’ of the Graphics cards market.

    • @skullkid456
      @skullkid456 Год назад +2

      lol lmao

    • @edragyz8596
      @edragyz8596 Год назад +6

      Intel's Arc isn't quite good enough to drop Nvidia yet, especially since the main game I play is DX11 only...

    • @TheArsenalgunner28
      @TheArsenalgunner28 Год назад +9

      @@edragyz8596 I mean I don’t even want arc to drop nvidia. What would be really interesting is if Intel basically matches older generation GPU performance and offers it as alternative to the latest, greatest and expensive GPU’s.
      I mean in 3 years, the 3070 is still gonna be great for most games, and Intel could simply aim lower and offer entry level 1080p/1440p cards that don’t compete with the best buy offer solid performances for a lower price.
      Like I am not looking forward to my GPU dying in 2-5 years and having to upgrade to brand new brick that rips my wallet contents to pieces. I just want a instant replacement that is priced in the range of £250-£429 bracket
      It’s definitely going to be something that develops further as price continue to go up at the top end

    • @edragyz8596
      @edragyz8596 Год назад +2

      @@TheArsenalgunner28 That's all fair and all, but it seems Intel is the only one who cares about beating Nvidia in the feature game. If Intel doesn't release an xx90 killer at some point, I'll still be stuck buying an xx90, because AMD just keeps MISSING.

  • @FnkDck
    @FnkDck Год назад +128

    Yo Linus thanks man, when you smashed the gpu on the table and bent the cables aggressively, I was reminded why I still watch you after all these years.

    • @NahBNah
      @NahBNah Год назад +4

      Yea…Jesus Christ but thank god it’s only a zotac

    • @uriNATE14
      @uriNATE14 Год назад +1

      Lmao! I was thinking about how his handling of tech and gpu’s specifically, hardly phases me anymore 😂 it used to make me so uncomfortable lmao !

  • @AakashParihar_
    @AakashParihar_ Год назад +1

    The thumbnail is so goofy , I love it 😂😂😂

  • @mrdan2898
    @mrdan2898 Год назад +3

    Man, that's HUGE! I would think that this graphics card needs a special extra large case.

    • @Gaagaagoogoo677
      @Gaagaagoogoo677 Год назад

      You would be correct my friend! Did an upgrade and the card is so massive it’s almost torching my front fans. Mid tower

  • @novantha1
    @novantha1 Год назад +325

    Not sure about other people, but I'd be really interested to see a "we made a computer more efficient than a console" revisit, where you try to take a 4090, and undervolt it to be about equal with a 3090 and see where it ends up power wise. I'm actually super curious to see what happens when you don't go completely overkill with this generation's cards

    • @SolarisUK
      @SolarisUK Год назад +33

      you can undervolt a 3080 to draw 100w less power and it only looses around 1 to 2% performance.

    • @gamepadlad
      @gamepadlad Год назад +9

      @@SolarisUK now this is awesome. Any video on that? Would love to watch something like that

    • @SolarisUK
      @SolarisUK Год назад

      @@gamepadlad ruclips.net/video/FqpfYTi43TE/видео.html

    • @cyphercracker
      @cyphercracker Год назад +14

      @@gamepadlad silicon lottery. Usually you can undervolt so it draws 80-100w less in afterburner without even touching the core or VRAM.

    • @cyphercracker
      @cyphercracker Год назад +6

      And don't underestimate the power of 2k resolution. I much more prefer to have 2k and higher settings to make the enviorment more epic 🙂👍

  • @joe-ni3zc
    @joe-ni3zc Год назад +179

    2:26 yeah dude, I also love the Radeon 5900x, my favorite cpu by far, the arc 12900hk isn't bad too

    • @notvisibleconfusion
      @notvisibleconfusion Год назад +3

      LMFAOOOO

    • @COE-Ender
      @COE-Ender Год назад +19

      I has to replay that part myself just to make sure I heard linus right

    • @1syncgg
      @1syncgg Год назад +15

      ryzen, radeon, rtx, ratatouille... a lot of r's.

    • @MrReklez
      @MrReklez Год назад +5

      Lmfao - I was like did I miss a new CPU release or something. There was no * to correct it aswell.

    • @TimSheehan
      @TimSheehan Год назад +3

      I'm just entertained nVidia is recommending you pair it with an AMD CPU, though the lower-than-intel power consumption was probably more important than Intel not competing with their GPUs

  • @shanecreamer6889
    @shanecreamer6889 Год назад

    The real question for me is trying to sift through the DLSS / RT information is:
    Which one can be enabled for games that don't natively support it? I have a 3080 doing just fine running at 2560x1080 on a 36" Ultrawide, but I am naturally interested in determining if the NVIDIA 4090 DLSS AI can be manually enabled and tuned for games that don't natively support to gain faster frame rates and thus use the whole potential of this expensive GPU?
    Example:
    If I upgrade to a 49" Ultrawide 5160x2160 monitor and try to force an 4090 into DLSS for better frame rates, can I manually tune & enable it?
    If there is no method to force super-sampling for the increased frame rates would it do any good to buy a 4090 to play Thief 2015, Dishonored 2, DeathLoop, Elden Ring, etc. for games that don't support it?

  • @mkine
    @mkine Год назад

    Hey guys, I have a real question:
    What actually changes when you turn on energy saving mode on you Phone?
    I have Pixel 5 and when I turn on the battery saving mode, i feel nothing changes, except that it actually has longer battery life compared to when i have it off. Subjectively, everything works just as fast. Is this black magic?

  • @WDCallahan
    @WDCallahan Год назад +195

    I like the part at 6:08 where he says that being able to change the game could be a game changer.

  • @builtofire1
    @builtofire1 Год назад

    card has mesh at the back, to blow out hot air, but the fins are covering it, no problem 3.5 height is the solution, just throw more heatpipes at it

  • @ProfessorGears_LT
    @ProfessorGears_LT Год назад +1

    I like how they release new GPU's when I cant even get my hands on a 1660 from the stores....

  • @Scarbuckks
    @Scarbuckks Год назад +477

    I remember when a GPU needing an 8 pin was insane, but 12 is ridiculous

    • @XGalaxy4U
      @XGalaxy4U Год назад +21

      It has 32 pins for your power supply. I'm not sure what that other dongle is for. Linus says it's for communication to the power supply. Probably not needed. My 800 watt PSU has four 8 pins. It may run it but not with a hot CPU.

    • @usagihakushaku5338
      @usagihakushaku5338 Год назад +17

      @@XGalaxy4U Soon you need your own wind turbine to be eco friendly , or wind turbine sub , few people borrow one and split the bill

    • @BaSiC47
      @BaSiC47 Год назад +9

      I remember when 500€ for a GPU was insane

    • @VioletGiraffe
      @VioletGiraffe Год назад +5

      I remember top-tier GPUs costing $400. That's how much a brand new 2900XT cost that I bought with my university scholarship savings.

    • @vasilije94
      @vasilije94 Год назад +1

      I mean I remember when we didn't even need extra PSU connectors to GPU, or better yet when we didn't had GPUs to begin with. Times change, technology goes forward.

  • @rodh1404
    @rodh1404 Год назад +238

    They're only charging a kidney for these? Amazing, I thought they'd cost an arm and a leg.

  • @TheLiberator455
    @TheLiberator455 Год назад

    Will there be slimmer and shorter versions because I feel that even with a prop it won't be enough to support it without it sagging from the weight

  • @keiro8364
    @keiro8364 Год назад

    Could you guys show the machine learning performance since that is what I am interested in, possibly using a CNN or GAN and timing the training period?

  • @Jahus
    @Jahus Год назад +359

    I love the EVGA model. The RGB is sick, and those fans are… man they are quiet!

    • @pickingyoe2992
      @pickingyoe2992 Год назад +7

      hah

    • @jondonnelly4831
      @jondonnelly4831 Год назад +49

      Since EVGA won't be putting it into production it will be very quiet.

    • @zulchemical
      @zulchemical Год назад +19

      @@jondonnelly4831 insert capt america meme

    • @2312uri
      @2312uri Год назад +9

      I've heard that the power consumption is incredible low

    • @JJHazy
      @JJHazy Год назад +5

      It does have some insane technology, in addition to RGB channels it even has an Alpha channel!

  • @farhadaa
    @farhadaa Год назад +35

    Linus carefully handling the Arc A770.
    Also Linus desk slamming the RTX 4090.

    • @ZaHandle
      @ZaHandle Год назад

      Aluminium brick is pretty impact resistant

  • @erfannazarian
    @erfannazarian 4 месяца назад

    Im between zotac 4090 trinity gaming and msi gaming trio X . right now in my country both of them are in the same price .... what should I choose?

  • @mohamadadeyn2837
    @mohamadadeyn2837 Год назад +4

    Wow Zotac does again with the coolest design ever!

  • @Akkbar21
    @Akkbar21 Год назад +53

    The PCI express connector needs to be redesigned as a cable and you just mount the module inside your case like a PSU.

  • @StevieThundr
    @StevieThundr Год назад +195

    I hope AMD will be laughing all the way to the bank in november...for our sake

    • @ntdspades5971
      @ntdspades5971 Год назад +10

      Definitely skipping at least the 40 series card because this shits crazy

    • @512TheWolf512
      @512TheWolf512 Год назад +9

      and intel, too. with prices 5 times lower that this absolute spit in the face.

    • @zimashe5446
      @zimashe5446 Год назад

      🤣

    • @chitorunya
      @chitorunya Год назад +5

      @@512TheWolf512 and like 7x weaker performance tbf

    • @qshank2752
      @qshank2752 Год назад +1

      @@ntdspades5971 i reckon at this rate we will be waiting for 60series cards. I find it hard to trust amd/ati as ever gpu i have had with them has either failed or had driver issues. Yes that may have been fixed but i dont trust them in my work PC after so much lost data from random shutoffs/dead gpus. i would never pay this much for a gpu either though so the waiting game begins.

  • @TurboMalibuV6
    @TurboMalibuV6 Год назад

    Available at your favorite online scalping outlet stores and auction sites!

  • @spin_kick
    @spin_kick Год назад

    At what size do you start saying that you connected the motherboard to the videocard?

  • @Answerx32
    @Answerx32 Год назад +177

    That cooling system is freakin huge. I bet you can heat your entire house with that card.

    • @timm1583
      @timm1583 Год назад +18

      Probably not the houses that people who get these live in

    • @bstaznkid4lyfe392
      @bstaznkid4lyfe392 Год назад +3

      This thing consume 450 watts..Only card that uses too much power..

    • @fynkozari9271
      @fynkozari9271 Год назад +2

      @@bstaznkid4lyfe392 3090ti also 450watt.

    • @MistyKathrine
      @MistyKathrine Год назад

      Space heaters are obsolete, just buy a modern computer.

  • @GamePlague
    @GamePlague Год назад +351

    I think we could use a generation where the focus is on making things smaller and more power efficient with only minimal performance increases.

    • @scotthadley92
      @scotthadley92 Год назад +19

      No we dont

    • @Tarets
      @Tarets Год назад +49

      ​@@scotthadley92 We don't what?

    • @moi3848
      @moi3848 Год назад +14

      Yes, but no one would buy it, do you really want to pay more for efficiency when with the cost of electricity no classical user will ever see the difference on theirs electricity bill ?

    • @moi3848
      @moi3848 Год назад +4

      @HackerMode if you want cheaper card more efficient just go with less higher graphics card of the new generation (4070 is a bit like a more efficient and cheaper 3080-3090) I get the point of stopping the power hungry card but I think with the current market you can already do it if you go on smaller graphics card

    • @DarcyFerrier
      @DarcyFerrier Год назад +3

      i think what you're trying to say is we should focus on making these cards more efficient overall, instead of more efficient per watt. ultimately when this card is pulling 400+ watts, that's a shit tonne more than the majority of other cards on the market, but the performance gains don't necessarily reflect the power draw. i totally understand!

  • @idwithheld5213
    @idwithheld5213 Год назад +1

    If it only needs 450w, why does it have four 8-pin connectors at 150w each? Three 8-pins and the pcie bus (75w) would be 525w.

  • @bikesandbrass
    @bikesandbrass Год назад

    6:20 "could be an absolute game changer"
    I see what you did there........

  • @klasztornik847
    @klasztornik847 Год назад +70

    Looking at those coolers, I am actually expecting orienting the fans sideways at some point, if you're already 4 slots wide, you might as well exhaust out of the back of the case.

  • @TheNextBigThingHD
    @TheNextBigThingHD Год назад +97

    I've never been less excited for a gpu launch...

    • @eliashabash7591
      @eliashabash7591 Год назад +5

      Because you can’t afford the 4080 and “4070”
      Has nothing to do with this particular card
      It’s pricing is in line with last gen

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 Год назад +30

      @@eliashabash7591 mate. That's not the issue. It's just a stupid buy in all the ways you look at it. And Nvidia is literally worse than apple so it hurts to support them.

    • @LuLeBe
      @LuLeBe Год назад +7

      @@eliashabash7591 Yeah the 4090 is in line with the 3090 pricing, but the 3000 series promised much bigger improvements over 2000 than this does over 3000. So it made sense that 3000 was really expensive, because it was such a big step. That launch was 2 years ago, by now the prices would usually (without miners) sit at like $550-600 for a 3080 (MSRP was 699). And compared to that, the price has really gone up. And you know that actual prices will be even higher than their MSRP.
      I remember when a friend got a 580, it cost like $470 or so new. My 970 was 299. My 2070 was around $420. And the current 4070 costs $900 now???

    • @hypewilkens
      @hypewilkens Год назад +1

      @@LuLeBe Its ludacris..

    • @Killswitch1411
      @Killswitch1411 Год назад +3

      @@SpartanArmy117 Should have told Germany that when they relied on Russians for their Energy.

  • @ChestnutandBea
    @ChestnutandBea Год назад

    What is that awesome house light thingy to your right in the video? I want one!

  • @Davethreshold
    @Davethreshold Год назад

    So WHEN will everybody be able to test this and show us theresults?

  • @BrentJohn
    @BrentJohn Год назад +85

    A heatsink _that size_ is not just ridiculous, but bordering on impractical. Not just in the sense of space, but the weight of the card as well, which is getting into requiring a support bracket to keep it from coming out of the socket.
    It would make a hell of a lot more sense to turn it into a kind of AIO with the radiator/fan attached elsewhere, or redesigning it to fit on the card, if possible.

    • @gamingmarcus
      @gamingmarcus Год назад +12

      Achtually a ton of them will ship with support brackets. Go watch the Gamers Nexus video about AIB 4090 announcements. The marketing they came up with for their GPU support brackets is hilarious.

    • @joshuaadams4945
      @joshuaadams4945 Год назад +5

      Right, when you implement a solution to fix a thermal problem that introduces a new issue. I want to design a new GPU Bra. Make it all lacey colorful. Or a Sweedish model pump style novelty gpu riser.

    • @Mart-E12
      @Mart-E12 Год назад +5

      They're making ones with aio radiators like the Msi Suprim 4090

    • @BrentJohn
      @BrentJohn Год назад +1

      @@gamingmarcus Yeah I saw. It's stupid to require a bracket for this.

    • @gamingmarcus
      @gamingmarcus Год назад +5

      @@BrentJohn My next card will be getting some support no matter what.
      I have a 10 series card in my system, which is around 5 years old and despite being a relatively small 2-slot card it has developed some noticable sag over the years because it has no support at the back. So this isn't just an issue with chunky cards.
      I don't know if this would ever create real problems down the line but there's hardly any effort in putting a 10cm piece of plastic under your GPU so I'll just do it.

  • @mndlessdrwer
    @mndlessdrwer Год назад +66

    I'm still amazed that Zotac is still around. Amazed and grateful, because they're one of the few companies that still get weird with their product designs. Need an overpowered and very short GPU? Check Zotac, they probably have it. Need a low-profile GPU? Check Zotac.

    • @tibor29
      @tibor29 Год назад +1

      I probably just got unlucky but a Zotac GTX 260 card was the only GPU that ever randomly died on me and the warranty process was a nightmare. I don't think I even got a replacement in the end. I haven't trusted Zotac since then, but to be fair, that happened like 14 years ago.

    • @karroq
      @karroq Год назад +7

      I wasn't grateful when they cranked their msrp to scalper levels, and were selling $700 3060s, $1200 3070s, and $2500 3090s on their own website during the shortage. Showed how much they cared about their customers then.

    • @mndlessdrwer
      @mndlessdrwer Год назад +2

      @@tibor29 Yeah, DoA products and other faulty products happen to every manufacturer. It's a real shame that EVGA got burnt out on the graphics card market because they legitimately had the best customer support for their stuff. I more appreciate Zotac for offering unusual products than for their overall quality.

    • @AmrrowSGG
      @AmrrowSGG Год назад

      I have a Zotac Oc mini RTX 2070 and it works great haven't had any problems with it. Zotac has became a better company then they used to be. Their customer support has gotten better after a lot of complaints. Their quality on their GPUs has gotten massively better over they years because of the consumers complained over it. Zotac do have clean looking cards and funky looking ones such as the one in the video

    • @KenS1267
      @KenS1267 Год назад

      Zotac has a history of doing low profile cards, fanless cards and all kinds of stuff but look at reviews before you buy there have also been some really bad quality stuff by them from time to time.

  • @Phurios1
    @Phurios1 Год назад

    I will be waiting for the mini itx build inside a 3D printed rtx 4000 series case (or something along these lines).

  • @cristi724
    @cristi724 Год назад

    Do you need to pay extra shipping fees for heavy packages when you order one?

  • @andrewtrumbower
    @andrewtrumbower Год назад +11

    Anybody else getting old LTT nostalgia with this onboxing? Love the current main channel stuff but it's nice to have that feeling of old comfort once in a while with Linus

    •  Год назад +1

      From what I remember that was the whole intention of creating this channel, to have a place where their old style could continue since they want deeper dives and pieces that tell a story in the main channel.

  • @Artaxo
    @Artaxo Год назад +12

    I got so confused with the Radeon 5900X CPU (2:26), but it was just a mistake. I started questioning everything I knew.

    • @halotravis6910
      @halotravis6910 11 месяцев назад

      Lol same! I was looking for this.

  • @andreww2098
    @andreww2098 Год назад

    how long before they use more than 1 Pcie socket to get the transfer rate needed?

  • @TekMoliGy
    @TekMoliGy Год назад

    does the new power connector mean i need a new psu? i never built a pc but i bought one prebuilt with a 1200 watt power supply with my 3090 hoping it would be future proof

  • @squeeh
    @squeeh Год назад +50

    All hail the 7 slot card, imagine the cooling fans that you could fit and the amount of cooling that you could produce...

    • @Dukes3677
      @Dukes3677 Год назад

      whole case cooling xD

    • @itznotmytube
      @itznotmytube Год назад +1

      They could use the "Blowzooka" fans from Delta Electronics that they tested on LTT :D 7400 CFM or so!

  • @MrMegaPussyPlayer
    @MrMegaPussyPlayer Год назад

    10:12 It then also could plug in in more than one PCI slot and, not only, draw less power over the flawed connector, but also be better secured.

  • @TechDove
    @TechDove Год назад +37

    With the introduction of these 40 series cards, I think I'll get a 20 or 30 series

    • @blackcitadel37
      @blackcitadel37 Год назад +3

      Most people don't even need a 30 series gpu let alone a 40 series monstrosity like that.

    • @Austin-bc1yh
      @Austin-bc1yh Год назад

      2060 super is highly recommended. runs most game at 70+ fps at 1440p ultra[I wouldnt raytrace tho] and worst case scenario you drop resolution to 1080p or reduce quality on distant shadows because its pretty taxing for something you will never notice unless you go batman detective mode on your monitor over something as trivial as shadows that are far away. Also after doing a little research I found that the 2060 super is about on par with the 3060 although the 3060 handles lighting better the 2060 super runs calculations and physics and everything else slightly better, they have around the same price as well. if you get a 20 series get the 2060 super if you want better than that you need to go for 3070 tier or better otherwise you are wasting money

  • @odenz91
    @odenz91 Год назад

    I WANT TO SEE A TEARDOWN VIDEO. i searched but i cant found one. Please make a video teardown the RTX4090

  • @shermanwellons
    @shermanwellons Год назад

    going to get a box that houses 2 of these, Need them for Cinema 4D + Redshift.

  • @rubikfan1
    @rubikfan1 Год назад +3

    3:16 why are there still using air? With this much power shouldnt we move to water as a base?

  • @tjammed7844
    @tjammed7844 Год назад +8

    Pretty soon we are going to be installing motherboards onto the GPU instead of the other way around.

  • @shi-enl.7046
    @shi-enl.7046 Год назад

    Linus: *smacking an 1600USD graficscard on the table*
    my heart: *stops a beat*

  • @issarolifewithyou
    @issarolifewithyou Год назад +1

    Hopefully Display Port 2.0 appears in the RTX 4090 TI. If Intel can do it on their ARC GPUs surely NVidia can.

  • @yourvenparianen5390
    @yourvenparianen5390 Год назад +30

    "zotac recommends a 1000w power supply" , ahh finally i was looking for a nice space heater

  • @darkminer8675
    @darkminer8675 Год назад +33

    As much as I kinda like the new power connector for the RTX 4000 Series, I kinda wish they had placed it on the back side of the card (Similar to the RTX 2060 FE and some Quadro cards) for a cleaner look on the inside in terms of cable management

    • @bthatguy1181
      @bthatguy1181 Год назад

      Why cant they just add it to the mobo? Not like this is a new issue, why not just add compatibility to the motherboard?

    • @HVDynamo
      @HVDynamo Год назад +1

      @@bthatguy1181 Because the PCI-E slot can’t handle the power needed either.

    • @bthatguy1181
      @bthatguy1181 Год назад

      @@HVDynamo They could come up with a standard connector that fits right after it to handle the watts needed.
      Its not like PCIE hasn't been changed again and again either.

    • @1steelcobra
      @1steelcobra Год назад

      I think the issue there is the extra wiring that requires to extend the connection for the power to the board from the end instead of surface-mounting the connector directly to the board.
      I think GN had complaints about how the 2060 FE's connector made teardown a lot more difficult, as well.

    • @88porpoise
      @88porpoise Год назад

      @@bthatguy1181 Then you need to get PSU, MB, graphics cards, and likely case designers in line for it.
      And you probably can't be backwards compatible. You can adapt old PCIe power connectors to the new one. It may not be ideal, but it is viable.
      Then you have to consider managing it fpr vertical-mount etc.
      And MB makers need to include traces capable of carrying all that power from where it enters the MB to the PCIe slot.
      All in all, it would be a LOT more complicated to implement for a small aesthetic benefit that a huge proton of their customers wouldn't care about.

  • @geeyan8162
    @geeyan8162 Год назад +1

    1:58 linus:“...this graaaaaa~phic(heavy) card....”🤣🤣

  • @tobiasvanderkaa3536
    @tobiasvanderkaa3536 Год назад

    I hope tear will be more efficient options in a few years

  • @Bandrosonk
    @Bandrosonk Год назад +5

    2:32. Radeon 5900x is my favorite cpu! Much better than Ryzen!

  • @darthgonk7741
    @darthgonk7741 Год назад +5

    RTX 6090: Requires a separate 1000 watt power supply with a 96 pin pcie cable and a separate case that has 20 fans installed (For average performance of course. Full performance requires more extensive measures)

    • @boastfultoast
      @boastfultoast Год назад +1

      No like straight up this stupid shit is gonna happen if they keep going in this direction, it’s so lame for consumers

  • @paulbrand7676
    @paulbrand7676 Год назад

    when he revealed the price my first thought was: 'Well thats actualy not too bad'. The last year was rough.

  • @AndreHansen96
    @AndreHansen96 Год назад

    0:36 almost sounded more Super Mario than Chris Pratt there Linus. Nice!

    • @AndreHansen96
      @AndreHansen96 Год назад

      @Bolia Fops wtf does that have to do with my comment?

  • @Urboyfromfuture
    @Urboyfromfuture Год назад +33

    Looking at the size of this I think it's perfect time for case manufacturers to start experimenting with different mounting position for gpu . I know lian li has 2.

  • @gregoriodia
    @gregoriodia Год назад +7

    Having Zotac 3080 this made me happy, to see Linus considers that 3rd best thing ;-)
    Scored one week after launch of 3080 at pretty much MSRP. Been serving me very well since then.

  • @antilogic81
    @antilogic81 Год назад

    I was gunna say the module idea is a very likely outcome. I guess we will see

  • @C0deMasterYT
    @C0deMasterYT Год назад

    When I eventually do a major overhaul on my PC, I'm probably going to get a top-of-the-line GPU if possible. Basically going to be rebuilding my PC all over again when I do that.

    • @Frawt
      @Frawt Год назад

      ok boomer

  • @kyleduddleston4123
    @kyleduddleston4123 Год назад +55

    I laughed too hard when he whipped out the Zotac Gaming box after explaining the cool ideas he had. I needed a good laugh today. 😂

  • @stekelly1980
    @stekelly1980 Год назад +4

    I think a new pc motherboard layout will be needed in the future and internal pc case design too. Gpu's are the dominant thing inside a case now!

  • @jenkem4464
    @jenkem4464 Год назад +1

    Oh man those thunks on the table sound expensive!

  • @barcsaysz
    @barcsaysz Год назад

    You almost got 100 member on your team and one still foked up the color grading on this video especially on your hands..:D The infos btw is much respected! :)

  • @DJdoppIer
    @DJdoppIer Год назад +31

    I don't know why, but I like the design of the Zotac cooler. Still wouldn't even consider getting an RTX 4090 card though (my electric bill is bad enough as-is).

    • @drumyogi9281
      @drumyogi9281 Год назад

      What do you get charged for kWh? Here in California it is average 21c kWh which at 750w an hour for 4 hours a day is 50c a day (who plays that much everyday though and 50c a day is still not that bad).
      That is hypothetical though as you are not going to be consistently drawing that much power over our 4 hours of base screen time, even while gaming. You will be hitting much lower.

  • @mundzine
    @mundzine Год назад +10

    EVGAs 4090 would have been enormous

  • @Krashulka
    @Krashulka Год назад

    That MSFS clip of with and without RT looked exactly the same to me......

  • @kwinzman
    @kwinzman Год назад

    Thanks for mentioning the legacy DisplayPort disaster!

  • @bexgreen1476
    @bexgreen1476 Год назад +5

    Linus may have mentioned the 7 slot wide card. But how about a card that plugs into 2 PCIE slots?

  • @onlymemes1329
    @onlymemes1329 Год назад +3

    1:12 Linus what happened to your finger?

  • @michal_king478
    @michal_king478 Год назад

    fun fact: it consumes way more than 450w. the specified power consumption is always only the chip itself. The board itself also easily takes 100 or 200 watts. My 2080ti has a tdp of 250w but will actually draw 340w max

  • @rawdog7220
    @rawdog7220 Год назад +1

    I'm still on vega 64.. bought 2 for $500 a couple of years ago, still great in all my games and still good in blender rendering (hence 2 of them).

  • @johnkeane320
    @johnkeane320 Год назад +26

    I cant wait to build a PC inside my 4090 at the rate these gpus are increasing in size

  • @user-ki1wq6pt9u
    @user-ki1wq6pt9u Год назад +17

    Man, the zotac amp holo 4090 is insane looking

    • @garienza
      @garienza Год назад

      It's the same card but but RGB bling

    • @PenaEnrique
      @PenaEnrique Год назад

      it looks like a giant soap bar

  • @saleh3521
    @saleh3521 Год назад +1

    So the 3090 TI and the 4090 all have the 12VHPWR plug. But with the 3090 TI the adapter was just a 12 pin adapter without the 4 pins (same adapter as the normal 3090 but minus 1 8 pin). While the 4090 has the 12+4 pin adapter. So im wondering, if the 3090 TI who has the same plug as the 4090 worked without the need of those 4 pins. Shouldnt the 4090 do the same? Or do you NEED those 4 pins? The 3090 TI and 4090 PCBs are not that different. So in theory, what worked on the 3090 TI, should work on the 4090.

  • @SmiFF_
    @SmiFF_ Год назад +1

    5:34 … compared to which gen gpu ?! Based on what hardware was that claim made ?! I never saw that demo run on a old gen tech to take those words with a grain of salt :/

  • @MRDJBURNS
    @MRDJBURNS Год назад +10

    Will be waiting for the second gen of ARC or a drop in the 30 series before I upgrade from my 20 series GPU. I like having kidneys!