Is the fastest GPU ALWAYS the best? RTX 4090 Review

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 6 тыс.

  • @LinusTechTips
    @LinusTechTips  Год назад +2342

    CORRECTION: We're working on updating the video, but in the meantime, our numbers for Cyberpunk 2077 were with FidelityFX Upscaling enabled. We specifically *didn't* have this enabled, but stability issues with the bench seems to have messed with the settings. We've re-run all of the numbers for each card:
    *No RT, no DLSS (1%low, 5%low, avg):
    - RTX 4090: 54, 69, 81
    - RTX 3090 Ti: 43, 46, 56
    - RTX 3090: 35, 43, 50
    - RX 6950 XT: 30, 39, 46
    *RT, no DLSS (1%low, 5%low, avg):
    - RTX 4090: 36, 39, 44
    - RTX 3090 Ti: 17, 22, 26
    - RTX 3090: 16, 19, 23
    - RX 6950 XT: 10, 11, 13
    *RT + DLSS (1%low, 5%low, avg):
    - RTX 4090: 94, 97, 108
    - RTX 3090 Ti: 58, 60, 67
    - RTX 3090: 52, 53, 61
    - RX 6950 XT: N/A
    -AY

    • @BayareaXotics
      @BayareaXotics Год назад +49

      Heyyy daddy Anthony

    • @zimnium1
      @zimnium1 Год назад +4

      Hey 5 minutes early to the comment:)
      And hey Anthony👋

    • @ProjectSmithTech
      @ProjectSmithTech Год назад +237

      I knew it, cyberpunk was way too high. Thank you for the correction.

    • @Keemochi420
      @Keemochi420 Год назад +25

      guess im staying with my RTX 3080 then id wait for 50 series

    • @moochoopr9551
      @moochoopr9551 Год назад +84

      Need to be pinned

  • @danielmaurel3195
    @danielmaurel3195 Год назад +12325

    it´s impressive how a 600 bucks GPU is bargain now... we need serious competition in the GPU market

    • @HerroYuy246
      @HerroYuy246 Год назад +746

      INTEL

    • @Laesx
      @Laesx Год назад +342

      I got a 3080 ti FE for that price and it truly was a bargain

    • @JoeWayne84
      @JoeWayne84 Год назад +35

      This comment was 2 minutes after video was released

    • @baseplate_462
      @baseplate_462 Год назад +500

      People who work on a budget, or anyone who isnt in an industry or job that might require the latest and greatest should not be foaming out the mouth to buy a new GPU. 2060s are available for sub 300 dollars on amazon and 3060s regularly at 400. i sit here doing game development, blender work, play my favorite games, steam my favorite content all on a 270$ 2060.
      Im generally baffled at the people who look at GPUs that SHOULD cost 1k+ and go "its not fair i cant buy that" when 2060s and 3060s exist. Its like looking at your 2018 ford focus and going "i cant believe they are selling lambos at that price, now im never gonna be able to get one." AS IF YOU NEED A LAMBO TO GET TO YOUR JOB AT BURGER KING. Your ford focus is reliable and when it breaks down, unlike a lambo- it isnt the end of the world.

    • @samson_the_great
      @samson_the_great Год назад +113

      600 is what a responsible adult should be making in 3 days. If you can't afford a gpu you shouldn't even be on RUclips right now.

  • @Neoxon619
    @Neoxon619 Год назад +4849

    As powerful as this is, I honestly can’t justify the price. Not to mention the cooling & space requirements for the 4090. And the fact that the 4000 series doesn’t have DisplayPort 2.0 is legitimately baffling.

    • @Zapdos0145
      @Zapdos0145 Год назад +408

      wait are you serious, it DOESNT have DP 2.0

    • @2ndcitysaint52
      @2ndcitysaint52 Год назад +285

      huh?? its way easier to justify this price vs the 3090 or 3090ti? Im sorry but if you are someone who always buys the bleeding edge gpus this is the 1st time it doesn't feel terrible to upgrade. yea the dp 2.0 its shitty but the uplifts at 4k are pretty nice here.

    • @fortwentiblazeit4177
      @fortwentiblazeit4177 Год назад +122

      isn't this video posted like 8 minutes ago? How are you able to finish the whole 16 video in 4 minutes :o

    • @Neoxon619
      @Neoxon619 Год назад +57

      @@Zapdos0145 Nope, it’s still 1.4.

    • @xXRealXx
      @xXRealXx Год назад +69

      @@fortwentiblazeit4177 by watching at 2x speed or higher

  • @kabiro2151
    @kabiro2151 Год назад +2161

    I have never wanted Intel to succeed so bad.

    • @thatdogeguy9108
      @thatdogeguy9108 Год назад +117

      Or amd

    • @euko6876
      @euko6876 Год назад +36

      Lmao same. It doesn't feel right 😄

    • @dennismwangangi
      @dennismwangangi Год назад +7

      OUt of context I tried displaying on 2 3 screens using an nvidia GPU, and it brought stability issues that could result to BSOD but then I bought USB display adapters and they were stable intel enabled the and no issues arose.

    • @euko6876
      @euko6876 Год назад +11

      @@dennismwangangi doesn't sound like you ultimately fixed or figured out was wrong. Just used an alternative method to use the gpu.. but okay 👍🏽

    • @f2pgpb993
      @f2pgpb993 Год назад +5

      @@thatdogeguy9108 amd is on the same pricey path as nvidia sadly they would be if their old policy at the time of the 4870 still was on course i remeber at the time they were on a policy to make pc vid cards cheaper

  • @Yomismo28
    @Yomismo28 Год назад +1713

    I remember buying a 1060 for around 300€ and my parents mocking me for spending so much on it. The look on my mother's face when I told her about these gpus prices was priceless

    • @Zach014G
      @Zach014G Год назад +89

      i bet she took back that statement now LOL

    • @AdrenResi
      @AdrenResi Год назад +2

      almost the same story here

    • @vuri3798
      @vuri3798 Год назад +67

      The first GPU I bought was 11 years ago, a GTX560 for 200 bucks, BF3 and every other games runned at ultra with over 60fps (120hz just started to be a thing back then)... Now you have to spend double for a xx60 card so you can play current demanding games at medium/high settings. What a time to be alive.

    • @Argedis
      @Argedis Год назад +20

      @@vuri3798 I bought my 1660 Super for $220
      The days of the 60-series cards being the best 'budget' card are gone

    • @scaredycat8685
      @scaredycat8685 Год назад +52

      boomers think 18 year olds can go buy a house and live on their own out of high school still ... lol good luck

  • @yt_clazify
    @yt_clazify Год назад +55

    It feels like the 3080 and 3090 came out yesterday

    • @KanecicqSTUDIOS
      @KanecicqSTUDIOS Год назад

      I just got 3060 Ti in in May, cuz 3080 prices were still madness at the time

  • @garrettyates647
    @garrettyates647 Год назад +40

    I think a very valid test would be a "office" or "bed room" size room with something like this. We measure ambient and output, but more airflow = moving more hot air = more hot air into the small room. If that room is not near a thermostat you're gonna have a small sauna in your home office or a freezer in your living room while trying to keep the office cool.

  • @thenoblehacker9111
    @thenoblehacker9111 Год назад +549

    Way too expensive, yet still extremely impressive and powerful

    • @PumpyGT
      @PumpyGT Год назад +2

      Yeah it really is

    • @Iskandr314
      @Iskandr314 Год назад

      @@PumpyGT test is not true...

    • @symphoricquoz3763
      @symphoricquoz3763 Год назад +26

      @@Iskandr314 Are you able to describe *why* the test isn't "true"? Because that seems like a strong claim.

    • @Butzemann123
      @Butzemann123 Год назад +25

      @@Iskandr314 so a test for ONE GAME isnt 100% correct. So what?

    • @mr.darknight416
      @mr.darknight416 Год назад +13

      Its not impressive when the price is almost 2k. It would be impressive if it was at a reasonable price. I mean before they too could make mote power gpus, but they were limited by price, where people wouldnt even consider buying a 1k gpu let alone 2k. Remember in 1080 era 800$ was too much. I mean everyone can make the fastest gpu if they got unlimited price and see a gpu for 5k or 3k.

  • @GERMAN2FUCK2DOWN
    @GERMAN2FUCK2DOWN Год назад +619

    Being honest. The performance is impressive and it would be a dream to game with that card. But the price tag and power consumption in todays market just makes me want to travel back in time

    • @picoplanetdev
      @picoplanetdev Год назад +13

      Couldn't agree more. I got a good deal on a 3060 Ti and I don't think you could justify much more than that.

    • @arnox4554
      @arnox4554 Год назад +25

      And the size... What the fuck?? Even the size of the 3090 vanilla was absolutely absurd.

    • @Darthquackius
      @Darthquackius Год назад +2

      I can't even get an ATX 3.0 power supply till December! how are we going to run these things!!

    • @SSoul0
      @SSoul0 Год назад +6

      @@Darthquackius off your standard psu...

    • @whiteXIchigo
      @whiteXIchigo Год назад +7

      the Power conusumtion is not even the problem, what LTT didn't checked how the FPS/Power consution curve is looking. It seems that you can safe way over 100w easy with lossing only a few percent in performance, 2-5%

  • @whasian1487
    @whasian1487 Год назад +158

    The enlarged GPU images in youtube thumbnails is usually for clickbait, but the 4090 may actually need to be downsized for thumbnails.

  • @iachimotdk1056
    @iachimotdk1056 Год назад +7

    Seriously, this video is still up saying that the 4090 is 60% faster than the 3090 Ti? Come on LMG. This is a very bad look.

    • @datcheesecakeboi6745
      @datcheesecakeboi6745 11 месяцев назад +4

      "We will remove the videos"
      Keeps up all the worse offenders

  • @bunsenn5064
    @bunsenn5064 Год назад +129

    I remember all the hype that came with the 30 series cards, especially the 3090, getting released. There was none of that with the 40 series. Back when the 20 and 30 series came out, me and my friends would talk about them all the time. No one said anything when the 4090 came out. It was so far out of reach that we just didn’t care.

    • @StopRemindingMeOfThoseDays
      @StopRemindingMeOfThoseDays Год назад +9

      Ikr. It's now some kind of fantasy that we nod to and say "cool" and go on with our day knowing we can never get our hands near it

    • @stevieC11Hanworth
      @stevieC11Hanworth Год назад +3

      Just wait 5 years and get one

    • @admistyt
      @admistyt Год назад +24

      @@stevieC11Hanworth in 5 years something newer will be out

    • @dronred8817
      @dronred8817 Год назад +1

      you too do not casually carry 1800 EUR in pocket?
      What a coincidence))).
      I think not....

    • @Larimuss
      @Larimuss Год назад

      $2000 USD in australia. I could spend the money on it but I just can’t justify it. No way in hell. It would cost more than my entire system. With 9x fans $200 case, water cooling etc. their just ripping of loyal customers now and I hope they lose sales

  • @thwind
    @thwind Год назад +1452

    Surprising differences in some results between reviewers. For example Jayz2cents had Cyberpunk 4K (ultra preset, RT/DLS off) average fps only at 76fps compared to LTT's 136fps. I wonder what could make such huge difference?

    • @elio564
      @elio564 Год назад +233

      I wonder the same thing. Vote this up lads so LTT sees this

    • @soaringspoon
      @soaringspoon Год назад +346

      They fucked it up DLSS was on.

    • @tiestofalljays
      @tiestofalljays Год назад +313

      @@soaringspoon LTT Labs getting off to a good start I see.

    • @ETin6666
      @ETin6666 Год назад +106

      Jayz' is the correct one. HardwareUnboxed got 83fps in 4k high dlss off.

    • @nasmeskartz9149
      @nasmeskartz9149 Год назад +36

      @@rustler08 Don't think so, since they got chart for both DLSS when off and on.

  • @mattb6646
    @mattb6646 Год назад +850

    They didn't put displayport 2.0 on the 4090 because they don't want these cards being used 5 years from now... so you'll have to buy the next gen gpu with the 2.0 port. It would be like making a car that last for 30 years, they want you to come back and buy another

    • @Sad_King_Billy
      @Sad_King_Billy Год назад +143

      Apple set that standard, now Nvidia wants a piece of the pie.

    • @512TheWolf512
      @512TheWolf512 Год назад +69

      @@Sad_King_Billy not apple. iSheep did.

    • @user-dm8ic8lj5z
      @user-dm8ic8lj5z Год назад +14

      AND pcie 5.0

    • @BadTomzi
      @BadTomzi Год назад +1

      Holy shit

    • @ShogoKawada123
      @ShogoKawada123 Год назад +15

      Used in five years with what though? The card has HDMI 2.1a, which can do 4K / 120 with no display stream compression

  • @KavinduLakshan
    @KavinduLakshan Год назад +4

    This RTX 4090 review was WRONG WRONG WRONG

  • @lilPOPjim
    @lilPOPjim Год назад +25

    It would have been interesting to see power draw at the same FPS to see how effective the card is generating the same media

  • @mathsam7103
    @mathsam7103 Год назад +825

    For years we've been focused so much at the pinnacle of gaming pcs that NVIDIA's forcing us to look back onto practicality by their insane pricing. Maybe Intel has a point.

    • @MrPaxio
      @MrPaxio Год назад +32

      more competition came out, competition showed nvidia that you can charge similar prices for a crappier product, so their prices adjusted to make sense in the market space. not very surprising, its what the people wanted, apparently

    • @khalilahd.
      @khalilahd. Год назад

      lol true 😅

    • @nathanjokeley4102
      @nathanjokeley4102 Год назад +23

      the people that buy these kinds of cards are such a tiny market.

    • @chsi5420
      @chsi5420 Год назад +26

      The real money is in selling budget cards. Which is why Intel hopped into the pool with their arc cards. Nvidia is moving further into an enthusiast/professional PC direction. It's like a Toyota vs a Lamborghini, one works well for most, but the other is most desirable.

    • @info0
      @info0 Год назад +30

      @@nathanjokeley4102 RTX 4090 cards are aimed at hardcore, high-end gaming enthusiasts who demand the best there is. They don't care about prices.
      I did belong to that group for a long time, but times change, priorities in life change, so I dropped out of the race.

  • @zotac1018
    @zotac1018 Год назад +638

    Though pricing for these cards are beyond crazy.
    This is the first card where turning on Ray Tracing will finally makes sense ( when I buy them at discount some years later).

    • @esatd34
      @esatd34 Год назад +25

      around 7 years maybe.

    • @lunatik6168
      @lunatik6168 Год назад +11

      atleast after 5 years of my younger brother mumbling about dlss and ray tracing superiority, he might even use these features for the 1st time 😆
      (nvm, he said his going amd after release 🙃)

    • @lejoshmont2093
      @lejoshmont2093 Год назад +4

      Another 3 generations it will probably preform decently on a 50series which means in a couple of generations after that you could expect pretty wide adoption. When those cards are hitting the used market.

    • @gstylez0107
      @gstylez0107 Год назад +16

      @@esatd34 Seven years? ..You're probably just being facetious, but it definitely won't take that long. In just three years, the 4090 will practically be chop liver.. The technology moves exponentially fast. Moore's law isn't dead, Nvidia is just full of shit.

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 Год назад

      @@lejoshmont2093 I'm waiting to upgrade till the 50 series... the next 'big thing' is going to be path tracing in games and maybe by then the cards will be powerful enough to support it better for non-tech demo use.

  • @IVMRGREENXX
    @IVMRGREENXX Год назад +5

    0:27 " there are some other problems"...card disintegrates

  • @gabe20244
    @gabe20244 Год назад +4

    I'm a simple man, I see Anthony, I click video.
    I know it gets shot down all the time, but would love to see a Linux Tech Tips with Anthony if he's willing to do it.

  • @thelaitas
    @thelaitas Год назад +2516

    I'm just hoping that AMD won't disappoint us

    • @aaditya4619
      @aaditya4619 Год назад +199

      Bro AMD is just synonym of disappointment

    • @Zapdos0145
      @Zapdos0145 Год назад +29

      the only way they could is if they increase prices by a stupid amount. or capacitor problems… the expectations are that low.

    • @raawesome3851
      @raawesome3851 Год назад +185

      @@aaditya4619 not really. Their CPUs are good, gpus are fine, at least in the high end.

    • @EldenLord.
      @EldenLord. Год назад +22

      They wont reach the 4090

    • @aboveaveragebayleaf9216
      @aboveaveragebayleaf9216 Год назад +27

      How so? They had a pretty competitive lineup of gpus depending on your specific needs/desires.

  • @ironhammer500
    @ironhammer500 Год назад +178

    I bet they are saving displayPort 2.0 for the Ti Series just so they can bring them out half a year later at around 50% more cost and justify the price hike for just putting on a display port and add pci gen 5.

    • @gmiblessed
      @gmiblessed Год назад +7

      Would more likely be the “Super” refresh next year or the year after. Rehash of the Turing product strategy.

    • @mortiarty7842
      @mortiarty7842 Год назад

      That's most likely what they doing

    • @theChramoX
      @theChramoX Год назад +1

      DP 2.0 in TI cards in 25% more price, super 25% more price and pcie gen 5. dont at me.

  • @amusetech
    @amusetech Год назад +31

    I love how Anthony is always focused on the wholistic experience and not get blown away by a few of extraordinary results!
    My favourite host ever!

  • @JackMooney
    @JackMooney 4 месяца назад +1

    0:26 - I thought that was a piece that fell of the GPU at perfect timing hahahaha

  • @SgtRamen69
    @SgtRamen69 Год назад +437

    Honestly I find the lack of Display Port 2.0 more concerning than the price since you atleast get some insane performance for it, but not being able to take full advantage of it for high refresh rate 4k gaming is pretty stupid lol

    • @Fizz-Pop
      @Fizz-Pop Год назад +33

      I wonder if the 4090TI will have Display Port 2.0 when it goes on sale...🤔

    • @STiStein
      @STiStein Год назад +9

      Seriously, hearing that makes me want to not buy one.

    • @prich0382
      @prich0382 Год назад +5

      The 3000 series was meant to have it, Nvidia trying to cheap out as much as possible

    • @hotlocalbabes
      @hotlocalbabes Год назад +6

      It's got an HDMI 2.1a port exactly for that reason my guy.

    • @bossofthisgym3945
      @bossofthisgym3945 Год назад +22

      @@hotlocalbabes HDMI 2.1a doesnt support more than 4k 120HZ too so what are you trying to say "my guy"?

  • @Scarlet_Soul
    @Scarlet_Soul Год назад +523

    The lack of Display Port 2.0 really is baffling

    • @Clawthorne
      @Clawthorne Год назад +62

      They're probably reserving that for their Quadro cards, so that companies who need higher refresh rates or resolutions are forced to pay 5x more.
      It's the Nvidia way!™

    • @AOTanoos22
      @AOTanoos22 Год назад +17

      @@Clawthorne Nope, RTX 6000 (Ada) also only has display port 1.4 which is a $5000+ GPU btw

    • @spaceduck413
      @spaceduck413 Год назад +38

      I feel like if RDNA3 is able to drive 144fps at 4k Nvidia are really going to be kicking themselves, to the point where we might even see a revision 2 or something like that.
      They've made it so that AMD doesn't even have to *match* their theoretical performance in order to outperform them in the real world. If your card can't drive more than 120hz due to bandwidth limitations, does it really matter how many "more" frames you get?

    • @Nightengale537
      @Nightengale537 Год назад +4

      @@spaceduck413 honestly I could care less about frames counts I really just like being able to play games with consistent frames which is why I like 30 and 60 fps no higher no less but honestly GPU's are absurdly priced which is why I don't even touch the RTX series and stick with my gtx 1060

    • @J-Rizzler
      @J-Rizzler Год назад +1

      @@Nightengale537 respect imma buy a 1050 to LP

  • @dicebeatsofficial2525
    @dicebeatsofficial2525 2 месяца назад +4

    We miss antony

  • @WayStedYou
    @WayStedYou Год назад +3

    14:03 the ARC cards were waiting in a warehouse for nearly a year and still have DP 2.0

  • @nazaryn
    @nazaryn Год назад +1302

    Anthony is such a great host -- clear, concise, covered all the bases, mentioned the case average temperatures, testing conditions, ambient air temperature, etc

    • @xenox8553
      @xenox8553 Год назад +41

      Almost like they have writers.. or whatever they are called...

    • @bigfoot3322
      @bigfoot3322 Год назад +25

      Yeah im always intrigued when a Cane toad hosts a show.

    • @Wes_Trippy4life
      @Wes_Trippy4life Год назад +5

      @@bigfoot3322 😭💀💀💀💀

    • @TinkyTheCat
      @TinkyTheCat Год назад +22

      ​@@xenox8553 Anthony is listed as both the episode's host and writer on the end slate, though that doesn't rule out others helping him.
      At any rate, while these videos are collaborative efforts, it's perfectly normal to have a favorite host(s), as they each have their own delivery.

    • @ypsilondaone
      @ypsilondaone Год назад +3

      and delivering wrong info

  • @falrexion7709
    @falrexion7709 Год назад +796

    I am so disappointed we haven't seen a set of cards with a good balance of power use to performance since the 10 series

    • @josdebosduif5
      @josdebosduif5 Год назад +46

      Check DerBauer's review, he plays around with the power target and that shows some impressive efficiency results!

    • @MHWGamer
      @MHWGamer Год назад +59

      do what the other said. 70% power target => 300 W and onlx -5% in fps.
      For a 300W card with this performance, it is bonkers.

    • @danebeee1
      @danebeee1 Год назад +7

      @@josdebosduif5 this is exactly what I was gonna say. The power draw on this card is actually insanely good compared to everything else out there including nvidia’s 3090 and 3090ti.

    • @bojinglebells
      @bojinglebells Год назад +6

      its competition. The 20 series might not have been as much of an improvement because nVidia ballooned the size of their chips in order to fit in all the extra RT/Tensor stuff, but the 30 series returned more to normalcy, however actual threat of competition from AMD lead them to be as aggressive as possible with performance with less regard for how power they are targeting per tier.
      Power/performance has never really gotten worse since the 10 series, you just have to be willing to apply your own limits to the cards. Take a 2080, tune it to draw no more power than a 1080, it will still be plenty faster. The 3080, even more so.
      What will be really interesting to see is the 12GB "4080", which has a die size smaller than the 1080 (and is barely larger than a 3050/60), but has more transistors than a 3090 Ti. Its just a major shame they're trying to fleece us for it with that absurd $900 MSRP.

    • @Lothyde
      @Lothyde Год назад +2

      All of this hardware (both CPU's and GPU's) is already extremely efficient, manufactures just crank the hardware to its limit and the power increases exponentially when its close to the limit.

  • @TwistedEyes12
    @TwistedEyes12 Год назад +9

    I really appreciate showing off the benchmarks with DLSS OFF. I'm not against using DLSS personally, but my "goal" is always to play without it if possible.

    • @lasarousi
      @lasarousi Год назад +3

      Game graphics have been in a stalemate for so long that these gimmicks seem more worrying than exciting.
      Anything above 1440p is just unnecessary unless you're display is over 70'

    • @hopey1809
      @hopey1809 Год назад +2

      why tho?

  • @Flyingsquirrel69420
    @Flyingsquirrel69420 Год назад +3

    Me watching this 4090 video with my integrated graphics cpu

  • @PhobiaSoft
    @PhobiaSoft Год назад +1159

    At this rate, I legitimately think my next GPU is going to be made by Intel. What a fascinating turnaround for them.

    • @velqt
      @velqt Год назад +40

      Intel gpus are a synonym for hot garbage

    • @nadie9058
      @nadie9058 Год назад +137

      @@velqt Right now they are, maybe in a couple of generations they can actually compete.

    • @velqt
      @velqt Год назад +28

      @@nadie9058 Intel will cry and give up before that happens. They’re already in damage control

    • @oisiaa
      @oisiaa Год назад +51

      Yes. Everyone needs to go Intel to send a message about pricing. Let Intel undercut the market.

    • @velqt
      @velqt Год назад +6

      @@oisiaa Ah you want to pay for hot garbage so intel prices even higher when they realize what people will pay for trash?

  • @MarkLoganFIB
    @MarkLoganFIB Год назад +285

    This is reminding me of when 1080p was really demanding. Finally we're going to get cards running 4k like it's kot even 4k

    • @momomimi8957
      @momomimi8957 Год назад +38

      1080p now doesn't even make the honorable mention section of the benchmarks.

    • @lejoshmont2093
      @lejoshmont2093 Год назад +7

      I remember people years ago gaming at 4k although at low frame rates.

    • @MrEpic-97
      @MrEpic-97 Год назад +6

      I gamed at 4k60 with a 1070...just dont crank everything to ultra. Set to medium-high and most games ran at 60fps

    • @MrBenenator
      @MrBenenator Год назад

      @@momomimi8957 You made my HP ZR24w cry. You monster. /lh /j

    • @MrZodiac011
      @MrZodiac011 Год назад +2

      Yeah I moved to 4k recently and have a 3080, but I am gonna need to replace it because of it's lousy 10gb which wasn't an issue when the card launched, but since then, despite game graphics not improving at all, they used twice the Vram. Far Cry 6 wants 11gb with the 4k pack, yet 5 and New Dawn wanted 6gb and they look the same

  • @eugenejones1994
    @eugenejones1994 Год назад +1

    I could listen to ANTHONY all day. He has so much knowledge, yet he breaks it down for us.

  • @NickxTM
    @NickxTM 11 месяцев назад +3

    When your pc gets less fps on 1080p medium settings than the 4090 on ultra 4k settings :(

  • @blaze8897
    @blaze8897 Год назад +655

    I swear, Nvidia just radiates so much utter contempt for their customers, partners, and worldwide power grids that it's actually giving Apple a run for their money.

    • @appixx
      @appixx Год назад +42

      Facts, I really hope AMD can pull though this year, because this is not okay

    • @lasthopelost9090
      @lasthopelost9090 Год назад +1

      How about dealing with crypto before we start complaining about the power grids

    • @IsraelWokoh
      @IsraelWokoh Год назад +22

      Nintendo: "Hold my lawsuit."

    • @robburgundy9539
      @robburgundy9539 Год назад +13

      They are the Apple of pc parts. Nvidia is a lifestyle at this point, they succeeded.

    • @wanderingwobb6300
      @wanderingwobb6300 Год назад +28

      It's hilarious since Apple hates Nvidia too. They're a match made in hell.

  • @shadowlemon69
    @shadowlemon69 Год назад +413

    RTX 4090's raster performance and with DLSS is pretty impressive, but for a whopping 1599$, I could buy a whole PC for that price, and with it's 450W+ TDP, even that 1600$ PC could consume less or same power as the RTX 4090 would

    • @mudgie0205
      @mudgie0205 Год назад +19

      Well that’s just, like, your opinion man…

    • @vongdong10
      @vongdong10 Год назад +21

      Yeah but will that whole pc have the same performance?

    • @TropicalCyc
      @TropicalCyc Год назад +7

      @@vongdong10 yes :)

    • @surfalcatraz9770
      @surfalcatraz9770 Год назад +6

      @paradox I bought it. I will legit have a heart attack if amd has a better card for a alower cost because the 4090 was hella fucking expensive AND i had to stretch my budget A LOT.

    • @watawatan0w
      @watawatan0w Год назад +2

      but what's gonna drive it kid? a 1060?

  • @Jameslawz
    @Jameslawz Год назад +16

    With how expensive Nvidia cards are getting, I am seriously considering buying an AMD card.
    I've been looking to replace my 1060 for almost 3 years now and thanks to Crypto miners, COVID (supply chain issues), silicon chip shortages, rising interest rates, inflation and the scarcity of 30x series cards + the black market it's created has driven the price up for cards almost 200%
    Nvidia promised that a GPU should always cost the price of a games console ($400-600) but they are still marked up way to high.

    • @Larimuss
      @Larimuss Год назад +1

      Their exactly the same. Their just matching nvidia prices that’s what they do because there is no real competition with 2x sellers. Buy the best AMD it’s not even as good as a RTX 4080 and around the same price.

  • @RektemRectums
    @RektemRectums Год назад +14

    I once bought the GTX 970 for about $400 when it came out. I thought that was expensive. That wasn't long ago.
    Now I'm spending almost $1000 for a 3090. Used. This is ridiculous.

    • @elusivemindsstudios
      @elusivemindsstudios Год назад +1

      I bought a 3090ti for $840 new Amazon flash sale

    • @Opie..
      @Opie.. Год назад +1

      Purchased an EVGA 980Ti on its release date for £570. Im out priced today from top of the range GPUs. I now wait some time before I buy them second hand.

    • @wizardemrys
      @wizardemrys Год назад +1

      I just got a RTX 3060 for about $350 with cyber-Monday deals. the GTX 970 now costs $340. the 3060 is 95% better on userbenchmark, when did you get yours? But my mother board could only support at latest 7nth gen intel cpu, so I did have to overpay to get the i7 7700K while there were far cheaper newer gen ones for less price that were the same or outperformed my cpu.

    • @RektemRectums
      @RektemRectums Год назад

      @@wizardemrys when did I get mine? My 970? As I stated, I got it when it came out, google search says late 2014 was launch.
      Currently using an ASUS Strix 3090, the best version of a 3090. I haven't overclocked it yet.
      I have an i9 9900K and game at 3440 x 1440p. CPU is maximum 73% under load even in the newest titles like Darktide, where even the GTX 700 series GPUs don't even work anymore. RIP
      I'll upgrade my CPU to a 13900K when the next generation launches and the prices drop. I'll have this i9 9900K for 5 or 6 years by that time.

  • @darklyspaced
    @darklyspaced Год назад +325

    I have to be honest, the 40 series really isn't worth it at this moment, especially with the recent reduction in the price of the 30 series gpus.

    • @manjindersinghsaini911
      @manjindersinghsaini911 Год назад +18

      Not just 30series but rx6900 xt is going around for 699!!! For its power consumption, this is no Brainer

    • @propersod2390
      @propersod2390 Год назад +9

      @@manjindersinghsaini911 6900xt is a joke bro imagine buying that 💀💀 the 6950xt had half the performance of the 3090 in most games. Even worse in productivity. What a joke

    • @researchandbuild1751
      @researchandbuild1751 Год назад +6

      It's literally 2 times faster for productivity use cases. 2 times! Yes that is worth it. Time = money

    • @vidyamancer7135
      @vidyamancer7135 Год назад

      @@propersod2390 LOL pulling numbers out of your ass.

    • @harrylesueur
      @harrylesueur Год назад +53

      @@propersod2390 not sure what games you've been playing... (*Psst* how much is Nvidia paying you for this, I want in)

  • @catalyst429
    @catalyst429 Год назад +19

    the displayport 2.0 thing is insane. With a card like that you want it to be relevant for as long as possible for that price point and its starting off a gen behind what a joke.

    • @Somber.Killer
      @Somber.Killer Год назад +10

      It's all on purpose. People who are gonna purchase this card will buy the next 4090ti or 5090 with dp2.0 without hesitation. Giving Nvidia their money hand over fist.

    • @ShogoKawada123
      @ShogoKawada123 Год назад

      No company has even announced a DP 2.0 display though, AFAIK

    • @ShogoKawada123
      @ShogoKawada123 Год назад

      @@primegamer321 When? AFAIK there literally isn't a single model available for sale currently, most of the really good displays just have HDMI 2.1 and DP 1.4a

  • @cjpartridge
    @cjpartridge Год назад +2

    Unbelievable this only has Display Port 1.4, classic Nvidia move.

  • @ThorDyrden
    @ThorDyrden Год назад +13

    Video-Encoding: for streaming h.26x and AV1 are important/the future... but for video-editing you often have to cope with ProRes RAW.
    (Apple) ProRes is a codec supported by Apple's SOCs of course - but I have never seen it being implemented on GPUs outside Apple - though you find it on a lot of cameras (Sony, Fujifilm,...). Is Apple prohibiting alternative hardware-encoders for "their" codec, or am I missing some other reason?

  • @kauahatu
    @kauahatu Год назад +248

    Could you guys test wether or not you can heat your room with one of those? Maybe that is the solution to the rising energy prices.

    • @pvpdm
      @pvpdm Год назад +71

      If you can afford one of those you can afford the high energy prices

    • @crashtestdummy9985
      @crashtestdummy9985 Год назад +15

      I mean that thing looks like it could shit out a way bigger power bill than your actual HVAC system honestly.

    • @DiamondTear
      @DiamondTear Год назад +4

      Around here we're heating a whole city with a supercomputer...

    • @Bruno_Laion
      @Bruno_Laion Год назад +1

      @@pvpdm pretty sure it was a joke

    • @Maxtraxv3
      @Maxtraxv3 Год назад +1

      bruah i can heat my room with a rtx 2070 super

  • @Chemzborgor
    @Chemzborgor Год назад +2

    Bro the 4090 just feels like a FLEX

  • @Synaps4
    @Synaps4 Год назад +6

    Hey LTT, you've been "working on updating the video" with your pinned corrected data here for TEN MONTHS now. Why can't you fix the video?

  • @JimmyMcThiccus
    @JimmyMcThiccus Год назад +215

    It was good having the second fastest gpu for 3 days.
    But I did get the 3090 for 800 bucks.
    I’m also upgrading to a 5800x3d today up from the 3600.

    • @themightypizzadevourer6018
      @themightypizzadevourer6018 Год назад +36

      Hello brother

    • @simonic2063
      @simonic2063 Год назад +7

      I went with a sim racing 3080 for $400. Unless I absolutely need the fps -- I have a hard time justifying >$500. Things like this definitely keep the console market alive.

    • @idwithheld5213
      @idwithheld5213 Год назад +4

      I would 100% wait for the 13900k before changing platforms/cpu's.

    • @Krydolph
      @Krydolph Год назад +28

      @@idwithheld5213 the 5800 X3D is a dropin replacement for his 3600, no new motherboard or anything else needed, and it is one of the absolute best gaming CPUs, maybe the 13900k can beat it on paper, but there is no smarter or better upgrade path for him right now, unless money means nothing, and for you giving "advice" ofc. someone elses money means nothing to you!

    • @jonsnow2555
      @jonsnow2555 Год назад

      @William B time to buy online from u.s.

  • @Secret_Takodachi
    @Secret_Takodachi Год назад +47

    The most impressive GPU I've seen in the last 12 months is the one Valve used in their Steam Deck, the performance that thing achieves is impressive!

    • @CaptainScorpio24
      @CaptainScorpio24 Год назад

      which one

    • @austinnafziger4159
      @austinnafziger4159 Год назад +8

      @@CaptainScorpio24 It's an integrated RDNA 2 based GPU with 8 CUs.

    • @CaptainScorpio24
      @CaptainScorpio24 Год назад

      @@austinnafziger4159 ohhh ok

    • @esprit101
      @esprit101 Год назад

      Add the 40hz mode to that. It's no 60 but it feels really good compared to 30hz. I've played Valheim, Satisfactory and Cyberpunk, all felt really good for such a lightweight device.

    • @propersod2390
      @propersod2390 Год назад

      4090 probs has about 10x more performance. So impressive 🤨🥱

  • @shughes57
    @shughes57 Год назад +9

    I'd rather have the Prelude tbh, probably better for the environment too.

    • @benchod3576
      @benchod3576 8 месяцев назад

      $1800 Prelude will cost you around $3000 bucks after all the repairs it'll need.

  • @Birdman._.
    @Birdman._. Год назад +1

    Cheapest 3090 ti zotac is 849$ in amazon u.s and cheapest 6950Xt is 851$ XFX speedster
    What a time to live

  • @Zapdos0145
    @Zapdos0145 Год назад +165

    can you believe we would rather intel enter this market to bring competition and theoretically help “fix” this problem than see nvidia continue this path. it’s actually amazing… and im here for it. i hope AMD and Intels drivers come to play ball.

    • @ArtisChronicles
      @ArtisChronicles Год назад +1

      A race with no end in sight

    • @Zapdos0145
      @Zapdos0145 Год назад +7

      @@RyTrapp0 i meant AMD (themselves) and intels drivers as two separate things, i should’ve clarified.

    • @Mr.Morden
      @Mr.Morden Год назад

      That right there is why Nvidia wasn't permitted by regulating agencies to acquire ARM. Jenny demands total ownership and control, anything less is inadequate for him. He is just like Steve Jobs.

  • @Night_Hawk_475
    @Night_Hawk_475 Год назад +205

    Edit: Anthony has posted in the comments about this issue and confirmed it wasn't DLSS, but a different setting, that wasn't setting correctly, new numbers which are closer to what we see in other games were shared. Still a big uplift from the prior generation though :)
    I'm curious now how the other 40xx cards which are closer to the current 30xx pricepoints will compare to their past generation's numbers.
    The rest of my comment can be disregarded in light of the corrections coming from Anthony/LTT.
    There was another reviewer who mentioned that cyberpunk seemed to have issues with changing settings and it took several attempts of changing the setting and restarting the game to get it to run with DLSS off on the new Nvidia cards. DLSS being the thing that "creates fake frames", it severely affects framerate measurements if it's not actually turning off in this measurement.
    I'm not saying that has to have happened here, but it wouldn't surprise me. I would love if you could double check it for us :c

    • @SweetJesus16
      @SweetJesus16 Год назад +21

      Clealy it happned and they probably know but instead of talking the video down they continue to spread misinformation

    • @kopilovicd
      @kopilovicd Год назад +9

      That has to be what happened. The performance in Cyberpunk shown here without DLSS is not consistent with other reviewers.

    • @mrlacksoriginality4877
      @mrlacksoriginality4877 Год назад +2

      They said they were suing older drivers because new drivers were faulty at the start of that. That could be the difference, the also didn't use TAA in this video like DF. They showed with DLSS 2.x on and off in this video. With ray tracing and DLSS performance it was higher than with DLSS off and ray tracing off.

    • @Night_Hawk_475
      @Night_Hawk_475 Год назад +5

      @@mrlacksoriginality4877 The charts said DLSS off on both versions (and he clearly says "without DLSS" @5:35). Only RTX was on in the second graph, not DLSS.
      But if DLSS had secretly been on in both (not even like they meant to leave it on, just because the drivers/cyberpunk have a known issue with leaving it on even though the settings say it's off) it would inflate the framerates drastically like this and invalidate the test results. If they had a separate measure where they claimed DLSS was on it would be helpful, but there were none in this video so I'm not sure what you're referring to when you say you saw a comparison with DLSS on in Cyberpunk from this channel.
      TL;DR: DLSS 3.0 (not 2.0) uses an AI to "guess" what frames should look like, and inserts a "fake" frame between each pair of "real" frames. It does this at the cost of adding measurable latency and a risk for visual artifacting / loss of clarity (as the reviewers noted in this video). I suspect the average player may prefer to play games with DLSS 2.0 instead of 3.0, or with DLSS off all together. Regardless though, if a game (like cyberpunk) were to accidentally run with DLSS 3.0 enabled, even though the reviewers checked and thought it was off, then that game would be able to roughly double it's FPS. And in comparison to the performance increase of other games which weren't lying about whether DLSS was on or not, it would show a much more massive performance jump from generation to generation. Which... is exactly what we see here, if you halve the 40-series performance in cyberpunk it's still a big increase, but it's much closer to the performance increase we see in the other games being benchmarked in this video.

    • @mrlacksoriginality4877
      @mrlacksoriginality4877 Год назад +6

      @@Night_Hawk_475 They just posted the issue. It didnt have DLSS on but fidelity upscaling on because of some technical issues. At the 6 minute mark you can see DLSS on btw.

  • @kasperdahlin6675
    @kasperdahlin6675 Год назад +1

    Notice the minecraft ”uoh!” Sound when he says cooling att 10:48 😂😂😂😂

  • @danielpersson7483
    @danielpersson7483 Год назад +5

    the fun part is that less than 0.1% of the world would have any use for a graphic card like that

    • @fluxmechanics
      @fluxmechanics Год назад

      Exactly. I do 3D design, so it's nice, but I can't imagine it being needed as creative programs can't utilize the power. Only my GPU renderers are designed for that kind of power.

    • @Klokopf52
      @Klokopf52 Год назад +1

      And yet the largest group of people i know who buy them are gamer kids with money from their parents...

    • @donmike2810
      @donmike2810 Год назад

      So explain why it's still constantly sold out??

  • @quanicle101
    @quanicle101 Год назад +194

    the fact is that you never really NEED the fastest gpu to play new games at the highest settings. you can go a few rungs down (even a few generations) and still do just fine. after a certain point the added performance isn’t enough to justify the price

    • @kiloneie
      @kiloneie Год назад +20

      I am sure you are talking about 1080p here, which can run high/very high settings 60 fps on bloody any GPU. The moment you go 1440p the requirements jump quite a bit, and when you add a 144Hz monitor it just even more, WAY more than the resolution increase, that is where i am at, my 1070Ti cannot run anything at 144 FPS, it could barely run Fortnite at very high when i bought it, but sure it's just one game, PUBG i cannot reach good quality and over 100 FPS, it's a trade of, same with Warzone, etc.
      Then you have people, i will be one of those one day, that are at 4K, and some aren't even at 4K 60 FPS, but at more, and NOTHING but this 4090 can run 4K at above 100 FPS. I am not advocating for such absurd prices or the 4090, but your statement is wildly incorrect. When someone tries and plays 1440p for a while, they don't want to go back to 1080p, same with 144Hz vs 60 FPS, and just reaching 1440p 144FPS is bloody absurd, and that isn't even that new or premium or whatever, 4K and refresh rates in the hundreds are the niche. Yes a lot of people play at 1080p 60FPS, but a lot of them never even tried anything higher.

    • @iawindowss4061
      @iawindowss4061 Год назад +2

      That's how I have been feeling too, I am just not that interested or exited for the next generation the same way I was with the 6 series or 7 series.

    • @mattshaw4016
      @mattshaw4016 Год назад

      I think it only matters to eSports gamers who need max frame rates and enthusiasts who like using all the settings to the max. I honestly turn off a lot of setting in every game which I believe ads bloating and a tacky look 🤷. But definitely would be nice if you had a lot of money 👍

    • @felcas
      @felcas Год назад +16

      You don't need a Ferrari to get from point A to point B either. But it is nice to drive a Ferrari 😄

    • @alexisrivera200xable
      @alexisrivera200xable Год назад +5

      Correct, on most games the jump from 1440p to 4K is not that noticeable, same between the high and ultra presets of most games. We literally dump electricity for very diminishing returns with most people unable to even tell the difference. (But swear up and down that they can which drives a lot of toxic elitism.) The chase for the ultimate hardware has gotten real mindless at this point with bragging rights the only metric that matters.
      All to play the same single player games that thrive at 60fps and the sweatfest online games like Apex legends and COD Warzone that look the same plus are so overwhelmingly infested with cheaters that an investment in the fastest GPU money can buy at the grossly inflated margins Nvidia asks for is completely unjustifiable.

  • @Standard.Candle
    @Standard.Candle Год назад +324

    Thank you Anthony for finally being the voice of reason on the lack of DP 2.0
    Maybe if this was discussed earlier and there was more of a consensus among reviewers we could have pushed back on this anti-consumer tactic. I'm sure nVidia will be more than willing to sell me a 4090ti for $2099 in 6 months that actually supports DP 2.0

    • @Sal3600
      @Sal3600 Год назад +1

      Keep crying lmaoo

    • @platinumjsi
      @platinumjsi Год назад +5

      Surprised he didnt mention DSC which enables higher refresh rates / resolutions on DP1.4 without the image quality degradation of Chroma Subsampling.

    • @FAQUERETERMAX
      @FAQUERETERMAX Год назад +9

      Yeah, I'm skipping this generation. If I buy a bleeding edge graphics card I want a bleeding edge screen

    • @platinumjsi
      @platinumjsi Год назад +4

      @Alex Unltd. 160hz 10 bit HDR works fine with DSC

    • @MistyKathrine
      @MistyKathrine Год назад +1

      @@platinumjsi Can just use the HDMI 2.1 which is probably what most people getting this card will be using, though I question why there is just 1 HDMI 2.1 port but 3 DP 1.4 ports.

  • @sjmemz7285
    @sjmemz7285 3 месяца назад +1

    Y’all remember during quarantine when gpu’s were overly expensive

  • @Wes_Trippy4life
    @Wes_Trippy4life Год назад +1

    Bruh always lookin like he just ate a 1000 mg edible 😂😂😂😂

  • @Reac2
    @Reac2 Год назад +112

    With prices ever rising ,I think it's time that tech youtube starts to refer to GPUs not by their names, but instead by the most useful thing you can buy for the same price. This Honda Prelude review is pretty good ,for example

    • @davidplesnik6428
      @davidplesnik6428 Год назад

      😆

    • @MongooseTacticool
      @MongooseTacticool Год назад

      "This card is X rent/mortgage payments worth."

    • @Najolve
      @Najolve Год назад

      I do my personal economics in terms of sandwiches. So to buy a 4090 I just have to starve for a decade.

  • @Kokorogamer1
    @Kokorogamer1 Год назад +41

    They did the same thing when they didn’t support the HDMI 2.1 on the 20 series cards which I missed when I bought the LGC9 to the point of thinking I should upgrade to the 30s series. So yeah I’m not gonna fall for the same mistake again, future proofing is a must so I will wait and see what AMD gonna offer
    it feels like they did it so they can add it on the next generation to justify upgrading not to save a buck

  • @jaxobin
    @jaxobin Год назад +3

    bro got the 34 series

  • @VVayVVard
    @VVayVVard Год назад +7

    7:47 It's interesting to see how the RX 6950 XT has a massive advantage over even the RTX 4090 in some categories. Much of it is presumably a matter of optimization, but still, it goes to show that performance doesn't exist on a simple continuum. And it underscores the importance of choosing hardware optimized for your own use cases.

    • @MonikaPecnik
      @MonikaPecnik Год назад

      Yea Really. How is nobody Talking about Those results

  • @TKSubDude
    @TKSubDude Год назад +31

    Power and size limits make this monster of a card with a monster price tag a "NO-GO". It will remain an enthusiest card with few in use world wide.

    • @shippy1001
      @shippy1001 Год назад +3

      Not really, it will fly off the shelves, don`t underestimate how much money people are willing to spend on pc-hardware worldwide, if this was priced at $2999 it would still sell out, not in all countries, but US and EU for sure.
      For business use is a no-brainer, time is money and this cuts a lot of time so the price is not a deciding factor, for streamers and content creators who always want/needs the latest and greatest it will also be a no-brainer, basically anyone but people on a budget, and those people should be buying the 30x according to Nvidia, if you divide FPS/$$$ the 4090 is the cheapest GPU you can get so $1199 3090ti or $1599 4090 that has 2.2x the performance? If we start seeing 3090ti`s priced at around $700-800 USD, than yeah no reason to get a 4090, but that`s not the scenario today.

    • @whdgk95
      @whdgk95 Год назад

      @@shippy1001 it'll do as well as a niche product will like the thread ripper. That was popular and commercially viable too, but definitely not moving the same volumes as the other Ryzen models. Dedicated servers and workspaces likely won't switch to this from their workspace cards like quadro MI series. Supercomputers that use the MI series already is not switching to this. Many of those partners are contractually bound as well. The rtx4000 is for enthusiasts, not for businesses. Streamers, like you pointed out, probably have the most draw, but honestly they can just get a cheaper gpu as a hw accelerator for their streaming purposes for a much smarter decision. So sheep will prob buy these, but they prob won't be moving the same volume as previous gens for sure.

    • @ggwp638BC
      @ggwp638BC Год назад +1

      @@shippy1001 It will fly off the shelves because, as nVidia already said, they are keep inventory low to create a false scarcity.
      For business... it depends. If you're large enough that you need it, you're also likely large enough to be looking for Quadros. Streamers can justify it somewhat, but most professional streamers already run double setups to the point this doesn't mean much. Plus the extra image quality won't be translated to the streams anyway. On top of that at some point you just have to factor in the rising energy costs and cooling solutions. It is a good product in regards that if you have the money and a very specific use-case that will benefit from the extra performance, and don't care about energy or heat, it can be beneficial, as long as you're not large enough for business solutions. But at the same type the actual market that checks all these boxes is very, very small. A good product under certain circumstances? Yes. A no-brainer? No - specially because the lack of Displayport is one massive downgrade, specially for streamers, who are the people who would mostly benefit from this card. Add to it that it doesn't have support for PCiE 5, which y'know, is quite important if you want highbandwith to, I don't know, move around large pictures in high resolutions with minimal compression and very high speeds, and you might have your buyers wondering if they really should invest or wait for next year.
      The 4090 is an enthusiast/halo product. It's something to sell the brand, and nVidia is making sure to price people out of it in order to become the de-facto "premium" brand, like Apple does.

    • @whdgk95
      @whdgk95 Год назад

      @@kenhew4641 Are you unaware that there are separate enterprise line of products in Intel, AMD, and NVIDIA? US Government has already invested into AMD MIs, EPYCs, and NVidia's A100 GPUs. THOSE are enterprise market products, not the RTX4000 LMAO. Thats exactly what makes these RTX4000 cards such a niche product like the threadripper.

    • @shippy1001
      @shippy1001 Год назад

      @@whdgk95 That`s a different scenario, and those Ryzen situations yes you are absolutely correct, but for Rendering, VFX, and even streaming, the 4090 is still the much better value proposition, and don`t get too caught up with the DP1.4 thing, it has HDMI 2.1, even Nvidia knows that is a niche to use PCIe5.0 and DP2.0.
      Most professional office environment like indie game devs or VFX artists runs dedicated PC desks around a warehouse, they simply buy a new system for the best developers and move the 1-2-year-old system to the newer guys, and the 4-year-old systems are sold/traded.
      The dedicated HW accelerator is too much of a hassle to deal with, a single powerful GPU is much easier to work with and you will get more value out of that.
      Just to be clear, I`m not defending Nvidia practices, just explaining that this product even priced as it is right now, is still a good deal for people/businesses who got the money.

  • @ChristopherHallett
    @ChristopherHallett Год назад +67

    In a few years when the 6090 launches, hopefully competition will have forced prices back down closer to $1000, and I might be able to justify buying one so I can start gaming in 4K.

    • @koalakenbymacy9248
      @koalakenbymacy9248 Год назад +5

      The 6060 should be able to easily handle 4K. I'll be incredibly surprised if the 6060 doesn't have better performance than the 3090ti

    • @hunchie
      @hunchie Год назад +2

      Because of inflation, PC parts are not going to get cheaper.

    • @ignacio6454
      @ignacio6454 Год назад +6

      If moore's law is dead, be ready to pay 5000$ for the 6090 :)

    • @steve_bisset
      @steve_bisset Год назад +1

      To be fair, I see the prices coming down. They can only keep charging so much when AI upscaling starts to do most of the work. The actual hardware is likely to plateau because there's just no need for monster cards with 600w power draw when the majority of the frames are software/AI based.

    • @kevin-sj3wt
      @kevin-sj3wt Год назад

      @@ignacio6454 yeah or $500 since everyone not buying it coz electricity is getting more expensive, and more GPU power wasted for nothing wont attract anyone

  • @sakracliche
    @sakracliche Год назад +1

    A controversial take - I agree with the 4090 price. Not the 4080 and the other "4080".
    I'd love to have a conversation about it in the comments if you can spell something other than "fuck nvidia". I'd like to hear someone else's opinion as well.
    Nvidia did deliver on the performance uplift, they improved dlss (which in time will be key upgrade for slower cards like 4050 on next gen titles, especially on cpu bound games at 1080p), outstanding thermal conductivity on the reference card and all the less but still important things like so much better tensor cores, much more and better SMs, double AV1 encoder and decoder - all this while not taking a single step backwards and rising the price by a meager hundred dollars, which is negligible if you have the budget for this kind of hardware. Especially after taking inflation into account.
    There's pressure on nvidia and it tries to deliver. They bought more wafers because of the gpu shortage, now the roles are reversed and there's an oversupply, so of course they need to raise the prices of the newer gen to keep selling older cards to sell through them. They also can't improve things out of thin air. You just can't expect to get 20% uplift at the same price every 2 years.
    We got a top-notch card at a high, but justified price. Older generations are still available with declining price, because we're all not expected to be able to afford it. The cards are built to last and even the older cards' drivers are still being updated.

  • @trascallion
    @trascallion Год назад +45

    Props on the educated and unbiased explanation. You rock

    • @pandemicneetbux2110
      @pandemicneetbux2110 Год назад +5

      Anythony is just so awesome, like fr every one of us gamers and tech enthusiasts out there knows he's among the GOATs for listening to on unbiased hardware reviews, like him and Steve. I think HWUB can be pretty good too, but I understand why some may say upside down Steve is too pro-red (although, in all frankness, it's pretty damn easy to see why considering EVGA dropped their asses and they had the audacity to threaten him for not emphasizing raytacing enough, it's literally almost 2023 on 3rd gen RTX and I still don't really give a F about RT much).
      Oh, and btw, it's ok to have strong opinion and still be unbiased. Like I don't think his review on Macshit was biased. We just all know that Apple sucks. I think some people get really confused about our societies (fatburgers in particular) in thinking that just because one vote=1 vote, and just because we allegedly have equal rights and free speech and all that, and that reporters are supposed to be unbiased, they come to a few erroneous conclusions:
      1) that in order to be a journalist, or host any kind of debate, or be unbiased generally, you have to give equal screenspace to the other opinions no matter how outlandish.
      This is horse shit. Just because your schizophrenic friend believes that the ancient Atlanteans used a portal to put a flag on the moon, doesn't mean ANYONE let alone a respectable journalist is in any way obligated to give your utter horse shit pants on head retarded psychotic opinion even a pretense of equal consideration. In point of fact, clinical psychopaths do this shit on purpose by issuing the most outlandish opinion they possibly can (like say, agitating for building death camps for black people) in order to get allegedly "reasonable" people to give their viciously stupid, psychopathic ideas more credit to agree to "meet in the middle" and "only" build a few death camps for *some* people. The very existence of those ideas is why journalists should go even harder in rejecting them. Like trying to act like nVidia pricing "isn't that unreasonable" yes, yes it f'ing is.
      2) that all ideas are created equal. This is equally bullshit, for the aforementioned reasons. It doesn't matter if it's commonly held, it doesn't matter if it's a rumour, and it sure as shit doesn't matter that you also get one vote. The idea in voting and free speech, btw, is that some idea so outlandishly retarded, like saying Dell makes good hardware, is that the rest of people who actually have a clue will speak up freely and refute you, and that the actually reasonable, sane, or informed people would average out the psychotic psychopathic nutjobs that are sincerely pushing for things like starting a civil war, storing your most precious data on only one drive, or giving Apple and nVidia your money right now. Ideas are not created equal, and not all cultures are good ones. I really don't care that it's somebody else's insane, retarded idea, Bacha Bazai boy parties are inferior and disgusting, Dixie Klan culture is inferior to my own, and "alienware" hasn't truly existed since Dell eviscerated that company.
      Obviously, Anthony is not going to agree with all I'm saying, nor should that be implied, but I'm just saying, that I like Anthony for those reasons and find him trustworthy the same way I find Steve trustworthy: because he isn't full of shit and he's willing to say what's wrong, and I don't feel like his arse is up for sale. He's funny and he's a good man, I trust his character. He's willing to just say what's what, and you can trust a person because they *actually* know what they're talking about and are just trying to get you the most accurate information possible. This is why Steve and Anthony are among my favourite journalists. I don't feel like they'll compromise themselves, and we're all sick of the price gouging b.s. of these companies and people in power with tons of money when all the little ol me's just want to get a graphics card to play games on (I realize the irony of saying it for this video, but I came here looking to bitch about power consumption.)
      Speaking of, I was looking at this >$800 USD computer, and it had a 3050 in it. You know, I made fun of nVidia for having such drastically higher power consumption and worse prices for Ampere (this is BEFORE the scalpocalypse hit, so you can imagine my scoffing later). And I was looking at this freaking thing, it had a damn 8pin in it. Seriously, a 50 non-ti card, that has an 8pin power connector. So I'd really like to see what kind of madness the 4050 offers, and how much spin and bullshit I can expect from their "mindshare" aka Heaven's Gate brainwashing in trying to instill in me some kind of brand bias that somehow, this is acceptable. Even for all the shittiness of 6500XT etc., at least AMD managed to fill that niche and have effectual use cases for the RX 6400, though it cost way too damn much (not as laughable as the GTX 1630 but still). So that's why I came here, is to say I'm looking at these 4090 specs and wondering just how insane it's going to be at the lower ends when they can't even make a non-ridiculous high end anymore. I mean what're they gonna do now, install an 8 pin and a 6 pin on a 4050ti? At least the shite cards you could still put in SFF cases and office machines.

    • @Daniel-bl1wf
      @Daniel-bl1wf Год назад +9

      @@pandemicneetbux2110 Holy shit. You responded to a single sentence comment with this ?

  • @Emmo76
    @Emmo76 Год назад +53

    Did I actually make it here on time for a LTT video?

  • @julio_adame
    @julio_adame Год назад +89

    Wow! Good on you guys for pointing out the lack of DisplayPort 2.0 support and what that means for this GPU

    • @frantisekheldak2228
      @frantisekheldak2228 Год назад

      120 FPS enough for 4K. 120 fps and 144 is not a big difference. It makes no difference as long as you are not blind. by the way, the RTX 4090 has HDMI 2.1 which supports 4K 144HZ, so I don't know what you're talking about?

    • @julio_adame
      @julio_adame Год назад +5

      @@frantisekheldak2228 the 4090 performs beyond that. You've missed the point. Good day, sir.

    • @frantisekheldak2228
      @frantisekheldak2228 Год назад

      @@julio_adame HDMI 2.1 support 4K 144 HZ !

    • @julio_adame
      @julio_adame Год назад +4

      @@frantisekheldak2228 and the 4090 can hit 4K 240Hz performance, which is what DP2.0 is necessary for. Do some reading and educate yourself, pls, ty

  • @undef1n3d_
    @undef1n3d_ 4 месяца назад +1

    The accent at 13:05. I’m dead 💀

  • @wizardnotknown
    @wizardnotknown Год назад +1

    I just imagine Antony shredding on a guitar in the background.

  • @decoder55killer
    @decoder55killer Год назад +94

    I remember when the flagship gpu 5-6 years ago (980 ti asus rog) was $650… no one can’t say that this performance isn’t amazing but the price is just ridiculous. While other hardware manufacturers have maintained the price or even lower it for better performance, nvidia has tripled the price

    • @neoasura
      @neoasura Год назад +7

      I paid $500 for a Geforce 2 Ultra in 2000, which is close to $800 today, it was outdated in less than a year. Kids today don't realize they had a good run of cheap components for the past decade, those days are over. Chinese workers don't want to be slaves anymore.

    • @big_matt3496
      @big_matt3496 Год назад

      Try getting 16000 cuda cores for 1600 5-6 years ago.

    • @GeneralKenobi69420
      @GeneralKenobi69420 Год назад +1

      Now compare the cost per performance instead of cost alone. You'd be surprised by how much the 4090 utterly dunks on the 980Ti

    • @BlatentlyFakeName
      @BlatentlyFakeName Год назад

      They claim it's "inflation", but Nvidia are making even more obscene profits than ever so it's clearly a con.

    • @zahidshabir4038
      @zahidshabir4038 Год назад +2

      Yes price has increased a lot but also even price to performance increases every gen and in most cases also performance per watt too. I too hate that the prices are so high BUT you got to consider the fact that back then they didn't have a 90 series card nor was covid a thing which drastically affected inflation as well as costs of items and so on and so forth. Here in the UK for example I could easily get 2 Litres of milk for £1.50 back in like 2016/17 now its more like £2 minimum for the same amount of milk another thing that I remember the price of back in those days was a 2 Litre bottle of branded soda/fizzy drink such as Coca Cola/Pepsi/Fanta etc... which were always around £1 minimum (except for Coca Cola which would only be that much when it happened to be on offer) and I know there is a tax on sugary drinks now but sweeteners aren't taxed like sugar and back then both sugary and sugar free were around the same price and even sugar free which BTW is I think supposed to be around 20% cheaper (don't know the exact number) is now around £1.50 for 2 Litre bottles and for Coca Cola which is usually the most expensive common name brand you can only get it that cheap easily is if you get it on offer as part of a multi buy deal (such as buying more than one bottle such as 2 for £3)
      Inflation has ruined the market but even then NVidia is pricing these a little too high I think. I think including inflation a XX80Ti series nowadays should be around maximum $900 MSRP and the regular XX80 should be like $750 max and the reason I say those prices is because of inflation

  • @thegeekno72
    @thegeekno72 Год назад +287

    What I took out of this video are the impressive stats for the 6950X : not a perfect card, got it's pros and cons but the margins are smaller than most think and knowing it was cheaper by 50% than the 3090 and it's trading blows with the 3090Ti which costs almost double in many titles is a big W for gamers and got me really excited for RDNA3 coming up very very soon ! Been riding with NVidia so far but I've been considering jumping ship these last few years

    • @davisbradford7438
      @davisbradford7438 Год назад +8

      And in some applications it slays. 7:54.

    • @ABaumstumpf
      @ABaumstumpf Год назад +5

      Well - higher rasterisation performance, lower Raytracing performance, better general purpose productivity, worse when compared against Cuda.

    • @I_Like__bananas
      @I_Like__bananas Год назад +11

      AMD GPU really age fine because of the driver updates. You can see it as a good point or just see amd cards as unfinished when they come out.
      Vega GPUs became a lot better with new drivers as an exemple

    • @unfortunatelyswagged6226
      @unfortunatelyswagged6226 Год назад +2

      @@davisbradford7438 this is almost certainly a mistake, there are many errors throughout the video

    • @blackknight4152
      @blackknight4152 Год назад +10

      No 2.0 support is basically making this gen an extension of the 30 series. 4090 should be flagship performance material, yet they wont offer latest tech on display port. Im also considering jumping ship after this gen for sure.

  • @Onlybadtakes2589
    @Onlybadtakes2589 10 месяцев назад +1

    Prelude was actually my first car in high school hahaha

  • @TheOneChriss
    @TheOneChriss Год назад +2

    I just bought a 3090 earlier this year and now this... The performance upgrade is so huge and it's the same bloody price as what I payed. I feel like I've wasted my damn money.

  • @k9chilly5
    @k9chilly5 Год назад +133

    ok, these test results are VASTLY different (referring to 4k, ultra, no DLSS, RT Ultra) to what some other reviewers are getting. Not slamming this video at all, just a reminder to watch other reviews. Test benches definitely impact the results it seems. Wild card, wild money. Will be waiting lolol
    Edit: good grief, this the most interaction I've ever had with my comment lol but yeah, almost 4x increase on that Cyberpunk performance when others have around a 40% increase (which is still bonkers) definitely raises an eyebrow. Gonna be an interesting WAN show I guess lolol

    • @fjb666
      @fjb666 Год назад +12

      Don't forget random silicon quality differences.

    • @reloadingdontshoot1
      @reloadingdontshoot1 Год назад +6

      Wait for Jufes from Frame Chasers results. Shows the in game tests and not just bs graphs anyone coulda thrown together

    • @FOGoticus
      @FOGoticus Год назад +13

      Der8auer saw some really impressive numbers even when limiting the GPU to about 60% power draw.

    • @zshadows
      @zshadows Год назад +9

      Yeah, they are claiming about double in Cyberpunk 2077 what other reviewers got. Something is up.

    • @Magnus0891
      @Magnus0891 Год назад +5

      check out igors lab test report... he tested the card with an AMD 7950 CPU (the fastest current CPU you can get) and the only CPU it should be tested with as of today (Intel Raptor lake 13900K, has not released yet) The videos from igors lab may be in german, but his tests also come with an english translated 12 page article with all graphs benchmarks etc in english on his website. He is probably the most repected tester/ reviewer around in the industry, even linus tech tips, jay C 2 cents, gamer nexus, der 8auer and other reviewers constantly refer to his testings and give him credit for his work.

  • @cookiewriter4001
    @cookiewriter4001 Год назад +96

    If the RDNA3 x800XT can stay below or at 800$ and delivers comparable results it will be the new High End King. the 40 series lacks competitive pricing. 700$ was right at the edge for most gamers. I had friends that bought the 3080 with money saved up over the summer. I can't see that happen with a 1200$ card in the same class just 1 generation later.

    • @LetrixAR
      @LetrixAR Год назад +8

      If this card wasn't a 4090 but a Titan or for workstations, almost no one would complain about the price.

    • @tag206
      @tag206 Год назад +1

      Now they can save up 900$ and get a 70 class card.

    • @CptVein
      @CptVein Год назад +8

      How is AMD suppose to match these performances for more than half the price?

    • @GlorifiedGremlin
      @GlorifiedGremlin Год назад +5

      @@LetrixAR But.. it's not tho. So why bring it up lol

    • @tag206
      @tag206 Год назад +6

      @@CptVein Not match, but if they can give us even 90% of a 4090 for 800$, they pretty much win this gen.

  • @KEMOQUWE
    @KEMOQUWE 4 месяца назад +2

    when i saw the thumbnail: fr the fattest ones💀

  • @em.dot.2
    @em.dot.2 Год назад +2

    It's a bit shady to have the RX 6950XT on the bottom of every chart, even when its the second best GPU on the chart.

  • @kevinantonowvideo
    @kevinantonowvideo Год назад +7

    Watching this on an AGP card!

  • @zsookah3
    @zsookah3 Год назад +10

    Gonna stick with 3000 series and wait for 5000 series. The nefarious pricing and irresponsible power consumption is ridiculous.

    • @MrPaxio
      @MrPaxio Год назад +2

      im still sticking with my 1070ti as it does great gaming in VR and 1440p, might be like 8000 series by the time i switch. always surprises me these people buy the 3090 and 4090 but still game on 1080p, like what you need that performance for? rendering hentai?

    • @dragonbloodeye8661
      @dragonbloodeye8661 Год назад

      I have the 3090 trust me the 4090 is like the 2000 series new technology and the the 3090 is the refinde version of the 2000 series that means the 5000 series is going to be a refind version of the 4000 series I would wait for the 6000 series that would be the biggest jump in performance

    • @ImDembe
      @ImDembe Год назад

      @@MrPaxio 3090 is the best 1080p card out on the market, it's more than twice the fps compared to your 1070ti in most titels, even the 1% lows are higher than a 1070ti max fps in modern games = stable preformance plus buyers don't pair a high end gpu with a garbage cpu and that have alot to do with the stable preformance in 1080p and a little less if you play 1440p.
      It's night and day difference but your need the monitor and cpu to support a higher end gpu to.

    • @ImDembe
      @ImDembe Год назад

      @@SweatyFeetGirl Wouldent say alot better, RTX3090 and 6950xt trade blows at 1080p and all depend on what titel.
      And 1080 is very cpu dependent to.

    • @ImDembe
      @ImDembe Год назад

      @@SweatyFeetGirl Might be so but how many RX69xx/RTX3090 are paired to a i7 7700 or a Ryzen 2600x? for lower tire gpus like RX6600 that is actually posible to end up in a older system.

  • @Comeback.king.
    @Comeback.king. Год назад +2

    That thumb nail ain’t it chief

  • @darkerrorcode
    @darkerrorcode Год назад +1

    Seems like I'll be keeping my 3070 until it degrades into its elemental forms

  • @512TheWolf512
    @512TheWolf512 Год назад +7

    looks like a good card to buy in about 7 years time.

    • @zhanucong4614
      @zhanucong4614 Год назад

      No this gpu clearly was a joke

    • @512TheWolf512
      @512TheWolf512 Год назад

      @@zhanucong4614 after 7 years it should cost less than 400$, which would make it not a joke anymore

    • @zhanucong4614
      @zhanucong4614 Год назад

      @@512TheWolf512 it would be literally new GTX 980 and no longer produced

  • @melodiclodgings8
    @melodiclodgings8 Год назад +48

    I feel this GPU would work well with animation, filming, medical and many other industries rather than consumer based

    • @SwordfighterRed
      @SwordfighterRed Год назад +14

      Isn't that more what the Quadro line is for, anyhow? Or do folks even use those?

    • @Jaeeden
      @Jaeeden Год назад +3

      @@SwordfighterRed as far as I can tell, the only PCs with Quadro GPUs are ones that are prebuilt specifically for those industries.

    • @untitled795
      @untitled795 Год назад +1

      a100 or a6000, this is pathetic compared.

    • @froznfire9531
      @froznfire9531 Год назад +1

      @@SwordfighterRed But many people nowadays game and work on the same PC so a card which is great in both is the way to go.

    • @realomegamodern
      @realomegamodern Год назад

      @@SwordfighterRed they are for servers

  • @knampf9779
    @knampf9779 Год назад +1

    13:50 I laughed when he pointed out all the other stuff you can buy for the same price. Good comparison. Very informative video.

  • @jace_Henderson
    @jace_Henderson Год назад +13

    I was really hoping we’d get something like the 3000 series did where it beats the previous gens performance by a good bit for much less, I was expecting at least not too much more than 3000’s msrp’s, but now we’re literally back up to rtx 2000 msrp prices at best.

  • @dx7gaming
    @dx7gaming Год назад +51

    I don't have an explanation for the frame rate differences. If I had to guess though, I would say it might be related to the drivers. Nvidia drivers often contain specific optimizations and profiles for each game. It could explain why some games perform really well, while others perform poorly. Having the best hardware can't fix bad software and bad coding. Often bottlenecks and performance problems are purely software issues (I'm a software developer).

    • @AtticusHimself
      @AtticusHimself Год назад +8

      Oh ok "software developer"
      Was this close to calling bs until I saw that impressive title

    • @user-ko4zp1wm2i
      @user-ko4zp1wm2i Год назад +11

      @@AtticusHimself He's right and it is basic knowledge.

    • @AtticusHimself
      @AtticusHimself Год назад +3

      @@user-ko4zp1wm2i "and it is basic knowledge" that's the entire point of my comment

    • @thedeadlygames9716
      @thedeadlygames9716 Год назад +1

      @@AtticusHimself he is right though, you people seem to focus more on hardware and less on software which also causes bad frames. Hardware is not to blame.

    • @unkemptwalrus4643
      @unkemptwalrus4643 Год назад

      @@AtticusHimself then you worded it poorly

  • @nathanielelijah5899
    @nathanielelijah5899 Год назад +97

    for the blender benchmarks it would be nice if you guys specified if you used Optix or not, as it can give a massive improvement over CUDA, would be good to see both sets of results.

  • @chrisvig123
    @chrisvig123 Год назад +1

    It’s unfortunate there are no 4K monitors over 27” that actually support high refresh rates 😮

    • @EGH666
      @EGH666 Год назад

      that's where TVs come in. 65 inch 120hz 4k HDR10 OLED.

  • @akshajande6304
    @akshajande6304 Год назад +3

    RTX 34090 LMAO

  • @TheMrZeeRow
    @TheMrZeeRow Год назад +72

    I had no idea the 6950 XT was actually that competitive with with 3090's, might have to give it some more consideration

    • @GWT1m0
      @GWT1m0 Год назад +47

      It always was. Everyone expects AMD to push prices down but the same people still choose to buy Nvidia

    • @toddsimone7182
      @toddsimone7182 Год назад +9

      If you don't plan on using much RT then it could be worth it, depending on what pricing does after RDNA3 launch

    • @jesusbarrera6916
      @jesusbarrera6916 Год назад +17

      LTT has always had a weird bias against AMD.... especially when RDNA2 proved to be a great generation of GPUs

    • @Starfoxmack10
      @Starfoxmack10 Год назад +10

      I got a 6900xt for that very reason. Wasn't fussed about RTX and got one new for £700. I undervolted so the power doesn't go over 250 watts and core clock stays at 2550Mhz. It has done me proud 👍👍

    • @nombrepredeterminado6463
      @nombrepredeterminado6463 Год назад +1

      @@GWT1m0 there is no contradiction

  • @FourthLast
    @FourthLast Год назад +68

    I did an AVI encode today and the 14GB video dropped to 2Gb and looked absolutely gorgeous, though still lost some very minor detail. Still couldn't believe it when i saw the file size. Though I didn't have hardware encoder so it took like 8-9 hours lol.

    • @retrosapien1
      @retrosapien1 Год назад +1

      what are the exact steps to do so, link a guide or vid or something?

    • @Ubya_
      @Ubya_ Год назад +3

      @@retrosapien1 look for ffmpeg encoding tutorials

    • @MizarcDev
      @MizarcDev Год назад +3

      @@retrosapien1 Handbrake's snapshot build has AV1 SVT support which is great if you want a simple to use GUI for encoding.

    • @retrosapien1
      @retrosapien1 Год назад +2

      @@Ubya_ thank you

    • @knightsljx
      @knightsljx Год назад +6

      AV1, not AVI

  • @CheddarCheeseBandit
    @CheddarCheeseBandit Год назад +1

    don't worry, 10 years from now, people will be calling this GPU "ancient junk"

  • @IntergalacticViking
    @IntergalacticViking Год назад +4

    Would love to see some machine learning benchmarks in the productivity section!

  • @markloroff
    @markloroff Год назад +21

    dang i remember contemplating whether or not i wanted to spend $400 on my 1070 lol. Feels good now.

    • @dez7852
      @dez7852 Год назад

      I'm right there with you with my 1080TIs. I would probably shit myself if I got on to a new machine but for now, I still have a beast... to me.
      Also, I would have to build an entirely new machine. Shit. I'm still running an i7 8700k. Have yet to try and over-clock though. Might be time to try!

    • @toddsimone7182
      @toddsimone7182 Год назад

      You're making us feel old. Right around that time I dropped almost 800 on a 1080Ti.

    • @skolex3121
      @skolex3121 Год назад +1

      i7 950 reporting in with a GTX 770 that I bought for 350€

    • @brad8122
      @brad8122 Год назад

      1070s go for 100 bucks used. Congrats on your loss.

    • @markloroff
      @markloroff Год назад

      ​@@brad8122 I bought it new when it was released 6 years ago and the comparable GPU in the 40 series costs almost $1000.

  • @RobertoCresto
    @RobertoCresto Год назад +93

    Honestly, I am definitely more interested to the 4060. Upgrading from my 1060 will be a big improvement

    • @stevieknight9225
      @stevieknight9225 Год назад +10

      Always where the bang for buck is. Think AMD could focus this area this gen though

    • @tyrannus00
      @tyrannus00 Год назад +4

      @@stevieknight9225 I also got a 1060 and am looking for an upgrade, but there is no way I am gonna buy an amd GPU. I hope the 4060 is gonna be priced not too stupidly

    • @FeuerToifel
      @FeuerToifel Год назад +10

      as long as nvidia still has a massive stock of 3000 series cards, there will most likely not be a 4000 series besides the 4090, 4080 and 4070. err, i mean 4080 12GB.

    • @endfm
      @endfm Год назад +20

      @@tyrannus00 lol AMD GPU's are fine, you will not miss frames.

    • @tyrannus00
      @tyrannus00 Год назад +7

      @@endfm it's not about the frames, it's about all the other stuff. I would miss things like rtx voice, GeForce experience, having drivers that work well all the time and support everything etc. The fps are just the tip of the iceberg

  • @r3dwhiteandblue92
    @r3dwhiteandblue92 3 месяца назад +1

    Can anyone explain why the 4090 is so much better at ray tracing then any other GPU on the market? I'm genuinely curious as to why it's such a beast at ray tracing.

    • @lemagreengreen
      @lemagreengreen 9 дней назад

      Well 4090 has far more RT cores than 3090 for example but if you look at the 40 series in general and Nvidia claim each RT core is around 2x as powerful, they also claim the huge L2 cache on 40 series is of benefit as well and it does seem to play out.
      They've got a paper on the Ada architecture you can read about, it does seem like it's a genuine generational leap as far as ray tracing goes.

  • @mattcole4262
    @mattcole4262 Год назад +1

    When Anthony comes out to play you know its serious

  • @ahmedrauf8777
    @ahmedrauf8777 Год назад +42

    Pretty sure they made a mistake at 5:28 when they showed the cyberpunk performance (which imo is the most significant jump too), i also watched a few other videos, namely the one by JayzTwoCents and by DigitalFoundry where they measured the CP2077 RT ultra performance at around 45-50 fps at 4k while LTT is measuring it at 97....which is higher than what the other videos are showing as the non-rt performance
    edit: hardware unboxed...not digital foundry

    • @Churchgrimm
      @Churchgrimm Год назад +1

      I'm wondering if the different version of Windows plays into that.

    • @Flukiest
      @Flukiest Год назад +3

      The main difference is that LTT is using Ryzen 7950X whilst Jayztwocents is using 12900K. LTT is also using 16GB DDR5 memory.

    • @ahmedrauf8777
      @ahmedrauf8777 Год назад +4

      @@Flukiest still...an almost 100% difference can't be because of a cpu bottleneck.....especially when the CPUs being used are one of the best ones of their respective brands

    • @DavidStruveDesigns
      @DavidStruveDesigns Год назад +3

      @@ahmedrauf8777 It can if you combine different CPU architecture _and_ different Windows versions _and_ different DDR RAM versions. You'd be surprised how much of a difference that can make. Hell just having different Windows versions can make a huge difference.

    • @MusicPxviMLP
      @MusicPxviMLP Год назад +3

      ​@@DavidStruveDesigns I will go with Ahmed doesnt matter to much if its the 7950x or 12900k