Linus played on a new RTX 5090

Поделиться
HTML-код
  • Опубликовано: 23 янв 2025

Комментарии • 5 тыс.

  • @ds5045
    @ds5045 15 дней назад +4344

    Can we pay fake money for fake frames?

    • @sulpherbratigh7936
      @sulpherbratigh7936 15 дней назад +142

      Bitcoin

    • @bankaimaster999
      @bankaimaster999 15 дней назад +90

      You can just buy the B580 instead and less money going to them will get your point across. So hope you're not one of those hypocrites with their cards ...

    • @nw932
      @nw932 15 дней назад +165

      We already do lol, our money is backed by nothing.

    • @DuBstep115
      @DuBstep115 15 дней назад +25

      @@nw932 never was

    • @avengersoul
      @avengersoul 15 дней назад +62

      AI upscaling subscription coming to a GPU near you.... I wouldn't be surprised if they swap to a model like that in the future.

  • @MordecaiSPM
    @MordecaiSPM 15 дней назад +2999

    9:18 Don't be fooled.
    The GPU's raw performance is 28 fps.
    Like huh? This is so stupid.
    There is no way their raw power only gets 28 fps.
    This is even more proof that Modern Games are extremely unoptimized and heavily relied on DLSS.
    EDIT: I've seen some dumb take here like it's beneficial for EVERYONE to have a minimum standard of 60fps instead of 30fps.
    It's also beneficial for everyone that developers optimize their game from the very beginning.
    I know that they are using PT and RT, which is why it only got 28fps. Yes, I understand that, but even a 4090, the highest GPU on the market, ran like garbage in modern AAA games without PT and RT.
    Like it's crazy how people just straight up fine with unoptimized games when it's beneficial for everyone.

    • @KraszuPolis
      @KraszuPolis 15 дней назад +442

      It is because it runs full path tracing in 4k that is ridiculously demanding. But yeah fake frames suck it isn't performance.

    • @sanek9500
      @sanek9500 15 дней назад +163

      In the same scenario with full RT-overdrive, 4090 gives you 17-18 of raw perfomance. Just to be clear.

    • @thepatriot6966
      @thepatriot6966 15 дней назад +146

      @@KraszuPolis You wouldn't need DLSS/FSR for full RT if the games were made properly. You'd just need rasterisation.

    • @YamGaming
      @YamGaming 15 дней назад +61

      @@sanek9500 4090 gives 20 fps but keep in mind this "jump" in "raw performance" will be even lower when you not using RT

    • @TheParentYouNeverHad
      @TheParentYouNeverHad 15 дней назад +7

      Yep, I commented the exact same thing.

  • @Teletha
    @Teletha 15 дней назад +1581

    Graphics Processing Unit ❌
    Frames Generating Unit✅

    • @GFClocked
      @GFClocked 15 дней назад +87

      Hallucination lsd-trip unit 🤡

    • @shreyasdharashivkar8027
      @shreyasdharashivkar8027 15 дней назад +70

      27 frames on native resolution ahh card

    • @sadge0
      @sadge0 15 дней назад +23

      @@shreyasdharashivkar8027 this is the future you don't understand!!11!!! 😠😠

    • @leszeksykacz5929
      @leszeksykacz5929 15 дней назад

      @@shreyasdharashivkar8027 on 4k with path tracing

    • @DWN-024ShadowMan
      @DWN-024ShadowMan 15 дней назад +4

      😂😂😂 ​@@sadge0

  • @Icureditwithmybrain
    @Icureditwithmybrain 14 дней назад +300

    Please note that this new technology will not be effective with most games in your Steam library that do not have frame generation or DLSS options available in the in-game settings.

    • @noderunner_
      @noderunner_ 14 дней назад +49

      But most of those games don’t need this technology to run well.

    • @MoneyyyOG
      @MoneyyyOG 14 дней назад +11

      lmao these cards will kill the performance with the games that dont get support because the internal specs and vram is already just as good or better than the 4000 series.

    • @palniok
      @palniok 14 дней назад +2

      Haven’t nvidia announced driver level DLSS enforcement?

    • @CertifiedSunset
      @CertifiedSunset 14 дней назад +2

      @@palniok Yup, from how I understand it, they can force inject DLSS into pretty much any game with the new cards. Not sure how it works, but that's pretty crazy if true.

    • @sparda9060
      @sparda9060 14 дней назад

      @@CertifiedSunset AMD already has that for FSR and frame gen. you can turn it on by force in the driver. Kinda weird Nvidia took this long to put it in.

  • @Shizlgizl
    @Shizlgizl 15 дней назад +227

    A bunch of tech channels already commented that what DLSS4 offers is 3 "fake" frames for every 1 true frame.
    It only works in games that support it and while LOOKING smooth has noticeable delay.

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 15 дней назад +54

      It doesnt even look smooth, there is noticeable ghosting and smear even on these early reviews. Which means it will be even worse in person.

    • @bigturkey1
      @bigturkey1 15 дней назад +33

      a lot of the tech channels on youtube are just influencers posing as reviewers

    • @rob_over_9000
      @rob_over_9000 15 дней назад +16

      @@bigturkey1Very true. If it ain’t GN, I ain’t interested.

    • @mroutcast8515
      @mroutcast8515 15 дней назад +15

      The regular frame gen is only every second fake. DLSS4 offers 1 true frame for every 4 frames, while those remaining 3 are interpolated frames that DO NOT react to your inputs. 30 fps framegened to 120fps will look like 120fps but will play like 30fps. Framegen from DLLS3 was only good when you were hitting like 70-90fps natively and you wanted to increase motion fluidity since most people have 144Hz+ monitors.

    • @jagildown
      @jagildown 15 дней назад +2

      And in that 1 frame one 1/4 of it is actually rendered the 3/4 rest is DLSS

  • @eddiehuang1811
    @eddiehuang1811 15 дней назад +1526

    as.owner of the 4080 super, the extra fake frames feels terrible, it does not feel smooth and your output from your keyboard and mouse feels really delayed, I tried them a few times and never turned back on

    • @TheHighborn
      @TheHighborn 15 дней назад +203

      Finally somebody else said it.

    • @GFClocked
      @GFClocked 15 дней назад +299

      Be careful. I just said the same on another video and got 10 fanboys telling me I'm against technology and I'm stupid etcetc. Supposedly if they can't notice then the issue doesn't exist🤡

    • @eddiehuang1811
      @eddiehuang1811 15 дней назад +82

      @@GFClocked i doubt they have a 40 series card lmao

    • @РаЫо
      @РаЫо 15 дней назад +21

      That’s why they showed a reduced latency. Everything will be AI in the future

    • @me-low-key
      @me-low-key 15 дней назад +51

      @@РаЫо It's wild they are going that route, every compute that is going to AI, means it's not going to actually generating authentic pixels the way devs want them.

  • @e_Moses
    @e_Moses 15 дней назад +68

    The reason the lower end cards are cheaper than the last generation is because they cut them down even more compared to the last generation. They are upselling lower tier cards, so actually they are more expensive. They are pulling an RTX 4080 12GB situation again but this time nobody seems to be complaining. And they are doing it for the whole lineup below the 5090.

    • @kthec1298
      @kthec1298 День назад

      ik right, when i saw how they cut down the 5080 in half compared to the 5090 i was like wtf is this, the 5070 is even worse , without dlss those cards are head to head with the 40 series

  • @bart_fox_hero2863
    @bart_fox_hero2863 15 дней назад +487

    Linus would have been an infomercial star like 30 years ago

  • @beardeddutchman5197
    @beardeddutchman5197 15 дней назад +1028

    I'm not paying 2k for double the fake frames. Real frames peaked

    • @smokinace926
      @smokinace926 15 дней назад +22

      True but tbh human eyes don’t get a real benefit over 120 fps. So what’s the point in trying to go higher?
      Edit: in case those want to comment, focus on the term “real benefit”. Not actual limit. It’s like watching something in 8k vs 4K. Technically better but the difference becomes marginal.

    • @TechnoMinarchist
      @TechnoMinarchist 15 дней назад +88

      @@smokinace926 Well when it can't even go over 28fps at native....

    • @wilkinlow
      @wilkinlow 15 дней назад

      Tell me when they hit 60fps​@@smokinace926

    • @zenixvampirchik652
      @zenixvampirchik652 15 дней назад +37

      @@smokinace926 tbh it's harder and harfer to notice, sure, but proper 240 is already noticebly better. For those who chase further and further fluidity it makes sense I guess

    • @Julio-Ces4r
      @Julio-Ces4r 15 дней назад +16

      ​@@smokinace926 the point is, the real frames are only 28 so...

  • @pulithevar8135
    @pulithevar8135 15 дней назад +852

    5090: “Hey we heard you love AI generated frames so we put more AI in your AI”

    • @novapulser9396
      @novapulser9396 15 дней назад +29

      Truly a yo dawg moment

    • @kyo_.
      @kyo_. 15 дней назад +12

      truly an Xzibition of our time

    • @Mr.Atari2600
      @Mr.Atari2600 15 дней назад +20

      Let's play Unreal 98!
      RTX 5090: Hey, i can't play that. it doesn't use DLSS!
      "You can't, but I can!" *3DFX Voodoo 3 3000*

    • @paavola2
      @paavola2 15 дней назад +1

      Why is it on performance and DLSS? I run a 5k ultrawide with a 4090, no. dlss, quality etc. Runs like a champ. No need for DLSS and all of the crap.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад +2

      At least now we know Asmon isn't afraid to tell us what kind of FPS he gets even if he is being a liar while getting ripped off for 110 FPS with a 4090

  • @Iesous27
    @Iesous27 15 дней назад +460

    You're basically paying and banking on Nvidia to provide you performance through software updates rather than buying a solid and powerful GPU. No thanks.

    • @darkspartan566
      @darkspartan566 15 дней назад +16

      Yes 💯% True!

    • @ronuss
      @ronuss 15 дней назад +21

      this is it 1000% its like a subscription. you want to unlock the new software features we locked behind the new hardware thats not even much better, give is 2k...

    • @grantsnyder3250
      @grantsnyder3250 15 дней назад +3

      The software “updates” are only for their next gen cards everytime

    • @joonashannila8751
      @joonashannila8751 15 дней назад +8

      Ehh its still RTX card. A very powerful graphics card.

    • @Iesous27
      @Iesous27 15 дней назад +10

      @joonashannila8751 and the moment Nvidia boinks an update, your performance goes to the crapper. Or better yet, when they decide to stop supporting the card.

  • @sjneow
    @sjneow 14 дней назад +8

    If Asmon want to see by himself how the AI generated frames looks he can fire up a game himself and play it in framegen, he has a 4090 right?

  • @PhaseMyself
    @PhaseMyself 15 дней назад +170

    Now we wait for game devs to release unoptimized slop running at 15fps and leaving it to MFG to reach their 60fps target.

    • @PerciusLive
      @PerciusLive 15 дней назад +29

      60 frames being the target is very disingenuous now because 60 frames at native should be the target. Not 60 frames after all these AI bs

    • @SSHayden
      @SSHayden 15 дней назад +3

      If even that. A lot of this ai shit also depends on the dev implementation. If they implement it bad, it could turn out almost unplayable. And if they can't make optimized games anymore, who says they can do anything else?

    • @wolfsbane9985
      @wolfsbane9985 15 дней назад +2

      ​@@PerciusLivebut that's not what they're doing. New games are so bad now that you literally need to enable the AI software to get you above 60fps.
      Edit= After reading your comment again I realized we're saying the exact same thing. My bad.

    • @snacsnac181
      @snacsnac181 15 дней назад +1

      dont you dare be so correct

    • @murray821
      @murray821 14 дней назад

      @@PerciusLiveI always disable every motion optimisations on my tv, they all make the image look weird with all kinds of artifacts and ghosting

  • @maverickstclare3756
    @maverickstclare3756 15 дней назад +865

    18:10 Zack still doesn't understand the retail price of goods is not determined by the costs of production. They are aimed to be priced at the optimum price that generates the most aggregate profit.

    • @joev3783
      @joev3783 15 дней назад +125

      You're absolutely correct in saying this. It's why ink cartridges have a nightmarish profit margin. They know you'll pay.

    • @xkblxcripple
      @xkblxcripple 15 дней назад +56

      Sad that Asmon will read comments on videos he reacts to and not his own YT comment section.

    • @damazywlodarczyk
      @damazywlodarczyk 15 дней назад +14

      These are two separate things. Of course the price is determined by the cost of production. And they come down from the price after they have developed cheaper productions methods.

    • @damazywlodarczyk
      @damazywlodarczyk 15 дней назад

      ​@@joev3783 It's a different business model, ink cartridges works like a subscription. You pay for the printer in ink.

    • @xkblxcripple
      @xkblxcripple 15 дней назад +18

      @@damazywlodarczyk They are not separate, they are priced above BOM/production and margined at what people will spend for the product. It is tech not a necessity.

  • @SomeOne-hw6jw
    @SomeOne-hw6jw 15 дней назад +393

    $2000 to play a few ultra unoptimized and buggy games at 300fps (75% IA generated frames) with a disgusting 60ms input delay

    • @SBlazeable
      @SBlazeable 15 дней назад +19

      yep, and I'm going to buy it anyways because of the 32GB vram required to play skyrim modded and then rent it out when I'm done to make $10/day and hopefully pay off the entire PC in 1 years time.
      this is however, perhaps my last video card purchase, I'll just hold onto it for 20 years unless they go back to generating real frames because devs are going to get lazy, why work hard when AI can smooth everything out? but you'd think there'd be a parallel economy for real frame GPUs however idk that the lay person will value it over the fake BS and us real frame enjoyers will just get the shaft where if you want a complete game, dlss is required.

    • @therealkylekelly
      @therealkylekelly 15 дней назад +2

      Let's see you design something better.

    • @Barbarossa97
      @Barbarossa97 15 дней назад +19

      Woke DEI slop nonetheless

    • @ghoulbuster1
      @ghoulbuster1 15 дней назад +4

      Time to go back to good old pixel games.

    • @wolfsbane9985
      @wolfsbane9985 15 дней назад +1

      ​@SBlazeable the Vram increase is literally the only reason why I'm interested as well lol. 8gb more is decent. If not for that, well it honestly doesn't look all that compelling

  • @Kujamon
    @Kujamon 15 дней назад +8

    Lul the monitor turning off when he slammed the case close

  • @dznnah
    @dznnah 15 дней назад +851

    1000 bucks for 16GB of vram.
    this is the worst timeline

    • @РаЫо
      @РаЫо 15 дней назад +17

      Ai reduces vram usage who cares

    • @vch309
      @vch309 15 дней назад

      Fr plus it's ddr7, but people love complaining​@@РаЫо

    • @dznnah
      @dznnah 15 дней назад +233

      @@РаЫо HAH. Good one.

    • @Jeffdaam
      @Jeffdaam 15 дней назад +78

      @@РаЫо and that's the problem the over reliance on AI

    • @YamGaming
      @YamGaming 15 дней назад +31

      @@РаЫо In their tests it only reduced 400mb tho

  • @HecmarJayam
    @HecmarJayam 15 дней назад +77

    One of the problems with fake frames is that performance doesn't work in VR or other high demand applications. 3D modeling, Video editing, multi screen, etc

    • @demonfox51321
      @demonfox51321 15 дней назад

      So no reason to get 5090 if those don't work

    • @socks2441
      @socks2441 15 дней назад +11

      yip. pure rasterization performance is always more important and more useful in general for various tasks. imho the die space for dlss, raytracing and ai is just wasted resources that could have been put toward more powerful gpu's.

    • @Wppsamsung2024
      @Wppsamsung2024 15 дней назад +4

      That's a good point

    • @therealkylekelly
      @therealkylekelly 15 дней назад +1

      @@socks2441 you seem to not understanding current engineering capabilities, the reason we have software solutions are because of our hardware limitations. GeForce is geared towards gaming, not productivity. DLSS and Ray Tracing are both beneficial for gaming.

    • @your_average_cultured_dude
      @your_average_cultured_dude 15 дней назад +5

      Frame gen can be nauseating even on regular monitors, I can't imagine how vomit inducing it would be in VR.

  • @Chesslover69420
    @Chesslover69420 15 дней назад +554

    When he says "ai" all I can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. Stalker 2 is one great recent example of all of those.

    • @BlazeBluetm35
      @BlazeBluetm35 15 дней назад +14

      we'll see with benchmarks if Nvidia really solved that with DLSS 4.0

    • @KeyT3ch
      @KeyT3ch 15 дней назад

      You havent seen DLSS Transformer model. It will be available for all RTX cards since 20series. And it fixes a lot of the ghosting and blurry issues. Go look it up, Digital Foundry already has a video on it

    • @SpecShadow
      @SpecShadow 15 дней назад +5

      Remember - GSC sacrificed A-life for aggresive optimisation in Stalker2

    • @GFClocked
      @GFClocked 15 дней назад +27

      Hallucinated frames. Soon we'll have full hallucinated games with 6000 series !

    • @XnathOW
      @XnathOW 15 дней назад +2

      ​@@BlazeBluetm35Probably not...

  • @MOSMASTERING
    @MOSMASTERING 14 дней назад +2

    Something overlooked is that it isn't just a more dense version of the 4000 GPU chip with a bunch of more cores added on (a percentage chunk more CUDA cores is always the go-to performance bump for any new generation). It's supposed to be an entirely new architecture, chip layout, new process node etc. therefore driver updates over time will likely vastly increase performance of the new 5000 series too as the software department begin to take advantage of what the hardware boys built.

    • @skzeeman007
      @skzeeman007 13 дней назад

      Hard disagree

    • @MOSMASTERING
      @MOSMASTERING 13 дней назад +1

      @@skzeeman007 about... ?

    • @vyzion980
      @vyzion980 11 дней назад

      none of that matters when it’s fake frames dependent on software updates and games that are compatible with the new system

  • @MiiMavis
    @MiiMavis 15 дней назад +116

    I cannot believe that frame generation feels good at all.
    Each frame that is generated does not have any direct input from your mouse and keyboard.
    This would feel like the input lag we had with v-sync 60hz, where you also have to wait for the monitor's refresh rate (it creates small gaps between control-input and video-output).
    Also, the fps counter can be manipulated by frame generation by simply showing the same generated frame twice. If you have enough different frames in a second, you won't notice it except for some blur - which the AI conveniently generates anyway, and the fps counter going up.

    • @TheNiteNinja19
      @TheNiteNinja19 15 дней назад +17

      I've tried it with just regular frame gen and it feels like Jell-O. It'll be even worse, it'll feel like a monitor with slow pixel response time despite having more frames displayed.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад +4

      1st time admitting to the fact that new video games run like dog !@#$ compared to older ones?

    • @wolfsbane9985
      @wolfsbane9985 15 дней назад +4

      From my experience, the only game it feels decent on (with a 4090 mind you) is Cyberpunk. Otherwise, every other game I enabled framegen on felt terrible.

    • @lolsalad52
      @lolsalad52 15 дней назад

      ​@@wolfsbane9985 every game I've tried frame gen with felt like I was fighting the game to do what I wanted, input delay felt like trying to swim in honey

    • @wolfsbane9985
      @wolfsbane9985 15 дней назад

      @lolsalad52 that's awful. It wasn't that bad with my setup, but in Space Marines 2, for example, enabling it output a higher framerate, but moving around/looking around felt like my framerate was lower than if I had just left it off. The best way I can describe it is stuttery. If good/high fps is the feeling/texture of running your hands through really smooth sand, then turning on framegen felt like running your hands through gravel. I maintain the only game it feels good in is Cyberpunk, but otherwise, it's not a good experience.

  • @danielbrowniel
    @danielbrowniel 15 дней назад +65

    build a new pc every 5-10 years, buy games late.. you save TONS of money by being patient.

    • @ToeCutter0
      @ToeCutter0 15 дней назад +5

      Roger that. 👍 I have more games in my Steam Library than I'll probably ever get around to playing. I'd rather have my Steam Library overflowing with games I might never play, than an RTX 4090 in February 2025. Folks were paying $3000 for a 4090!

    • @user-jq3jh7hp2w
      @user-jq3jh7hp2w 15 дней назад +2

      Or don't play games like Tarkov that are so poorly optmized it requires you to drop $4,000 on a PC for a chance at giving decent performance.

    • @brandonconner8684
      @brandonconner8684 15 дней назад

      I agree I stay a year or two away from new hardware let others beta test lol

    • @Black-nf3tx
      @Black-nf3tx 15 дней назад

      @@user-jq3jh7hp2w So true plus that game for as cool as it seems to play, just sucks. The best thing about it is the gun builder

    • @spankbuda3769
      @spankbuda3769 14 дней назад

      Not me! I miss a day of work for every launch. I pay retail every chance I get. I'm severely in debt to the point that I'm living at my friend grandmother's neighbors apartment at the senior living home. I must bath her daily that will haunt any sane man dreams. But it's worth it!

  • @RubenGugis
    @RubenGugis 15 дней назад +161

    25% price increase for 25% performance at 25% more electricity used.

    • @Einacht
      @Einacht 15 дней назад +18

      The question is do you need that 25% increase to play a game? Most people would say no. Band wagoners would say probably yes. 😂

    • @socks2441
      @socks2441 15 дней назад +5

      best case scenario.

    • @TheOneTrueFett
      @TheOneTrueFett 15 дней назад +7

      Im still on my 1080ti lol

    • @Scott6794
      @Scott6794 15 дней назад +3

      @RubenGugis Still using my RX6650XT that cost me $250 two years ago. No need for anything more. I play at 60fps.

    • @DeceiverIX
      @DeceiverIX 15 дней назад +3

      @@TheOneTrueFett Yooo! That's sick! I'm still on a 2070s, not sure I'll even upgrade to a 50xx LMAO.

  • @Mr.Stephane_L
    @Mr.Stephane_L 15 дней назад +3

    Just putting it out there: on a 4K monitor (27inch), you can turn off all the anti-aliasing options because there’s no need for them. On a 1440p monitor, however, you might still need them.

    • @Hamborger-wd5jg
      @Hamborger-wd5jg 14 дней назад +1

      i still need antialiasing on my 4k 32" monitor. Might not be necessary on a 27" one, but i absolutely need it on my 32" one.

    • @Mr.Stephane_L
      @Mr.Stephane_L 8 дней назад

      @@Hamborger-wd5jg
      "Yeah, that's right on a 27-inch. I can't say for anything over that, but you seem to point out that on a 32-inch, you might need (jokingly, 8K). Jokes aside, you're right. ;)"

    • @Hamborger-wd5jg
      @Hamborger-wd5jg 8 дней назад

      @@Mr.Stephane_L I came from a 32" 1440p monitor and I didn't feel like downsizing to 27" 1440p, and I felt like getting a 27" 4k monitor would be fruitless, so that's why I went for 4k 32". I saw a $350 black Friday deal on a UPF322 and sniped it. Difference is night and day on the 4k one compared to the 1440p one. My 7900 GRE is able to drive games of similar demand to FH5 around 130 fps once overclocked using basic taa without upscaling. Given of course Forza is somewhat AMD optimized, I can't say if it can run anything more intensive, but I do know it can do this.

  • @Grodstark
    @Grodstark 15 дней назад +264

    Yup, listen to the guy thats been personally invited backstage on Nvidias HQ with multiple Nvidia executives listen to every word cause if there is one negative thing, he'll lose Nvidia as a potential sponsor.
    I like LTT, but this felt forced. Almost like gunpoint.

    • @TheNiteNinja19
      @TheNiteNinja19 15 дней назад +21

      He vowed not to get the 4090 but he said he's gonna get the 5090. However the elephant in the room is he didn't get the 4090 because of the price, and the 5090 is even more expensive.

    • @truckywuckyuwu
      @truckywuckyuwu 15 дней назад +23

      Hes always expressed distaste for Nvidia. He says they're a huge pain to work with all the time.
      The card is more powerful. And he's got the money to spend. I don't blame him for getting the best card available. But he's probably just as miffed as most of us that this is all dlss AI performance.
      You're still going to get great performance without craptracing on.

    • @Rofo89
      @Rofo89 15 дней назад +1

      @@Grodstark poor linus🤣

    • @randodox8375
      @randodox8375 15 дней назад +8

      Yeah, with the whole room full of Nvidia execs looking at you while making a video, it must felt akward as hell.

    • @Emanance
      @Emanance 15 дней назад +5

      @@truckywuckyuwu If he truly thought Nvidia was distasteful and a pain to work with he simply wouldn’t work with them.

  • @Ulagalanthanathan
    @Ulagalanthanathan 15 дней назад +328

    Bro got abducted to the Nvidia Backrooms to make content for them.

    • @slapdisgaem6601
      @slapdisgaem6601 15 дней назад +28

      They even got a "casting"couch in there 🤣

    • @Web720
      @Web720 15 дней назад +7

      He was shilling HARD.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад +3

      Bro got a big fat bag to lie about poor performance being responsible for lazy development and 3x the fake frame generation.

    • @AlwaysEast
      @AlwaysEast 14 дней назад +1

      To make content for us*
      He's basically Jesus.

  • @tachrayonic2982
    @tachrayonic2982 15 дней назад +176

    It feels like the biggest issue with DLSS is not to do with it's performance, but that developers are relying on it instead of developing actual good graphics processes.
    DLSS does have it's negative quirks, such as producing frames that are not responsive to user input, and smearing between frames, but it does produce a lot of "performance" for much less effort.

    • @datbo1jay1
      @datbo1jay1 15 дней назад +17

      True. Should have been used as a booster, like a floor raiser not just a crutch for devs to get lazy and go “well I guess we can stop trying as hard”

    • @transformerstuff7029
      @transformerstuff7029 15 дней назад +7

      @@datbo1jay1 yea I think you are 100% on the money there man. When it was introduced I was hoping it was intended to keep older GPU's alive/relevant or something

    • @will2023-onCensorshipTub
      @will2023-onCensorshipTub 15 дней назад +3

      DLSS is for hiding mistakes and bad programming its expensive too.

    • @StyleshStorm
      @StyleshStorm 15 дней назад

      ​@@datbo1jay1 That's precisely what it is. Now that these tech guys have a way to make life easier on themselves and become lazy that's what's happening.

    • @SuperLotus
      @SuperLotus 15 дней назад

      There's also the issue with games being CPU intensive. I heard that with Dragon's Dogma 2, some people had worse performance when they enabled DLSS because of that. I had really bad performance with No Rest for the Wicked and I'm not too optimistic about DLSS in that game (haven't tried it yet iirc). I wonder if part of the reason it was so poorly optimized is bc they expected DLSS to fix everything.

  •  14 дней назад +2

    ''i measure time in wow expansions...'' absolute cinema right there asmon

  • @Boss_Fight_Index_muki
    @Boss_Fight_Index_muki 15 дней назад +211

    The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't).
    The 5070 could be sold at $400 and the 5080 at $650.

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 15 дней назад +12

      More like the opposite. 5080 is not even half of 5090. And other cards show no significant improvement over previous gen as well. So cards for poor, and A SINGLE card that actually delivers performance.

    • @Boss_Fight_Index_muki
      @Boss_Fight_Index_muki 15 дней назад +11

      @@alexturnbackthearmy1907 You're thinking of 4060 (eventually 5060), an intentionally horrendous value meant to push buyers to the 4060ti (eventually 5060ti).
      The 5090 has 102% more cores, the 5080 has 14% faster clock. The 5080 is half the price, more than half the performance.

    • @ivancouto1028
      @ivancouto1028 15 дней назад +4

      Cope my guy. 4090 user spotted.

    • @OneTomato
      @OneTomato 15 дней назад +15

      @@ivancouto1028 nice argument, emotionally insulting a person instead of actually trying to argue his point with logic.

    • @Some0ne001
      @Some0ne001 15 дней назад +5

      The 5080 isn’t horrible as back in the day when I was building PC’s in 2005 a top tier GPU was $500 and with inflation is $851 today so they aren’t too far off but the 5090 is yes absolutely horrible value because it actually is a top tier GPU which should be close to the $851 if they followed inflation but they added another $1,150 on top of inflation which is very frustrating. It’s literally the same with housing too. Do I think the 5080 will actually sell for $999…absolutely not. I’ll just wait for some sales to pop up to get a 5080 to upgrade my 3080.

  • @saschaberger3212
    @saschaberger3212 15 дней назад +274

    NVIDIA doing the Apple move increasing pricing even higher. Chip shortage am I right

    • @ivyr336
      @ivyr336 15 дней назад +19

      The 5090 is hardly a consumer card, if you only plan to game then can just pretend it doesn't exist.

    • @Greysprunkiyay
      @Greysprunkiyay 15 дней назад +9

      People with financial difficulties shouldn’t be buying halo tech products. For moderately successful middle-aged people, $2K isn’t a big deal.

    • @indeedinteresting2156
      @indeedinteresting2156 15 дней назад +18

      @@ivyr336 IKR. Anything that can't run smoothly on a 3000 series card deserves to go down under. Unoptimized bullshit.

    • @sadge0
      @sadge0 15 дней назад +15

      @@indeedinteresting2156 I'd say that anything that can't run on a 1000/1600 series card on lowest settings without any frame gen and upscaling bullshit deserves to go under

    • @thenonexistinghero
      @thenonexistinghero 15 дней назад +1

      @@ivyr336 It's very much a consumer card. Won't be buying it though.

  • @miketysohn
    @miketysohn 15 дней назад +187

    Bro even the fucking DLSS was on Performance and frame gen on wtf is this .... Why is Raw not the normal stuff nowadays. The input delay must be crazy

    • @jackofsometradesmasterofnone
      @jackofsometradesmasterofnone 15 дней назад +12

      Because companies - not the Devs - are lazy and only see Dollar signs. There's no time for optimisations so whatever shortcut they can take to make their games faster out the door the better.
      Hence this dlss/fsr crap.

    • @EduardoSantiagoDev
      @EduardoSantiagoDev 15 дней назад +32

      @@jackofsometradesmasterofnone devs are getting lazy af as well, no one is free from criticism at this point.

    • @vgankush
      @vgankush 15 дней назад

      Devs work whatever they are told.
      They dont have free will
      Dont blame them​@@EduardoSantiagoDev

    • @swiftrealm
      @swiftrealm 15 дней назад +2

      Because NVIDIA is the king of AI hardware. They are going to push that everywhere so game devs don't need to optimize, you can just use their tech and get easy gains. Every GPU company is doing this.

    • @ticktockbam
      @ticktockbam 15 дней назад +8

      What are these graphic cards good for then if their native frame rate is just 25 FPS? What would be the point of buying them beside the DLSS/FSR technologies they have? Like, are they really good for other tasks like AI image generation or stuff like that, or nah?

  • @Grimbear13
    @Grimbear13 13 дней назад +1

    As someone who upgraded to a 43" 4k monitor its scary how fast it becomes your norm and its beautiful

  • @Mr_Dink
    @Mr_Dink 15 дней назад +310

    I'm just waiting for real reviews from unpaid testers.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад +17

      Nobody is unpaid nowadays except everyone willing to say no to getting ripped off for that much. Gamers Nexus... Digital Foundry... EVERYONE is collecting a bag.

    • @JackCarsonite
      @JackCarsonite 15 дней назад +4

      ​@@idindunuphenwonggamers nexus still gives data. Do you think that data is false? 😊

    • @heddihx
      @heddihx 15 дней назад +31

      @@idindunuphenwong Gamers nexus don't sell ad space to Nvidia, AMD or Intel. They also know what they're talking about

    • @ROFLKNIEFGOESSLIEC
      @ROFLKNIEFGOESSLIEC 15 дней назад

      LOL yeah it's funny how LTT don't even try and hide it anymore. They'll release another "nvidia I am disappoint" video this year or next, to try and seem neutral to redditors, and they'll fall for it once again.

    • @Zeryther
      @Zeryther 15 дней назад +5

      linus wasn't paid lol
      he's very vocal about never doing "paid reviews" because that's illegal

  • @Rofo89
    @Rofo89 15 дней назад +262

    I think that from now on GPUs should no longer be called "hardware" but here you are practically paying for a "software" since 3/4 is generated by dlss4

    • @bigturkey1
      @bigturkey1 15 дней назад +1

      does that make it feel laggy or can you not even tell its on like current nvidia frame gen

    • @HellPedre
      @HellPedre 15 дней назад +13

      they literally invented the HARDWARE to make DLSS and real time RT possible........................................ SMH

    • @Rofo89
      @Rofo89 15 дней назад +19

      @@HellPedre so you really think that artificially generated frames are better than physical frames generated by the gpu? 🤣I think that dlss4 should only "help" a gpu to "compensate" with additional frames only when it becomes obsolete... not become the main feature. otherwise here we are no longer talking about GPU but about FGU (frame generating unit)

    • @budgetking2591
      @budgetking2591 15 дней назад +13

      @@HellPedre haha you think dlss is hardware? good joke!

    • @Holyblades1982
      @Holyblades1982 15 дней назад +7

      @@HellPedre that's exactly what they want you to think.

  • @bluelyoko1041
    @bluelyoko1041 15 дней назад +56

    You say most of the average consumers are not going to notice but who's going to afford 5090 is going to be the people are looking for the all that graphics.

    • @JunkYardDawgGohan
      @JunkYardDawgGohan 15 дней назад +6

      You mean the Austin millionaire who doesn't leave his home or bath, forgot people in trailer parks, district 8 and 3rd world countries want to play games too but can't afford to and nor has the infrastructure to handle it?
      If intel doesn't mess up and meets market demand remembering poor people like games too and even more at higher rates, Intel will become the wealthiest company in the world by country miles; it always happens a third company swoops in to fill the entire low-end market of a product or service which is the major majority then later buys the competition just in this case they have the aid of the fed and it's the largest product today.

    • @TwoBs
      @TwoBs 15 дней назад +6

      @@JunkYardDawgGohan Waiting for the NVIDIA consoomer to spawn in and inform you that “PC gaming is a ✨luxury✨ and that if you can’t afford it, its not the hobby for you … go be a poor elsewhere.” You’re not a true power-bottom gaymur like they are if you don’t rush out and slop up the slop.

    • @TruthWaves22
      @TruthWaves22 15 дней назад +1

      A good amount of people In the military have PC builds. Or at least in my community I did. Everyone has a built PC. I've served in the Marines for 7 years now. So that's just my experience still a good amount of people. And we don't get paid much. Also I'm not for PC builds. A decent laptop and your chilling.

    • @Pufflord
      @Pufflord 15 дней назад +7

      @@TwoBs I ain't a fanboy of Nvidia, but this idea is ridiculous. You don't like it don't buy it. And if you can't buy it within reason, then you shouldn't even think about it.
      Gaming is absolutely a luxury, for every second you spend on a game rather than being productive, you lose money or some gain you could've otherwise had. Now I'm not one of those anti-gaming people. That narrative is just as stupid.
      However, it's pretty wild that there's so many people out there that think 2000 or anything like that is a genuinely impossible number. I guess most people don't know how to invest or save money so it makes sense, but if they learned how to do that one simple thing they could buy whatever they wanted.
      It's financial illiteracy, not being poor or anything like that. Shouldn't be spending money on any PC part if you can barely make ends meet. But if you live comfortably and have excess every now and then, yeah it won't harm you to keep money on the side and save up for something big.
      And for people in crappy low-income areas of the world, and 3rd world countries, they have bigger problems to focus on anyway.

    • @fenixfyre
      @fenixfyre 15 дней назад +4

      @@PufflordYou make good points, however, $2k for this frame generated tech just sounds like a gimmick with very questionable value. The delay you experience with it can in some instances feel worse than low frame rates (speaking from experience, good luck playing any fps with it). So yeah personal financing is obvs. the 1st priority, and it’s not that “$2k is an impossible number,” but rather the value incentive for that price is not compelling.

  • @zachreederau2531
    @zachreederau2531 7 дней назад

    M32, tech professional in Australia here… I own a gaming laptop, we live in a space constrained apartment and had to give up my desktop to reduce clutter / make space.
    A few of my friends have had to do the same when they have kids and give up their office / study.
    A lot of younger people I know get a mid-range gaming laptop they can take to uni but also game on as they can’t afford both.
    The market for gaming laptops is pretty huge.

  • @Katanaaaa
    @Katanaaaa 15 дней назад +85

    Frame generation is like painting stripes and changing exhaust your car and say that it's faster now just because it looks and sounds faster.

    • @afelias
      @afelias 15 дней назад +19

      Nah, frame generation is parking at the finish line and bribing a ref to affirm you ran the race

    • @Basuko_Smoker
      @Basuko_Smoker 15 дней назад +2

      Comment section on this video is making me cry hahahaah, good one dude, really puts things into perspective

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 15 дней назад

      Lol the copium is through the roof.

    • @ELPapayapa
      @ELPapayapa 15 дней назад +1

      Add stripes to your Lada to make Adidas car

    • @bigturkey1
      @bigturkey1 15 дней назад

      frame gen is like making games like hogarts and msfs run smooth for the first time ever

  • @kender121
    @kender121 15 дней назад +64

    the big difference is still , they used different settings on the systems why linus was not allowed to show the settings in cyberpunk of the 5090 system

    • @MelroyvandenBerg
      @MelroyvandenBerg 15 дней назад +4

      It's all set to low. And the 80% is AI generated using frame generations. And blurry gaming. It's a joke.

    • @FcoEnriquePerez
      @FcoEnriquePerez 14 дней назад +1

      Yeah, everything there has been manipulated, we definitely need to real reviews to know what are we getting.

    • @SubjectE57
      @SubjectE57 14 дней назад +3

      The settings are the same, it's just that the 4090 had 1/2 frames generated by AI and the 5090 having 3/4 frames generated by AI. This would roughly equate to double the framerate if the base performance of the cards were about equal.

  • @Hadeks_Marow
    @Hadeks_Marow 15 дней назад +158

    The hottest take anyone can give:
    We should have NEVER left the GTX lineup. RTX and the very IDEA of raytracing has done nothing but irreversible damage to the entire gaming industry as a whole to the point of ruin.
    Raytracing was not only a mistake, but flat out the worse direction that industry has taken on mass due to the negative impact it has had on development (industry-wide) and then by proxy, performance.
    Gaming would be in a MUCH better place right now if raytracing technology never existed.

    • @AegisK
      @AegisK 15 дней назад +8

      I wouldn't go as far as it shouldn't exist as much as it shouldn't be on a commercial product. Production workloads still benefit from RTX cards for ray traced rendering.

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 15 дней назад +12

      Or was released in done state - as in, all games would be fully ray-traced out of the box. Nowadays this is not the case - RT is unoptimized mess that runs on top of another unoptimized mess, and is also completely butchered by needing a proper implementation from dev team which happens...extremely rare.

    • @sebastianrosenheim6196
      @sebastianrosenheim6196 15 дней назад +14

      Its so funny to me that raytracing can look worse but needs far more ressources then baked lighting. But GUYS THE RAYS, THE TRACING, SPEND 2000$ NOW!

    • @adreanmarantz2103
      @adreanmarantz2103 15 дней назад +11

      @@sebastianrosenheim6196 Bought into the hype with a 3070, I'll turn RT on to see how it looks, then back off when I actually go to play.

    • @bigturkey1
      @bigturkey1 15 дней назад +1

      you can still buy a 1080ti if you want, they are like $100

  • @MuckMan
    @MuckMan 11 дней назад +1

    Gaming laptops are not usually used for gaming. Most of the time gaming laptops are used by developers for modeling or creating video games a I work things of that nature.The reason why they buy the gaming laptops.Because of the superior gpu performance

  • @IscariottActual
    @IscariottActual 15 дней назад +72

    The blur and smear is driving me out of high end gaming. I've been on the high end of graphics since 3dfx. I cannot 'unsee' the bad

    • @michaelmichaelagnew8503
      @michaelmichaelagnew8503 15 дней назад +10

      Agree I see this on Ark ascended and I thought it was my monitor but I have a new one with thats OLED 240hz and still the same. DLSS I don't think is the future but it might kill gaming for allot of us if it keeps going this direction.

    • @gasad01374
      @gasad01374 14 дней назад

      @@michaelmichaelagnew8503 DLSS is only needed because developers CANNOT and WILL NOT optimize their own games, so GPU manufacturers are forced to do shit like this with fake frames because the game devs are incompetent.

    • @caldale4941
      @caldale4941 14 дней назад +2

      Serious question, if you've been playing games since way back (my first card was a voodoo 2), which means you were happy playing extremely low res games for at least a decade how do you rationalize your current "standards"? This sounds to me more like the curse of adaptability, essentially people are so quick to take the latest image/graphical improvements for granted due to the constant progress we've experienced over the last 20 years, and as a result they cannot be happy with anything for too long because they are constantly adjusting/adapting to the latest improvements.
      If you look at it from that perspective our adaptability is a kind of curse, doomed to never be satisfied with anything as long as a new, shinier, fancier version of it comes out. It would seem that the only way to remedy this would be to first be conscious of this reality and try to remember where we came from, what we had. I do this all the time, I bitch about imperfect image quality but then remember the evolution over the years, and then suddenly I'm 14 years old again and blown away, and this allows me to see what we have now in a clearer light, and my bitching turns to thankfulness, and appreciation. And I'll tell you, thinking like this makes everything better, makes me less bitch less, makes me a better person....

    • @DingleBerryschnapps
      @DingleBerryschnapps 13 дней назад +1

      ​@@caldale4941 Well said, and it needs to be said more. This is my perspective exactly. Which is why I'm still HAPPILY using my 2070 super, even though I can afford a 4090.
      But why would I bother. During actual gameplay the differences are hardly noticeable. I'm still running everything on high / ultra with this card. To help this situation I always stay a couple years behind on games as well.
      Basically what we see these days are a bunch of benchmark chasers that don't understand diminishing returns.

    • @caldale4941
      @caldale4941 9 дней назад +1

      @@DingleBerryschnapps It actually means a lot that you said that, sometimes I feel like everyone is missing something when I see how entitled and demanding they are, with no awareness of what's happening to their minds, completely swept up by the time and devoid of introspection and the ability to shift perspective to something more wholesome and grounded. Not a week goes by when I dont look at a game and go "Wow!", and this goes for all types of games not just the cutting edge ones.
      I refuse to be a slave to my adaptability because I've learned to see how it corrupts and lessens my ability to see what is good and valuable.
      Anyhoo, your comment reminded me that there are some people out there who really do think outside the box and understand that they have internal processes that actually work against them when they go unchecked and arent acknowledged.

  • @TheHighborn
    @TheHighborn 15 дней назад +37

    9:25 50 ms input lagg oh my Lord, its disgusting.
    And thats the problem with fake frames. The engine is still bogged down, and itll feel like shit.

    • @Basuko_Smoker
      @Basuko_Smoker 15 дней назад +6

      I had to recheck becuase I didnt even know that was the input lag...If 5ms is already noticeable but playable, I can't imagine what 50ms does to a person HAHAHAHA

    • @jonesy66691
      @jonesy66691 15 дней назад

      Just wait for Deep Learning Input Generation.

    • @your_average_cultured_dude
      @your_average_cultured_dude 15 дней назад +2

      yeah, 60FPS has 16.6ms of latency, on 50ms everything would feel like 20FPS (1000/fps=latency)

    • @WayStedYou
      @WayStedYou 15 дней назад

      @@Basuko_Smoker you arent going to notice 5ms thats monitors latency already unless you are on CRT or OLED

  • @Mr.Atari2600
    @Mr.Atari2600 15 дней назад +54

    Does this mean that my 3DFX Voodoo can outperform the RTX 50 series because it doesn't rely on AI & just runs off the 3D Acceleration?

  • @gibbogsi
    @gibbogsi 10 дней назад +1

    Is cramming all components rught next to the ref hot gpu a good idea vs spacing them over a full size pcb?

  • @EvilFriedBacon
    @EvilFriedBacon 15 дней назад +43

    @14:10 but casual players aren't buying 5090s. The people who get 5090s will notice the issues for sure.

    • @SandTiger42
      @SandTiger42 15 дней назад +4

      100%
      I can't even look at Christmas led's without seeing flickering, it drives me bonkers after just a few seconds. This would have me throw my monitor across the room in under a minute if I was forced to play like that.

    • @cc1drt
      @cc1drt 15 дней назад +2

      yeah that was surprisingly low 1ntelligence from asmon. like gee you think the people that dont care that much about gaming are going to spend $2000 on just their gpu?

    • @NajaToxicus
      @NajaToxicus 15 дней назад +2

      Not the case, people who buy 5090 are not the people who know a lot of stuff about tech and computers, it's just rich people, if you get 200k a year working in finance, you won't notice shit.

    • @mistrsynistr7644
      @mistrsynistr7644 14 дней назад

      ​@NajaToxicus I have a 4090. I don't use dlss because it is a blurry mess. I bought it because it was not only the best card on the market but also the best cost to performance. A lot of the people with 4090s will notice, i assure you. I'd imagine most pro players are on 4090s currently.

    • @DingleBerryschnapps
      @DingleBerryschnapps 13 дней назад

      ​@@mistrsynistr7644 And? You think those people are all professionals that understand diminishing returns just because they purchased a high-end gpu? If I got a nickel every time a person thought they needed something, or they could notice a difference in graphics, when there was no difference there, I'd be rich. As a photographer I've seen this countless times.
      Sure, it looks better when you're pixel peeping and adjusting the settings, but when you're actually playing, it's meaningless, unless you're looking for 200 plus frame rates, which most people are not.
      Someone purchasing a xx90 doesn't tell me that person knows more.
      It's a fact that people that know the "least" about something, pay the most. They don't understand the nuances of things that would allow them to make an educated decision and save money. They just have money to spend, and listen to Tech influencers, and benchmark chasers in the comments.
      People that understand diminishing returns and are frugal with their money are the smart ones. They understand how fast technology moves now and only buy what's necessary at the moment. I know people that have spent $6000 on gpus in 5 years when I only spend $500 every 5 years, and we play the same games.
      Btw, "best card on the market" is completely subjective. Best for what? I guarantee my 2070 super I'm still using plays all the same games yours does at perfectly acceptable frame rates, on high / ultra settings. 👍

  • @aldoleonardo220
    @aldoleonardo220 15 дней назад +75

    This GPU draws more power than my fridge.

    • @aegd
      @aegd 15 дней назад +9

      That's true for practically any GPU genius

    • @swiftrealm
      @swiftrealm 15 дней назад +7

      Can your fridge play CP2077?

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад

      @@aegd LOL a GPU does not draw more power than a fridge unless it's a mini fridge that GPU does not draw more power than a fridge you can't plug a fridge into an extension cord like a power supply for a GPU or at least nobody I know is cracked out enough to use extension cords and surge protectors just to plug in a fridge

    • @thecompanioncube4211
      @thecompanioncube4211 15 дней назад +1

      You got 550W fridge? For cooling 1 can of coke? (Joking)

    • @pierre-yvesdubreuil9315
      @pierre-yvesdubreuil9315 15 дней назад +1

      @@swiftrealm can your gpu cool your food?

  • @dcdenton6859
    @dcdenton6859 15 дней назад +71

    There are no "tech bloggers".
    There are only freelance marketers for tech corporations.

    • @afelias
      @afelias 15 дней назад +11

      and then there's Gamers Nexus

  • @-BUGZ-
    @-BUGZ- 3 дня назад +1

    Didn’t the 4070 super compare to the 3090 in performance? This AI and scaling is getting ridiculous. I already have to use some type of scaling when I play with my 2080Ti at 1440p. Some games I can get away at Native and lowering my fps cap but I might end up going for a used 40 series card or something.

    • @-BUGZ-
      @-BUGZ- 3 дня назад +1

      And 100% bought my first prebuilt in 2019 for $800 after taxes for a 1660Ti and an i5-9400F and upgraded that and eventually built my own with the 2080Ti in late 2020/ early 2021.

  • @bossmanebz
    @bossmanebz 15 дней назад +55

    21:00 as someone who bought a gaming laptop i think it's because I wanted to be able to take it to university, play games when I'm on my lunch breaks, and be able to take my laptop to a friend's house and game together with split screen games

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 15 дней назад +3

      Split screen? At least i hope you are doing it on TV at least, even the regular monitor is way too small for that.

    • @boogiebam7104
      @boogiebam7104 15 дней назад +5

      ​@@alexturnbackthearmy1907No reason you can't. all you need is a hdmi cable

    • @bossmanebz
      @bossmanebz 15 дней назад +5

      @@alexturnbackthearmy1907 it's just to go to a friend's dorm and casually game together if we were bored. There's a lot of uses normies get out of gaming laptops was my point, I couldn't take my pc with me

    • @hotrodhunk7389
      @hotrodhunk7389 15 дней назад +2

      Yes I got a gaming laptop for that purpose too.
      Ended up getting a desktop PC and a cheap laptop. Just remote into the gaming desktop with moonlight and sunshine. Extremely low latency. Works out perfectly.
      Laptop gets great battery life because it's only streaming video.
      No sound from the loud desktop as it's in another room.
      And I can stream it to any screen anywhere.

    • @bossmanebz
      @bossmanebz 15 дней назад

      @@hotrodhunk7389 the simplest explanation I can say is, this correlates as to why mobile gaming is so big too. It's all about convenience.

  • @Who-z8k
    @Who-z8k 15 дней назад +80

    I think you vastly underestimate how many people care about image quality going backwards and input lag. Also on what planet are "casual gamers" getting a 4090 or 5090.
    We are also not hit the ceiling on visuals. Visuals have literally gone backwards.

    • @therealkylekelly
      @therealkylekelly 15 дней назад +1

      Image quality is subjective. How would you define the quality of an image? I can claim that Super Mario 64 has superior graphics over modern games, does that make my opinion a fact?

    • @maxttk97
      @maxttk97 15 дней назад +25

      ​@@therealkylekelly so a 480p texture has more detail then a 1080p texture? or even a 4k texture?

    • @therealkylekelly
      @therealkylekelly 15 дней назад

      @@maxttk97 when people refer to image quality they are usually not referring to the objective value of pixel information but are instead talking about the aesthetics. 2D vs 3D, art styles, etc. Obviously a 480p texture has less detail than 720 or 1080 but a majority of games today are created with at least HD textures so the objective quality of the pixels has not gone down and in fact has gone up with the rise of 1440p and 4k usage which would debunk OP's theory of image quality going down if we are strictly speaking about the pixel fidelity.

    • @SandTiger42
      @SandTiger42 15 дней назад +6

      You 100% nailed it. That screen tearing and shimmering and ghosting gave me nightmares. Nvidia fanboys about to hit you with snarky replies though.

    • @ghoulbuster1
      @ghoulbuster1 15 дней назад +4

      Games are becoming a blurry mess, I always turn all of it OFF!
      I'll rather play with less fps and crispy aliasing than turn on fake frames.

  • @Bombay.Badboy
    @Bombay.Badboy 15 дней назад +107

    Imagine how bad the optimization is going to be in future releases oh god, they're already stuttery messes on high end current hardware

    • @bigturkey1
      @bigturkey1 15 дней назад +3

      i use a 4080 at 4k and never felt like a game was unoptimized

    • @socks2441
      @socks2441 15 дней назад +11

      @@bigturkey1 maybe because you are using a damn 4080?
      on a 3080ti i have noticed almost every AAA game of the last few years is terribly optimized. indiana jones being the one exception.

    • @bigturkey1
      @bigturkey1 15 дней назад +3

      @@socks2441 Indiana jones at 4k wtih a 3080 gets 80-90 in the jungle areas and 110-120 in doors.

    • @socks2441
      @socks2441 15 дней назад +3

      @@bigturkey1 exactly. its incredible. especially after so many games that require dlss and low settings and still cant hold 60fps.
      but yeah, i was shocked to find my 3080 ti could max the game out at NATIVE 4k60.
      i wish they had not limited the game to ray tracing hardware though, because clearly this game is well enough optimized for any decent GTX gpu to play. i dont think the forced ray tracing is doing much in this game.

    • @Primafaveo
      @Primafaveo 15 дней назад

      @@socks2441 It's probably because they didn't spend thousands of hours to make "fake lights" in the game and rather went for the better looking and easier ray tracing way.
      Which is why ray tracing was made, just like nVidia forced PhysX into the games years ago.

  • @Zodryn
    @Zodryn 14 дней назад +1

    19:50 more context though: 3080 was $700, then the 4080 jumped to $1200, now the 5080 is $1000.
    So it's still a massive increase for that tier compared to two gens ago. New gens used to be similar price for the same tier but better performance. Around covid it turned into huge price increase to get the performance increase. It's good that it decreased but it still could be better, and the limited vram is just sad.

  • @ArnnFrost
    @ArnnFrost 15 дней назад +97

    1:07 "the most beautiful real time rendered demo" that ball is blurring... in the demo.

    • @GK_Squid
      @GK_Squid 15 дней назад +11

      Not to mention the gem completely loses color when it got "realistic"

    • @EliteInExile
      @EliteInExile 15 дней назад +6

      I don't wanna dig at the devs of Stalker 2. But the performance of that demo looked like Stalker 2 gameplay....

    • @thecompanioncube4211
      @thecompanioncube4211 15 дней назад +4

      All the hyped tech demos and when it's coming to actual games they all look like poop

    • @derekjackson8832
      @derekjackson8832 13 дней назад

      You gotta have a 4090 with a 4k screen to accurately watch this video and comment on the picture quality 😂

  • @HeavenOnHigh
    @HeavenOnHigh 15 дней назад +39

    7:40 "you cant change the settings" sounds weird to me, im sure its not because they have to reset it for the next journalist, more like "it only works if you put it like that"
    basically what Triple A devs do today "the game only works, if you play as we want you to play!" if you do it a bit differently, everything brakes

    • @Johnsmithhjoe
      @Johnsmithhjoe 15 дней назад +28

      They don’t want Linus to turn off DLSS. Nvidia is building up hype around the 5090 performing twice as good as the 4090 but only with DLSS which is meaningless.

    • @socks2441
      @socks2441 15 дней назад +9

      they have it set up so the 4090 runs at a crawl, and the 5090 has all the new ai software for fake frames. if he could turn that off he could actually compare raw performance numbers of the actual hardware instead of the software. they dont want that.

    • @asengeorgiev5039
      @asengeorgiev5039 15 дней назад +4

      Yep. But the problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.

    • @Aqualightnin
      @Aqualightnin 15 дней назад +1

      The intention is so everyone gets the same experience

    • @asengeorgiev5039
      @asengeorgiev5039 15 дней назад +1

      @@Aqualightnin i see no problem to change few things in game settings, just for the show and set them back, its LTT afterall not just random guy.

  • @osirismpg7393
    @osirismpg7393 15 дней назад +145

    And then it dies after exactly 2 years of warranty

    • @roshunepp
      @roshunepp 15 дней назад +3

      Don't you already get a new card before the warranty runs out?

    • @andrejjelcic3675
      @andrejjelcic3675 15 дней назад +24

      c o n s u m e

    • @croncastle3028
      @croncastle3028 15 дней назад +18

      ​@@roshuneppBro, what?

    • @truckywuckyuwu
      @truckywuckyuwu 15 дней назад +3

      This is what bothers me most. Paying 2000 should get you a minimum of 4 year warranty. I'd be happy with that.

    • @WorstCommenter2008
      @WorstCommenter2008 15 дней назад +3

      @@roshunepp Bruh, been running my 2080TI since 2019, intend to keep going for a few more years and praying maybe something that isn't AI dogslop comes out with actual hardware improvements.

  • @arthurakopyan8218
    @arthurakopyan8218 14 дней назад +1

    so... i basically skipped the 4xxx gen nvidia's and it looks like I will be skipping the 5xxx gen as well and wait until they either perfect this system to avoid the latency and ghosting or change it to simply improve the native rendering. I cannot ignore the blurring/ghostness, it actually bothers me.

  • @MrRafagigapr
    @MrRafagigapr 15 дней назад +181

    20% more performance , 100% more dlss4 performance lmao imagine AI generating 15 out of 16 pixels and calling it performance lol , unlike FSR this will only work with a few titles that Require Nvidia to work with the developers , Nvidia must be mad that the 1080TI Was relevant for too long and they want performance to be locked behing "software compatibility" while a old and weak gtx1050 from a decade ago can run AMD Frame generation technology lol

    • @Jay-vt1mw
      @Jay-vt1mw 15 дней назад +15

      If it actually works i dont see any issue.

    • @ja_pocitacove_hry_nehraji69
      @ja_pocitacove_hry_nehraji69 15 дней назад +1

      :D

    • @oceyho
      @oceyho 15 дней назад +5

      yeah, but since that frame generation is 50series exclusive, its still produced double the frames. And think about the fact that game devs dont give a shit about opimization anymore, so they just implement these frame gen features. So in the end it does not matter, still double the frames. BTW did no one notice that the resolution was even on 4k???? With the new Displayport upgrades to 2.1 and the new 4k OLED 240hz monitors, the quality of pixels will be insane.

    • @GFClocked
      @GFClocked 15 дней назад +17

      Real frames are generated. These are more like hallucinated, as they're that bad. GPU just went on lsd-level of artifacts. But somehow it's amazing and I'm just stupid because others can't notice the problem so it must not exist.

    • @gucky4717
      @gucky4717 15 дней назад +1

      @@Jay-vt1mw Same.
      I used Frame Generation on some titles now. Some were broken at release, but later fixed, some had input-lag issues, but some also ran perfectly fine as if there was no Frame Gen (it was on) with high FPS.
      What Nvidia needs to do it to push it onto more games or engines like UE5 OR make it available via drivers, so you can use it on all games (but it might be broken).

  • @mralabbad7
    @mralabbad7 15 дней назад +37

    I feel like the people mainly buying gaming laptops are engineers doing BIM or 3d modeling for construction projects.
    This is the case in our company and all the contractors we work with.
    If you're working in a construction site powered by generators, it's much more convenient to have a battery than lose power suddenly and see all your unsaved work fade away with the flickering lights every now and then.

    • @I-HAVE-A-BOMB
      @I-HAVE-A-BOMB 15 дней назад +3

      Machinists and Heavy Duty Mechanics too

    • @emperorborgpalpatine
      @emperorborgpalpatine 15 дней назад +4

      laptop is superior in almost every way. if it could also perform as good as desktop and not burn my balls then it'll be the best.

    • @MusicSkeletonKeys
      @MusicSkeletonKeys 14 дней назад

      I have one computer, my gaming laptop. The practicality of a laptop, the capacity to play things like BG3. I don't game enough, or care enough frankly, to maintain a tower/monitor setup. Though I can appreciate other's joy in those spaces.

    • @Blob6859
      @Blob6859 14 дней назад +1

      I shift places often as my job requires me to shift in every few years and my gaming laptop is like my pet dog i can carry anywhere to game and it also helps me with my work sometimes.

  • @EliteInExile
    @EliteInExile 15 дней назад +24

    That tech demo had less frames than my modded Minecraft playthrough with Shaders...

    • @Darkdiver28
      @Darkdiver28 14 дней назад

      @@EliteInExile minecraft looks worse

    • @kiddeath8883
      @kiddeath8883 14 дней назад

      ​​@@Darkdiver28you know modded minecraft Especially with the more demanding sharders bring any pc to it's knees like yeah minecraft is more CPU & ram base than gpu but those more Demanding shaders you need a powerful asf Graphics card to even run. Like a RTX 4090 can't deal with those more demaning shaders like having it's like Ray tracing just to much for modern Consumer hardware.

  • @carloscervantes836
    @carloscervantes836 14 дней назад

    1440p to 4k on 27 is an immediate and noticeable difference on my end. I tried an ultrawide high refresh 1440p Alienware Oled and ended up returning it because of how big of a difference the sharpness was compared to my 4K ultrawide IPS. On top of that, the motion was still not as clear as I hoped, maybe we need 480Hz, or MicroLED.

  • @Ukgyt99
    @Ukgyt99 15 дней назад +19

    Built my lad his first pc Christmas day will never forget it.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад

      Starting him off with fake FPS and paying more for less. May god have mercy on his gamer soul.

  • @xorion123
    @xorion123 15 дней назад +56

    4070 to 5070
    $549 for software upgrade "AI frame"

  • @liquidsnake6879
    @liquidsnake6879 15 дней назад +9

    wasn't the point of more expensive hardware that we don't NEED dlss? It's cool that the price is competitive but all i heard was that nvidia can't really squeeze much more performance than they already have with the 40xx series and from now onward the improvements are all gonna be DLSS based

  • @glitch_city_gamer2846
    @glitch_city_gamer2846 13 дней назад

    the view at the 4.50 mark of the card, looks so nice! They've done a beautiful job aesthetically

  • @asengeorgiev5039
    @asengeorgiev5039 15 дней назад +13

    The problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.

    • @sandboy5880
      @sandboy5880 15 дней назад +1

      But the first impression (from linus) will be seeded in many people's minds. Most people will not bother to double check. They'll just watch the first, most popular thing and basing on it form their opinion. That's what they're going for.

  • @AnimeMemes
    @AnimeMemes 15 дней назад +80

    If only we had games, worth playing

    • @LordAteag
      @LordAteag 15 дней назад +28

      If only we knew how to use commas.

    • @megaplexXxHD
      @megaplexXxHD 15 дней назад +21

      Just because you cant find a game you enjoy doesnt mean „we“ dont have good games😂😂🤦🏽‍♂️

    • @Alec_cz
      @Alec_cz 15 дней назад +13

      ​@@megaplexXxHDOK concord player lmao

    • @РаЫо
      @РаЫо 15 дней назад +1

      Concord

    • @smileeface
      @smileeface 15 дней назад +1

      Banana

  • @hasselplayer9
    @hasselplayer9 15 дней назад +31

    These new GPU cards should comfortably be able to do: native 4k 60fps with full raytracing for the amount they're charging, especially the: 4080, 4090, 5070, 5080, & 5090

    • @jackofsometradesmasterofnone
      @jackofsometradesmasterofnone 15 дней назад +17

      I don't even care about ray tracing, I just want stable 4k 60fps at this point with no gimmicks.

    • @thefirstloser
      @thefirstloser 15 дней назад +6

      @@jackofsometradesmasterofnone I still have my 6900XT. It is "old" but it works and I can play everything I want. I wanted to build a new PC in a month but it looks like the new GPUs from Nvidia and AMD are just "fake" with their AI trash.

    • @wolfsbane9985
      @wolfsbane9985 15 дней назад +3

      They can, IF the game is optimized. But many games simply aren't anymore and rely on dlss and framegen to get them there.

    • @computron1
      @computron1 15 дней назад +1

      @@wolfsbane9985 You must have extensive experience shipping games with full ray tracing at 4k 60fps to make that statement. What have you worked on, and what have you optimized?

    • @wolfsbane9985
      @wolfsbane9985 14 дней назад +2

      @computron1 What are you implying with your question? That my statement is false or I don't know what I'm talking about? I own a 4090 and do a lot of testing myself. It's a hobby of mine. You don't need to be a game developer with years of experience to see how broken and unoptimized AAA release titles have been these past few years. It doesn't take much to see that developers are heavily relying on AI techniques to get games to acceptable framerates, and it may get worse. Take Silent Hill 2 remake for example. Why does that game have worse performance than Cyberpunk 2077 on a 4090? Because it's terribly unoptimized. That's just one example. Do your own research. It's not hard to see where the market is going.

  • @tigoes
    @tigoes 3 дня назад

    Yeah, many gamers / it engineers buy those gaming laptop so they can have dual use our of it. These laptops are generally connected as a desktop setup with screens and staff to actually play the game.

  • @sirlordofderp
    @sirlordofderp 15 дней назад +32

    0:46 no wonder there employees are retiring

  • @Phaevryn
    @Phaevryn 15 дней назад +133

    Yes, lets trust Linus's words.

    • @Sin3xtreme
      @Sin3xtreme 15 дней назад +29

      Linus is a shill

    • @HitomiYuna1
      @HitomiYuna1 15 дней назад +15

      he also promo honey scam ADS why everyone trust him word

    • @michaelfrock2473
      @michaelfrock2473 15 дней назад +12

      The only person you can really trust is Steve.

    • @sadge0
      @sadge0 15 дней назад +22

      especially when the NVIDIA staff watching him while he's doing the review. Totally not holding him on a "gunpoint" 😂

    • @rem5215
      @rem5215 15 дней назад

      You realize every youtuber didnt know honey was scamming right??​@@HitomiYuna1

  • @xXE4GLEyEXx
    @xXE4GLEyEXx 15 дней назад +53

    Frame generation is a cancer on gaming... As in AI and the push for tech for it :/

    • @noobandfriends2420
      @noobandfriends2420 15 дней назад +3

      Wait till it's used in everything to cut down on streaming data.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад

      It's tech word slop to hide the fact that the new hardware is just as big a scam as the old new hardware.

    • @addictedtosynthwave
      @addictedtosynthwave 15 дней назад

      consoles using that for 10 years.. non of them NPCs can tell tho lol

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад

      @@noobandfriends2420 LOL clearly you've never thought about how much streaming data that bots have already been hogging nowadays

  • @capt8
    @capt8 4 дня назад

    The main problem here will be system latency doesnt help to have 200 or more frames, when system latancy is 60 ms. Add 24-60ms latancy to server and its like you play on over 100ms. Wont be a pleasant experience for online games. The ghosting and smushing in fast paced games will be really noticable.

  • @alexs2195
    @alexs2195 15 дней назад +7

    Those tech demos are like a limit of what the card can handle in a very especific scenario, ghraphics cards have a long road to improve. The limitation of looking realistic is not the only one, but you also can increase the number of things in a scenario, movement on things, and so on, and in those aspects we still have a long road. Games and tech demos dont look that realistic, they are build to give that impression, but they still work on really big limitations

  • @Frankture
    @Frankture 15 дней назад +61

    DLSS Is great as an option for those who wants better performance... The issue is nowadays developers just don't care about optimization at all, and DLSS has become a REQUIREMENT instead of an option. DLSS and frame gen has done a lot of damage to the videogame industry.

    • @anarex0929
      @anarex0929 15 дней назад

      Ya I think we will see a revolt like we saw against SBI and the influence.
      An the industry will be better for it.

    • @PhilosophicalSock
      @PhilosophicalSock 15 дней назад +5

      so what? lets stop progress? it is not nvidia who deals damage to games, but big publisher managers trying to save some costs using newest tech.
      because from the pure profit point of view - why bother with optimization when you can just turn on the couple of setting switches?
      This is what fked up

    • @Frankture
      @Frankture 15 дней назад +10

      @@PhilosophicalSock Who said to stop the tecnology? I like DLSS and FSR , they're great options for players. My problem is with developers who rely on upscaling software to get "OK" performance instead of optimizing their games...

    • @RobK-rl6sn
      @RobK-rl6sn 15 дней назад +1

      Gamers have been destroyed gaming, more than any developer could ever do. Gamers are responsible for destroying the physical side of gaming. Collecting in real ownership, it is pretty much gone because gamers are just not very smart. A group of millions of people conditioned to rent entertainment.

    • @miguellopez3392
      @miguellopez3392 15 дней назад

      @@RobK-rl6sn lol no, gaming was destroyed in 2013 when the industry was showing billions of profits to people with BA degrees who think mass market and subcription models is how you build a business.

  • @Gunnumn
    @Gunnumn 15 дней назад +43

    Wait for benchmarks. Every launch nvidia has had since the 970 has had an issue on day 1. Plus youre paying more $$$ for artificial frames.

  • @unintellyjellie
    @unintellyjellie 12 дней назад

    I dont get why they cant get the DLSS to have different layers. Make it seperate certian render sections. Crucial items that shouldnt blur ever would always be rendered while less important items would be using the AI cores. That way you get performance + the accuracy neccessary.

  • @puregameplaysonhard
    @puregameplaysonhard 15 дней назад +38

    I think we brought this to ourselves. Many gamers are so obsessed with frame rates that graphic cards no longer provide "visuals" as such but they are simply focused on generating frame rates, while games look like their 2015 counterparts but in 4k...

    • @afelias
      @afelias 15 дней назад +24

      That's a lie. We're talking about the 60 fps standard, the lowest possible bar, set more than 10 years ago. I, nor anyone else, should be gaslit into believing that "gamers have unreasonable standards."
      Games should run on reasonably affordable hardware, at 60 fps, minimum, before any frame generation. That has been the standard for more than 10 years, the absolute minimum standard, and it's the industry's fault for failing to meet even the minimum standard.

    • @bigturkey1
      @bigturkey1 15 дней назад +5

      did you even game in 2015?

    • @sadiqraga3451
      @sadiqraga3451 15 дней назад +2

      @@afelias Games can run at 60fps though if everything else sacrificed.
      Frankly, I see no issue with the rtx50 generation. The AI and frame generation is for people wanting 4K and 60 fps and ray tracing and pathtracing.
      If you play at 1080p or 2K 60 fps and no raytracing or pathtracing you won't need the AI frame generation.

    • @afelias
      @afelias 15 дней назад +3

      @@sadiqraga3451 Yes, but you shouldn't advertise any sort of improvement if it comes at the cost of the minimum bar.
      If you have to turn off RT, lower the texture quality, lower the resolution or use supersampling, just to hit 60 FPS, then that is the limit of the performance uplift. If the games can't look better at 60 FPS then there would be no reason to spend more or even an equal amount of money for new hardware. That's the point.
      It's not like it's anyone's fault but NVIDIA's for aggressively pushing for RT in the first place. So they can't advertise any kind of uplift if they have to start below-minimum to create that comparison.

    • @puregameplaysonhard
      @puregameplaysonhard 15 дней назад

      @@bigturkey1 I gamed in the 1990s on Amiga 600 :P

  • @AngeloValdejueza
    @AngeloValdejueza 15 дней назад +10

    NGL, that chain did add charisma. Now, it's gone. RIP Chain Asmon

  • @ct4nk3r
    @ct4nk3r 15 дней назад +33

    8:11
    DLSS Performance
    Screen Space Reflections Quality not on the maximum setting
    This is 1/4th of the resolution of the original, meaning this "4k" gameplay is only a 1080p one upscaled, also 3 out of 4 frames is AI generated, by doing a simplified calculation you have to subtract by ~7 (you have to calculate with the actual calculation of the upscaling so amore realistic number is around subtracting by 5-6) to get the "real" performance of this card, for 4k native
    So the 110 FPS you see at 8:41 would be about 20-25 FPS

    • @denxx56
      @denxx56 15 дней назад

      Damn I can't believe DLSS isn't actually real, can't believe we've been lied to

    • @enderwigin7976
      @enderwigin7976 15 дней назад

      Could you do some calculations but on the 5090 side?

    • @StefanRial-i4f
      @StefanRial-i4f 15 дней назад +5

      @@enderwigin7976 We don't have to. They showed the 5090 FPS raw performance without DLSS, it was 28FPS, about 27% better than the 4090

    • @MrChologno
      @MrChologno 15 дней назад +1

      @@StefanRial-i4f This is the raw performance increase that makes most sense. 5090 is a 4090 that draws more power and has better AI performance and that's it.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад

      LOL 28 FPS natively LMAO no bag is big enough to make me ignore that big of a sign

  • @jeremygarcia7605
    @jeremygarcia7605 15 дней назад

    You killed me with the chain breaking debuff 🤣👌🏼

  • @mikschultzyevo
    @mikschultzyevo 15 дней назад +38

    Not a fan of the AI shit. As others have said, all it really brings to the table is blur and texture smear.

    • @Grodstark
      @Grodstark 15 дней назад +10

      DLSS fan girls will flame you for this.

    • @bigturkey1
      @bigturkey1 15 дней назад +2

      my 4080 never smear anything

    • @mikschultzyevo
      @mikschultzyevo 15 дней назад +5

      @@bigturkey1 then you're blind.

    • @bigturkey1
      @bigturkey1 15 дней назад +3

      @@mikschultzyevo what nvidia card do you use to game at 4k and what monitor, i use 4080 and 55" 4k oled. no smearing no blur

    • @mikschultzyevo
      @mikschultzyevo 15 дней назад +2

      @@Grodstark and here one is
      I dOnt GeT SmEar wItH mY 4o80

  • @Jeffdaam
    @Jeffdaam 15 дней назад +28

    1440p for a monitor is plenty enough

    • @UkoKoromi
      @UkoKoromi 15 дней назад +3

      nah you need to play at 8k

    • @PerciusLive
      @PerciusLive 15 дней назад +3

      For most. Same way that 1080p was standard for 2020, and 720p was for 2015/2016, 1440p should be standard for 2025. Benchmarking to 60 frames at ultra at 1440p native should be the ideal for optimization now.

    • @UkoKoromi
      @UkoKoromi 15 дней назад +1

      @@PerciusLive you need to get an 8k tho

    • @Basuko_Smoker
      @Basuko_Smoker 15 дней назад +4

      @@PerciusLive I feel like 1080 has been a standard for waaay longer...Like 2010's late, pretty much every single person had a 1080p monitor. Besides that i couldnt agree more.

    • @Zanderb20
      @Zanderb20 15 дней назад

      Even 1080p is plenty enough for basically almost everyone, and its soso much easy to run compared to 1440p, especially 2160p. I do however think for gaming 1080p is enough but for more productive work having multiple windows side by side 1440p is the sweetspot. For only gaming 1080p is enough for most people

  • @Daniel-kn9xr
    @Daniel-kn9xr 15 дней назад +10

    I dont know how im every gonna afford this shit, so annoying with PC games being unoptimized shit stains nowdays and you need DLSS and 5070TI's to run them well

    • @therealkylekelly
      @therealkylekelly 15 дней назад +5

      do you need to run the latest games at 4k 60? no. If you can be satisfied with 1080p there are many options for you.

  • @ryanwatters2108
    @ryanwatters2108 13 дней назад

    Gaming laptops are pretty commonly used as work laptops as well when the work requires graphic heavy programs.

  • @PrisonMike49
    @PrisonMike49 15 дней назад +9

    People in the military and people that travel for work a lot use gaming laptops

  • @makshm
    @makshm 15 дней назад +13

    Not long ago, playing CS2, I noticed how viscous the controls were, and they often killed me as if they saw me before I did them. When I turned on the FPS counter, I saw that my frame time was jumping by 18-24 milliseconds. There are several factors here, of course, after working on a lot of things, I eventually achieved 11-14 milliseconds and this radically changed my feeling about the game. If you think about what 10 milliseconds are, it's almost nothing, but in reality everything became faster, controls, reaction time, even skill increased and it became easier to kill. What I'm getting at is that they feed us 300-500 fake frames with a delay of over 35 milliseconds, this is just hell for multiplayer shooters.

    • @Entropy67
      @Entropy67 15 дней назад +3

      It's great for movies pretending to be video games though

    • @mew1534
      @mew1534 15 дней назад +1

      For me I like it but my reflexes aren’t as good as yours so I’m not best at competitive shooters. I am curious how 5090 will work with 1080p just feel gpu wise were not there for gaming but monitor wise we are getting there

    • @cheezus4772
      @cheezus4772 15 дней назад

      @@mew1534 " I am curious how 5090 will work with 1080p" dude. the 1060 is a 1080p card. if you are buying 5090 wondering "how it will do" in 1080p i am speechless

    • @cloudnine5651
      @cloudnine5651 15 дней назад

      literally has zero to do with this conversation. cs is an ancient game that can run on a potato

    • @Hamborger-wd5jg
      @Hamborger-wd5jg 15 дней назад +1

      @@cloudnine5651 CS2 updated the game engine so its more demanding than the original counter strike.

  • @stevesavage3683
    @stevesavage3683 15 дней назад +12

    Honestly, I think Nvidia took this approach because their marketing strategies are lacking, and collaborating with RUclipsrs might help them come across as more genuine.

    • @thescarletpumpernel3305
      @thescarletpumpernel3305 15 дней назад +2

      their marketing is so far behind AMD at this point it's sad.

    • @TheDravic
      @TheDravic 15 дней назад +3

      @@thescarletpumpernel3305 10-15% market share AMD Radeon doesn't really scream "leadership in marketing strategy" like you think it does.

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 15 дней назад +3

      @@TheDravic Yeah...at least nvidia has something to show. Coming empty handed is worse then not participating at all.

  • @MrToasty8705
    @MrToasty8705 14 дней назад

    49” Ultra wide monitor is the way to go. It’s two 27” screens in one. You can game at the 27” screen size with streamlabs or other go super wide screen full gaming mode which is sick.

  • @tntTom174
    @tntTom174 15 дней назад +9

    just a side note: I think it should be called Big Black Case.

  • @vennece
    @vennece 15 дней назад +9

    The "Zuckerberg free speech" video wore the chain
    13:30 and now seeing it broken by next video it's like seeing the conclusion side quest story

  • @Griffin519x
    @Griffin519x 15 дней назад +5

    Gaming laptops are pretty cost effective if you consider the included monitor. They aren’t as powerful as a desktop but they’re pretty close. Plus they are portable so nice to take to a friends house or play party games on the tv

    • @zRidZz
      @zRidZz 14 дней назад

      Do they have ethernet ports?

    • @RealOneUp
      @RealOneUp 13 дней назад

      They do ​@@zRidZz

    • @triotudu2702
      @triotudu2702 13 дней назад

      ​@@zRidZzthey do have 5ghz wifi

  • @69memnon69
    @69memnon69 13 дней назад

    The advertising on Linus videos always reminds me of the early 2000’s with all the popup ads.

  • @pawanrohidas1163
    @pawanrohidas1163 15 дней назад +42

    RTX 5090 giving 20 fps at native 4k with full ray tracing

    • @Primafaveo
      @Primafaveo 15 дней назад +5

      As the video shows it does 28fps. and with path tracing on.
      My 3080ti and 1440p is getting 4-5 fps doing that.

    • @mewtilate420
      @mewtilate420 15 дней назад +5

      @@Primafaveo so we waited 2 years for 25% more computing power, 575w tdp and 2500 price tag. very nice development cant wait for the future

    • @infinite683
      @infinite683 15 дней назад

      @@mewtilate420 25% to 30% is a pretty normal jump in a 2 year time frame though.

    • @idindunuphenwong
      @idindunuphenwong 15 дней назад +4

      @@infinite683 LOL for these price points it's a rip-off if you don't get a happy ending just admit it we've peaked technologically

    • @cloudnine5651
      @cloudnine5651 15 дней назад

      @@Primafaveo it also wasnt 4k.

  • @t1m3out
    @t1m3out 15 дней назад +14

    All of these examples of artifacts and jittering is why I DO NOT use RT and FSR4 / DLSS. I always turn off Motion blur on games....I'm old and I hate the new shit that covers the major flaws....

  • @Severism
    @Severism 15 дней назад +5

    1000$ is unrealistic for every reason being all the different manufacturers for each different PC component charging what they want - even for very basic bare bones features. And then the periodic gimmick advancements we see in hardware/software. It'll never really happen without making "non-name brand" types of cost reductions

  • @itstusk9495
    @itstusk9495 13 дней назад

    so basically the new gpus are using ai to upscale/ downscale to keep performance up that sounds like something we already have 'Dynamic Resolution' but its implemented to the gpu by default for every game i guess? and dlss gets a boost

  • @TriPBOOMER
    @TriPBOOMER 15 дней назад +5

    the 5000 series prices for me is they found the price cap for each class with 4000 series, and used the successful 'super' card prices to gauge the 5000 series cards, the 5090, they can price it whatever they want... because... what else are you buying at that level? 🤣

  • @E1250-qy9hl
    @E1250-qy9hl 15 дней назад +9

    Jensen is elbow deep up in Linus.

  • @Einacht
    @Einacht 15 дней назад +15

    This 50s series is probably just a substandard byproduct of their ongoing Blackwell AI research.
    The money shifted from their GPUs to their AI enterprise instead.
    It's like petrol byproducts. You really want the petrol, but why not sell the byproducts too.

  • @DeepFriedOreoOffline
    @DeepFriedOreoOffline 3 дня назад

    I think the telling thing that no one seems to be taking into account about DLSS4 is the input latency. While it is generating frames inbetween your current and next frame to give you a smoother visual, it is not reacting to your inputs. For many games, this will be imperceptible. However, for many other games, include most FPS games, games like Rocket League, rhythm games, action combat games, they are all going to feel somewhat sluggish to play regardless of the visual smoothness. A lot of people wont even know what the issue is, they will probably blame the game.