It's getting Even Worse... RTX 4060 Ti tested on a PCIe 3.0 System

Поделиться
HTML-код
  • Опубликовано: 18 окт 2024

Комментарии • 1,9 тыс.

  • @sojirou
    @sojirou Год назад +1855

    Thank you for testing the 4010Ti so comprehensively, not a lot of channels would put in this kind of effort for a simple 2d display adapter.

    • @_mrsahem_
      @_mrsahem_ Год назад +167

      GT 4030 when

    • @potatoes5829
      @potatoes5829 Год назад +123

      @@_mrsahem_ nah it'll be the RT4030. gotta sell that raytracing to someone :)

    • @KamalSingh-ny9vw
      @KamalSingh-ny9vw Год назад +60

      NAH bruh it's GTX 4010Ti

    • @SinNumero_no41
      @SinNumero_no41 Год назад +41

      @@potatoes5829 lol rt 4030, what a great idea, ngreedia should hire you :)

    • @C0bblers
      @C0bblers Год назад +34

      @@_mrsahem_ The 75w, PCI 2.0, 12bit bus, 16gb model with the oversized cooler be the one to buy.

  • @DanTheYeen
    @DanTheYeen Год назад +30

    Glad you're pointing this out. One thing I'd love to see next time is a card's PCIe 3 vs 4 performance plotted on the same graph.

    • @FiveFiveZeroTwo
      @FiveFiveZeroTwo Год назад +2

      Indeed, IMO the current graphs / benchmarks don't show the potentially lost performance at all.

  • @Barryhick186
    @Barryhick186 Год назад +1304

    Reviewers recommending the 4060 GPU for older systems have absolutely lost every bit of confidence i had in them.

    • @BikingChap
      @BikingChap Год назад +115

      Quite, I run a 9900k and hadn’t seen this referred to elsewhere. Perhaps because everyone runs absolutely top end hardware don’t think of checking on older platforms.

    • @m-copyright
      @m-copyright Год назад +102

      @@BikingChap They all use the best cpu at the time in order to exclude bottlenecking. And while that's great because it shows what the card can do without being held back, it also has downsides like the one we see here.
      We still have older cpus that are still way more than capable of handling newer gpus.

    • @j33k83
      @j33k83 Год назад +60

      if youre into upgrading an old harwdware i dont think nvidia is the right choice

    • @sketters9400
      @sketters9400 Год назад +30

      ​​@@j33k83 agreed, upgrading old hardware is always better trough amd

    • @BikingChap
      @BikingChap Год назад +6

      @@m-copyright sure, I can see why they do that and it’s difficult to think of every scenario but now someone’s called it out it seems obvious. Easy to be wise after someone else has seen it I guess.

  • @Nelthalin
    @Nelthalin Год назад +108

    It's would have been nice if results of the 4060 Ti with 8x 4.0 where also in the tables. I guess it makes quite the difference but would have been nice to directly compare it.

    • @Alberto_RiscvsCisc
      @Alberto_RiscvsCisc Год назад +3

      Nvidia paid for this test thats why ...

    • @Igor-Bond
      @Igor-Bond Год назад +1

      Please tell me which is better to buy a video card, RX7600, which costs 42599 rubles, or RTX 4060 Ti, which costs 43990 rubles?

    • @MsNyara
      @MsNyara Год назад +2

      @@Igor-Bond RX 6700 XT at 29120 rubles is the way to go, beats both cards by a generous margin.

    • @joefowble
      @joefowble Год назад +4

      Agree, but then you're not running 4.0 on the same intel 10th gen test bench.

    • @darvengames
      @darvengames Год назад

      @@Igor-Bond4060 Ti

  • @soulshinobi
    @soulshinobi Год назад +292

    DLSS2 is a useful tool, but I agree with Hardware Unboxed that DLSS3 should not be included in benchmarks because it does not provide the responsive experience that the frame rate suggests. It's just a visual filter and should not be included in benchmarks at all.

    • @chubmouse
      @chubmouse Год назад +26

      Yep. And then there's the instances where it adds input latency as a trade-off, which is an objective gameplay experience downgrade the worse it gets.

    • @superpulaski9767
      @superpulaski9767 Год назад +7

      The best use cases I have for DLSS3 with frame Gen are for titles it has to be modded into for lol. I use it in my Skyrim mod list and elden ring but using it in darktide or cyberpunk makes the experience feel very meh. Using a 4090.

    • @no-barknoonan1335
      @no-barknoonan1335 Год назад +4

      @@chubmouse It always adds more latency, even if only a little. The only game it's worth it in is MS Flight Sim imo.

    • @alderwield9636
      @alderwield9636 Год назад +10

      b-b-but dlss is every nvidia fanboys last resort if they were cornered 😢

    • @TheVanillatech
      @TheVanillatech Год назад +2

      How much is Putin .... I mean, AMD paying you?!?!?!?

  • @wagnonforcolorado
    @wagnonforcolorado Год назад +101

    I think adding the DLSS, XeSS and FSR frame rates is useful information. At the same time, showing latency values while using these options is needed, so a consumer can decide if the latency penalty is worth the increased frame rate. Thank you for covering the PCIe 3.0 information as well. A lot of systems were built on gen 3, and a reviewer pointing out the possible bandwidth limitations helps in deciding which upgrades to make.

    • @Erik_Dz
      @Erik_Dz Год назад +17

      ITs important to tell consumers that its not a real frame rate increase either. They are fake frames. They are not rendered by the game but 'generated' by the gpu. It is frame 'smoothing' not actually improved FPS. This is why they should not replace the actual rendered FPS on graphs with frame generation data. For example in the Cyber punk 2077 graph with the added DLSS bar, an actually powerful GPU rendering at 152 fps will not only have lower latency but will look visually better than a 4060 ti using DLSS. DLSS 3 may be better, but it is still just fake frames generated to make the game appear higher FPS than it actually is. Its not magically improving FPS by flipping a switch. (which is what Nvidea wants people to think)

    • @KibitoAkuya
      @KibitoAkuya Год назад +12

      ​@@Erik_Dz XeSS and FSR are NOT generating frames, that's not how they work, they (basically)render the frames at a lower resolution and feed them to an algorithm that upscales it while maintaining good quality (it's a little bit more complicated than that, just upscaling images alone makes it worse, but intel uses an ai model to try and improve the quality and AMD uses a few techniques together to do the same thing, trying to get as close as they can to what a native image would have been like)
      Neither does DLSS3, *BUT* , it has an extra option for frame interpolation which basically uses AI to "guess" a frame in between two already generated frames. THIS is what worsens latency because there is always one already done real frame ahead of every other frame you're presented (NVIDIA misleadingly markets this as "reducing latency" because frame interpolation mode forces Reflex on to mitigate that latency penalty, but you are indeed gonna experience penalties if you already were using Reflex beforehand)
      Now, AMD's recently unveiled FSR 3 will also feature interpolation (with their own techniques to do so) but they do admit that there will be latency penalties with frame interpolation

    • @zodwraith5745
      @zodwraith5745 Год назад

      @@Erik_Dz While that's true with DLSS3FG, it's not true with DLSS2 or 3 minus the FG. (DLSS3 works on 20 and 30 series, it's only the frame gen aspect that nvidia gatekept) DLSS upscaling can often produce a _better_ image than native and HUB already proved that. FG is up in the air and entirely subjective if you like high fps for the visuals or the responsiveness. That depends on the types of games you play. But anyone that blindly turns off DLSS for scaling is an idiot when it _always_ gives you superior latency, and often gives you better than native visuals as long as you're using quality mode. This is where we get into the grey area of games that support DLSS but not FSR. This needs to be a note in benchmarks cause anyone that _needs_ the extra performance is ALWAYS going to enable DLSS. It's not like they're going to say "I can't enable that because it's not fair to AMD users."

    • @zodwraith5745
      @zodwraith5745 Год назад

      @@KibitoAkuya I've got a sinking feeling FSR3 is going to suck. It's taking really long to come out and they don't have dedicated hardware for it. 30 series DOES have the dedicated hardware but doesn't have as big a frame buffer as 40 series. (Nvidia's bullshit excuse) FSR3 would most likely work _better_ on 30 series than RDNA, but of course AMD wants to change direction and limit FSR3 to AMD only. It's already been theorized Nvidia could easily enable FG on 30 series but they badly wanted a new feature they could gatekeep to 40 series. Typical Nvidia.
      The worst part is, imagine anyone that dropped 2 freaking grand on a 3090ti to have Nvidia only 6 months later spit in your face on DLSS3. I'd be fucking pissed. It pisses me off with my 3080ti and I spent far less than MSRP for it.

    • @bosstowndynamics5488
      @bosstowndynamics5488 Год назад +1

      ​​@@KibitoAkuya Is DLSS3 frame generation that basic? Would have thought with the fact that it requires specific support that it could do asynchronous reprojection and just fine tune the result, with reprojection being a much older technique that genuinely reduces latency (very common in VR games on mobile headsets).
      The only real disadvantage for AMD for "work" (you didn't specify so I'm making assumptions about which workload you have) is that it won't support CUDA software, but most client side applications don't use CUDA for hardware acceleration and AMD support in the GPGPU realm is getting better rapidly, so unless you're doing AI you *should* be good (and *might* be good even on AI - be sure to research hardware support for your intended application)

  • @jakov175
    @jakov175 Год назад +161

    Thank you for putting emphasis on the power consumption. Especially important with something like the 4090 because in gaming it rarely ever runs at the max listed TDP.

    • @hop-skip-ouch8798
      @hop-skip-ouch8798 Год назад +4

      Same. Our power consumption bill is calculated in slabs so the higher we consume, the more expensive it gets per unit. And it gets barely talked in most reviews.

    • @marinipersonal
      @marinipersonal Год назад +4

      Exactly. My 4080 uses much less power than my 3080 Suprim. Runs very cool and as I don’t use high frame rates, only ultra on image quality, usually around 200w. Not bad.

    • @raulsaavedra709
      @raulsaavedra709 Год назад +7

      2x. Furmark is good to include to know the peak sustained power a card can pull, e.g. good to know maybe for power supply capacity planning. But that's way too high a load compared to everyday usage, even for the most demanding games or graphics applications. A call for better/more granular power analysis from other reviewers was very much pertinent.

    • @alexrusu6417
      @alexrusu6417 Год назад +2

      Copium!

    • @dakoderii4221
      @dakoderii4221 Год назад

      @@hop-skip-ouch8798 You pay more for utilities to stop "climate change" and "racism". The experts would never lie for nefarious reasons. It is impossible that someone with an education could ever do any wrong. It's best to blindly believe them and persecute the heretics who dare defy the gods of science. Now eat ze bugs! You'll own nothing and be happy!

  • @iseeu-fp9po
    @iseeu-fp9po Год назад +15

    The PCIe 3.0 comparison is so important here. I have been wondering about that myself but have not seen any other reviewer mention this.

  • @nothingisawesome
    @nothingisawesome Год назад +77

    you are 100% right about benchmarks with DLSS. including those in the baseline benchmarks is a horrible idea. i know its more work for reviewers but i think its fine to give those figures out but i would summarize them at the end somehow: i.e. here is the DLSS/FSR number. but as it stands now i wouldnt include frame gen. also consider the point of benchmarks is having the same settings on all the products tested. man thats just annoyingly complicated

  • @TroubleSeekerTeam
    @TroubleSeekerTeam 11 дней назад +2

    I replaced my gtx 1060 6gb with RTX 4060 8gb in old pc with pcie 3.0, as you can guess i am happier than ever

  • @JCBeastie
    @JCBeastie Год назад +205

    Nvidia really asking us to overlook the disappointing hardware because they have a software trick that makes it look better on paper.

    • @jacobrzeszewski6527
      @jacobrzeszewski6527 Год назад +27

      That's literally been Nvidia's thing since the beginning.

    • @lyonscultivars
      @lyonscultivars Год назад

      Well.said

    • @dagnisnierlins188
      @dagnisnierlins188 Год назад +6

      After today's keynote at computex it's clear that ai and data centres are priority for them now.

    • @mr.unknown4589
      @mr.unknown4589 Год назад

      those nGreedia suckers who shows middle finger to customers, are just to be avoided for good

    • @pandemicneetbux2110
      @pandemicneetbux2110 Год назад +6

      @@jacobrzeszewski6527 Yeah but they did kind of at least deliver before, back when Maxwell came out it was truly one of the most efficient and most performant GPU generations, even though back then nVidia literally got sued for lying about their VRAM on the 970 (and lost, not a good look when a corpo and its fleet of attorneys can still manage to lose and not buy a legal victory). Pascal was also good, it also delivered on performance, and AMD back then just didn't, their main value was in the lower and midrange but even then R9 300 was just not as performant and efficient.
      Also back then they didn't line their bullshit front and center, like TXAA, PCSS, HBAO+ wasn't being parroted in every outlet like it was some important thing to have, nobody ever implied you should buy a GPU just for better hairworks performance, but here we are and that's exactly what they are doing. I don't give two shits about frame generation. It also looks weird and bad upscaling so it kind of defeats the point to enable raytracing when you run worse fps and it looks worse too thanks to DLSS. And no idgaf that it's better now than 1.0, 1.0 was literally unplayable garbage and anyone who bought an RTX 2000 series ended up feeling cheated. The problem is they keep getting worse and worse. It's like they're drunk on their own hubris at this point, and truly expecting their loyal customers to be brainless sheep. "The more you spend the more you save" "we're like the new iPhone"I mean fucking really?

  • @allyoucouldate9510
    @allyoucouldate9510 Год назад +4

    Congrats for covering Pcie 3 x8 issue!
    Myconcern with this is the shuttering that can have when system move data from Vram to/from RAM, because you will have 8GB/s on x8 lanes and not 16GB like would have on x16 lanes, so theoretically double shuttering time. This issue is amplified by small Vram of 8GB, on the 16GB version of the card there is no issues with bandwith.

  • @1Grainer1
    @1Grainer1 Год назад +236

    no dlss is best way of testing, no 4 scenarios to include, no wasting reviewers time, gives "worst case scenario" in performance deparment and takes into account games that don't support it
    overall better way of testing since it gives rough estimate for majority of games that are not bugged or badly optimised
    dlss/fsr/xess can be included in RT low performance scenario to show if it can go from 10fps to 30fps and be at least playable
    edit: with last one, i meant dlss 2/fsr 2/xess, since those are somewhat supported by games, dlss 3 is frame transition smoothing, which only could help in games like "life is strange" if anything since that's the only game you really don't care about latency (choosing dialog options would still he horrible probably)

    • @bradhaines3142
      @bradhaines3142 Год назад +7

      problem is nvidia pushes reviewers really hard to pitch that so it looks better

    • @cpt.tombstone
      @cpt.tombstone Год назад +2

      I'd agree if Frame Generation's performance uplift was as linear as the performance uplift of DLSS. With DLSS, you can just calculate the difference in pixel counts between the output resolution and the render resolution and you will be very close to the actual performance uplift you get in game. With Frame Generation, it's really not that easy, because the base framerate with FG on and off are not the same - on my 4090, in a GPU-bound scenario, base framerate can be 25-30% lower with Frame Generation on. FG then doubles the effective framerate, and you end up with a 65-80% boost overall. However, in a CPU-bound scenario, you get a clean 100% uplift. This is very evident in poor quality PC ports like Hogwarts Legacy and Jedi Survivor, where if you enable RT, you cannot get over 60 fps, but Frame Generation doubles the framerate very effectively. Elden Ring is similar, there is a 60fps lock on the game, Frame Generation will double it to 120 fps without you messing up the gameplay because invulnerability frames not lasting as long as intended.
      So I think it could be very informative to test games with frame generation as well (not instead of native, that's disingenuous)

    • @salmon_wine
      @salmon_wine Год назад +16

      @@cpt.tombstone frame generation really kind of is cheating though as it only appeals to the eye candy factor of a high frame rate, and a high frame rate is already pretty superfluous to an eye-candy focused game. Thus, the main use of frame generation is to hit 60fps in a title where you are using graphics settings definitely too high for your gpu. Idk, I feel like there's more validity in testing native vs DLSS vs FSR than there is including frame generation in there

    • @joey_f4ke238
      @joey_f4ke238 Год назад

      @@salmon_wine Any game that isn't some competitive multiplayer will benefit from frame generation, it is a really great tool for maximum visual fidelity while retaining a smooth experience, i have always dropped graphic settings to get at least 60 fps or more for every game i played and i would have loved such a tool back when i played mh world or other really pretty games and my gpu just wasn't quite up to the task

    • @cpt.tombstone
      @cpt.tombstone Год назад

      @@salmon_wine I strongly disagree. 60->120 in a game that cannot achieve more than 60 fps is very strong use case for the tech. I've been playing a heavily modded Skyrim with frame generation, I only get 60-70 fps in the wilds at 4K even with a 4090. Turning on Frame Generation, now it's an almost perfect 116 fps experience, and the game is super smooth. Very-very minor latency impact as well, even with Reflex not working in this version of the FG mod.

  • @xingbairong
    @xingbairong Год назад +19

    Excellent video. Glad to see someone putting more attention to power consumption.
    To be honest the card isn't bad, the price however... Just few years ago this card would've probably been in the $220-$250 range and realistically that's where it should be.

    • @marstedt
      @marstedt Год назад +4

      I disagree because the name of this product implies that a performance upgrade should be expected. If it was cheaper AND renamed the 4060 (or 4050Ti) then I think it would be worthy of praise. I think most people would expect a 4060 to be equal or better in performance to a 3060Ti AND consume same/less power. Nvidia has failed / misguided the public on two fronts, price and performance.

    • @vsm1456
      @vsm1456 Год назад +1

      @@marstedt 4060 (non Ti) should perform around 3070, that would be a proper generational improvement as we usually saw in the past

    • @MsNyara
      @MsNyara Год назад

      No, the card is bad, it has a serious imbalance of memory and bus for its otherwise real power, leading it severe bottlenecks with itself, which is a very bad design.

    • @xingbairong
      @xingbairong Год назад

      @@MsNyara So if the RX 7600 and RTX 4060 Ti were the same price you would tell people to get the RX 7600?

    • @MsNyara
      @MsNyara Год назад

      @@xingbairong The RTX 4060 Ti if same price or up to $30 more but that is mostly comparing two terrible designed cards with each other.
      Just buy a RX 6600 for $200, RTX 3060 12GB for $280 or RX 6700 XT for $350, the rest does not makes sense in the performance segment.

  • @Saieden
    @Saieden Год назад +156

    As an owner/custodian of two B450M Mortar systems, this is extremely relevant to me if I decide to upgrade the 3060ti and 6700xt they house at the moment.

    • @ManuSaraswat
      @ManuSaraswat Год назад +11

      i'm in same boat, i built my system with ryzen 1700 nd aorus b350 board, when it died aorus gave me free b450 upgrade, now i'm running 5800x nd 1070, no way i'm buying nvidia cause their pricing nd products make no sense rn, 6700xt it is.

    • @Saieden
      @Saieden Год назад +6

      @@ManuSaraswat Yeah, I've been milking the first B450 for 4 years now and still running a 3700x in my rig that I want to replace with a secondhand 5800x3d/5900x/5950x/ for another few years when the time comes. I'd completely forgotten that I was still on 3.0 until this popped up lol.

    • @AndrewTSq
      @AndrewTSq Год назад +4

      I have a Asus Prime B450 with a 3600 that I never upgraded bios on, so its still a PCIE4 system :)

    • @grants7390
      @grants7390 Год назад

      @@AndrewTSq according to the asus website that's A pcie3 board

    • @AndrewTSq
      @AndrewTSq Год назад +4

      @@grants7390 Yes, cause AMD forced all B450 boards that supported PCIE4 to remove it with a bios update... but I never updated mine, so its still PCIE4. Just search "PCIE4 removed from B450" on google. Not only Asus, also Gigabyte had it "Gigabyte, who introduced a Firmware BIOS switch in their X470 motherboards allowing you to enable PCIe Gen 4.0 (partially). Once that happened a lot of motherboard manufacturers felt obligated to follow. Meanwhile, AMD has been stating that non-500 series motherboards will not get PCIe Gen 4.0 support. "

  • @tee_es_bee
    @tee_es_bee Год назад +40

    DLSS, similarly to RT is just an extra _optional_ feature that _might_ add in _some_ games. Thus, primarily I would focus on pure rasterization, then including RT, then including DLSS/FSR/whateverIntelsTechIs. Thank you for including the power numbers. As someone who donates his compute when not gaming, power is much more important for me. 🧡💛🧡

    • @beatsntoons
      @beatsntoons Год назад +2

      Agreed on the power metric. It's hugely important to me. I went from a 2070S to a 4070, and the power draw is about the same or a little less. But the performance is better.
      I'm happy to similar performance as well, if the power draw keeps falling. PC parts can consume so much power these days and power generation is expensive.

    • @pliat
      @pliat Год назад

      XESS

    • @Melsharpe95
      @Melsharpe95 Год назад

      DLSS is such an important tech that it does need to be tested and factored into the purchase.
      Especially seeing as AMD themselves are working on their own DLSS 3.0 variant.

  • @markus.schiefer
    @markus.schiefer Год назад +98

    Frame Generation should for the foreseeable future not be part of any comparative chart. Way too many games don't support it and depending on person and / or game it has negative side effects like increased latency.
    As for DLSS / FSR2: While I think that it is working well enough, the quality is depending on resolution and settings. Quality or balanced settings might be fine on 4k for most people, but for many even quality settings might be not acceptable at 1080p. On top of that, some popular games, even new ones sometimes, still don't support it. DLSS1 / FSR1 shouldn't even be considered.
    At the moment, comparison at native resolution without those features is simply the fairest.

    • @RN1441
      @RN1441 Год назад +15

      Frame generation and upscaling technologies are very vulnerable to abuse. In concept Nvidia can generate any framerate they want at any resolution they want if they are ok with grotesque artifacts and distortion. I really prefer to see the apples to apples raw performance comparisons and go from there.

    • @LuminousSpace
      @LuminousSpace Год назад +1

      dlss and fsr is really great for mobile devices like 7940U and asus rdna3

    • @Jaml321
      @Jaml321 Год назад +12

      Frame generation should never be the focus in testing a Gpu. It is a nice feature to have but i pay for raw performance not for fake frames.

    • @Val-bx6gn
      @Val-bx6gn Год назад +5

      The thing that many people tend to forget or even misunderstand about FG is that it isn't even for people with low fps. The lower your fps is the harder it is to interpolate between frames. And it does nothing to fix the input lag which usually is pretty bad at low fps. This feature is for people who already have pretty decent >60 fps and who also have high refresh monitors, it allows them to saturate the pipeline to further visual smoothness. It's a "win-more" feature.

    • @Anankin12
      @Anankin12 Год назад +4

      ​​@@RN1441ot to mention, dlss3 is literally a performance worsener in online compatitive gaming.
      Increases latency, introduces false information (uses ML to predict what an object will do; if said object for example changes direction suddenly, DLSS3 will predict wrong and show you wrong information), etc.
      It's ok in slower paced stuff, but at that point... Do you truly need the high fps?
      It can't even be used to reach "playable" on heavy titles, because if native FPS are low enough you'll get untenable latency anyways. At the moment, the only actual use for it is basically VSync in single player titles: you can set it to match your screen refresh rate and that's it.
      Basically if you get 45 FPS in a non latency sensitive game, you can use it to get 120 fps with minimal drawbacks. Below 45 fps, but it depends on one's tolerance for it, it's useless because even if you get 100 fake frames you'll have an actual frame time of like 60 ms or more, which is about equivalent to playing at 15 fps. Literally unplayable.
      It can be a great technology, but in gaming it has very limited use at the moment (and in the future if they don't change something about it).

  • @gruzzob
    @gruzzob Год назад +11

    Hi Roman, couple of thoughts on this whole thing:
    1. I know you tested the 4060Ti on a PCIe 4.0 system in your previous video, but it would have been good to see the results included here as well. Even though it was a different system (and probably more powerful. though there are ways to control for that), it could help give an idea of what kind of performance was being lost over the older bus,
    2. re DLSS3: I say show it, people do use it, it does give extra framerate (potential quality issues aside), and it allows intergenerational testing
    3. Keep up the good work, you have a good perspective on things

  • @truckerallikatuk
    @truckerallikatuk Год назад +75

    The issue with DLSS is that it's a massive quality drop at 1080p, it's only realistically useable at 1440p or higher. Edit: At least it was for DLSS 2.0 The base render resolution when using DLSS didn't have enough resolution steps below 1080p to maintain quality even with the AI upscaling.

    • @Xayc__
      @Xayc__ Год назад +11

      It's same with dlss 3, because it's using same dlss super resolution. Difference between them is that dlss 3 also using frame generation tech that draws bunch of fake frames and make games visually smoother, but doesn't improve responsiveness (input lag), usually makes it worse, while real fps boost improves input lag.

    • @mryellow6918
      @mryellow6918 Год назад +6

      Still looks bad at 1440p

    • @caribbaviator7058
      @caribbaviator7058 Год назад +3

      ​@@mryellow6918 agreed it doesn't look great on 1440p.
      Fsr looks even worse @1440p

    • @TheFriendlyInvader
      @TheFriendlyInvader Год назад +2

      You really shouldn't notice that much of a difference on the higher quality settings of DLSS, if you do it's probably not DLSS that's the issue as much as a shader game side bugging out due to sampling issues. Of course if you push it to the limit and use ultra performance you'll start seeing resolution issues, but that's the exception to the norm.

    • @stevensgarage6451
      @stevensgarage6451 Год назад +2

      it looks better than native in cyberpunk

  • @zwerushka1
    @zwerushka1 Год назад +1

    Excellent soundcard with video exit)))

  • @iamdarkyoshi
    @iamdarkyoshi Год назад +50

    Love to see constructive feedback between channels. I love GN's content as well, and agree some power tests per game would be a great addition.

    • @-opus
      @-opus Год назад +4

      It is definitely something they are lacking, but I guess most of their viewers are from the US and they don't (according to a lot of youtube comments) seem to care about power usage because of the power cost, or the cost to the environment.

    • @GrizzAxxemann
      @GrizzAxxemann Год назад +6

      @@-opus A lot of places in the US are on nuclear power. It's about as clean and efficoent as you can get.

    • @19alive
      @19alive Год назад +4

      Also GN needs to change those bars how he presents the numbers, looks like a mess, too many numbers on top of each other.

    • @-opus
      @-opus Год назад +2

      @@19alive Agreed, too many comparisons. The only issue is that there are often people questioning why certain models are still excluded.

    • @-opus
      @-opus Год назад +1

      @@GrizzAxxemann I guess that explains a lot, perhaps it is why they like rgb so much, it too glows in the dark. I am surprised no one has made a PC case shaped like a 3 eyed fish

  • @DownwithEA1
    @DownwithEA1 Год назад +1

    Thank you! I've also been very curious about the power consumption comparisons. This was a very helpful perspective.

  • @gyokzoli
    @gyokzoli Год назад +8

    Awesome content as usual! You really deserve a bigger audience.

  • @DGCastell
    @DGCastell Год назад +114

    Right when you mentioned it was running on x8 on the last video, I knew it was gonna be a big problem for people like us who are still on PCIe 3.0.
    This is important because I know quite a lot of people will get this card not knowing about this potential performance loss.

    • @EightPawsProductionsHD
      @EightPawsProductionsHD Год назад +9

      There's no excuse in 2023 for buying new hardware without checking out the numerous online written and video reviews first.

    • @WSS_the_OG
      @WSS_the_OG Год назад +17

      Yes, but! The large L2 makes up for everything, even the lack of RAM. Right? Right?

    • @Soundsaboutright42
      @Soundsaboutright42 Год назад +1

      ​@@EightPawsProductionsHDThere hasn't been that excuse for a very long time. People are just dumb to be honest. I'm shocked some people can even Velcro their shoes in the morning.

    • @jemma2607
      @jemma2607 Год назад +1

      So, what GPU do you recommend for 1080-1440p with pcie 3? I've read about the rx 6750xt but some say this won't be a good option if I also want to work with the system so... Can you give me any recommendations?

    • @simonb.8868
      @simonb.8868 Год назад +17

      @@EightPawsProductionsHD there is no excuse for them to cut 8 lanes

  • @Viking8888
    @Viking8888 Год назад +45

    Thx Roman! I appreciate you testing the differences between pci-e 3 to 4 with this card. It really shows how delusional nvidia has been with the 40 series pricing.

    • @nelsonrobe-from3278
      @nelsonrobe-from3278 Год назад +12

      I needed an upgrade and after watching the 4060 ti reviews I immediately went out and bought a 6700xt which has almost the same performance 50% more vram and was $70 cheaper than the 4060ti

    • @ResidentWeevil2077
      @ResidentWeevil2077 Год назад

      Ngreedia wants to go the high end/workstation route, and that's exactly what ended their former competitor 3dfx. Now it seems Ngreedia is bent on following in 3dfx's footsteps and they'll soon get their just rewards for their nonsense. The irony of this situation is highly amusing; it'd be even more amusing if Ngreedia goes under and sells their IP to Intel.

    • @EbonySaints
      @EbonySaints Год назад +2

      ​@@ResidentWeevil2077 Look, I'm rooting for Intel on their GPU front, but do you *really* want the company that kept 4 cores and 8 threads for almost eight years to take over?

    • @username8644
      @username8644 Год назад

      ​​​@@EbonySaintsey had zero competition, not even remotely close competition. And to be fair to Intel, the worst thing they ever did was not improve their CPUs enough with each generation for a couple generations. That's significantly better than all the issues AMD has been having with their CPUs these last generations. Also amd single handedly killed the workstation lineup with threadripper by pricing them to absolutely ridiculous levels. I mean first gen threadripper was good value, but then it went through the roof to 4k for a cpu. We used to be fuming at Intel for a 1k CPU. Honestly Intel looks way better than AMD now, and at least their CPUs are rock solid stable. AMD is pretty touchy.
      Edit: also Intel literally made all the high core count chips first before anyone else so your comment is slightly ironic. I'm referring to 6 cores, 8 cores, 10 cores, 12 cores, 14 cores, 16 cores, 18 cores, 20 cores, and 22 cores. Intel all did it first. They did that during your 8 year 4 core 8 thread time period. And yes they were expensive at the time, but AMD is now high core count king and they priced their stuff 4x higher than Intel ever did. So I don't see how Intel is worse.

    • @firecat6666
      @firecat6666 Год назад +1

      @@ResidentWeevil2077 Nvidia seems to be going for the AI market now, judging from some stuff I saw regarding their most recent financial report. Hardware for that area has much greater overlap with high end/workstation hardware than with gaming hardware, so no wonder their gaming products aren't getting much R&D attention. With all the projections for the growth of the AI market, Nvidia's decision seems pretty solid compared to 3dfx's back then.
      Also, it would probably be bad for the gaming GPU market if Nvidia decided to quit, we'd be back to duopoly again, except we'd then be left with the two people that currently offer the lower quality products.

  • @JonathanSias
    @JonathanSias Год назад +2

    Thank you for retesting this on Pcie 3.0 against a 3060ti!
    DLSS should not be compared directly against older systems. Testing should be done with identical settings to show the hardware changes. DLSS 3.0 can be added to results after that, but 1. Not all games support it, so it's not always relevant. 2. DLSS 3.0 introduces latency. It's not high refresh rate gameplay, it's just visual. If a gamer is sensitive to input lag, they may never choose to use DLSS.

  • @_mrsahem_
    @_mrsahem_ Год назад +61

    Really appreciate the follow up video. I actually commented on the original video asking about Gen3 performance. Totally forgot Gen3 8x == Gen4 4x. Regardless I think the 4060 ti is terrible value right now. Hopefully AMD can provide some better alternatives.

    • @notrixamoris3318
      @notrixamoris3318 Год назад +10

      7600 is the same...

    • @syncmonism
      @syncmonism Год назад +3

      In the US, you can get a new 6750 XT for 330 USD (including shipping) right now

    • @notrixamoris3318
      @notrixamoris3318 Год назад +1

      @@syncmonism everyone did a price drop...

    • @tilapiadave3234
      @tilapiadave3234 Год назад

      @@notrixamoris3318 Doesn't look to bad at the new price ,, 260 of those green American peso's

    • @astra6640
      @astra6640 Год назад +5

      ​@@notrixamoris3318 at least AMD's pricing is less wack... the 7600 having the same pitfalls isn't as bad when you factor in that it's lower in the product stack and priced according lower as well. Afaict, it's not *meant* to be a direct competitor to the 4060 Ti. Which makes the limited parameters a lot more excusable on AMD's side, since you get a low end limited product, but you do pay fittingly less for it. Whereas Nvidia's offering is kind of criminal...

  • @garbuckle3000
    @garbuckle3000 Год назад +3

    Thank you for testing this! This was exactly what I was wondering about when I heard only x8 for the 4060ti. I know sites did testing when the 6600XT came out with only x8 and even more people still had PCIe 3.0 (I was one of them). I admit I saw a slight increase in performance once I upgraded my system to PCIe 4.

    • @pandemicneetbux2110
      @pandemicneetbux2110 Год назад

      Man I'd be so pissed by that. Mainly because the board is literally the ONLY thing in my rig I don't expect ever to touch in the lifetime of the system period, literally even the PSU I would imagine needing to replace before having to get a different board. That's literally the exact reason why I got the pricey motherboard to begin with, because I didn't want to have stupid problems like these and not have any hassle because it's supposed to be a rock solid x570 that shouldn't even need to be touched until I replace my whole rig by 2030 (God willing, barring terrible unforseen consequence knock on wood).

    • @DanishBashir-sz6vs
      @DanishBashir-sz6vs Год назад

      I am buying a PCIe 3 motherboard and a Ryzen 2600. I know it's very very old but it's extremely cheap. And very oddly I am buying a 3090 or 4070. I know I know bottleneck. But I am going to be playing at 4k or even higher in my VR setting. Tell me how much difference will PCI 3 Vs PCI 4 will make at for example 8k?

    • @MsNyara
      @MsNyara Год назад

      @@DanishBashir-sz6vs Nothing, x16 4.0 cards does not bottlenecks with 3.0 setups. x8 4.0 cards does bottlenecks with 3.0 setups but the amount of bottleneck depends of the CPU and specific card pair, generally weaker cards bottlenecks less from it and AMD cards bottlenecks less due to Infinity Cache doing lift work and Nvidia having no equivalent.
      I will heavily advise to at least buy a Ryzen 5500 as it just costs $85 and doubles the performance, can be used on the same motherboard you had eyes on by just asking the seller to update the bios if it has not been updated yet.

    • @DanishBashir-sz6vs
      @DanishBashir-sz6vs Год назад

      @@MsNyara 5500 was exactly next in line and you made me reconsider again lol. Okay, I will think about processor once again.

    • @DanishBashir-sz6vs
      @DanishBashir-sz6vs Год назад

      @@MsNyara 5500 was exactly next in line and you made me reconsider again lol. Okay, I will think about processor once again.

  • @---le7cy
    @---le7cy Год назад +12

    Its a 50 card, named as 60ti and priced as 70 ....

    • @ignacio6454
      @ignacio6454 7 месяцев назад

      On 8 lanes. It is garbage.

    • @toututu2993
      @toututu2993 4 месяца назад

      With 2 useless features that make gaming experience worse

  • @earthtaurus5515
    @earthtaurus5515 Год назад

    Thanks for covering this Roman 👍🏽👍🏽, haven't seen many reviewers mention this at all.

  • @davidbrennan5
    @davidbrennan5 Год назад +8

    Native benchmarks only please, no gimmicks turned on. The new games are going to be very demanding, the requirements for Immortals of Aveum (unreal 5 engine) are very high even at lower resolutions. I wish AMD/Nvidia had spent time on improving the actual Native performance rather than add RT and FSR/DLSS . This is a great channel, Thank You for the hard work.

    • @davidbrennan5
      @davidbrennan5 Год назад +2

      @@mikeycrackson That is part of the problem but new games have very busy scenes and a lot of details that use up memory. They recommend a 12700K/5700X CPU and a 6800xt or 3080ti GPU @1440p 60FPS for medium to high settings for Immortals of Aveum. This game has an area where you are fighting on a moving mech and a battle is going on in the background and you have a hud screen etc...

  • @RAZGR1Z
    @RAZGR1Z Год назад +2

    Thank you for making English videos. I'm sure it's really annoying to do and I appreciate you take the time to make them.

  • @tqrules01
    @tqrules01 Год назад +16

    I like your points there are just a few things here I would like to add
    1: Most gamers really don't care about power unless the gap is huge "look at all the core i9 9900k users"
    2: The 8GB frame buffer is waaaaaay to low.......
    3: Nvidia believes they are better and bigger than the consumer and won't budge on price or volume.

    • @WaterZer0
      @WaterZer0 Год назад

      People in Europe will care more about power than the US for reasons made obvious in this video.

    • @korana6308
      @korana6308 Год назад +3

      Good points.
      1 I completely agree. And I seriously don't get that power efficiency thing. Because it depends on the settings that you play your game at.
      2 This 8GB bs is completely ridiculous since we already had it on a GTX 1070 !!! 8 bleeding years ago. Ngreedia releasing it with 8 gb is completely ridiculous, to me the 8Gb should only be at the baseline of gaming which means only a 4050 should have it and the rest should go up. 4050 8Gb, 4060 12Gb , 4070 16Gb , 4080 20 Gb , 4090 24 Gb, and anything outside of that is Ngreedia just screwing us.
      3 I think eventually they will though, but not as fast as people think. It will take a few months for them to organize a corporate meeting, to finally realize oh gush darnit we screwed up with pricing. They will have to lower the price...
      I saw a perfect comment on one of the 4060 videos, so now I repeat it everywhere. This 4060 is a 4050 at 4070' price.

    • @badass6300
      @badass6300 Год назад

      80%+ of their income comes from enterprise/server/AI and another 10% or so from laptops, they don't really care much to sell volume on the desktop side, they'd rather have premium products.
      I'm more surprised that AMD are even trying, because last year less than 10% of their revenue came from their GPUs.

  • @joannaatkins822
    @joannaatkins822 Год назад +7

    Thank you for your work, I really appreciate your insights.
    In general I would have liked to see a more in depth comparison between pcie 3.0 and 4.0 on the RTX 4060 ti, as such a big deal was made of that in the recent past with the RX 6500 xt. I'm guessing that without DLSS 3.0 the differences would be noticeable if not profound.
    With so many GPU launches, and with so many changes to specifications and platforms people are getting confused with the slew of tepid offerings from nvidia and amd. I've had so called techsperts recommending the Intel ARC cards to pure gamers recently, which I think is misguiding people at best.
    I've had people wondering why the RX 6700 XT is getting slammed and I've had to help them realise that it's a much newer and less useful product that is being slated.
    With all this in mind, so much is changing that short, clear, concise videos like yours are extremely helpful. Massive in depth reviews are not digestible enough for many people, even enthusiasts, so please keep up your efforts. They are greatly appreciated

    • @pandemicneetbux2110
      @pandemicneetbux2110 Год назад +1

      Anyone who calls himself a "techspert" is probably neither. That's pure 100% marketing bullshittery right there. Also again the only sole reason for mentioning DLSS is for making raytracing playable. You either have a new enough graphics card you can probably run your game natively anyway, or your GPU is old and slow enough to actually need it but also old enough not to have DLSS and therefore you're going to be using FSR anyway. That's for the tiny amount of games which even have it mind you, that's completely irrelevant for literally 99% of my games (I have iirc 7 games tops out of a 700+ library on GOG, Steam, and Epic that even have the option to use FSR/DLSS or RT, none of my other games do).
      So DLSS is really so much of a niche use case for such a small proportion of gamers that I literally see it along the lines of mentioning AMD's drivers for Linux, or nVidia's NVENC encoding and CUDA, yeah I"m sure AMD does run great on Linux and I'm sure your media business is doing well with your rendering and encoding, but it's completely irrelevant for the overwhelming majority of gamers just like DLSS is.

  • @thepatriot6966
    @thepatriot6966 Год назад +10

    I didn't even know the 4060Ti ran on only 8 pcie lanes. As you said no other reviewer spoke of this at all to my knowledge. What is Nividia thinking? What a disaster of a card.

    • @WaterZer0
      @WaterZer0 Год назад +3

      They are thinking: $$$

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      Cost saving. AMD has been doing it longer than nvidia.

  • @lonelymtbrider3369
    @lonelymtbrider3369 Год назад +10

    My thoughts on FSR, DLSS and frame generation is that it should always be included in a separate bar- but always together with the raw performance so we can see both side of the coins. I am one of those who gladly use DLSS + FG if it is available, because most of the time it's free performance gains with little to no negatives, but sometimes in some games DLSS look like sh** and it's better without.

    • @amunak_
      @amunak_ Год назад

      You also absolutely need it to be able to judge performance for titles that don't have it.

    • @lonelymtbrider3369
      @lonelymtbrider3369 Год назад +1

      @@amunak_ Yes, most definitely. Nvidia thinks DLSS3 should be compared to other cards running "raw" but that's just wrong. Nevertheless, I do want to see what kind of gains DLSS3+fg gives, but that should be an additional test, because raw performance is super important for the sake of comparison.

  • @pf100andahalf
    @pf100andahalf Год назад +58

    Almost every time I tell people that used 3080's are down to $400 now (I got one for $450 last october 2022 when prices finally dropped), and that they are a great deal, I get lots of pushback saying, "I won't buy used!" Glad to see there are some sane people around here. The 3080 is a beast and if you buy used it might be the best deal going right now.

    • @lat78610
      @lat78610 Год назад +15

      dude you re lucky in france people are insane and basically selling used at shop prices

    • @AndrewTSq
      @AndrewTSq Год назад +7

      I would actually get a new 4070 over a used 3080 anyday in the week. More vram, more speed, and you get a warranty. That is worth the extra $199 for me. edit: or if you dont mind AMD a 6800xt new is $529.

    • @main_stream_media_is_a_joke
      @main_stream_media_is_a_joke Год назад +2

      I think people are somewhat scared to open and clean what is quite possibly the most expensive part of the PC.
      Just yesterday I got a used 3060ti for 230$.
      It also has around 4 years of warranty still left....Zotac model.
      I got lucky as the guy who sold it to me was moving out and have had barely used it for just around a year....card is in great shape and I plan to open and clean it soon.

    • @yasunakaikumi
      @yasunakaikumi Год назад +3

      the problem with used is if was used in mining, you might higher rate of the fan bearing going to die in the mid use of it.... if mining wasnt a thing back when it was released, id recommend it anytime but since it's only 10GB of Vram... it's not that much of a different from having an 8GB

    • @seeibe
      @seeibe Год назад +5

      With European energy prices, getting a used 3080 at $400 vs a new 4070 at $600 is still a tough choice. In gaming the 3080 uses a solid 100W more than the 4070ti. If you game 2 hours a day for 5 years at 0.45€/kWh (prices will only go up from here), that comes down to 164€, which puts the costs of the cards more in line with each other. And that's not accounting for people like me, who also use their PC for work 8 hours a day and thus idle/low usage power consumption matters a lot. In conclusion, don't pay more than 400€ for a used 3080 if you live in Europe lol

  • @robertlawrence9000
    @robertlawrence9000 Год назад +18

    It's good to compare the cards primarily without upscaling or frame generation and then show it when using it for comparison. Good to see as much as possible of what the cards are capable of.

    • @SirDragonClaw
      @SirDragonClaw Год назад

      I disagree, at this point DLSS 2.x and 3.x have gotten good enough that you should almost always use them when available, and when set to quality mode with or without frame generation I (and most people) would argue that more times that not you are getting better than native or equivalent to native rendering quality perceptually which is what really matters.
      So I say include them as the new default (labeled though) for nvidia cards, and when AMD and Intel get theirs up to a similar quality where it looks as good or better then start including them by default.
      It really reminds me of years ago when the rendering quality differed between cards and brands in the early 2000's. You would adjust them to have a similar visual quality when benchmarking to get the most out of the cards while having a fair comparison.

    • @robertlawrence9000
      @robertlawrence9000 Год назад

      @@SirDragonClaw did you understand what I said and why I said it? Not all games have upscaling technology and good to still see without it also for comparison and then with it on

  • @paulc0102
    @paulc0102 Год назад +34

    I totally get why GPU benchmarks use the fastest CPU and memory to mitigate the influence of the platform on the comparison, but there's also a place for demonstrating performance on older platforms - if reviewers are really trying to allow viewers to make an informed choice, then to me they are currently failing. Credit to you for attention to detail :)

    • @-opus
      @-opus Год назад +2

      But how many different setups do you expect youtubers to benchmark on?

    • @paulc0102
      @paulc0102 Год назад +3

      @@-opus That is an issue. Relative performance (with a fixed high-end platform) is great to show (generational) improvements (or lack of) etc. But this example is a case in point (and I think Jay's video is an extreme illustration) that not considering other factors can skew the result.
      I would suggest that there are a lot of folks happy with their "older" intel systems and AM4 kit that might want to upgrade their GPU - so testing at least a couple of the most popular configurations wouldn't hurt - particularly where there is a fundamental change between current and last gen e.g. PCIE3 to PCIE4. Most of us upgrade incrementally (I certainly do). Nobody running a 13900KS on a $1,000 motherboard is going to be gaming with an RTX4060 though anyway......

    • @-opus
      @-opus Год назад +1

      @@paulc0102 I agree that it makes sense for significant changes, I just don't see youtubers being keen on doubling their workload and complicating charts even further.

    • @mryellow6918
      @mryellow6918 Год назад

      Well, then just watch a cpu review, a cpu can only give you so many frames

    • @99mage99
      @99mage99 Год назад

      @@paulc0102 I 100% agree that benchmarks for older systems are super great but even adding one older platform can drastically increase the workload and production time for a video so I get why they don't when they already have such a limited time to benchmark the samples they get. It'd be great if AMD, Intel, Nvidia etc would send the samples out sooner to give reviewers more time to test more configurations.

  • @MahbuburRahman-uc7np
    @MahbuburRahman-uc7np Год назад +2

    Excellent review. Just one suggestion, for the TDP cost-saving calculations I think its better to test the total system power draw rather than just the GPU power draw.
    Because you can not just run your GPU without the system. Doing so the gap will minimize and will reflect real-world use better

  • @totojejedinecnynick
    @totojejedinecnynick Год назад +4

    Roman, just a quick note on consumption calculations - you should consider additional cooling costs associated with card. You may need to ramp up case fans a bit to evacuate extra heat, you may have to run your air-conditioning harder in the summer... of course it won't make much difference in this case, but some high-end 600W cards - well, you have to consider it then. Remember that old datacenters had a rule of thumb that high-density servers can eat ~20-30% of total system power just to run their chassis fans, not to mention datacenter cooling overall.

    • @coppercore6287
      @coppercore6287 Год назад +1

      I agree with this, it does add a good deal of complexity to consider, but looking at a worst case during the summer months of having 150-200W or even more heat dumped into your house could be something.

    • @klopferator
      @klopferator Год назад

      This is hard to quantify in Germany because most houses and apartments here don't have AC anyway.

    • @Lishtenbird
      @Lishtenbird Год назад

      To be fair, you could get the reverse effect in winter, not having to run a heater as high.

    • @PainterVierax
      @PainterVierax Год назад +1

      @@Lishtenbird with this electricity price, it could be better to massively invest in heatpumps rather than resistive heaters.

    • @PainterVierax
      @PainterVierax Год назад +1

      another point is to not forget that older cards also have higher power consumption running light loads than newer ones. This dozen of Watts can be quite a difference when the computer is used for more than 4hr/day gaming sessions and sometimes barely powered off.

  • @Astfgl
    @Astfgl Год назад +1

    In my opinion, DLSS 2 and FSR 2 are entirely fair to include as a separate benchmarking category. Because they're legit methods to handle rendering and anti-aliasing in a different way, such that expensive techniques like ray tracing and outputting at high resolutions become more viable in practice. They genuinely offer real performance improvements with satisfying image quality.
    DLSS 3 however is where the waters become very muddled. It's not a technique that improves performance, no matter how much Nvidia would like us to believe that it is. It's a smoothing filter. It's like enabling Sharpening in the Nvidia control panel and saying that the resolution has improved. It's just not true. It can make games appear smoother but it does not make them run better or improve their responsiveness at all.
    Nvidia banking on DLSS 3 to make the 4060 Ti look better than it is and to mask the fact that it's really just an expensive 4050, frankly it's disgusting and tech reviewers are doing well not to get carried away by the misleading marketing.

  • @uncrunch398
    @uncrunch398 Год назад +5

    Difference in power consumption could also be a difference in power paid for cooling your room if you're hot a significant portion of the year. I already want to run a heat transfer system using pumped liquid so I can run my hot CPU all day when the ambient is already uncomfortably warm. I just can't afford it yet.

    • @palmariusbjrnstad1682
      @palmariusbjrnstad1682 Год назад +2

      Indeed, this is also big in data centres. You pay for the power consumption twice due to cooling. It can work the other way too, though. In Norway, most of the year, the PC power consumption will be exactly balanced by needing less heating. It's very common to have resistive electric heaters and then the (marginal) cost of PC power usage is zero (though, more efficient heating options like heat pumps are becoming widespread). [Edit: and in the summer the electricity tends to be cheap here, due to large solar production on the continent]

    • @PainterVierax
      @PainterVierax Год назад

      @@palmariusbjrnstad1682 Yeah heatpumps can mitigate the heating cost. Though it doesn't work well under zero Celsius so might be less incentive to invest on it in your latitude than in southern Europe (especially units with reverse heatpump option to replace an existing HVAC)

  • @CorvoPT
    @CorvoPT Год назад

    nice explanation! congrets!... cheers from Portugal!

  • @Verpal
    @Verpal Год назад +23

    $400 is a price point that I would expect budget gamer be paying, and many of them would be on pcie 3.0.
    NVIDIA truly did gamer a dirty one this time, good that sales volume seems to be abysmal, people spoke with their wallet.

    • @subaction
      @subaction Год назад +8

      The problem is that a lot of people "speak with their wallet" by buying a higher end nvidia card instead, rewarding their gambit.

    • @toututu2993
      @toututu2993 4 месяца назад

      I bought RX6600 for around $220. Such a great gpu with no 2 useless features thaf Nvidia been scamming you

  • @maxview99
    @maxview99 Год назад +1

    Excellent video, I agree power consumption is an important metric.

  • @tinfxc
    @tinfxc Год назад +4

    Nice recview! but hey, it would be nice if we also see how it performs on a pcie 4 system to see if its worth jumping to it or better stay on pcie 3 bit longer. thanks!

  • @nukedathlonman
    @nukedathlonman Год назад

    BIG thank you - I'm glad someone looked into this as it is very helpful information to have when people are looking at older systems for upgrades.

  • @KookyBone
    @KookyBone Год назад +3

    Great, just thought about the two topics of how the 8 lanes might affect the performance and how much the power savings could help. But just as a Sparfuchs, i look a lot after used GPUs here in Germany and I often find the RTX 3080 (10GB) for about 410-430€.... So it is even cheaper. But really great point: in the past Nvidia was nearly always matching the used price of their 80 series with the 60 (later Ti)series, but now they don't seem to care about this anymore... And with a market full of used 3080s this doesn't make sense. Really great video. Keep on doing them

  • @tenkowal
    @tenkowal Год назад +1

    I think the 8x4.0 lanes topic was extensively tested when RX 6600 (XT). It turned out that their performance on 3.0 vs 4.0 was not that bad, in many games below 2% (not noticeable outside of benchmarks). I think in case of 4060 TI it is similar, but there are other things (memory bus width) which are killing the product. Please test strictly 3.0 vs 4.0 performance on 4060 TI, I expect the story to be similar to RX 6600 (XT).
    I agree that power consumption of this card is impressive. If only price was also cut down to old 60 card standard.

    • @PainterVierax
      @PainterVierax Год назад

      Not sure about that hypothesis. There are other possible reasons such as the games are more demanding or the card is just more performant than the 6600 so it is using more bandwidth to achieve higher fps. Though AMD and Nvidia also have firmware/driver discrepancies on how they allocate VRAM.

  • @jonathanellis6097
    @jonathanellis6097 Год назад +7

    I was supprised by the lack of covrage about the pcie lanes. Not that long ago AMD got lambasted for doing the same thing, so i was surprised at the lack of testing on an older gen 3 board for the 4060ti, as i would imagin a lot of people with an older system would be looking at this card as as easy drop in replacement for an aging video card.

    • @ABaumstumpf
      @ABaumstumpf Год назад +2

      "Not that long ago AMD got lambasted for doing the same thing"
      They were rightfully pointed out as it was only 4!!! lanes.

    • @jonathanellis6097
      @jonathanellis6097 Год назад

      ​@@ABaumstumpfyeah AMD should have been, and were called out, and it received lots of negative press. The odd thing is Nvida seem to have been given a pass on the issue of pcie lanes. Nvida should have been called out for it as well.

    • @ABaumstumpf
      @ABaumstumpf Год назад

      @@jonathanellis6097 again - twice the lanes and as he has shown even with PCIe3 not a real problem - contrary to AMD.

  • @ibangladeshi1161
    @ibangladeshi1161 Год назад

    thanks for this bro, was looking for this, subscribed

  • @daviddesrosiers1946
    @daviddesrosiers1946 Год назад +5

    I've noped out on team red and green. Waiting on my Acer Predator Bifrost A770 OC. I'll take my chances with the guys who aren't trying to sell me an 8gb 128bit bus card for an outrageous price.

  • @Kingramze
    @Kingramze Год назад +1

    Thank you for this video! Most systems 3+ years old and older have PCIe3 boards, and those are the same customers that would consider a 4060 class card for an upgrade. Knowing the bandwidth is limited makes a big difference. It makes it even worse value for them, and I'd definitely recommend cards with full 16 lanes for such a system just so they get the most out of whatever card they choose. Very few cards are limited by PCIe3 16 lanes - and certainly only the high end ones.
    As for power efficiency, I think the reason it's not discussed in general is because the USA market doesn't find it a priority. I personally don't care because electricity is so cheap, it's not a factor. However, I DO care about noise and heat dissipation which are related to efficiency. If a card can use less power and therefore need less of a noisy cooling solution, then I'm very interested.

  • @RylTheValstrax
    @RylTheValstrax Год назад +3

    I agree - DLSS/FSR are not native raster perf, so they should not be treated as native raster perf. Its like the whole thing back in the 90s and 00s where companies were cheating by reducing graphical quality in benchmark programs, except now its being marketed as a feature you can use in games instead of a cheat baked into the drivers to be used in benchmarks. It is useful, dont get me wrong, but its not properly comparable/objectively like-for-like.

    • @Melsharpe95
      @Melsharpe95 Год назад

      AMD are working on their own version of DLSS 3.0 just as they did with DLSS 1.0 and 2.0.
      Current cards are all tested with DLSS/XESS/FSR on.
      It's a technology that helps, so why not test it and report on it?

  • @sortofsmarter
    @sortofsmarter Год назад

    your work and quality of information is great, I have never understood why you dont have 3X more subs. Your content is worth it, then purchasing test components would be less of a impact on the channels bottom line. Keep it up and Thanks

  • @master21mark
    @master21mark Год назад +3

    Thanks for the extensive review. It's great you also factor in the power consumption cost to. My thought on the bandwidth limitation - would overclocking the PCIe lane help with this? I know it's possible to cause instability this way, but I'd be interested to see the impact. Could also make a nice overclocking video as there is a lot less content on the topic.

  • @Arcona
    @Arcona Год назад +1

    Given the title of the video, the charts should have both been the 4060 Ti but in 4.0 and 3.0 slots, instead of comparing it to the 3060 Ti each time.

  • @CogxGaming3D
    @CogxGaming3D 11 месяцев назад +3

    So, i can install a RTX 4060 Ti on PCIe 3.0 Motherboard ?

  • @czdragon7473
    @czdragon7473 Год назад +1

    NVidia's big brain strategy: make the 4060ti so shit that people actually buy the 3060ti instead which isn't that less expensive but is much cheaper to produce 🤯

  • @Ghozer
    @Ghozer Год назад +6

    I'm glad you did the power savings... more channels should do this, and it's something that's been on my mind recently, especially with the energy crisis atm.....
    "The TRUE cost of gaming" - the cost of the electricity :D

  • @philmccracken2012
    @philmccracken2012 Год назад

    Also, I wanted to add I absolutely love your videos! Thank you for what you do and how you do it!

  • @pandabuttonftw745
    @pandabuttonftw745 Год назад +4

    DLSS3 isn't the selling point nvidia thinks it is. fake frames don't mean squat since it introduces artifacting (more than "normal" dlss) and the input latency is not improved at all.

    • @subaction
      @subaction Год назад

      Trillion dollar company, folks.

  • @TimLongson
    @TimLongson Год назад +1

    NVidia & AMD need to learn what gamers require from the 3 GPU categories: 1. High-End = Smooth performance at 4K ultra settings, (without needing false frame generation/emulations) in ALL current games.
    2. Mid-Range = Smooth performance at 1440p ultra settings (without needing false frame generation/emulations) in ALL current games.
    3. Low-end = Anything which can't at least deliver playable 1080p ultra in ALL current games (without needing false frame generation/emulations) is at the very BOTTOM/entry of low-end, because even managing it, without managing the same for 1440p, is STILL ONLY LOW-END GPUs.
    Technology moves forward; what WAS cutting edge 10 years ago is now outdated rubbish, so as new games come out, if a GPU can't meet the requirements for it for its current category then it moves DOWN a category - even current high-end GPUs will become low-end over time.
    By this clear and logical definition any current GPU with 8GB VRAM or less is automatically low end. The artificially high prices, and model numbers, don't change that, and AI emulation/frame generation is not a true substitute for raw performance, so GPU prices should reflect this.
    You could potentially have a better graphical gaming experience using a current games console than playing on a low end GPU, so why would anyone pay as much JUST for a GPU as an entire games console portable gaming system?
    Moore's Law states "speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain." Even if we HALF that, there should be AT LEAST a 50% raw performance increase for EVERY model PER generation which comes out every 2 years, and if they can't deliver that then they can't claim its a generational upgrade of the same cards.

    • @TimLongson
      @TimLongson Год назад

      Ps to answer your question, I agree with you: NO, never include A.I. gimmicks (DLSS or otherwise) in benchmarking; frame generation is ONLY for SMOOTHING so the extra FILLER frames should not be included in performance FPS because its NOT extra performance. Likewise, AI up scaling results in substandard graphics, so it's not a like for like comparison to other GPUs. Only native performance should be included in benchmarking GPUs, especially as many games aren't designed to use it.

  • @Biker_Gremling
    @Biker_Gremling Год назад +38

    This was going to be the RTX 4050ti, but due to the 4080 cancelation reshuffle, low end cards where pushed up a tier. nVidia apparently decided not to stick to the original names and prices and now we have the worst GPU line-up in history.

    • @lewisgrey6010
      @lewisgrey6010 Год назад +9

      Theoretically the unlaunch of the 4080 12gb has pushed the rest of the lineup down a tier in relation to their original plan, not up. (Current 4070 would’ve been 4070ti, 4060ti would’ve then been 4070 etc).

    • @korana6308
      @korana6308 Год назад +4

      I mean it could all bi fixed with the price, then it would be a normal line up.
      4060 Ti (16Gb) fair price $300
      4070 Ti fair price $600
      4080 fair price $800
      4090 fair price $1000

    • @jaymorrow8058
      @jaymorrow8058 Год назад +1

      In my opinion this card would have done fine if it would have been named rtx 4060 and sold for 300.
      If it drops that much (100$/€) in price (wouldn't hold my breath) Radeon is screwed.

    • @XenonG
      @XenonG Год назад

      No, NVIDIA pushed the naming up a chip tier and it was all done many months ahead of time with press releases of the images showing the specs plus leaks. People looked at the "4080" "12GB" specs and it just looked like it actually supposed to be down a tier with the compute core amount and memory bus width. Back lash put it down to where is is supposed to, but pricing is still too much. You can also tell by looking at the chip codenames, NVIDIA could have named them up a tier, but the audacity to be honest here is quite admirable. My response is the same as a certain kernel developer:"FUCK YOU NVIDIA".

    • @gutterg0d
      @gutterg0d Год назад

      It's not worse, just more expensive.

  • @Rachit0904
    @Rachit0904 Год назад +1

    That’s the GDDR6X 3060 Ti. It uses more power and is a whisker faster than the original 3060 Ti.
    I think you should also test at 1440p Ultra, including games that are very VRAM demanding like Last of Us, Doom Eternal, even Fortnite with RT. When running out of VRAM, PCIe bandwidth becomes more important. Hardware Unboxed showed this with the 6600 XT and 6500 XT.

  • @daviddesrosiers1946
    @daviddesrosiers1946 Год назад +6

    This gen of cards should have the market chasing Nvidia's execs down the street with pitchforks.

    • @subaction
      @subaction Год назад

      Not at all. It's good for the stock value to sell overpriced cards that will feel obsolete in one or two years of use. The execs are doing what they are required to do: make the most money for the company. They just happen to have noticed that the best way to do that is to exploit consumer stupidity.

  • @eldibs
    @eldibs Год назад +1

    My opinion is that DLSS/FSR/XeSS should always be off for reviews, because I want to know what the actual processing power of the GPU is in a like-for-like comparison, because not every game people play supports upscaling, and the ones that do don't always support all three.

  • @reviii-s4t
    @reviii-s4t Год назад +3

    can you compare how undervolted + OCed 3060Ti vs stock 4060Ti? seems interesting if you can achieve same power draw and same performance as 4060Ti without any cost

    • @27Zangle
      @27Zangle Год назад +1

      Interesting thought!

  • @melbornefischer4493
    @melbornefischer4493 Год назад

    Thank you for show the power consumption. :) I watched the other yt reviews and was wondering if they match last gen, whats the power consumption then and nobody showed enough so thank you.

  • @planetrift
    @planetrift Год назад +26

    DLSS 3 adds latency. There are simply no two ways about it. Those extra frames are outside the measurement of whatever you use to control the character in game. "Framesmoothing" is a good term. But I tend not to use DLSS, because it is destructive to the environment. And If I begin looking at artifacts in a game, I'm focusing on the wrong thing. I'd rather lose performance, if it means the game will look better. And the prices.. nVidia is living in the cryptostonkers lalala land.

    • @dracer35
      @dracer35 Год назад +1

      I agree with you 100%

    • @retrowrath9374
      @retrowrath9374 Год назад +2

      Why complain about choice though, Frame-Generation adds latency not DLSS, it's only really reliant in competition though, it's just not an issue with 120hz and upwards, it doesn't ruin you game experience in any sense of the word, gamers are being picky based off what tech youtubers are saying because they pick up more on it as it's said. If people are not happy with DLSS then go back to native a with crappy TAA then

    • @dracer35
      @dracer35 Год назад +4

      @@retrowrath9374 I don't mind having the choice but in my experience for the games the play. DLSS2 can add artifacts and reduces image quality so I'd rather play in native resolution of either 4k or 1440p. DLSS3 adds latency and artifacts so I find it useless. It's perfectly fine if you find it useful but there are those of us that see it more as a marketing gimmick rather than a useful feature.

    • @xxovereyexx5019
      @xxovereyexx5019 Год назад +1

      omg these comments, you are not expertise, so stfu ...

    • @dracer35
      @dracer35 Год назад

      @@retrowrath9374 I Never said anything about TAA. Sounds like you are the only one whining here lol. Let's just agree to disagree. Have a good day sir or mam. 👍

  • @ViseOW
    @ViseOW 9 месяцев назад +2

    Hey, I'm sure this is a dumb question but I can't figure this out myself.
    Would I be able to just purchase a 4060 and be able to use it or would I need to upgrade my PSU and/or Motherboard as well?
    I currently have a 2060 Super with a 500W PSU, running on an Asrock b450m/ac board. I'm wanting to get an RTX 4060 but some places I look it says that 500W is fine and some say that its not fine. I also understand that my motherboard is PCIe 3.0.

    • @juan180p
      @juan180p 3 месяца назад

      From 600w to 750w It is what they have recommended to me, and yes, you can use the 4060, you just wouldn't get the most out of it.

  • @nickw6068
    @nickw6068 Год назад +9

    On the subject of energy efficiency. Would be good to hear about idle power. I game a several hours a few times a week but I use the PC for 10-12 hours a day for 4 or 5 days a week.

    • @ryutenmen
      @ryutenmen Год назад

      7W idle, 13w on video playback.
      Its all on Nvidia's page.

  • @ferreiram32
    @ferreiram32 Год назад

    Lucas Peperaio, from Brazil, was the first reviewer to talk about and show PCI 3.0 performance in the RTX 4060 Ti.

  • @mikekleiner3741
    @mikekleiner3741 Год назад +4

    Whenever I am able to afford a GPU that makes sense I am not buying Nvidia. They don't care about gamers, just miners and now AI deep learning. 2017 was so long ago for my 1080ti and yet no reason to buy a GPU unless you are rich.

  • @elu5ive
    @elu5ive Год назад +1

    EXTREMELY important video.
    anything faster than rx6600 needs more than 8x PCIE lanes on gen 3.
    ANYONE who still has a PCIe 3 motherboard: 7600 and 4060 ti are a TERRIBLE option

  • @peterarnfield7300
    @peterarnfield7300 Год назад +6

    I became aware of the PCI 4.0 8x issue when I had to replace the dead GPU on a B450 motherboard a few months ago - not impressed at the cost cutting and designed-in bottleneck for those with older systems. One expects maybe a new PSU, possibly CPU to get the best out of a new GPU, but not a complete platform change.

    • @taylorshin
      @taylorshin Год назад

      I would rather get a used 30x0 series that has full lane.
      They are even talking about PCI gen 5 these days. Be prepared to leap the generation.
      Yet, without any new and 'good' AAA games good enough to play... is there any reason to even spend a dime on a gaming PC?

  • @AthanImmortal
    @AthanImmortal Год назад +1

    I understand why you chose the CPU you did, but I wish there was also a more common PCIe 3.0 CPU like an Intel 8600K or Ryzen 2600x in the charts, as I think those are the people more likely to be eyeing this, and it they wont have the CPU overhead to smooth out the 1% lows.

    • @joefowble
      @joefowble Год назад

      To show the GPU performance as purely as possible, it's generally best to use the most powerful CPU so there isn't a CPU limitation. 10 core 10850k is a great choice for that.

  • @an7on413
    @an7on413 Год назад +7

    How do these cards perform with and without SAM/resizable bar? I feel like the low bandwidth of these could make these features essential.

  • @RadarLeon
    @RadarLeon Год назад +1

    Quick question based on this video. Two choices, say both are $500, A. Card one is 300 watts and gets 20% more frames than card two, B. Card two is 150w and gets cannot be overclocked and is 60fps @ max settings. Which do you buy. Card A or Card B

  • @notwhatitwasbefore
    @notwhatitwasbefore Год назад +4

    THANK YOU! I was wondering if the techtuber crowd had all lost the plot a little with many pointing out last gens 6500xt with its 4x PCI 4.0 would be an issue for people on older systems would likely get much lower performance that those on the latest platform but Nvidia seemed to get a pass on a significantly more expensive product having a similar issue.
    Would of been nice to see the 4060ti on a PCIe 3.0 system compared to a 4.0 system as I feel thats the only way to really see if it makes any real world differences. Honestly its the only testing I actually want to see on this card as if it makes little to no difference in most games then nvidias claims about improvements to cache and memory performance would seemingly been proven but if it has a noticeable drop in fps on the older PCI standard then thats probably the single most important piece of consumer advice about this product and no one has done that test as far as I can see.
    Again there was lots of it for the AMD card on day one and that was a card at half the price of this one that was probably not much interest to most gamers due to being the rx580 again but the 4060ti is much higher profile and will be looked at as an upgrade path for many people despite I would assume a potentially huge bottleneck for those users. Shame no one picked up on it but Nvidia always gets a pass by press this is just the latest exmple and once again thank you and well done for at least bringing it up

  • @miksim494
    @miksim494 Месяц назад

    I really appreciate that you give such an emphasis on the power consumption because nowadays electricity is expensive in Greece at least

  • @DragonOfTheMortalKombat
    @DragonOfTheMortalKombat Год назад +4

    This is Nvidia's 6500XT.

  • @Xzavn
    @Xzavn Год назад +2

    I agree with you and HWU on the DLSS issue. DLSS 3 will basically only be available in very new titles, while older ones will only offer a DLSS gen that is available to both cards, so including this gives only a very narrow view. Additionally eventually the 3060Ti will get access to FSR 3.0, which will vastly diminish any DLSS 3.0 benefit.
    And then there is the thing that frame generation is not bound to DLSS 3, so Igors Lab did a test on a 4090 comparing DLSS 3.0, FSR 2.1 and XeSS with frame generation, and FSR 2.1 was on par or beating DLSS 3 there. So with FSR 3.0 possibly offering frame generation too, and the 3060 Ti being able to use that, the performance benefit of DLSS 3 would completely vanish, aside from maybe having slightly better image quality.
    And then there is the issue of frame generation not delivering any "true" additional fps. So if you enable frame generation when your GPU is only rendering low FPS, you might get smoother FPS, but the game will still feel off, as there isn't any more information generated than without frame generation, and your input delay might get pretty high.

    • @Lion117YT
      @Lion117YT Год назад

      Segun rumores fsr3 estara ligado al driver por lo que puede que solo sea compatible con radeon 7000

  • @shyawkward
    @shyawkward Год назад +4

    The 4060ti will age far worse compared to the 3060ti due to the bandwidth limits, both pcie and bus width. A good example is the rx580 that can still play modern games on respectable settings at even 1440p but ONLY if you have the 8gb version

    • @caribbaviator7058
      @caribbaviator7058 Год назад

      That's their plan all along. By the 4060ti now and upgrade to 5060ti when it's out .
      The 2060super was my last midrange card from Nvidia.

  • @AncientGameplays
    @AncientGameplays Год назад

    Thanks for the tests, but you tested 4 games... And you didn't even test the 4060TI on PCIe vs PCIe 4 as well...

  • @pickle-o1151
    @pickle-o1151 Год назад +4

    PCIe 5.0 cards when? I am ready with a motherboard that will never use those sweet 5.0 GPU lanes

    • @IscAst4
      @IscAst4 Год назад +3

      x2 😢

    • @swizzler8053
      @swizzler8053 Год назад

      Maybe next year.

    • @einarcgulbrandsen7177
      @einarcgulbrandsen7177 Год назад

      Currently PCIe 4 is more than enough. Gonna be a few years before any GPU needs PCIe 5. But its good for marketing specs......

    • @kimnice
      @kimnice Год назад

      Well RTX 4090 shows barely any bottlnecking when using PCIe 2.0 x16 bandwidths (RTX 3080 with AGP, from 2003, bus speeds is still faster than RTX 3070 with PCIe 4.0 x16). It will take few generations when high end cards will need 4.0. Maybe next year these cards with insufficient amount of VRAM can utilize PCIe 5.0. Assuming they also cripple the PCIe connector.

  • @rollerr
    @rollerr Год назад +1

    Frame generation has no business being compared directly to the raw fps of competing cards. Digital Foundry recently put frame generation data on the same graphs as raw AMD performance data, which to me is totally misleading.

  • @SovietGrazz
    @SovietGrazz Год назад +9

    You should NEVER include frame gen DLSS mode in benchmarks. Screw that.

    • @subaction
      @subaction Год назад

      They don't pay the reviewers to be honest. They pay them to sell the cards.

  • @alisioardiona727
    @alisioardiona727 Год назад +1

    DLSS shouldn't be the norm in benchmarking because not that many people use it and it's not apples to apples compared to other cards.

  • @mingyi456
    @mingyi456 Год назад +5

    I wanted to see the 4060 ti benchmarked at pcie 4.0 vs pcie 3.0, it was disappointing to see that the only comparison was done with the 3060 ti, both at pcie 3.0. The video title actually seems to imply it will contain a comparison that was shown nowhere in the video.

    • @Pulver
      @Pulver Год назад

      YES

    • @taylorshin
      @taylorshin Год назад

      The video clearly shows no significant performance difference due to PCI generation. The generational comparison on the same 4060 Ti was already made in previous video. PCI-e 4 x4 is technically PCI-e 3 x8.
      But then again the same no significant improvement happened in terms of raw power from previous generation of 3060 Ti. Except those handful of DLSS 3 capable games.
      And yet, prcing went up while power consumption was reduced but not as much as to be meaningful in many countries where electricity prices are not too high.
      So, what is the point even talking about 4060 Ti? Just ignore it jus like those Japanese people until price returns to its level, 4050 Ti price.

  • @dangingerich2559
    @dangingerich2559 Год назад

    US power costs are far more complex than just the kWh rate. In many areas, including my current area, there are additional charges that are charged also per kWh. In my area, northern Denver suburbs, there is Xcel energy, which charges an "electrical commodity adjustment," presumably to pay for maintaining the grid in the area. While our rate is officially $0.11/kWh, the "electrical commodity adjustment" more than doubles the bill, making our ACTUAL electrical rate more like $0.26/kWh. That adjustment rate is larger in older parts of the city, where it costs more to maintain the infrastructure. There are also areas where that charge is simply not present in the south east suburbs. I'm in the middle of moving to that side of town now, and the electrical rate is $0.14 or $0.13/kWh, but with no "adjustment" is comes out significantly cheaper than the northern suburbs. (All the while, Xcel energy is making record profits. Making $3.17 per share earnings in 2022.) Plus, in the summer time, for residential users, if the household uses more than a certain level of power, they double the electrical rates, supposedly to discourage high energy use. (More like profits for Xcel without having to do anything to earn them.) This makes it even more unpredictable for costs to run a certain card over a year.

  • @addnan6494
    @addnan6494 Год назад +5

    There are so many Intel CPUs that are on PCIE 3.0 that don't need to be upgraded. Even going back to 8700K, it is still a solid CPU, and Nvidia shipping this with 8 lanes is ridiculous. Would have been a great match up with that CPU (assuming it was cheaper).

    • @ABaumstumpf
      @ABaumstumpf Год назад +1

      But as he has shown - the difference is minor in most cases.

    • @addnan6494
      @addnan6494 Год назад

      @@ABaumstumpf the difference is minor because of the bandwidth limitation. The difference in the two cards is wider with more bandwidth.

    • @ABaumstumpf
      @ABaumstumpf Год назад

      @@addnan6494 " the difference is minor because of the bandwidth limitation."
      Watch his video again. he has tested with 8x and 4x and the difference is minor - aka the bandwidth is rarely relevant.

  • @NeroKoso
    @NeroKoso Год назад

    7:50 in Finland we also have to add the electricity transfer cost, which makes the electricity bill double. Plus tax.

  • @HexerPsy
    @HexerPsy Год назад +4

    DLSS 2 is a performance trade off between image quality and frame rate. If you tets image quality to be the same (say quality vs native) than I think its fair to add DLSS 2.0 to the charts.
    But.... DLSS 3 must wait for 2 frames to be rendered, then interpolates the frame. Its also limited by the maximum frame rate, if users set a frame rate cap to their monitor.
    So DLSS 3 always has a latency penalty, and that difference can be felt in some games and you cant fit that in the chart.
    So now you must check latency impact... image quality for the upscale... and interpollation artefacts... Thats a lot to get right and frankly incomparable to native resolution.
    However, I think some story games could benefit from this tech. So then it becomes game dependant if the tech is applicable or not.
    At that point, you are not testing the GPU.
    Wait for DLSS 3 charts when you test DLSS3 between generations on a few test case games to see how the tech progresses. Maybe, like RT, it will improve and become the new standard setting for some games.

    • @korana6308
      @korana6308 Год назад

      Good point. Latency is very important. The point of immersion is that you feel instantaneous actions with your ingame character and it better immerses you in game. But when there's a delay this immersion breaks. So it shouldn't be just about FPS, but also about the input delay in games. And I wish more reviewers checked that.

    • @xxovereyexx5019
      @xxovereyexx5019 Год назад +1

      dude latency is only important in competitive or esport games
      if you only play casual games, the different is very little to no difference at all

    • @HexerPsy
      @HexerPsy Год назад

      @@xxovereyexx5019 But consider the tech by itself then.
      If you have a 60 fps display without freesync or gsync, either uncap it and holysht that screen tearing all over the place with DLSS 3. Or with a target of 60fps, DLSS 3 is going to render 30fps and it ll feel like a 20-24fps game.
      And you will definitely notice some artefacting on the interpolated frames if its targeting 60fps.
      Also, upscaled image quality for 1080p looks poor because of the low internal render resolution. You are better of rendering native for image quality at 1080p, and most games render 1080 just fine!
      Sure, if you have a higher end 144Hz Gsyn/Freesync display with 1440p or higher res, DLSS 3 is going to render at 72fps, and the latency penalty makes it feel 50-60fps. You will be fine indeed.
      But are you really a casual behind that kind of monitor tech?
      Alternatively you can drop a quality level on DLSS 2, and sacrifice some image quality for better latency AND more fps.
      And if latency doesnt matter at all - run it native at 30 fps. Would you notice the higher fps in same games? Are the games you play even that demanding? Switch, consoles, mobile gaming do it too...
      To me, DLSS 3 becomes such a niche applicable feature that its hard to find a use case for.

  • @DigitalIP
    @DigitalIP Год назад

    I mentioned a 3.0 system either on this channel or on Jayz before, so YAYYYYY
    Glad someones testing on it

  • @XxFusionsk8rxX
    @XxFusionsk8rxX Год назад +10

    You, Jay and Steve are absolutely the best youtubers covering tech. The 3 of you go so far beyond for everyone else.

    • @Elinzar
      @Elinzar Год назад +7

      Id say Jesus Steve, Unboxed Steve and Debauer, Jay have sold out many times and Ltt for me has been Meh these days except when they talk about their server stuff

    • @XxFusionsk8rxX
      @XxFusionsk8rxX Год назад +4

      @Elinzar yeah LTT has been lacking imo and I feel that about Jay but to me he still has provided so many helpful videos despite the sponsoring and such but yeah der8auer and Steve are who I watch for all teardowns, OCs, Reviews and Experimental stuff. Love these 2 channels, wouldn't have been able to build my first rig without their videos and explanations

    • @pandemicneetbux2110
      @pandemicneetbux2110 Год назад +3

      In fairness HWUB was among the first to be covering this VRAM issue, and really hammering nVidia for their lying bullshit and scammery. I feel like Aussies in general aren't as willing to put up with slick bullshit.

  • @Cblan1224
    @Cblan1224 Год назад

    Yo how long until we get the mycro direct die block? There are a few threads of people anxiously checking on this every day, including myself!
    Keep up the great work
    Thanks

  • @kutark
    @kutark Год назад +4

    I'm very sensitive to artifacts and texture flashing and popping. I have yet to find a game where i enabled DLSS and didn't hate it.

  • @DoNotFitInACivic
    @DoNotFitInACivic Год назад

    Bless your detail oriented German soul for breaking out the power costs for the cards.
    Too many YT and pront reviewers mention the efficiency gains but do not give a real world figure for the savings.