Frame Generation A Scam? Super Refresh or GeForce 50 Series Next Year? October Q&A [Part 3]

Поделиться
HTML-код
  • Опубликовано: 9 янв 2025

Комментарии • 831

  • @mukkah
    @mukkah Год назад +19

    13:12 a really fair acknowledgement. If you're cap'd, you're budget capped, no worries. Totally get the recomm, that if you can manage the extra cost, AM5 is the way to go for budget to help carry into the future > which helps to stay budget friendly down the road, too! ^^
    Did an AM4 build Nov '22, happy with it and am OK with upgrade path cappin' at 5800X3D and top end ddr4 ram (don't do much heavy productivity)
    Build within your means, and take these homies advice whever it fits your needs!

  • @kaslanyaan
    @kaslanyaan Год назад +347

    Guess we're about to see the 5090 being priced at least $3000

    • @Rezwolf
      @Rezwolf Год назад +46

      probably 4500 on ebay

    • @Petch85
      @Petch85 Год назад +74

      Naaaa, It will be a subscription only product.

    • @Tanzu15
      @Tanzu15 Год назад +9

      Guess almost no one will buy them. If the 5090 costs 3000 that means a 5070 would cost 2k

    • @Rockett.
      @Rockett. Год назад +32

      Stop capping, everyone knows it's gonna be between 1.5k-2k

    • @roanbrand7358
      @roanbrand7358 Год назад +40

      Nah, $1900 msrp with $2300+ real price initially

  • @sublime2craig
    @sublime2craig Год назад +2

    Thanks!

  • @chrislemery8178
    @chrislemery8178 Год назад +42

    The thing with frame gen that seems to shock most people is that you have to have the game already performing somewhat decent to begin with before the frame truly shines. Many were expecting a fix this broken game medicine and it just isn't what it is. If the game runs choppy without it, it's gonna run choppy with it enabled.

    • @EarthIsFlat456
      @EarthIsFlat456 Год назад +8

      Most people are simply clueless. FG is icing on the cake, not a fix for bad performance.

    • @jwhi419
      @jwhi419 Год назад +6

      @@EarthIsFlat456 same thing for upscaling frankly. If your game is micro stuttering for 50ms. Increasing your fps and reducing your ms render time... Will only make it worse if anything.

    • @volkishelf1088
      @volkishelf1088 Год назад +2

      @@jwhi419and your source for that is? Sounds like utter bullshit, upscaling is something entirely different.

    • @kevinerbs2778
      @kevinerbs2778 Год назад

      @@volkishelf1088 you should just think about the number of games that have released this year that all have stuttering. Stuttering & even" micro stuttering" is proving to be more of a game engine software coding problem than anything wrong with the GPU's. I mean there been at least 14 games now this year that where terrible optimized or coded to the point where they need D.L.S.S/F.S.R/X.e.S.S to even become playable. That is totally unaccepted for all these co called triple A titles.

  • @dragonmares59110
    @dragonmares59110 Год назад +41

    Not going to upgrade as long as those price stay so stupidly high...at this point i am giving up playing on PC and will switch to console if nvidia, amd and intel are not willing to go down in price

    • @JavierPwns
      @JavierPwns Год назад

      Welcome to the new economy

    • @Idyllicsilver
      @Idyllicsilver Год назад +6

      Already made the switch to console and haven't looked back

    • @Tanzu15
      @Tanzu15 Год назад +3

      At some point the upgrade to the highest end card at the time will be worth it. For example. I went from 2017 launch day 1080 Ti to rtx 4090. And the jump in performance was leaps and bounds and the vram was double and plus. Uses less power, and has all the new tech. If you wait to buy a 5090. It will be worth it as long as you spend 1500 or less.

    • @t5kcannon1
      @t5kcannon1 Год назад +6

      PC gaming is very viable if you turn down the settings a bit. I'm still running an MSI 970 GTX, and have enjoyed a lot of games over the years, at a great price from Steam, Epic, GoG, etc. I'm in the market for a system upgrade; but am in no great hurry.

    • @k.23_02
      @k.23_02 Год назад +9

      I'd rather turn the settings down completely than ever switch to the scam which is modern consoles where games are more expensive and then you have to pay for all these scam subscription models just to play multiplayer and the choice of game selection isn't even that good considering modern games have hardly been worth the money.

  • @patrickbrault8411
    @patrickbrault8411 Год назад +24

    You guys should do a special lemon episode where you go over the worst hardware and games you encountered in 2023. BTW you deserve the million mark and much more. Keep up the good work.

  • @Chairman_Wang
    @Chairman_Wang Год назад +2

    Unless there is some fire sale I would always buy the lesser cost system now and upgrade later. Never count on reusing your MB.
    Pick the price/performance ratio king and upgrade whenever later beats trying to future proof now in the budget category imo.

  • @gucky4717
    @gucky4717 Год назад +3

    8:00 My last 3 mainboards/CPUs were always EOL (2600k,7700k, 5800X3D).
    The CPU was always around 300€, the board ~150€ and the RAM ~100€. When I sold them 5 years later, I got ~250-300€ for CPU+mainboard+RAM, so about half.
    So I never "upgraded" my CPU while keeping my board. I didn't know back then, that my Intel sockets were EOL...
    GPUs are tougher... The last GPU I sold with "normal" prices was my 1080Ti for 400€ after 3 years of use, while it had cost ~800€. I'd say GPU can lose its worth much faster...

    • @gucky4717
      @gucky4717 Год назад +2

      @@ardonsioren But 5 years later you won't upgrade just one gen, but 5. :D
      I don't upgrade every year...

    • @loganmedia1142
      @loganmedia1142 Год назад

      Same here. I think the last time I upgraded a CPU on the same socket was in the 486 and Pentium days.

  • @MinosML
    @MinosML Год назад +4

    Lmao favorite intro so far, not even trying by the third part haha
    Back to you, Steve!

  • @Slider93
    @Slider93 Год назад +88

    Finally somebody with a media presence loudly said "brands don't matter, they are just the name, you have to do research yourself"

    • @GFClocked
      @GFClocked Год назад +11

      I've been watching for years, they've been very consistent and down to earth with stuff like this.

    • @emmata98
      @emmata98 Год назад +4

      I mean this is obvious and standard for hub or gn, and even Jay

    • @dIggl3r
      @dIggl3r Год назад +4

      But but... I've been told to absolutely not buying AM5 ASUS motherboards... that person was wrong then? 😄

    • @TTks124
      @TTks124 Год назад

      This should be common sense in all things you you're looking to acquire. Sadly to few ppl have common sense.

    • @MEGABUMSTENCH
      @MEGABUMSTENCH Год назад +1

      @@dIggl3r did you go momentarily blind before you read "do your own research"?

  • @zodwraith5745
    @zodwraith5745 Год назад +2

    You can find 12600k, 12700k, and Zen3 chips for stupidly cheap right now. They're the performance value at the moment. When it comes to mobos you can get DDR4 Z690 boards much cheaper than DDR5 ones if you don't want to sacrifice overclocking on a B board.
    All of these options along side Zen3 are still very viable gaming platforms until you get into FAR more expensive GPUs. Anyone with a 4 figure GPU isn't trying to save $75 on their platform, and anyone building a budget system isn't cross shopping 4090s and XTXs to pair it with.
    Even trying to push the "future proofing" argument, for one everyone knows that's always a faulty argument that only makes you overspend, and by the time you NEED to upgrade again there will be new mid range chips faster than your "future proof" chip and probably for around the same price as the amount you saved to begin with.
    It's much easier to overspend on a CPU that you'll never fully utilize than a GPU that you'll _constantly_ be running at full tilt while gaming. This is why I don't agree with weighing CPUs ONLY when paired with a 4090/XTX like all the review channels do. It leads people to think they _NEED_ far more CPU than they really do when 99% of gamers are almost never CPU bound. Their budget will give vastly better real world performance with as much focused on the GPU as possible. As long as you're not talking about a seriously outdated 4770 or 1600x of course.

  • @pirotehs
    @pirotehs Год назад +2

    This is how much HUB cares about power consumption for Nvidia! 4060ti performance with tdp 75w? "we don't need it" 5:50 . HUB prefers AMD rx6700xt card with 230w tdp. Don't forget this next time HUB will complain about Intel's CPU power consumption. They WILL

  • @highlanderknight
    @highlanderknight Год назад +1

    Frame generation is not a scam. If players like it and it helps some with their frame rates, great. If you don't like it, then just totally ignore that feature when figuring out what GPU you might like to purchase. If it is a technology that is available, why leave it out? Just simply don't use it and let the rest of us decide for ourselves.

  • @komorka88
    @komorka88 Год назад +3

    As for component prices. I just changed my Ryzen 1600 on B350 board to 5800X3D on the same board. Huge jump in performance while not having to buy new mobo + ram + AM5 cpu that would cost 250% as much as 5800X3D alone, and bring maybe 15-20% more performance only.
    AM5 currently makes sense only for someone that wants to buy whole new platform because he is on something like LGA1155 or AM3. I will likely skip AM5 entirely with the 5800X3D.
    Thus, I believe AMD will not make the same mistake and AM5 will be short lived and without such great upgrade capabilities as AM4.

    • @loganmedia1142
      @loganmedia1142 Год назад

      Similarly for those who have an LGA1700 CPU. There are going to be faster options that don't require a new board available for quite a while still.

  • @KingGurrenAlex7
    @KingGurrenAlex7 Год назад +1

    Unluckly i had to upgrade for work purposes from my I5 7600k, and the unfortunate thing is in Mexico the prices for getting a AM5 machine just the cost of the CPU alone i can get a Mobo for am4 and a ryzen 5000 cpu so long as i don't go past a 5800X. Since i was able to reuse my ram it was the way too go for my upgrade and now ill hold out for a year or two until i do a very good upgrade. I plan to do another upgrade so that my brother can use the current machine i have and he doesn't have something that wont last him another couple of years.

  • @dividead100
    @dividead100 Год назад +2

    AM4 is just much better value, 90% or more will never upgrade. Given that, recommending AM5 to people on a budget is just bad . Also, past experience says the AM5 components you buy today will be cheaper in 3 years so just do an entire upgrade then and sell whatever you won't be using anymore.

  • @stevenwest1494
    @stevenwest1494 Год назад +1

    I totally disagree, with budget builders going AM5, as they don't all need 32gb of RAM, as they don't use the programs needing it, and the AM5 chips are expensive. The Zen 3 platform is so mature, a second hand B450 board could handle a 5950x and 32gb of RAM in 2 years time!

  • @nio3836
    @nio3836 Год назад +1

    My biggest fear of FG and Upscalers is that they focus way to much on features and less on the actual hardware. The RTX 4060 ti is the prime example of that.
    The main focus of hardware should always be the hardware itself. Features should only be the the cherry on top, nothing more nothing less, but never the main focus. But that's how it is going to look like in the future....

    • @erobbin144
      @erobbin144 Год назад

      well no, the entire point of a GPU is to deliver an enjoyable experience, and at the end of the day why would it matter if it comes from software or hardware?

  • @crzyces1693
    @crzyces1693 Год назад +17

    A lot of people are really expecting the 4080 Super/Ti to come out at $1000-$1100 and be 15%+ faster than the 4080. Then, they expect the 4080 to drop to $800 which will in turn move everything else down and I just do not see that being the case. I don't even see the 4070 or 4070Ti being lowered in price, though I do think they will slot a 4070 Super Ti in between the 4070Ti and 4080 *FOR MORE MONEY.* Nvidia is not going to give up these damn prices. As long as home pc tinkerers are third on their sales list they are just going to keep prices crazy high until the AI thing slows down. I could see them (Nvidia) actually buying the cards back from AIB's if they can't sell them and repackaging them as Datacenter or less expensive _"Pro"_ cards. Less expensive compared to the current Pro cards, not less expensive than the gaming ones. Then if they have to eat some of the cost it's easier to write off while they make a larger profit on the ones they actually sell. They assume that even if they have to eat 75% of the stock between the write-offs and the double profit from the pro card sales they'll be right back where they were had they sold them to gamers without them ever having to bring the damn price points down to a level that makes sense. *UNLESS* AMD decides to be really aggressive with pricing. Then they'd try. Without AMD trying they literally do not care at all though.

    • @Greenalex89
      @Greenalex89 Год назад +5

      Well how much further down do u want AMD to go? The 7800XT is at a 4060ti price range and the difference is huge, the 7900xtx is slightly better than the 4080 and still 250 bucks cheaper (after the 4080s price drop). Last and current gen Nvidia cards are still too expensive whereas so many AMD cards are called best bang for buck, like the 6600XT, 6700XT, 6800. I almost feel sad for people who bought a 3070, 4070 or 4080 card, such bad value..Proper research could have prevented that.
      I just dont see AMD lowering prices that much more, since Nvidia wont, cause they dont care. Thats why we need intel to step up and bring some serious competition.

    • @vulpix9210
      @vulpix9210 Год назад +2

      @@Greenalex89 the 7900XTX in worst case scenario is around the 4080. If you're being honest and test without any frame interpolation tech (DLSS, Frame Gen), and RT off as well its competitive with the 4090. The pricing of it is fine, its a GPU that gives 4090 level of gaming performance disregarding RT. And not many people and games use RT.

    • @donger58
      @donger58 Год назад

      ​@@Greenalex89did you find a 7800xt for 370-400$?

    • @Greenalex89
      @Greenalex89 Год назад

      @@donger58 well 4060ti was 499$ not too long ago, obviously prices had to drop, since its a totally rediculous price, Id buy a 7800XT for that money any day, no clue what your point is exactly

    • @Greenalex89
      @Greenalex89 Год назад

      @@vulpix9210 Id say it trades blows with the 4080, soooometimes even with the 4090, not more, not less..Decent price for the performance

  • @pointandshootvideo
    @pointandshootvideo Год назад +1

    We don't need more cards or even a 5090. Developers need to optimize using the RTX 3080 and we're good to go. 100,000 FPS isn't the goal. Exceptional looking graphics at a playable frame rate is all that's needed. I don't know why developers are putting out games that can't be played on main stream cards - that's just sloppy - pure and simple.

  • @UncleLayne
    @UncleLayne Год назад +2

    Steve and Tim, I hope I can have the same drive and motivation that you fellers have when I take over the family farm in the coming years. Yall are doing some good work I think

  • @soupwizard
    @soupwizard Год назад +5

    I've given up on pc gaming, I did buy a 3060ti last year to go with my 1440p monitor, just to play some older and less demanding games but even those I play less and less now & don't want to shell out $800+ for a GPU to play the top new games. No Alan Wake 2 or Cities Skylines 2 for me, I just don't care anymore.

    • @Dr_EviiL
      @Dr_EviiL Год назад

      I'm pretty much in the same boat. Although I game at 1080p (still have i5-8400 CPU). Just can't be arsed for the huge expenditure for not much gain. Xbox series X (for 4k single player) and the odd FPS on pc @1080p 144hz is fine for me, for the foreseeable...
      Maybe we are getting old?

    • @samgragas8467
      @samgragas8467 Год назад

      Nonsense, you can play Alan Wake 2 using PS5 settings and you will get around 80 FPS on a 3060 Ti.@@Dr_EviiL

  • @christopherhadsell9049
    @christopherhadsell9049 Год назад

    You guys have a top-notch channel here! Congratulations on the 1M subscribers! I joined you a number of years ago. I feel your content has really improved. I do, of course, watch GN, too. I always held him to be the 'gold standard.' However, your channel is doing super well! Thank you both, and, I suppose, Balin, too, for all your hard work and tireless dedication.

  • @ffnbbq
    @ffnbbq Год назад +2

    I dunno, the price difference between a 32GB DDR4 3600 Gskill Ripjaws kit vs a 32GB DDR5 6000 CL30 Gskill Z5 Neo for AMD is significant in Australia. Not to mention a half-decent B650m board costs almost as much as the B550 Aorus Master (which I believe you recommended).

    • @sammiller6631
      @sammiller6631 Год назад

      Then don't buy a 32GB DDR5 6000 CL30 Gskill Z5 Neo for AMD, buy 2GB DDR5 6000 CL30 Gskill FlareX for AMD instead

    • @ffnbbq
      @ffnbbq Год назад

      @@sammiller6631 I am going by the sets HB have shown/mentioned as the ones they used in testing and evidently trust to be stable and reliable.

  • @antronk
    @antronk Год назад

    I loved Steve’s energy on this one, kicking off with “just roll the ad already” and the banter at the end praising us viewers 😂😂

  • @srmadden1979
    @srmadden1979 Год назад +2

    A bit off topic, but I was thinking earlier... 3 and a half years after the PS4 was released you could get an RX580 that would blow it away for $229. 3 and a half years after the PS5 has been released you probably need to spend $1000+ to surpass it as much as the RX580 did to the PS4. What an overpriced and underpowered pisstake midrange PC gaming is these days.

  • @EhNothing
    @EhNothing Год назад +1

    "Wait a bit longer to upgrade if you can" - I'm on a 4770K, DDR3, and a 1070 Ti....because I keep waiting and waiting and waiting....

  • @ydfhlx5923
    @ydfhlx5923 Год назад +1

    16:46 - Steve calls RTX 4070 "a lower end GPU". It costs $600 ffs.
    Let's hope he meant "lower than 4090" and isn't detached from reality.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад +1

      It's a lower end GPU for sure, it only has 12GB of VRAM. Don't be upset with us that you're being charged way too much for what should be a $400 max tier two card.

  • @Gomcio
    @Gomcio Год назад +2

    Frame generation is worth only if you already have 60+fps so you can get closer to your refresh rate (lets say you have 144hz 1440p monitor or 120hz 4k tv) gsync helps... so basically something for people who like to have 100+fps.
    If you have 30-40fps with DLSS then frem gen wont help... screen choping is annoying and you dont even have good ms..., you will be around 60-70fps but with 28-32ms with is 35fps range..

  • @stephenlynn1233
    @stephenlynn1233 Год назад

    Good in taking all the credit! Lol
    A great in-depth video as usual, and keep up the good work!

  • @-Brunnen-G
    @-Brunnen-G Год назад

    The question about frame generation being a scan is missing the point. It's not that you can get equivalent graphics at better FPS. It's that you can get something to work at a playable FPS at decent graphics quality that wouldn't normally be acceptable otherwise. It's not a 1:1 replacement for normal graphics processing. Yet

  • @papichuckle
    @papichuckle Год назад +1

    Pc gaming needs longer generations with how much it is now, im not spending 2 grand every 2 or 3 years on a gpu
    It would be nice to see longer generations and prices go down and allow people to catch up and for things to become more balanced and hopefully get more optimised games out of it .
    I doubt though as the pc community are all apparently upper middle class and could buy a 4090 once a month if they wanted to

  • @PcGamingForFun
    @PcGamingForFun Год назад +1

    Cyberpunk with fg on is awesome . No feeling of lag whatsoever and everything is smooth 👌

  • @TheNerd
    @TheNerd Год назад +1

    4070 lower end GPU?
    Sure, but most "Gaming PCs" out there are slower than a PS5 / XBOX, because most ppl a(the average game) are/ is buying a new system every 5 - 7 years.

  • @johanlahti84
    @johanlahti84 Год назад +1

    The pricing of PC-parts, and specifically GPUs, and the steps developers take to make gpus only a few years old obsolete at a faster rate.. the incentive to build a rig now in the mid to high end is non-existent. You never know when the next "mesh shaders can't ve run with this gpu" happens..

  • @VoldoronGaming
    @VoldoronGaming Год назад

    I have a HP z400 which uses a proprietary pin out for the atx connector to motherboard. I dropped in a 650 watt evga with adapter that adapts the spec atx power connector to the proprietary HP connector on the motherboard and put in a gtx 980 ti. The Psu that comes with the machine is 400 watts and though they did make a 675 watt PSU for the z400 they are very rare and hard to find. So there are ways to adapt an oem motherboard to a standard atx pin out. Anyway that machine is my windows xp retro gaming computer.
    Motherboard, cpu and ram for Am5 platform is around $450-500.00, which does include the Ryzen 7600....with a microATX case, NVME, 6650 Xt and 600 watt power supply would cost less than $950.00

  • @MadBlazer89
    @MadBlazer89 Год назад +1

    The problem is that people are defending new AAA titles that run like shit, saying that back in the days GPU upgrades were required more frequently and the only reason that people got away with it last gen is because PS4 and Xbox One were so weak...and now we are back to "normal" according to them, tldr "JUST UPGRADE UR PC". While I somewhat agree, people are forgetting that if you had bought a 60class card for example, and another 60class card after 2/3 years, the cost would have been almost the same while the performance gain was massive. Nowadays if you have a 3060Ti, you will upgrade to what, a 4060Ti, where the performance improvement as almost nonexistent? Fine, you want to jump the gun and go for a 80class card which went in the past for around 700 bucks, tough luck sucker, you need to pay 1200. Like there is a need to upgrade, but there's nothing worthwhile to upgrade to...without shelling out 3 times the amount of money of course. If I have a 350 - 400 dollars budget for a GPU upgrade, maybe in the past you would have convinced me to spend 650 - 700 on a higher tier card. But in now way you can convince me to go well over a grand...not gonna happen, it just won't.

  • @Scarecrow-zr3zx
    @Scarecrow-zr3zx Год назад +2

    0:07 Part 2 on the graphic

  • @ivotomic1558
    @ivotomic1558 Год назад +1

    on screen text says this is part 2...not part 3... the "other Steve" will make a note... u close to "problem with Harbor unboxed" vid coming soon... just a few more on screen mistakes

  • @SDGRTX1455
    @SDGRTX1455 Год назад +2

    Frame generation without VRR is just useless unless you have 60Hz monitor or tv. 120hz without vrr enable is just a tearfest

    • @DailyCorvid
      @DailyCorvid Год назад

      That says the card isn't powerful enough.
      They give all these tools to have excuses.
      Raw power alone - can't support 4K rendering. At these prices it's a joke!
      I don't WANT 50 inserted fake frames and fake 4k upscaled from 1440p or 1080p. I can already do all that shit artificially in software I don't NEED Nvidia software!
      I NEED CUDA CORES VRAM AND BANDWIDTH NVIDIA!

    • @optimalsettings
      @optimalsettings Год назад

      its so easy. you like fg turn it on. otherwise turn it off! for me fg is a godsend when nailed at 4k on my oled.

  • @JohnSmith-sw2zy
    @JohnSmith-sw2zy Год назад +1

    I really like to turn on FG, when my FPS stays under ~60 fps in single player titles after maxing all setttings. It provide a much smoother experience and the occasional atrifacts here and there are a trade of I am willing to take. Fake or not, I do not care, because it in fact makes my gaming experience better and that is what counts for me.

    • @awebuser5914
      @awebuser5914 Год назад

      "...after maxing all setttings." Why? It has been categorically proven that "Ultra" and other nonsense settings offer no _relevant_ visible difference and don't add to the game experience in any tangible way. "Ultra" *does not* equal "better"

  • @48some
    @48some Год назад +1

    i love Framegeneration i always use it when needed. I play games with a Xbox controller and seriously its the best game experience i had.

  • @shahrukhwolfmann6824
    @shahrukhwolfmann6824 Год назад

    24:12 Alright, Tim, here we go! Thumbs up guys, you are AWESOME! Blessings!

  • @GoFeri
    @GoFeri 11 месяцев назад

    The PNY and Gainwards cards are two slot cards - the noise/temp results are not bad considering.

  • @davidlefranc6240
    @davidlefranc6240 Год назад

    Congrats for the 1mil subscribers .

  • @NiCO-jo2vh
    @NiCO-jo2vh Год назад

    So it was Balin who made the B- Roll of the empty table a year or two ago in that GPU video.
    That was awesome.

  • @vidiveniviciDCLXVI
    @vidiveniviciDCLXVI Год назад +1

    I spent £5,000 on my PC 2 months ago, couldn't be happier, future proofed for the next year.

  • @danielpindell1267
    @danielpindell1267 Год назад +2

    I've tried it on cyberpunk with full ray tracing and path tracing on and its honestly pretty good. Yes some games it's not as smooth as others why using it but other games it works pretty well. Sadly developers are relying to much on the tech instead of optimizing their games. On the other hand Unreal Engine 5 is way to demanding in its current state where settings have to be dialed down or we have to use DLSS and Frame Generation. Might as well get used to it as this tech isnt going anywhere because even AMD is copying Nvidia and going the same route with FSR.

    • @kevinerbs2778
      @kevinerbs2778 Год назад

      why not just mGPU to get around all that?

  • @Dragon211
    @Dragon211 Год назад +2

    Going to upgrade from a ryzen 2600 to whatever the 2nd generation of AM5 is, can't wait!!!

    • @MrAnimescrazy
      @MrAnimescrazy Год назад +2

      Are you going to upgrade your entire setup?

    • @Dragon211
      @Dragon211 Год назад +1

      yeah, I'll have to but ill keep my current RX 5700 XT GPU, PSU, water cooler and case. that should cut down the cost quite a bit.@@MrAnimescrazy

    • @MrAnimescrazy
      @MrAnimescrazy Год назад +1

      @@Dragon211 ok and when you upgrade everything else what are you looking at currently or are you waiting for more parts in the future.

  • @CommanderBeefDev
    @CommanderBeefDev Год назад +1

    speaking of the asus fiasco, thats the board im running right now, got it a month ago and i guess it was already updated, it is rock solid and i would highly recommend it, asus b650 tuf gaming wifi plus, cost 220 on amazon, has heatsinks extra vrm, 3 nvme slots and one of em is gen 5, 2 pci 16 slots, it is a true atx board, i will never buy a nudget board again, hell i even get diagnostic lights now

  • @102728
    @102728 Год назад +2

    Zen 3 is where you go when you got under $1k to spend and don't expect to free up much more budget down the line for a gpu upgrade. Spending nearly half your budget on the cpu/mobo/ram leaves you with not much left for a gpu and in the end less real world performance than if you went with the am4 platform with a stronger gpu

    • @loganmedia1142
      @loganmedia1142 Год назад +1

      Exactly and that system will be good for years. The typical person who is buying a budget system is unlikely to want to upgrade again before AM5 itself is no longer available. It is also likely that any DDR5 RAM bought today will be considered slow in five years time.

    • @102728
      @102728 Год назад

      @@mojojojo6292 1 - you don't have a grand, 2 - you're not in the us and have higher pricing, 3 - your limited budget now is quite likely to remain limited in the foreseeable future meaning you won't be upgrading your cpu until it becomes unusable. And unusable means not just slower compared to new cpu's, but actually not running the software you want to run. You'll be fine throughout am5 with a zen 3 processor, playing the games you can afford to play, beggars can't be choosers.
      Reason 2 and 3 are why I set up my wife's nephew with a 5600x + 7700xt build, for his budget this should give him the performance needed for years to come to play fortnite and the like and maybe explore some different and newer titles as he grows up. Yes - he's locked out of zen 5/6 upgrades, but it's not likely he's gonna buy any of those anyway since he's like 13 y/o

  • @denisruskin348
    @denisruskin348 Год назад

    Thanks for going with my question guys!

  • @cstubed
    @cstubed Год назад +1

    BIOS wise no brand is reliable if it is a new platform till it gets a few updates.

  • @charlesandresen-reed1514
    @charlesandresen-reed1514 Год назад

    Definitely agree on optimization not always being as expected and some strange trends. I remember (in a PTSD-like way) how utterly bad Cyberpunk was in performance at launch. I couldn't consistently max out settings with the 3090. But like many people, I started playing again recently after 2.0/Phantom Liberty launching, and the improvement was stunning. My main PC has a 4090 now but I hadn't actually finished building it when I started playing the new release. So instead I was running it on a 2060, with a core i5 processor. On a laptop. And it worked quite well and looked pretty damn good still. But on the other end, I heard plenty of people with performance issues in regards to Hogwarts, and that I never experienced any issue running. Sometimes its maddening trying to predict performance in modern games, as so much changes so fast that benchmark numbers that were good six months ago are next to useless now.

  • @InnuendoXP
    @InnuendoXP Год назад +3

    23:50
    Man, ASUS's fall from grace reminds me of Activision. In the early-mid 00's that brand meant you were in for a good time.

  • @alexdevcamp
    @alexdevcamp Год назад +2

    Not sure how easy it would be to separate and let game engines plug into it, but it'd be cool if NVIDIA allowed certain parts of the scene to be frame-generated, and others not (like the UI or text that can get garbled).

  • @stevenwex8966
    @stevenwex8966 Год назад +1

    CPU cost are too high, with a second hand 3000 series ryzen I could build a AM4 for less then the asking price of the Ryzen 5 7600 at $350 AUS, re-using the case, tower cooler and PSU, but I have notice that more 7600 are coming up on Ebay. I'm waiting until Windows 12, my i7 4790 rocks.

    • @formuIafun
      @formuIafun Год назад

      I'm glad you still enjoy your i7 4790, but it's absurd to say it rocks. I have a 6700K and I have to say these old i7s are really showing their age

  • @dimitrisbampanaras5297
    @dimitrisbampanaras5297 Год назад +1

    I'd love to see people try to figure out a way to reliably benchmark MMOs - WoW etc.

  • @autoglobus
    @autoglobus Год назад +10

    As a RTX 4000 series user i have yet to find a use for frame generation. For me it just feel strictly worse with it on even if it boosts my fps from 100 to 150. I don't know how to put this in words, but it just feels like my input latency is disconnected from what i see on the screen and it feels wrong. I still wouldn't call it a huge scam, just the usual misleading marketing that nVidia always does.

    • @AegisHyperon
      @AegisHyperon Год назад +1

      First they faked the resolution and since no one complained they decided to fake the FPS too

  • @chinnaramgariakash0029
    @chinnaramgariakash0029 Год назад

    Suggestion Noctua nh d15 support cabinet? Any size requirements?

  • @ClarkWasHere1
    @ClarkWasHere1 Год назад +1

    Performance =/= more frames. Performance = less input latency plus less time between frames

  • @gottahavitvt
    @gottahavitvt Год назад +2

    hmmm sounds like you're saying "future proofing" actually is a thing. I have to say your recent recommendations of abandoning last gen platform builds even when saving $100 when you quibble over 20-40 dollars on a GPU price is kind of hypocritical. Telling people to buy consoles over low end hardware?! WTF

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад

      Obviously, it depends on what you're giving up for the saving. Treating every decision as the same would be very odd.

    • @gottahavitvt
      @gottahavitvt Год назад +1

      @Hardwareunboxed I understand and I love the *it depends* slogan but you guys really do drive peoples decisions and for recommendations I like your cost per frame analysis but you do tend to abandon that when discussing platform. Yes I know that is much harder with cpus and multiple components than when just discussing gpus.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад

      Apples and oranges mate, you can't compare the value of one product line to another using the same metrics if what they offer is completely different.

    • @gottahavitvt
      @gottahavitvt Год назад +1

      @@Hardwareunboxed But you keep claiming you only review for gaming, what other metric matters? Please don't take this as a bash, I like you guys just providing some constructive feedback.

  • @hedimaster1
    @hedimaster1 Год назад +1

    Generally agree but the „it plays fine for me“ while using a 7800xd is such a stupid argument. Literally top of the line, best case cpu that can brute force basically anything. I know he mentions that himself but his argument in the end was still „it worked for me“ plus the 4070 literally can’t be considered a „lower end gpu“.

  • @mdsremac
    @mdsremac Год назад

    What happens when you use ultrawide monitor to play some games in 16:9 ration? Does it still randers pixels that are not in use? Or the performance in game is similar to one when running it on 16:9 monitor?

  • @rob90230
    @rob90230 Год назад

    Is laptop CPU/APU advancing faster than desktop CPU/APU? It seems that wattage/performance of a laptop CPU/APU exceeds the power usage of a desktop CPU/APU when comparing performance. The bottom line is the wattage of a desktop CPU/APU is quite excessive.

  • @cgerman
    @cgerman Год назад +16

    I am a huge fan of DLSS but not frame generation. Although other people may love it I don't really care about it. But I wouldn't call it a scam, it is a real feature.

    • @owlmostdead9492
      @owlmostdead9492 Год назад +6

      It's a scam in the context of how it was marketed, marketing was like "free 2x fps".

    • @alexatkin
      @alexatkin Год назад +1

      Problem is frame generation feels pretty awful with a mouse, its much less of an issue with a controller.

    • @garyb7193
      @garyb7193 Год назад +2

      If judging on a scale, I say Frame Gen is more a scam than a feature. By increasing latency, It often degrades the experience more than it enhances.
      Increasingly higher fps is 'felt' more than 'seen'. Frame Gen increases the frames seen with slight visual artifacts and noticeably added latency. It's our typical gpu tech taking a step backwards.

  • @johnpen269
    @johnpen269 Год назад +2

    in what universe is rtx 4070 low end gpu

  • @sicmic
    @sicmic Год назад

    I love the Q&A's. The more and longer, the better. Keep em coming lads.

  • @eliotcole
    @eliotcole Год назад +1

    When the '70 class is lodged as a poorer GPU, then the GPU market is bad.

  • @noanyobiseniss7462
    @noanyobiseniss7462 Год назад +2

    4070 IS not a LOWER end Gpu.

  • @setadoon
    @setadoon Год назад +2

    Who cares about a 5090 when 90% of people can't afford or refuse to pay for a 4090..

  • @HenryTownsmyth
    @HenryTownsmyth Год назад +1

    We should compare games with some sharpening filter turned on when comparing the results against DLSS and FG.
    Since FG and DLSS apply some sharpening by default, the images automatically look better on RUclips although, too much sharpening looks terrible in actuality and I always reduce it by 50% or so.

  • @Bluelagoonstudios
    @Bluelagoonstudios Год назад

    I'm not the guy, who want the newest of the newest, first with these prices these days, so I built myself a high-end rig with AM4 and made a custom loop on it. And maybe in 2 years I'll upgrade, when prices drops from AM5 stuff, I do this for many years, do I like all the new stuff? Of course, but there are other things in life that also cost a lot.

  • @Peylix
    @Peylix Год назад

    I personally can understand the ire regarding FG and the 40 series in general. Like if you came from a 30 series. You would be better off just staying with your 30 card.
    But as someone who doesn't waste money upgrading every generation for marginal steps. FG and 40 series has been perfect. Coming from a 10 series card of 6 years to a 40 was a game changer for me. And I'll keep my 40 series for a long time as well.
    Having said that. FG shouldn't be used as a crutch. There are exceptions to this like AW2 (a game that actually pushes boundaries unlike *cough Starfield *cough). FG and Upscaling in general, should be tools to help, not focus on. Which is a trend that I fear is growing more and more lately. Which is ironically what sparked the goofy backlash against AW2 before the title even came out (which proved all that bashing wrong after the fact). But again, AW2 is an exception, not the rule. Jedi Survivor, TLOU, Hogwarts, Starfield for example, relied on FG/Upscaling at the core. Rather than having them as a tool if needed. That is what I hope does not grow out of control.

  • @ishaanpitcon4572
    @ishaanpitcon4572 Год назад +19

    frame generation is amazing in controller oriented games (Alan Wake 2?) as it allows prioritizing fidelity over frames and then smooth it out with frame generation

    • @emmata98
      @emmata98 Год назад +8

      The issue is, that the generatez frames have horrible fidelity

    • @theaveragecactus
      @theaveragecactus Год назад +3

      ​@patricktho6546 I'm not a fan of fg but this is straight up misinformation, lol

    • @emmata98
      @emmata98 Год назад +1

      @@theaveragecactus why?
      Haven't you watched the videos about it?

  • @QuatraaGaming
    @QuatraaGaming Год назад +1

    I'm beginning to question if pc gamings worth it anymore which is sad as I've been on pc since 1999.

  • @eugkra33
    @eugkra33 Год назад

    How can they get "a bit more out of that silicon" on Ad103 if it's already has all the memory controllers enabled? Seems 16GB is the maximum it can do. Kind of doubt they'll cut down 4090 dies just to make a cheaper 4080 Super.

  • @thingsiplay
    @thingsiplay Год назад +28

    If frame generation is adding latency for generating frames, then it loses its purpose to me. Higher frames should lead to less latency, not more. And its most effective at higher framerates, where I would not need it anyway. It's not a scam, but it is useless technology to me. I would spent the resources of the GPU on something else, when possible.

    • @d_shi
      @d_shi Год назад +8

      Have you used fsr3 or dlss3? both? I found the latency increase negligable or at least enjoyed the visual smoothness enough to ignore having 80fps feeling input latency but seeing 120fps visually. Also I think going from something like 80 to 120 (Like I am in AW2) is a huge benefit, and don't buy the "isn't needed" argument. I didn't like fsr 3 when I tried it, but i've been insanely impressed with dlss3 so far

    • @charekuu
      @charekuu Год назад +2

      I've used DLSS FG in most games it's been implemented in I think and I haven't noticed any additional latency whatsoever. Even in The Finals which is a fast paced online shooter, I didn't feel any additional latency.

    • @d_shi
      @d_shi Год назад

      @@charekuu this is actually a game I want to try it in, because im interested in how it feels in a multiplayer shooter thats very fast paced. It gets me to a basically locked 240fps so I need to give it a go

    • @thingsiplay
      @thingsiplay Год назад +2

      @@d_shi No, I didn't use it. But what it provides is not something I don't know about. I know what it is like improving the FPS (my monitor is 120/144 Hz) in terms of fluidity and I know what input latency has an impact of. In example I have input latency optimizations for my emulators, and this is an important topic for me. My point is, I know how these individual factors have an impact.
      Frame generation adds input latency and the new generated frames aren't even new full frames as they can have artifacts. Just because you need it, does not need I would. This technology is useless for multiplayer games and I would only consider it for single player games. And even then I don't see any benefit of using it. The resources of the GPU are better used for something else that does not increase input latency. And at higher framerates, I don't see the need for using such a technology at all. Therefore my conclusion of being useless FOR ME.

    • @d_shi
      @d_shi Год назад

      yeah Idno I thought these things too and then I used it and now im a believer. But everyone is going to be different, same with upscaling discourse@@thingsiplay

  • @juanford55
    @juanford55 Год назад

    Super isnt between a Ti and a non Ti version?, if thats true... I hope this means a decrease in price of the non Ti models; but i dont know if thats going to happen.

  • @kdc420421
    @kdc420421 Год назад +2

    Frame gen is literally the most amazing innovation since adaptive sync. It makes ray tracing and path tracing usable NOW instead of in 5 years. And it REALLY smooths out games that are CPU bound. It's not the best for competitive games but in anything single player or cooperative it's AMAZING! Don't knock it till you try it. And if you try it and it "feels bad" your settings are wrong. The settings are a bit complicated to get right but once you get them right there is really no issue.

  • @guitarmansg499
    @guitarmansg499 Год назад

    Much better intro. keep it up! 😀

  • @west5385
    @west5385 Год назад +2

    Man,I love these videos. I always listen to them while I'm working or cleaning the house

  • @weltuntergang6668
    @weltuntergang6668 Год назад

    Comparing the performance with FG against without FG is definitely misleading at least, since the user experience is very different. If your real performance is in the 30ish fps, you will have a bad time, even when FG pushes the fps counter to be well above 60; you will have a better experience by switching DLSS down a step (e.g. quality -> performance) to bump up your real fps a notch by sacrificing some picture clarity/sharpness for it.
    That being said, I still like FG for the added fluidity in motion, but I will still aim to hit 60ish real fps performance for the input responsiveness. Since I don't play competitive this is the point where I stop being bothered by the latency and can enjoy the game experience. When playing with a controller I aim for 40ish fps at least, since it is less precise than a mouse and therefore generally feels always a little laggy anyway.

  • @emmata98
    @emmata98 Год назад

    29:00 methologies etc also differ

  • @NeoCyrus777
    @NeoCyrus777 Год назад +1

    As a 4090 owner, I have to say nVidia's frame generation borders on a scam.

  • @DJFIRESTONE92
    @DJFIRESTONE92 Год назад +4

    To me dlss is a must have feature given how awesome it performs. Frame gen is a nice to have in niche cases. Like hitting the refresh rate of your monitor if you're close. I find it only really does what I want it to when I'm already above 60 fps. Honestly that is probably it's biggest problem. Dlss was applicable in almost every case where frame gen was niche.

    • @Spinelli__
      @Spinelli__ Год назад +1

      Yup. It's too bad DLSS is a game setting rather than a driver setting.

    • @sammiller6631
      @sammiller6631 Год назад

      DLSS is a GAME setting not a card setting. It's a niche thing, not a must have.

    • @Spinelli__
      @Spinelli__ Год назад

      @@sammiller6631 That's the problem. DLSS is great and should be a driver setting so we can use it with any game. I don't care much for frame generation but I wish at least standard DLSS (ie. DLSS 2.0) was a driver setting.

    • @erobbin144
      @erobbin144 Год назад

      ​@@Spinelli__DLSS is always at least going to be game-engine specific. That's how the tech works, it's the whole reason it's better than the general upscaling algorithms we've had for decades. I don't know what the point is of saying that it "should be" something different than what it is.

  • @Wineblood
    @Wineblood Год назад +1

    16:47 4070 is a lower end GPU apparently, time to open your wallets if you want real performance and stop complaining about optimisation

  • @ChrisPkmn
    @ChrisPkmn Год назад +1

    If the pattern holds up, 10 as a great improvement & getting robbed by crypto, 20 being mid, then 30 being great and robbed by crypto, and 40 being mid..

  • @SamCableguy
    @SamCableguy Год назад

    I remember when I was in school and DOOM 3 came out around the same time as the GeForce 6800 and the ATI X850 XT…. Man those were the days😢

  • @Dyils
    @Dyils Год назад

    24:50 It's really not. I think if you consider just the tech in isolation, yes. But you have to consider several things - Nvidia are using framegen benchmarks to mislead consumers into thinking they've made bigger generational gains than they actually have. On top of that, they STILL call it DLSS even though framegen has nothing to do with super sampling. So many people will enable it, they'll get massive framerates and be like "WOW, THIS GPU SO GOOD". And they'll have no clue they have framegen on. Framegen is a scam in the way that it's implemented. Not the tech as a whole. The tech as a whole is just gimmicky, but not a scam. But the way NVIDIA approach it, it might as well be a scam.

    • @xviii5780
      @xviii5780 Год назад

      yeah Nvidia marketing is something else. Same for rtx - originally it was just ray tracing, now they just slap it everywhere.

  • @DavidFregoli
    @DavidFregoli Год назад

    I tried Frame Generation multiple times and never kept it ON, when you have enough FPS it's not needed, and when you don't have enough it doesn't help or makes things worse

  • @Xilent1
    @Xilent1 Год назад +2

    People use "scam" and "unoptimized" to freely these days.

  • @jasonhowe1697
    @jasonhowe1697 Год назад +1

    I think the shit and additives that enhances graphics fidelity I think we have push the 8,12, 16 20 and 24 GB cards can keep the original 4k hdr of 12 years ago let alone the new crap we expect the gpu to be able speeding up ram isn't fixing the issue its prolonging it, we're at at bust now that we require either 64-256GB ram or go the route of additional gpu with with a TB+ ram
    the speed of ram is fine it is the lack of gddr ram that is causing the bottleneck

    • @DailyCorvid
      @DailyCorvid Год назад +1

      Nvidia's had all the good ideas you can have lol.
      I mean who's idea was it to nerf the 4070 with a puny 8GB?
      My RX5500 (£200 from 5 years ago) has 8GB RAM plays the same games!

    • @jasonhowe1697
      @jasonhowe1697 Год назад

      @@DailyCorvid it matter not that you can a game on a 8gb card it's the resolution you can play at with that 8GB card same when you walk up the tiers no matter which camp you sit in as a gpu fan boi

  • @SpookySkeletonGang
    @SpookySkeletonGang Год назад +6

    If the 4070 super is the same price as the 4070, but with 16gb of vram and even just a little bit better performance? I'll grab it instantly. But if the price goes up or it's not better enough then I guess I'm holding my 5700xt for another year lol.

    • @TotallySlapdash
      @TotallySlapdash Год назад

      or you could buy a 6950XT with 16GB VRAM at a lower price with the same performance while they're still available (unless power is super pricey where you are). I bought one back in march & it's been a great upgrade for me (mainly because I've been around long enough that I don't buy NVs gaslighting and think £600 should be the limit for a high end card)

    • @awebuser5914
      @awebuser5914 Год назад

      @@TotallySlapdash The 6950XT is a pathetic power-hog with non-existent RT capabilities, why would you even consider such a useless brick for that kind of money? P.S. To the OP, the 4070 Super will *not* have 16GB, the platform won't allow for it.

  • @XX-ku7dn
    @XX-ku7dn Год назад

    Hopefully they update the power connector on the super series

  • @Oliver-s8k7k
    @Oliver-s8k7k Год назад +2

    I only appreciate FG when I already have 60+ FPS in a game. It's an fluidity enhancer but NOT a true performance enhancer. That's how I think about it.

  • @WinterEFG
    @WinterEFG Год назад

    Looking at doing a GPU upgrade, Super refresh might end up with some better value offers from Nvidia. What I'm slightly worried about is the fact that GPUs have gotten so much more expensive there's readily available price gaps between the models. So it may not end up pushing existing models down all that much.

  • @davidcole2337
    @davidcole2337 Год назад

    With RTX 4080 Super Scheduled to be released in 2024 I fully expect it to be at least Mid 2025 before RTX 5000 Series GPU to be released. They might be announced by end of next Year but do not expect them to be available till Middle of the year.

  • @leonthepromoreno
    @leonthepromoreno Год назад

    0:11 Is It only Me that It seems sligthly strange that TS still advertise 12 gen Intel frame while 14th gen CPUs has just been released? Maybe It's time to move on? 😁

  • @scottc6745
    @scottc6745 Год назад

    Solid intro. Well done.