Hogwarts Legacy PC Performance Analysis

Поделиться
HTML-код
  • Опубликовано: 25 июл 2024
  • Benchmarking GPUs in Hogwarts Legacy is useful, but can also be misleading because the CPU can easily be the limiting factor in performance, especially with ray tracing enabled and in certain locations in the game. Here I explain the issues and benchmark a wide variety of GPU at a variety of graphics settings and resolutions in the demanding (on the CPU) Hogsmeade location. The GPUs tested include RTX 4090, RTX 3060 Ti, RX 6700 XT, GTX 1060 6GB, and RX 7900 XTX.
    Test system specs:
    CPU: Ryzen 7700X amzn.to/3ODM90l
    Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
    Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
    RAM: 32GB Corsair Vengeance DDR5 6000 CL36 amzn.to/3u563Yx
    SSD: Samsung 980 Pro amzn.to/3BfkKds
    Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
    PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
    Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
    Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
    Mouse: Logitech G305 amzn.to/3gDyfPh
    What equipment do I use to make my videos?
    Camera: Sony a6100 amzn.to/3wmDtR9
    Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
    Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
    PC Capture Card: amzn.to/3jwBjxF
    Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
    Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
    Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
    Greenscreen: Emart Collapsable amzn.to/3AGjQXx
    Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
    RGB Strip Backlight on desk: amzn.to/2ZceAwC
    Sponsor my channel monthly by clicking the "Join" button:
    / @danielowentech
    Donate directly to the channel via PayPal:
    www.paypal.com/donate?hosted_...
    Disclaimer: I may earn money on qualifying purchases through affiliate links above.
    Chapters:
    0:00 Heavily CPU limited especially with ray tracing enabled (RTX 4090, Ryzen 7700 X)
    3:44 DLSS 3 Frame Generation can help when CPU limited
    5:43 CPU limit is reduced with ray tracing turned off (Still RTX 4090)
    8:03 3060 Ti Testing
    12:27 RX 6700 XT Testing
    16:20 GTX 1060 Testing (60 fps is possible if your PC can keep up)
    19:09 7900 XTX Testing Ray Tracing OFF
    20:44 7900 XTX Testing Ray Tracing ON
  • НаукаНаука

Комментарии • 1,5 тыс.

  • @danielowentech
    @danielowentech  Год назад +5

    Here's Part 2 where I had time to test some other CPUs: ruclips.net/video/hO94Ksz6bNo/видео.html

    • @shperax
      @shperax Год назад +2

      I noticed that big frame drops coincide with a big drop in GPU power usage. The clocks stay the same, but wattage will drop like 30-50%. When wattage returns to normal, so do your FPS. Your frame buffer % also drops at the same time.

  • @Hardwareunboxed
    @Hardwareunboxed Год назад +489

    Hey Daniel, can you try the initial RTX 4090 test again, but this time with DLSS completely disabled?

    • @MrIsmaeltaleb85
      @MrIsmaeltaleb85 Год назад +74

      What are you doin here Steve?

    • @danielowentech
      @danielowentech  Год назад +225

      Just fired it back up to check. At native 4K (TAA High, DLSS completely off) Ultra with RT Ultra I'm seeing a bit lower average since it is not fully CPU limited at native 4K. Dropping to 1080p render scale (no dlss) shows about the same ~60fps CPU limit I was seeing in this video. I also tried on my 3440x1440 ultrawide and hit the same CPU limit. Are you seeing different results? I don't have another comparable test bench to test this against.

    • @penis2
      @penis2 Год назад +35

      quickest pin in the west

    • @oktusprime3637
      @oktusprime3637 Год назад +7

      @@MrIsmaeltaleb85 What are you doing step Steve?

    • @Hardwareunboxed
      @Hardwareunboxed Год назад +151

      @@danielowentech I'm actually testing with the 7700X as well. I'm looking into this now. I wouldn't say I have anything different yet. I haven't tested that section of the game enough, it just looks odd. So you've got me interested now ;)

  • @TahaJelani
    @TahaJelani Год назад +427

    Thanks so much for still testing the 1060, literally the most useful benchmark for most people including myself.

    • @MrHoojaszczyk
      @MrHoojaszczyk Год назад +15

      Speak for yourself. I'm on 4090.

    • @Tripnotik25
      @Tripnotik25 Год назад +164

      @@MrHoojaszczyk GTX 1060, still most used card on steam despite its age. RTX 4090, you and 100 others, nobody cares.

    • @TahaJelani
      @TahaJelani Год назад +150

      @@MrHoojaszczyk What was the point of your comment? To display your insecurity? Good job you support companies price gouging their customers, and are the reason gpus are about to cost 2000$. Hows that boot taste?

    • @lamikal2515
      @lamikal2515 Год назад +1

      @@TahaJelani He can't tell how it tastes, unless his rectum have a tongue.

    • @gourab-2129
      @gourab-2129 Год назад +70

      @@MrHoojaszczyk this comment was made from momma's basement

  • @RobAryeeArc
    @RobAryeeArc Год назад +32

    Thanks for the nice range of performance analyses. Hope your kid feels better soon.

  • @ShaneH5150
    @ShaneH5150 Год назад +6

    a pleasure as always, thanks for keeping us all in the loop

  • @danielowentech
    @danielowentech  Год назад +33

    Hardware Unboxed has looked into the CPU limit I was seeing here, and has confirmed it. Edit: Originally he thought results were better on Intel, but then confirmed there is actually a menu bug that was enabling Frame Generation without it looking like it was enabled! So if you think you are getting double the FPS that I'm showing here on an RTX 4000 series card maybe try enabling DLSS and Frame Gen, and then turning them both back off, to confirm frame gen is actually off. Link to relevant tweets from Hardware Unboxed: twitter.com/HardwareUnboxed/status/1623793684436381698?cxt=HHwWhMDT1Zbw74gtAAAA
    twitter.com/HardwareUnboxed/status/1623619530143723521?cxt=HHwWgsDTiYjXoIgtAAAA

    • @antipathy17
      @antipathy17 Год назад +7

      CPU, GPU and Ram are all below 50% usage. Game is poorly optimized and not using hardware assets properly. Likely getting bottlenecked with the engine's use of the hardware or communication to hardware or poor High-level-api optimization. 60 fps while using 1080p upscaled on a 4090 is disgusting.
      You basically say its not a gpu limitation because you don't see the usage high but then make an excuse for the cpu even though its acting similar. That's stupid.
      Also you are using AI generated frame generation to get a more reasonable FPS but it still runs like crap considering your hardware to graphic quality. Like usual the PC version of a game is getting released in an incomplete form... go figure.
      DLSS is going to be a handicap of gaming but it's being sold as a golden goose.

    • @facegamefps6725
      @facegamefps6725 Год назад

      Lol well few already said that including me. No issues here and also better ram needed. AMD proves again it comes with built in stuttering.

    • @JokerTheDank
      @JokerTheDank Год назад

      @@antipathy17 DLSS would be a golden goose if devs didn't just say "oh look, we can avoid doing any optimization"

    • @DeadlyRedRing
      @DeadlyRedRing Год назад

      His latest tweet shows that frame gen was accidentally enabled on the Intel test system because of a menu bug

    • @zschade27
      @zschade27 Год назад +1

      I’m running a 3080 with an amd Ryzen 9 5900x 32 g ram. Almost zero stuttering issues and fps averages 120-130 on ultra settings at 1440p.

  • @gscurd75
    @gscurd75 Год назад +11

    Love that in the game it uses half the GPU but in the menu you finally get close to 100%

  • @surfx4804
    @surfx4804 Год назад +16

    One thing I find in games is crowds tend to tank performance. As the game starts managing the movement of many NPCs it hits the CPU and you can see this kind of thing in Cyberpunk by changing crowed density. You could try dropping shadows back as they seem to hit the CPU hard and even higher resolutions.
    I think I might get this game just to play around with things :)

    • @saricubra2867
      @saricubra2867 Год назад +11

      But Cyberpunk 2077 has WAY, WAY more NPCs that this area of Hogwarts Legacy. It screams bad CPU optimization.
      On a 12 core Ryzen 9 5900X, you can push Cyberpunk crow density so high and you have a WAY higher than 60fps experience.

    • @wadimek116
      @wadimek116 Год назад +2

      Its only here to, we have games like witcher, red dead redemption 2, cyberpunk which all work with much bigger crowds and way better performance. I thought cyberpunk it damanding game but its child's play compared to how demanding is Hogwarts legacy

    • @G_G_G_G_
      @G_G_G_G_ Год назад +4

      @@saricubra2867 its horrible cpu optimization i dont know why my cpu usage is at 30% with 12900k getting 70fps in hogsmead with no RT on a 4090

  • @robertanderson5735
    @robertanderson5735 Год назад +6

    Thank you. Always appreciate the time and effort you put into making these videos for us!! Keep up the good work bud!👍

  • @jayjay00agnt
    @jayjay00agnt Год назад +3

    I have a 4090 and AMD 7900X and besides the PBO 200mhz overclock, I spent a decent amount of time overclocking my RAM and infinity fabric which has really helped a lot. Hardware Unboxed did a segment with how bad AM5 motherboards do with the secondary timings which make a huge difference for zen4. I was able to get my latency and bandwidth numbers almost 25% better for each and in a benchmark like Tomb Raider Shadows at 1080p highest I am able to get average 303 fps, which is an approximate 20% increase vs no PBO and memory overclock, meaning the memory overclock is the main contributor to that. Would be interesting to see how much it helps you here, if at all. Great video as always.

  • @VRGamingTherapy
    @VRGamingTherapy Год назад +16

    So basically frame generation was meant for unoptimized games.
    😅😅

    • @bltzcstrnx
      @bltzcstrnx Год назад

      Actually it's for CPU heavy games, such as Flight Simulator and Racing Simulator.

    • @Coccanutzz
      @Coccanutzz Год назад +1

      I’m using frame generation on my 4070ti and still having issues

  • @Kenny5436
    @Kenny5436 Год назад +4

    Love the breakdowns with the different cards. Sounds like your kid has what my kids have. Hope they get better soon!

  • @Yahfz
    @Yahfz Год назад +13

    I'm getting over twice your performance (110fps+) on my tuned 12900K at the same settings and same area with the 4090. You're just completely CPU/RAM bottlenecked.
    I can record a video and post it on my channel if you want proof.

  • @Oshaoxin
    @Oshaoxin Год назад +24

    Something is causing the game to have a persistent fog everywhere. It's not just faded out textures like DLSS would do but actual fog. You can see the game without fog for a split second when you exit an interior like the Three Broomstricks and the fog pops back in. So it's a filter doing this, because filters switch to create better ambient in different rooms.

    • @alyssarasmussen1723
      @alyssarasmussen1723 Год назад +4

      i think i saw a mod that got rid of the fog or something like that

    • @RickyRozayyy
      @RickyRozayyy Год назад

      Is this only on pc or consoles as well?

    • @Oshaoxin
      @Oshaoxin Год назад +7

      Also must note this is set in the UK, persistent fogs are a common occurence.

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe Год назад +1

      @@Oshaoxin that's just not true lol. I live here and it's not foggy

    • @Oshaoxin
      @Oshaoxin Год назад +2

      @@JudeTheRUclipsPoopersubscribe Jokes are always true. Even if they're not

  • @danielmastia87
    @danielmastia87 Год назад +116

    Would love to know how much of an impact Denuvo is making here.

    • @GentlyUsedFrog
      @GentlyUsedFrog Год назад +2

      I've watched quite a few vids about Denuvo and benchmarks with/without Denuvo and on most games it seemed to affect loading times and 0.1% lows the most. Guessing it works the same here, not sure about that.

    • @alexorth5172
      @alexorth5172 Год назад +39

      @@GentlyUsedFrog It causes stutters though

    • @GentlyUsedFrog
      @GentlyUsedFrog Год назад

      @@alexorth5172 Yes, I mentioned that when talking about the 0.1% lows.

    • @ihmpall
      @ihmpall Год назад

      Stfu and buy the game

    • @devindykstra
      @devindykstra Год назад +8

      In my experience Denuvo doesn't really effect average fps too much, but it does create awful frame time spikes if not carefully optimized. Removing Denuvo might help a bit, but it won't fix the problem entirely.

  • @cizzymac
    @cizzymac Год назад +160

    It's going to be VERY interesting to see how the game performs once Denuvo is cracked. Watch us have another issue with it like RE8.

    • @fParad0x_
      @fParad0x_ Год назад

      A friend of mine said that the companies actually pay the crackers for the game not to be cracked for some time. It's usually 1 or 2 years depending on the game.

    • @CaptToilet
      @CaptToilet Год назад +8

      RE8 was a stutter issue and not a gpu usage issue. And that wasn't with denuvo it was with capcoms own proprietary DRM coinciding with all the other crap companies think we need.

    • @NGBH
      @NGBH Год назад +30

      @@samhhhhh i think Empress(even with all her issues) still cracks denuvo. she actually stated shed try to crack hogwarts as fast as she can if the news i saw was true.

    • @stavroskoul8782
      @stavroskoul8782 Год назад +12

      @@samhhhhh a few days ago she said it would be cracked within the next 10 days. So within like 4 more days or so, if she wasnt lying, it will be cracked

    • @captain4318
      @captain4318 Год назад +7

      @@samhhhhh Only 1 left. It's true, a lot of Denuvo games remain uncracked.. but the last one standing really wants to crack this one, so it will likely happen.

  • @JokerTheDank
    @JokerTheDank Год назад +45

    I have a feeling this game was made with direct storage in mind on consoles, and the PC port is either not using it or using it poorly. It looks like it loads and releases assets from the disk constantly, and it just can't load them fast enough

    • @Kizzster
      @Kizzster Год назад +6

      I think it's just an open world game issue, Modern OWG titles are too demanding. On console this game doesn't even run native 4K

    • @JokerTheDank
      @JokerTheDank Год назад +13

      @@Kizzster on PS5 comparable hardware this game chugs on PC. It's not performing the same

    • @znubionek
      @znubionek Год назад +6

      its just typical Stutter Engine 4. next with those issues will be jedi: survivor (just like jedi: fallen order had it)

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe Год назад

      Crap engine.

  • @ksks6802
    @ksks6802 Год назад +5

    Hdr is bugged on the newest nvidia drivers. I lost 10 frames having hdr on in doom eternal. Dead space didn't seem to matter and didn't see a hit. Worth mentioning

  • @matc2094
    @matc2094 Год назад +1

    Great content like allways Dani! hope your child get better! Greetings from ARG.

  • @SuperSuchties
    @SuperSuchties Год назад

    THANK YOU SO MUCH for makeing a deep dive.

  • @thecarsonigen
    @thecarsonigen Год назад +4

    if they allowed exclusive fullscreen i am sure that would allow a few more frames vs windowed fullscreen. I don't understand why they didnt add this option as well

  • @alphabrainwave
    @alphabrainwave Год назад +3

    Thanks for making a no-nonsense video strictly about the performance of this game. I wonder if this engine isn't fully utilizing GPU because the developers have designed it to be more compatible with the Nintendo Switch and other GPU-limited systems. But I guess the RAM and CPU of the Switch aren't that advanced either, so there's a good chance I have no idea what I'm talking about. Wishing good health and magical gaming memories to your family.

  • @maximus574
    @maximus574 Год назад +1

    Hope the kid gets better. Great video. I was wondering about this exact thing. Thanks for the info.

  • @RektemRectums
    @RektemRectums Год назад

    Thank you for this! Very useful tips. Also your speech has improved significantly and really easy to listen to. Hope your daughter feels better! Thanks. Subscribed.

  • @wpelfeta
    @wpelfeta Год назад +4

    Hmm. I guess I'd be interested in seeing the performance of different CPUs.

  • @ImEverlove
    @ImEverlove Год назад +20

    Between Calisto protocol, Dead Space Remake and Now Hogwarts Legacy, it seems PC ports are suffering and its really sad to see. I understand everyone in the comment's seem to be running fine, but you're not every one and there are people with issues. Its sad that most reviews for Dead Space or Hogwarts dont mention these issues. Th easy option is just turn off RT right? Well if your game offers an option, I'm going to judge it as a whole and not pick and choose what I review. There are games that handle RT implementation well. This is not one of them as of right now and it should be a con and not looked over and told to turn it off. Thats not a solution.

    • @JarvisJarvis
      @JarvisJarvis Год назад +4

      Should we add denuvo into equation? Considering both titles has DRM in it.

    • @ryanespinoza7297
      @ryanespinoza7297 Год назад +2

      Dead Space remake runs fine compared to the other two. I played it on my 4080 at 4k ultra RTX on and on my 2080ti at 4k medium dlss quality and the game had a consistent frame rate in each area.

    • @DelfinGames
      @DelfinGames Год назад +1

      simple explanation, we've come full circle. back in the 360/ps3 day's most of the console to pc ports sucked. we're basically going back to those days....good times

    • @ImEverlove
      @ImEverlove Год назад +3

      @@ryanespinoza7297 Again, thats great dude. But youre not everyone are you? Search around. Youll see people with Massive CPU and other frame issues with Dead Space. People with 4090s .

    • @MLWJ1993
      @MLWJ1993 Год назад +3

      @@JarvisJarvis DENUVO is known to be shit for the consumer. Murders everyones enjoyment of games while offering those that pirate it the better experience 😆

  • @Whiteshade
    @Whiteshade Год назад +1

    Hi, did you experience stutter with cutscenes, new environment, new ui elements?
    I'm having 100-120 fps with stabile frametime at 4k dlss balanced with high settings raytracing off. but whenever I saw something new or enter new area I'm having stutters. It's look like unreal engine shader issue to me, it does have pre-cache but I don't think its enough, becausetaking its so short time to compile them. For example in Star Ocean The Divine force has stutters if you didn't cache the shaders in the options, but if you choose to cache them it tooks 20 minutes for game to cache it. Thats why I don't think hogwarts legacy is caching all shaders and it causes me stutters.
    Rtx 3070
    Ryzen 5600x
    32gb 3600mhz ram
    samsung 980 pro ssd
    Seasonic focus gold 750w psu.

  • @arsenilazarevich8779
    @arsenilazarevich8779 Год назад

    Glad I found your chanel Daniel, it helped to decide which GPU to buy :)

  • @SebasXgaming
    @SebasXgaming Год назад +15

    I've personally noticed quite a bit of difference by manually switching the DLSS version to 2.5.1. It definitely didn't fix all of it but I am curious how much difference your setups would show, if you're not already using 2.5.1 in this video ofcourse! Would love to see a comparison video :)

  • @LeateqOfficial
    @LeateqOfficial Год назад +12

    I got it for pc at first but i had really weird performance problems on rtx 3070 ryzen 5900 medium-high 1440p
    Then decided to try it on ps5 and ps5 runs this game perfectly
    The only problem is dynamic resolution in 60 fps is a bit much sometimes

    • @bearpuns5910
      @bearpuns5910 Год назад

      Are you using dlss?

    • @LeateqOfficial
      @LeateqOfficial Год назад

      @@bearpuns5910s i had good frame rate but it would drop like crazy to 20 fps sometimes and the game felt like it stutters
      Never seen this before and ps5 is just perfect 60 fps with no stuttering or any other problems
      Played for about 8 hours already so I decided to refund pc version and keep a ps5 one

    • @darkbustergt8085
      @darkbustergt8085 Год назад

      @@bearpuns5910 dlss barely does anything. i’m playing at 1080p with medium high settings and i’m getting anywhere from 40 to 100 fps

    • @runek100
      @runek100 Год назад

      @@darkbustergt8085 isn't it because game is not optimized well? Couze dlss help a lot in many games.

    • @kenshinhimura9387
      @kenshinhimura9387 Год назад

      ​​@@bearpuns5910 you should never use dlss garbage. It's fake frames and makes the game laggy and blurry

  • @MannyMorningstar
    @MannyMorningstar Год назад

    Great video!! Did you check the cpu clock speed? That should be an important factor

  • @moldyshishkabob
    @moldyshishkabob Год назад

    This is some epic benchmarking!
    And hope the kiddo feels better soon! Don't know what kind of potion is needed for that cold. Not a licensed wizard, myself.

  • @Azalis1701
    @Azalis1701 Год назад +3

    I wonder if Unreal 5 putting ray tracing on the software side will improve performance in future games, especially for the console versions. In Fortnite it looked like the AMD GPUs were seeing a huge jump in performance with UE5 RT.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      Game developer probably will ditch software based RT altogether in the future. Use hardware accelerated RT or not using RT at all. It is similar years ago with DX9, DX10 and DX11. Console still using DX9 by the time DX11 arrive in 2009. Rather than making their game on each API most game developer end skipping DX10.

  • @prototype8137
    @prototype8137 Год назад +5

    Im using a modest 10850 k, ddr 4, 3080 and im getting about the same performance at 2k but i do see stutters once in awhile.
    Surprised a system like yours isnt performing much better.

    • @GokuCOD
      @GokuCOD Год назад

      I got similar results on 1440p dlss quality, all on ultra rt off
      My pc has a 12700, 3080 and 32gb ddr4 3600mh, frames never go under 75 in hogsmeade, and the gpu utilization is super low btw, like 60% or 70% most of the time 🙄

  • @blaaaaaaaaaaaaaaa303
    @blaaaaaaaaaaaaaaa303 Год назад

    hello Daniel, what is the name of the app which you are using for the measuring (on the left side of the screen)? I want to check it out and then test it also on my pc

  • @mr.jordan6626
    @mr.jordan6626 Год назад

    Hope your kid feels well soon! Great vid

  • @ojonathan
    @ojonathan Год назад +140

    I did a full analysis on this game by myself, just for fun, using Radeon GPU Profilier (RGP). I was looking for a good comparison video to base my findings on, you didn't posted at the time I Tweeted my findings and wrote on Reddit, still very good since it basically validates some of my findings. So I'll share what I found here as well.
    When there is no VRAM oversubscription (Video Memory spilling to System Memory), you became CPU limited like what happens with the 4090, even on mid-end GPUs, if there's no VRAM oversubscription you become CPU limited -- RGP is very good at telling if you're GPU or CPU limited, but I think it only works with AMD GPUs, I know Nvidia has an equivalent two, but I don't know if it does have this feature.
    However, the stutters and performance issues are mostly issues with the game itself, the game may need more than 12GB of VRAM for 1440p in some occasions¹, if you have less than that, it will spill over to System Memory and cause massive stutter and framedrops, as well as low GPU usage and even low CPU usage. Because of the way AMD Driver reports GPU usage, you may still see it report 100% GPU usage, but that's because AMD reports based on Busy Time². If you look at GPU temperature and Power Draw, you can notice those values dropping whenever you experience massive FPS drops -- RGP also points that you're GPU limited, but that's not true, the GPU is busy waiting on Atomic Synchronization, the real bottleneck is System Memory, more on that below.
    The really really complete explanation summarized is that the game uses a lot of Video Memory, that eventually get spilled to System Memory, then the game tries to access the spilled memory, which causes the Driver to issue memory synchronization calls to ensure correct synchronization of System Memory (that's a complex thing, so I'll not make any assumptions to why this is being so damn slow, I would have to investigate), those calls uses GPU time but don't really uses computational power (in simple terms, the GPU is basically sleeping, although that's not the correct explanation of atomic synchronization), it may seem that the CPU is the bottleneck, when the System Memory is the bottleneck, but the VRAM shouldn't spill over to start with. The game is badly managing the memory.
    The High Video Memory usage is mostly coming from things that I think that didn't needed to be on VRAM to start with (Radeon Memory Visualizer can show what is being placed on VRAM), increasing resolution increases the used VRAM, so decreasing resolution reduces memory usage, but just at some extent since there is a baseline, which is determined by those structures that are being placed on VRAM at the start of the game, this is probably a constant and there is not way to reduce it. And since this is done at the start of the game, it will always be mapped to the VRAM, and the other things will get spilled over to the RAM.
    ¹: Your video proved that at 1440p it didn't reached the 12GB mark, but based on some reports, it may reach at some point, so 12GB is enough for 1440p, just don't expect to never experience any issues, you may eventually although rarely.
    ²: AMD measures the time the GPU is busy doing something, which also includes waiting on memory synchronization, the GPU cannot tell if it's waiting on VRAM or System Memory, so it reports both as GPU usage. This is way more complex, because memory synchronization instructions may be waiting on other work-groups that may be running on the GPU, the driver cannot really tell if it's waiting actual work or just on the data being copied over since there's no distinction (there may be a way for the drivers to do this, but AFAIK, after looking at Radeon's ISA and how it should work, it may not be practical).

    • @muhammadmesum8529
      @muhammadmesum8529 Год назад +9

      Who writes so.much

    • @XxLE6ITSWAGxX
      @XxLE6ITSWAGxX Год назад +16

      No shittt, so my 3080 FE is already dated? Fuck PC gaming lmaoo.. Sooo many bullshit optimization issues I never experienced on console

    • @davidjavier7688
      @davidjavier7688 Год назад +25

      @@XxLE6ITSWAGxX Dude chill, if its a game problem it will be patched, if it's a driver problem there will be an update to fix the issue. PC gaming is the best, i have a modest 1060 and it can play any new game on full hd so your 3080 it's not dated and will not be for a long time. A ps5 is close to a rtx 2070 so you are way beyond what consoles can do.

    • @stephenpourciau8155
      @stephenpourciau8155 Год назад +1

      Any reasons why you would think my 5800x3d + 4090 system is literally producing in most cases double his fps, if not triple on occasion?

    • @XxLE6ITSWAGxX
      @XxLE6ITSWAGxX Год назад +3

      @@davidjavier7688 Hahah I’ll take your word man, the unreal engine games killl me thoughh, thanks for the small bit of optimism
      hoping you’re right
      Also 1060 ? that’s pretty badass to hear it’s still kicking strong hope it lasts you a few more years until and if you upgrade

  • @japanesesamurai4945
    @japanesesamurai4945 Год назад +15

    Hi, do you think the devs can fix this cpu/ram bottleneck? Because if not, then the system requirements are misleading

    • @calamdumr
      @calamdumr Год назад

      Exacly as im thinking, no way that much ram and instability only on 1080p here and i use a 4070ti

    • @diddykong7354
      @diddykong7354 Год назад

      not really its an old gen game. its supposed be about gameplay. not once did they boast about amazing graphics

    • @japanesesamurai4945
      @japanesesamurai4945 Год назад

      @@calamdumr especially when ps5 have 16 gb of memory allocated between gpu and ram

    • @bltzcstrnx
      @bltzcstrnx Год назад

      @@japanesesamurai4945 PS5 have a guaranteed SSD performance. This means developers can depends on its SSD without having to allocate large caches on RAM. There are no guarantees for high performance storage on PC. I guess current-gen developers are relying on RAM to offset the storage performance on PC now. Hence high RAM usage for games that are designed for current gen consoles.

    • @japanesesamurai4945
      @japanesesamurai4945 Год назад

      @@bltzcstrnx but it is the first game we seeing such a problem only on pc and not on consoles

  • @phenix5609
    @phenix5609 Год назад

    hi interesting u had the most powerful setup that exist currently , and barely play at 60 fps with rtx on .... , in outer door situation its not bother me much because the effect of the rtx is almost invisible, but indoor ( in hogwart particulary) that where the rtx show the difference sad we can't even play with it enable without dropping to really low fps ( between 10 to 25 fps for my case ).
    I have a 5900x cpu and a rtx 3080 , and without Rtx in hogsmead i got the same fps u get between 80 to 90 fps with ultra setting with dlss quality on a 2k native resolution, but with activated i drop around 30 or lower.
    Other question the frame generating only work with 4xxx series ? or do we need to activate something in nvidia somewhere to be able to use it with a 3080 ?

  • @AwsomeALEC72
    @AwsomeALEC72 Год назад +1

    Is the in Game time of day different between the testing on the 4090 vs the 6700xt? The lighting looks way fainter with the AMD card.
    Edit: it’s FSR. You can see the difference distinctly when he’s testing the 7900xtx

  • @Aurummorituri
    @Aurummorituri Год назад +7

    Thank you so much for this. Same problems with a 3090 and 9900k. Huge stutters, awful frames, and super low gpu utilization. The intro gets you and then reality hits. It’s pretty awful.

    • @colinparks619
      @colinparks619 Год назад +1

      Just got a 3090... Not good tonhear this as i was gonna get on ps5, but as usual, we are getting a bad experience on pc, but more playable on consoles

    • @Aurummorituri
      @Aurummorituri Год назад

      @@colinparks619 Its pretty awful. Keep RT off and the stutters will be less.

  • @wimpie031
    @wimpie031 Год назад +6

    Wondering what the increased L3 on the 5800x3d does to the game. Very educational again Daniel, thank you for testing! Hope the little one gets better soon :)

    • @nepnep6894
      @nepnep6894 Год назад +1

      Not much at all. RAM bandwidth scales a lot more with RT especially with UE4/5. Ideal config is a raptorlake cpu with hynix ddr5 and tuned subtimings.

    • @Wonkanator2
      @Wonkanator2 Год назад +3

      @@nepnep6894 could just be how powerful the 4090 is but with my 5800x3d and ftw3 ultra 3080 all ultra settings dlss quality I'm always full gpu utilization in this at 3840x1620 ultrawide config

    • @xTurtleOW
      @xTurtleOW Год назад

      it runs basically same as 5800x game needs cpu power not cache

    • @TheWretchedWorld
      @TheWretchedWorld Год назад

      Have the same config runs like shit

    • @lorsch.
      @lorsch. Год назад

      @@Wonkanator2 with max settings and RT enabled, you'll probably see similar dips below 60 in this area even if you lower resolution / increase dlss to get outside the gpu limit.

  • @savevsdeath
    @savevsdeath Год назад

    Thank your little one for letting you take a break and make a video for us.

  • @PRODBYNDRU
    @PRODBYNDRU Год назад

    Hey, great video! I have a question for you about my system performance. So I have a RTX 3080 (10GB) - Ryzen 5800x - 64GB DDR4 Ram 3200Mhz. I have played around with the setting a bunch and I am unable to achieve what you achieved in Hogsmeade with the exact same settings you used with your 3060Ti. You were floating around 70-90fps. I can only achieve 50-60fps with dips into the 30-40s and slight jumps to the 70s. I thought of maybe a CPU bottleneck but I saw a RUclips video of someone benchmarking the Ryzen 5600x - 5800x - 5900x - 5950x with modern GPUs as high as the RTX 3090 and they all achieved the same performance. So I assume I am not bottlenecked with my CPU. I am lost to where my performance loss can be.

  • @asdf51501
    @asdf51501 Год назад +3

    It'll be interesting when I try this on a 7950x system with an ARC A770. Not expecting greatness there lol. However, I do have a 7900xtx on the way, which will go into a 7950X3D system next month when that monster comes out.

    • @killer1275
      @killer1275 Год назад +3

      I've seen a video from a smaller channel and the A770 performs really well. Would be good to see it tested as well!

  • @darknight991
    @darknight991 Год назад +3

    Hey Daniel; I’m using a 4090 with a 5800X3D; I got about 20-30% better performance by updating the DLSS version from the one included with the game to the latest version (2.5.1 AFAIK) as well as “evening” the frame times. Could you try that next?

    • @andreiga76
      @andreiga76 Год назад

      I can confirm using DLSS 2.5.1 you get better performance and also less graphical issues with Frame Generation enabled, the game comes with a pretty old version 2.3.11. My computer is using 5950x and 4090. Game is also allocating so much memory, system RAM used is 22gb on my 64gb system and 16gb of VRAM.

    • @xcom9648
      @xcom9648 Год назад

      @@andreiga76 How do you change the DLSS version?

    • @naty_53
      @naty_53 Год назад

      ward

  • @OGNetro
    @OGNetro Год назад +2

    What is the tool you use to measure frames in the top left I cannot find the name of it for the love of god. I also want to say I have a 13900k and a gigabyte 4090 and with high refresh rate and high frames on this game stutters are way more visible and I am sure they are still happening on lower frame rates just the less frames the less likely you are to even notice a difference. I tried setting it to the ingame 75fps limit and it seemed less stuttery.

    • @D3nsity_
      @D3nsity_ Год назад +1

      Its msi afterburner you just have to configure the settings so that you have that information in the OSD its under the monitoring tab.

    • @dddddddddddddddadddddddddddddd
      @dddddddddddddddadddddddddddddd Год назад +5

      It’s rivatuner statistics server, it gets bundled in with msi afterburner too

  • @CRBarchager
    @CRBarchager Год назад +1

    As the RAM requested for the entire video was about 20GB and usage of this was around 14GB. It would be interesstring to see if lowering it to 16GB of total memory would show some of the problems people have mentioned. Eventhough most of windows is lowered priority wise in games having only 2GB to spare for the rest of the system could that be the reason people are having problems?

  • @mtlcub
    @mtlcub Год назад +4

    For me, I'm on 4080 (with the same CPU, 7700x).
    I was able to play the game in full 4K, Ultra but RT off... and able to get easily with 70-90FPS.
    If i want to use RT On, I need to put DLSS for sure... and the FPS will be low...
    But I'm totally fine to play it full 4K Ultra and with no RT. And the game is so pretty!

    • @Megatrance9000
      @Megatrance9000 Год назад

      Same here, to me I didn't really notice to much of a difference with RT on this game so I ended up turning it off. On my 5800x3d/4080 I went with 4k Quality DLSS and I'm able to reach 116 cap pretty frequently, although in some areas I've seen it dip in the 80's.

  • @BlindBison
    @BlindBison Год назад +12

    It sucks that RT is so heavy on CPU and VRAM tbh. It’s strange to me that 8 gig cards are now in hot water when it comes to their VRAM budget. Nvidia skimped too hard during the 3000 series it would seem. Imo 3070 should’ve had at least 10 gigs while 3080 should’ve had at least 12.

    • @anorax001
      @anorax001 Год назад +1

      The ironic thing is the 30 series were lightyears ahead of the 20 series for RT. I still think RT is a work in progress even with the latest 40 series generation. Plus for Hogwarts I think they set the minimum system requirements far too low and should have been far more granular with their specification requirements for RT on or off.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад +1

      Well mining happen. Because of mining nvidia end up cancelling their 3070 16GB, 3080 20GB and 3080Ti 20GB. nvidia probably afraid post mining craze where cheap cards with lots of VRAM flood the market. Imagine 3080 20GB end up being 300-400 in used market. If the VRAM still limited to 10GB and lower some people probably still doubting to get those card and will be "encourage" to take new 40 series that have more VRAM.

    • @merlin7800
      @merlin7800 Год назад

      @@anorax001 RT on 4000 is great and really usable, playing the game on 160fps with the 4090 (but dlss 3.0 activared) the game is looking gorgeous.

    • @lorsch.
      @lorsch. Год назад +1

      I thought I'd be safe with the 5800x3D without upgrading for the next 5 years, but quickly noticed this. Seems DLSS 3 becomes a necessatiy soon for playing with RT, shame it's locked to the 40 series.

    • @runek100
      @runek100 Год назад

      @@merlin7800 yeah, anything with dlss looks great, it really is free fps. Cant imagine what they can do in future with it.

  • @J0ttaD
    @J0ttaD Год назад +2

    Daniel have u checked if its a ram bottleneck? Ive noticed 32gb ram is maxed out with pagefile disabled. Physical ram gets around 50% but virtual ram maxes out and its weird. I will never understand that virtual ram bs. So i got to enable pagefile and some games write to the ssd. I feel that is part of the issue u guys should try using 64gb ram or disabling pagefile if it doesn't crash.

  • @Hollowtriangles
    @Hollowtriangles Год назад +2

    Thanks for testing the 3060Ti, I’m using a 3070 so I was getting a little worried. But it just seems to prove the theory that it’s limited by something else, CPU maybe

  • @raminsameni1063
    @raminsameni1063 Год назад +63

    Tell your daughter thanks from us enthusiasts for letting you do this today 😂 and we hope she feels better . Feel lucky to have a 40 series’s for these demanding titles . I feel like a shill but nvidia does give a great experience ONLY IF you aren’t hung up on frame per dollar and just overall the best “experience” . I love AMD and hope that fsr 3.0 is as good as frame gen/dlss 3.0

    • @jayjay00agnt
      @jayjay00agnt Год назад +2

      Same lol. I have an AMD cpu but the performance numbers are what they are and I’m loyal to performance.

    • @TheDude50447
      @TheDude50447 Год назад

      Having owned a bunch of gpus over the past decades and having tested a multitude more I can say that there is no visible difference in picture quality between AMD and Nvidia cards today. Then again I just run games native and dont use FSR or DLSS. I have tested ray tracing but its only a slight difference in picture quality and I wouldnt say its actually better just a bit different. If it wouldnt tank the framerate Id say Im neither here nor there about it but considering that fact I dont see a reason to ever activate it.

  • @t1e6x12
    @t1e6x12 Год назад +10

    It wouldve been nice to have the A770 LE included in this test!

    • @fm1798
      @fm1798 Год назад +5

      No bot uses that shi 😂

    • @t1e6x12
      @t1e6x12 Год назад +5

      @@fm1798 ? It only makes sense to include it.

    • @bigbob3772
      @bigbob3772 Год назад

      @@fm1798 BS, it runs this game perfectly. ruclips.net/video/beT2EBXDPpY/видео.html

    • @OUBrent1
      @OUBrent1 Год назад +7

      @@fm1798 I use one and would definitely like to see how it performs with this game before I spend the $$ on it

    • @user-ye7lp9lg1c
      @user-ye7lp9lg1c Год назад

      @@OUBrent1 check it yourself and then refund or wait for crack

  • @Hanenb0w
    @Hanenb0w Год назад

    Great guide thank u so much for this.

  • @js100serch
    @js100serch Год назад

    I'm playing on a 4K TV. Is there a difference between setting my desktop resolution to 1080p and play the game like that. Or leave my TV on 4K and use the 1080p Rendering Resolution setting?. I playing on a Ryzen 2600x and RTX 2060 Super.

  • @thedeejlam
    @thedeejlam Год назад +3

    This is, in a way, a reflection of how many games go out into the wild these days -- not fully baked. This is to say that we can see evidence of this with pressures to realize a faster ROI starting with things like open "early access" in lieu of longer, staged betas from the past. Fully launched releases for even triple-A games now need months of post-release fixes for optimizations at best... and disaster recoveries at worst. And as teams have no time before DLCs need to also go out the door, the current state of things has become standard fare across gaming, which in turn has normalized them.

  • @fmrc69
    @fmrc69 Год назад +4

    I have a 5950x with 64GB of DDR4 RAM and an RTX 3070....the game was hitting 10-20 fps in Hoggsmead and outside. This is a game or driver issue imo

    • @Pablud3S
      @Pablud3S Год назад

      What the fuck... I have 3080 12g and 5900x and I run ultra at 4k with quality dlss rt off. RT KILLED the game at random points, like down from 60 to 5 fps for up to 30 second at seemingly random times, but with rt off I only had that happen once in like 20 hrs...

  • @todda6943
    @todda6943 Год назад

    i have a question don't know if you can answer it or not with dlss on i was like getting 144 frames inside and like 80 outside with 50 GPU usage i turn off DLSS and get like a 30 % jump in FPS at same settings 1440 p ultra with a 3700 x 32 gb of ddr 4 @ 14 clocks and a EVGA 3080 TI ultra gaming so dlss off the gpu usage went up to 80-85 % but so did my frames . i tohught dlss was suppose to get you more frames why do i get less ? with dlss is it because i am CPU bound ?

  • @pablonaranjo730
    @pablonaranjo730 Год назад

    I am having a problem by starting playing the game. Yesterday I couuld play but I stablished my settings to high with a GTX 1060. Today, the game doesn' t start, it just remains black screen. Can I change the settings from outside the game or is there another option to do? My CPU is an Intel i7 8700k and I have 16gb RAM 3200 MHz

  • @exactom6320
    @exactom6320 Год назад +7

    Hogsmeade brought my 3080 5900x system to its knees. I hope it gets better with patches/drivers.

    • @SuperYtc1
      @SuperYtc1 Год назад +1

      Just play 1080p.

    • @exactom6320
      @exactom6320 Год назад +2

      @@SuperYtc1 no

    • @user-ye7lp9lg1c
      @user-ye7lp9lg1c Год назад

      It will get better with crack

    • @shanegarmon4516
      @shanegarmon4516 Год назад

      @jkteddy77 anything more than 1080p and he will go over 10gb of vram. Even at 1080p high I've seen it use 9.5 Gb of vram. It's probably not a cpu bottleneck, it's probably a vram one. That's a real bummer

    • @exactom6320
      @exactom6320 Год назад

      @Shane Garmon yeah I think you're right friend of mine has a 6800xt and he has no issues. I'm gonna try messing with settings more when I play tonight.

  • @PeterPauls
    @PeterPauls Год назад +11

    Not finished the video yet. But interesting that “new” titles which heavily rely on frame generation like The Witcher 3 NG and now Hogwarts Legacy are CPU bottlenecked. Coincidence? I don’t think so. (But I love my theories).

    • @maxieroo629
      @maxieroo629 Год назад +1

      Interesting, either way, but are you saying that you think devs are being lazy because they don’t have to optimize as much, or that nvidia is making them cpu bottleneck the game in order to push dlss3/fram-gen?

    • @xtr.7662
      @xtr.7662 Год назад +5

      no, its because of denuvo

    • @PeterPauls
      @PeterPauls Год назад +1

      @@maxieroo629 Yeah, this is my theory.

    • @PeterPauls
      @PeterPauls Год назад

      @@xtr.7662 Could be, but I have my theory don’t take away from me please.

    • @alexunkin
      @alexunkin Год назад

      It's more a case of old game engines and ray tracing make for a bad combination

  • @hareshgamer8990
    @hareshgamer8990 Год назад +1

    i have a rtx 3060 12gb is there a big difference between ti and normal

  • @DanielTralamazza
    @DanielTralamazza Год назад

    It could be RAM (both size and bw) limited, did you try rebar on/off?

  • @KaranBenchmarks
    @KaranBenchmarks Год назад +6

    with my 14 core I5 13600k it runs absolutely fine without any bottleneck even with 3060 but the ram management is terrible, 20-24gb ram for 1080p ultra lol !!

    • @ZealousPumpkinTV
      @ZealousPumpkinTV Год назад

      That's exactly my rig, i5-13600K Stock + RTX 3060 12GB Asus Strix + I got 16GB RAM 3,600mhz (2x8GB), I haven't bought the game but if I ended up doing so it would be to stream it, I wouldn't mind streaming at 720p60 instead of the usual 936p60 seeing how much RAM it's consuming. How much RAM do you have in total?? (I've seen people on twitch with apparently 16GB of RAM streaming the game at like 1080p60)

    • @KaranBenchmarks
      @KaranBenchmarks Год назад

      @@ZealousPumpkinTV i have 32gb of ram, you can check full benchmark on my channel.

  • @TC_Geosci
    @TC_Geosci Год назад +3

    My experience: 5700G, 16gb 3200mhz cl16, RX 6600XT, 1080p 165hz display - game is consuming about 15+gb system RAM and reached maximum capacity and had a hard system crash. Second time running it, closed Chrome and other apps and the game was stable with system RAM at 15gb total usage. But wow, I’ve never seen a game this hungry for system RAM.

    • @asifhasan4678
      @asifhasan4678 Год назад

      what about locking fps to 60 with rivaturner?

    • @TC_Geosci
      @TC_Geosci Год назад

      @@asifhasan4678 locking fps had no affect on system RAM usage. I did notice the RAM usage slowly ramped down as the game progressed, plateauing around 13gb.

    • @andreiga76
      @andreiga76 Год назад +1

      On my 64gb system is using 22Gb, this game is using more RAM than professional applications.

  • @Batlafication
    @Batlafication Год назад +1

    I'm running a 5900x, 3080 and 32gb ram, max FPS is set to 60, my CPU bad GPU utilization isn't at max at all, but sometimes GPU jumps to 100% and I get stutters, I can't identify what triggers it

  • @joker927
    @joker927 Год назад

    Have you tried limiting the CPU threads to 12 and or disabling SMT? Could this be PC port problem and mimicing the console CPUs could help?

  • @jason19872011
    @jason19872011 Год назад +6

    There is nor reason why this game should be this demanding.

    • @user-ye7lp9lg1c
      @user-ye7lp9lg1c Год назад +4

      It looks like uncharted but runs like cyberpunk

    • @kaznodzieja3957
      @kaznodzieja3957 8 месяцев назад

      ​@@user-ye7lp9lg1cIT looks worse than uncharted

  • @123TheCloop
    @123TheCloop Год назад +5

    Still would love to know what systems they even test there games on when hardware like the 7000 series CPUs and 4000 series GPUs were only out for a few months, not enough to even got a hold of them and test on them, and if performance is this bad, really would love to see there internal results

    • @Lukashoffmann94
      @Lukashoffmann94 Год назад

      Do Microsoft and Sony have developer consoles? If so, I would guess they mainly test it on there. The rest would most likely just be high end pc‘s with Quattros, or other high memory gpu‘s, that they also use for modeling/rendering.

  • @kitzalecka
    @kitzalecka Год назад +1

    how can you switch between all those cards and test them? you always plug and unplug them, and uninstall and reinstall all the drivers every time for amd and nvidia?

    • @leeakers972
      @leeakers972 Год назад

      Probably has nvidia and amd drivers installed. It usually doesnt cause problems because amd/nvidia isnt trying to look for nvidia/amd drivers

  • @livestronger925
    @livestronger925 Год назад +11

    Memory leakage is definitely an issue in this game. I saw 20gb of ram used with the 7900XTX. That is absurd

    • @icy1007
      @icy1007 Год назад +4

      There isn't a memory leak. It is using available resources for asset streaming.

    • @mikeramos91
      @mikeramos91 Год назад +1

      It hits 20gb when u get to the menu. But then I would say it goes up a little the longer u play & then back down depending on what area your in

    • @ilikecerealprawns
      @ilikecerealprawns Год назад

      Seeing the ram usage just in this part alone is between 19-23gb. Isn't the minimum specs for this game 16gb?

    • @Extreme96PL
      @Extreme96PL Год назад +4

      @@icy1007 20gb is more than consoles have and consoles have 16gb shared memory not VRAM and RAM so consoles should crash with that high memory usage.

    • @icy1007
      @icy1007 Год назад

      @@Extreme96PL - and if you have less total RAM then the game will use less RAM.

  • @bl4d3runn3rX
    @bl4d3runn3rX Год назад +4

    Wait a year and the game will run much better

  • @tomasmartins5009
    @tomasmartins5009 Год назад

    How can you tell it's a CPU limit when it's not maxing out its usage? PC channels have always told us that cpu doesn't matter in higher resolutions but will it matter in a setting where you are rendering at 1080p with dlss on a 4k monitor?

  • @andyorritt2472
    @andyorritt2472 Год назад +1

    Which testing software is that, that is showing CPU/GPU stats and FPS. You don't list it in the description

  • @hrod9393
    @hrod9393 Год назад +3

    It's potentially good to know that there is potential to unlock something in order to utilize more GPU. Maybe they'll fix it in a driver update.
    Potentially doubling the performance will be very exciting on the 4090 and future GPUs

    • @Trophykage
      @Trophykage Год назад +3

      Hopefully! Running a 3090 and 12900k and stuck at like 60-70% GPU utilization in hogsmead.

    • @SlimedogNumbaSixty9
      @SlimedogNumbaSixty9 Год назад

      @@Trophykage time for CPU and ram oc

    • @kenshinhimura9387
      @kenshinhimura9387 Год назад

      ​@@SlimedogNumbaSixty9 the game is not even using our cpus though. Why upgrade when the game doesn't even use a fraction of what we already have?

    • @SlimedogNumbaSixty9
      @SlimedogNumbaSixty9 Год назад

      @@kenshinhimura9387 the less the GPU is utilized the more the CPU needs to pick up. This happens often in lower resolutions/settings. Having a better CPU for these situations will let the GPU stretch it's legs (or just play in higher resolution/settings with similar frame rate). I have no issues getting 100% GPU utilization, but the 4090 is just too strong to demand it to work harder because it's bottlenecked by the CPU in these scenarios.

  • @SKHYJINX
    @SKHYJINX Год назад +7

    When CPU bound, fast and tight RAM matters the most.
    FYI Nvidia users, Nvidia DLAA at Native Res is actually faster than Native TAA High and also looks cleaner on fine hair details in this game.
    All Users - Config file can also have RT variables tweaked for higher res reflections and better RTAO coverage. News sites are beginning to report this.
    RT reflections render at 33% Default of Native Res, you can set up to 100%, 66%, 50%,.. whatever you like.

    • @TerraWare
      @TerraWare Год назад

      DLAA is nice although it can introduce ghosting in some cases like Forza Horizon 5 for example but it's overall pretty good.

    • @i_cri_evertim
      @i_cri_evertim Год назад +1

      This game needs a big patch for the graphics optimization.

    • @thecatdaddy1981
      @thecatdaddy1981 Год назад +1

      I don't think many people here tune their ram xd

  • @denniskarlsson6173
    @denniskarlsson6173 Год назад

    Have you tried doing a comparison beteende running the game at fullscreen instead of borderless?

  • @klym8_
    @klym8_ Год назад +1

    How is AMD dont release driver update for 3 months now? and will they update the driver for this? I have rx 6600 xt and really hoping for a driver update at least helps the performance a bit.

  • @Eldesinstalado
    @Eldesinstalado Год назад +3

    The issue is denuvo

  • @echo42704
    @echo42704 Год назад +8

    the stutters have been the biggest issue for me, i use an rx 6600, and it’s definitely capable of a lot in that game, but it stutters particularly bad in hogwarts and hogsmade. interested to see what the devs do

    • @ncredibledark7926
      @ncredibledark7926 Год назад +2

      Yeah certain areas are a bit funky. I use a strix 2080super and play on 1080p ultra, dlss quality setting with consistent 90-110fps in hogwarts (2 spots where it drops to 40), and 60-70fps in hogsmead.
      Also cutscenes sometimes Tank fps like crazy, dropping to below 20fps

    • @iihamed711
      @iihamed711 Год назад +2

      I have that gpu. Sad to see it struggle in this horribly optimised game.

    • @paulssnfuture2752
      @paulssnfuture2752 Год назад

      Glad stutters doesn't bother me since it's not a competitive game anyway...

    • @user-ye7lp9lg1c
      @user-ye7lp9lg1c Год назад

      Wait for crack

    • @ncredibledark7926
      @ncredibledark7926 Год назад

      @@user-ye7lp9lg1c nah. I just wrote a dlss update. Fixed the issues

  • @xzy_n6827
    @xzy_n6827 Год назад

    hi i have the same problem that my game is cpu limited, can this be fixed with future updates of the game?

  • @paulwohlford6820
    @paulwohlford6820 Год назад

    Did you change the refresh rate? I have a 1080p monitor with a 6900xt and a 5800x. I changed the refresh rate to 240hz since it have a 240hz monitor. I am still getting between 85 and 189 fps even in hogsmead. My game performance is as smooth as silk with everything on.

  • @OptimalMayhem
    @OptimalMayhem Год назад +12

    I built my wife a 5800X/3060/32gb rig specifically to play this game. She’s barely touched PC gaming and knows nothing about FPS. I’m pretty interested to see what her gameplay experience will be like untainted by the quest for bigger numbers.

    • @echo42704
      @echo42704 Год назад +1

      cap her at 30fps, she won’t even notice, my girlfriend doesn’t tend to either

    • @yellowflash511
      @yellowflash511 Год назад +1

      Could've gotten 6700xt ma man 🤦

    • @deagle7602
      @deagle7602 Год назад

      @@yellowflash511 some people will do anything but buy an amd gpu. I dont get it.

    • @LegendaryGooseling
      @LegendaryGooseling Год назад

      @@deagle7602 less features worse support

    • @yellowflash511
      @yellowflash511 Год назад +2

      @@LegendaryGooseling same support, way more performance, same price 🤡

  • @necrotic256
    @necrotic256 Год назад +3

    Runs so much better on Radeon in comparison, wow. And people still yap about "muh nvidia drivers"

    • @anuzahyder7185
      @anuzahyder7185 Год назад

      Ray tracing kills radeon gpus in this game

    • @MLWJ1993
      @MLWJ1993 Год назад

      With low level API's there's much more in the developers hands than there is in Nvidia's... So before pointing fingers at drivers one should really wait for a driver update that fixes it (if it is a driver issue) or the game just remains terrible at even utilising a single thread on your CPU for the rest of eternity (so not a driver issue there)...

    • @sorryyourenotawinner2506
      @sorryyourenotawinner2506 Год назад

      @@anuzahyder7185 who cares about rtx?? is just a gimmick boy

    • @anuzahyder7185
      @anuzahyder7185 Год назад

      @@sorryyourenotawinner2506 ok u r a 6000 series radeon user. Sorry we r not.

  • @jessterman21
    @jessterman21 Год назад

    Low five to another family man PC gamer - thanks for the analysis.
    Have you tried RTX Voice btw? Pretty magical for eliminating background noise

  • @defaultdefault812
    @defaultdefault812 Год назад

    Currently running HL using an i5-6500k on a B150 chipset with 16GB of DDR 4 3200Mhz memory and a 2060 6GB. My target is 60FPS at High at 1080p with DSLL quality mode on and RTX off.
    Currently hitting around 25-35FPS for most areas of the game after all optimisations (DSLL to 2.5.1, CPU priority to High, V-sync set externally, Control Flow Guard, Latest Patch and Drivers).
    I am CPU bottlenecked. GPU rarely hits 50% but CPU flat out at 100% in a lot of areas. The socket 1151 can support up to a 7th gen Intel processor, something like a i7-7700k. Will I still be CPU bottlenecked? Should I just upgrade to an i5-12400F and new chipset + 32GB of memory?
    Many thanks!

  • @darcrequiem
    @darcrequiem Год назад +27

    AMD GPUs have a hardware scheduler. Nvidia GPUs don't, they off load that function to the CPU. That's why AMD GPUs perform better in CPU limited situations.

  • @dakelpoandgames2593
    @dakelpoandgames2593 Год назад +3

    I mean the game isnt even out yet either. Im trusting and hoping they put out some fixes soon but I mean im playing the game at 1440p on a mid range system just fine. I get from 80-120 fps with dlss at high settings and medium crowd density with no rt. I also think it is a ram issue with the whole whatever the hell is going on under the hood. My buddy with the same system only has 16gbs whereas I have 48gbs and he cant run the game without it crashing. So idk dude lol. We will see when it gets some patches and stuff.

    • @sham5614
      @sham5614 Год назад +6

      We really need to go back to the standard of releasing finished, polished games. Cant just keep letting them do this expecting a patch that rarely comes.

    • @sham5614
      @sham5614 Год назад +3

      Until then I'd stick with games from smaller companies. (3x the quality 1/3 the price)

    • @dakelpoandgames2593
      @dakelpoandgames2593 Год назад

      @@sham5614 I get that but the game isn't technically out yet. Its weird because some lower end systems are running it fine. There aren't even driver tweaks out for the game yet either. Most of the time Nvidia pushes out driver optimizations and so does amd for large titles when they come out. Those probably won't drop until Friday or Saturday. And it's not like the game isn't complete or unpolished at all. I've been playing it fine and I'm 8 hours in now with very little stutters.

    • @GRE3NT
      @GRE3NT Год назад +1

      @@dakelpoandgames2593official release for preorders was 07.02.2022. plenty of people already finished the game. How is it not out yet?

    • @ryanespinoza7297
      @ryanespinoza7297 Год назад +1

      People pay extra to get a worse version of the game 3 days early it's wild.

  • @imlanding
    @imlanding Год назад

    Whats the name of the program that shows all that FPS info? cheers.

  • @davidjs97_
    @davidjs97_ Год назад

    How are you getting 60+ FPS with an RTX 3060 TI at Hogsmead on 1440p? I have an RTX 3080 with an i7 10700k / 32 GB Ram and I'm running on 1440 too - I'm getting 30FPS. What's happening with my system? My CPU utilization is hovering around a constant 50% and my GPU usage is around 35% but I am noticing my VRAM usage is pretty much maxed out at 10GB. Could that be the issue? If it is then that would be strange because you're getting better performance with an RTX 3060 TI...

  • @TheGreektrojan
    @TheGreektrojan Год назад +8

    Definitely not the most optimized game but I think that the 3060ti getting nearly 60fps at ultra 1440p (maybe with one or two setting turned down) is more than adequate performance. Just making sure people aren't witch hunting for the sake of it.

    • @khaledm.1476
      @khaledm.1476 Год назад +1

      People are already witch hunting for various of reasons for this game

    • @stardomplays1386
      @stardomplays1386 Год назад +8

      No you aren't. The reason I KNOW you are lying is because you only have 8gb of VRAM. There is NO way you aren't getting performance drops into the single digits ESPECIALLY during the sorting ceremony. People who bought this game on PC KNOW you are lying.

    • @TheGreektrojan
      @TheGreektrojan Год назад

      @@stardomplays1386 I don't have the game but I watched this video and was expecting way worse numbers given all the performance hubbub. I imagine by the time it goes on sale (when I'll get it) it'll be marginally better with patches.

  • @oktusprime3637
    @oktusprime3637 Год назад +3

    Glad your daughter gave you permission to post this video.

    • @RaiPalk78
      @RaiPalk78 Год назад

      You westerners are so cringe, where I'm from, this kind of talk doesn't exist, we hold our parents and elders in such great respect. That these pathetic jokes are no where to be found.
      Such a shame what you guys have devolved to.

  • @rl_fusion8404
    @rl_fusion8404 Год назад

    Im having problems with the game I have 16bg ram an i5 and a 2070 super ti but the game keeps crashing because i run out of memory and I have nearly every background process closed do you know how i can fix and yes all the graphics are low and fps is set to 30

  • @Danny.reg.
    @Danny.reg. Год назад

    Great Video! Can you test the rx6800xt?

  • @oktusprime3637
    @oktusprime3637 Год назад +4

    The 13900K is faster than the 7700X in damn near every game and not by a small margin when CPU-limited.

    • @bronsondixon4747
      @bronsondixon4747 Год назад

      Only HUB really even has them that close.

    • @SlimedogNumbaSixty9
      @SlimedogNumbaSixty9 Год назад +1

      @@bronsondixon4747 HUB, Steve specifically, is an AMD shill

    • @Nite-Lite-Gamers
      @Nite-Lite-Gamers Год назад

      @@SlimedogNumbaSixty9 Grow up kid

    • @bronsondixon4747
      @bronsondixon4747 Год назад

      @thecouchtripper That's dumb and arbitrary but even going by that metric I'm not even sure if the 7700X would come out on top.

    • @SlimedogNumbaSixty9
      @SlimedogNumbaSixty9 Год назад

      @@Nite-Lite-Gamers u good?

  • @riverfilmer5183
    @riverfilmer5183 Год назад +5

    Here is my experience, and everything I’ve realised about this game regarding performance so far! My system is a 5800x3D CPU and 4080 GPU. 32gb 3600CL16 DDR4 RAM.
    First, I must say… Frame Generation is KING! I NEVER use DLSS as it looks awful in every game at 1440p, but frame generation truly is a revelation! Identical graphical quality, but double the fps! The game is BUTTERY SMOOTH and I’m constantly going back to record clips because of how cinematic and beautiful things look and feel. I can’t feel or see any negative difference from enabling Frame Generation! Also the latency barely changes! The game is perfectly responsive. I will also say, this game was made for controller for SURE! Feels way smoother. In summary for Frame Generation - if you have this option, you MUST use it.
    So, as for numbers: I’m able to play 1440p Max Ultra settings on everything and I get 130-150fps consistently! I’ve seen fps drop to lowest 100-120 rarely in hectic moments. With frame generation off, I’d have to use DLSS (which I’d never do due to reduction in quality) so I’d be forced to reduce graphics settings down to high/medium to keep above 100fps. Also, important - regarding Ray Tracing. Turn off ambient occlusion ray tracing! It makes the game stutter! The shadows and reflections run perfectly. The shadows in this game are fantastic! The ray traced reflections are not amazing, but if you turn them off, you really notice how dull the floor appears. Ray traced reflections give this game a lot of depth in the image, even though the actual resolution and clarity of reflections themselves aren’t impressive (not bad by any means).
    If you’re someone who doesn’t own a 40 series GPU, be prepared to sacrifice your settings to play this game around 100fps or above. I had a 3080 before, and I’d be able to play all ultra on most games no problem. That WON’T be the case here unfortunately.
    Best of luck! The game is great btw.. finally a game that for it’s retail price, is genuinely a bargain.

    • @TenthMarigold
      @TenthMarigold Год назад

      Would this be with RT on? Additionally, how is the latency?

    • @ayyorta
      @ayyorta Год назад

      I thought you had to use DLSS to use frame generation, hence the name DLSS 3.0. Is it standalone?

    • @TerraWare
      @TerraWare Год назад

      @@ayyorta No you don't have to use DLSS with frame generation, can do it native or mix the two if you want.

    • @mikeramos91
      @mikeramos91 Год назад

      Frame generation is only available when u enable upscaling

    • @riverfilmer5183
      @riverfilmer5183 Год назад

      @@mikeramos91 In settings, you have to turn on Upscaling options, but you can actually disable all upscaling and keep Frame generation on! It’s great!

  • @MarcusJ79
    @MarcusJ79 Год назад +2

    Another great video! Do you think the RT is actually working correctly? I have it turned on with my 4090 and honestly it doesn't feel like it is working as it should be.

    • @TerraWare
      @TerraWare Год назад +1

      The RT shadows look really good and reflections do too although they are a bit too nosiy. Analistadebits did a pretty good RT comparison video for this game. Ill probably make one too

    • @Aurummorituri
      @Aurummorituri Год назад +1

      You have to restart to see it. It’s also bad console-manageable RT with tons of noise. It’s not good looking. I turned mine off and things smoothed out a little bit.

  • @tclark5481
    @tclark5481 Год назад

    What happens when you go to a higher resolution? My understanding is that lower rez puts more strain on the CPU than the GPU.

  • @raul-km6mq
    @raul-km6mq Год назад +71

    pc gaming is a joke with this level of lazy development

    • @rusty10785
      @rusty10785 Год назад +8

      Tell us you didn't watch the video without telling us. 😂

    • @najeebshah.
      @najeebshah. Год назад +13

      @@rusty10785 tell us you didn't watch other tests , a 4090 at 4k native dips below 60

    • @charizard6969
      @charizard6969 Год назад

      @Najeeb Shah and if you saw this video you will know why you idiot🤣 we don't have a powerful enough cpu yet dork

    • @jjlw2378
      @jjlw2378 Год назад +6

      @@najeebshah. You mean a 7700x at 4k dips below 60fps. The 4090 isn't the bottleneck and will provide 120+ all day if only the cpu could keep up.

    • @michaelthomas5433
      @michaelthomas5433 Год назад +1

      Righteous dude! I will only play this on the Switch.