Hogwarts Legacy PC CPU Performance- How bad is it?

Поделиться
HTML-код
  • Опубликовано: 6 окт 2024

Комментарии • 944

  • @crawfordbrown75
    @crawfordbrown75 Год назад +379

    What's crazy is I run this game at 1440p ultrawide with a 3060ti and a R5 5600, and I'm not even hitting 100% utilisation most of the time. Horrible optimisation.

    • @riverfilmer5183
      @riverfilmer5183 Год назад +20

      Are you referring to gpu utilisation? If so, that will be because you are cpu bottlenecked. This game is cpu bound

    • @coreywalker3224
      @coreywalker3224 Год назад +14

      Did you read this comment before posting it 😅

    • @mudzy9820
      @mudzy9820 Год назад +104

      @@riverfilmer5183 lol so in other words it's poorly optimised,😂

    • @A.Froster
      @A.Froster Год назад +61

      @@riverfilmer5183 Yes , thanks. We realised. The point is the game isn't properly utilizing those CPU cores which there isn't a fix for until Devs patch it

    • @sstier48
      @sstier48 Год назад +53

      Games shouldn't be cpu bound unless you're running an old cpu with a newer/powerful gpu, unless the game isn't optimized properly. I believe that's his point

  • @javiermd5835
    @javiermd5835 Год назад +119

    Anyone in the room as sick as I am of devs’ sheer laziness to deliver acceptable PC ports?

    • @neruba2173
      @neruba2173 Год назад +11

      Not the only one dude. GenZ are getting jobs. This is a one way trip down.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Год назад +4

      I'm as sick of that as I am most review channels' suite of console port benchmarks which represent games hardly anyone plays relative to the market. The games are terribly optimized and totally ramshackle, no one should care about benchmarks in them especially when they're optimized for AMD hardware in consoles. Show me benchmarks of PC titles please, I know I'm going to be able to run a single player console port at 60+ FPS and don't give a crap about any of their performance metrics.
      /rant

    • @javiermd5835
      @javiermd5835 Год назад +1

      @@neruba2173 then I can see many people migrating to consoles. A sea of incompetence optimizing games paired with Jensen’s greed to fund his leather jackets is the perfect recipe.

    • @jalen5147
      @jalen5147 Год назад +1

      @@javiermd5835 the "performance mode" on ps5 is straight up trash

    • @ironmaidenmetalgod
      @ironmaidenmetalgod Год назад

      Take a look at the upcoming wild hearts and see what a terrible mess AAA pc gaming is.

  • @RafaPolit
    @RafaPolit Год назад +35

    I want to thank you for these series of videos. The fact that you test older CPUs makes for a more useful and realistic setting for 90% of us that may have good systems, but not last iteration or cutting edge systems. It also makes you understand why, having a beast of GPU is not the only piece of the equation. Thanks again!

  • @nikita_tihiy
    @nikita_tihiy Год назад +19

    great CPU comparison! appreciate your efforts. hope developers can address the issue and optimize RT in this game

  • @Verpal
    @Verpal Год назад +51

    Quite unfortunate turning down texture usually sucks the most and are extremely noticeable, texture is just free image quality as long as your VRAM and bandwidth can support, if possible I usually try to push it to the limit.

    • @Kizzster
      @Kizzster Год назад +4

      Run DLSS Quality then...

    • @darrenchilds5034
      @darrenchilds5034 Год назад

      It’s not noticeable

    • @SolarianStrike
      @SolarianStrike Год назад +1

      @@Kizzster Doesn't work, the game with RT will try to use 13.5GB of ram even at 1600x900.

    • @Kizzster
      @Kizzster Год назад +5

      @@SolarianStrike Not asking him to run RT, It's never worth using on open world games imo. There aren't enough Ray Tracing scenes or enough of them to justify a 40-70 fps decrease.

    • @SolarianStrike
      @SolarianStrike Год назад +1

      @@jimdshea This game actually does stutter / stall if it run out of VRAM.

  • @Simon-jq4fj
    @Simon-jq4fj Год назад +23

    I’m running on a 9600k and 3070, what I find odd about this game is how much my performance drops when moving from area to area. If I chill in one section of the castle for a minute i’m usually able to average around 80-90fps, but if i’m jogging around that average drops to like 40 all the time

    • @mbed0123
      @mbed0123 Год назад +1

      Shit coding unfortunately....

    • @LameArab
      @LameArab Год назад +1

      I have the same specs and I can confirm I’m running into the same issues my man, hopefully in the future they come out with a good patch

    • @АртемТерещенко-ц4э
      @АртемТерещенко-ц4э Год назад

      Same with 10400

    • @dakbassett
      @dakbassett Год назад

      3070 is running out of vram. How is it now after patches and such?

    • @EasternUNO
      @EasternUNO 5 месяцев назад

      I have 9900 ks and 4080 and the same issues for me. In the school 80 - 90 FPS and when going outside it drops to 37 - 49

  • @omarmessi7075
    @omarmessi7075 Год назад +3

    one of the best channel i am so happy that i found it keep it up man
    and i hope ur daughter get better

  • @ristopoho824
    @ristopoho824 Год назад +13

    It's not too long ago that LowSpeckGamer turned off from showing how to optimize games to run on lower hardware, since there isn't much to optimize in modern games. Seems we may be needing him again.
    By the way i have been enjoying his new direction to low speck history. Top notch documentary both for learning and entertainment.

  • @SevendashGaming
    @SevendashGaming Год назад +31

    Vram is looking more and more essential already. As much as I want to go with a 4080 this generation, with these recent game releases it just looks like both 7900 cards are going to be better at current generation games with the 24 and 20GB Vrams respectively.

    • @SussexIan
      @SussexIan Год назад +4

      Yeah, my thoughts as well.

    • @crescentmoon256
      @crescentmoon256 Год назад +5

      You don't need 20 and 24 gigs this generation, consoles have 16 gigs but the usable ram for games is 10-12 gigs so a gpu with atleast 12 gigs should suffice

    • @SevendashGaming
      @SevendashGaming Год назад +10

      @@crescentmoon256 consoles also don't need the same lane restrictions as a PC with interchangeable parts. I do have a 16Gb card right now so I'm easily able to play games like Forspoken without issue. My purchase decision is going to include anticipating game needs for the next 2-3yrs as well as AV1 encoding. 4080 being limited in two years isn't worth the price difference to me so a 7900 XTX or XT is looking much better for the cost.

    • @ovemalmstrom7428
      @ovemalmstrom7428 Год назад +4

      Think the Arc A770 16gb also is a good budget option for the future with it's 16gb vram, good bandwidth and AV1 built in. Maby the new "rx 580 8gb"?

    • @crescentmoon256
      @crescentmoon256 Год назад +2

      @@SevendashGaming ur just speculating, u can't say for sure how upcoming next gen games will utilise resources yet, cause there are not enough titles to see the trend
      this exact thing happened with the 10 series, 1060 was considered to have low vram with its 6 gig and consoles having 8 gigs but 1060 outperforns both ps4/xbox one consoles because usable memory on console for games is 4-5 gigs
      we see the same with 1050 and 1050 ti, even tho they are faster than ps4/xbox one in most tiltes, towards the end those cards looked worse because they only have 4 gigs
      vram is Important but dont consider buying a card just for vram, av1 encoding is supported on all the three gpu manufacturers on their latest series of gpus

  • @riven4121
    @riven4121 Год назад +165

    I do wonder how much of the stuttering is caused by denuvo being denuvo though.

    • @lynackhilou4865
      @lynackhilou4865 Год назад +48

      Well as much as i hate denuvo , i think at this point it can't be considered THE problem here , i think the game is just that badly unoptimized as a lot of games recently are releasing with wierd cpu limits .
      But then again denuvo has a history of causing stutters and more cpu limits so removing may help the performance in some way , just look at RE village

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +6

      No its too single thread limited. I don't like Denuvo, but this is worse than mere stuttering.

    • @Eldesinstalado
      @Eldesinstalado Год назад +16

      @@lynackhilou4865 there is gonna be video comparison when denuvo gets cracked

    • @Nick-nf1kd
      @Nick-nf1kd Год назад +1

      @@Eldesinstalado usually denuvo cracks are just bypasses, if the devs won't remove it themselves, it will run the same. Exception i know of is AC: Origins, where the crackers removed denuvo themselves, and the game actually ran a lot better, especially in cpu bound scenarios

    • @teddyholiday8038
      @teddyholiday8038 Год назад +13

      @@lynackhilou4865 yeah. It ain't Denuvo. I have a 4090 and a 5900x, the game is butter smooth on Ultra with RT off, but with RT on the stuttering is insane. Something is borked with the RT

  • @i11usiveman97
    @i11usiveman97 Год назад +30

    The Devs surely must be readying a patch, there's no way they can release a game that has such wide appeal as this with graphics settings so high that most people can run it decently. It looks really good but there's something not right here.

    • @GrimAbstract
      @GrimAbstract Год назад

      I mean idk m, elden ring was pretty bad for awhile, still no ultra wide support as well haha

    • @samgoff5289
      @samgoff5289 Год назад

      I don't understand...you think because Harry Potter is popular they should make it have low graphics and be easy to run because a lot of people will want to play it...? That isn't how it works

    • @A.Froster
      @A.Froster Год назад +2

      @@samgoff5289 It usually does. Most popular games are done not with overkill graphics in mind so you can still run them decently even if you have old hardware but they tend to scale well so you can play them at high framerate/settings with powerful GPUs too. This game has ok graphics and it's actually easy to run at 60 fps run with upscaling but has terrible CPU performance which hinders even the best GPUs out there like a 4090.

    • @cyborgchimpy
      @cyborgchimpy Год назад

      @@GrimAbstract i'm sorry what? you're comparing the optimization of Elden ring to this garbage? Elden ring with literally everything on ran quite smoothly AT LAUNCH. this garbage here (tried again few days ago) runs like absolute shite. if it was a stable low FPS it would be somewhat ok, but it hitches like crazy even on medium in areas like hogsmead or in hogwarts.

    • @cyborgchimpy
      @cyborgchimpy Год назад +1

      @@samgoff5289 no, the game just has shit optimization. I hate the fact that modern games are struggling to run properly so you're forced to make the game look WORSE than games from 10+ years ago who focused more on innovative ways to make their games look nice, rather than relying on slapped on new technology that make the game look cool in high/ultra but absolute shite in lower settings.

  • @Franchisco7
    @Franchisco7 Год назад +3

    I am running a 5900x, 3070, 32GB of 3600 CL16 DDR4 ram. Using all high settings and turning to Native 3660x1440 Rendering resolution I am getting mid 40 averages in hogsmeade and mid 60s everywhere else. This was very inconsistent at first, but once I forced off the Control Flow Guard in the Exploit Protection I was seeing very steady averages, but still in the 40-70 Framerate. Still need to test with rendering reduction and upscaling to see if I can improve the framerate without losing the steady average. This fix also seemed to remove most of the massive drops I was seeing before. I will still get short ones every once in a while in cut scenes but nothing else. Also most averages were seeing 99% GPU utilization and 15%-40% CPU utilization.

  • @Nighthearted
    @Nighthearted Год назад +5

    The Denuvo BS is probably at least 50% of what's causing the problem here.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад

      Nah, its too single thread limited.

    • @saricubra2867
      @saricubra2867 Год назад

      @@Wobbothe3rd Too single thread limited?
      My i7-12700K running a Nintendo Switch emulator only with the Intel UHD 770 graphics looks smoother than this AAA game. I actually watch one thread being 100% used.

  • @vladislavantonov5552
    @vladislavantonov5552 Год назад +1

    so happy with my 7700X, overclocked and with ddr5 6000 and extremely tight timings(took weeks to tune and test) I get ~120fps in hogsmead all on ultra except ray tracing!

  • @ytmandrake
    @ytmandrake Год назад +12

    Great benchmarks as always on this channel, but man what would I give in my time for this guy to be my math teacher

  • @aimbreak1461
    @aimbreak1461 Год назад +1

    I’m running this on 1440p with maxed graphics on a i713700K, RTX4080 and 32GB DDR4 RAM and the frame drops are insane. I can’t possibly play It flawless.
    Yes I know I’m insane and yes I didn’t have any money for month to afford it, but I bought a PC for 3k.

  • @jooyichen
    @jooyichen Год назад +6

    Important to keep in mind.
    Cyberpunk launch is not a lesson learned.
    It is a *blueprint* on how to sell games.

    • @pankhanafis818
      @pankhanafis818 Год назад +1

      lol this game launch is nowhere close to cyberpunk 2077 in terms of bad launch.

    • @neruba2173
      @neruba2173 Год назад +5

      @@pankhanafis818 cyberpunk was actually 100 times better optimized FOR PC that this game. It was trash in consoles. I played CP all the way to the end in high - ultra settings with a 10700 and 2070RTX, no problem. I try this game with the same settings and its all over the place. Not only that, graphically it looks worse with the same settings. This is a CP case, but reversed, its a trash PC port.

    • @adamjohanssonn
      @adamjohanssonn Год назад

      @@neruba2173 i had 0 problems with cyberpunk and 0 issues with hogwarts

  • @Kizzster
    @Kizzster Год назад +2

    Use DLSS Quality on RTX 3000 or lower. Use DLSS Quality and Frame Generation on RTX 4000. It's that simple to get a bunch more frames without negatively effecting visual quality.

  • @referencestudioproductions6258
    @referencestudioproductions6258 Год назад +7

    I ended up getting a refund after I started to see performance in the open world with a 5600x 2070s 16GB ram. I'm not upgrading my memory for this game alone. Terrible performance, I'm just going to get dead space instead.

    • @misericorde360
      @misericorde360 Год назад +1

      Got the exact same issues with dead space, heavy VRAM consumption and stutters (/w RTX 3080 and 32GB ram). Runs fine for some periods but some bosses and cinematics, frame rate drops heavily in both games

    • @OperationDecoded
      @OperationDecoded Год назад

      I have the same setup as you but with 32 GB of RAM and I have stutter and performance issues as well. So upgrading your RAM won't help you either way.

    • @ewitte12
      @ewitte12 Год назад

      32GB should be the minimum these days for newer games. They will be pushing 12-16GB by themselves that doesn't leave a lot of room for windows especially if you're not rebooting and making sure nothing else is running.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Год назад

      Fair decision on your part but reminder that a 2x8 kit of RAM will run you like $42 USD and will really make the whole system last longer. If it's even one more year of relevance that's 100% worth $42 in amortized cost.

    • @referencestudioproductions6258
      @referencestudioproductions6258 Год назад

      @@CyberneticArgumentCreator Yeah the issue is I want to match my current kit just to avoid any potential instability, and that kit is like $70. It's honestly almost as cheap to just get a faster 32GB kit, I'm just not wanting to put more money into a system I feel is at the end of it's life.

  • @aiash80
    @aiash80 Год назад +1

    I hope that tech companies will not use poor optimisation as way to push people to upgrade

  • @xeno9551
    @xeno9551 Год назад +8

    Its the denuvo anti cheat as well.

  • @MoneyMakes82
    @MoneyMakes82 Год назад +2

    I’m running 5800x3d, rtx3060Ti and 32Gb 3200mhz Ram. It’s upscaled to 4K with high settings it’s renedering at 65%. I’m getting 50-60 Fps most of the time

  • @semtexxl4023
    @semtexxl4023 Год назад +4

    Optimization has left the chat yet again...

  • @rozzbozzgames
    @rozzbozzgames Год назад +2

    Horribly optimised, (in fact clearly not optimised at all), lazy, lazy dev. TBH the game looks nice, but there are games that look far, far more impressive and run far better. Send a clear message to the Dev/ Publisher- dont buy it until they sort it- or they will keep doing the same thing again and again and again. Its pretty unacceptable in 2023 TBH.

  • @PaoloMix09
    @PaoloMix09 Год назад +5

    My main specs are a 7700x with a 3070 and the game runs hella smooth but sometimes when I go outside or go to another area it does have stutters and drops go 10-15 fps but as soon as I do a rebellio it kinda fixes it for a while. Game definitely needs optimization for sure, I played through Callisto Protocol and it was very smooth for a new game. Either way I’m enjoying this game a lot, great mechanics, love duels, and exploring the castle and everything is so fun.

    • @voltage1235
      @voltage1235 Год назад

      Me too mate same specs, I try everything.

    • @JustS0meK1dd
      @JustS0meK1dd Год назад

      It definitely seems to be VRAM limitations with the 3070. I have been running on mostly high settings with DLSS quality on my 3070 and it seems to offer a smooth experience. As soon as I go up to ultra textures, I get the massive frame drops you are talking about.

    • @PaoloMix09
      @PaoloMix09 Год назад

      @@JustS0meK1dd im on mainly high settings, idk if I have DLSS on. Maybe that could fix it or I’ll have to go to medium I guess

    • @voltage1235
      @voltage1235 Год назад

      @@PaoloMix09 im with all in high RT On no shadow and dlss quality , NVDIA dlaa , open Geforce experience sharpening 50/15 and looks better

  • @fernandoalarcon525
    @fernandoalarcon525 Год назад

    Rtx 3060 owner here. This card is getting me solid performance on this 1440p medium low settings with ultra textures I get 110 to 130+ fps.

  • @herrosG
    @herrosG Год назад +4

    Is crazy how, while in the castle, I have 140+ fps, and then in the next 30 meters (or like 3seconds of sprinting) i go to like 110, then 75, then 48, and it goes up again.
    All I want is to run this game, 1080p, looking as graphically good as possible, without any performance issues.
    I hope they patch this asap.
    (i7-9700F, 32GB ram, 3060Ti)

    • @nikita11111111
      @nikita11111111 Год назад +1

      Same here and that is without Raytracing with 3080. Crazy unoptimized mess of a game

    • @seedyrom6529
      @seedyrom6529 Год назад +1

      This game is an absolute mess performance wise. The stuttering is ridiculous in Hogwarts and Hogsmead, even on a 4090 and 13900k for me. Can't even brute force it with high-end hardware.

    • @HeyImBoschea
      @HeyImBoschea Год назад +2

      idk man just set things to medium and enjoy the game. It still looks fantastic in lower settings. im running a gtx 970 4gb card with an i5 4570 and STILL pulling 60 frames and having a great time. its a shame high end rigs can pull high frames in ultra settings, but idk maybe i got low standards lol

    • @herrosG
      @herrosG Год назад +2

      @@HeyImBoschea i tried tweaking and going for lower settings, no matter how high or low i go, it will always have some moments/places where it drops to like 40fps for a second and then goes back. Is not like i dont want to see the "40fps" on my screen, but the second it drops, it feels like the game is moving in slowmotion, ruining the experience.

    • @zpotato1346
      @zpotato1346 Год назад +1

      @@herrosG yes . same feeling here exactly

  • @bighersh14
    @bighersh14 Год назад +2

    I’m not saying this game wasn’t optimized well because it wasn’t but it’s crazy how a lot of people are having issues with this game but I’m not I’m even running it 1440p high setting at 60fps easily with my Intel arc a750 whole recording for a video.

    • @bighersh14
      @bighersh14 Год назад

      @@ryanroberts714 thats weird because I have a Intel arc a750 and a 12700k and it works just fine. I guess it’s primarily an nvidia thing but I know I’ve seen some people say it’s still messed up on and gpus

  • @stephanhart9941
    @stephanhart9941 Год назад +10

    A settings optimization video would be much appreciated if you have time. Especially since Digital Foundry won't do their job, For no apparent reason. Just strange situation.

    • @adi6293
      @adi6293 Год назад +1

      There is a reason , they are all bunch of wanks 🤣🤣

    • @Radek494
      @Radek494 Год назад +5

      DF caved in to T Mafia, maybe Eurogamer forced them

    • @adi6293
      @adi6293 Год назад

      @@Radek494 Mafia 3?

    • @PoPoBye123
      @PoPoBye123 Год назад

      @@adi6293 transgender mafia

    • @teddyholiday8038
      @teddyholiday8038 Год назад

      @@Radek494 I doubt this is the case. Rich and John are not very liberal, they don’t seem to give a F about all that. Alex usually does the PC analysis anyway

  • @loryt690
    @loryt690 Год назад +1

    I ve loved when you tested the 9600k, please test more games with 9600k

  • @Mako2401
    @Mako2401 Год назад +4

    Ray tracing is such a scam, I have no idea what the point of it was with nvidia. In many games they look better with rtx off , and in others it's so badly optimized, you are losing up to 80 percent of frames for it.

    • @pravnav
      @pravnav Год назад

      I personally like it but for most games I just turn jt off I would never reslly play it like that unless it's 75+ fps

    • @ryans3795
      @ryans3795 Год назад +1

      This title is one of the games that utilizes raytracing the most, it really improves the graphics. Unfortunately the developers didn't put any effort into non ray traced reflections. It's either raytracing or nothing seemed to be the philosophy when creating this game.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад

      I instantly disregard the opinion of anyone who calls an entire rendering paradigm that has been researched and developed since the 1970s a "scam." Badly optimized RT is no reason to dismiss RT.

  • @ryans3795
    @ryans3795 Год назад +2

    One good bit of news though. My 5800x never goes over 60 degrees.

  • @Osprey850
    @Osprey850 Год назад +9

    It's hard for me to tell much of a difference in this scene between RT on and off, yet it tanks performance when it's on.

    • @ImAnOcean
      @ImAnOcean Год назад

      rt reflections look worse unless you set them to ultra

    • @teddyholiday8038
      @teddyholiday8038 Год назад +6

      This game has terrible RT. The RT reflections are fuzzy, soupy messes. The RT shadows barely work, and the RT AO just make things brighter

    • @josephoverstreet5584
      @josephoverstreet5584 Год назад +1

      RT even at ultra isn’t worth using lol

    • @gunturbayu6779
      @gunturbayu6779 Год назад

      Not even worth it , looks worse in some area

    • @Richard-tj1yh
      @Richard-tj1yh Год назад

      Maybe your monitor can't tell the difference

  • @jadedriviera7402
    @jadedriviera7402 Год назад

    Wishing your daughter a speedy recovery

  • @billschannel1116
    @billschannel1116 Год назад +3

    Been into computers for 40+ years. I'll tell you that the most interesting times were when things were coming out that ran slow. So much more interesting than those long periods where software stagnated. Mostly it was caused by overconcentration on consoles. But anyways if things getting better excites you then having stuff that maxes out hardware for good reasons is a really nice time.

  • @ATP-Flo
    @ATP-Flo Год назад

    Usind RTX3090 and 5950X and 64GB RAM. Absolutely no problem here. Max settings, Raytracing full and 4K DLSS Quality approximately 60 FPS VSYNC on my 4K TV. Perfectly playable and beautiful.

  • @Prashanth12344
    @Prashanth12344 Год назад +1

    It's usually around 90-120fps for me in sprinting through Hogsmeade, 7900XTX and 5800X3D. X3D CPU and AMD GPU seems to be the best combination for this game. The 3D V-cache and lower CPU overhead on AMD cards helps a lot in the CPU intensive areas.

  • @rocksfire4390
    @rocksfire4390 Год назад +4

    970, 5600 and 32gb ram (low settings with effects on medium). it's playable (vast majority of the time it's 60 fps) but some areas of the game tank it hard. that is something they can look into, i'm sure they just have a bit too many effects/objects in certain areas. that is very much something they can remedy overtime. if i had even a 2000 series card though...i doubt i would ever see it dip below 60 fps.
    it's already amazing this 9 year old GPU barely dips below 60 fps. granted FSR 2 is pretty much required but still....the game is detailed enough that it doesn't really matter to me. i'm sure the game looks amazing in 1440P ultra and even better with ray tracing but performance isn't free, if you want it you are gonna have to pay for it in some way.

    • @atomicfrijole7542
      @atomicfrijole7542 Год назад

      Honestly the raytracing isn't that noticeable from my experience with the game on the PS5 and in PC vids. 1440p is nice though. 4k is pretty but you'll be happier with a high-refresh rate 1440p monitor when you decide to upgrade. The 970 is a great card - keep an eye out for sales on the 6800xt if you're looking for an affordable upgrade path that will take advantage of your 5600.

    • @rocksfire4390
      @rocksfire4390 Год назад +1

      @@atomicfrijole7542
      sad to hear that RT isn't that big of a jump, i 100% agree about having/wanting higher refresh rates though. 60 fps feels like a slog at times, i never had a 144hz monitor but my bother does and it's sooooooooooo smooth. i don't think i would ever use 4k unless it could get higher refresh rates on top. then again that would require a beast of a GPU so it's unlikely i would have to choose anyway xD.
      was eyeing 6000 series but 7000 came out but things happened and decided to wait (was going to get a 7900xtx). so now i'm just waiting for the lower tier GPU's to release to see what happens to GPU prices. waiting a few months or another year isn't a big deal, already been waiting almost a decade.

    • @atomicfrijole7542
      @atomicfrijole7542 Год назад

      @@rocksfire4390 I don't know if you're in the USA but if you are and are near a Costco, there's sale on LG's 1440p 165hz 32" panels. I love mine, and I think it's come down to $230 on sale thru the 26th.
      It seems to me that raytracing is one of those beta features that is still in development and will be for a couple more generations before they get it right. It's nice to have it in Cyberpunk or Ascent, but it is such a drag on the gpu (even the 3080) that it loses its value.
      The really good news is that 1440p is the new 1080p thanks to everyone pushing 4k, so you are already in a great position for a mid-range gpu to high-end gpu to get amazing framerates at 1440p. I think you'd be really impressed and happy with anything 6800xt or above on the Radeon side and 3080 (maybe even the 3070ti) or above on the Nvidia side.

  • @ChannelSho
    @ChannelSho Год назад +1

    If you want to confirm if VRAM is swapping into system ram, check to see if the "Shared Video Memory" usage in Task Manager -> Details tab (you'll have to enable the column) for the game is going up.

  • @TheVideomaster138
    @TheVideomaster138 Год назад +2

    I am running Hogwarts Legacy fine, no stutters.
    My system specs:
    i5 13600KF
    32GB DDR5 5600
    GTX 1070
    and funnily enough I dont have any dips below 60 with a capped framerate.
    I use FSR 2.0 Ultra Quality everything else is at Medium.

    • @alaa341g
      @alaa341g Год назад +1

      so basically you have a very good setup with latest ram and cpu , but a fucking 1070 ? wtf xD hahahaha, why didn't get a GPU at least an rtx 2000

    • @TheVideomaster138
      @TheVideomaster138 Год назад +2

      @@alaa341g Because they are still too expensive and I'm waiting to see what the new generations of GPU's bring. Also im still very satisfied with my GPU. I upgraded everything else since it was i5 4690k and 16gb of DDR3 before that, and I just got a bottleneck with it in quite a few games.

    • @AngelicRequiemX
      @AngelicRequiemX Год назад

      1070, lol

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Год назад

      The 1070 has a lot of horsepower for a 2016 GPU. 8GB VRAM and decently overclockable too. What you're saying checks out with everything the internet is saying - it's just a terribly optimized game that has a terrible render thread.

    • @TheVideomaster138
      @TheVideomaster138 Год назад

      @@CyberneticArgumentCreator I see people have stutter with high end cards while I didnt experience any so far, thats whats weird to me. Im using 511.79 drivers and put shader cache to unlimited, maybe thats what helps I have no idea.

  • @igorrafael7429
    @igorrafael7429 Год назад +1

    The i5 9600k even overclocked to 5ghz all cores wasn't enough for ray tracing back in 2021 with cyberpunk. And not just cyberpunk, watch dogs legion, far cry 6 and many others open world games with ray tracing, i've noticed that very earlier, i had one, also had the 5600 , not enough either, but it was better. Now im with the 5800x3d and seems to be the bare minimum for 60+ ray tracing in open world games. Its a fantastic cpu and im glad im not going below 60 fps in rt games anymore but clearly it wont keep up in the future when we start to get more demanding games. We always need to upgrade our hardware if we want to maintain some sort of performance standard because of bad software optimization on pc. That's sad and depressing. I wish i could just leave my hardware actually running for a few years before thinking about upgrading but thats not possible these days.

  • @Burbanana
    @Burbanana Год назад +3

    Also a question about your overlay; how did you get all those options like the graph to show via Afterburner? I dont even have an average fps indicator.. Also how do you reset the counter? Thanks for the really informative videos man

    • @pessoaanonima6345
      @pessoaanonima6345 Год назад

      You have to change it in the settings. Choose the hardware you want to measure and set it to show in screen. You reset the counter using a key combo, which you can also set up in the settings.

    • @YumiSumire
      @YumiSumire Год назад +1

      there should be some tutorial videos on RUclips.

  • @rubdulbah3201
    @rubdulbah3201 Год назад

    I swear I'm getting flashbacks to the first days of Elden Ring on PC again. Thankful for your testing.

  • @hrod9393
    @hrod9393 Год назад +4

    So far I keep Ray Tracing reflections on. The others I can sacrifice. I think they do the shadows pretty well 'artificially'.
    I think you're halfway right about the limitations not being solely cou, but coding and memory bandwidth usage etc. The cou is obviously waiting on something.
    I see up to 96 % usage on my 4090 and 10850k in other scenes just fine though.

    • @RicochetForce
      @RicochetForce Год назад

      Yeah, CPU + memory subsystem issues limiting performance is a pretty rare combo in gaming.

    • @teddyholiday8038
      @teddyholiday8038 Год назад

      That’s fascinating, my 4090 has never gotten above 62 usage and I have pretty frequent stutters with RT on

    • @hrod9393
      @hrod9393 Год назад

      @@RicochetForce Something rare at least means it is possible and has happened.
      It could be they need to set a "end" limit to the ray tracing calculations, especially for the ambient occlusion stuff.
      I'm nearly sure that since it's been brought up by many reviewers, they will patch it.

    • @hrod9393
      @hrod9393 Год назад

      @@teddyholiday8038 I have no stutters. 4k DLSS Quality or Balanced. I do not have the latest Nvidia drivers either, I hope you figure it out. I can get 223 fps in some scenes with Frame Generation. the 4090 is slightly overclocked as well.

    • @lanand9397
      @lanand9397 Год назад

      @@hrod9393you’re making shit up

  • @jantube358
    @jantube358 Год назад +1

    I wonder why none of the CPU cores is at 100% load.
    I remember hitting CPU limit meaning the CPU being at 100% and the GPU at like 50%.

  • @dakelpoandgames2593
    @dakelpoandgames2593 Год назад +5

    Whats odd is that I am running on a 5600x, a 3070, and 48gbs of 3200mhz ddr4 at 1440p ultra and I am getting the same fps with little drops. The game does seem to be cpu limited and even then having more ram seems to help overall. My buddy with the same system and just 16gbs of ram can't even run the game. It just stutters and then crashes. Only time will tell with patches I guess.

    • @atomicfrijole7542
      @atomicfrijole7542 Год назад +2

      I think it's the game engine, not the hardware. Same kind of experience on PS5.

    • @Real_MisterSir
      @Real_MisterSir Год назад +2

      @@atomicfrijole7542 Definitely Unreal 4 optimization that is lacking, and Raytracing was never truly well implemented into that engine - even UE5 has its share of issues still.

    • @ElendielPlaysEU
      @ElendielPlaysEU Год назад +1

      do u think its worth it to upgrade to 32gb ram as its a pretty cheap update rn for more stable performance (currently running 2070 super 5600x and 16gbram with dlss on, raytracing off and high quality settings) (am averaging depending on region between 70-100 fps with drops into the 30s sometimes too :( )

    • @atomicfrijole7542
      @atomicfrijole7542 Год назад +1

      @@ElendielPlaysEU Yes it is absolutely worth it, but not exactly for the reasons you're looking at although you're right on with stability. You won't see a huge fps increase from it, but you will have a much smoother environment with your operating system and games. I consider 32gb a minimum to have a smooth computing experience. Sure, you can stick with 16gb but the operating system overhead will go away when you hit 32gb. Plus you're future-proofing your system for not much money and when you are ready to shift away from Windows to Linux, your pc will run like a brand new machine, no joke.

    • @dakelpoandgames2593
      @dakelpoandgames2593 Год назад +1

      @@ElendielPlaysEU personally yes. If this game and other future titles are any indication, we are at the end of the days of 8gb as the lowest for gaming. I think we should have at least 32gbs. It is overkill but if hogwarts is any indication for future titles then it means we should have more ram.

  • @brandonb8705
    @brandonb8705 Год назад +1

    I was actually surprised at how good it ran.

  • @alexschmidt3034
    @alexschmidt3034 Год назад +5

    Would be interesting to also see ram comparisons from DDR4 3000mhz to DDR5 6000mhz or even higher.

    • @stefanradunovic8318
      @stefanradunovic8318 Год назад

      My ddr5 8000 plays smooth this game so I am not sure any more

  • @MM-xc2zk
    @MM-xc2zk Год назад +1

    4090 13700k 32 GB Ram 240Hz 1440p, pretty bad hitching here. traversing the castle in many areas just feels bad. refunded the game till performance is fixed. changing the resolution to 4k and refresh rate to 120Hz helped some, but not enough to make the game feel good to play.

  • @YourDevil54
    @YourDevil54 Год назад +5

    cpu litimation means theres alot of scripts running in background. i made some lua script for gta online mod menu. and i had same issue with low cpu/gpu utilisation and bad fps when i made some heavy cycles in my script running each frame (like for i = 1,1000 do). it runs fine if it doesnt run each frame (like with ~80 ms pause between cycles). so hogwarts devs have to rethink what they have done in their scripts i guess

  • @IndyMotoRider
    @IndyMotoRider Год назад

    Getting this game today. i7 6700K, 32 MB ram @2132 MHz CL15, 2080 ti. Wish me luck!

  • @Kaori42
    @Kaori42 Год назад +3

    5800X3D ,32GB 3600C16, 7900XTX I'm getting between 100 and 144fps (144fps in poudlard, 100 in the city), always gpu bound with ultra preset, no rt, fsr 2 quality. Without the day one patch fsr was reducing gpu usage to 50%, frametime all over the place but now it's perfect, except robotic voice for your character if you change voice pitch and white bushes (a mod fixes that, either you can lower materials to low to fix the issue). I also found that my character face lightning tend to look weird.

  • @adhdengineer
    @adhdengineer Год назад +1

    it really does seem like another poorly optimized PC port of a console game. like they put all their effort into getting it to run on the console as best as they could knowing a decent gaming PC could run it even if they left a load of performance on the table for PC games.

  • @brzimtrco6142
    @brzimtrco6142 Год назад +8

    Yes but nvidia has driver overhead in dx12, so with amd gpu there would be less cpu bottleneck like you have shown in one of your pervious videos

    • @Davids6994
      @Davids6994 Год назад +1

      Frame generation goes BRRRRR

    • @sito3539
      @sito3539 Год назад +7

      @@Davids6994 imput lag goes brr too with it

    • @e0nema0tak1v
      @e0nema0tak1v Год назад +2

      @@sito3539 Oh, that 0.005 seconds delay when you are playing a single player game. Unplayable......

    • @Real_MisterSir
      @Real_MisterSir Год назад +2

      @@e0nema0tak1v It's wild how you with one sentence proved how little you know about hardware accelerated processes like framegen.
      If you, for example, are already gimped by cpu and vram performance, then frame generation input lag will have exponential effect, compared to if you generate from a stead 60fps native. The higher your base input lag, the exponentially worse the negative effect of frame generation delay will become.

    • @sito3539
      @sito3539 Год назад

      @@e0nema0tak1v It's more the fact that DLSS and frame interpolation as well as Vsync and any form of upscaling will gimp the frame buffer pipeline, thus defeating the purpose of higher fps in the first place. To fully benefit from high refresh rates and fps you must clean up the pipeline, use native resolution, native fps, no vsync, no nothing and you will have a better experience already. If you're having trouble with fps just do what we've always been doing: drop resolution, use optimized settings, disable any RT tech, etc. and after all that, cap fps to 2 frames lower than your monitor hz to prevent screen tearing. Also DLSS and FSR don't benefit anybody, low end users will have their game look horrible with lower resolutions and unprecise pixel upscaling artifacts, and high end users who want to increase their already good fps are only introducing imput lag and that doesn't help in any way. It's a redundant tech in any situation.

  • @Ryan-1337
    @Ryan-1337 Год назад +1

    Even the steam deck GPU has no problem when running this game. All of the performance issues I see when playing is when the CPU hits 4 to 5 watts or higher or 60% or higher usage. The frames tank as soon as either one of those happen. I play docked at 1080p with FSR 2 performance. Everything low + medium textures. When handheld 800p with FSR 2 quality same settings. I've even had sub 30 fps when the steam deck GPU is at 75ish % util and CPU is pegged. If they can manage to get CPU optimizations even just a bit. This game will be a steam deck gem. It's already almost there.
    Edit: docked I shoot for visuals and a 30 fps cap. Handheld I shoot for a smoother 40fps and less wattage

  • @A.Froster
    @A.Froster Год назад +3

    No reason a 7700x should be bottlenecking without Raytracing on. The game isn't just utilizing CPU cores properly

    • @ryans3795
      @ryans3795 Год назад +2

      I think the point of the video was to show that. That no matter the CPU you are using, all of them are not being utilized as much as they should be.

    • @Augmented_Realism
      @Augmented_Realism Год назад

      @@ryans3795 I agree with you and some people still believe that CPUs are bottlenecked lol, even that i5-9600K is still a good CPU when paired with a 2060 or something like that..

  • @scaramouche7759
    @scaramouche7759 Год назад +1

    I feel like there should be a law protecting consumers against this kind of behavior.

  • @alexunkin
    @alexunkin Год назад +3

    I7-I9 13th gen are quite ahead of amd 7000 series in this game. Single core performance and higher memory speed support helps overcome the terrible game engine.. 7000 series X3d may change that.

    • @JimboJamboYT
      @JimboJamboYT Год назад

      140mb of L3 cache will definitely be a factor, you dont need to brute force clock speeds like intel when you have more efficient cycles.

  • @Nadz203
    @Nadz203 Год назад

    I've been running: All settings on recommended High, all RT off, 1440 target res with DLSS on (Quality setting), locked to 60fps. These were the original default setting the game first booted up with.
    I haven't had an issue what so ever. I've only today seen that many are having issues and am somewhat surprised. I have had an almost consistent 60fps for ~20hours of play time. I've been watching FPS fairly consistently since I started playing to see if I could many increase some setting here and there. Only time I really saw any drops were for a fraction of a second in big area transitions but its barely noticeable.
    Running on a RTX 2070 Super, i7-9700K, 16gb Ram

    • @Nadz203
      @Nadz203 Год назад

      After seeing people having issues I tried RT and yep.. its borked!..
      Out of interest I also tried turning off DLSS to try native 1440 render, ultra settings, no RT. And to my surprise still ran at consistent 60 with occasion dips in Hogsmead and a few places in the Castle. The dips at area transitions were much more noticeable though sometimes lasting a few seconds before restoring to 60.
      Going back to original recommended, as I don't see much difference and I prefer a smooth 60fps

  • @magiccarpets
    @magiccarpets Год назад +5

    For whatever reason I get massive stuttering when I enable DLSS3 on a 4090 with a 7950x. It seems to happen on Witcher 3, Cyberpunk, and in this game. Disabling the secondary CCD using Ryzen master seems to resolve this issue mostly, but I'm just surprised nobody is looking into this.

    • @ravakahr
      @ravakahr Год назад

      Should be fixed in cyberpunk

    • @DelgaDude
      @DelgaDude Год назад +1

      That's the price u pay for using bleeding edge tech not everything has been ironed out yet.

    • @gameplayers_cz9778
      @gameplayers_cz9778 Год назад +1

      Try disabling XMP/EXPO in bios and try to Verify integrity of game files in steam for all games

    • @Simon_Denmark
      @Simon_Denmark Год назад +1

      So you don’t just mean the momentary massive stutter after enabling it and everytime after leaving map/inventory?

    • @seriouserer
      @seriouserer Год назад

      Try enabling Vsync in NVCP if it's not already and disable any frame limiters.

  • @rainsthetic
    @rainsthetic Год назад

    love your content!

  • @CyberneticArgumentCreator
    @CyberneticArgumentCreator Год назад +2

    It seems like the game runs "fine" at 1080p on medium settings on even budget rigs, but change a couple of settings up a notch and it slows to a crawl.
    I hope they invest dev time into rapidly fixing the optimization and memory usage. Would be a shame if such a fun game that so many artists poured years and years of time into was marred by a relatively small amount of mistakes on the optimization end.

  • @DavidAlfredoGuisado
    @DavidAlfredoGuisado Год назад +1

    I watched this test and replicated it on my 5800X3D + 4080, the 5800X3D is even faster than the 7700X shown here (120 vs 95 rt disabled and 90 vs 70 with rt enabled)

  • @jdogi1
    @jdogi1 Год назад +4

    Looking for a recco based on what folks are finding. Trying to build up a gaming rig for my nieces to play. CPU is settled 10900k. Just trying to choose the best GPU from the pile. Any thoughts between 3060-12GB, 3070-8GB, and 5700xt-8GB? Any help greatly appreciated!
    EDIT: Also taking into account upcoming possible/expected optimization patches😅

    • @jerrodshack7610
      @jerrodshack7610 Год назад +1

      3070

    • @jdogi1
      @jdogi1 Год назад

      @@jerrodshack7610 Thanks! I wasn't sure if the 8/12GB might skew it towards the 3060.

    • @adamek9750
      @adamek9750 Год назад +3

      3060ti or 6700xt? probably better option that those listed

    • @jdogi1
      @jdogi1 Год назад +1

      @@adamek9750 Yea, I know, but I can't go buying them a new GPU. Must be from the stockpile. I've convinced them to NOT buy the PS4 version. They aren't full on tech nerds, so probably don't even understand why. But now I need to give them an alternative😅

    • @adamek9750
      @adamek9750 Год назад

      @@jdogi1 3070 for sure, what resolution are they playing?

  • @hisgen01
    @hisgen01 Год назад

    Didn't knew a CPU could make that much of a difference Daniël 🤔, my combo: 5800x3d, 32GB 3200MT/s & 6900xt (2500MHz). I run the game at 1440 ultra, no RT, FSR set to quality. In the castle I mostly get 165 fps (set max fps to 165 because I have a 165 Hz monitor). The GPU usage is often between 90 & 99%. In Hogsmead and other areas with more open view, I get around 110 to 120 fps. The game is very smooth, fps drops to 80 fps are rare.

  • @TheVanillatech
    @TheVanillatech Год назад +3

    Terrible optimiztion because it's developed for the consoles and drop ported to PC. I don't know what the surprise is. It's been the same for a decade or more, with all "AAA" titles. Consoles first, then drop ported the same shitty version to PC. A handful of exceptions, like GTA 5, were in development LONGER after the console release to optmize and make a true, ultimate PC version. But that's an anomoly.

    • @stirfrysensei
      @stirfrysensei Год назад +1

      GTA 5 had major PC performance issues when it came to PC..so did RDR2. Should probably look into that.

    • @blkspade23
      @blkspade23 Год назад

      @@stirfrysensei Yeah those games weren't actually in development longer, they were just timed exclusives for console. That's been the case with damn near every rockstar games release.

    • @TheVanillatech
      @TheVanillatech Год назад

      @@stirfrysensei Like fuck! The PC version was held back a year after the console release in order to give PC gamers a true PC hardware level experience, better in every way than the consoles. They didn't drop port it. They optimized for PC and GTA 5 is regarded as one of the best cross platform AAA titles ever made - still used in benchmarks today! Get tuned into the real world bro.
      It was the console versions of GTA5 that had terrible performance, hence the day one PS4 patch removing 1/2 the graphics to maintain that 30fps framerate!
      RD2 was terrible because - YOU GUESS IT - drop ported code from console to PC, like I already said.
      If you had issues with GTA5 on launch, it's cos you either had terrible 5 year old hardware in your rig or you were running it in stupidly high settings. In which case, open the menu and turn the options down. There were dozens of options in graphics settings on GTA5 - cos it was a real PC game developed FOR THE PC.

  • @kathleendelcourt8136
    @kathleendelcourt8136 Год назад +1

    Have they fixed the Ryzen CPU performance bug yet? I've seen benchmarks showing an i3 -12xxx being almost twice as fast as a 5800X3D which is preposterous. Hardware Unboxed also mentioned on their twitter that Ryzen 4 CPUs were completely underperforming with a very poor thread distribution and utilization (only 4 cores actually busy and even those "busy" cores were barely touching 50% utilization each).

  • @ofthenearfuture
    @ofthenearfuture Год назад +4

    Do you think the denuvo DRM is having any impact on cpu performance?

    • @Bruce_Black
      @Bruce_Black Год назад

      most games that had denuvo removed gained 15-30 fps and less stutter
      so probably yeah

  • @EasternUNO
    @EasternUNO 5 месяцев назад

    Good video except those 1000 annoying ads. I own I9-9900 KS and I have it paired with RTX 4080. When in Hogsmeade FPS drops to 33 - 47 when at 4K, ultra preset and ray tracing on. DLSS upscaling helps not that much and I cannot enable frame generation, because if I do, the game always randomly crashes when playing for 5 - 10 minutes + there is noticable input lag. It seems that the game crashes because of 'hardware accelerated GPU scheduling' which can be set up in windows graphical settings. And you need this setting on in order to enable frame generation. So this game is unoptimized garbage on PC. I am not having issues with other games such as Horizon Zero Dawn. No issues at all with frame generation there.

  • @93836
    @93836 Год назад +3

    I’ve been enjoying the heck out of this game. I’m running it with a 4090 and a Ryzen 5900x. And my system feels like the bare minimum for running this game. Which is crazy, haha. But that one of the reasons I get overpowered hardware - so I can brute force through any unoptimized games that come out. Haha. It’s basically insurance that I’ll never need to compromise on my 2160p 120fps ray tracing expectations 😆

    • @csguak
      @csguak Год назад +3

      13900k OCed, 4090 FE OCed here and I'm getting crushed at 4k 😂🤣😂

    • @93836
      @93836 Год назад +2

      You know what, games like this are one of the situations where you can really justify getting a 13900k and overclocking it. Every drop of CPU performance helps here. And also Frame Generation is worth its weight in gold! The people talking trash on Frame-Gen guarantee don’t even own a 40-Series card. Pull out every Nvidia trick in the book, and you still get a mostly-normal gaming experience, even with unfinished “beta” games like this 😆👊

  • @adrianemporr58
    @adrianemporr58 Год назад +1

    I have i7 12700k with Rx6800xt. I run 1440p Ultra RT off. I Observed that every time you open/load game CPU goes to 80-100%. Game just load every location again making game heavy CPU demadning but after location loaded in CPU goes to 60% and GPU to 100%.
    Conclusion:Everytime you open a game, game loads shaders/objects FROM THE SCRATH making your CPU bottleneck GPU for sometime ( in my case I got from 100fps in Griffindor common room to 144fps + in like 5 minutes, then when I left common room my fps again went to around 110-120 and after 5-10 minutes in one location it went back to 144 + again/ When I got back to Griffindor common room location was loaded in and I had stable 144 fps +) This applies to every location.

  • @aeztral
    @aeztral Год назад +4

    What a joke of an optimization attempt lol. What point is there to having high-end specs if even average looking games don't run well because devs don't optimize for PC.

  • @CaptToilet
    @CaptToilet Год назад +1

    I think this just comes down to how the engine handles the CPU cores and threads. I just find it hard to believe that even a 13900k or AMD equivalent aren't strong enough to eliminate these bottlenecks. I remember a game called Vampyr that came out a few years back and it too was cpu limited in some of the hub areas. UE4 game unsurprisingly. My gpu usage would take a hit down to the 40's percentage at times it was that bad. I highly believe it just the engine inefficiency.

    • @siegfriedgottz698
      @siegfriedgottz698 Год назад

      yea i remeber that game not running well even with a 3070

  • @NeVErseeNMeLikEDis
    @NeVErseeNMeLikEDis Год назад +7

    Game looks horrible

    • @johnboylan3832
      @johnboylan3832 Год назад +1

      Yeah, it doesn't look as good as the min specs would suggest.

    • @DelgaDude
      @DelgaDude Год назад +7

      Last gen game with terrible ray tracing implementation. Idk what they did so it requires top tier hardware to run well.

    • @aj.5841
      @aj.5841 Год назад +1

      I like the look of the game, but only in native 2K res with high settings on a 3080. I don't know why, but DLSS makes this game look washed out. Thankfully, the game runs well enough with the typical stutters here and there.

    • @OperationDecoded
      @OperationDecoded Год назад +2

      @@DelgaDude It's more like what they didn't, which is optimize the game.

    • @DelgaDude
      @DelgaDude Год назад +2

      @@aj.5841 Yes but 3080 is a top tier modern card saying it runs "well enough" on a game that looks like it's 5 years old says it all don't u think?

  • @OrcCorp
    @OrcCorp Год назад

    A quick tip: if you want to test a game like this in actual 1080p, change the resolution of the monitor/tv in Windows before you start the game.
    And as the riva stats are so small, could you scale them up please, so that people on phone can see them better 👍🏻

  • @Brad-bb7ey
    @Brad-bb7ey Год назад +1

    i5 12600k, 3060, 60hz 1080p. All settings on Ultra. Zero issues, CPU 30 C, GPU 50 C.

  • @maxolina
    @maxolina Год назад +1

    I think that the population setting only works when you enter a new area, as it can't remove NPCs from right in front of you.

  • @KeyholeDweller
    @KeyholeDweller Год назад +2

    The optimization for RT is awful and DLSS 2 doesn't help. Frame generator helps... but what really helped me is giving up on RT. 🤣
    The game looks the same or better in some places and i get more FPS than my monitors refresh rate at any given point so that's what I would recommend to players on PC.... either frame generator or turn off RT.

  • @TechButler
    @TechButler Год назад

    Currently, running it between 75-90 FPS on full ultra settings with Raytracing options all on in 1440p on a 6700XT and with 16Gb of 3600Mhz RAM, an NVME and a Ryzen 5 3600. I'm on a 1440p monitor that can run up to 165Hz and I have V-Sync turned off. There's more to this than a GPU. I will admit, though, Adrenalin software wanted to cook my card again until I realized that I, yet again, had to import my settings because it can't remember the settings I've already chosen. That's the worst part of the AMD experience they fail to address. Once I imported my settings, the card was running below 80C. Remember this: If you're having problems running the game, try deleting the Hogwarts Legacy folder in your users folder in AppData under Local (not roaming).

  • @Ebilcake
    @Ebilcake Год назад

    Same here, pretty much. GPU sits at around 55%, loads of stutters and framerate drops walking round the castle, the town is just below 60fps and stutters quite a bit.
    Memory usage isn't particularly bad, it does spike up to 90% but then settles back down to around 60-70%, video memory is pegged at around or 8GB
    3900X
    3080FE
    X570
    16GB DDR4 3600
    Installed on an M.2

  • @kenhiew2900
    @kenhiew2900 Год назад

    There are reports the 7900xtx is running this game 30 up to 40 percent faster than the 4090 with current AMD drivers. I believe that's without RT.

    • @kenhiew2900
      @kenhiew2900 Год назад

      Reason being the Nvidia are adding a lot of overhead to the CPU.

  • @lolitagupta2201
    @lolitagupta2201 Год назад

    Small note, 16 gb of ram is on the low side for this game. U can check some benchmarks on it and i tested and upgraded myself.
    Off course its not gonna solve it on ur case but even me on a 12600k 5.3ghz was having deeps in that area till i upgraded

  • @ATY676
    @ATY676 Год назад +1

    Not seen a single issue with my 12900k and 4090.

  • @vextakes
    @vextakes Год назад +1

    Would this mean that the game is just poorly optimized? Cos only certain thing can be hardware accelerated on the gpu, so does the engine not utilize the acceleration or something? So it falls back on the cpu?
    But it is also pretty bad just rasterized, so it’s just generally poorly optimized ig

  • @H4GRlD
    @H4GRlD Год назад +1

    70 fps in 1080p on a high end pc (even with RT turned off!) screams of bad optimization, also very weird to see technically nothing being utilized properly. What is this :D
    Very cool video btw, thanks for it.

    • @redknight2120
      @redknight2120 Год назад

      What do you call high end PC? please tell the specs :)

    • @H4GRlD
      @H4GRlD Год назад

      @@redknight2120 I was talking about the second config from this video, 5950X 32GB DDR 4 3600 + RTX 4080. This should clearly run a game like this around ~ 240 fps at 1080p. Red Dead Redemption 2 could be a good comparison performance - graphics wise. I might be wrong :)

  • @Loquedigafaby140208
    @Loquedigafaby140208 Год назад

    5800x + 4070ti, ULTRA settings, RT off and frame generation ON. No problem at 2k resolution, the framerate is always between 120-140 fps. GPU usage around 60-70% most of time

  • @mfvicli
    @mfvicli Год назад

    I play this on high settings (1080p), no RT, DLSS quality, and the lowest I go is 50 fps. The game uses a lot of RAM, however. My system utilization will peak at 18/32GB used and my VRAM goes as high as 9GB used. It's certainly playable.

  • @andrej177
    @andrej177 Год назад +6

    Is it me or the textures in this game look kinda bland

    • @flimermithrandir
      @flimermithrandir Год назад

      That really depends on the VRAM as far as i can tell. The other Vids before have shown WAY better Results i think. May be just my Memory telling me wrong but i really think it looked better.

    • @mememanfresh
      @mememanfresh Год назад

      the humams yeah

  • @Parky_T
    @Parky_T Год назад

    The issue with the low performance is the DRM. It's always the DRM. While Ray Tracing does have a reasonable CPU cost, it's almost always made 10x worse by the DRM. Denuvo is ruining game performance.

  • @valentinvas6454
    @valentinvas6454 Год назад +1

    Am I the only one wondering how the hell will last gen consoles run this game??
    If something like the 9600K is brought to its knees in Hogsmeade the old Jaguar CPUs will probably scream in agony LOL.

  • @finaldx7
    @finaldx7 Год назад +1

    Good points. I have a 13900ks, 32 GB of DDR5 and a 4090 and I am rendering at native 4k with all RT effects on, but also frame generation without DLSS on. I get about 85-100 fps in hogsmeade with dips into the 70s and its responsive enough. Im sure that latency is affected but its not very noticable.
    DLSS in this game, even on quality, is quite trash. Turning it on results in muddy scenes and very poor reflections.

    • @dakelpoandgames2593
      @dakelpoandgames2593 Год назад

      Whats odd is that I am running on a 5600x, a 3070, and 48gbs of 3200mhz ddr4 at 1440p ultra and I am getting the same fps with little drops. The game does seem to be cpu limited and even then having more ram seems to help overall. My buddy with the same system and just 16gbs of ram can't even run the game. It just stutters and then crashes. Only time will tell with patches I guess.

    • @KeyholeDweller
      @KeyholeDweller Год назад

      @@dakelpoandgames2593 Are you using RT on max? Because without RT on he would get 200+ 🤔

    • @KeyholeDweller
      @KeyholeDweller Год назад

      Almost same build and I just turned off RT... it's simply not worth it...

    • @finaldx7
      @finaldx7 Год назад +1

      @@KeyholeDweller I think the RT reflections and shadows are probably give or take, they look just ok. But I think I'd keep RT ambient occlusion turned on if your fps and frame times aren't too low.

    • @KeyholeDweller
      @KeyholeDweller Год назад +1

      @@finaldx7 After a lot of testing I think it's better to turn on RT reflections instead of ambient... reflections actually look prettier. While occlusion does almost nothing might even look worse because of the weird lighting effects.
      So maybe try only RT reflections on, with no DLSS. I'm getting 150-170 with max dips into 105fps. And the game looks better than ever.

  • @Harrave
    @Harrave Год назад

    Can't wait for this game to finally have a proper Day 1.5 patch and finally fix it xD on another note, hope your daughters get well soon!!!

  • @night3094
    @night3094 Год назад +1

    Below 50% GPU usage ☠Do devs just dont bother anymore with optimiazing their game anymore and just slap 4090 in requirements this is unacceptable

  • @Jason_Bover9000
    @Jason_Bover9000 Год назад

    Thanks for this video

  • @kendragon2827
    @kendragon2827 Год назад

    13700k, rtx 4090, 3440x1440uw, ultra settings, ultra rt, no dlss, motion blur off, film grain off. Average about 120, never goes below 100, haven't got to hogsmede yet. Oh and 32gb ddr5 6000. I can't see why so many people are having issues. Mine runs great.

  • @ROZAR802
    @ROZAR802 Год назад +1

    i have a rtx 2060, ryzen 9 5900x, and 16 gb of ram and my game lags terribly on medium settings no ray tracing. I do not know what the issue is or how to fix it

  • @TenthMarigold
    @TenthMarigold Год назад

    Recently you mentioned getting an Alienware Aw3423 OLED. How does it compare to your C1 or do you plan on making a video in the future about it?

  • @adrastoso9727
    @adrastoso9727 Год назад +2

    After watching this second video, none of this makes sense because the CPU is not near 100% at any cores, so i do not understand how the game is CPU bound. There must be a driver issue with this game, it is a brand new game so I am sure there will be driver updates coming out in the near future.

    • @dinachoueiry6073
      @dinachoueiry6073 Год назад

      No cpu core will use 100 percent. Mostly you can say 60-80 percent on just 1 core is used up 100 percent. Because that one core is the one talking to the GPU. If that 1 core bottle necks. They all do sadly

    • @adrastoso9727
      @adrastoso9727 Год назад

      @@dinachoueiry6073 right, so my point is it’s not a CPU issue but a driver issue with the game.

    • @dinachoueiry6073
      @dinachoueiry6073 Год назад

      @@adrastoso9727 not a driver but optimization from unreal engine 4.
      At the moment the way they make games. There no room to open up the thread to talk with GPU. As ray tracing is a software from Microsoft and uses nvidea software to run there RT cores on windows.
      Cpu limitation is a think atm due to those factors. Which is a bottleneck at the end of the day.
      Because no code in the world will optimize the problem when the problem is unreal engine 4

  • @etchieSketchie
    @etchieSketchie Год назад +1

    I requested a refund for this game via Steam. My RTX 2070 Max-Q w/ i7 9750H simply cannot handle this game without a stutter-fest. Every time I tried to play it, it'd just make me mad. It blows my mind that a game in 2023 looks worse than Red Dead Redemption 2 (2019) and SIMULTANEOUSLY runs significantly worse. I will reconsider this game if patches resolve the performance issues.

  • @gatomon2002
    @gatomon2002 Год назад

    I have an older CPU i& 6700 and RTX 2060, 32 GB of ram. My CPU usage has been 40-50%, GPU hovers around 60%. I had to turn off DLSS, turn on anti aliasing to DLAA, changed certain graphics to high. Then that's when i saw the difference with GPU usage up to 80% and CPU usage dipped to the low 40s consistently. Was still able to get 60-78 fps.