What's crazy is I run this game at 1440p ultrawide with a 3060ti and a R5 5600, and I'm not even hitting 100% utilisation most of the time. Horrible optimisation.
@@riverfilmer5183 Yes , thanks. We realised. The point is the game isn't properly utilizing those CPU cores which there isn't a fix for until Devs patch it
Games shouldn't be cpu bound unless you're running an old cpu with a newer/powerful gpu, unless the game isn't optimized properly. I believe that's his point
I'm as sick of that as I am most review channels' suite of console port benchmarks which represent games hardly anyone plays relative to the market. The games are terribly optimized and totally ramshackle, no one should care about benchmarks in them especially when they're optimized for AMD hardware in consoles. Show me benchmarks of PC titles please, I know I'm going to be able to run a single player console port at 60+ FPS and don't give a crap about any of their performance metrics. /rant
@@neruba2173 then I can see many people migrating to consoles. A sea of incompetence optimizing games paired with Jensen’s greed to fund his leather jackets is the perfect recipe.
I want to thank you for these series of videos. The fact that you test older CPUs makes for a more useful and realistic setting for 90% of us that may have good systems, but not last iteration or cutting edge systems. It also makes you understand why, having a beast of GPU is not the only piece of the equation. Thanks again!
Quite unfortunate turning down texture usually sucks the most and are extremely noticeable, texture is just free image quality as long as your VRAM and bandwidth can support, if possible I usually try to push it to the limit.
@@SolarianStrike Not asking him to run RT, It's never worth using on open world games imo. There aren't enough Ray Tracing scenes or enough of them to justify a 40-70 fps decrease.
It's usually around 90-120fps for me in sprinting through Hogsmeade, 7900XTX and 5800X3D. X3D CPU and AMD GPU seems to be the best combination for this game. The 3D V-cache and lower CPU overhead on AMD cards helps a lot in the CPU intensive areas.
I’m running on a 9600k and 3070, what I find odd about this game is how much my performance drops when moving from area to area. If I chill in one section of the castle for a minute i’m usually able to average around 80-90fps, but if i’m jogging around that average drops to like 40 all the time
Well as much as i hate denuvo , i think at this point it can't be considered THE problem here , i think the game is just that badly unoptimized as a lot of games recently are releasing with wierd cpu limits . But then again denuvo has a history of causing stutters and more cpu limits so removing may help the performance in some way , just look at RE village
@@Eldesinstalado usually denuvo cracks are just bypasses, if the devs won't remove it themselves, it will run the same. Exception i know of is AC: Origins, where the crackers removed denuvo themselves, and the game actually ran a lot better, especially in cpu bound scenarios
@@lynackhilou4865 yeah. It ain't Denuvo. I have a 4090 and a 5900x, the game is butter smooth on Ultra with RT off, but with RT on the stuttering is insane. Something is borked with the RT
The Devs surely must be readying a patch, there's no way they can release a game that has such wide appeal as this with graphics settings so high that most people can run it decently. It looks really good but there's something not right here.
I don't understand...you think because Harry Potter is popular they should make it have low graphics and be easy to run because a lot of people will want to play it...? That isn't how it works
@@samgoff5289 It usually does. Most popular games are done not with overkill graphics in mind so you can still run them decently even if you have old hardware but they tend to scale well so you can play them at high framerate/settings with powerful GPUs too. This game has ok graphics and it's actually easy to run at 60 fps run with upscaling but has terrible CPU performance which hinders even the best GPUs out there like a 4090.
@@GrimAbstract i'm sorry what? you're comparing the optimization of Elden ring to this garbage? Elden ring with literally everything on ran quite smoothly AT LAUNCH. this garbage here (tried again few days ago) runs like absolute shite. if it was a stable low FPS it would be somewhat ok, but it hitches like crazy even on medium in areas like hogsmead or in hogwarts.
@@samgoff5289 no, the game just has shit optimization. I hate the fact that modern games are struggling to run properly so you're forced to make the game look WORSE than games from 10+ years ago who focused more on innovative ways to make their games look nice, rather than relying on slapped on new technology that make the game look cool in high/ultra but absolute shite in lower settings.
It's not too long ago that LowSpeckGamer turned off from showing how to optimize games to run on lower hardware, since there isn't much to optimize in modern games. Seems we may be needing him again. By the way i have been enjoying his new direction to low speck history. Top notch documentary both for learning and entertainment.
I am running a 5900x, 3070, 32GB of 3600 CL16 DDR4 ram. Using all high settings and turning to Native 3660x1440 Rendering resolution I am getting mid 40 averages in hogsmeade and mid 60s everywhere else. This was very inconsistent at first, but once I forced off the Control Flow Guard in the Exploit Protection I was seeing very steady averages, but still in the 40-70 Framerate. Still need to test with rendering reduction and upscaling to see if I can improve the framerate without losing the steady average. This fix also seemed to remove most of the massive drops I was seeing before. I will still get short ones every once in a while in cut scenes but nothing else. Also most averages were seeing 99% GPU utilization and 15%-40% CPU utilization.
Vram is looking more and more essential already. As much as I want to go with a 4080 this generation, with these recent game releases it just looks like both 7900 cards are going to be better at current generation games with the 24 and 20GB Vrams respectively.
You don't need 20 and 24 gigs this generation, consoles have 16 gigs but the usable ram for games is 10-12 gigs so a gpu with atleast 12 gigs should suffice
@@crescentmoon256 consoles also don't need the same lane restrictions as a PC with interchangeable parts. I do have a 16Gb card right now so I'm easily able to play games like Forspoken without issue. My purchase decision is going to include anticipating game needs for the next 2-3yrs as well as AV1 encoding. 4080 being limited in two years isn't worth the price difference to me so a 7900 XTX or XT is looking much better for the cost.
@@SevendashGaming ur just speculating, u can't say for sure how upcoming next gen games will utilise resources yet, cause there are not enough titles to see the trend this exact thing happened with the 10 series, 1060 was considered to have low vram with its 6 gig and consoles having 8 gigs but 1060 outperforns both ps4/xbox one consoles because usable memory on console for games is 4-5 gigs we see the same with 1050 and 1050 ti, even tho they are faster than ps4/xbox one in most tiltes, towards the end those cards looked worse because they only have 4 gigs vram is Important but dont consider buying a card just for vram, av1 encoding is supported on all the three gpu manufacturers on their latest series of gpus
I ended up getting a refund after I started to see performance in the open world with a 5600x 2070s 16GB ram. I'm not upgrading my memory for this game alone. Terrible performance, I'm just going to get dead space instead.
Got the exact same issues with dead space, heavy VRAM consumption and stutters (/w RTX 3080 and 32GB ram). Runs fine for some periods but some bosses and cinematics, frame rate drops heavily in both games
32GB should be the minimum these days for newer games. They will be pushing 12-16GB by themselves that doesn't leave a lot of room for windows especially if you're not rebooting and making sure nothing else is running.
Fair decision on your part but reminder that a 2x8 kit of RAM will run you like $42 USD and will really make the whole system last longer. If it's even one more year of relevance that's 100% worth $42 in amortized cost.
@@CyberneticArgumentCreator Yeah the issue is I want to match my current kit just to avoid any potential instability, and that kit is like $70. It's honestly almost as cheap to just get a faster 32GB kit, I'm just not wanting to put more money into a system I feel is at the end of it's life.
so happy with my 7700X, overclocked and with ddr5 6000 and extremely tight timings(took weeks to tune and test) I get ~120fps in hogsmead all on ultra except ray tracing!
Would this mean that the game is just poorly optimized? Cos only certain thing can be hardware accelerated on the gpu, so does the engine not utilize the acceleration or something? So it falls back on the cpu? But it is also pretty bad just rasterized, so it’s just generally poorly optimized ig
Have they fixed the Ryzen CPU performance bug yet? I've seen benchmarks showing an i3 -12xxx being almost twice as fast as a 5800X3D which is preposterous. Hardware Unboxed also mentioned on their twitter that Ryzen 4 CPUs were completely underperforming with a very poor thread distribution and utilization (only 4 cores actually busy and even those "busy" cores were barely touching 50% utilization each).
@@pankhanafis818 cyberpunk was actually 100 times better optimized FOR PC that this game. It was trash in consoles. I played CP all the way to the end in high - ultra settings with a 10700 and 2070RTX, no problem. I try this game with the same settings and its all over the place. Not only that, graphically it looks worse with the same settings. This is a CP case, but reversed, its a trash PC port.
I’m running this on 1440p with maxed graphics on a i713700K, RTX4080 and 32GB DDR4 RAM and the frame drops are insane. I can’t possibly play It flawless. Yes I know I’m insane and yes I didn’t have any money for month to afford it, but I bought a PC for 3k.
Also a question about your overlay; how did you get all those options like the graph to show via Afterburner? I dont even have an average fps indicator.. Also how do you reset the counter? Thanks for the really informative videos man
You have to change it in the settings. Choose the hardware you want to measure and set it to show in screen. You reset the counter using a key combo, which you can also set up in the settings.
I’m running 5800x3d, rtx3060Ti and 32Gb 3200mhz Ram. It’s upscaled to 4K with high settings it’s renedering at 65%. I’m getting 50-60 Fps most of the time
@@Wobbothe3rd Too single thread limited? My i7-12700K running a Nintendo Switch emulator only with the Intel UHD 770 graphics looks smoother than this AAA game. I actually watch one thread being 100% used.
If you want to confirm if VRAM is swapping into system ram, check to see if the "Shared Video Memory" usage in Task Manager -> Details tab (you'll have to enable the column) for the game is going up.
Is crazy how, while in the castle, I have 140+ fps, and then in the next 30 meters (or like 3seconds of sprinting) i go to like 110, then 75, then 48, and it goes up again. All I want is to run this game, 1080p, looking as graphically good as possible, without any performance issues. I hope they patch this asap. (i7-9700F, 32GB ram, 3060Ti)
This game is an absolute mess performance wise. The stuttering is ridiculous in Hogwarts and Hogsmead, even on a 4090 and 13900k for me. Can't even brute force it with high-end hardware.
idk man just set things to medium and enjoy the game. It still looks fantastic in lower settings. im running a gtx 970 4gb card with an i5 4570 and STILL pulling 60 frames and having a great time. its a shame high end rigs can pull high frames in ultra settings, but idk maybe i got low standards lol
@@HeyImBoschea i tried tweaking and going for lower settings, no matter how high or low i go, it will always have some moments/places where it drops to like 40fps for a second and then goes back. Is not like i dont want to see the "40fps" on my screen, but the second it drops, it feels like the game is moving in slowmotion, ruining the experience.
Am I the only one wondering how the hell will last gen consoles run this game?? If something like the 9600K is brought to its knees in Hogsmeade the old Jaguar CPUs will probably scream in agony LOL.
My main specs are a 7700x with a 3070 and the game runs hella smooth but sometimes when I go outside or go to another area it does have stutters and drops go 10-15 fps but as soon as I do a rebellio it kinda fixes it for a while. Game definitely needs optimization for sure, I played through Callisto Protocol and it was very smooth for a new game. Either way I’m enjoying this game a lot, great mechanics, love duels, and exploring the castle and everything is so fun.
It definitely seems to be VRAM limitations with the 3070. I have been running on mostly high settings with DLSS quality on my 3070 and it seems to offer a smooth experience. As soon as I go up to ultra textures, I get the massive frame drops you are talking about.
I wonder what even makes the game remotely worth this terrible performamce, the more I know about this game the more I'm puzzled why gamers were excited beyond "it's a singleplayer game with no microtransactions"
You really shouldn't need to spend $3000+ on a PC to be able to run a new release over 60fps without ray tracing on high settings and 1080p. Most people aren't going to have a 5950X let alone a 40 series card 🤣.
If you have a 4090 use DLSS QUALITY and FRAME GENERATION it helps a lot and visually there is no difference between that and 4K native AT ALL and you get around 150 fps average and lower latency than native 4K because of Nvidia Reflex being on - 4K native was 35ms, DLSS Quality and Frame Generation was 27ms. Stop this "fake frames" scare mongering people I haven't noticed a single bad frame at all!
Whats odd is that I am running on a 5600x, a 3070, and 48gbs of 3200mhz ddr4 at 1440p ultra and I am getting the same fps with little drops. The game does seem to be cpu limited and even then having more ram seems to help overall. My buddy with the same system and just 16gbs of ram can't even run the game. It just stutters and then crashes. Only time will tell with patches I guess.
@@atomicfrijole7542 Definitely Unreal 4 optimization that is lacking, and Raytracing was never truly well implemented into that engine - even UE5 has its share of issues still.
do u think its worth it to upgrade to 32gb ram as its a pretty cheap update rn for more stable performance (currently running 2070 super 5600x and 16gbram with dlss on, raytracing off and high quality settings) (am averaging depending on region between 70-100 fps with drops into the 30s sometimes too :( )
@@ElendielPlaysEU Yes it is absolutely worth it, but not exactly for the reasons you're looking at although you're right on with stability. You won't see a huge fps increase from it, but you will have a much smoother environment with your operating system and games. I consider 32gb a minimum to have a smooth computing experience. Sure, you can stick with 16gb but the operating system overhead will go away when you hit 32gb. Plus you're future-proofing your system for not much money and when you are ready to shift away from Windows to Linux, your pc will run like a brand new machine, no joke.
@@ElendielPlaysEU personally yes. If this game and other future titles are any indication, we are at the end of the days of 8gb as the lowest for gaming. I think we should have at least 32gbs. It is overkill but if hogwarts is any indication for future titles then it means we should have more ram.
We're 3 years into PS5 era, and UE4 engines games are still being released. Poor coordination on everyone's part. Trying to use engines unoptimized for new hardware.
A settings optimization video would be much appreciated if you have time. Especially since Digital Foundry won't do their job, For no apparent reason. Just strange situation.
@@Radek494 I doubt this is the case. Rich and John are not very liberal, they don’t seem to give a F about all that. Alex usually does the PC analysis anyway
The i5 9600k even overclocked to 5ghz all cores wasn't enough for ray tracing back in 2021 with cyberpunk. And not just cyberpunk, watch dogs legion, far cry 6 and many others open world games with ray tracing, i've noticed that very earlier, i had one, also had the 5600 , not enough either, but it was better. Now im with the 5800x3d and seems to be the bare minimum for 60+ ray tracing in open world games. Its a fantastic cpu and im glad im not going below 60 fps in rt games anymore but clearly it wont keep up in the future when we start to get more demanding games. We always need to upgrade our hardware if we want to maintain some sort of performance standard because of bad software optimization on pc. That's sad and depressing. I wish i could just leave my hardware actually running for a few years before thinking about upgrading but thats not possible these days.
cpu litimation means theres alot of scripts running in background. i made some lua script for gta online mod menu. and i had same issue with low cpu/gpu utilisation and bad fps when i made some heavy cycles in my script running each frame (like for i = 1,1000 do). it runs fine if it doesnt run each frame (like with ~80 ms pause between cycles). so hogwarts devs have to rethink what they have done in their scripts i guess
I've been running: All settings on recommended High, all RT off, 1440 target res with DLSS on (Quality setting), locked to 60fps. These were the original default setting the game first booted up with. I haven't had an issue what so ever. I've only today seen that many are having issues and am somewhat surprised. I have had an almost consistent 60fps for ~20hours of play time. I've been watching FPS fairly consistently since I started playing to see if I could many increase some setting here and there. Only time I really saw any drops were for a fraction of a second in big area transitions but its barely noticeable. Running on a RTX 2070 Super, i7-9700K, 16gb Ram
After seeing people having issues I tried RT and yep.. its borked!.. Out of interest I also tried turning off DLSS to try native 1440 render, ultra settings, no RT. And to my surprise still ran at consistent 60 with occasion dips in Hogsmead and a few places in the Castle. The dips at area transitions were much more noticeable though sometimes lasting a few seconds before restoring to 60. Going back to original recommended, as I don't see much difference and I prefer a smooth 60fps
970, 5600 and 32gb ram (low settings with effects on medium). it's playable (vast majority of the time it's 60 fps) but some areas of the game tank it hard. that is something they can look into, i'm sure they just have a bit too many effects/objects in certain areas. that is very much something they can remedy overtime. if i had even a 2000 series card though...i doubt i would ever see it dip below 60 fps. it's already amazing this 9 year old GPU barely dips below 60 fps. granted FSR 2 is pretty much required but still....the game is detailed enough that it doesn't really matter to me. i'm sure the game looks amazing in 1440P ultra and even better with ray tracing but performance isn't free, if you want it you are gonna have to pay for it in some way.
Honestly the raytracing isn't that noticeable from my experience with the game on the PS5 and in PC vids. 1440p is nice though. 4k is pretty but you'll be happier with a high-refresh rate 1440p monitor when you decide to upgrade. The 970 is a great card - keep an eye out for sales on the 6800xt if you're looking for an affordable upgrade path that will take advantage of your 5600.
@@atomicfrijole7542 sad to hear that RT isn't that big of a jump, i 100% agree about having/wanting higher refresh rates though. 60 fps feels like a slog at times, i never had a 144hz monitor but my bother does and it's sooooooooooo smooth. i don't think i would ever use 4k unless it could get higher refresh rates on top. then again that would require a beast of a GPU so it's unlikely i would have to choose anyway xD. was eyeing 6000 series but 7000 came out but things happened and decided to wait (was going to get a 7900xtx). so now i'm just waiting for the lower tier GPU's to release to see what happens to GPU prices. waiting a few months or another year isn't a big deal, already been waiting almost a decade.
@@rocksfire4390 I don't know if you're in the USA but if you are and are near a Costco, there's sale on LG's 1440p 165hz 32" panels. I love mine, and I think it's come down to $230 on sale thru the 26th. It seems to me that raytracing is one of those beta features that is still in development and will be for a couple more generations before they get it right. It's nice to have it in Cyberpunk or Ascent, but it is such a drag on the gpu (even the 3080) that it loses its value. The really good news is that 1440p is the new 1080p thanks to everyone pushing 4k, so you are already in a great position for a mid-range gpu to high-end gpu to get amazing framerates at 1440p. I think you'd be really impressed and happy with anything 6800xt or above on the Radeon side and 3080 (maybe even the 3070ti) or above on the Nvidia side.
Currently, running it between 75-90 FPS on full ultra settings with Raytracing options all on in 1440p on a 6700XT and with 16Gb of 3600Mhz RAM, an NVME and a Ryzen 5 3600. I'm on a 1440p monitor that can run up to 165Hz and I have V-Sync turned off. There's more to this than a GPU. I will admit, though, Adrenalin software wanted to cook my card again until I realized that I, yet again, had to import my settings because it can't remember the settings I've already chosen. That's the worst part of the AMD experience they fail to address. Once I imported my settings, the card was running below 80C. Remember this: If you're having problems running the game, try deleting the Hogwarts Legacy folder in your users folder in AppData under Local (not roaming).
Looking for a recco based on what folks are finding. Trying to build up a gaming rig for my nieces to play. CPU is settled 10900k. Just trying to choose the best GPU from the pile. Any thoughts between 3060-12GB, 3070-8GB, and 5700xt-8GB? Any help greatly appreciated! EDIT: Also taking into account upcoming possible/expected optimization patches😅
@@adamek9750 Yea, I know, but I can't go buying them a new GPU. Must be from the stockpile. I've convinced them to NOT buy the PS4 version. They aren't full on tech nerds, so probably don't even understand why. But now I need to give them an alternative😅
I get horrible stuttering with a laptop with 16gb of ram, i5-9300H, gtx 1660 ti. Im pretty new to this stuff, but when I looked into msi afterburner stats, seems that my commit charge (the second number in the RAM stats) is 19000 MB which is way above the 16GB of RAM I have available. Could this be the cause, since I'm not exactly sure what does that stat describe? The first number on the RAM line is somewhere around 13000 MB, which should be okay.
I am running Hogwarts Legacy fine, no stutters. My system specs: i5 13600KF 32GB DDR5 5600 GTX 1070 and funnily enough I dont have any dips below 60 with a capped framerate. I use FSR 2.0 Ultra Quality everything else is at Medium.
@@alaa341g Because they are still too expensive and I'm waiting to see what the new generations of GPU's bring. Also im still very satisfied with my GPU. I upgraded everything else since it was i5 4690k and 16gb of DDR3 before that, and I just got a bottleneck with it in quite a few games.
The 1070 has a lot of horsepower for a 2016 GPU. 8GB VRAM and decently overclockable too. What you're saying checks out with everything the internet is saying - it's just a terribly optimized game that has a terrible render thread.
@@CyberneticArgumentCreator I see people have stutter with high end cards while I didnt experience any so far, thats whats weird to me. Im using 511.79 drivers and put shader cache to unlimited, maybe thats what helps I have no idea.
i have a rtx 2060, ryzen 9 5900x, and 16 gb of ram and my game lags terribly on medium settings no ray tracing. I do not know what the issue is or how to fix it
Ray tracing is such a scam, I have no idea what the point of it was with nvidia. In many games they look better with rtx off , and in others it's so badly optimized, you are losing up to 80 percent of frames for it.
This title is one of the games that utilizes raytracing the most, it really improves the graphics. Unfortunately the developers didn't put any effort into non ray traced reflections. It's either raytracing or nothing seemed to be the philosophy when creating this game.
I instantly disregard the opinion of anyone who calls an entire rendering paradigm that has been researched and developed since the 1970s a "scam." Badly optimized RT is no reason to dismiss RT.
So far I keep Ray Tracing reflections on. The others I can sacrifice. I think they do the shadows pretty well 'artificially'. I think you're halfway right about the limitations not being solely cou, but coding and memory bandwidth usage etc. The cou is obviously waiting on something. I see up to 96 % usage on my 4090 and 10850k in other scenes just fine though.
@@RicochetForce Something rare at least means it is possible and has happened. It could be they need to set a "end" limit to the ray tracing calculations, especially for the ambient occlusion stuff. I'm nearly sure that since it's been brought up by many reviewers, they will patch it.
@@teddyholiday8038 I have no stutters. 4k DLSS Quality or Balanced. I do not have the latest Nvidia drivers either, I hope you figure it out. I can get 223 fps in some scenes with Frame Generation. the 4090 is slightly overclocked as well.
I have i7 12700k with Rx6800xt. I run 1440p Ultra RT off. I Observed that every time you open/load game CPU goes to 80-100%. Game just load every location again making game heavy CPU demadning but after location loaded in CPU goes to 60% and GPU to 100%. Conclusion:Everytime you open a game, game loads shaders/objects FROM THE SCRATH making your CPU bottleneck GPU for sometime ( in my case I got from 100fps in Griffindor common room to 144fps + in like 5 minutes, then when I left common room my fps again went to around 110-120 and after 5-10 minutes in one location it went back to 144 + again/ When I got back to Griffindor common room location was loaded in and I had stable 144 fps +) This applies to every location.
For whatever reason I get massive stuttering when I enable DLSS3 on a 4090 with a 7950x. It seems to happen on Witcher 3, Cyberpunk, and in this game. Disabling the secondary CCD using Ryzen master seems to resolve this issue mostly, but I'm just surprised nobody is looking into this.
Been into computers for 40+ years. I'll tell you that the most interesting times were when things were coming out that ran slow. So much more interesting than those long periods where software stagnated. Mostly it was caused by overconcentration on consoles. But anyways if things getting better excites you then having stuff that maxes out hardware for good reasons is a really nice time.
@@e0nema0tak1v It's wild how you with one sentence proved how little you know about hardware accelerated processes like framegen. If you, for example, are already gimped by cpu and vram performance, then frame generation input lag will have exponential effect, compared to if you generate from a stead 60fps native. The higher your base input lag, the exponentially worse the negative effect of frame generation delay will become.
@@e0nema0tak1v It's more the fact that DLSS and frame interpolation as well as Vsync and any form of upscaling will gimp the frame buffer pipeline, thus defeating the purpose of higher fps in the first place. To fully benefit from high refresh rates and fps you must clean up the pipeline, use native resolution, native fps, no vsync, no nothing and you will have a better experience already. If you're having trouble with fps just do what we've always been doing: drop resolution, use optimized settings, disable any RT tech, etc. and after all that, cap fps to 2 frames lower than your monitor hz to prevent screen tearing. Also DLSS and FSR don't benefit anybody, low end users will have their game look horrible with lower resolutions and unprecise pixel upscaling artifacts, and high end users who want to increase their already good fps are only introducing imput lag and that doesn't help in any way. It's a redundant tech in any situation.
I'm not familiar with all this computer lingo but I'm hoping someone can comment with what settings I should play at. CPU is i7-9750H 2.60ghz GPU is GTX 1650 and I have 32gb ram. Currently on all the recommended settings and it runs ok just didn't know if it could do more.
It seems like the game runs "fine" at 1080p on medium settings on even budget rigs, but change a couple of settings up a notch and it slows to a crawl. I hope they invest dev time into rapidly fixing the optimization and memory usage. Would be a shame if such a fun game that so many artists poured years and years of time into was marred by a relatively small amount of mistakes on the optimization end.
I think this just comes down to how the engine handles the CPU cores and threads. I just find it hard to believe that even a 13900k or AMD equivalent aren't strong enough to eliminate these bottlenecks. I remember a game called Vampyr that came out a few years back and it too was cpu limited in some of the hub areas. UE4 game unsurprisingly. My gpu usage would take a hit down to the 40's percentage at times it was that bad. I highly believe it just the engine inefficiency.
5800X3D ,32GB 3600C16, 7900XTX I'm getting between 100 and 144fps (144fps in poudlard, 100 in the city), always gpu bound with ultra preset, no rt, fsr 2 quality. Without the day one patch fsr was reducing gpu usage to 50%, frametime all over the place but now it's perfect, except robotic voice for your character if you change voice pitch and white bushes (a mod fixes that, either you can lower materials to low to fix the issue). I also found that my character face lightning tend to look weird.
I7-I9 13th gen are quite ahead of amd 7000 series in this game. Single core performance and higher memory speed support helps overcome the terrible game engine.. 7000 series X3d may change that.
Even the steam deck GPU has no problem when running this game. All of the performance issues I see when playing is when the CPU hits 4 to 5 watts or higher or 60% or higher usage. The frames tank as soon as either one of those happen. I play docked at 1080p with FSR 2 performance. Everything low + medium textures. When handheld 800p with FSR 2 quality same settings. I've even had sub 30 fps when the steam deck GPU is at 75ish % util and CPU is pegged. If they can manage to get CPU optimizations even just a bit. This game will be a steam deck gem. It's already almost there. Edit: docked I shoot for visuals and a 30 fps cap. Handheld I shoot for a smoother 40fps and less wattage
@@ryans3795 I agree with you and some people still believe that CPUs are bottlenecked lol, even that i5-9600K is still a good CPU when paired with a 2060 or something like that..
I’ve been enjoying the heck out of this game. I’m running it with a 4090 and a Ryzen 5900x. And my system feels like the bare minimum for running this game. Which is crazy, haha. But that one of the reasons I get overpowered hardware - so I can brute force through any unoptimized games that come out. Haha. It’s basically insurance that I’ll never need to compromise on my 2160p 120fps ray tracing expectations 😆
You know what, games like this are one of the situations where you can really justify getting a 13900k and overclocking it. Every drop of CPU performance helps here. And also Frame Generation is worth its weight in gold! The people talking trash on Frame-Gen guarantee don’t even own a 40-Series card. Pull out every Nvidia trick in the book, and you still get a mostly-normal gaming experience, even with unfinished “beta” games like this 😆👊
Usind RTX3090 and 5950X and 64GB RAM. Absolutely no problem here. Max settings, Raytracing full and 4K DLSS Quality approximately 60 FPS VSYNC on my 4K TV. Perfectly playable and beautiful.
What a joke of an optimization attempt lol. What point is there to having high-end specs if even average looking games don't run well because devs don't optimize for PC.
Good video except those 1000 annoying ads. I own I9-9900 KS and I have it paired with RTX 4080. When in Hogsmeade FPS drops to 33 - 47 when at 4K, ultra preset and ray tracing on. DLSS upscaling helps not that much and I cannot enable frame generation, because if I do, the game always randomly crashes when playing for 5 - 10 minutes + there is noticable input lag. It seems that the game crashes because of 'hardware accelerated GPU scheduling' which can be set up in windows graphical settings. And you need this setting on in order to enable frame generation. So this game is unoptimized garbage on PC. I am not having issues with other games such as Horizon Zero Dawn. No issues at all with frame generation there.
Terrible optimiztion because it's developed for the consoles and drop ported to PC. I don't know what the surprise is. It's been the same for a decade or more, with all "AAA" titles. Consoles first, then drop ported the same shitty version to PC. A handful of exceptions, like GTA 5, were in development LONGER after the console release to optmize and make a true, ultimate PC version. But that's an anomoly.
@@stirfrysensei Yeah those games weren't actually in development longer, they were just timed exclusives for console. That's been the case with damn near every rockstar games release.
@@stirfrysensei Like fuck! The PC version was held back a year after the console release in order to give PC gamers a true PC hardware level experience, better in every way than the consoles. They didn't drop port it. They optimized for PC and GTA 5 is regarded as one of the best cross platform AAA titles ever made - still used in benchmarks today! Get tuned into the real world bro. It was the console versions of GTA5 that had terrible performance, hence the day one PS4 patch removing 1/2 the graphics to maintain that 30fps framerate! RD2 was terrible because - YOU GUESS IT - drop ported code from console to PC, like I already said. If you had issues with GTA5 on launch, it's cos you either had terrible 5 year old hardware in your rig or you were running it in stupidly high settings. In which case, open the menu and turn the options down. There were dozens of options in graphics settings on GTA5 - cos it was a real PC game developed FOR THE PC.
I’m not saying this game wasn’t optimized well because it wasn’t but it’s crazy how a lot of people are having issues with this game but I’m not I’m even running it 1440p high setting at 60fps easily with my Intel arc a750 whole recording for a video.
@@ryanroberts714 thats weird because I have a Intel arc a750 and a 12700k and it works just fine. I guess it’s primarily an nvidia thing but I know I’ve seen some people say it’s still messed up on and gpus
I like the look of the game, but only in native 2K res with high settings on a 3080. I don't know why, but DLSS makes this game look washed out. Thankfully, the game runs well enough with the typical stutters here and there.
Same here, pretty much. GPU sits at around 55%, loads of stutters and framerate drops walking round the castle, the town is just below 60fps and stutters quite a bit. Memory usage isn't particularly bad, it does spike up to 90% but then settles back down to around 60-70%, video memory is pegged at around or 8GB 3900X 3080FE X570 16GB DDR4 3600 Installed on an M.2
5800x + 4070ti, ULTRA settings, RT off and frame generation ON. No problem at 2k resolution, the framerate is always between 120-140 fps. GPU usage around 60-70% most of time
You have audio dropouts in the 5950x pc (DPC latency issues). Are you using motherboard audio or a dedicated sound card to capture the audio? If you are using a good card to capture the audio, then it might be shadowplay is causing dropouts, else if dropouts do exist anyway, there are things wrong in your system, also the audio dropouts reduced significantly once you turned on dlss 3.0 frame gen
How did you manage making the GPU close to 100% utilization? I have a 3080 and I am stuck at 70% max, no matter what settings I change, both in game and out. My CPU is 5900X.
4090 13700k 32 GB Ram 240Hz 1440p, pretty bad hitching here. traversing the castle in many areas just feels bad. refunded the game till performance is fixed. changing the resolution to 4k and refresh rate to 120Hz helped some, but not enough to make the game feel good to play.
I have an older CPU i& 6700 and RTX 2060, 32 GB of ram. My CPU usage has been 40-50%, GPU hovers around 60%. I had to turn off DLSS, turn on anti aliasing to DLAA, changed certain graphics to high. Then that's when i saw the difference with GPU usage up to 80% and CPU usage dipped to the low 40s consistently. Was still able to get 60-78 fps.
Hi Dan, my PC build isn't state of the art by any means, but I have an MSI trio brand 3070 in conjunction with an i7 9700k and 32gb of DDR4 at 3200mhz. I struggle to achieve anything close to what you're getting (on your 3060 demo), even with graphical settings on medium, even at lower rendering scales. I'm wondering if there's something wrong with my build but every other "new" game runs just fine.
Didn't knew a CPU could make that much of a difference Daniël 🤔, my combo: 5800x3d, 32GB 3200MT/s & 6900xt (2500MHz). I run the game at 1440 ultra, no RT, FSR set to quality. In the castle I mostly get 165 fps (set max fps to 165 because I have a 165 Hz monitor). The GPU usage is often between 90 & 99%. In Hogsmead and other areas with more open view, I get around 110 to 120 fps. The game is very smooth, fps drops to 80 fps are rare.
I play this on high settings (1080p), no RT, DLSS quality, and the lowest I go is 50 fps. The game uses a lot of RAM, however. My system utilization will peak at 18/32GB used and my VRAM goes as high as 9GB used. It's certainly playable.
Small note, 16 gb of ram is on the low side for this game. U can check some benchmarks on it and i tested and upgraded myself. Off course its not gonna solve it on ur case but even me on a 12600k 5.3ghz was having deeps in that area till i upgraded
I watched this test and replicated it on my 5800X3D + 4080, the 5800X3D is even faster than the 7700X shown here (120 vs 95 rt disabled and 90 vs 70 with rt enabled)
There is this issue I noticed with "Windowed Fullscreen" games when running on 4K (or 5K in my case) desktop resolution. It has a huge CPU overhead impact. On my Macbook Pro I mostly use Windows to play games, and I figured that by reducing the desktop resolution down to 1440p (which scales perfectly on my 5K display) performance jumps up a lot. I had this both on Uncharted, and on Hogwards. Now I'm running 1440p with FSR2 balanced, and +/- medium settings, and I'm very close to 60fps all the time with this 5600M (which would be equivalent to a 1070). Of course, there is what you mention on nVidia drivers being worse on the CPU ceiling that AMD, but give a try with lower desktop resolution. You'll see things change dramatically!
@@anuzahyder7185 I tried again after posting. On my MacBook I played 2 hours last night pretty flawlessly. On the desktop (9900k with 3090) there are some hickups, but overall my performance is much better with desktop resolution 1440p instead of 4k.
A quick tip: if you want to test a game like this in actual 1080p, change the resolution of the monitor/tv in Windows before you start the game. And as the riva stats are so small, could you scale them up please, so that people on phone can see them better 👍🏻
I only have a 20 series card, have you tried setting the shader cache size to 10gb in the nvidia control panel, I'm still testing but improved my fps..
Hi! I cant turn on frame generation, its just not available for me. It says it has something to do with the windows settings and i checked but its not that. Any ideas? RTX 3050
What's crazy is I run this game at 1440p ultrawide with a 3060ti and a R5 5600, and I'm not even hitting 100% utilisation most of the time. Horrible optimisation.
Are you referring to gpu utilisation? If so, that will be because you are cpu bottlenecked. This game is cpu bound
Did you read this comment before posting it 😅
@@riverfilmer5183 lol so in other words it's poorly optimised,😂
@@riverfilmer5183 Yes , thanks. We realised. The point is the game isn't properly utilizing those CPU cores which there isn't a fix for until Devs patch it
Games shouldn't be cpu bound unless you're running an old cpu with a newer/powerful gpu, unless the game isn't optimized properly. I believe that's his point
Anyone in the room as sick as I am of devs’ sheer laziness to deliver acceptable PC ports?
Not the only one dude. GenZ are getting jobs. This is a one way trip down.
I'm as sick of that as I am most review channels' suite of console port benchmarks which represent games hardly anyone plays relative to the market. The games are terribly optimized and totally ramshackle, no one should care about benchmarks in them especially when they're optimized for AMD hardware in consoles. Show me benchmarks of PC titles please, I know I'm going to be able to run a single player console port at 60+ FPS and don't give a crap about any of their performance metrics.
/rant
@@neruba2173 then I can see many people migrating to consoles. A sea of incompetence optimizing games paired with Jensen’s greed to fund his leather jackets is the perfect recipe.
@@javiermd5835 the "performance mode" on ps5 is straight up trash
Take a look at the upcoming wild hearts and see what a terrible mess AAA pc gaming is.
I want to thank you for these series of videos. The fact that you test older CPUs makes for a more useful and realistic setting for 90% of us that may have good systems, but not last iteration or cutting edge systems. It also makes you understand why, having a beast of GPU is not the only piece of the equation. Thanks again!
Quite unfortunate turning down texture usually sucks the most and are extremely noticeable, texture is just free image quality as long as your VRAM and bandwidth can support, if possible I usually try to push it to the limit.
Run DLSS Quality then...
It’s not noticeable
@@Kizzster Doesn't work, the game with RT will try to use 13.5GB of ram even at 1600x900.
@@SolarianStrike Not asking him to run RT, It's never worth using on open world games imo. There aren't enough Ray Tracing scenes or enough of them to justify a 40-70 fps decrease.
@@jimdshea This game actually does stutter / stall if it run out of VRAM.
great CPU comparison! appreciate your efforts. hope developers can address the issue and optimize RT in this game
It's usually around 90-120fps for me in sprinting through Hogsmeade, 7900XTX and 5800X3D. X3D CPU and AMD GPU seems to be the best combination for this game. The 3D V-cache and lower CPU overhead on AMD cards helps a lot in the CPU intensive areas.
I’m running on a 9600k and 3070, what I find odd about this game is how much my performance drops when moving from area to area. If I chill in one section of the castle for a minute i’m usually able to average around 80-90fps, but if i’m jogging around that average drops to like 40 all the time
Shit coding unfortunately....
I have the same specs and I can confirm I’m running into the same issues my man, hopefully in the future they come out with a good patch
Same with 10400
3070 is running out of vram. How is it now after patches and such?
I have 9900 ks and 4080 and the same issues for me. In the school 80 - 90 FPS and when going outside it drops to 37 - 49
I do wonder how much of the stuttering is caused by denuvo being denuvo though.
Well as much as i hate denuvo , i think at this point it can't be considered THE problem here , i think the game is just that badly unoptimized as a lot of games recently are releasing with wierd cpu limits .
But then again denuvo has a history of causing stutters and more cpu limits so removing may help the performance in some way , just look at RE village
No its too single thread limited. I don't like Denuvo, but this is worse than mere stuttering.
@@lynackhilou4865 there is gonna be video comparison when denuvo gets cracked
@@Eldesinstalado usually denuvo cracks are just bypasses, if the devs won't remove it themselves, it will run the same. Exception i know of is AC: Origins, where the crackers removed denuvo themselves, and the game actually ran a lot better, especially in cpu bound scenarios
@@lynackhilou4865 yeah. It ain't Denuvo. I have a 4090 and a 5900x, the game is butter smooth on Ultra with RT off, but with RT on the stuttering is insane. Something is borked with the RT
The Devs surely must be readying a patch, there's no way they can release a game that has such wide appeal as this with graphics settings so high that most people can run it decently. It looks really good but there's something not right here.
I mean idk m, elden ring was pretty bad for awhile, still no ultra wide support as well haha
I don't understand...you think because Harry Potter is popular they should make it have low graphics and be easy to run because a lot of people will want to play it...? That isn't how it works
@@samgoff5289 It usually does. Most popular games are done not with overkill graphics in mind so you can still run them decently even if you have old hardware but they tend to scale well so you can play them at high framerate/settings with powerful GPUs too. This game has ok graphics and it's actually easy to run at 60 fps run with upscaling but has terrible CPU performance which hinders even the best GPUs out there like a 4090.
@@GrimAbstract i'm sorry what? you're comparing the optimization of Elden ring to this garbage? Elden ring with literally everything on ran quite smoothly AT LAUNCH. this garbage here (tried again few days ago) runs like absolute shite. if it was a stable low FPS it would be somewhat ok, but it hitches like crazy even on medium in areas like hogsmead or in hogwarts.
@@samgoff5289 no, the game just has shit optimization. I hate the fact that modern games are struggling to run properly so you're forced to make the game look WORSE than games from 10+ years ago who focused more on innovative ways to make their games look nice, rather than relying on slapped on new technology that make the game look cool in high/ultra but absolute shite in lower settings.
It's not too long ago that LowSpeckGamer turned off from showing how to optimize games to run on lower hardware, since there isn't much to optimize in modern games. Seems we may be needing him again.
By the way i have been enjoying his new direction to low speck history. Top notch documentary both for learning and entertainment.
I am running a 5900x, 3070, 32GB of 3600 CL16 DDR4 ram. Using all high settings and turning to Native 3660x1440 Rendering resolution I am getting mid 40 averages in hogsmeade and mid 60s everywhere else. This was very inconsistent at first, but once I forced off the Control Flow Guard in the Exploit Protection I was seeing very steady averages, but still in the 40-70 Framerate. Still need to test with rendering reduction and upscaling to see if I can improve the framerate without losing the steady average. This fix also seemed to remove most of the massive drops I was seeing before. I will still get short ones every once in a while in cut scenes but nothing else. Also most averages were seeing 99% GPU utilization and 15%-40% CPU utilization.
Vram is looking more and more essential already. As much as I want to go with a 4080 this generation, with these recent game releases it just looks like both 7900 cards are going to be better at current generation games with the 24 and 20GB Vrams respectively.
Yeah, my thoughts as well.
You don't need 20 and 24 gigs this generation, consoles have 16 gigs but the usable ram for games is 10-12 gigs so a gpu with atleast 12 gigs should suffice
@@crescentmoon256 consoles also don't need the same lane restrictions as a PC with interchangeable parts. I do have a 16Gb card right now so I'm easily able to play games like Forspoken without issue. My purchase decision is going to include anticipating game needs for the next 2-3yrs as well as AV1 encoding. 4080 being limited in two years isn't worth the price difference to me so a 7900 XTX or XT is looking much better for the cost.
Think the Arc A770 16gb also is a good budget option for the future with it's 16gb vram, good bandwidth and AV1 built in. Maby the new "rx 580 8gb"?
@@SevendashGaming ur just speculating, u can't say for sure how upcoming next gen games will utilise resources yet, cause there are not enough titles to see the trend
this exact thing happened with the 10 series, 1060 was considered to have low vram with its 6 gig and consoles having 8 gigs but 1060 outperforns both ps4/xbox one consoles because usable memory on console for games is 4-5 gigs
we see the same with 1050 and 1050 ti, even tho they are faster than ps4/xbox one in most tiltes, towards the end those cards looked worse because they only have 4 gigs
vram is Important but dont consider buying a card just for vram, av1 encoding is supported on all the three gpu manufacturers on their latest series of gpus
I ended up getting a refund after I started to see performance in the open world with a 5600x 2070s 16GB ram. I'm not upgrading my memory for this game alone. Terrible performance, I'm just going to get dead space instead.
Got the exact same issues with dead space, heavy VRAM consumption and stutters (/w RTX 3080 and 32GB ram). Runs fine for some periods but some bosses and cinematics, frame rate drops heavily in both games
I have the same setup as you but with 32 GB of RAM and I have stutter and performance issues as well. So upgrading your RAM won't help you either way.
32GB should be the minimum these days for newer games. They will be pushing 12-16GB by themselves that doesn't leave a lot of room for windows especially if you're not rebooting and making sure nothing else is running.
Fair decision on your part but reminder that a 2x8 kit of RAM will run you like $42 USD and will really make the whole system last longer. If it's even one more year of relevance that's 100% worth $42 in amortized cost.
@@CyberneticArgumentCreator Yeah the issue is I want to match my current kit just to avoid any potential instability, and that kit is like $70. It's honestly almost as cheap to just get a faster 32GB kit, I'm just not wanting to put more money into a system I feel is at the end of it's life.
so happy with my 7700X, overclocked and with ddr5 6000 and extremely tight timings(took weeks to tune and test) I get ~120fps in hogsmead all on ultra except ray tracing!
Its the denuvo anti cheat as well.
Would this mean that the game is just poorly optimized? Cos only certain thing can be hardware accelerated on the gpu, so does the engine not utilize the acceleration or something? So it falls back on the cpu?
But it is also pretty bad just rasterized, so it’s just generally poorly optimized ig
one of the best channel i am so happy that i found it keep it up man
and i hope ur daughter get better
Have they fixed the Ryzen CPU performance bug yet? I've seen benchmarks showing an i3 -12xxx being almost twice as fast as a 5800X3D which is preposterous. Hardware Unboxed also mentioned on their twitter that Ryzen 4 CPUs were completely underperforming with a very poor thread distribution and utilization (only 4 cores actually busy and even those "busy" cores were barely touching 50% utilization each).
Important to keep in mind.
Cyberpunk launch is not a lesson learned.
It is a *blueprint* on how to sell games.
lol this game launch is nowhere close to cyberpunk 2077 in terms of bad launch.
@@pankhanafis818 cyberpunk was actually 100 times better optimized FOR PC that this game. It was trash in consoles. I played CP all the way to the end in high - ultra settings with a 10700 and 2070RTX, no problem. I try this game with the same settings and its all over the place. Not only that, graphically it looks worse with the same settings. This is a CP case, but reversed, its a trash PC port.
@@neruba2173 i had 0 problems with cyberpunk and 0 issues with hogwarts
I’m running this on 1440p with maxed graphics on a i713700K, RTX4080 and 32GB DDR4 RAM and the frame drops are insane. I can’t possibly play It flawless.
Yes I know I’m insane and yes I didn’t have any money for month to afford it, but I bought a PC for 3k.
Also a question about your overlay; how did you get all those options like the graph to show via Afterburner? I dont even have an average fps indicator.. Also how do you reset the counter? Thanks for the really informative videos man
You have to change it in the settings. Choose the hardware you want to measure and set it to show in screen. You reset the counter using a key combo, which you can also set up in the settings.
there should be some tutorial videos on RUclips.
I’m running 5800x3d, rtx3060Ti and 32Gb 3200mhz Ram. It’s upscaled to 4K with high settings it’s renedering at 65%. I’m getting 50-60 Fps most of the time
The Denuvo BS is probably at least 50% of what's causing the problem here.
Nah, its too single thread limited.
@@Wobbothe3rd Too single thread limited?
My i7-12700K running a Nintendo Switch emulator only with the Intel UHD 770 graphics looks smoother than this AAA game. I actually watch one thread being 100% used.
If you want to confirm if VRAM is swapping into system ram, check to see if the "Shared Video Memory" usage in Task Manager -> Details tab (you'll have to enable the column) for the game is going up.
Is crazy how, while in the castle, I have 140+ fps, and then in the next 30 meters (or like 3seconds of sprinting) i go to like 110, then 75, then 48, and it goes up again.
All I want is to run this game, 1080p, looking as graphically good as possible, without any performance issues.
I hope they patch this asap.
(i7-9700F, 32GB ram, 3060Ti)
Same here and that is without Raytracing with 3080. Crazy unoptimized mess of a game
This game is an absolute mess performance wise. The stuttering is ridiculous in Hogwarts and Hogsmead, even on a 4090 and 13900k for me. Can't even brute force it with high-end hardware.
idk man just set things to medium and enjoy the game. It still looks fantastic in lower settings. im running a gtx 970 4gb card with an i5 4570 and STILL pulling 60 frames and having a great time. its a shame high end rigs can pull high frames in ultra settings, but idk maybe i got low standards lol
@@HeyImBoschea i tried tweaking and going for lower settings, no matter how high or low i go, it will always have some moments/places where it drops to like 40fps for a second and then goes back. Is not like i dont want to see the "40fps" on my screen, but the second it drops, it feels like the game is moving in slowmotion, ruining the experience.
@@herrosG yes . same feeling here exactly
Am I the only one wondering how the hell will last gen consoles run this game??
If something like the 9600K is brought to its knees in Hogsmeade the old Jaguar CPUs will probably scream in agony LOL.
My main specs are a 7700x with a 3070 and the game runs hella smooth but sometimes when I go outside or go to another area it does have stutters and drops go 10-15 fps but as soon as I do a rebellio it kinda fixes it for a while. Game definitely needs optimization for sure, I played through Callisto Protocol and it was very smooth for a new game. Either way I’m enjoying this game a lot, great mechanics, love duels, and exploring the castle and everything is so fun.
Me too mate same specs, I try everything.
It definitely seems to be VRAM limitations with the 3070. I have been running on mostly high settings with DLSS quality on my 3070 and it seems to offer a smooth experience. As soon as I go up to ultra textures, I get the massive frame drops you are talking about.
@@JustS0meK1dd im on mainly high settings, idk if I have DLSS on. Maybe that could fix it or I’ll have to go to medium I guess
@@PaoloMix09 im with all in high RT On no shadow and dlss quality , NVDIA dlaa , open Geforce experience sharpening 50/15 and looks better
I wonder what even makes the game remotely worth this terrible performamce, the more I know about this game the more I'm puzzled why gamers were excited beyond "it's a singleplayer game with no microtransactions"
Great benchmarks as always on this channel, but man what would I give in my time for this guy to be my math teacher
You really shouldn't need to spend $3000+ on a PC to be able to run a new release over 60fps without ray tracing on high settings and 1080p. Most people aren't going to have a 5950X let alone a 40 series card 🤣.
It's hard for me to tell much of a difference in this scene between RT on and off, yet it tanks performance when it's on.
rt reflections look worse unless you set them to ultra
This game has terrible RT. The RT reflections are fuzzy, soupy messes. The RT shadows barely work, and the RT AO just make things brighter
RT even at ultra isn’t worth using lol
Not even worth it , looks worse in some area
Maybe your monitor can't tell the difference
I hope that tech companies will not use poor optimisation as way to push people to upgrade
Optimization has left the chat yet again...
If you have a 4090 use DLSS QUALITY and FRAME GENERATION it helps a lot and visually there is no difference between that and 4K native AT ALL and you get around 150 fps average and lower latency than native 4K because of Nvidia Reflex being on - 4K native was 35ms, DLSS Quality and Frame Generation was 27ms. Stop this "fake frames" scare mongering people I haven't noticed a single bad frame at all!
Whats odd is that I am running on a 5600x, a 3070, and 48gbs of 3200mhz ddr4 at 1440p ultra and I am getting the same fps with little drops. The game does seem to be cpu limited and even then having more ram seems to help overall. My buddy with the same system and just 16gbs of ram can't even run the game. It just stutters and then crashes. Only time will tell with patches I guess.
I think it's the game engine, not the hardware. Same kind of experience on PS5.
@@atomicfrijole7542 Definitely Unreal 4 optimization that is lacking, and Raytracing was never truly well implemented into that engine - even UE5 has its share of issues still.
do u think its worth it to upgrade to 32gb ram as its a pretty cheap update rn for more stable performance (currently running 2070 super 5600x and 16gbram with dlss on, raytracing off and high quality settings) (am averaging depending on region between 70-100 fps with drops into the 30s sometimes too :( )
@@ElendielPlaysEU Yes it is absolutely worth it, but not exactly for the reasons you're looking at although you're right on with stability. You won't see a huge fps increase from it, but you will have a much smoother environment with your operating system and games. I consider 32gb a minimum to have a smooth computing experience. Sure, you can stick with 16gb but the operating system overhead will go away when you hit 32gb. Plus you're future-proofing your system for not much money and when you are ready to shift away from Windows to Linux, your pc will run like a brand new machine, no joke.
@@ElendielPlaysEU personally yes. If this game and other future titles are any indication, we are at the end of the days of 8gb as the lowest for gaming. I think we should have at least 32gbs. It is overkill but if hogwarts is any indication for future titles then it means we should have more ram.
We're 3 years into PS5 era, and UE4 engines games are still being released. Poor coordination on everyone's part. Trying to use engines unoptimized for new hardware.
A settings optimization video would be much appreciated if you have time. Especially since Digital Foundry won't do their job, For no apparent reason. Just strange situation.
There is a reason , they are all bunch of wanks 🤣🤣
DF caved in to T Mafia, maybe Eurogamer forced them
@@Radek494 Mafia 3?
@@adi6293 transgender mafia
@@Radek494 I doubt this is the case. Rich and John are not very liberal, they don’t seem to give a F about all that. Alex usually does the PC analysis anyway
The i5 9600k even overclocked to 5ghz all cores wasn't enough for ray tracing back in 2021 with cyberpunk. And not just cyberpunk, watch dogs legion, far cry 6 and many others open world games with ray tracing, i've noticed that very earlier, i had one, also had the 5600 , not enough either, but it was better. Now im with the 5800x3d and seems to be the bare minimum for 60+ ray tracing in open world games. Its a fantastic cpu and im glad im not going below 60 fps in rt games anymore but clearly it wont keep up in the future when we start to get more demanding games. We always need to upgrade our hardware if we want to maintain some sort of performance standard because of bad software optimization on pc. That's sad and depressing. I wish i could just leave my hardware actually running for a few years before thinking about upgrading but thats not possible these days.
cpu litimation means theres alot of scripts running in background. i made some lua script for gta online mod menu. and i had same issue with low cpu/gpu utilisation and bad fps when i made some heavy cycles in my script running each frame (like for i = 1,1000 do). it runs fine if it doesnt run each frame (like with ~80 ms pause between cycles). so hogwarts devs have to rethink what they have done in their scripts i guess
I've been running: All settings on recommended High, all RT off, 1440 target res with DLSS on (Quality setting), locked to 60fps. These were the original default setting the game first booted up with.
I haven't had an issue what so ever. I've only today seen that many are having issues and am somewhat surprised. I have had an almost consistent 60fps for ~20hours of play time. I've been watching FPS fairly consistently since I started playing to see if I could many increase some setting here and there. Only time I really saw any drops were for a fraction of a second in big area transitions but its barely noticeable.
Running on a RTX 2070 Super, i7-9700K, 16gb Ram
After seeing people having issues I tried RT and yep.. its borked!..
Out of interest I also tried turning off DLSS to try native 1440 render, ultra settings, no RT. And to my surprise still ran at consistent 60 with occasion dips in Hogsmead and a few places in the Castle. The dips at area transitions were much more noticeable though sometimes lasting a few seconds before restoring to 60.
Going back to original recommended, as I don't see much difference and I prefer a smooth 60fps
970, 5600 and 32gb ram (low settings with effects on medium). it's playable (vast majority of the time it's 60 fps) but some areas of the game tank it hard. that is something they can look into, i'm sure they just have a bit too many effects/objects in certain areas. that is very much something they can remedy overtime. if i had even a 2000 series card though...i doubt i would ever see it dip below 60 fps.
it's already amazing this 9 year old GPU barely dips below 60 fps. granted FSR 2 is pretty much required but still....the game is detailed enough that it doesn't really matter to me. i'm sure the game looks amazing in 1440P ultra and even better with ray tracing but performance isn't free, if you want it you are gonna have to pay for it in some way.
Honestly the raytracing isn't that noticeable from my experience with the game on the PS5 and in PC vids. 1440p is nice though. 4k is pretty but you'll be happier with a high-refresh rate 1440p monitor when you decide to upgrade. The 970 is a great card - keep an eye out for sales on the 6800xt if you're looking for an affordable upgrade path that will take advantage of your 5600.
@@atomicfrijole7542
sad to hear that RT isn't that big of a jump, i 100% agree about having/wanting higher refresh rates though. 60 fps feels like a slog at times, i never had a 144hz monitor but my bother does and it's sooooooooooo smooth. i don't think i would ever use 4k unless it could get higher refresh rates on top. then again that would require a beast of a GPU so it's unlikely i would have to choose anyway xD.
was eyeing 6000 series but 7000 came out but things happened and decided to wait (was going to get a 7900xtx). so now i'm just waiting for the lower tier GPU's to release to see what happens to GPU prices. waiting a few months or another year isn't a big deal, already been waiting almost a decade.
@@rocksfire4390 I don't know if you're in the USA but if you are and are near a Costco, there's sale on LG's 1440p 165hz 32" panels. I love mine, and I think it's come down to $230 on sale thru the 26th.
It seems to me that raytracing is one of those beta features that is still in development and will be for a couple more generations before they get it right. It's nice to have it in Cyberpunk or Ascent, but it is such a drag on the gpu (even the 3080) that it loses its value.
The really good news is that 1440p is the new 1080p thanks to everyone pushing 4k, so you are already in a great position for a mid-range gpu to high-end gpu to get amazing framerates at 1440p. I think you'd be really impressed and happy with anything 6800xt or above on the Radeon side and 3080 (maybe even the 3070ti) or above on the Nvidia side.
Currently, running it between 75-90 FPS on full ultra settings with Raytracing options all on in 1440p on a 6700XT and with 16Gb of 3600Mhz RAM, an NVME and a Ryzen 5 3600. I'm on a 1440p monitor that can run up to 165Hz and I have V-Sync turned off. There's more to this than a GPU. I will admit, though, Adrenalin software wanted to cook my card again until I realized that I, yet again, had to import my settings because it can't remember the settings I've already chosen. That's the worst part of the AMD experience they fail to address. Once I imported my settings, the card was running below 80C. Remember this: If you're having problems running the game, try deleting the Hogwarts Legacy folder in your users folder in AppData under Local (not roaming).
Looking for a recco based on what folks are finding. Trying to build up a gaming rig for my nieces to play. CPU is settled 10900k. Just trying to choose the best GPU from the pile. Any thoughts between 3060-12GB, 3070-8GB, and 5700xt-8GB? Any help greatly appreciated!
EDIT: Also taking into account upcoming possible/expected optimization patches😅
3070
@@jerrodshack7610 Thanks! I wasn't sure if the 8/12GB might skew it towards the 3060.
3060ti or 6700xt? probably better option that those listed
@@adamek9750 Yea, I know, but I can't go buying them a new GPU. Must be from the stockpile. I've convinced them to NOT buy the PS4 version. They aren't full on tech nerds, so probably don't even understand why. But now I need to give them an alternative😅
@@jdogi1 3070 for sure, what resolution are they playing?
I get horrible stuttering with a laptop with 16gb of ram, i5-9300H, gtx 1660 ti. Im pretty new to this stuff, but when I looked into msi afterburner stats, seems that my commit charge (the second number in the RAM stats) is 19000 MB which is way above the 16GB of RAM I have available. Could this be the cause, since I'm not exactly sure what does that stat describe? The first number on the RAM line is somewhere around 13000 MB, which should be okay.
I am running Hogwarts Legacy fine, no stutters.
My system specs:
i5 13600KF
32GB DDR5 5600
GTX 1070
and funnily enough I dont have any dips below 60 with a capped framerate.
I use FSR 2.0 Ultra Quality everything else is at Medium.
so basically you have a very good setup with latest ram and cpu , but a fucking 1070 ? wtf xD hahahaha, why didn't get a GPU at least an rtx 2000
@@alaa341g Because they are still too expensive and I'm waiting to see what the new generations of GPU's bring. Also im still very satisfied with my GPU. I upgraded everything else since it was i5 4690k and 16gb of DDR3 before that, and I just got a bottleneck with it in quite a few games.
1070, lol
The 1070 has a lot of horsepower for a 2016 GPU. 8GB VRAM and decently overclockable too. What you're saying checks out with everything the internet is saying - it's just a terribly optimized game that has a terrible render thread.
@@CyberneticArgumentCreator I see people have stutter with high end cards while I didnt experience any so far, thats whats weird to me. Im using 511.79 drivers and put shader cache to unlimited, maybe thats what helps I have no idea.
i have a rtx 2060, ryzen 9 5900x, and 16 gb of ram and my game lags terribly on medium settings no ray tracing. I do not know what the issue is or how to fix it
Ray tracing is such a scam, I have no idea what the point of it was with nvidia. In many games they look better with rtx off , and in others it's so badly optimized, you are losing up to 80 percent of frames for it.
I personally like it but for most games I just turn jt off I would never reslly play it like that unless it's 75+ fps
This title is one of the games that utilizes raytracing the most, it really improves the graphics. Unfortunately the developers didn't put any effort into non ray traced reflections. It's either raytracing or nothing seemed to be the philosophy when creating this game.
I instantly disregard the opinion of anyone who calls an entire rendering paradigm that has been researched and developed since the 1970s a "scam." Badly optimized RT is no reason to dismiss RT.
I wonder why none of the CPU cores is at 100% load.
I remember hitting CPU limit meaning the CPU being at 100% and the GPU at like 50%.
So far I keep Ray Tracing reflections on. The others I can sacrifice. I think they do the shadows pretty well 'artificially'.
I think you're halfway right about the limitations not being solely cou, but coding and memory bandwidth usage etc. The cou is obviously waiting on something.
I see up to 96 % usage on my 4090 and 10850k in other scenes just fine though.
Yeah, CPU + memory subsystem issues limiting performance is a pretty rare combo in gaming.
That’s fascinating, my 4090 has never gotten above 62 usage and I have pretty frequent stutters with RT on
@@RicochetForce Something rare at least means it is possible and has happened.
It could be they need to set a "end" limit to the ray tracing calculations, especially for the ambient occlusion stuff.
I'm nearly sure that since it's been brought up by many reviewers, they will patch it.
@@teddyholiday8038 I have no stutters. 4k DLSS Quality or Balanced. I do not have the latest Nvidia drivers either, I hope you figure it out. I can get 223 fps in some scenes with Frame Generation. the 4090 is slightly overclocked as well.
@@hrod9393you’re making shit up
I have i7 12700k with Rx6800xt. I run 1440p Ultra RT off. I Observed that every time you open/load game CPU goes to 80-100%. Game just load every location again making game heavy CPU demadning but after location loaded in CPU goes to 60% and GPU to 100%.
Conclusion:Everytime you open a game, game loads shaders/objects FROM THE SCRATH making your CPU bottleneck GPU for sometime ( in my case I got from 100fps in Griffindor common room to 144fps + in like 5 minutes, then when I left common room my fps again went to around 110-120 and after 5-10 minutes in one location it went back to 144 + again/ When I got back to Griffindor common room location was loaded in and I had stable 144 fps +) This applies to every location.
For whatever reason I get massive stuttering when I enable DLSS3 on a 4090 with a 7950x. It seems to happen on Witcher 3, Cyberpunk, and in this game. Disabling the secondary CCD using Ryzen master seems to resolve this issue mostly, but I'm just surprised nobody is looking into this.
Should be fixed in cyberpunk
That's the price u pay for using bleeding edge tech not everything has been ironed out yet.
Try disabling XMP/EXPO in bios and try to Verify integrity of game files in steam for all games
So you don’t just mean the momentary massive stutter after enabling it and everytime after leaving map/inventory?
Try enabling Vsync in NVCP if it's not already and disable any frame limiters.
One good bit of news though. My 5800x never goes over 60 degrees.
Do you think the denuvo DRM is having any impact on cpu performance?
most games that had denuvo removed gained 15-30 fps and less stutter
so probably yeah
I feel like there should be a law protecting consumers against this kind of behavior.
Been into computers for 40+ years. I'll tell you that the most interesting times were when things were coming out that ran slow. So much more interesting than those long periods where software stagnated. Mostly it was caused by overconcentration on consoles. But anyways if things getting better excites you then having stuff that maxes out hardware for good reasons is a really nice time.
Recently you mentioned getting an Alienware Aw3423 OLED. How does it compare to your C1 or do you plan on making a video in the future about it?
Yes but nvidia has driver overhead in dx12, so with amd gpu there would be less cpu bottleneck like you have shown in one of your pervious videos
Frame generation goes BRRRRR
@@Davids6994 imput lag goes brr too with it
@@sito3539 Oh, that 0.005 seconds delay when you are playing a single player game. Unplayable......
@@e0nema0tak1v It's wild how you with one sentence proved how little you know about hardware accelerated processes like framegen.
If you, for example, are already gimped by cpu and vram performance, then frame generation input lag will have exponential effect, compared to if you generate from a stead 60fps native. The higher your base input lag, the exponentially worse the negative effect of frame generation delay will become.
@@e0nema0tak1v It's more the fact that DLSS and frame interpolation as well as Vsync and any form of upscaling will gimp the frame buffer pipeline, thus defeating the purpose of higher fps in the first place. To fully benefit from high refresh rates and fps you must clean up the pipeline, use native resolution, native fps, no vsync, no nothing and you will have a better experience already. If you're having trouble with fps just do what we've always been doing: drop resolution, use optimized settings, disable any RT tech, etc. and after all that, cap fps to 2 frames lower than your monitor hz to prevent screen tearing. Also DLSS and FSR don't benefit anybody, low end users will have their game look horrible with lower resolutions and unprecise pixel upscaling artifacts, and high end users who want to increase their already good fps are only introducing imput lag and that doesn't help in any way. It's a redundant tech in any situation.
I'm not familiar with all this computer lingo but I'm hoping someone can comment with what settings I should play at. CPU is i7-9750H 2.60ghz GPU is GTX 1650 and I have 32gb ram. Currently on all the recommended settings and it runs ok just didn't know if it could do more.
Would be interesting to also see ram comparisons from DDR4 3000mhz to DDR5 6000mhz or even higher.
My ddr5 8000 plays smooth this game so I am not sure any more
I read in a comment that in Dx12 games; windowed fullscreen and fullscreen don't differ in performance. Is this true?
It seems like the game runs "fine" at 1080p on medium settings on even budget rigs, but change a couple of settings up a notch and it slows to a crawl.
I hope they invest dev time into rapidly fixing the optimization and memory usage. Would be a shame if such a fun game that so many artists poured years and years of time into was marred by a relatively small amount of mistakes on the optimization end.
I think this just comes down to how the engine handles the CPU cores and threads. I just find it hard to believe that even a 13900k or AMD equivalent aren't strong enough to eliminate these bottlenecks. I remember a game called Vampyr that came out a few years back and it too was cpu limited in some of the hub areas. UE4 game unsurprisingly. My gpu usage would take a hit down to the 40's percentage at times it was that bad. I highly believe it just the engine inefficiency.
yea i remeber that game not running well even with a 3070
5800X3D ,32GB 3600C16, 7900XTX I'm getting between 100 and 144fps (144fps in poudlard, 100 in the city), always gpu bound with ultra preset, no rt, fsr 2 quality. Without the day one patch fsr was reducing gpu usage to 50%, frametime all over the place but now it's perfect, except robotic voice for your character if you change voice pitch and white bushes (a mod fixes that, either you can lower materials to low to fix the issue). I also found that my character face lightning tend to look weird.
I ve loved when you tested the 9600k, please test more games with 9600k
I7-I9 13th gen are quite ahead of amd 7000 series in this game. Single core performance and higher memory speed support helps overcome the terrible game engine.. 7000 series X3d may change that.
140mb of L3 cache will definitely be a factor, you dont need to brute force clock speeds like intel when you have more efficient cycles.
Even the steam deck GPU has no problem when running this game. All of the performance issues I see when playing is when the CPU hits 4 to 5 watts or higher or 60% or higher usage. The frames tank as soon as either one of those happen. I play docked at 1080p with FSR 2 performance. Everything low + medium textures. When handheld 800p with FSR 2 quality same settings. I've even had sub 30 fps when the steam deck GPU is at 75ish % util and CPU is pegged. If they can manage to get CPU optimizations even just a bit. This game will be a steam deck gem. It's already almost there.
Edit: docked I shoot for visuals and a 30 fps cap. Handheld I shoot for a smoother 40fps and less wattage
No reason a 7700x should be bottlenecking without Raytracing on. The game isn't just utilizing CPU cores properly
I think the point of the video was to show that. That no matter the CPU you are using, all of them are not being utilized as much as they should be.
@@ryans3795 I agree with you and some people still believe that CPUs are bottlenecked lol, even that i5-9600K is still a good CPU when paired with a 2060 or something like that..
Rtx 3060 owner here. This card is getting me solid performance on this 1440p medium low settings with ultra textures I get 110 to 130+ fps.
Wishing your daughter a speedy recovery
I’ve been enjoying the heck out of this game. I’m running it with a 4090 and a Ryzen 5900x. And my system feels like the bare minimum for running this game. Which is crazy, haha. But that one of the reasons I get overpowered hardware - so I can brute force through any unoptimized games that come out. Haha. It’s basically insurance that I’ll never need to compromise on my 2160p 120fps ray tracing expectations 😆
13900k OCed, 4090 FE OCed here and I'm getting crushed at 4k 😂🤣😂
You know what, games like this are one of the situations where you can really justify getting a 13900k and overclocking it. Every drop of CPU performance helps here. And also Frame Generation is worth its weight in gold! The people talking trash on Frame-Gen guarantee don’t even own a 40-Series card. Pull out every Nvidia trick in the book, and you still get a mostly-normal gaming experience, even with unfinished “beta” games like this 😆👊
Usind RTX3090 and 5950X and 64GB RAM. Absolutely no problem here. Max settings, Raytracing full and 4K DLSS Quality approximately 60 FPS VSYNC on my 4K TV. Perfectly playable and beautiful.
What a joke of an optimization attempt lol. What point is there to having high-end specs if even average looking games don't run well because devs don't optimize for PC.
Good video except those 1000 annoying ads. I own I9-9900 KS and I have it paired with RTX 4080. When in Hogsmeade FPS drops to 33 - 47 when at 4K, ultra preset and ray tracing on. DLSS upscaling helps not that much and I cannot enable frame generation, because if I do, the game always randomly crashes when playing for 5 - 10 minutes + there is noticable input lag. It seems that the game crashes because of 'hardware accelerated GPU scheduling' which can be set up in windows graphical settings. And you need this setting on in order to enable frame generation. So this game is unoptimized garbage on PC. I am not having issues with other games such as Horizon Zero Dawn. No issues at all with frame generation there.
Terrible optimiztion because it's developed for the consoles and drop ported to PC. I don't know what the surprise is. It's been the same for a decade or more, with all "AAA" titles. Consoles first, then drop ported the same shitty version to PC. A handful of exceptions, like GTA 5, were in development LONGER after the console release to optmize and make a true, ultimate PC version. But that's an anomoly.
GTA 5 had major PC performance issues when it came to PC..so did RDR2. Should probably look into that.
@@stirfrysensei Yeah those games weren't actually in development longer, they were just timed exclusives for console. That's been the case with damn near every rockstar games release.
@@stirfrysensei Like fuck! The PC version was held back a year after the console release in order to give PC gamers a true PC hardware level experience, better in every way than the consoles. They didn't drop port it. They optimized for PC and GTA 5 is regarded as one of the best cross platform AAA titles ever made - still used in benchmarks today! Get tuned into the real world bro.
It was the console versions of GTA5 that had terrible performance, hence the day one PS4 patch removing 1/2 the graphics to maintain that 30fps framerate!
RD2 was terrible because - YOU GUESS IT - drop ported code from console to PC, like I already said.
If you had issues with GTA5 on launch, it's cos you either had terrible 5 year old hardware in your rig or you were running it in stupidly high settings. In which case, open the menu and turn the options down. There were dozens of options in graphics settings on GTA5 - cos it was a real PC game developed FOR THE PC.
I’m not saying this game wasn’t optimized well because it wasn’t but it’s crazy how a lot of people are having issues with this game but I’m not I’m even running it 1440p high setting at 60fps easily with my Intel arc a750 whole recording for a video.
@@ryanroberts714 thats weird because I have a Intel arc a750 and a 12700k and it works just fine. I guess it’s primarily an nvidia thing but I know I’ve seen some people say it’s still messed up on and gpus
Game looks horrible
Yeah, it doesn't look as good as the min specs would suggest.
Last gen game with terrible ray tracing implementation. Idk what they did so it requires top tier hardware to run well.
I like the look of the game, but only in native 2K res with high settings on a 3080. I don't know why, but DLSS makes this game look washed out. Thankfully, the game runs well enough with the typical stutters here and there.
@@DelgaDude It's more like what they didn't, which is optimize the game.
@@aj.5841 Yes but 3080 is a top tier modern card saying it runs "well enough" on a game that looks like it's 5 years old says it all don't u think?
Same here, pretty much. GPU sits at around 55%, loads of stutters and framerate drops walking round the castle, the town is just below 60fps and stutters quite a bit.
Memory usage isn't particularly bad, it does spike up to 90% but then settles back down to around 60-70%, video memory is pegged at around or 8GB
3900X
3080FE
X570
16GB DDR4 3600
Installed on an M.2
I swear I'm getting flashbacks to the first days of Elden Ring on PC again. Thankful for your testing.
new patch, still broken, and my game lags on lowest graphic settings as well, wish i could get a refund from a key
5800x + 4070ti, ULTRA settings, RT off and frame generation ON. No problem at 2k resolution, the framerate is always between 120-140 fps. GPU usage around 60-70% most of time
I was actually surprised at how good it ran.
You have audio dropouts in the 5950x pc (DPC latency issues). Are you using motherboard audio or a dedicated sound card to capture the audio? If you are using a good card to capture the audio, then it might be shadowplay is causing dropouts, else if dropouts do exist anyway, there are things wrong in your system, also the audio dropouts reduced significantly once you turned on dlss 3.0 frame gen
How did you manage making the GPU close to 100% utilization? I have a 3080 and I am stuck at 70% max, no matter what settings I change, both in game and out. My CPU is 5900X.
4k
4090 13700k 32 GB Ram 240Hz 1440p, pretty bad hitching here. traversing the castle in many areas just feels bad. refunded the game till performance is fixed. changing the resolution to 4k and refresh rate to 120Hz helped some, but not enough to make the game feel good to play.
I have an older CPU i& 6700 and RTX 2060, 32 GB of ram. My CPU usage has been 40-50%, GPU hovers around 60%. I had to turn off DLSS, turn on anti aliasing to DLAA, changed certain graphics to high. Then that's when i saw the difference with GPU usage up to 80% and CPU usage dipped to the low 40s consistently. Was still able to get 60-78 fps.
Hi Dan, my PC build isn't state of the art by any means, but I have an MSI trio brand 3070 in conjunction with an i7 9700k and 32gb of DDR4 at 3200mhz. I struggle to achieve anything close to what you're getting (on your 3060 demo), even with graphical settings on medium, even at lower rendering scales. I'm wondering if there's something wrong with my build but every other "new" game runs just fine.
Didn't knew a CPU could make that much of a difference Daniël 🤔, my combo: 5800x3d, 32GB 3200MT/s & 6900xt (2500MHz). I run the game at 1440 ultra, no RT, FSR set to quality. In the castle I mostly get 165 fps (set max fps to 165 because I have a 165 Hz monitor). The GPU usage is often between 90 & 99%. In Hogsmead and other areas with more open view, I get around 110 to 120 fps. The game is very smooth, fps drops to 80 fps are rare.
I play this on high settings (1080p), no RT, DLSS quality, and the lowest I go is 50 fps. The game uses a lot of RAM, however. My system utilization will peak at 18/32GB used and my VRAM goes as high as 9GB used. It's certainly playable.
Small note, 16 gb of ram is on the low side for this game. U can check some benchmarks on it and i tested and upgraded myself.
Off course its not gonna solve it on ur case but even me on a 12600k 5.3ghz was having deeps in that area till i upgraded
I watched this test and replicated it on my 5800X3D + 4080, the 5800X3D is even faster than the 7700X shown here (120 vs 95 rt disabled and 90 vs 70 with rt enabled)
There is this issue I noticed with "Windowed Fullscreen" games when running on 4K (or 5K in my case) desktop resolution. It has a huge CPU overhead impact. On my Macbook Pro I mostly use Windows to play games, and I figured that by reducing the desktop resolution down to 1440p (which scales perfectly on my 5K display) performance jumps up a lot. I had this both on Uncharted, and on Hogwards.
Now I'm running 1440p with FSR2 balanced, and +/- medium settings, and I'm very close to 60fps all the time with this 5600M (which would be equivalent to a 1070).
Of course, there is what you mention on nVidia drivers being worse on the CPU ceiling that AMD, but give a try with lower desktop resolution. You'll see things change dramatically!
Will try this out today
@@anuzahyder7185 I tried again after posting. On my MacBook I played 2 hours last night pretty flawlessly. On the desktop (9900k with 3090) there are some hickups, but overall my performance is much better with desktop resolution 1440p instead of 4k.
i5 12600k, 3060, 60hz 1080p. All settings on Ultra. Zero issues, CPU 30 C, GPU 50 C.
what's your RAM? 16 or 32?
A quick tip: if you want to test a game like this in actual 1080p, change the resolution of the monitor/tv in Windows before you start the game.
And as the riva stats are so small, could you scale them up please, so that people on phone can see them better 👍🏻
I think that the population setting only works when you enter a new area, as it can't remove NPCs from right in front of you.
Does the RT actually make a difference because I can't really see it on the video 😅😅
@Garrus Vakarian It does make a difference in some games but I can't see any in this one
Soooo, can you play this game with only 16gb of ram? cause that's all I have lol.
16:54 3060 beats 3060 TI? VRAM issues... was RT on or off? That is pretty great FPS for Ray Tracing on and Ultra. On a 3060?
That's with no RT. VRAM use goes even higher with it on.
@@danielowentech Thanks that is helpful. My friend has a 3060 and is worried about getting this game. But I passed him this video etc.
I only have a 20 series card, have you tried setting the shader cache size to 10gb in the nvidia control panel, I'm still testing but improved my fps..
Too hard to tell, I increased it and still run into issues.
The game absolutely hates my 3900x
Hi! I cant turn on frame generation, its just not available for me. It says it has something to do with the windows settings and i checked but its not that. Any ideas? RTX 3050
It’s only for 4000 series cards
@@nikita11111111 ok man thank you