Starfield 4090 + 7800X3D vs 13900K | 4K 1440p 1080p
HTML-код
- Опубликовано: 30 июл 2024
- How does Starfield Perform with the top of the line CPUs, Ram and GPU?
►Join the Game Room: www.game-day.xyz/
Public Starfield CPU Performance Data docs.google.com/spreadsheets/...
PCGH article: www.pcgameshardware.de/Starfi...
System Specs
--------------------------------------------------------
Intel CPU: 13900KF All Cores at 5.8GHz
AMD CPU: 7800X3D
RAM: TEAMGROUP T-Force Delta 32GB 7200MHz @ 8200mhz [Tuned]
GPU: Nvidia RTX 4090
Intel Motherboard: Asus Apex Z790
AMD Motherboard: Gigabyte b650 aorus elite ax
CPU Cooler: EK Water Blocks EK-AIO 360 Basic
Power supply: Be Quiet straight power 11 1200w
OS M.2 NVMe: Crucial P2 1TB M.2
Game M.2 NVMe: WD_BLACK SN770 2TB
Case: Lian Li O11 Dynamic EVO
Monitor: LG OLED C2 42 Inch 120Hz
Settings
--------------------------------------------------------
Resolution: 1280 x 720 (to set this resolution in game you need to set your desktop resolution at 720p)
Ultra settings
Upscaling: Off
Film Grain: Off
Enable VRS: Off
CHAPTERS
---------------------------------------------------
0:00 Intro
0:30 Pcgameshardware source data
1:11 Full Starfield CPU performance list
1:20 Benchmark Methodology and settings
2:20 Plug&Play CPU results
3:29 Tuned CPU Results
4:50 7800X3d Ram Tuning
5:41 13900k DDR5 Ram Tuning & core tuning
7:11 13900k DDR4 vs DDR5 Tuning & core tuning
8:12 13900k DDR4 Scaling
8:34 RAM bandwidth and FPS analysis
9:29 13900k CPU frequency scaling 4Ghz to 5.8Ghz
10:28 CPUs @ 1080p
10:51 CPUs @ 1440p
11:03 CPUs @ 4K
11:35 Conclusion
11:52 Todd Howard's Conclusion Игры
WOW just. A big WOW can't believe u only got less than one k subs, ultra quality content straight to the point.
Your HT on vs off and 8 P cores + nE cores configurations match the last month worth of testing ive done in a bunch of different games in windows 10. It schedules far better with HT off on the pcores but ecores can remain on. Nice video!
Good video. One of the better ones. Your channel will explode one day, keep it up
Well done, Subbed!
Excellent work!
"Why didn't you implement DLSS2&3 for Starfield on PC?" would have been a much better Bloomberg question for Tom Howard, I think XD
Wow, this is such a detailed and helpful video. You’ve really worked so hard on it🤩🤩Considering getting the game and pairing 7800x3d with 7900tx to test performance but I’m still new to all of this. Do you have any tips for be? Easily the best channel on RUclips 🥳🥳🥳🥳
Thanks for the detailed info! I'm running 7800X3D with Gskill 6000 memory using EXO1 profile. How do i better tune the memory for this performance boost or should i just leave it as is?
What is the timings of the EXPO1? What GPU do you have? and what settings do you play? is your GPU at 99% utilization? Might not need to do anything if you are already happy with your performance
@@gamedaytoday1 rtx 4090 strix. i'm running quite a few mods with the DLSS3 FG mod that recently dropped, so im actually really happy with the performance, it rarely drops below 100fps. default expo ram timings, but yes im gpu capped, my cpu is barely hitting 60% while playing
The fact that the game scales with more hardware power doesn't mean it's optimised, it just means it scales. F.ex. let's say its Ambient Occlusion implementation takes 5ms to render, but there's plug'n'play AO of similar quality which would only take 2ms to render, if you get more GPU power then you keep improving your FPS but technically their graphics are unoptimised because it's not using the best solution available. Obviously we don't know what's really going on without access to the code, so we can't say for sure either way, but to claim Starfield is optimised because it scales is simply flawed thinking.
I was just about to write a similar comment. It's not uncommon to confuse scaling with optimisation and according to a thread on reddit it looks like the game is doing some strange things regarding a DX12 feature called "ExecuteIndirect" and it's very possible that this hammers both the CPU and GPU more than needed.
Well said...
Good vid but scaling isn't the same as optimizations.
When I write poor code that runs twice as fast on twice the Mhz, it doesn't mean it is optimized. It just scales. Which also is a problem of some games that only run on one core. Because they don't scale with many cores. And other games don't scale with CPU or GPU Power. Looking at those filthy 30 fps locked games especially.
When I write good code and I get the same result in a shorter time on the same hardware we speak about optimization.
Considering the Hardware needed to run it properly it doesn't look good in any shape or form. Kinda get the same FPS with Cyberpunk 2077 with Pathtracing.
You are forgetting a very important detail. Bethesda games, such as Skyrim, Fallout 4 and now Starfield, have many draw calls, so they can be modified later. And a game with many draw calls is automatically a game that is not optimized at the CPU level. CD Projekt games, such as Witcher 3 (which I know well) and Cyberpunk (I only played about 12 hours) are much more optimized in terms of draw calls. This doesn't mean that they have very few draw calls, but they are guaranteed to be fewer than Skyrim, Fallout 4 or Starfield! And this Starfield test is done precisely in the city of Akyla because it is perhaps one of the places in the game with the most draw calls to be processed by the CPU. It is also because of these many draw calls that this game does not run as well on AMD CPUs, which have historically always been weaker, compared to Intel processors, in performance with draw calls.
Therefore, the lack of optimization that this game has at the CPU level is not something that resulted from Bethesda's negligence (Doom Eternal is also from Bethesda and is an example of a game optimized to have as few draw calls as possible), but rather something done intentionally, to then allow mods that are not possible in games like The Witcher 3 or Cyberpunk 2077, not to mention Doom Eternal or the Tomb Raider series of games.
Does the 7800x3d paired with 7900xtx and SAM activated outperform 13900k and 4090? Do you have a way to test if SAM makes a difference over no SAM or if SAM is better than rebar. Since it's an AMD sponsored game you'd think so but I haven't seen much discussion on this.
I would have tested this but I returned my 7900XTX, but my educated guess is that with SAM a tuned 7800X3D might catch up a bit and go higher up in the benchmark list, but it will not beat a fully tunned13900K with 13900k
@@gamedaytoday1Ah ok.
Your CPU results in this video are with rebar on?
yes@@andreiclawhammer
Amazing work, this is what I was looking for, you could call it 4k Gaming CPUs ;)
Amazing vid buddy! Really liked the specifics explaning for a guy like me who is a bigger in this area :-) Think your setup is really cool aswell, what table is that, and the stand for the screen?
Are those the Klipsch r-51pm’s you have on your desk? Really great bookshelf speakers
r-41pm’s
Man, I bought a 12700K about a year and a half ago, with DDR4 memories (3600MHZ CL16), because DDR4 memories were, at the time, very expensive and were still underdeveloped (I think 6000MHz memories were the maximum speed at that time). I bought DDR4 memories, knowing that the benchmarks at the time already indicated that future games would depend heavily on the bandwidth and speed of DDR5 memories. That time has come and, even though I knew it would come, it's always sad to see that my 12700K with DDR4 memories, even with this game configured for 4K with everything maximized, could be in some situations a bottleneck for the GPU. Well, I think I'll go over there, in the corner, cry a little and I'll be right back.
Wait! I just saw the conclusion of your video. After all, my 12700K, even with DDR4 memories, seems to be at the very minimum so as not to throttle my RTX 4090 (I use Windows 11).
Many congratulations on this analysis. It's very well done.
Why are u not posting what the tuned ram settings are? timings and subtimings? means pretty much nothing if u write tuned and dont disclose the settings for tuned... But good video.-
very interesting and indepth. thanks
Do you notice any frame dips or stuttering with either CPU? I heard theres more dips/instabilities with AMD.
Not with this generation bruc
Can you post your 6400, 7200 & 8000Mhz timings for the 7800x3d?
14900k memory tuned lesGOOOOOOOo!!!! :D hahah but seriously, I've been contradictory reports concerning the IMC.. sowhile not overly optimistic I'm eager to hear your results, provided you'll be testing it of course.
As you have seen from all the mainstream reviews the 14900K = rebranded 13900KS. The only value you are getting from the 14900K is that its a Discounted 13900KS so you would save a couple of bucks. Saying anything else is an exaggeration. So if you already have a 13900KS there is no real value in upgrading.
What would be interesting to see is if the 14900KS when released has better IMC, but I am not super optimistic either as it seems to be just a super binned 14900K.
What would be more interesting to test is how consistent 14900K/S can reach 8200MHz memory clock speeds. And if the new motherboards helps with memory speeds.
@@gamedaytoday1 Agreed! Thank you for a detailed reply
Would XMP 7200 C34 be worth running with a 7800X3D and 4090?
what resolution and what settings?
@@gamedaytoday1 I plan to play on a 4K 144 Hz display. Settings are negotiable.
Starfield will take it all? Even 96 cores from a server cpu?
96 core server CPU has poor single core performance, high latency’s it will be completely gargabe
Funny that i found oposite.On my 7800X3D i found that 6200Mhz CL28, 36, 29, 30 is faster than 6400Mhz CL30, 37, 30, 34.Both with FCLK 2133mhz
I also tried 8000mhz but i wont post for some reason above 6400Mhz with UCLK/2.I have 2x32GB Hyninx A DDR5 6600mhz CL32 and MSI X670E Tomahawk. I think its just bugged Bios.Even XPM wont post i have to manually set timings.
I don't think you can even come close to 8ghz on AM5
For 8000mhz have you tried to update to the latest bios with AGESA 1.0.0.7b?
@@gamedaytoday1 yeah latest Bios with 1.0.0.7c i think its because its 2x32GB dual rank.
@HEAD123456 Could you please post your exact ramkit? Appreciate it.
I don't really like the PCGH benchmark settings, even with my 4090 i was still GPU bottlenecked (any tuned 13th gen cpu will also be gpu bottlenecked by the 4090 here), which is kinda retarded since its a CPU Benchmark. In our CPU BENCHMARK spreadsheet we just use the PCGH save combined with low settings preset + Crowd density maxed out + 720p with 50% res scale.
What monitor is that? It looks incredible!
LGC2 42inch
I'm running a 5800x3d with 3733 tuned b-die and a 4090 with the dlss3 frame gen mod at dlss quality (66%) and getting a locked 160 fps with an occasional dip to 140 fps at dldsr 1920p on a 1440p monitor with great sharpness and image quality. I cheated my way to the top but whatever works, right? I haven't run the benchmark with settings shown in the video and I really have no need to but if I do I'll edit this post with the results even though I'm sure it won't be anything spectacular.
Awesome would love to hear what results you are getting
great video subscribed .
I'm on a 13900 ks special edition @ 6.0 GHz P cores
4.6 GHz e cores
5.2 GHz ring / cache
32gb 16x 2. DDr4 @ 4,300 cl 15
Gear 1 mode.
I can do 4,400 gear 1 cl 16 on my ddr4 as well .
strix 4090
I will try this benchmark out
and see what's up
Please share your results!
@@gamedaytoday1 with the pre played benchmark demo it's the same spot every time Wich is nicen
So the game optimized to the max end of the CPU and GPU scales but that's dumb when probably 80% of your customers don't run very high end PC's
Series s only costs 250$
@@ummerfarooq5383 have you seen the gameplay quality on xbox it looks like graphics from a ps1
Even with tuned ram and cpu, taking into consideration the hardware used, 13900k/4090 doesn't really say anything to me. It scales, sure. Bur there are much better looking games, and honestly star citizen runs better than SF for me, and I'm upscaling in starfield. This is poorly optimized, period. Nothing is going to convince mr that it isn't poor optimization, especially after watching this video. If anything, this only further strengthens my viewpoint.
Very good testing, except for the wrong conclusion.
Just because it scales with more powerful hardware does not mean the game is optimized, or has a good optimization to be more correct.
Also we are in 2023, you have zero reasons to stay on Win10, but lots of pros to be on Win11, especially with ADL/RPL CPUs.
Seems like win 10 is still best for gaming but will do proper OS testing in the future
I disagree, Win10 is not "best for gaming" as a general rule as you say... it may be better for some specific few (older?) games, but those are exceptions.
Win11 is actually best for gaming if you have a modern intel CPU (AMD too) - that's a valid statement.
You're just gimping yourself on Win10 and until you actually use Win11 for a few months in many different games, you will not know and keep having the wrong idea.@@gamedaytoday1
Your performance might be a few percent lower pcgames's because you probably used a different ram with weaker timings. Also its stupid to use 13xxxK cpus with DDR5 5600...
Possibly could be weak timings but my educated guess is the OS due to win11 have better scheduler.
i5-13600K + DDR4 3600 CL16 was a huge mistake on my part.
Can be fixed by overclocking and tuning your ram, what ram sticks do you have? If it’s b-die you can get them to 4100mhz+
@@gamedaytoday1 It's KF436C16RB1A/16. And I don't know much about memory overclocking, but the default frequency only 2400 MHz and I had to used one of the XMP profiles to get it up to its advertised speeds. Is it normal?
@@JAnx01Totally normal. XMP = advertised speed.
I've got 13900k with 4133c15. I couldn't get the PCGH save to work on my GamePass version but running the benchmark myself, I got AVG 107.3fps 1% low 64.7fps with 5.7p/4.5e/5.0ring 8p+16e HT on. Missing out on that sweet DDR5 bandwidth! However, I play at 4k so it only hurts my soul not my gaming. Lol.
13700K + 4090(450W trio) 4000Cl16 all timings tuned. 5.6 Ghz 8C/8T, 4.9 Ring Pcores PCGH test 113.7fps avg, 81.3fps 1% avg lows with capframeX.
It runs way smoother than I expected for a brand new game. I'm running it at 1440p with a 13900ks with tuned 7600 ram and stock clockspeed on 4090. I'm not using upscaling but I wanted to test how cpu bound I was so I set FSR all the way to 50% resolution in steps and the fps kept going up so it is mostly gpu bound for me. If official dlss came out, I'd probably use it on quality for more smotothness but I haven't tried the mod bc I don't like crashing.
Bro the mod doesn't crash the game I tried it on various systems and various dlss files (old and new) and never did it crash i would use dlss if i where u it looks also much better 👋
you should try out the DLSS3 FG mod, gives you a 30-40% bump in performance, however its still early days so its crashing from time to time... still worth keepin an eye on it, dont think we are ever going to get native dlss for starfield...
I'll try it, I'm currently using an app called special K to launch the game in order to inject HDR into the game engine so idk if it will conflict but we'll see.@@oullahimaran1732
The article says ddr5 8200 gives best performance. 128fps
Fully tuned 7600 is only 2% slower than 8200 tuned. Not worth the extra effort. For example you get 102 fps instead of 100@@ummerfarooq5383
I don't even play this game but I want to upgrade my computer anyway just to know I have the best optimized computer for this game.
here we notice an absolute bethesda stan that will stop at nothing but to support their beliefs with any salvagable scraps of evidence whey crawl to belive