People have a hard time just accepting it's the game, always saying it's the hardware. If multiple people can be testing multiple cards on multiple different systems getting the same results maybe it's not the hardware lol. Also the idea that a 2-3 year old GPU is a potato and shouldn't be expected to run new games is ridiculous, especially when the 40 series cards struggle.
Dude, exactly. I got an RX6600 recently. And even though it's considered a "budget" GPU, that thing has 8gb of VRAM and should be able to run games like these without issues. I know it does when the game is actually optimized. I shouldn't feel like I need to get a top-of-the-line card just to achieve 1080p at 60fps. This card and more like it are more than capable to do this, especially when the games just don't look that much better from previous titles.
funny i guessed em almost perfectly, said 1080p fsr low and maybe 1 or 2 things at medium... pathetic that a 3070 cant natively render this shit at 720p low and keep over 60 fps
@@PineyJusticeand Starfailed is the most pathetic 10 years old graphics game, ever, and it needs NASA's computer to run at native 2K with Medium settings above 60 fps. Bro, this game doesn't even have reflections or any ray tracing setting, lmao. Just cube maps everywhere like in a PS2 game 20 years ago...
I'll let you in on the punchline, though: Plenty of people who care about performance will buy this game. They will play it for at least a decade on future NVidia 6000 and AMD 9000 series GPUs. And Bethesda will cry about it all the way to the bank.
When DLSS and FSR originally got announced I said "Yeah, this is just going to make games release far more Unoptimized" because developers will use it as an excuse to not optimize games. And I have been right, so many games have been really performance heavy as of late even on good pcs.
The sad thing about that is the technology is actually amazing for being able to play at graphical settings or visual fidelity your card shouldn't normally get to.... At least that's what it was designed for until every game studio adopted the "upscaling = optimisation" lazy mentality. Anyway this game goes a step further. Even with fsr on it's no where near acceptable performance. If your game can't even seem close to good enough fps with fsr or dlss on then you have an incredibly serious problem.
it's a shame that optimization is a skill lost to programming. did you know they used to fit entire games in less than 100 MB. Heck the OG pokemon was on less than 100KB iirc and that is just insane. these days 50GB is run of the mill and many games breach the 100GB mark. it's getting bad and will likely only get worse.
PC gaming is fine - bad games aren't. You could jump to console but you're still going to be getting awful performance from Starfield because it's made on a dogshit engine that can't cope with modern day games.
@@OG_Daz - True but then I look back at Skyrim... I played the game twice on Xbox 360 and then got bored of it half way through the 2nd playthrough and moved on. A few years down the line I got a PC that could play games and I discovered the world of mods... I dread to think of how many hundreds of hours I spent playing Skyrim after that.
Why we are so obsessed of these unoptimized,bug written , unfinished and overated games ?That are not worth hour to be talked about bcz game of the year award is to far for them .
Cyberpunk on Steam Deck is running 40fps medium settings at 720p with FSR 2.1 quality settings, with drops to 30fps in crowded areas, crowds set to high. I dunno how Starfield runs, but Cyberpunk does okay on Deck.
@@MG-un9bh It is 4K with FSR, I think it is either 1080p or 1440p native resolution and then upscaled. I'm getting 55 - 85 FPS with a 3070 Ti at 1080p native, highest setting and no VRS or DRS or FSR enabled. Not sure why this RUclipsr is getting such terrible performance here honestly... Plus, a 3070 is like $400 or less now anyways, though I paid a fair bit more for my 3070 Ti that was also a year ago. 55 - 85 FPS at 1080p native is a way better experience than 30 FPS (with some frame drops too!) at 4K with FSR...
Its bethesda after all. Fallout 76 was a mess at launch (still is) and I honestly dont expect this game to improve all that much, just hope Arc at least launches the game some day
News flash: When new video games come out they tend to require more current technology. The fact that an old PC, or one with weaker hardware, can't run one of the latest games, with increased complexity and depth (because if it were equivalent to an older one again you'd be bitching that there was nothing new to the game instead), is the same as it has been for decades and has nothing to do with optimization. Also, you can't see the code and have no experience as a game software developer specializing in optimization to be able to tell anything about the code's optimization if you could see it. I doubt you have any idea what optimization of any kind in compilation of software actually entails, although some of the developers at BGS and XBox with decades of experience know because that's their job. But one thing I know for sure: my three-year old machine with a three year old CPU and year and a half old previous-generation GPU runs the game beautifully on 4K Ultra/High. There are zero problems with the frame rate, and it's as smooth as glass everywhere in the game with over 70 hours now. If it didn't, I'd see what I would need for equipment to play the game or play something else. I wouldn't whine about it or blame it on things I don't actually know anything about, because that's not my style. If I have a problem (and YOUR PC is YOUR problem), I'll fix it instead of blaming it on someone else like developers who just tried to make a good new game.
@@PineyJustice Whats your point? Standards have gone up since then. 4K 240hz monitors didnt exist then let alone mainstream high refresh rate monitors. The target for most console games with 30fps.
I have a 3070 and I think it'd be fair to expect at least 80fps at native 1080p max settings on any game released this year and even the next. This card was supposed to be a pretty decent 1440p card after all and now it can't even handle 1080p. I got a free copy of this game from a friend but I think I'm gonna pass on this one until they optimize it better... if they ever will.
My 3070 LAPTOP can play cyberspunk 2077 ultra settings ultra RT DLSS balanced 60fps through all but the city center where it drops to around 50ish. Also the game is rubbish and that’s coming from the biggest Bethesda Stan of all time I was playing there games since morrowind when I was 12-13yrs old. At least 250hrs+ in every game since then(excluding online/76 of course.) The beginning is the worse 15hrs of any open world game I’ve ever played. The planets are beyond bland with sterile generic bases randomly generated everywhere, all the fun is hidden behind a way to large perk tree, space combat is either impossible or laughably easy, constant loading screens since you’re fast traveling every time you reach a quest marker, no immersion, over encumbered every time you explore a location, lackluster gun battles that are just about slinging more lead than the other guy, horrible writing, wooden faces. I thought they could fix most of this but they didn’t. Don’t spend 100 at for a sale and mods guys!!!
I mean i literally play cyberpunk with my 3070 at 4k and the settings are just about max with dlss on. Great experience and i still get over 60fps. This is a joke
Luckily on PC there's also the modders that will release dozens of performance tweaks for the game so i think we're fine, soon the official patches and drivers will address these issues and after that the modders will come with their own tweaks on top of that.
The 3070 is not an old GPU. I have a 3070 TI. This state of gaming pisses me off to no end. Spent way too much money to have piss-poor FPS in a game released 2 years after the release of my card. I hate this. The 30 FPS on 1440p @ LOW settings is what really got me heated.
It's the programming... way too many actors, and way too many pointless actor classes ( objects you pick up ) these all contribute to bad optimization ( when optimizing you literally eliminate objects that are pointless ) bad texture application, and likely too many branching switch case coding ( or stacked else/ if ).
I think outlets like digital foundry are also to be partly blamed for this, their so called "tech analysis" which contains obsessing over useless gimmick features which barely make the game the look better while incurring significant performance cost and saying good visuals requires good hardware while the visuals themselves does not look anywhere close to justifying their hardware requirements.
Thank you for showing this. Way too many fangirls in the comments with "it runs great for meeee" comments dismissing any legitimate complaints about frame rate. I have a 3090 and it runs like this at 4K. Had to DLSS mod it to stay above 60fps at 50% resolution scale. It just runs so poorly.
@@MrDmadness There is so much random shit in every little base you can pick up. Folders, pens, coffee mugs, styrofoam trays, pencils, notebooks, empty food packets, photo frames, etc. These are junk items worth pennies. Sure, you can collect them to spread around your outpost to make it more “homey”, but my god there is too much junk.
@@Aurummorituri and every one of those takes up a bit of space in the program, it renders them as actor classes, which is a huuuuge load on the gpu and cpu for absolutely no benefit to the game or player
I've noticed that if you put the Shadows at medium and the crowd density and refletions at high, but leave everything else on low or off, there really isn't much of a visual difference in most cases between that and ultra. The biggest difference is the performance drop.
I have a feeling a huge patch will arrive soon. I noticed the VRAM Buffer is acting strange on NVIDIA GPU's. Seems like it's not being fully utilized. Great video, Saude!
Hey I hate to burst your bubble but there is absolutely no way they would be able to optimize the egine because its clearly being pushed to its limits. They already delayed it for a year and this is the result.
yeaah... we still wait for "boston low fps" fix for fallout 4, only modders can "save" starfield, im still impresed how modders fixed most of skyrim bugs BUT WE DONT HAVE CREATION KIT YET SO GOOOD LUCK PLAYING 30 FPS ON 3060 hahahaha
@@superpartia3805yeah these fuckers can't even implement mods into patches copying others' people work to finish their fucking game. Worst developers ever, always said this. Shit graphics, 2006 gameplay, more loading screens than an xbox360 game, facial animations from the x360 era, forgettable writing, decent performance lile 5 years after release, patches and only with community patches (mods). Bethesda are incompetent like nobody else and I really don't understand how people like their game..I meain maybe they are so nostalgic of the morrowind gameplay that they want the same after 20 years lmao..but I mean, a lot of people loves junk food, I guess it's the same with games.
Enjoy cyberpunk for the time being and refund starfield until it's patched... And cheaper. You buy GPU that should run everything on desired resolution and then devs just fuck it up. Sad times
@@semick4729 just wait for the unreal 5 wave to kick in, the most generic looking engine that only pushes photo realistic graphics in test demos at gaming events, and was never really meant to be utilized in action. at least in this current year. i really wanted to feel like a kid again with all these new releases, picking up games and booting them up on my rig. but all i get are more and more frame loss with worse and worse graphics and gameplay. is there really any point to gaming on a medium spec pc anymore?
Enjoy Cyberpunk instead. You can play it at maximum settings (not including path tracing) at 1440p at the framerates it can run Starfield at 1080p high. Looks much better and is also probably more fun.
I get drops to 40 FPS at 3440x1440p medium/high settings in dense foliage areas, when walking up to lava and resources/gas vents. I have an RTX 3090, i9 12900k, 32GB DDR5 Ram @6200 Mhx CL32 and 2TB NVMe M.2 SSD. Seems like there are issues with the actual optimisation of certain graphical settings or an engine issue. Because I usually get 80 FPS. And lose 40 FPS straight up in lava, walking up to resource vents and in the high foliage areas I can drop down to 47 FPS. So lose 33 FPS. The difference between using FSR or not is extremely small. Like a 10 FPS difference. If I were using DLSS quality I would be getting atleast a 20-25 FPS difference. More then double and it would look better too.
Can't bother optimizing your game? Just make it run at 30 frame on consoles and write down heavy hardware requirements for pc so you can get away with it.
I was waiting for reviews for Starfield to see which game I wanted to invest more time in with Boulders Gate 3. Thank you to all of the premium edition beta testers for helping me dodge this bullet lol.
He deserves the money he is one of the very very few people on the internet who has not been bought out to actually tell the truth about this game lol. Seriously it's actually painful watching big tech and review channels and streamers all sit there and lie that this game is perfect and deserves a 9/10 or 10/10 rating and that the performance is fine you should just buy a 5k pc if you wanna run something as extreme as 1440p high settings.
I genuinely hope that it’s just the drivers that haven’t been ready yet which would explain the horrible Nvidia performance hopefully we will get Game Ready drivers on the 6th
It could be that and I’m hoping that’s the case but if not then it’s pretty crazy to have 1/2k pc’s and the game can’t run well. On top of that, the xbox is capped at 30 fps which i find bizzare
Seeing how many Nvidia users there are on Steam surveys, I'm not so sure that this is a great move from Bethesda. An anticipated title like this should function well on all modern GPUs, regardless of manufacturer. There are already DLSS mods, but it shouldn't be required to play 1080 resolutions on decent settings, using a mid-tier GPU from last gen. This is nuts.
All I'm saying to people is.. I'm running Cyberpunk with Path Tracing, not just ray tracing.. at 30fps constantly with DLSS set to just balanced, on my 3070.. at 1440p. Starfield is a disgrace.
On my Ryzen 5 3600 3060 RTX 32 gig PC I had over 100 fps, 107 to be exact. Everything is at the default medium. 1920x1080. Did some running around and no significant dip.
the hype train for this one is real, most people wont percieve the garbage graphics and horrible framerate as "bad optimization" and think that the game is so big that it runs so poorly. i think the worse it runs, the more people are gonna be defending it 😂
@@jkeebla >most people wont percieve the garbage graphics They'll say "it's a bethesda game." Personally, after Skyrim I told myself any game released after Skyrim will have acceptable graphics considering Skyrim was fairly acceptable already.
@@jkeebla Yep, I have a casual friend like that, designed him a $2000 7900xtx PC for starfield and some other games. He has some sort of driver issue and was getting 1080p 30fps ultra, he thought it was just the size of starfield and that was normal.
Mine runs well now on 2070s. DLSS enabler mod "Starfield FSR2 Bridge" gave me better performance. I also replaced the medium.ini from "Starfield performance optimizations" for a boost. If its eating your Vram you could install the "performance texture pack".
yeah the DLSS mod is legit a MUST for RTX3000/4000 users, it basically let me go from running the game on medium at 60fps to high at 60fps with exact upscaling and all.
These videos are so informative and entertaining at the same time, and it's because of how you present yourself, please don't ever change bro, we love you like this 🙏
If a game can't hit 2k60fps with a 3070.. That's just outright disgusting optimization and there's no defending that whatsoever. "But but but but it's OlD hArDwArE" BS. It's hardware that was created specifically for 2k60fps benchmarks on software that Starfield's engine quite literally uses. And it's not even that old. It's exactly 3 years old nearly to this day. It came out 2 months before the XSX, and is quite a bit more powerful than that GPU. Loving a game isnt a good excuse for defending its problems. You want it to be better.
It's like when the gtx 1070 came out in 2016, considering it was a 70 tier card, it ran every game at 1080 max settings just fine back then. Yeah 8gbs of vram was better in 2016 than nowadays, but it's still really good for 1080p gaming, games like stray, god of war, doom eternal, and other games that have good optimization and look good, run really good on a 3070 despite the 8gbs of vram and I can confirm that. Really a shame that the games industry has come to this kind of situation 🤷♂️
Xbox Series X is Native 1440p and then FSR upscaled to full 2160p and it’s rock solid locked 30fps and could easily run 60fps if the devs actually attempted to optimize it but they never even tried to optimize and instead took the easy way out and locked it at 30fps and said that’s good enough but honestly it feels more like 40 fps as it’s a super smooth 30 which I’ve never felt before as usually 30 feels atrocious but not in Starfield so I’m not sure how they made it feel so good but I’m definitely not nearly as upset now that I’ve tried it and it doesn’t feel terrible like the usual 30
Some people found there is at least some serious CPU optimization issues such as the thread priority being changed every frame which is really bad for performance. Another thing was a compilation issue. Probably wouldn't affect GPU performance much, but there is definitely optimization issues (Unless you're on a AMD GPU)
The Video does not show CPU limits for the game.. but GPU hard limits. Its freaking 35% to 50% max.. lets say we gonna worse CPU and usage of 70% - its still not CPU limited :/ so whatever they found.. is not of use for the optimalization coz CPU is not the problem.
Def not CPU problems, unless their engine is bugged and don't use the CPU correctly or something - not suprised if the case, skyrim and fallout 4 I had to use a mod to fix something about the engine using the CPU incorrectly (they use only 2 cores and forget abou the rest), to which I saw a massive FPS boost back then. Guess they haven't fixed this yet.
@@masteroak9724the 5800x3d can do mad 105 watts? In this video average is 65 watts 😂😂😂😂 der a cpu issue also happens with the gpu! It does not matter if ti says 99% usage if it only uses 2/3 of power draw ! It willl never achieve the desired result
@@artursobkow2680 bullzoid posted a video about how this game is memory bandwidth limited more than cpu limited while the game is not cpu limited on kryzpy's system it certainly the utilization is certainly high for what's offering
Hello kryzzp , great video as always😄.. I have a suggestion for you that you can try the DLSS mod in this game and see how much of a difference it can make to that of fsr2
My 68000xt plays about the same 😸😸 won't maintain 60fps outside with frequent dips to the 50's and occasionally 40's. So much fun! The cpu never goes over 10% utilization with 16 threads and the gpu stays at 100% always. Such a fine tuned experience!
Me too! Gpu at 96% (3070) and cpu just chilling 😂😂😂😂😂 Quite opposite of my cyberpunk experience 😂😂😂😂 cpu 90% gpu chilling Wth are these new devs even doing 😂😂😂😂😂😂
From what I've been reading that Nvidia didn't have enough access to the game to optimize their drivers before launch, so all of these benchmarks are just good for a baseline. In 1-2 months both game and driver updates will make it more playable on Nvidia and Intel Arc (can't even run it). Likely they cared more about the Xbox release than the PC release in terms of optimization.
Letting AMD get their grubby hands on things definitely hurt the optimisation especially as a majority of people that actually care about what's in their PC and know what they're talking about with said situation are not using AMD, they're using Intel and Nvidia for the superior efficiency, fps, optimization and all the other wonderful things.
@adambrown6669 I think Microsoft is the bigger villain here. They bought Bethesda and forced it out a bit early and only optimized for the Xbox (AMD hardware) for early access. This forced them to make it more CPU heavy and only implement AMD optimizations that were required, which is why it runs better on AMD currently. Remember, technically the game is in early access. I think originally, they wanted to release it in November, as Bethesda likes to.
@@adambrown6669AMD really are not that far behind 😅 Better gaming CPU's, better mid range GPU's only to see improvement with FSR3. I highly doubt AMD had anything to do with anything other than it being a sponsor/partnership deal...The real villains are Bethesda and Microsoft here I think...
Honestly, the change in performance from low to high was underwhelming. I'm accustomed to games being markedly different in performance and visuals when that gets changed, but there was almost no change in visuals from what I could see. YT compression is definitely affecting this, but seriously?
Nice video. really gives an insight on how the game might be running. On the 1080p low side you might be atleast partially cpu limited tho what from i can see. I tried to estimate performance with my 5800X3D and from sites i read is that it might not be capable of much more than 70fps in some areas. I really hope they improve with optimization. i like to play games with more than 60fps. just feels smoother and more responsive.
5800X3D + 3070 Ti, at 1080p (no FSR, no DRS, no VRS, ultra settings) I get 55 FPS in the heaviest areas like New Atlantis, where this video takes place, but in most other areas I get 65 - 85 FPS. With no stutter and a VRR display it is perfectly fine. I too prefer 100 FPS+, but this isn't really the kind of game that "needs" that to feel good, it isn't some action game like Doom or CS or something. It is fine. What is unfortunate is that dropping to low settings doesn't help the game all that much with FPS, meaning folks with 3060s and like Ryzen 3600s are getting crushed down to 30 - 40 FPS at 1080p. Sucks for them, because the game is great honestly. Plenty of drama out there but I personally am having tons of fun with this game. I like it more than Skyrim/Oblivion for sure. And it so far seems to have more potential than Fallout, but I'm not 100% on that yet, I'll know more as I get deeper into the quests. The game has a TON of mechanics too, with far better menus than Fallout 4 had, like when it comes to crafting and base building and ship building and organizing your crew for settlements (which make sooooo much more sense than the settlements in Fallout 4). The main quest is actually fantastic and the faction quests are even better. Side quests are hit and miss as per usual. The only thing lacking for me is the space side of things. This game is basically Mass Effect on steroids. If you approach it like that it is really great. If you expect No Man's Sky + Bethesda game, then you'll be very disappointed regarding the NMS elements, but it DEFINITELY is a Bethesda game, just with way, way fewer bugs than normal for Bethesda. And a much nicer graphical fidelity. Comparing this to Fallout 4 or Skyrim it looks light years better, and those games ran like garbage when they launched as well.
just keep in mind that your rig is extremely powerful compared to what the majority uses, so what's great to you is 30 fps low settings to the next guy.@@mraso30
@@mraso30 thx for the info :) i have been playing elden ring with 60fps which was also alright but not as i prefer. found out i could unlock the framerate, now i get around 120 and it just feels alot better. I expect the game to run fine then but it still bothers me that even with such a high end system i can't get a stable 120fps and i don't feel like upgrading to a 7800X3D which can't do 120 either.
I have a 3070, I'm not even mad i'm laughing through all your starfield videos, don't let the marketing forces you into thinking that what you have as hardware is not enough to have fun. missing on a game or two per year by CHOICE shouldn't make your entertainment life feel horrible.
Nah, devs should stop using upscaling technology as a crutch for bad optimization. This game should be able to run well over 60 fps without any DLSS/FSR at high settings on medium end cards like the 3070. You are right about the fun part, but people want a good looking game and not have it look like someone rubbed vaseline on the screen because they have to turn down the rendering percentage to have playable fps. FPS Lower than 60s isn’t fun, but that’s definitely depending on player. Can’t let your fun blind you from the horrendous optimization. Its why devs get away with it.
@@NovistadorsWishful thinking. That boat has sailed and will never return, so to speak. Also, it doesnt look like vaseline is on your screen. Get your eyes checked.
@@Novistadors 3070 was and is a joke of a card with 8gb of vram. This is probably the only game that will fit at 1440p or higher in that tiny vram pool. It's only going downhill from here.
9:35, there's no way you can pause on this still and think to yourself that this is the true next gen experience, games look WAY better than this from way before this game released that run twice as good.
I have locked it at 30 fps on a 4K TV 50% scaling and have blasted all post processing on TV, crystal sharp image and super smooth with LG truemotion tech (90fps interpolated from 30), but my god the latency is 35ms XD. The only way the game is somewhat playable, the combat takes time to get used to but it's not that bad on a controller. Very sad milestone game for PC gaming.
@@Bubbagaming90000 I understand that 4060 is not the best video card, but I don't have money for 3060 ti (we have it more expensive). And I also do not want to buy rtx 3060, because it is weaker. What should I do then?
The difference is a demanding game has good textures this game is awful some of the furniture looks like its painted in the wall & those trees FFS talk about lazy
@@NoodlesTBogratActually textures are one of the least taxing elements in games. Although I somewhat agree. Some things like pilot cockpit are incredibly detailed and other like leaves look like...
@@NoodlesTBogratThat's the reason why game VRAM usage is so low: low-quality textures. People would be complaining again that their trash rtx4060 is running out of vram.
I've got a 3070, playing at 3440x1440, DLSS mod, low settings and getting 90+ (except for big cities, where I get about 60). Since I don't really care about the graphics being ultra, this has been totally fine for me.
They knew about the performance when getting sales footage. All the influencers who got the game 3 weeks early knew about the performance issues. They also knew about politics being inserted. Nobody said anything.
I have this GPU paired with a 13700K. Where @ZWORMzGaming is testing is about as tough as it gets. I have the DLSS mod installed and with the equivalent of about DLSS Quality and Ultra settings at 1440p, I get over 70 fps in New Atlantis all the time, it looks gorgeous (much better than FSR), and in other outdoor areas I tend to get 80 to 90 fps and in indoor areas, 100 to 140 fps. I love the game and how smooth it is - barely a stutter ever. And I am expecting an NVIDIA driver update to increase performance by 10 to 20% (the Wattage is way under what it should be). It's an intensive game, but the outrage is a bit over the top. Think of all the NPCs walking around in New Atlantis, so many with quests attached to them, the number of persistent variables and objects, the number of if statements required to check so many conditions - it's tough on CPUs compared to the typical on-rails game (I am a developer, including with Unreal Engine, so I know what I'm talking about), and is holding the GPU back here in this section.
3070 Ti here, getting 55 - 85 FPS @ 1080p (no FSR), highest settings, VRS and DRS turned off. New Atlantis gets about 55 - 70 FPS. Sure, it feels like I should be getting like 100 FPS based on visuals alone, but in TLOU Part 1 for example I get 80 - 100 FPS, and whilst that game does look nicer it is also such a simple game in terms of scope compared to Starfield. I have zero issues with the performance I am getting. The only people getting "bad" performance in this game are on Xbox Series X, or have a 3060 or lower card and are probably trying to run at 1440p or something stupid for their card, or, they maybe have a decent card, but paired it with terrible components other than their GPU and this game stresses the CPU just as much as the GPU for sure, most games don't do that so people get away with their mainstream CPUs, but this game needs a good CPU/GPU balance, and a lot of folks neglect that when building a gaming PC.
It seems you do know what you are talking about, but NPC's part just does not make any sense. Look how many NPC's in the CP2077 on High crowd settings, look at GTA5. Before you start to speak with the NPC, it's just a 3D model animation, nothing more. All the quests and dialogues can be stored in memory and CPU absolutely does not need to calculate anything about those things before interaction with the NPC. And even when interaction happens, it's just laughable to think that checking a few numbers, triggers, player stats whatever there is, could be hard for a modern CPU that is capable to performing billions of operations per second on each core. I am pretty sure that rendering a single NPC takes way more resource than any dialogues\interactions\scripts of all NPC's in the area. How do I know? There are much simpler games, where you can have complex interactions with huge amount of NPC's and it will run on a potato.
The Starfield DLSS Nexus Mod is totally worth the extra five minutes to install. Using a 3060 Ti and 5600 and getting 55fps+ in cities on 1440p high settings with default resolution scale and latest DLSS version. No overclock and turned off motion blur and film grain. I'm using the GamePass $30 update version of the game. Unfortunately I read that Steam buyers are having some issues with the mod crashing and some controllers not being recognized. It's super simple to install or revert if it doesn't work.
Has it been updated since release? Last time I checked it was not giving the correct visuals of dlss only the fps increase. Seemed more like it was using fsr as a base with "adjustments" to call it dlss not a proper implantation.
@@kishaloyb.7937 Yeah it's really weird. FSR/DLSS hardly changes anything in the game. I think it's an overall optimization problem with Creation Engine 2. Hopefully they figure out some software magic and release some updates that make it a bit more performance friendly. When they said "least buggiest game" I think they meant that in the true sense of bugs (quest,item bugs etc) not that it would be the best running performance wise. Still very playable on my 6700xt/5800x @ 1080p. If I didn't have an fps monitor on I don't think I would have noticed the fps dips as much.
@@magnuskane246 FSR 2 and DLSS 2 have very similar performance overheads. So, if you were getting 70fps with FSR 2 @ 75% render scale at 1080p. You'll roughly get the same FPS with DLSS 2 at those exact same settings too. That's just how the technology is. It's only with DLSS FG, you'll notice the fps increase further but then you'll have increased input latency too. Though as much as I love the game, I've gotta agree that the engine is very much unoptimized. It's guzzling computing resources but the fps is really not that great. Hope Bethesda fixes it with future patches and modders could optimize it further.
I remember when FO4 came out. At the time 900 series was the best you could have and a 970 wasn't able to play it on high settings. Looks like it's the same today.
Thanks for testing the 3070! I just got one recently from my friend, I was excited to try out starfield but seeing how poor the performance is at 1440p, I think I"ll just pass on yet another terribly optimized game, maybe in the special edition 6 years from now I'll consider trying it out. Keep up the great work!!!
can you do some CPU tests to see if cpu is being utilised efficiently and wether or not turning of hyperthreading gets better performance, a lot of games this gen are running better without hyperthreading for some reason
I know it might not be your thing to mess with mods for performance, but you should try the dlss mod, just to see how it stacks up against native and fsr, to see if it really helps or if this game is a lost cause.
Over 20 hours in on starfield,Been playing this on my 3070 paired with i7-12700k. Ultra settings with 66 percent resolution scale of 4k. Definitely been very playable. Not a completely 60 fps lock of course lol but there are plenty areas especially inside buildings and caves and stuff where it’s well over 60.
60fps is not good enough anymore, 120 is the new standard, i refunded this overpriced decade old looking game, that runs worse than any of my decade old game, and dosnt look much better.
@@laszlozsurka8991 There's already DLSS3 Mod on nexusmods. Framegeneration part is paywalled for 5$ and exclusive to 40-series just like regular framegen.
What's more scary about this games performance i found is it began development in 2015 and was in a playable state since 2018 before the 30 series gpus were even announced yet you are looking at a 4070 for 1080p medium/high 60 fps at the bare minimum. I don't understand how in 2023 a triple A studio putting millions of dollars into a game can go "yea 30 fps for modern consoles that's good enough, 600+ dollar gpus for 1080p medium settings on pc sounds good send it out. Like I won't pretend the game is ugly but I also wont pretend the game is better looking than many games at all we have seen in the past 4 years. Normally studios would get nailed for such a release in reviews but for some reason this game is skating by getting undeserved praise by everyone for the only single defence I can find on the entire internet being "it will get better in the future give it 6 months"
the fact todd howard said in an interview when asked about the bad optimization "just upgrade your hardware" I was dumbfounded how is he saying people just dont have good enough hardware when I watched multiple 4070 benchmarks and they couldnt get any higher that 66 fps on high the game is optimized horribly
I'd be surprised if it got 60fps at 1440p High settings. It's a Bethesda game after all. Lazy companies avoid optimization by forcing the use of DLSS. Bought the 3070 as a card for 1440p. I played 1 or 2 games and it's became 1080p card.
I currently have an r5 3600 and a 3070. I finally gave up on trying to get 60fps at 1440 and just locked the game to 40fps in control panel and set the preset to ultra with 80% render scale and a DLSS mod to get away from the FSR problems. Even with all of this it still drops to high 30's at times but if you turn off the frame counter it's more or less fine and you get used to it. Absolute shame the game has launched in this state however as it really is quite a good game.
You ask who is that decides the program design options. The video game devs get kick backs from card manufacturers by making stuff that doesn't work on the old cards, making people have to upgrade. The tech industry has been exposed using this predatory practice in all aspects of tech. The cell phone and video game companies were exposed all the way back in early 2000. Another thing they like to do is in the updates, they introduce dummy processes to make the component seem like it's running slower.
Thats how every single pc game is at release. They take the feedback on all machines used via steam and then optimize the game for the cards used and patch it for performance.
People have a hard time just accepting it's the game, always saying it's the hardware. If multiple people can be testing multiple cards on multiple different systems getting the same results maybe it's not the hardware lol. Also the idea that a 2-3 year old GPU is a potato and shouldn't be expected to run new games is ridiculous, especially when the 40 series cards struggle.
Exactly! I've been doing this for years. Optimization has never been worse...
@@zWORMzGamingDramaqueen.
Dude, exactly. I got an RX6600 recently. And even though it's considered a "budget" GPU, that thing has 8gb of VRAM and should be able to run games like these without issues. I know it does when the game is actually optimized.
I shouldn't feel like I need to get a top-of-the-line card just to achieve 1080p at 60fps. This card and more like it are more than capable to do this, especially when the games just don't look that much better from previous titles.
Agree. I highly doubt a GPU can go from 4K60fps to barely 1080p40 over night. Makes no sense, no matter how people try to spin it.
@@ChrisGrump DooDoo Head
This "guess the settings" thing is a very smart idea actually. Great channel, both funny and informative. Keep up the great work!!!
Thanks for the feedback! I'll keep doing that :)
Cheers!
funny i guessed em almost perfectly, said 1080p fsr low and maybe 1 or 2 things at medium... pathetic that a 3070 cant natively render this shit at 720p low and keep over 60 fps
@@arencorparencorp2189 Well the 3070 is the most pathetic upper midrange gpu we've seen in the last 15 years.
yeah we like that@@zWORMzGaming
@@PineyJusticeand Starfailed is the most pathetic 10 years old graphics game, ever, and it needs NASA's computer to run at native 2K with Medium settings above 60 fps. Bro, this game doesn't even have reflections or any ray tracing setting, lmao. Just cube maps everywhere like in a PS2 game 20 years ago...
"if you care about performance don't buy this game" hitting the nail on the head
I'll let you in on the punchline, though: Plenty of people who care about performance will buy this game. They will play it for at least a decade on future NVidia 6000 and AMD 9000 series GPUs. And Bethesda will cry about it all the way to the bank.
a 3070 is NOT an old gpu.
3070 for 1080p medium settings and barely 60fps with drops, unthinkable.
with a 3060 ti, I play 1080p High settings and FSR Off, 55-60FPS nothing under....
@@nightgardheal post a video with those frames on those settings in New Atlantis or fake
@@nightgardhealcpu
@@nightgardheal Dude stop lying. You obviously have either resolution scale at 50% or you've got driver-based scaling enabled.
When DLSS and FSR originally got announced I said "Yeah, this is just going to make games release far more Unoptimized" because developers will use it as an excuse to not optimize games. And I have been right, so many games have been really performance heavy as of late even on good pcs.
That´s why I stopped playing AAA games, It's sad that AAA game developers don't take optimization seriously, F*ck them...
optimisation, as a skill, has been a non priority ever since.
Yep, that's exactly what they have done.
Yeah everyone says that once it happened lmao. Plus where you close you quotation marks bruh?
The sad thing about that is the technology is actually amazing for being able to play at graphical settings or visual fidelity your card shouldn't normally get to.... At least that's what it was designed for until every game studio adopted the "upscaling = optimisation" lazy mentality.
Anyway this game goes a step further. Even with fsr on it's no where near acceptable performance. If your game can't even seem close to good enough fps with fsr or dlss on then you have an incredibly serious problem.
If every PC game was optimized like Doom, there would be no market for high-end GPUs as that engine runs amazing.
Doom's engine is an absolute masterpiece.
This! 100% this!!
And this game doesn't even look as good as Doom
it's a shame that optimization is a skill lost to programming. did you know they used to fit entire games in less than 100 MB. Heck the OG pokemon was on less than 100KB iirc and that is just insane. these days 50GB is run of the mill and many games breach the 100GB mark. it's getting bad and will likely only get worse.
It's impressive but doesn't have the load other games do
as much as I love pc gaming, that's it for me. the amount you have to spend on new hardware just for this level of performance is criminal
PC gaming is fine - bad games aren't. You could jump to console but you're still going to be getting awful performance from Starfield because it's made on a dogshit engine that can't cope with modern day games.
@@PeteBaldwinNailed it.
@@PeteBaldwinexcept you will be getting 30fps at 4K on a £400 machine. I know which one is the better value
@@OG_Daz - True but then I look back at Skyrim... I played the game twice on Xbox 360 and then got bored of it half way through the 2nd playthrough and moved on. A few years down the line I got a PC that could play games and I discovered the world of mods... I dread to think of how many hundreds of hours I spent playing Skyrim after that.
I've sold on all my gpu, now running my i7-10700k with integrated graphics and giving up gaming.
"Can it run cyberpunk?"
- 2020
"Can it run starfield?"
- 2023
Why we are so obsessed of these unoptimized,bug written , unfinished and overated games ?That are not worth hour to be talked about bcz game of the year award is to far for them .
This game makes cyberpunk look like the best optimized game ever
Cyberpunk on Steam Deck is running 40fps medium settings at 720p with FSR 2.1 quality settings, with drops to 30fps in crowded areas, crowds set to high.
I dunno how Starfield runs, but Cyberpunk does okay on Deck.
He made a vid on starfield on the steam deck and it's not really playable, drops to 20s in the intensiv areas@@SlainByTheWire
i get a lot better results with my RTX4070 running threadripper processor. and full ray tracing.@@SlainByTheWire
Yeah lol .
imagine living in world were you have to own a 1-2k graphics card to play console games
To be fair console runs at a locked 30 fps. So the 3070 is still outperforming the console lmfai
THIS GAME RUNS SHIT ON CONSOLES TOO. Will sound funny but 3070 runs it better. 30fps is joke.
@@BennyPressbut you are playing on xbox with 4k, not 1080p...
@@MG-un9bh It is 4K with FSR, I think it is either 1080p or 1440p native resolution and then upscaled. I'm getting 55 - 85 FPS with a 3070 Ti at 1080p native, highest setting and no VRS or DRS or FSR enabled. Not sure why this RUclipsr is getting such terrible performance here honestly... Plus, a 3070 is like $400 or less now anyways, though I paid a fair bit more for my 3070 Ti that was also a year ago. 55 - 85 FPS at 1080p native is a way better experience than 30 FPS (with some frame drops too!) at 4K with FSR...
@@MG-un9bh not native 4k
Its dynamic resolution
U can get 4k high settings 40+ fps with a 3070 too with dlss on
Starfield developers: You think you saw enough games with terrible optimization? let us introduce ourselves.
Its bethesda after all. Fallout 76 was a mess at launch (still is) and I honestly dont expect this game to improve all that much, just hope Arc at least launches the game some day
They need no introduction.
bUt aT LeAsT iT dOesN't StuTtEr
And yet it was a gpu company that started it all....
News flash: When new video games come out they tend to require more current technology. The fact that an old PC, or one with weaker hardware, can't run one of the latest games, with increased complexity and depth (because if it were equivalent to an older one again you'd be bitching that there was nothing new to the game instead), is the same as it has been for decades and has nothing to do with optimization. Also, you can't see the code and have no experience as a game software developer specializing in optimization to be able to tell anything about the code's optimization if you could see it. I doubt you have any idea what optimization of any kind in compilation of software actually entails, although some of the developers at BGS and XBox with decades of experience know because that's their job. But one thing I know for sure: my three-year old machine with a three year old CPU and year and a half old previous-generation GPU runs the game beautifully on 4K Ultra/High. There are zero problems with the frame rate, and it's as smooth as glass everywhere in the game with over 70 hours now. If it didn't, I'd see what I would need for equipment to play the game or play something else. I wouldn't whine about it or blame it on things I don't actually know anything about, because that's not my style. If I have a problem (and YOUR PC is YOUR problem), I'll fix it instead of blaming it on someone else like developers who just tried to make a good new game.
I remember the days when games were playable for most people when it released, not a year later.
What like Crysis?
Oblivion was like 25 fps on top tier hardware at ultra settings at launch. Clearly don't have a very good memory or you're young.
Well, I still cant run crysis ^^
@@PineyJusticeanything else?
@@PineyJustice Whats your point? Standards have gone up since then. 4K 240hz monitors didnt exist then let alone mainstream high refresh rate monitors. The target for most console games with 30fps.
the funny thing is starfield's graphics doesn't even look good for a 2023 game, looks like apex legends to me, but perfoms 10 times worse
I have a 3070 and I think it'd be fair to expect at least 80fps at native 1080p max settings on any game released this year and even the next. This card was supposed to be a pretty decent 1440p card after all and now it can't even handle 1080p.
I got a free copy of this game from a friend but I think I'm gonna pass on this one until they optimize it better... if they ever will.
I have free copy too) My friend name Xatab.
I have one too. my friend name dodi
Have copy as well. My friend name FitGirl
It's Bethesda... Of course it's gonna be broken for at least the first year or so lol it's pretty much their trademark 😂
I don't see it ever being optimised considering it's using the engine from oblivion
I sure hope there's some patch on the 6th, I've got a 3070. The very same 3070 that runs RDR2 and CP2077 in 1440p nearly maxed out at 60+ fps.
@@Blackfatrat Hope you're right, I've watched some non spoiler gameplay. If I like ES(O) series, and Fallout series, how you think I will like this?
@@Blackfatrat I will be testing this on game pass, tho it seems AMD got the best optimization this time.
My 3070 LAPTOP can play cyberspunk 2077 ultra settings ultra RT DLSS balanced 60fps through all but the city center where it drops to around 50ish.
Also the game is rubbish and that’s coming from the biggest Bethesda Stan of all time I was playing there games since morrowind when I was 12-13yrs old. At least 250hrs+ in every game since then(excluding online/76 of course.) The beginning is the worse 15hrs of any open world game I’ve ever played. The planets are beyond bland with sterile generic bases randomly generated everywhere, all the fun is hidden behind a way to large perk tree, space combat is either impossible or laughably easy, constant loading screens since you’re fast traveling every time you reach a quest marker, no immersion, over encumbered every time you explore a location, lackluster gun battles that are just about slinging more lead than the other guy, horrible writing, wooden faces. I thought they could fix most of this but they didn’t. Don’t spend 100 at for a sale and mods guys!!!
I mean i literally play cyberpunk with my 3070 at 4k and the settings are just about max with dlss on. Great experience and i still get over 60fps. This is a joke
@@Blackfatrathaving to play on 1080p low is not fine at all.
Truly one of the bethesda games of all time
Luckily on PC there's also the modders that will release dozens of performance tweaks for the game so i think we're fine, soon the official patches and drivers will address these issues and after that the modders will come with their own tweaks on top of that.
@@fs5866 typical bethesda let the modders do their jobs lol
@@fs5866and if it stays relevant enough will get re-released dozens of times with the same bugs since day 1
@@micsss_unfortunately it has worked out for them in the last 2 decades
Even worse than F4 and 76, what am i reading XDD
The 3070 is not an old GPU. I have a 3070 TI. This state of gaming pisses me off to no end. Spent way too much money to have piss-poor FPS in a game released 2 years after the release of my card. I hate this. The 30 FPS on 1440p @ LOW settings is what really got me heated.
It's the programming... way too many actors, and way too many pointless actor classes ( objects you pick up ) these all contribute to bad optimization ( when optimizing you literally eliminate objects that are pointless ) bad texture application, and likely too many branching switch case coding ( or stacked else/ if ).
I think outlets like digital foundry are also to be partly blamed for this, their so called "tech analysis" which contains obsessing over useless gimmick features which barely make the game the look better while incurring significant performance cost and saying good visuals requires good hardware while the visuals themselves does not look anywhere close to justifying their hardware requirements.
Thank you for showing this. Way too many fangirls in the comments with "it runs great for meeee" comments dismissing any legitimate complaints about frame rate. I have a 3090 and it runs like this at 4K. Had to DLSS mod it to stay above 60fps at 50% resolution scale. It just runs so poorly.
Pro tip for the programmers... have way WAY less actor classes ( pointless objects to pick up )
@@MrDmadness There is so much random shit in every little base you can pick up. Folders, pens, coffee mugs, styrofoam trays, pencils, notebooks, empty food packets, photo frames, etc. These are junk items worth pennies. Sure, you can collect them to spread around your outpost to make it more “homey”, but my god there is too much junk.
@@Aurummorituri and every one of those takes up a bit of space in the program, it renders them as actor classes, which is a huuuuge load on the gpu and cpu for absolutely no benefit to the game or player
@@MrDmadness The only "use" for these junk items is to pick them up and sell them in bulk to level up your commerce skill.
Just turn on dlss or fsr. /s
That's what they want you to do, just upscale. Optimization is out the window.
I've noticed that if you put the Shadows at medium and the crowd density and refletions at high, but leave everything else on low or off, there really isn't much of a visual difference in most cases between that and ultra. The biggest difference is the performance drop.
maybe do a "Guess the GPU" shorts video series with the kind of games releasing these days
CD PROJEKT RED: "we've released the most unoptimized game of all time"
Bethesda: "hold my inhaler"
I have a feeling a huge patch will arrive soon. I noticed the VRAM Buffer is acting strange on NVIDIA GPU's. Seems like it's not being fully utilized. Great video, Saude!
Hey I hate to burst your bubble but there is absolutely no way they would be able to optimize the egine because its clearly being pushed to its limits. They already delayed it for a year and this is the result.
yeaah... we still wait for "boston low fps" fix for fallout 4, only modders can "save" starfield, im still impresed how modders fixed most of skyrim bugs BUT WE DONT HAVE CREATION KIT YET SO GOOOD LUCK PLAYING 30 FPS ON 3060 hahahaha
copium
@@nicolasgonzalez629agreed jedi survivor came out in April and that game still has performance issues but tbh this game runs worse than that game.
@@superpartia3805yeah these fuckers can't even implement mods into patches copying others' people work to finish their fucking game. Worst developers ever, always said this. Shit graphics, 2006 gameplay, more loading screens than an xbox360 game, facial animations from the x360 era, forgettable writing, decent performance lile 5 years after release, patches and only with community patches (mods). Bethesda are incompetent like nobody else and I really don't understand how people like their game..I meain maybe they are so nostalgic of the morrowind gameplay that they want the same after 20 years lmao..but I mean, a lot of people loves junk food, I guess it's the same with games.
Glad it's not just my card then! I got a 3070 just to help run Starfield & Cyberpunk and I thought it was broken! 🤣
Enjoy cyberpunk for the time being and refund starfield until it's patched... And cheaper. You buy GPU that should run everything on desired resolution and then devs just fuck it up. Sad times
@@semick4729 just wait for the unreal 5 wave to kick in, the most generic looking engine that only pushes photo realistic graphics in test demos at gaming events, and was never really meant to be utilized in action. at least in this current year. i really wanted to feel like a kid again with all these new releases, picking up games and booting them up on my rig. but all i get are more and more frame loss with worse and worse graphics and gameplay. is there really any point to gaming on a medium spec pc anymore?
Enjoy Cyberpunk instead. You can play it at maximum settings (not including path tracing) at 1440p at the framerates it can run Starfield at 1080p high.
Looks much better and is also probably more fun.
@@semick4729You’re exactly right, it’s not our pc’s blame the developers
@@ryanspencer6778 Not wrong there. I can run Cyberpunk on my 3090 above 60fps at 4K with RT Medium and DLSS Performance no problem. This game? Pfff.
Have you seen the Hardware Unboxed optimized settings? Using FSR with 100% resolution scale is actually worse than no FSR (~6% worse).
true but fsr sharpens the image compared to native, so its worth the 6% loss
@@ImperialDiecast You can use Nvidia inbuild Sharpening and you will have no FPS loss.. Funny
.. yeah fsr at native res is just an anti aliasing method which is more expensive, it’s basically amds equivalent of dlaa
Me to Cyberpunk: maybe I treat you too harsh
I get drops to 40 FPS at 3440x1440p medium/high settings in dense foliage areas, when walking up to lava and resources/gas vents.
I have an RTX 3090, i9 12900k, 32GB DDR5 Ram @6200 Mhx CL32 and 2TB NVMe M.2 SSD.
Seems like there are issues with the actual optimisation of certain graphical settings or an engine issue.
Because I usually get 80 FPS. And lose 40 FPS straight up in lava, walking up to resource vents and in the high foliage areas I can drop down to 47 FPS. So lose 33 FPS.
The difference between using FSR or not is extremely small. Like a 10 FPS difference. If I were using DLSS quality I would be getting atleast a 20-25 FPS difference. More then double and it would look better too.
Masked foliage shaders are typically very expensive and this engine is too old.
Can't bother optimizing your game? Just make it run at 30 frame on consoles and write down heavy hardware requirements for pc so you can get away with it.
well said about Doom, Starfield doesn't even look as half as good and runs like crap xd
can you test the 4060ti next???
That's too weak, we need to test more powerful GPUs like the gt710
@@SmittyWJManJensentbh we need a g100
@SapphireLeaf Dead card move on
I was waiting for reviews for Starfield to see which game I wanted to invest more time in with Boulders Gate 3.
Thank you to all of the premium edition beta testers for helping me dodge this bullet lol.
That’s weird. I’m playing on a 3070 with a Ryzen 5 5600X and getting better FPS on Ultra settings. FSR is set to 65% though
Bro be packing money with these Starfield videos 💪
Not more than Godd Howard is currently packing with all the premium edition sales.
@@SmittyWJManJensen say hello to my pirate friend
@@morphtek RUNE is your friend too. that guy is awesome..
He deserves the money he is one of the very very few people on the internet who has not been bought out to actually tell the truth about this game lol.
Seriously it's actually painful watching big tech and review channels and streamers all sit there and lie that this game is perfect and deserves a 9/10 or 10/10 rating and that the performance is fine you should just buy a 5k pc if you wanna run something as extreme as 1440p high settings.
StarFAILED 😄
This game performs worse than remnant 2
I don't mind if games become demanding over time, but I'm baffled how they get more demanding and look worse at the same time ! 🤦♂
Yeah also they look dull, unattractive,boring , unpolished, bugg written at the same time.
I am getting around 50 FPS at this area on a RTX 2060 6GB at 1080p. This game is hilariously unoptimized
I genuinely hope that it’s just the drivers that haven’t been ready yet which would explain the horrible Nvidia performance hopefully we will get Game Ready drivers on the 6th
I’m praying too buddy, I’ve been dying to play this game
It could be that and I’m hoping that’s the case but if not then it’s pretty crazy to have 1/2k pc’s and the game can’t run well. On top of that, the xbox is capped at 30 fps which i find bizzare
Seeing how many Nvidia users there are on Steam surveys, I'm not so sure that this is a great move from Bethesda. An anticipated title like this should function well on all modern GPUs, regardless of manufacturer. There are already DLSS mods, but it shouldn't be required to play 1080 resolutions on decent settings, using a mid-tier GPU from last gen. This is nuts.
All I'm saying to people is.. I'm running Cyberpunk with Path Tracing, not just ray tracing.. at 30fps constantly with DLSS set to just balanced, on my 3070.. at 1440p. Starfield is a disgrace.
On my Ryzen 5 3600 3060 RTX 32 gig PC I had over 100 fps, 107 to be exact. Everything is at the default medium. 1920x1080. Did some running around and no significant dip.
Thank you for testing this game with my gpu, i was planning to get the game but now im gonna decide should i wait more
Imagine if this game came out in 2022 as planned, the backlash would be insane!
the hype train for this one is real, most people wont percieve the garbage graphics and horrible framerate as "bad optimization" and think that the game is so big that it runs so poorly. i think the worse it runs, the more people are gonna be defending it 😂
@@jkeebla >most people wont percieve the garbage graphics
They'll say "it's a bethesda game."
Personally, after Skyrim I told myself any game released after Skyrim will have acceptable graphics considering Skyrim was fairly acceptable already.
@@jkeebla Yep, I have a casual friend like that, designed him a $2000 7900xtx PC for starfield and some other games. He has some sort of driver issue and was getting 1080p 30fps ultra, he thought it was just the size of starfield and that was normal.
@@a.trance6997 This game got worse graphics than Skyrim
This game terrible even today, its scary to see what it could be in 2022
You should keep the "guess the settings" segments from now on, its really funny for this game and could be really surprising on optimized titles :)
Mine runs well now on 2070s. DLSS enabler mod "Starfield FSR2 Bridge" gave me better performance. I also replaced the medium.ini from "Starfield performance optimizations" for a boost. If its eating your Vram you could install the "performance texture pack".
yeah the DLSS mod is legit a MUST for RTX3000/4000 users, it basically let me go from running the game on medium at 60fps to high at 60fps with exact upscaling and all.
Wil dlss mods work if I play thru pc gamepass?
@@nicosci7009 yeah I’m on game pass with mods
16 times the low detail and cinematic fps, in order to play this masterpiece
This game handling the VRAM better than other crappy optimized games but struggling on other aspects
Low quality textures thats why its low.
Thank you for this video, my friend
These videos are so informative and entertaining at the same time, and it's because of how you present yourself, please don't ever change bro, we love you like this 🙏
If a game can't hit 2k60fps with a 3070..
That's just outright disgusting optimization and there's no defending that whatsoever.
"But but but but it's OlD hArDwArE" BS. It's hardware that was created specifically for 2k60fps benchmarks on software that Starfield's engine quite literally uses. And it's not even that old. It's exactly 3 years old nearly to this day. It came out 2 months before the XSX, and is quite a bit more powerful than that GPU.
Loving a game isnt a good excuse for defending its problems. You want it to be better.
It's like when the gtx 1070 came out in 2016, considering it was a 70 tier card, it ran every game at 1080 max settings just fine back then.
Yeah 8gbs of vram was better in 2016 than nowadays, but it's still really good for 1080p gaming, games like stray, god of war, doom eternal, and other games that have good optimization and look good, run really good on a 3070 despite the 8gbs of vram and I can confirm that.
Really a shame that the games industry has come to this kind of situation 🤷♂️
I think the engine is running on its limits
Xbox Series X is Native 1440p and then FSR upscaled to full 2160p and it’s rock solid locked 30fps and could easily run 60fps if the devs actually attempted to optimize it but they never even tried to optimize and instead took the easy way out and locked it at 30fps and said that’s good enough but honestly it feels more like 40 fps as it’s a super smooth 30 which I’ve never felt before as usually 30 feels atrocious but not in Starfield so I’m not sure how they made it feel so good but I’m definitely not nearly as upset now that I’ve tried it and it doesn’t feel terrible like the usual 30
Some people found there is at least some serious CPU optimization issues such as the thread priority being changed every frame which is really bad for performance.
Another thing was a compilation issue. Probably wouldn't affect GPU performance much, but there is definitely optimization issues (Unless you're on a AMD GPU)
The Video does not show CPU limits for the game.. but GPU hard limits. Its freaking 35% to 50% max.. lets say we gonna worse CPU and usage of 70% - its still not CPU limited :/ so whatever they found.. is not of use for the optimalization coz CPU is not the problem.
Def not CPU problems, unless their engine is bugged and don't use the CPU correctly or something - not suprised if the case, skyrim and fallout 4 I had to use a mod to fix something about the engine using the CPU incorrectly (they use only 2 cores and forget abou the rest), to which I saw a massive FPS boost back then. Guess they haven't fixed this yet.
@@masteroak9724the 5800x3d can do mad 105 watts? In this video average is 65 watts 😂😂😂😂 der a cpu issue also happens with the gpu!
It does not matter if ti says 99% usage if it only uses 2/3 of power draw !
It willl never achieve the desired result
@@artursobkow2680 bullzoid posted a video about how this game is memory bandwidth limited more than cpu limited
while the game is not cpu limited on kryzpy's system it certainly the utilization is certainly high for what's offering
Hello kryzzp , great video as always😄..
I have a suggestion for you that you can try the DLSS mod in this game and see how much of a difference it can make to that of fsr2
My 68000xt plays about the same 😸😸 won't maintain 60fps outside with frequent dips to the 50's and occasionally 40's. So much fun! The cpu never goes over 10% utilization with 16 threads and the gpu stays at 100% always. Such a fine tuned experience!
Don't buy the game. Don't give them money. That's the problem
Me too! Gpu at 96% (3070) and cpu just chilling 😂😂😂😂😂
Quite opposite of my cyberpunk experience 😂😂😂😂 cpu 90% gpu chilling
Wth are these new devs even doing 😂😂😂😂😂😂
I'm running a 3070 with 16g ram ryzen 3600 installed on a m.2 ssd. I'm not getting close to 60fps, more like 35 40.
It's a shame that Game developers feel like they gotta raise their price up to $70 even though they can't even optimize their game right.
There is a mod that adds DLSS upscaling, works great on 3070
I second this, the fps isn't amazing, but DLSS 2.5.1 looks sooo much better than FSR2
@@Dokkir - the mod uses DLSS 3.5.0.
@@jowdyboy you can choose the version of dlss, 2.5.1 is the latest one that works on the 3000 series
@@Dokkir - the mod clearly states it's using the latest 3.5.0 version of the DLSS .dll from Nvidia. It's on Nexus Mods. Use your eyes.
From what I've been reading that Nvidia didn't have enough access to the game to optimize their drivers before launch, so all of these benchmarks are just good for a baseline. In 1-2 months both game and driver updates will make it more playable on Nvidia and Intel Arc (can't even run it). Likely they cared more about the Xbox release than the PC release in terms of optimization.
Letting AMD get their grubby hands on things definitely hurt the optimisation especially as a majority of people that actually care about what's in their PC and know what they're talking about with said situation are not using AMD, they're using Intel and Nvidia for the superior efficiency, fps, optimization and all the other wonderful things.
@adambrown6669 I think Microsoft is the bigger villain here. They bought Bethesda and forced it out a bit early and only optimized for the Xbox (AMD hardware) for early access. This forced them to make it more CPU heavy and only implement AMD optimizations that were required, which is why it runs better on AMD currently. Remember, technically the game is in early access. I think originally, they wanted to release it in November, as Bethesda likes to.
@@adambrown6669AMD really are not that far behind 😅
Better gaming CPU's, better mid range GPU's only to see improvement with FSR3.
I highly doubt AMD had anything to do with anything other than it being a sponsor/partnership deal...The real villains are Bethesda and Microsoft here I think...
Insane how better 6800 performs considering it launched to compare to this. 6800 is in line with 3080ti in this game
Almost as if it's an amd title... Smh
Bethesda - Fallout 76: "It just works".
Bethesda - Starfield: Well, at least it works now, kinda.
Honestly, the change in performance from low to high was underwhelming. I'm accustomed to games being markedly different in performance and visuals when that gets changed, but there was almost no change in visuals from what I could see. YT compression is definitely affecting this, but seriously?
Nice video. really gives an insight on how the game might be running.
On the 1080p low side you might be atleast partially cpu limited tho what from i can see.
I tried to estimate performance with my 5800X3D and from sites i read is that it might not be capable of much more than 70fps in some areas.
I really hope they improve with optimization. i like to play games with more than 60fps. just feels smoother and more responsive.
5800X3D + 3070 Ti, at 1080p (no FSR, no DRS, no VRS, ultra settings) I get 55 FPS in the heaviest areas like New Atlantis, where this video takes place, but in most other areas I get 65 - 85 FPS. With no stutter and a VRR display it is perfectly fine. I too prefer 100 FPS+, but this isn't really the kind of game that "needs" that to feel good, it isn't some action game like Doom or CS or something. It is fine. What is unfortunate is that dropping to low settings doesn't help the game all that much with FPS, meaning folks with 3060s and like Ryzen 3600s are getting crushed down to 30 - 40 FPS at 1080p. Sucks for them, because the game is great honestly. Plenty of drama out there but I personally am having tons of fun with this game. I like it more than Skyrim/Oblivion for sure. And it so far seems to have more potential than Fallout, but I'm not 100% on that yet, I'll know more as I get deeper into the quests. The game has a TON of mechanics too, with far better menus than Fallout 4 had, like when it comes to crafting and base building and ship building and organizing your crew for settlements (which make sooooo much more sense than the settlements in Fallout 4). The main quest is actually fantastic and the faction quests are even better. Side quests are hit and miss as per usual. The only thing lacking for me is the space side of things. This game is basically Mass Effect on steroids. If you approach it like that it is really great. If you expect No Man's Sky + Bethesda game, then you'll be very disappointed regarding the NMS elements, but it DEFINITELY is a Bethesda game, just with way, way fewer bugs than normal for Bethesda. And a much nicer graphical fidelity. Comparing this to Fallout 4 or Skyrim it looks light years better, and those games ran like garbage when they launched as well.
just keep in mind that your rig is extremely powerful compared to what the majority uses, so what's great to you is 30 fps low settings to the next guy.@@mraso30
agreed, for me 60 fps on anything but a 3rd person title feels like a slideshow. I need probably around 80 consistently, and preferably around 120.
@@mraso30 thx for the info :) i have been playing elden ring with 60fps which was also alright but not as i prefer. found out i could unlock the framerate, now i get around 120 and it just feels alot better. I expect the game to run fine then but it still bothers me that even with such a high end system i can't get a stable 120fps and i don't feel like upgrading to a 7800X3D which can't do 120 either.
Appreciate you doing these videos!
You're welcome 😁
I have a 3070, I'm not even mad i'm laughing through all your starfield videos, don't let the marketing forces you into thinking that what you have as hardware is not enough to have fun. missing on a game or two per year by CHOICE shouldn't make your entertainment life feel horrible.
Nah, devs should stop using upscaling technology as a crutch for bad optimization. This game should be able to run well over 60 fps without any DLSS/FSR at high settings on medium end cards like the 3070.
You are right about the fun part, but people want a good looking game and not have it look like someone rubbed vaseline on the screen because they have to turn down the rendering percentage to have playable fps. FPS Lower than 60s isn’t fun, but that’s definitely depending on player.
Can’t let your fun blind you from the horrendous optimization. Its why devs get away with it.
@@NovistadorsWishful thinking. That boat has sailed and will never return, so to speak. Also, it doesnt look like vaseline is on your screen. Get your eyes checked.
@@Novistadors 3070 was and is a joke of a card with 8gb of vram. This is probably the only game that will fit at 1440p or higher in that tiny vram pool. It's only going downhill from here.
@@PineyJustice Yeah, 8GB is awful. Crazy to think NVIDIA is still making 8GB cards in 2023. 💀
9:35, there's no way you can pause on this still and think to yourself that this is the true next gen experience, games look WAY better than this from way before this game released that run twice as good.
I have locked it at 30 fps on a 4K TV 50% scaling and have blasted all post processing on TV, crystal sharp image and super smooth with LG truemotion tech (90fps interpolated from 30), but my god the latency is 35ms XD. The only way the game is somewhat playable, the combat takes time to get used to but it's not that bad on a controller. Very sad milestone game for PC gaming.
you are my favorite youtuber. More videos from the RTX 4060 please? I'm thinking about buying it
Coming tomorrow:)
4060 not really worth it
@@Bubbagaming90000 it is a good card, but needs at least 10 gb Vram minimum
just don't
@@Bubbagaming90000 I understand that 4060 is not the best video card, but I don't have money for 3060 ti (we have it more expensive). And I also do not want to buy rtx 3060, because it is weaker. What should I do then?
It is getting really hard to distinguish being demanding and being unoptimized nowadays
The difference is a demanding game has good textures this game is awful some of the furniture looks like its painted in the wall & those trees FFS talk about lazy
@@NoodlesTBogratActually textures are one of the least taxing elements in games. Although I somewhat agree. Some things like pilot cockpit are incredibly detailed and other like leaves look like...
@@NoodlesTBogratThat's the reason why game VRAM usage is so low: low-quality textures. People would be complaining again that their trash rtx4060 is running out of vram.
nah its very easy, its always the second one hah
I don’t think it is lol the new games don’t look great - no improvement since 2020 in graphics but everything runs worse on newer hardware
I've got a 3070, playing at 3440x1440, DLSS mod, low settings and getting 90+ (except for big cities, where I get about 60). Since I don't really care about the graphics being ultra, this has been totally fine for me.
How do i install that mod?
@@petersvenn1511 There are dozens of RUclips videos about it, I personally used one of those.
They knew about the performance when getting sales footage. All the influencers who got the game 3 weeks early knew about the performance issues. They also knew about politics being inserted. Nobody said anything.
Well I'm not getting this game then, thanks Bethesda.
Thank you for the video, it was helpful !
Love your videos man❤ hope you do more vids on the 3070 in future 😁😁
Gigachad pfp
Gigachad pfp
I have this GPU paired with a 13700K. Where @ZWORMzGaming is testing is about as tough as it gets. I have the DLSS mod installed and with the equivalent of about DLSS Quality and Ultra settings at 1440p, I get over 70 fps in New Atlantis all the time, it looks gorgeous (much better than FSR), and in other outdoor areas I tend to get 80 to 90 fps and in indoor areas, 100 to 140 fps. I love the game and how smooth it is - barely a stutter ever. And I am expecting an NVIDIA driver update to increase performance by 10 to 20% (the Wattage is way under what it should be). It's an intensive game, but the outrage is a bit over the top. Think of all the NPCs walking around in New Atlantis, so many with quests attached to them, the number of persistent variables and objects, the number of if statements required to check so many conditions - it's tough on CPUs compared to the typical on-rails game (I am a developer, including with Unreal Engine, so I know what I'm talking about), and is holding the GPU back here in this section.
3070 Ti here, getting 55 - 85 FPS @ 1080p (no FSR), highest settings, VRS and DRS turned off. New Atlantis gets about 55 - 70 FPS. Sure, it feels like I should be getting like 100 FPS based on visuals alone, but in TLOU Part 1 for example I get 80 - 100 FPS, and whilst that game does look nicer it is also such a simple game in terms of scope compared to Starfield. I have zero issues with the performance I am getting. The only people getting "bad" performance in this game are on Xbox Series X, or have a 3060 or lower card and are probably trying to run at 1440p or something stupid for their card, or, they maybe have a decent card, but paired it with terrible components other than their GPU and this game stresses the CPU just as much as the GPU for sure, most games don't do that so people get away with their mainstream CPUs, but this game needs a good CPU/GPU balance, and a lot of folks neglect that when building a gaming PC.
It seems you do know what you are talking about, but NPC's part just does not make any sense. Look how many NPC's in the CP2077 on High crowd settings, look at GTA5. Before you start to speak with the NPC, it's just a 3D model animation, nothing more. All the quests and dialogues can be stored in memory and CPU absolutely does not need to calculate anything about those things before interaction with the NPC. And even when interaction happens, it's just laughable to think that checking a few numbers, triggers, player stats whatever there is, could be hard for a modern CPU that is capable to performing billions of operations per second on each core. I am pretty sure that rendering a single NPC takes way more resource than any dialogues\interactions\scripts of all NPC's in the area. How do I know? There are much simpler games, where you can have complex interactions with huge amount of NPC's and it will run on a potato.
The Starfield DLSS Nexus Mod is totally worth the extra five minutes to install. Using a 3060 Ti and 5600 and getting 55fps+ in cities on 1440p high settings with default resolution scale and latest DLSS version. No overclock and turned off motion blur and film grain. I'm using the GamePass $30 update version of the game. Unfortunately I read that Steam buyers are having some issues with the mod crashing and some controllers not being recognized. It's super simple to install or revert if it doesn't work.
Has it been updated since release? Last time I checked it was not giving the correct visuals of dlss only the fps increase. Seemed more like it was using fsr as a base with "adjustments" to call it dlss not a proper implantation.
FSR 2 and DLSS performance gains is the same in this game. There is barely any difference.
@@kishaloyb.7937 Yeah it's really weird. FSR/DLSS hardly changes anything in the game. I think it's an overall optimization problem with Creation Engine 2. Hopefully they figure out some software magic and release some updates that make it a bit more performance friendly. When they said "least buggiest game" I think they meant that in the true sense of bugs (quest,item bugs etc) not that it would be the best running performance wise. Still very playable on my 6700xt/5800x @ 1080p. If I didn't have an fps monitor on I don't think I would have noticed the fps dips as much.
@@magnuskane246 FSR 2 and DLSS 2 have very similar performance overheads. So, if you were getting 70fps with FSR 2 @ 75% render scale at 1080p. You'll roughly get the same FPS with DLSS 2 at those exact same settings too. That's just how the technology is. It's only with DLSS FG, you'll notice the fps increase further but then you'll have increased input latency too.
Though as much as I love the game, I've gotta agree that the engine is very much unoptimized. It's guzzling computing resources but the fps is really not that great. Hope Bethesda fixes it with future patches and modders could optimize it further.
I remember when FO4 came out. At the time 900 series was the best you could have and a 970 wasn't able to play it on high settings. Looks like it's the same today.
Not “it’s same today”.3070 it is beast gpu for 1440p .Hw don’t work on this game bad!
@@loverwoman Exactly, both games run like shit.
"Please kill me." - My RTX2070 when I start Starfield
Thanks for testing the 3070! I just got one recently from my friend, I was excited to try out starfield but seeing how poor the performance is at 1440p, I think I"ll just pass on yet another terribly optimized game, maybe in the special edition 6 years from now I'll consider trying it out. Keep up the great work!!!
It’s the drivers bro
@@AzSurenoAnd how did you reach this conclusion 😂
Maybe in 6 years time? What difference would that make? A heavy game will still be a heavy game on the same pc.
@@KratosisGod a lot of people starting rolling back
Yup. This needs to be a majority response, less we keep getting un-optimised/broken games.
can you do some CPU tests to see if cpu is being utilised efficiently and wether or not turning of hyperthreading gets better performance, a lot of games this gen are running better without hyperthreading for some reason
I know it might not be your thing to mess with mods for performance, but you should try the dlss mod, just to see how it stacks up against native and fsr, to see if it really helps or if this game is a lost cause.
When you have to use mods on a game to even get it to perform decently, it's already a lost cause.
using it rn. honestly not worth it. replacing FSR2 with DLSS just makes my fans louder but alot less artifacts and annoying flicker.
Dude. It’s not even using the entire VRAM buffer! This game is honestly a lost cause.
@@Stockinzs Using it right now. Honestly very much worth it. Massive increase in fine detail quality and reduced shimmer and jaggies.
@@EngineerMikey5 bro its a bethesda game, they always sucked without mods
I have Starfield on my laptop with a RTX2060 and it runs in low settings and I think it looks good and runs smooth.
Thanks for beta testing the game for 6 days before I can play it
Over 20 hours in on starfield,Been playing this on my 3070 paired with i7-12700k. Ultra settings with 66 percent resolution scale of 4k. Definitely been very playable. Not a completely 60 fps lock of course lol but there are plenty areas especially inside buildings and caves and stuff where it’s well over 60.
Right now I'm pretty sure the most popular StarField mod will be an optimisation mod😢
60fps is not good enough anymore, 120 is the new standard, i refunded this overpriced decade old looking game, that runs worse than any of my decade old game, and dosnt look much better.
There is a mod that changes the filtered lighting, you'll enjoy it a lot more without the blue, brown and green tint in areas.
where is the mod to give me another 60 fps?
@@allofyourdreams We need DLSS 3 modded into this game so it'll be playable.
@@laszlozsurka8991 There's already DLSS3 Mod on nexusmods. Framegeneration part is paywalled for 5$ and exclusive to 40-series just like regular framegen.
@laszlozsurka8991 it's already modded in
@@laszlozsurka8991It already has been modded in and it looks much better than FSR2
Replaying Skyrim modded despite having a free copy of Starfield.
"40 is the new 60!" - Todd Howard 2023
So i, with a GTX 1070 and a 4K TV, should NEVER touch this game till i upgrade. Thanks man. You saved me 60 bucks
i was guessing 1080p medium, but LOW? bro, can we start optimizing our games again like we used to?
What's more scary about this games performance i found is it began development in 2015 and was in a playable state since 2018 before the 30 series gpus were even announced yet you are looking at a 4070 for 1080p medium/high 60 fps at the bare minimum. I don't understand how in 2023 a triple A studio putting millions of dollars into a game can go "yea 30 fps for modern consoles that's good enough, 600+ dollar gpus for 1080p medium settings on pc sounds good send it out. Like I won't pretend the game is ugly but I also wont pretend the game is better looking than many games at all we have seen in the past 4 years.
Normally studios would get nailed for such a release in reviews but for some reason this game is skating by getting undeserved praise by everyone for the only single defence I can find on the entire internet being "it will get better in the future give it 6 months"
the fact todd howard said in an interview when asked about the bad optimization "just upgrade your hardware" I was dumbfounded how is he saying people just dont have good enough hardware when I watched multiple 4070 benchmarks and they couldnt get any higher that 66 fps on high the game is optimized horribly
I'd be surprised if it got 60fps at 1440p High settings. It's a Bethesda game after all. Lazy companies avoid optimization by forcing the use of DLSS. Bought the 3070 as a card for 1440p. I played 1 or 2 games and it's became 1080p card.
I currently have an r5 3600 and a 3070. I finally gave up on trying to get 60fps at 1440 and just locked the game to 40fps in control panel and set the preset to ultra with 80% render scale and a DLSS mod to get away from the FSR problems. Even with all of this it still drops to high 30's at times but if you turn off the frame counter it's more or less fine and you get used to it. Absolute shame the game has launched in this state however as it really is quite a good game.
You ask who is that decides the program design options. The video game devs get kick backs from card manufacturers by making stuff that doesn't work on the old cards, making people have to upgrade. The tech industry has been exposed using this predatory practice in all aspects of tech. The cell phone and video game companies were exposed all the way back in early 2000. Another thing they like to do is in the updates, they introduce dummy processes to make the component seem like it's running slower.
Thats how every single pc game is at release. They take the feedback on all machines used via steam and then optimize the game for the cards used and patch it for performance.
And here I am playing with my 3070 on ultra setting and it looks amazing.
What's your CPU, RAM, and Mobo?
@@MyReligionIs2DoGood 11900k, 32gb & asus prime something
@@BernardoPC117 What's your fps, and do you get stuttering?
@@MyReligionIs2DoGood At 1440p between 50-60 except in big cities like the one in the video it can go down to 40s, no stutters that I can notice.
It's fine bros just use FSR 2 with 70% resolution