A prominent open-source dev publishes their findings as to what's going on with Starfield's performance, and it's pretty darn strange.
According to Hans-Kristian Arntzen, a prominent open-source developer working on Vkd3d, a DirectX 12 to Vulkan translation layer, Starfield is not interacting properly with graphics card drivers.
The problem is so severe, in fact, that the aforementioned translation layer had to be updated specifically to handle Starfield as an exception to the usual handling of the issue.
"I had to fix your shit in my shit because your shit was so fucked that it fucked my shit"
I'll play in a year after most of the bug and performance issues are fixed. Which seems like my typical response to any major game release these days; just wait a few months at first.
only issue I see with the game at the moment is that they did not use those fly/land/dock sequences to mask the loading times. I think that would enhance the experience a lot
I wonder if this has anything to do with not being able to load my saves. I went to mars and exited the game after a long gaming session. Came back the next day and I get a full system crash upon trying to load the exit save. Tried the autosaves, same deal. Tried my last normal save, same deal. Every once in about 5-6 full system crashes I can reload one of the saves from just landing on mars but if I try to enter caledonia then it's a full system crash. It's weird too, I can still hear the game running in a loop but I can tell there is no input and the graphics fully fail. Very frustrating. I finally got back to my main rig to be able to play and the game has just been straight not playable since about the day after it came out. Can't even get a hotfix from Bethesda. Bummer. I'll just have to wait to play it again. I'm not going to restart a new character just to run into the same thing.
I preferred the Little Mermaid, the Ugly Duckling, and of course the Emperor's New Groove, but his commentary on graphics in Starfield is also a compelling work.
Looks like Hans implemented a workaround in vkd3d-proton 2.10, using the open-source AMD vulkan driver on linux (RADV).
Device generated commands for compute
With NV_device_generated_commands_compute we can efficiently implement Starfield's use of ExecuteIndirect which hammers multi-dispatch COMPUTE + root parameter changes.
Previously, we would rely on a very slow workaround.
NOTE: This feature is currently only enabled on RADV due to driver issues.
I don't imagine it will take long for this to make its way into a Proton experimental release. Folks with AMD graphics who are comfortable with linux might want to give it a try.
Do we know for sure that the Starfield devs weren't able to figure out the problems with performance? I find often with companies, the larger they are, the more bureaucracy there is, and the more prioritization of tickets becomes this huge deal, where you even end up having meetings about how to prioritize tickets etc.
I would be surprised if the devs didn't know what was wrong already, I think it's more likely that management and higherups doesn't care about them fixing it right now.
I'm amazed that Bethesda has one of the premier game developers in their stead in id Software and didn't bother to just use their shit. Instead they actively chased their staff away.
I was able to install the DLSS mod which helped some but there's still performance issue even with using the DF optimized settings. I assume this will be fixed with driver and game updates but who knows how long that will take.
I'm inclined to believe this, and this likely isn't even the whole extent of it. I've been playing on a Series X, but decided to check it out on my Rog Ally. On low, at 720p with FSR2 on, I'd get 25-30fps in somewhere like New Atlantis. I downloaded a tweaked .ini for the Ultra preset and now not only does the game look much better, but the city is up closer to 40fps, with most other areas being 45-60+. Makes me wonder what it was they thought was worth the massive cost that the default settings give, with no real visual improvement.
Another odd thing, if I'm playing Cyberpunk or something, this thing is in the 90%+ CPU and GPU utilization range, with the temps in the 90c+ range. Starfield? GPU is like 99%, CPU sits around 30%, and the temp is <=70c, which basically doesn't happen playing any other "AAA" game. I could buy Todd's comments if the frame rate was crap, but this thing was maxed out... but not getting close to full utilization on a handheld with an APU indicates something less simple.
I'm hoping the work from Hans finds its way to all platforms (in one way or another), because I'd love to use the Series X but 30fps with weird HDR on a 120hz OLED TV actually makes me a little nauseous after playing for a while, which isn't something I commonly have a problem with.
Typical Bugthesda. Am only wondering how did they get this big by only releasing buggy products. I can't for the life of me remember a single product they have made that wasn't buggy mess that community fixed for them time and time again without any compensation. Not only that community didn't get any compensation, Bethesda tried to sell their work and pinch some more money.
I'm convinced large video game publishers make deals with graphics card manufacturers to force the end user to upgrade, the AMD and Nvidia deals are not for free access to new technology it's for which ever bids the highest price to sell more cards. There is little progression in graphics fidelity since 2016. We used to take giant leaps and now we take small insignificant steps.
It's the same trash engine they've used for 20 years. To be perfectly honest, they should put it in the ground and build a new one from scratch instead of pushing their Frankenstein engine along.
This article is kinda pointless. Go look at the linked PR. The author says his optimizations yield a tiny performance gain. Not particularly worth mentioning.