Oh no I'll have to actually play one of the many great games in my library that runs perfectly on my hardware instead of buying another remastered game to that runs like hot garbage.
The industry needs to appreciate QA and optimization more than ever. I don't feel like getting the latest GPU for a couple of rushed and overpriced digital entertainment softwares, the same say I don't feel like getting the newest iphone every year because of social pressure.
I've skipped a couple already. I'm on a 1080 now. It's showing its age a bit but still generally does well at 3440x1440. I will turn settings down as needed to maintain 60-100fps.
Since that gpu has 24 GB of vram the game might be using more than it really needs, just because it can. The best way to test the importance of vram would be to get two cards of the same tier with different vram amounts (like the A770 8GB and 16GB) and see how that impacts performance.
Looked at the review. 4070ti (12G) and 3090ti (24G) scale similiarly until 4K RT / 4K PT, at which point most 12G cards stop scaling and drop to a couple fps. 6700xt (12G) and 7700xt (12G) doesn't seem affected in RT. With PT only 7700xt survives, with a whopping 7 fps. Similar thing happens at 1440p to 8GB cards
According to the posted picture this should happen at 1440p with >14GB VRAM used. It doesn't. 4k native is unplayable territory for every 12GB card anyway
There are also plenty of totally reasonable settings that require less than 12GB, 1440p maximum settings for example. If you want the best of the best, obviously you have to pay for the best of the best.
(It's still a lot and a minimum of 12GB is already ridiculous. I'm just saying the claim of 16GB being not enough is kinda dishonest)
I find myself saying "but why?" for all these spec requirements on Alan Wake 2. Is it some kind of monsterous leap forward in terms of technical prowess? Because usually outliers like this suggest poor optimization, which is bad.
I mean, I know many people like the series. I agree it doesn't seem like it should be terribly demanding though. I may just be wrong and maybe it's meant to be the best graphics ever, but I suspect that on release we'll see a lot of "meh" and potentially backlash if these reqs don't translate into something no one has seen before.
At the same time, Armored Core 6 has pretty stunning visuals and runs pretty well even on a 2060. Almost like graphics can be done well with a good art style and optimisation, not just throwing more hardware at the issue.
These requirements are such horseshit. What's the point of making everything look hyperrealistic at 4K if nobody can run the damn game without raiding NASA's control room for hardware?
My 2080 is plucking along perfectly fine. I'm actually happy that I didn't upgrade to the 30 or 40 series when I'd have to pay over a grand to get enough vram to make this generation of horribly optimized games to run properly anyway.
My 3060Ti has been serving me very well, I've played games that look amazingly unbelievably good (Death Stranding for example) with it, but these recent new requirements are crazy. Especially with UE5 games, I can't help but think it's just shitty optimization because they don't look good enough to justify this.
While this is not a good thing, we have to remember that games will take advantage of more resources than needed if they're available. If keeping more things in memory just in case increases performance even a little bit, there's no reason that they shouldn't do it. Unused memory is wasted memory.
The 4070 is consistently faster than the 7800xt and even the 7900xt(in ray tracing) in almost all settings. And only in 4k with ray tracing, it is ram bottlenecked. But even though the 7800xt and 7900xt arent ram bottlenecked, their performance is shit at those settings anyway(sub 30fps), so thats irrelevant.
I dont see how having 20fps is better than having 5fps. Both are unplayable settings for either card.
Wasn't trying to compare to any specific other cards, this game is gong to destroy a lot of them. Just commenting on Nvidia skimping on the v ram for some very pricey cards.
If you'd red the whole thing you'd found that those numbers are overblown. Fps of a 4070 should tank in 1440p and run out of VRAM but doesn't.
Even with PT it's fine
Optimization is supposedly fine because it looks the part. Optimization doesn't mean make everything run on old hardware. It means make it run as well as possible. There's only so much you can do while retaining the fidelity they're going for.
Until they start making 4k displays in a 23", I'm not interested.
I'm so sick of monitors getting larger and larger like this. I sit about arms length from my monitor, and even a 27" I'm having to physically swivel my head to look at both left/right of the screen.
If I had an ultrawide that curved around me, and the software to split it up into 3 distinct areas, so that my immediate frontal field of view was all that center windows popped up in, or if games would allow me to put my HUD only in like a 60 degree field of view but still displayed the rest of the game in the periphery, I'd be happy.
But you want to fuck up my UI, make it so I have to physically turn my head to see the HUD elements, AAAAAND fuck my framerates hard? Nah. I'll just take the lower fidelity.
It's been interesting seeing the commotion about the performance requirements for Alan Wake 2, but I'm fine with it due to it not being something I'm planning to buy any time soon if ever with it being an epic exclusive.
Most likely way I'll end up playing it is years later if it is given away, which by then I'll probably have upgraded hardware.