I still refuse to believe they're not a fake term used to fluff up tech announcements and make shit sound more powerful than it is because that's a fucking stupid name that nobody should use
That’s like saying clock rate and core count are fake terms. Sure, by themselves they might not mean much, but they’re part of a system that directly benefits from them being high.
The issue with teraflops metric is that it is inversely proportional (almost linearly) to the bit-length of the data, meaning that teraflops@8-bit is about 2x(teraflops@16-bit). So giving teraflops without specifying the bit-length it comes from is almost useless. Although you could make the argument that 8-bit is too low for modern games and 64-bit is too high of a performance trade off for accuracy gain, so you can assume the teraflops from a gaming company are based on 16-bit/32-bit performance.
There will be a single digit number of games for it and all of them will require subscriptions to play and half of them will be canceled +/- 2 months from launch and then impossible to play because the servers are shut down.
How many PS5 Pros will be sold at retail, taken out of the package, hooked up to a TV, and never play a game that you could play on a normal PS5 or even a PS4?
That's interesting and all, but I still don't see a reason to upgrade my PS5 to a Pro, and frankly it wouldn't even be that interesting for the price as a new player either.
Are there like any games that will really make use of the new hardware? Other than perhaps upgraded framerates and better 4K support. The average console player probably isn't going to care that much, not for the giant price increase over minimal gains.
I feel like all games on this generation will still be limited to the base PS5 anyway, can't imagine hardware matters much until the next generation consoles.
People who don't have a gaming PC but still want to game would be the next target audience in line, since they wouldn't have another machine to play third-party games on anyway, so the exclusive would just be a bonus on top.
But I don't think they're even interested in paying so much extra for features they don't even care about. Perhaps a smooth high framerate in casual shooters would be something they'd care for, but that can easily be achieved on base PS5 with at least 60+ FPS. I don't think they're the ones that care about true 4K, 120Hz/FPS or slightly better textures.
The only thing I can think of that people are hyped up for is GTA6. I fear that Rockstar might sell out to Sony and deliver a shitty 30FPS locked, low resolution and texture version of the game on older PS5 models on purpose, just to "push the hardware" of the newest model. But then again, they also couldn't even be arsed to unlock framerate for RDR2 on PS5, not even after so many years.
I think you can expect about the same as with the PS4 Pro. Maybe finally this time it will be a smooth actual 4k (ok actually, UHD) gaming experience. But that's kinda what we said last time too, so I don't know.
Developers would still have to optimize their games to get the most out of the hardware, unless we're talking about a game that was already performing suboptimal and throwing raw power at it will hide the surface level problems so it looks smoother.
I would love to see all this horsepower being used to actually make the games better by design, like pathfinding and NPC behaviour. The last big breakthrough we had was raytracing, which proved that it wasn't photorealism that makes it look better, but accurate lighting and shadows. For the consoles it was using an SSD for almost instant loading times.
But I digress. I'm not upgrading my PS5 either, but I can see the value for power users that play competitively or something.
Oh the 16gigs is for devs/games and the 2gb is exclusively for the system. Was wondering how they were able to get by with only 2 gigs of ram and 16gigs of vram originally lmao.
Interesting in their choice of TFLOPS announcement. They could've simply claimed 33 and put an asterisk for FP16 performance on the precision and called it a day. They're listing AMD's FP32 spec, which is divergent from Ampere/Ada which has the same output regardless of precision.