Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won't anybody think of the poor oligarchs?!?
It's a lot cheaper to have double the ram than it is to pay for someone to optimize your code.
And if you're working with code that requires that serious of resource optimization you'll invariably end up with low level code libraries that are hard to maintain.
... But fuck the Always on internet connection and DRM for sure.
Reminds me of a funny story I heard Tom Petty once tell. Apparently, he had a buddy with a POS car with a crappy stereo, and Tom insisted that all his records had to be mixed and mastered not so that they sound great on the studio's million dollar equipment but in his friend's car.
Most of the abstractions, frameworks, "bloats", etc. are there to make development easier and therefore cheaper, but to run such software you need a more and more expensive hardware. In a way it is just pushing some of the development costs onto a consumer.
Reminds me of the UK's Government Digital Services, who want to digitise government processes but also have a responsibility to keep that service as accessible and streamlined as possible, so that even a homeless person using a £10 phone on a 2G data service still has an acceptable experience.
An example. Here they painstakingly remove JQuery (most modern frameworks are way too big) from the site and shave 32Kb off the site size.
When my dad died suddenly in 2015 and I cleared out his office at his job, I spun down his Win95 machine that he'd been using for essential coding and testing. My father was that programmer—the one who directly spoke to a limited number of clients and stakeholders because he had a tendency to ask people if they were stupid.
When you see what ONE coder was able to do in the 80s, with 64K of RAM, on a 4MHz CPU, and in assembly, it's quite incredible. I miss my Amstrad CPC6128 and all its good games.
I can think of a few games franchises that wouldn't have trashed their reputation if they'd have had an internal rule like "if it doesn't play on 50% of the machines on Steam's hardware survey, it's not going out"
I knew someone that refused to upgrade the programmer's workstation precisely because it would have been a big leap in performance compared to what their costumers used the software on.
Needless to say the program was very fast even on weaker hardware.
The ideal is “plays fine at lowest graphics settings on old hardware” while having “high graphics settings” that look fantastic but requires too-of-the-line hardware to play reasonably.
I think that every operating system needs to a have a "do what the fuck I told you to" mode, especially as it comes to networking. I've come close to going full luddite just trying to get smart home devices to connect to a non-internet connected network, (which of course you can only do through a dogshit app) and having my phone constantly try to drop that network since it has no Internet.
I get the desire to have everything be as hand-holdy as possible, but it's really frustrating when the hand holding way doesn't work and there is absolutely zero recourse, and even less ability to tell what went wrong.
Then there's my day job, where I get do deal with crappy industrial software, flakey Internet connections and really annoying things like hyper-v occupying network ports when it's not even open.
The thing is that developers tend to keep things as simple as possible and overly optimize stuff, when you find bloatware is usually some manager that decided to have it.
But how would you implement that new Microsoft Screenshot surveillance bullshit feature? Just imagine what a giant waste of resources that is. You have something on your screen which is information and mostly likely already in a good form to process like text. But it makes a screenshot every few seconds and uses some "AI" to make the already existing information searchable again from a fucking screenshot??? Maybe I missed something but that is how I understood the feature.
In 1000 years this meme/tweet/post will be what my entire generation's existence will be known for. Noone will remember the politics, the disasters, the geopolitical events good or bad, they will remember our entire world and existence ad the only time that technology advancement was driven by the big tech mafia trying to see how far it can get it's dick in your digital footprint.
It's the new cops v robbers or bootleggers v prohibition race. Our tech is getting faster to out run the corporate fuckin maleare but the faster we go the more they stuff in so to the avg user they're ended with paying $6k for a GPU/cpu combo that runs at the same efficiency as my school library's c9mputer did running ms-dos running Oregon Trail in 1995. You are so confined by only having access to functions with massive fuckiing app buttons that even logging in as a guest user req you to memorize every CLI ever made.
It's become my defining "I don't want to live in this world anymore"
I make sure my own web game can run smoothly on crappy hardware. It runs well on my gaming laptop downclocked to 400MHz with a 4x slowdown set by Chrome. It also loads in a couple seconds with a typical crappy Internet connection of 200kbps and >10% packet loss. However, it doesn't run smoothly on my Snapdragon 425 phone or my old Core 2 Duo laptop. Is this my game or just browser overhead?
I wrote an email service called Port87, and I did it on a really low end laptop (an Ideapad 3 from 2021) to make sure that it works well, even on a potato.
This is like the definition of a "conservative". Progress shouldn't happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn't be allowed.
Most games have the ability to downscale so that people like this can still play. We don't stop all progress just because some people aren't comfortable with it. You learn to adjust or catch up.
i played Skyrim on i3 4005u with integrated graphics and 4gb ram when i was high schooler, they did epic in 7th console generation as limitations was 512mb shared memory and 250gflops gpu