If it ain't broke
If it ain't broke


If it ain't broke
4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.
Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.
Honestly most people sit far enough from the TV that 1080p is already good enough.
I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn't quite enough to get there.
Also, there's some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.
I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.
I’d settle for 4k @ 120 FPS locked.
monkey's paw curls*
Granted! Everything's just internal render 25% scale and massive amounts of TAA.
He said next-gen not current gen. :/
For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.
TV makers DON'T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.
Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p
can doesn't mean should.
720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance
to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.
or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.
For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched
A few years ago, I got a good deal on a 4K projector and setup a 135" screen on the wall. The lamp stopped working and I've put off replacing it. You know what didn't stop working? The 10+ year old Haier 1080p TV with a ding in the screen and the two cinder blocks that currently keep it from sliding across the living room floor.
Why does it slide across the floor? Do you live on a boat?
I wish. It's sitting on the floor and there's a rug, so the cinder blocks are in front of it at the corners. Now my bed is a little more saggy. I might need some new furniture.
The lamp stopped working and I've put off replacing it.
If you still have it, do it. Replacing the lamp on a projector is incredibly easy and only takes like 15 minutes.
If you only order the bulb without casing it's also very cheap.
Yep! I bought a model with pretty cheap/easy replacement bulbs. I just need to actually pull the trigger and replace it.
Yep I feel the same. I love how old stuff seem to last longer and longer and the new stuff breaks just out of the blue.
Has anyone else here never actually bought a TV? I've been given 3 perfectly good TVs that relatives were gonna throw out when they upgraded to smart TVs. I love my dumb, free TVs. They do exactly what I need them to and nothing more. I'm going to be really sad when they kick the bucket.
I've been using the same two TVs since 2008 and I have zero desire to upgrade.
I was a given free, very decent, dumb tv and upgraded it to a smart tv with a $5 steam link and ran a cat 6 cable to it from my router. Best $5 ever. Have no intention of buying a new one. If I ever do, I will try my hardest to make sure if it's a dumb one. I know they sell "commercial displays" that are basically a tv with no thrid party apps or a way to install them.
Any TV is a dumb TV if you plug a Kodi box in the HDMI and never use the smart trash.
I set up a tv for my mother in law. No joke had to register with an email before it would let me switch to HDMI.
Yes, people like me buy TVs. I'm the guy who keeps giving away perfectly good TVs to other people because I've bought a new one and don't want to store the old one. I've given away 2 smart TVs so far, though I'm not sure what I'll do with my current one when I inevitably upgrade.
I've bought my TVs because all my relatives are the same as us. My mom finally tossed an old CRT TV a couple of years ago because it started having issues displaying colours correctly.
I used my family's first HDTV from 2008 up until last year, when my family got me a 55" 4k TV for like $250. Not gonna lie, it's pretty nice having so much screen, but I'm never getting rid of the ol' Sanyo.
One of my TVs was given to us by my mother-in-law, but we did buy the other one. Before the 'smart' TV era though.
That's how I feel when people complain about 4k only being 30fps on PS5.
I laugh because my 1080p tv lets the PS5 output at like 800fps.
but the ps5 is still limited by 30 fps even on 1080p
My 120 fps on ps5 1080 in front of me says that your comment is mistaken.
No, the PS5 can output higher FPS at 1080p.
What you might be thinking of is refresh rate, which yeah, even if the PS5 was doing 1080p/60fps, if you for some reason have a 1080p/30hz TV, you won’t be able to see anything above 30fps.
Jokes on you -- I'm still using the last TV I bought in 2005. It has 2 HDMI ports and supports 1080i!
I miss this the most, older tv models would have like over 30 ports to connect anything you wanted. All newer models just have like 1 HDMI connection if even.
To add these older screens last. New stuff just dies after a few years, or gets killed with a firmware upgrade.
PSA: Don't connect your "smart" appliances to the internet fokes.
My Smart TV made itself dumb when its built-in wifi just died one day. No loss.
We had an older Hitachi tv with 4 HDMI plus component plus RCA input and 4 different options for audio input.
New Samsung TV. 2 HDMI, that's it. One is ARC which is the only audio interface besides TOSLINK so really theres effectively 1 HDMI to use.
But of course all the lovely spyware smart features more than make up for it.
Is that a joke? My old TV has 3 and that's the only reason I can still use it. 2 of them broke over the years.
I've got 4 HDMI 4k 120hz ports on my LG...
Are you telling me there are modern TVs with only 1 HDMI port??
I was curious I so went and browsed some budget TVs on Walmart’s website. Even the no-name budget ones all had 3 HDMI. Maybe if it’s meant to be a monitor instead of a living room TV but I just looked at living room style TVs.
i feel like the only way youd get one with a single HDMI port are like models that were built specifically for black friday (to maximize profit, by cuting out features)
Mine's running a Wii on component YPbPr video. Looks mint!
They should rather focus on 4k@60fps before doing shit like 8K which nobody needs.
It’s a chicken/egg problem. We need 8k so we can use bigger TV’s, but those bigger TV’s need 8k content to be usable.
What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV. Only suckers buy into 8k. Same people who bought those rounded screen smartphones thinking it will be the new thing. Where are those phones now?
I've never even had 4K. All I have is 1080p and that's fine.
Same. I can't tell a huge difference between 1080p and 4k, if I'm being honest.
Hell, I can’t tell ANY difference (though I do need glasses so maybe that’s got to do with it)
haven't tried it but I'm pretty sure you can't tell 8K from 4K anyway
I can tell a difference, but not enough to be worth the cost.
Context matters a lot. On a 27" monitor, it makes a pretty decent difference. On a 50" TV at 10+ ft...meh?
1080 vs 2k is pretty clear to me, but I have a hard time telling the difference between 2k and 4k.
I have a 4k TV, it legitimately is no better than 1080 lmao
There's a very noticeable difference, but it's nothing like the difference between SD and HD. It's pretty, but not that pretty. I prefer the performance (and proper scaling for my computer) of 1080, even on a 55" screen
Could this be a configuration issue? I can't talk out of experience but I'd assume it would be quite a bit better.
Thanks for the info anyway.
P.s. I'm not the person who downvoted you. I don't do that when disagreeing.
4k is the reasonable limit, combined with 120 FPS or so. Beyond that, the returns are extremely diminished and aren't worth truly considering.
8k is twice as big as 4k so it would be twice as good. Thanks for coming to my ted talk
That would sure be something if it was noticeably twice as good, haha.
But it got 4 times the pixels, so 4 times as pixely!
8k is 4 4k tvs, so 4 times as good?
8k makes sense in the era of VR I guess. But for a screen? Meh
Even that's a big stretch, haha.
480 720 1080 1440 4k is as much as anyone's gonna need, the next highest thing doesn't look that much better
There are legitimately diminishing returns, realistically I would say 1080p would be fine to keep at max, but 4k really is the sweet spot. Eventually, there is a physical limit.
4K id agree with, but going from 120 to 240fps is notable
Perhaps, I suppose that can get upped.
One of my TVs is 720p. The other is 1080p. The quality is just fine for me. Neither is a 'smart' TV and neither connects to the internet.
I will use them until they can no longer be used.
The last TV I owned was an old CRT that was built in the 70s. I repaired it, and connected the NES and eventually the SNES to it. Haven't had a need for a TV ever since I went to university, joined IT, and gained a steady supply of second hand monitors.
We are at a point where 4k rtx is barely viable if you have a money tree.
Why the fuck would you wanna move to 8k?
I'm contemplating getting 1440p for my setup, as it seems a decent obtainable option.
8k 15fps will be glorious.
lol
It's all about dlss
And getting the newest gpu every year because they lock you out of the most recent dlss update when you don't upgrade to the newest line up right?
You can play not only 2023-2024 games. I play GTA V with ultra setting and have 4k@60 FPS. My GPU is 150$ 1080ti.
If we're comparing the latest tech then I'd like to be playing the most recent gen games. GTA V feels as old as San Andreas, in a few years my phone should be running it fine.
I have a 4K 120hz OLED TV. The difference is quite drastic compared to my old 1080p LED. It's certainly sharper, and probably the practical limit. I've also seen 8K, and, meh. I don't even care if it's noticable, it's just too expensive to be worthwhile. We should just push more frames and lower latency for now, or, the Gods forbid, optimise games properly.
I feel like resolution wasn't much of an issue even at 1080p. It was plenty. Especially at normal viewing distances.
The real advantages are things like HDR and higher framerates including VRR. I can actually see those.
I feel like we're going to have brighter HDR introduced at some point, and we'll be forced to upgrade to 8K in order to see it.
Ehhhh, I think 1080p is definitely serviceable, it's even good enough for most things. However, I think 1440p and 4k are both a pretty noticeable improvement for stuff like gaming. I can't go back to 1080p after using my 3440x1440 monitor.
Depends entirely on the size of the screen.
A normal monitor is fine on 1080p
But once you go over 40", a 4K is really nice
Ok but will you be able to use it in 2036.
Will the planet survive until 2036?
I've heard recently that there's "cheap OLED" and "expensive OLED." Which one did you go for? I've got a 75" 4k OLED for $400 and it's definitely super dark. I can't even watch some movies during the day if they're too dark. The expensive ones are supposed to be a lot better.
A Sony Bravia, so decently high end probably.
I've got an older Sony bravia A9G and I've seen reviews complaining that it's too dim but I've had no issues. I think some people just have really poorly thought out tv placement, or overly bright rooms. Also just close the curtains if the movie is dark...
If you want to watch tv outside in direct sunlight you'll need to follow this guide to build a custom super bright tv: https://youtu.be/WlFVPnGEb8o
Too expensive both in terms of the price, and the massive amount of storage needed for 8k video. I don't really think 8k is ever going to be the dominant format. There's not really much point in just increasing resolution for miniscule gains that are almost certainly not noticeable on anything but a massive display. Streaming services are going to balk at 8k content.
The only time I replace electronics anymore is when something breaks or when I'm gifted someone else's hand-me-downs
I have everything on a upgrade list depending on how much we use it and how fast the technology is changing.
Phones: 3 years. Thinking of moving this to 4 or 5 years with the industry's stagnation. Starting to see some companies offering updates for longer times.
Laptops/desktops: 5-6 years.
Wifi/modem/router: 10 years.
3 years for a phone is very low. Maybe change battery and you can keep it for 3 years more. Though you need to buy phones with custom rom support.
My notebook is 9 years old. My desktop is 6 years old. I haven't found a reasonable argument to replace them until they stop working. Why 5-6 years?
Me still rocking the 1080p 42 inch I bought off a coworker for $50 10 years ago
I mean, you can get 4K TVs for cheap and fix them (As long as the display is NOT damaged, once that's gone the TV is nothing but scrap)
Got a 60 inch 4K HDR TV for free off Facebook, the led backlights had just gone out. $20 for a replacement set, 2 hours of my time and a couple cuts on my hand and it's been a fantastic TV since lmao
The performance difference between 1080p and 720p on my computer makes me really question if 4k is worth it. My computer isn't very good because it has an APU and it's actually shocking what will run on it at low res. If I had a GPU that could run 4k I'd just use 1080p and have 120fps all the time.
1440p is the sweet spot. Very affordable these days to hit high FPS at 1440 including the monitors you need to drive it.
1080@120 is definitely low budget tier at this point.
Check out the PC Builder YouTube channel. Guy is great at talking gaming PC builds, prices, performance.
Tldr: Higher resolutions afford greater screen sizes and closer viewing distances
There's a treadmill effect when it comes to higher resolutions
You don't mind the resolution you're used to. When you upgrade the higher resolution will be nicer but then you'll get used to it again and it doesn't really improve the experience
The reason to upgrade to a higher resolution is because you want bigger screens
If you want a TV for a monitor, for instance, you'll want 4k because you're close enough that you'll be and to SEE the pixels otherwise.
As long as don't know that there is anything better you will love 1080p. Once you have seen 2k you don't want to switch back. Especially on bigger screens.
On the TV I like 1080p still. I remember the old CRT TVs with just bad resolution. In comparison 1080 is a dream.
However if the video is that high in quality you will like 4k on a big TV even more. But if the movie is only 720p (like most DVDs or streaming Services) then 4k is worse than 1080p you need some upscaling in order to have a clear image now.
You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience
This is sort of how I feel about 3D movies and why I never go to them. After about 20 minutes, I mostly stop noticing the 3D.
My son is on his 3rd Dualsense controller in about 18months.
Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.
Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.
If you had told someone 10 years ago that you can play Halo 3 on a handheld running Linux with a OG Xbox 360 controller on Steam they would call you crazy.
Halo 3 is seventeen years old. Ten years ago, a seventeen-year-old game would be something like Quake 2 or Castlevania: Symphony of the Night, both of which could easily be run on handhelds by that time.
The fuck is your son doing to those poor $80 controllers?
Dreaded stick drift. Quite common.
I had mine maybe 8 months before the left stick started drifting hard. Completely unusable. And sony wanted me to go through all these hoops AND spend like $20 bucks to ship it to them.
Ended up getting an 8bit Pro Ultimate instead, and so far it’s worked great! Has Hall-Effect joysticks too, so no chance of drifting ever. The major console makers NEED to switch to HE for the next gen.
I'm still with my first dualsense, my dualshocks from PS3 and PS4 still work without any issues. I don't want to know what people do to their controllers.
My Xbox Series S controller got stick drift like 3 months after I got it. My friend's finally succumbed last week, after about a year of owning it. What is it with stick drift on new controllers? Seems like every modern system has the exact same problem
My 46" Sharp Aquos that I paid $2,000 for in 2004 is still chugging along like a champ. It's been used nearly daily.
Cherish it (though maybe not its power requirements?) - based on the big ole chunky bois I’ve seen at the dump 📺 (looked like those rear projector models or something).
Same here. 40” Sharp Aquos quattron not only still working, but working flawlessly. It’s also got way more inputs than any TV that size today, and a stand that swivels that I use all the time. I’m in no hurry to replace it.
Televisions are one if the few things that have gotten cheaper and better these last 20 years. Treat yourself and upgrade.
Except they turned into trash boxes in the last couple of years. Everything is a smart TV with ad potential and functionality that will eventually be unsupported. I’m holding onto my dumb TVs as long as I can.
Yup. Those cheap TV's are being subsidized by advertisements that are built right in. If you don't need the smart functionality, skip connecting it to the Internet. (If you can. Looking at you Roku TV's!)
look up "commercial displays" or "commercial tvs" when the time comes.
We’ve got a pair of LG C1 OLEDs in the house, and the best thing we did was remove any network access whatsoever. Everything is now handled through Apple TVs (for AirPlay, Handoff etc.), but literally any decent media device or console would be an upgrade on what manufacturers bundle in.
well you can just not connect it to the internet and still have some extra features.
also if it's an android tv, it's probably fine (unless you have one with the new google tv dashboard)
these usually don't come with ads or anything except regular google android tracking, and you can just unpin google play movies or whatnot.
But be careful of the "smart" ones. If you have a "dumb" one that is working fine, keep it. I changed mine last year and I don't like the new "smart" one. IDGAF about Netflix and Amazon Prime buttons or apps. And now I'm stuck with a TV that boots. All I want is to use the HDMI input but the TV has to be "on" all the times because it runs android. So if I unplug the TV, it has to boot an entire operating system before it can show you the HDMI input.
I don't use any "smart" feature and I would very much have preferred to buy a "dumb" TV but "smart" ones are actually cheaper now.
Same for my parents. They use OTA with an antenna and their new smart TV has to boot into the tuner mode instead of just... showing TV. Being boomers they are confused as to why a TV boots into a menu where they have to select TV again to use it.
New TVs may be cheap, but it's because of the "smart" "spying" function, and they are so annoying. I really don't like them.
Yeah the bootup kills me. I got lucky that my current tv doesn't do it. But man the last one I had took forever to turn on. It's stupid.
Can't speak for your TV, but mine takes all of 16 seconds to boot up into the HDMI input from the moment I plug it in, and there's a setting to change the default input when it powers on. I use two HDMI ports so I have it default to the last input, but I have the option to tell it to default to the home screen, a particular HDMI port, the AV ports, or antenna
Not a fan of the remote though. I don't have any of these streaming services, and more importantly I'll be dead and gone before I let this screen connect to the Internet
Why give up a perfectly usable TV?
Why get a new pair of glasses when your prescription increases? My old glasses are still perfectly usable.
I've been in stores which have demonstration 8K TVs.
Very impressive.
I'm still fine with my 720p and 1080p TVs. I've never once felt like I've missed out on something I was watching which I wouldn't have if the resolution was higher and that's really all I care about.
I think the impressive is likely more to do with other facets than the resolution. Without putting your face up to the glass, you won't be able to discern a difference, the human visual acuity limits just don't get that high at a normal distance of a couple of meters or more.
I'd rather have a 1080P plasma than most cheap 4K LCDs. The demonstrators are likely OLED which mean supremely granular conrol of both color and brightness, like plasma used to be. Even nice LCDs have granular backlighting, sometimes with something like a 1920x1080 array of backlight to be close enough to OLED in terms of brightness control.
I have a 4k tv with backlighting that matches the screen. When I take magic mushrooms and watch it I can see god
My takeaway from this comment section is that smart TVs are straight from hell and should be treated as such. It is very important, that you get a TV BEFORE smart TVs were a thing.
Display technology has advanced quite a bit since smart tvs have become ubiquitous, though. So you are sacrificing quality to avoid those headaches.
Personally I just don't give my smart TV an internet connection.
That's what I did too. It has no connection and I don't use any of the smart TV features. Instead I have my own box I'm using. I never felt this stupid.
Nah, you can buy new TVs
Just make sure they can be used without network and then never connect them to the internet.
Bought a new TCL recently, none of the smart features work, but got excellent screen quality with all the new specs.
Mine is that I’m not the same demographic than most Lemmy users who comment here.
Lol my phone has the best GPU and display in my house, and has raw specs of half the ram and cores of my 2012 desktop 😹
At work all day I remind people that a container with 1 vcpu and 2GB of ram is like running on a ten year old phone, theoretically 🙃
I game at 1440p. The day my 1080ti dies will be a sad day indeed.
I went from a 1070Ti to a €600 4070.
There was no need whatsoever for that. I have noticed almost no difference at 1440p 165Hz except newer games go like 20 fps higher.
I was thinking VR would be hugely different. It wasn't, and now my headset cable is broken so I can't even game in VR.
Honestly don't see the necessity. I've had the same computer monitor for 17 years.
Senseless tech lust
Or maybe most people that sit in front of a monitor all day are working and can benefit from a sharper image and more real estate. I work in tech and end up needing a lot of windows and terminals on the screen at once - upgrading from 1080p to 1440p was a game changer for productivity.
1050p 16:10, 1200p 16:10, or 1200p 4:3?
Just a normal 1080p 16:9. Wish I got a 16:10.
How many Ks is real life resolution and at how many fps does it run?
Whatever the resolution of 'real life', what matters is at what point our little eyes and brains no longer can perceive a difference.
In average scenery, the general consensus is about 60 pixels per degree of vision. If you have something a bit more synthetic, like a white dot in empty space, then that sort of specific small high contrast would take maybe 200 pixels per degree to ensure that the white dot is appropriately equally visible in the display versus directly seeing. A 75" display 2 meters out at 4k is about 85 pixels per degree. This is comfortable enough for display.
Similar story with 'frames per second'. Move something back and forth really fast and you'll see a blurry smear of the object rather than observing it's discrete movement. So if you accurately match the blurring you will naturally see and do low persistence backlight/display, you'll get away with probably something like 60 FPS. If you are stuck with discrete representations and will unable to blur or turn off between meaningful frames, you might have to go a bit further up, to like 120 or 144 FPS.
It's possible to argue motion blur looks better, but at least in Rocket League, it makes it insanely hard to play
I think it's about 1044 fps, give or take.
I feel like it's kinda infinite, because you can zoom in to the quantum level and then looking at things sorta fails you... But I'm no scientist.
The question isn't how high the resolution of reality is, but how well we can process it is. There is an upper limit to visual acuity, but I'd have to calculate what an arc-minute at 6 meters would be and I'm too lazy right now. Regarding fps, some people can notice artefacts up to 800hz, but I'd think going with 120hz would be ok. Remember, you'll have to generate stereoscopic output.
But I asked how much, not how well. I wanna know about the first question, not the second 🥹
Both are practically infinite, or well, the question doesn't really make sense.
Reality isn't rasterized, so there's no resolution. You just have light waves bouncing off of things and into your eyes. They can hit at all kinds of angles and positions, and your brain will interpret different impact frequency distributions as some color or brightness you see in a certain position.
And you don't have a shutter in your eyes or something else that would isolate individual frames. Light waves just arrive whenever they do and your brain updates its interpreted image continuously.
So, in principle, you can increase the resolution and display rate of a screen to infinity and you'd still perceive it differently (even if it's not noticeable enough to point it out).
The cost just goes up ever more and the returns diminish, so the question rather has to be, whether it's worth your money (and whether you want to sink that much money into entertainment in the first place).
1080P VA panels FTW!
It’s funny that we got to retina displays, which were supposed to be the highest resolution you’d ever need for the form factor, and then manufacturers just kept making higher and higher resolutions anyway because Number Go Up. I saw my first 8K laptop around this time and the only notable difference was that the default font size was unreadable.
That kid was about as cool as kids could be back then. I wonder what he's up to today.
https://knowyourmeme.com/memes/brent-rambo
Work(s/ed?) for Sony as of 2013 and worked on Planetside 2. Pretty cool!
Work(s/ed?) for Sony as of 2013
Funny turn of events considering the meme
4K looks the same as 8K on a 46" TV
Resolution =/= graphics
I enjoy 4k on the monitors I sit only a few inches from all day, but so far I find it hard to justify a whole chain of upgrades for the living room when I think the picture quality already looks great at 10+ feet away or whatever . To be clear, I mean I don't see the need to upgrade the living room from 1080p to 4k, let alone beyond that
It really depends on the size of tv. It's like a cinema screen, you want very high resolutions for that even though it's far away, because it's a large size
It depends on how much of your FOV the screen covers, since it’s the angular resolution of our eyes that matters.
I think mine is 56 or 58 inches. A lot of people have commented that it's large. It feels like the right size to me /shrug
bro i just want full raytracing-
Think we're still a few gens from that.
This will either require AMD to go hard on ray tracing or for console manufacturers to get their video hardware from Nvidia, which will be far more expensive.
Though after some brief searching, my literal terminology may apply to AMD’s strategy: https://www.pcgamesn.com/amd/rdna-5-ray-tracing
Sony Bravia Z series. Bought in 2010 I think? Still works like a charm!
I can't really imagine being close enough to any screen where I need more than 1080p. I'm sitting across the room, not pressing my face against the glass.
This is me, but TV is from 2009.
CRT for life
I don't play games on my TV but I have a really old 1080p one with a native Plex and YouTube apps with no nonsense. I have seen the ads and other stupid bullshit modern tvs come with, I'm going to be fixing this TV up until my dying breath.
Don't worry. Either the PS9 won't need a screen or we can sue for false advertising. https://youtu.be/iwhPkBHLdqE?feature=shared
I saw a license plate the other day that said "PSX 12" and I was like "they're not even on 6 yet. Is this guy from the future?"
I legit just had an Olevia branded 37 inch TV I’ve had since 2007 bite the dust finally. 16 years was a hell of a run, It cost me $600 at the time, which cost me roughly $37.50 per year of use. RCA ports went out partially ages ago but the HDMI just kept ticking. It was an lcd and I never had a single pixel die out on me. Played everything from GameCube-Wii-Switch ,PS2-4, OGxbox-360-XboxOne and ran a chromecast for the last 3-4 constantly. Felt like I was putting a dog out to pasture. Loved that bad boy.
I was using an old plasma screen from around 2008/9 for a while until my girlfriend's sister's exboyfriend stole it. Their dad gifted me a 55in 4k tv that wasn't bid on at an auction he was running. That plasma is probably gonna burn down Shithead's place at some point, it was pretty sketchy.
I love to see the average iPhone fan (wastes $1000 every year) meet you
Credit where it’s due, since the post was about using old devices: iPhones have consistently had some of the longest software support in the industry.
good point, I feel like OP would have a iPhone due to this
It's so funny to me, for a few hundred more you can get an android that unfolds into a tablet lol if you're going to drop a grand on a phone why not spend a little more and get something fresh