And video quality. Watching some historical videos from my childhood, like tv shows on youtube.... the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.
People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren't capable of displaying the difference in quality. To the average person they were the same.
VHS was capable of not bad quality, people just had a lot bad equipment.
Some TV shows (if they were crazy) were shot on film so you could re digitize them now in 4 or 8k and they’d look amazing. But there was also a lot of junk that was out there.
And as others have mentioned if you do an awful job of digitizing it then you could take something that looked good and throw all of that quality away. But if the tape wasn’t stored in good condition then it could just struggle to be digitized in the first place when done properly.
There's a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that's gotten shared over the years, especially YouTube's encoders. They will just straight up murder videos to save bandwidth. There's also a lot of stuff that just doesn't look great when it's being upscaled from magnetic media that's 240x320 at best.
However, there's also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There's a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn't capable of. There's a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.
Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing...
This video was exactly what first came to mind when I read "badly understandable dialogues"! It bothers me that as we got better mics, the actors became more unintelligible instead of the other way as one would predict.
I hear this all the time, and maybe I just don’t watch THAT many shows/movies, but I haven’t come across anything where the actors sound like they’re mumbling. Do you have a few examples I could look up?
I've used subtitles for most of my adult life, ever since having kids. First it was so I could watch without waking the baby, and then it was so I could follow along over all the noise in the house. And I never went back. So as sound mixing changed and got muddier, I guess I didn't notice, because I was already used to not being able to hear half the dialogue anyway.
But then you also have that very specific window of time when a lot of stuff especially SFX was done on video that can't be upscaled. Babylon 5 fans weep.
When I was a kid I used to think black and white meant the TV show or whatever used to be in color but since it got old it turned black and white. My thought process was they changed color just like old people's hair turns grey... This was 35 years ago before internet.
Yeah but it's more complicated than that. They colorized a lot of movies after the fact, the colors were always extremely bright, kinda like when people would color their hair extremely bright. On the contrary I'm not very bright
No, that was just for effect. Notice that all the scenes playing in Kansas are B&W (even the ones at the end), and all of Oz was in Color. It gave the place an extra kind of quality above the B&W pictures they were used to. I have heard that people in the cinemas gasped in surprise when the switch happened.
That's such a trip. Only a 6 year difference between the two of you, yet you experienced the dawn of something and they didn't, and it shapes both of your perspectives so much.
Even though it technically applies to transistors, Moore's Law has been a good barometer for the increase of complexity and capabilities of technology in general. And now because of your comment I'm kinda thinking that since the applicability of that law seems to be nearing its end, it's either tech will stagnate in the next decade (possible, but I think unlikely), or we may be due for another leapfrog into a higher level of sophistication (more likely).
I sometimes watch old movies and it gets infuriating how long they talk around the same fact that everyone already agrees on. Yes, he was killed with a knife because it's still stuck in his head, now move on!
The pacing of the oldest movies comes from the theater. Watch a live play and it will seem well paced and 'natural.' Fast cuts from TV ads made people want a faster pace in television shows. Back in the day, 'Miami Vice' was cutting edge because they were the speediest.
Especially in action scenes. I used to watch Hawaii five O the 2010 version and sometimes a chanal showed the old version with the same name, the are so incredibly different in pacing and the amount of violence. I really liked the old one in that regard, much less shooting and blood.
Re-watching Buffy the Vampire Slayer with my kids in new hi-def, and you can clearly and easily see the stunt doubles now, and the SFX look really dated now that you can see them clearly.
It's amazing what old CRTs would let you get away with.
The SFX was the limitations of tools they had, (and budget) but there were a lot of aspects of set design and stunt doubles where they could get away with more on a TV show in SD compared to a movie that was on film. When HDTV started, even news shows were forced to drastically improve the quality of the set pieces and makeup because small details could now be seen.
Lotta old shows are re-formated just to have the wider screen, since they would still film at higher res for movies or just because. It's not just an indication of age if something is still only in 4:3, it's an indication of thrift or just a general lack of giving a shit about the future.
I identified them by awkward haircuts and clothing styles. I knew something was off / wrong, but it wasn't until adulthood that I was able to piece it together.
Can always tell when a show is 4:3 aspect. Recently I've noticed some modern TV shows adopting the theater aspects of flat (1.85:1) or scope (2.4:1) which I think is pretty cool. The last episode of Strange New Worlds I watched was in scope, that's some high end filming.
SNW is really top tier production quality across the board. The camera work, the sound, music, design, everything is goddamned impeccable, and that extends to the post production. So much thought goes into every part of it, and I really have to give Paramount its kudos for enabling that level of attention to detail in all aspects of the franchise right now. If I told a fellow Trekkie in the 90s that we would ever see the day, they would laugh.
Asteroid City switched between aspect ratios as well as switching between black&white as they swapped between the TV story and the 'real'/cinema story.
Even for fans of his films, you have to be prepared for the weirdness to be dialled up to 11 in this one. It's the cinema equivalent of "I'm so meta, even this acronym".
Any of his others would be an easier and maybe more satisfying watch. It's a nice enough story of course, with the usual silly and neurotic characters and bizarre beautiful sets - just don't be surprised when people come out of the cinema looking confused.