The static on old CRT TVs with rabbit ears was the cosmic microwave background. No one in the last 25 years has ever seen it.
Well, not really. The cosmic microwave background radiation was a tiny fraction of that noise. What everyone saw was mostly thermal noise generated by the amplifier circuit inside the TV.
CRTs was in use well into the 2000s
Even before the 2000s they started showing a blue screen instead of static.
That wasn’t just a digital or flat panel thing.
But of course old sets were around for a long time.
My memory of the exacts here are fuzzy, but I think this depended on whether or not your TV picked up digital signal, analog, or both. I remember around that time we had a TV that would pick up static on some channels and have a blue input screen on others.
It’s definitelly an analog over the air TV thing.
The way digital works you would either get a “No signal” indicator (because the circuitry detects the signal to noise ratio is too low) or squarish artifacts (because of the way the compression algorithms for digital video are designed).
I remember back in the Wii days when I was young we had a flat screen that would go to the digital pattern with no input. However sometimes once in a while it would get that static loud no signal so I think mine had both
I don’t really have a point here just wanted to share
Yeah, for instance the semi-ubiquitous “small TV with a vhs player built in” that was in a ton of mini-vans and kids’ rooms well into the early 2000s only supported analog cable/antenna signals, so it would give the black and white static when there was no signal.
I’m talking long before digital channels existed. (In the US anyway)
What are they hiding from us?!
Yeah I was still using a CRT as recently as 2012. I think OP means analogue TVs.
Yeah you’re right.
Technically, it’s not about the display technology, but instead about the signal/tuner. More specifically if it’s analog or digital. Some modern TVs still have analog or hybrid tuners for backwards compatibility and regions that still use analog, so they can display static. For instance, in Ukraine we finished the switch to digital TV only a couple of years ago. If your TV had no digital tuner (as was the case for many) you had to buy a DAC box. Retirees/pensioners got them for free, sponsored by the government.
Yeah, my youngest sibling has definitely seen CRTs. My niblings probably haven’t, though.
I thought they were teaching it in all the schools? /s
Do you think CRTs just magically disappeared after the turn of the millennium?
Don’t you still see this when using an OTA ATSC tuner on a newer LCD display? I thought this was a function of the signal generation and not the display technologies.
deleted by creator
It actually was a pretty rapid switch where all the CRTs disappeared
Cheap led tvs were like 1/5 the cost of Analog TVs. The digital switch over really finished them off too.
Really it’s the size/price that did it though. My buddy paid I think $3k for a maybe 40” Trinitron in 99-2000. It probably weighed 200lbs. Looked amazing at the time but it was probably only months before big leds came out. Plasma might have been a thing then but we’re like $10k+
They lied to us. The real Y2K was the CRT rapture.
I think they’re more likely to have been scrapped than other old tech.
They’re bulky, and mine was too heavy to get out in the attic. I still have my ZX Spectrum and Amiga, but the CRT needed for lightgun games is long gone.
Well to be fair at some point most/all CRTs showed a blue screen instead of static. So it’s possible someone born in 2000 never saw the snowy display.
No, I just couldn’t remember exactly when. And as another commenter pointed out, what I should have said was analog TV’s.
People born before 2000 think older technology just evaporated the minute the millenium ticked over.
Like when the black and white world suddenly got colorized! My grandpap told me about them old days - when the lawn, the sidewalk and the sky were just different shades of gray.
Grandpa was telling you about 50 shades of grey?
Grandpa knew things. Apparently so did grandma.
deleted by creator
they have to watch HBO shows to compensate
Surely you mean the much worse “Max”.
Logo still shows HBO for that intro though
Only the HBO shows on Max have the static. If you watch the garbage reality shows it isn’t there.
No one in the last 25 years has ever seen it.
I mean you can still find a CRT today and turn it on if you like, they’re less common for sure, but they’re still around if you’re looking for one
Kids born after 2000 aren’t looking for one
Well that’s a lie, I know an early 20 year old who’s into retro games and has definitely been to an arcade with CRTs in the past year or so. It’s not a stretch to imagine he’s seen static on one
i know i am.
I had three different ones growing up. The first 2 were black and white and the last one was color. All found on the side of the road.
I find one every once in a while, on the side oft the road aswell, unfortunately some idiot usually sprayed graffiti on it for some dumbass Instagram post, or I have no room at my place ATM.
Dude I’d kill for the opportunity to get my hands on a half-decent CRT
You’d be surprised, some people born in the 2000s want them for the retro factor now
CRTs are popular with people who have retro games consoles.
They’re surprisingly difficult to acquire though. Big, heavy and either very expensive or free.
By the way, the picture illustrating the post isn’t actually displaying the real thing - the noise in it is too squarish and has no grey tones.
TV static in recent movies and shows that are set in the past almost always instantly pull me out of the narrative because no one seems to be able to get it right and some are just stunningly bad. It’s usually very subtle, so much so that I’m not sure I could even describe what’s wrong. Makes me feel old to notice it.
I think the problem is because CRT displays didn’t have pixels so the uniform noise which is static was not only uniformely spread in distribution and intensity (i.e. greyscale level) but also had “dots” of all sizes.
Also another possible thing that’s off is the speed at which the noise changes: was it the 25fps refresh rate of a CRT monitor, related to that rate but not necessarily at that rate or did the noise itself had more persistent and less persistent parts?
The noise is basically the product of radio waves at all frequencies with various intensities (though all low) with only the ones that could pass the bandpass filter of the TV tuner coming through (and being boosted up in intensitity by automatic gain control) and being painted along a phosphorous screen (hence no pixels) as the beam draw line by line the screen 25 times per second so to get that effect right you probably have to simulate it mathematically from a starting point of random radio noise and it can’t be going through things with pixels (such as 3D textures) to be shown and probably requires some kind of procedural shader.
The sky above the port was the color of television, tuned to a dead channel. - William Gibson, Neuromancer
One of the most beautiful opening lines to a novel.
If you remember that it was written in 1984, the color is obviously black and white static. If you don’t think about the year, you might be lead to believe it is blue.
Literally 1984
One of the most beautiful opening lines to a novel.
Abundantly clearly not.
This is it:
“It was a dark and stormy night; the rain fell in torrents—except at occasional intervals, when it was checked by a violent gust of wind which swept up the streets (for it is in London that our scene lies), rattling along the housetops, and fiercely agitating the scanty flame of the lamps that struggled against the darkness."Torrents >> TV
Uh, no:
“The man in black fled across the desert and the gunslinger followed.”
“Cheryl’s mind turned like the vanes of a wind-powered turbine, chopping her sparrow-like thoughts into bloody pieces that fell onto a growing pile of forgotten memories.”
Look here dude, we still doing “no nut November” or what?! Why must you tempt me?!
The opposite of a Bulwer-Lytton!
It is entirely possible for people born after 2000 to have grown up with CRTs.
It is, but those late model CRTs often had a lot of digital circuitry that displayed a solid color on channels with nothing on them. Unless there was a much older CRT around, they never would have seen it.
Most of the CRTs are going to be older
2001 here literally grew up with CRT static, you have your years a bit off there.
I was about to say, i think we had a CRT till about 2010. My grandma still has one upstairs so even my youngest cousins still grew up with it.
I bought a plasma in 2009 that would show static if I turned it to cable channels without cable plugged in. Plasmas were susceptible to burn in and since I would game a lot I could see health bars etc start to burn in after a while. Whenever that would happen I would turn it to the static screen - making each pixel flip from one end of the spectrum to the other rapidly like that would actually help remove the burn in.
Last time I thought about static I wondered why colour TV didn’t show colour static.
Turns out the colour signal was on very specific frequencies, and if it wasn’t present, it would assume it was a black and white signal and turn off the colour circuit.
Tube TV’s remained in common service well into the 2010’s. The changeover from analog to fully digital TV transmission did not happen until 2009, with many delays in between, and the government ultimately had to give away digital-to-analog tuner boxes because so many people still refused to let go of their old CRT’s.
Millions of analog TV’s are still languishing in basements and attics in perfect working order to this very day, still able to show you the cosmic background, if only anyone would dust them off or plug them in. Or in many retro gaming nerds’ setups. I have one, and it’ll show me static any time I ask. (I used it to make this gif, for instance.)
In fact, with no one transmitting analog television anymore (probably with some very low scale hobbyist exceptions), the cosmic background radiation is all they can show you now if you’re not inputting video from some other device. Or unless you have one of those dopey models that detects a no-signal situation and shows a blue screen instead. Those are lame.
Amateur radio operators are indeed allowed to transmit analog NTSC television in the UHF band. It’s most commonly done on the 70cm (440MHz) band, and a normal everyday 90’s television is all you need to receive the signals. You’d tune to what would have been cable channels 57 through 61. The use cases for this have decreased in recent years; for example you used to see hams using amateur television to send video signals from RC aircraft or model rockets, now that’s done with compressed digital video over something like Wi-Fi and doesn’t require a license. But, it’s still legal for hams to do.
I think my mom still uses the last CRT TV that I had. Gave it to her when I bought my first 720p HD TV, as the old CRT was better than her old TV. Later on I also gave her that HD TV but she still has the CRT too.
My family had several tvs that did this until around 2013
It really isn’t though. It is thermal noise.
Random radio sources, but a small part of the signal is CMB. I wasn’t sure what you even meant by thermal noise but I believe it’s a phenomenon of flatscreens. I found something that said it was “similar to snow on analog TVs” - so apparently there’s a difference.
Funnily, Google AI says, “In the 1940s, people could detect the CMB at home by tuning their TVs to channel 03 and measuring the remaining static after removing other sources. This allowed them to prove the Big Bang before scientists did.” So they had that going for 'em, which is nice.
“Thermal Noise” is a phenomenon where everything makes EM noise, just from thermal energy.
If you were to put such a TV in a faraday cage, with an RF termination, you would see something similar. Because noise is inherently part of the circuitry and amplifiers.
Could it not be both?