[ Updated June 9, 2019, to correct a math error regarding drop frame timecode. ]
Back in the early days, the 1930’s, when television was being invented, the engineers trying to create pictures out of thin air had a problem.
The method they were using to display a picture required zapping a high-voltage beam of electrons against a curved piece of plate glass the inside edge of which was painted with a phosphorescent solution.
When the voltage was high, around 15,000 volts, the phosphors would glow white. When the voltage was a little less, the phosphors would glow gray. And when the voltage dropped too low, the phosphors wouldn’t glow at all.
The goal was to create an image by “painting” this variable high-voltage beam in a series of rows from the top of the image to the bottom, causing phosphors to glow in its wake.
The problem was that these phosphors weren’t very high quality. The chemistry at the time meant that the phosphors would only glow for as long as it took for the beam to get about half-way down the back of the glass plate. Then, they’d fade away.
This meant that, instead of looking at a complete image, these engineers were looking at a rapidly shifting band of gray as the phosphors were refreshed and faded.
What to do?
Improved chemistry was out of the question. These phosphors were the best available. Instead, the engineers came up with a very clever work-around: instead of painting every line from top to bottom in one pass, they would paint every other line, so that as the phosphors in the first “odd field” started to fade, the second “even field” was already being painted and overlaid on top.
Instantly, the engineers had a complete image and interlacing was born.
The next problem was that there was no wire connecting the originating station with the TV receiver. Without a wire, how could the two synchronize so that the TV set would know when to start painting each frame?
Remember, this was thirty-years before the invention of the transistor and a digital clock.
What do do?
The engineers looked around for a stable clock that could be used anywhere in the United States and discovered the pulse built into 60-cycle AC current. Electricity was standardized across North America
By locking the first “odd” video field to one pulse, then locking the second “even” video field to the second pulse, these new TVs could sync the two fields to create a single video frame 30 times each second – and do it anywhere in the US. Pictures were transmitted, but sync came from the plug in the wall.
Instantly, 30 frames per second (fps) video was born.
NOTE: Film used 24 frames per second because it was the slowest frame rate film could run through a projector and still deliver both smooth movement and high-quality audio. In other words, this was a choice based on cost. TV picked 30 frames a second because it was a timing pulse universally available across North America; a choice based upon universality.
It took more than 80 years to finally invent technology that made interlacing and incompatible frame rates disappear. Mostly.
So where did 25 frames per second come from? Read this.
Except, that explains 30 fps and 25 fps; where did 29.97 fps come from?
Well, the Wizard of Oz, actually.
Back in the early days, television was only black-and-white. But, films like the Wizard of Oz, along with every MGM musical of the 1930’s, proved that the American public thought color was pretty darn cool.
So, David Sarnoff, head of both NBC and RCA, asked his engineers what it would take to create color television. They replied: “One and a half years and one and a half million dollars.” Bold, but wrong.
It took more than 15 years and more than 15 million dollars. And, in the process, both the engineers and marketers had problems.
In those days, the 1940’s and early 50’s, it was impossible for consumers to take out a loan to buy a TV set. You could only pay cash for your TV.
The problem was that a black-and-white TV cost the average American two years salary to purchase. The proposed cost of a color TV looked to require saving for EIGHT years in order to make the payment!
What to do?
The marketers, realizing that no one would be able to afford a color TV if something didn’t change, needed to find a way around cash-only sales of TV sets; because if no one owned a color TV, color television broadcasting would die.
This was the genesis of consumer credit cards and installment loans. Initially offered for dining, the banks expanded it to consumer credit in 1958, though it took until 1976 for Visa to appear. MasterCard appeared a few years later.
As the marketers began to solve how to afford to purchase a TV set, the engineers were struggling to figure out how to transmit a color signal. The easy solution was to create two TV networks: one for black and white programs and one for color. But, as David Sarnoff strongly reminded them, that would mean that no single television could see all the programs available. Anything that limited the potential market was not acceptable.
What to do?
The engineers invented a way to piggy-back a color signal on top of the black-and-white signal.
Instantly, 4:2:2 color was born; where the green channel carried the black-and-white signal, with red and blue channels overlaid on top of the green for those sets that could display color. On Nov. 3, 1956, the Wizard of Oz ushered in the new world of color television on the NBC network.
NOTE: I was a kid when that broadcast first aired. My family watched it, along with about 30 neighbors, packed into the living room of Katie Malvetz, the local furniture dealer who had the only color TV in our city. When black-and-white Kansas turned into the color of Oz, there wasn’t a dry eye in that entire stunned group. It was a life-altering experience for each of us.
The problem was that transmitting all this color information took time. 0.03 seconds every second, to be exact. Where was this extra time to come from? Adding it to the program meant that one hour shows would actually run one hour, three seconds and 18 frames. Not gonna fly.
What do do?
The engineers, in order for shows to run accurately to time, slightly altered the timing of each frame to compensate for the extra delay caused by transmitting a color signal.
Instantly, 29.97 fps and drop-frame timecode were invented.
Well, perhaps every solution wasn’t quite so “instant” but you get the idea. Decisions made in the distant past to solve problems created technology that, warts and all, we continue to use today.
And now you know where each of these originated.
Or so it was told to me.
8 Responses to So It Was Told To Me… [u]
Wow, Larry! Thanks for the behind the scenes explanation of the genesis of color TV. My first encounter with it was in the early ’60s seeing the NBC peacock followed by “Bonanza”. I was blown away by the blue cavalry officers’ uniforms and, later on, by Perry Como’s colorful sweaters. This was not at home but at a local restaurant. The novelty of color TV was definitely viewed as a draw to attract customers.
Colour theory here in the uk is also mind numbing, it’s locked to the 50hz mains but they have an interesting way of correcting the colour signal to keep it the same as the broadcast output, PAL. all this controlled by its own crystal controlled occilator running at 4.43mhz, there not enough time to go into to the maths behind it and to many years have passed since I studeid it, this was in the days of mechanical delay lines and passive components.
if you want to know more about go to youtube there will be some kind nurd who has passed an in depth video.
I thought running color results in 108 seconds long each hour, which (at 30fps) results in an overage of 3.6 second long per hour or 3 seconds 18 frames?
Good point. My math was incorrect.
NTSC RS-170 color television did not take any longer to transmit than black and white.
The subcarrier containing color information was set to 3.57945 mHz so it would be a sub multiple of half the line rate (nominally 15,750 hZ) to minimize interference.
The real concern was preventing intermodulation with the standard FM audio subcarrier at 4.5 mHz. By shifting the whole works down by .1%, they would avoid interference in transmitters and receivers between color subcarrier and audio subcarrier. Therefore 29.97 Hz video.
If there wasn’t a need to preserve compatibility for sound reception with old, black & white sets, 29.97 wouldn’t have happened.
Thanks for taking time to write this up. While this kills a good story, I always enjoy learning stuff like this.
Sometimes you have to kill a good story to get to the facts… ☹. There are a few holes in your story.
While interlaced scanning might have ‘fixed’ a problem with CRT phosphors, that wasn’t why it was used. By scanning every other line, the frequency of the horizontal sweeps could be reduced, which also reduced the bandwidth of the signal that had to be transmitted. It was an early version of analog ‘data compression!’ It also had the added benefit of reducing the flicker of the display, as the eye would not perceive the flicker of a 60 frame per second (fps) signal as much as a 30 fps signal.
Before you ask, we don’t notice the flicker of a 24 fps motion picture because we watch movies in a dark room. The human eye is less sensitive to flicker of a dim image in a darkened room. Also, in the days of motion picture film projection, the light image was often interrupted by a two-bladed shutter to increase the apparent frame rate to 48 fps.
Interlacing could save ‘bandwidth’ in other endeavors, too. I could cut my lawn in an interlaced fashion by mowing every other row. The next week, I could cut the rows in between the rows I cut the previous week. The lawn wouldn’t look perfect, but I’d be done in half the time. Same thing with interlaced video. The picture isn’t perfect, but I save on bandwidth by lowering the sweep frequency.
29.97 Hz frame rate
As others have mentioned, the frame rate was changed to avoid a perceived “beat frequency pattern” between the aural (sound) carrier and the color subcarrier frequencies. The truth is, it wasn’t a problem on every TV set (even back then) and shortly after the introduction of color broadcasting, receiver designs evolved to where the original problem didn’t exist anymore. But, we were stuck with an unusual frame rate.
The horizontal scanning frequency change didn’t seem like a big deal at the time, but it did cause problems of its own. “Hum bars” now rolled through the image, rather than being stationary (and not as noticeable) on the screen. The change has also caused unknown hours of problems (and cost perhaps millions of dollars) for broadcasters and post production engineers. And yet, it still exists!
Cost of a TV in the 1950’s
The numbers you see on line are all over the place, but it definitely did not cost two years’ salary to buy a black-and-white TV, or eight years for a color TV. The numbers I’ve seen for 1955 were around $200 for a small B/W TV, and around $1000 for a color TV. The US census bureau says the median salary in 1946 was $2410, so you can do the math. (B/W, appx 5 weeks. Color, appx 22 weeks.) Color definitely was expensive, but it was also brand-new at that time.
The first national color broadcast was on NBC on January 1, 1954. It was a live broadcast from the Rose Bowl Parade. Your first experience may have been in 1956, and it was common back then for TV dealers to have what we would now call a “watching party.”
It wasn’t anything called 4:2:2. The existing black and white signal carried the brightness information (luminance, or more correctly, “luma”) just as always, which made it compatible with color TV. The two ‘color difference’ signals were carried on a “color subcarrier” that was added to the black and white signal. Those color component signals were lower in resolution than the monochrome signal, but still not 4:2:2. In NTSC it was known as Y-I-Q, and while it is similar to 4:2:2 in sampling, the math is slightly different.
Drop-frame time code
As others have mentioned, color TV isn’t “slower” than black and white. The frame rate was changed, but not the speed of transmission. If you’ve done any work in post production, you must realize that “drop-frame time code” refers to how the frames are counted, not their duration. Because the frame rate is lower, time code numbers needed to be dropped in counting so that the “time code” value more closely matched real time.
Sorry to ruin some of your stories, but I believe that these are more accurate answers. I apologize for not including citations, but it’s better to have people look it up themselves from official sources.
I always appreciate being corrected with more accurate information. No apology necessary.
Thanks for taking the time to write!