Adding Color
Previous: Vertical Sync and Interlacing
Black & white TV is great and all, but citizens demanded color TV! Adding color TV to analog TV broadcasts was a tricky proposition: almost nothing besides the actual addition of color could change (and "almost" is doing a bit of heavy lifting here as you'll see), and any color TV broadcasts still needed to be perfectly visible on existing black & white TVs (backwards-compatibility is a big deal).
Color as a Frequency
It turns out, color can be encoded as a wave with a specific frequency, by encoding the hue of the color as the wave's phase (its side-to-side positioning relative to a reference wave) and the saturation of the color as the wave's amplitude. Coupled with the existing broadcast's brightness information (now referred to as the luma of the signal), this gives the TV everything it needs to display a full color picture.
For an idea of how this encoding works, here is an interactive demo of how the phase (relative to the reference wave, the fainter sine wave) and amplitude of a carrier wave can be turned into a color (here using a constant luma value of 0.5);
Choosing the encoding frequency was problematic, though. Due to some signal processing shenanigans, this wave's frequency really needed to be an integer multiple of both the scanline rate (525 lines times 30 frames = 15,750 lines per second) and the audio carrier frequency which was 4.5MHz above the video signal. Were that not the case, the color carrier would end up causing interference with at least one of those signals, which would distort either the picture or the audio (or both). Additionally, this color carrier frequency wanted to be high enough that it would be minimally visible on existing black & white TVs, and easy to filter out in newer black & white TVs.
Ultimately, it was decided that moving the audio carrier frequency was not viable, because it would break old TVs' ability to pick up audio (they were strongly tuned to that 4.5MHz), so in order to get a color carrier wave to sit nicely against both, they had to, instead, change the frame rate of the video signal slightly: to approximately 29.97 frames per second (30/1.001, to be precise). That's right, if you were wondering why we have these weird 29.97 or 59.94fps video modes, it's the fault of color video.
Because of all of the sync pulses in the video signals, TVs could easily adjust to a different line rate, and thus the line rate changed from 14,750 to an even 1/286th of the audio subcarrier frequency (approximately 15,734.27Hz), and the color carrier was chosen to be roughly 3.58MHz (315/88 MHz), which divides nicely into both rates.
A New Scanline
There were two more issues that needed solving: how could color TVs distinguish between newer color signals vs. black & white signals, and how does the TV know what phase a scanline's color waveform was relative to (i.e. what is the 0° baseline frequency). These were both solved by adding the colorburst to the back porch of a color video signal:
In the back porch now, there is a burst of the color carrier frequency (a 3.58MHz wave), which is the reference phase for the color encoding for this scanline (the amplitude of it additionally lets the TV know what the maximum saturation amplitude is), so the TV will synchronize with this pulse and use that synchronization to decode the actual color signal (a process called quadrature amplitude modulation).
Additionally, broadcasts that did not contain a colorburst were known to color TVs as being in black & white, and so they could be processed accordingly. And black & white TVs could freely ignore the colorburst, as it would just appear as a little bit of signal wobble to them.
This encoding of the luma (brightness) and chroma (the color information) together in one signal is called composite video, which was also what the yellow video jacks from older video products carried.
Never The Same Color
Some people will refer to NTSC as some variation of "Never The Same Color" because - especially on older TVs - broadcast differences between channels or poor reception could lead to colors appearing different on some channels than others (hence why TVs had a "tint" knob, so the viewer could correct for any weird color tones due to phase issues).
When both PAL and SECAM were standardized, color NTSC had been around for a while, and they both contained signal tricks to help reduce or eliminate these phase issues, and thus are widely considered as having superior picture quality.
By the mid- to late-1970s, newer TVs typically had less issues decoding the phases, and most stations were broadcasting properly by then, so the color issues mostly stopped being an issue. But the name stuck, because it was funny.
Color Artifacts
Because the chroma and luma data is all in one big analog signal, information bled between them: for instance, rapid changes in brightness at roughly the color carrier frequency are indistinguishable from chroma signals.
Systems like the Apple II and PCs with CGA graphics cards attached to composite monitors intentionally took advantage of this to display more colors than the system supposedly supported.
(The above images are based on ones by the authors of the 8088 MPH demo which uses these artifact colors in masterful ways to get way more colors on-screen at once than CGA was believed capable of)
The left image above shows the various patterns possible with 1-bit-per-pixel color, and the right image shows the colors that a composite screen displays for those bit patterns when displayed by a CGA graphics card at 640 pixels of horizontal resolution. Apple II had a similar set of colors, but in a different order due to differences in the relative phases of the scanlines.
Cathode Retro was explicitly designed to reproduce this (and other related) effect(s) when in composite mode.
S-Video
While the above artifact colors were useful to some, in many other places they could be a nuisance, and so in the late 1970s, a new video connector type started making the rounds: S-Video!
S-Video has not one but two channels of signal, and it uses them to broadcast the luma and chroma separately (the "S" in S-Video literally stands for "Separate"). Basically, it just took all of the chroma wiggles and shunted them off into their own wire, so now one component of the signal looks exactly like a black & white video signal (luma only) and the other has just the chroma data in it.
This meant that all of those cross-talk artifacts (including artifact colors) were not visible over an S-Video connection, so as long as the thing you were displaying was not relying on them for the full effect (as many older games did), the picture qualtiy was vastly improved as a result.
~Fin~
That is, effectively, how the NTSC video signal works! If you made it this far, hopefully you have a better understanding of what all goes into a color analog TV signal. It's a lot.
Now that you know how these work, it's time to take a look at how Cathode Retro fakes the whole thing.