The evolution of computer display technology
▼ In a previous post, I mentioned that it's hard to connect a Commodore 64 to currently available monitors and TVs. I've also had considerable difficulty hooking up my Amiga 1200 to an LCD monitor and getting a clear picture. Probably more about that in a later post.
So I thought it would be interesting to look at how widely available computers since the late 1970s have sent text and graphics to an external display, and the evolution of those systems.
1980s home computers
Home computers from the 1980s were intended to be used with a TV as their display. This works very well for screen resolutions up to around 320x200 that those computers had to use because of memory and speed limitations. Nearly all of those computers had a text mode, and then often various graphics modes and additional capabilities such as sprites that are useful for games. They could display 8 to 16 different colors. Text was limited to about 40x25 characters.
Typically, home computers had a built-in RF modulator, which created an actual broadcast TV signal. That works with all analog TVs. And VCRs did the same thing. It quickly became clear that skipping the RF modulation/demodulation steps provided a better picture, so new TVs came with composite video or SCART connectors to hook up VCRs (and home computers).
A composite video signal is a combination of a black-and-white signal and an additional signal that encodes color information. These are also the signals a VCR records on tape. The computer's video chips directly created these signals inside their video chip, so those are the best quality video output available. A few computers such as the Commodore 64 provided access to separate luma (black-and-white) and chroma (color) signals, which prevents the two from interfering with each other. That system was later called S-video.
Early IBM-compatible PCs
The original IBM PC from 1981 had two display options: a monochrome text-only MDA video card or a color CGA video card that also supported graphics. At 640x200 in graphics mode and 80x25 in text mode, the CGA card supported a higher resolution than home computers did, and the signal wasn't an NTSC signal, but TTL RGBI. That means: four wires for red, green, blue and intensity. Each wire caries a binary on/off signal, allowing for 16 color combinations. However, the timing signals were the same as NTSC, so it was possible to connect a TV to a CGA video card with the right adapter.
A limitation of the CGA card is that characters are 8x8 pixels, which doesn't look very smooth. The MDA graphics card addresses this issue by using 9x14 pixel characters. This means the resolution of the display must now be 720x350. A monitor with NTSC timing can in principle a 720x350 resolution, but the image will flicker horribly because it can only draw a complete image 30 times per second. (An image of no more than about 200 lines can be redrawn 60 times per second and doesn't flicker.) So instead, MDA displays use a faster pixel clock, which in turn allows for a faster horizontal refresh rate, resulting in a vertical refresh rate (how often the complete image is drawn) of 50 Hz. MDA only supports text in one color plus extra brightness. It uses two TTL signals for that.
Not long after, there was a Hercules Graphics Card that supports the same text mode as MDA, but also a 720x348 monochrome graphics mode.
In 1984, the EGA graphics card came out, which could run in a modes compatible with both CGA and MDA monitors and supports 64 colors by having two TTL signals each for red, green and blue.
Mac
The original Macintosh/128K from 1984 had a built-in display with a resolution of 512x342 and a refresh rate of 60 Hz, which, like MDA, uses faster timing than a TV signal. Interestingly, there was no text mode, only back and white, and no additional capabilities such as automatically overlaying the mouse pointer over the image. Later Macs increased the resolution and number of colors similar to VGA and its successors.
Amiga
In 1985, Commodore introduced the Amiga. The first generations of Amigas had very flexible graphics modes up to around 320x200 with 2, 4, 8, 16 or 32 colors out of a palette of 4096, or around 640x200 with up to 16 out of 4096 colors (4 bits per color). The timing is compatible with the NTSC or PAL TV standards. So much so even, that with the aid of a genlock device, the Amiga's video chip can sync up with the an external video signal so the Amiga's display can be overlaid over parts of it.
However, the original Amiga was intended to be used with a monitor, not a TV. So its video output is in the form of three analog signals for red, green and blue—12 TTL signals to allow for 4096 color combinations would have been impractical. As a CRT's electron beams are driven by three analog signals for the three primary colors, an analog RGB signal from the computer is a very natural fit.
The Amiga 1200's video outputs: analog RGB, composite video, RF
A big limitation of NTSC/PAL compatibility was that resolutions of more than 200 lines (256 for PAL) required interlaced video with its inherent flickering. So later Amigas got a faster pixel clock that let them draw resolutions such as 640x480 at 60 Hz without flicker. The number of colors was also increased to max 256 out of 16.8 million (8 bits per color).
VGA
In 1987, the new IBM PS/2 PCs came with the Video Graphics Array. VGA has (yes, present tense, VGA is still supported as a lowest common denominator standard in most video cards!) a 80x25 text mode and 640x480 graphics with 16 out of 262144 colors (6 bits per color) or 320x200 with 256 colors. It uses the (in)famous 15-pin VGA connector with analog signals for red, green and blue. After VGA came SVGA, XGA and many others, which increased the number of colors to 256 at higher resolutions, and resolutions kept increasing up to an eventual 1920x1200.
Unification
The early color computers used a small number of fixed colors and the later ones worked with an indexed color system, where you still only had a relatively small number of different colors on the screen at the same time, but you could redefine those colors by specifying the three primary colors with a precision of 2, 4, 6 or 8 bits.
After that came high color and true color starting at around 1995. With high color, each pixel is 15 or 16 bits, with 5 bits for red and blue, and 5 or 6 bits for green. True color increases this to (at least) 24 bits per color, so 8 bis per color. So with a true color system, there are no limitations on how many colors can be on the screen at the same time.
Around the same time, MSDOS gave way to Windows so it was no longer common for computers to use text mode, but rather, mix text and graphics in a (high color or true color) graphics mode.
The VGA connector was never replaced by anything new until analog display connections were replaced by digital ones: first DVI, then HDMI and DisplayPort. Within a few years after VGA came out, Macs became compatible with VGA monitors, and the later Amigas could also use a VGA monitor when running in the correct screenmode.
So as of the second half of the 1990s, all common computer types moved away from TV-based display options towards a common display technology. Then, with the advent of HDTV, TV also made the jump to higher resolutions and digital connections... and now it is again possible to use a TV as a computer display.
Funny how things work out sometimes.
Permalink - posted 2020-09-13