mmcirvin: (Default)
[personal profile] mmcirvin
So I've been taking another trip down memory lane, actually messing with some stuff I never had as a kid: pinball simulations on the Atari 400/800/XL/XE computers. The most famous pinball games on early personal computers tended to be pretty straight ports of games from the Apple II, and when you see videos of them in emulation or played on modern video hardware, often they look incredibly ugly--monochrome graphics with eye-hurting vertical stripes, like in this video from ILLSeaBass of David's Midnight Magic:



You may recall this kind of thing showing up a lot on CGA MS-DOS games too.

But then again, the graphics might instead look like this (posted by ILLSeaBass as well):



Much better--now there's colors, if somewhat peculiar colors. It's the same game. What's going on?

These Atari releases were in a graphics mode that Atari BASIC programmers called "GRAPHICS 8", the highest-resolution mode, which for a full screen gave you 320x192 pixels. But it was a monochrome mode: you could specify the background color, and the brightness of the foreground color relative to that, and that was it.

How do you get colors in a high-res monochrome mode? Well, these home computers like the Atari were usually hooked up to a color TV, which, in North America in the 1970s and 80s, used the NTSC analog broadcast standard. On those TVs you could use a trick called "artifacting": if you made a pattern of alternating vertical white and black stripes, it would appear as a solid field of color. You didn't have a lot of control over the colors; you could get a couple of different ones depending on whether you put the white lines on the odd or even pixel values. And on Ataris, the colors also varied wildly between different releases of their graphics chipset, so that was a thing you had to live with. But it was a thing you could do. Apple II and TRS-80 CoCo games did this a lot too.

With MS-DOS games, though, it was actually pretty common for a PC to be hooked up to something other than a TV or composite color monitor, and then games that used this effect would look like butt. Often you could turn it on or off on those games to account for the kind of monitor you were using.

When I was a kid messing around with Atari programming, I knew about this and played with the effect, but I didn't know how it worked. I thought vaguely that it had something to do with the positioning of the colored phosphor dots on the screen, but it has nothing at all to do with that. It's actually all about the way the NTSC standard encodes color information. Color is encoded in an extra modulation layered right on top of the monochrome brightness data. There's a signal with a period that happens to correspond exactly to two horizontal pixels of this GRAPHICS 8 mode. The amplitude of the signal variation controls the saturation of the color, and the phase of it (relative to an invisible spike at the beginning of every scanline called the "colorburst") controls the hue.

This has a number of implications. An old-school analog television doesn't have pixels, unless it's hooked up to something that generates pixels, but it's never going to be able to show any involved detail with a horizontal frequency that's very high without it turning into colorful hash. Seventies guys wearing houndstooth jackets would show up on TV with the pattern turning into psychedelic rainbow colors. Home computers that were hooked up to TV sets were limited practically to 40-column text, because the horizontal detail needed for narrower characters would turn into artifact mush; that meant they were less useful than they could be for serious productivity applications. But it also meant that with a 320-pixel-wide monochrome graphics mode, you could tickle the TV's color-decoding circuitry directly just by alternating dark and light pixels. 

The effect didn't work on PAL TVs because they had a different color clock frequency that didn't coincide precisely to two pixels, and I think they also had some error-correction capability that specifically prevented this sort of thing. There, you'd apparently get shades of gray from something like this.

You could only really get a couple of extra colors this way, and, as I mentioned, the colors on Atari 8-bits kept changing when Atari changed their graphics hardware slightly. The GTIA chip produced different artifacting colors relative to the earlier CTIA, and the XL/XE computers changed it again. Why? Well, they slightly changed the timing of the signal representing video pixels relative to the colorburst, so the same pattern of pixels would give you a different relative phase and therefore a different hue. Some Atari computer emulators actually simulate the artifacting produced by all these iterations of the hardware, and give you the option to choose.

June 2025

S M T W T F S
1234567
89101112 1314
15161718192021
22232425262728
2930     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 14th, 2025 08:00 am
Powered by Dreamwidth Studios