![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
So I've been taking another trip down memory lane, actually messing with some stuff I never had as a kid: pinball simulations on the Atari 400/800/XL/XE computers. The most famous pinball games on early personal computers tended to be pretty straight ports of games from the Apple II, and when you see videos of them in emulation or played on modern video hardware, often they look incredibly ugly--monochrome graphics with eye-hurting vertical stripes, like in this video from ILLSeaBass of David's Midnight Magic:
You may recall this kind of thing showing up a lot on CGA MS-DOS games too.
But then again, the graphics might instead look like this (posted by ILLSeaBass as well):
Much better--now there's colors, if somewhat peculiar colors. It's the same game. What's going on?
These Atari releases were in a graphics mode that Atari BASIC programmers called "GRAPHICS 8", the highest-resolution mode, which for a full screen gave you 320x192 pixels. But it was a monochrome mode: you could specify the background color, and the brightness of the foreground color relative to that, and that was it.
How do you get colors in a high-res monochrome mode? Well, these home computers like the Atari were usually hooked up to a color TV, which, in North America in the 1970s and 80s, used the NTSC analog broadcast standard. On those TVs you could use a trick called "artifacting": if you made a pattern of alternating vertical white and black stripes, it would appear as a solid field of color. You didn't have a lot of control over the colors; you could get a couple of different ones depending on whether you put the white lines on the odd or even pixel values. And on Ataris, the colors also varied wildly between different releases of their graphics chipset, so that was a thing you had to live with. But it was a thing you could do. Apple II and TRS-80 CoCo games did this a lot too.
With MS-DOS games, though, it was actually pretty common for a PC to be hooked up to something other than a TV or composite color monitor, and then games that used this effect would look like butt. Often you could turn it on or off on those games to account for the kind of monitor you were using.
When I was a kid messing around with Atari programming, I knew about this and played with the effect, but I didn't know how it worked. I thought vaguely that it had something to do with the positioning of the colored phosphor dots on the screen, but it has nothing at all to do with that. It's actually all about the way the NTSC standard encodes color information. Color is encoded in an extra modulation layered right on top of the monochrome brightness data. There's a signal with a period that happens to correspond exactly to two horizontal pixels of this GRAPHICS 8 mode. The amplitude of the signal variation controls the saturation of the color, and the phase of it (relative to an invisible spike at the beginning of every scanline called the "colorburst") controls the hue.
This has a number of implications. An old-school analog television doesn't have pixels, unless it's hooked up to something that generates pixels, but it's never going to be able to show any involved detail with a horizontal frequency that's very high without it turning into colorful hash. Seventies guys wearing houndstooth jackets would show up on TV with the pattern turning into psychedelic rainbow colors. Home computers that were hooked up to TV sets were limited practically to 40-column text, because the horizontal detail needed for narrower characters would turn into artifact mush; that meant they were less useful than they could be for serious productivity applications. But it also meant that with a 320-pixel-wide monochrome graphics mode, you could tickle the TV's color-decoding circuitry directly just by alternating dark and light pixels.
The effect didn't work on PAL TVs because they had a different color clock frequency that didn't coincide precisely to two pixels, and I think they also had some error-correction capability that specifically prevented this sort of thing. There, you'd apparently get shades of gray from something like this.
You could only really get a couple of extra colors this way, and, as I mentioned, the colors on Atari 8-bits kept changing when Atari changed their graphics hardware slightly. The GTIA chip produced different artifacting colors relative to the earlier CTIA, and the XL/XE computers changed it again. Why? Well, they slightly changed the timing of the signal representing video pixels relative to the colorburst, so the same pattern of pixels would give you a different relative phase and therefore a different hue. Some Atari computer emulators actually simulate the artifacting produced by all these iterations of the hardware, and give you the option to choose.
You may recall this kind of thing showing up a lot on CGA MS-DOS games too.
But then again, the graphics might instead look like this (posted by ILLSeaBass as well):
Much better--now there's colors, if somewhat peculiar colors. It's the same game. What's going on?
These Atari releases were in a graphics mode that Atari BASIC programmers called "GRAPHICS 8", the highest-resolution mode, which for a full screen gave you 320x192 pixels. But it was a monochrome mode: you could specify the background color, and the brightness of the foreground color relative to that, and that was it.
How do you get colors in a high-res monochrome mode? Well, these home computers like the Atari were usually hooked up to a color TV, which, in North America in the 1970s and 80s, used the NTSC analog broadcast standard. On those TVs you could use a trick called "artifacting": if you made a pattern of alternating vertical white and black stripes, it would appear as a solid field of color. You didn't have a lot of control over the colors; you could get a couple of different ones depending on whether you put the white lines on the odd or even pixel values. And on Ataris, the colors also varied wildly between different releases of their graphics chipset, so that was a thing you had to live with. But it was a thing you could do. Apple II and TRS-80 CoCo games did this a lot too.
With MS-DOS games, though, it was actually pretty common for a PC to be hooked up to something other than a TV or composite color monitor, and then games that used this effect would look like butt. Often you could turn it on or off on those games to account for the kind of monitor you were using.
When I was a kid messing around with Atari programming, I knew about this and played with the effect, but I didn't know how it worked. I thought vaguely that it had something to do with the positioning of the colored phosphor dots on the screen, but it has nothing at all to do with that. It's actually all about the way the NTSC standard encodes color information. Color is encoded in an extra modulation layered right on top of the monochrome brightness data. There's a signal with a period that happens to correspond exactly to two horizontal pixels of this GRAPHICS 8 mode. The amplitude of the signal variation controls the saturation of the color, and the phase of it (relative to an invisible spike at the beginning of every scanline called the "colorburst") controls the hue.
This has a number of implications. An old-school analog television doesn't have pixels, unless it's hooked up to something that generates pixels, but it's never going to be able to show any involved detail with a horizontal frequency that's very high without it turning into colorful hash. Seventies guys wearing houndstooth jackets would show up on TV with the pattern turning into psychedelic rainbow colors. Home computers that were hooked up to TV sets were limited practically to 40-column text, because the horizontal detail needed for narrower characters would turn into artifact mush; that meant they were less useful than they could be for serious productivity applications. But it also meant that with a 320-pixel-wide monochrome graphics mode, you could tickle the TV's color-decoding circuitry directly just by alternating dark and light pixels.
The effect didn't work on PAL TVs because they had a different color clock frequency that didn't coincide precisely to two pixels, and I think they also had some error-correction capability that specifically prevented this sort of thing. There, you'd apparently get shades of gray from something like this.
You could only really get a couple of extra colors this way, and, as I mentioned, the colors on Atari 8-bits kept changing when Atari changed their graphics hardware slightly. The GTIA chip produced different artifacting colors relative to the earlier CTIA, and the XL/XE computers changed it again. Why? Well, they slightly changed the timing of the signal representing video pixels relative to the colorburst, so the same pattern of pixels would give you a different relative phase and therefore a different hue. Some Atari computer emulators actually simulate the artifacting produced by all these iterations of the hardware, and give you the option to choose.
no subject
Date: 2019-05-29 02:53 pm (UTC)1. On MS-DOS, you got three foreground colors to begin with in the CGA 320x200 mode, though you didn't have much choice: it was either white, cyan and magenta or yellow, red and green. But by combining these with the artifact colors you could actually work up a nice little palette. So it was considerably more refined there than on the Atari... for the minority of users who were using composite video. IBM-compatibles were SERIOUS BUSINESS machines, users typically wanted 80-column text displays more than anything else, and in the early days many didn't even bother with color. Some didn't even bother with graphics.
2. I've heard that artifacting didn't work on the Commodore 64, except in the earliest hardware revisions, because its color video generation was more sophisticated-- but I don't really know the electronic details. Instead, the 64 gave you legit color support at fairly high resolutions.
3. Atari eventually added full support for a semi-hi-res mode, "GRAPHICS 7.5" or "GRAPHICS E", initially not directly accessible from BASIC, that gave you four colors with Atari's nifty color indirection at 160x192. When Atari came out with the "XE Game System" (their final, far-too-late bid to turn their 8-bit computer platform into a dedicated games console) they apparently remade some existing GRAPHICS 8 games to use this mode instead; Choplifter was one. For most game purposes this was really better than GR.8 with artifacting, since the extra horizontal resolution was tricky to exploit anyway and this gave you full color control.
no subject
Date: 2019-05-29 06:50 pm (UTC)Descriptions of hires color graphics on the Apple II seem bafflingly arcane as a result-- for understanding, it really helps to know that it all works by exploiting artifacting.
Games ported from Apple to other platforms often retained the use of artifact color in a hires monochrome display, but would have to sacrifice two of the colors.
no subject
Date: 2019-05-29 07:10 pm (UTC)