Dec. 20th, 2003
Doom-mongering for the Web
Dec. 20th, 2003 11:13 amSpamocalypse may be in the cards for the world of weblogs, but it's worth noting that many predictions of Web doom have been wrong in the past.
I was an early adopter of CSS, and consequently I hung out frequently on the Usenet group comp.infosystems.www.authoring.stylesheets a few years ago. The newsgroup seemed populated by people who were knowledgeable and brilliant, but also perpetually disgruntled and prone to lectures about how the Web was going to hell because nobody had listened to their favored way of doing things back in 1992. Around 2000, when the first wave of browsers with half-decent standards compliance started to get some traction, I ventured to express some optimism about the situation, and my sentiments weren't generally shared. Sure, these browsers can finally lay out a nontrivial box-model situation without barfing, but can't you see it's all doomed because the W3C's cascade model can't handle this or this or this? Sure, DOCTYPE sniffing allows browser developers to cover their butts on backward compatibility while putting in proper parsing, but it violates the stated function of DOCTYPEs in SGML and would have bad results in this corner case and this one and this one, so is anathema!
Half a loaf was never good enough, doing nothing would have been preferable to doing things in any imperfect way, and because of the industry's willingness to compromise, we were inevitably progressing toward a Web wholly owned by Microsoft. What's happened instead is interesting: Microsoft indeed has overwhelmingly dominant market share, yet open standards compliance is enough of a selling point that you can actually use a non-Microsoft browser for pretty much all of your needs outside of corporate intranets, and a small fraction of sites designed by dimwits. If any one company has a proprietary lock on a chunk of the Web, it's Macromedia.
Here's another one: Remember the Imminent Death of the Amateur Indie Web? There was a lot of this going around back in the Nineties during the dot-com boom. The story went something like this: "Once upon a time, the World Wide Web was a collection of thrown-together personal sites with kitty pictures on them. But now, with big corporations moving into the space, individuals are going to be out-competed, and soon the Web will just be another big-media organ, with amateurs reduced to the status of cable access television, scorned and ignored." What followed was the tale of the Death of Content, which I found particularly amusing: "Once upon a time, people went to the Web to obtain information, and we said that content was king. But companies founded to make money off Web content haven't. Since, as we said earlier, corporate sites will soon outcompete amateurs, it follows that content is vanishing as a driving force on the Web." I used to wonder what they thought would replace content: Empty templates that hypnotize you into looking at them? Sites full of random characters, like Borges' Library of Babel? More likely they thought it would just all go to e-commerce, like TV channels switching to 24-hour home shopping. Some mysterious force would keep people from wanting to see the kitty pictures any more. The Web was always going to be more like TV in some way or other, and if it wasn't turning out to succeed in the way TV had, it was probably just going to evaporate.
It didn't happen. Every corporation has a Web presence now, and a few of them even have content worth looking at. But the big exciting story of the past few years has been the weblog explosion, which is almost completely amateur material. E-commerce is mostly corporate and content is mostly amateur. The difference from cable access television is that a dedicated amateur with little capital can put up a content site that's as polished as a corporate product, or more so. There are scaling problems when the site gets popular, but nobody said that any individual site had to be as popular as amazon.com, and it turns out that a few of them can even meet expenses by hawking T-shirts.
Megapixels
Dec. 20th, 2003 11:43 pmIn the comments to Reid Stott's interesting article on the ongoing film-versus-digital debate in high-end photography, he mentioned something that I've talked about before, but needs to be said again: When shopping for a digital camera, megapixels aren't everything. People who admire the pictures I take often ask me how many megapixels my camera has, and they're mildly surprised that it's an old 2MP job. It seems as if the megapixel count has become the equivalent of megahertz in the PC industry (or kilobytes of RAM back in the eighties): the premier "mine's bigger than yours" number. It shouldn't be.
Mind you, they're not completely insignificant. Other things being equal, the capability for more pixels is usually better. If you're just going to use your pictures on the Web or in e-mail, two megapixels are more than enough; you'll end up scaling or cropping the pictures down anyway. 2MP produces pretty good prints, too, at typical sizes. But the more pixels you have, the more cropping you can do when composing your image after shooting it, and still get decent-looking results. Also, because of the interpolation done to get a full-color image out of the camera's filter mosaic, digicam images displayed onscreen at the original resolution usually look a little soft, so an image scaled down from a higher resolution can actually be somewhat sharper and more detailed.
But in practice, other things are not equal. As Reid mentions, a larger number of pixels crammed into the same physical sensor size can lead to greater sensor noise and make grainier-looking images. (I've heard that this is a problem with the Nikon 3500, the higher-resolution sibling of my 2500.) Bigger images take up more memory on the card. Also, if you've got a finite amount to spend, often you can spend it either on higher sensor resolution or on other things, and the other things might be better. The size and quality of the lens is important; optical zoom is good. The most limiting thing about my little Nikon is not that it's two-megapixel, but that its lens is kind of small and doesn't do that great in low light, which limits what I can do indoors.
The most important thing to realize, though, is that there's an effect of diminishing returns. The pixel count is a kind of area: horizontal resolution times vertical resolution. That means that the number of pixels per centimeter in your image (however you display it) only increases as the square root of the pixel count. Two megapixels are 1.41 times as good as one, but four megapixels are only 1.15 times as good as three. Before long, you need tremendous increases in megapixels to get an equivalent increase in image quality. Meanwhile, the amount of memory you need to hold the images goes up more or less linearly with the pixel count.
It seems to me that, once you've got the money to buy a higher-end camera (which I can't personally justify doing at this point), one of the most important tradeoffs has nothing to do with megapixels, but is lens size versus portability. I'd love to have a camera with the kind of lens that you can use to shoot reliably by dim indoor light, but better lenses are usually bigger, so they tend to make cameras harder to carry around and operate on a whim. Conversely, there are expensive and fancy cameras that easily fit in a shirt pocket, but slowish lenses are typically their weak point.