The Seventeen-Year Cicadas

The last of the seventeen-year cicadas disappeared last week. The few stragglers that could be spotted were heavily battered and drained of energy. The birds were aggressively conducting cicada mopping-up operations. A wing may still occasionally be tracked into the office, but even over where I work in Klingle park, they can be neither seen nor heard. Three weeks ago they were so thick that they interfered with the kids’ soccer games and filled the whole neighborhood with their chirp-din.

I was completely taken in by them, as if a little, bug-infatuated boy again. My upwelling of youthful enthusiasm was tempered by more adult-like naturalist imaginings, so I have recorded some observations of a natural philosophic purpose, of course.

When last they came out in 1987, I was an eleven-year-old living in Washington State, but I read all about them in, I think it must have been National Geographic. As a kid, I was enthralled at the idea that there was this normal phenomena, annual cicadas — normal, green, bugish — but that once every great while, there were special cicadas that were spectacularly colored, cause of much stir, more space invader than terrestrial creature. I have kept them in the back of my mind ever since — to the point that, during a visit with family in Missouri, I was totally excited when the cat seemed to be buzzing, and upon closer inspection found in the cat’s jaws one of the annual cicadas protesting its rude handling by the feline field hand. What a fabulous occurrence that my first spring in D.C. would coincide with such a rare event!

The articles in the papers appeared earlier than the bugs themselves and as some time had elapsed since their publishing and I still hadn’t seen any cicadas, I was beginning to worry that I had missed them. The neighbor told me that they were all over the suburbs. I thought that maybe I would have to travel to see them. Then, one afternoon while lounging in the back yard, the cat seemed particularly excided about something that he was rooting at in the ground cover. I went to investigate, as kitty’s excitement usually means trouble, and found that he had caught a seventeen-year cicada. Just as my first sighting of an annual cicada was courtesy a cat, so now with the seventeen-year cicada.

When the cat found a second and then a third, I began to have a look around and it turned out that they were everywhere. There weren’t so many of the insects themselves, but the empty claws of their molted carapaces were dug into the stems of all the low vegetation that I looked at. As I walked back outside from an errand into the kitchen, I spotted two more clung to the door frame of the laundry room.

Seventeen-year cicada posed on the chain-link fence in the back yard, Mount Pleasant, Washington, D.C., circa June 2004

Seventeen-year cicada posed on the chain-link fence in the back yard, Mount Pleasant, Washington, D.C., circa June 2004

Of course, the emergence of the cicadas is not all excitement. Cicadas were born to die and the macaw display of nature’s carelessness for her creatures is a little hard to handle. The whole of D.C. has become like the killing field of some third-world genocidal war. Most insects possess the winning combination of the will to live and lightening reflexes. Cicadas have neither. Their beady read eyes stoically look on as a boot-shaped shadow blocks out the sun. One actually has to make an effort not to kill them. They can be picked up with no trouble. Occasionally a particularly feisty one will issue a chirp of protest, but they are wholly pliant. Their sole survival skill is shear numbers.

I kept on trying to save them, carefully moving to a tree trunk or planter box each cicada that I came upon in the middle of a sidewalk, but after a while this became an empathy-exhausting exercise: if I am going to make such an effort to save these creatures, they should at least meet me half way. On more than one occasion, I tried to toss one into the bushes only to have it fly, in its suicidal frenzy, right back smack in the middle of the sidewalk.

I will count for you the myriad ways that cicadas die. I do this, not to disgust, but to truly represent the oppressiveness of the carnage:

  1. Owing to the damage done by onrushing automobile windshields, the sidewalks were littered with their corpses, always attended to by a phalanx of black ant pall bearers. Unlike most bugs which splatter against the glass, cicadas break apart and scatter as if they were made of balsa wood or very delicate aluminum parts affixed to one another with little bits of solder or eye-glass screws. So the sidewalks were littered with the crunchy segments of abdomens that split nicely at the seam between exoskeleton plates, heads, legs, whole bodies sheered of extremities, thoraxes with a full compliment of wings, wings with a black stump of wing muscles, miscellaneous unrecognizable bits of carapace, et cetera.

  2. I found one that had molted in the laundry room, looking rather ashen at his predicament, stuck in the window sill and all. I thought that I was doing him a favor by setting him free. I carried him out to the fence and tried to nudge him off my hand onto a high rail where the cat wouldn’t chew him to pieces. There were already a congregation of them there delivered from the cat. Rather than crawl off, he flew. About twenty feet out, a bird swept in and thwack!, caught him in mid flight. I could hear the thump of their collision. I didn’t know birds ate these things, but I guess it makes sense: they are bugs — and juicy feasts of bugs at that. It sucks to watch something you tried to save die.

  3. While walking home, I passed an ivy-covered lawn in which a squirrel was rooting among the vines. When I made a noise to alert him to my proximity he sat up, realized the situation and jumped three feet up a tree trunk. That was enough to put him at a safe distance as the yard was atop a wall at about chest level. He seemed all too proud of himself over the cicada, still squirming in his mouth, that he had dug out of the brush.

  4. Cicadas are like turtles: once on their back, they are screwed. On field sports day at the school, I watched one fly across the school lawn and crash land — the only sort of which they seem capable — in the dust of a track that the students were using to practice for the three-legged race. I considered this fellow beyond help and watched as she squirmed on her back for what must have been five minutes in this treacherous lane. At that point I reconsidered: she is lucky and deserving of a hand.

  5. Cicadas regularly plop down on the back of one’s shirt or in the middle of a high-traffic walkway after simply falling out of a tree. Maybe they are inattentive climbers and lose their footing. Maybe it’s because they are cold-blooded creatures and when they walk into a shadow, they shut down. Whatever the case, there is a certain tree that overhangs the sidewalk on my route to work that has made for quite a mess. In addition to a grossly disproportionate number of cicadas falling from this tree, it is also dumping tones of black berries. It is a well-traveled stretch of sidewalk so the trampled cicadas and mushed berries have made a disgusting black roux of the splattered sperm and egg filled bodies of the insects, their ruptured and smeared organs, the reproductive parts of plants and a juicy-sweet fruit. The scene is just a little too much nature to handle. It busts open the myth of resplendent nature and hoists upon the unwary morning walker the gruesomeness of it all. I think of Camille Paglia every time I tiptoe around it:

    Everything is melting in nature. We think we see objects, but our eyes are slow and partial. Nature is blooming and withering in long puffy respirations, rising and falling in oceanic wave-motion. A mind that opened itself fully to nature without sentimental preconceptions would be glutted by nature’s coarse materialism, its relentless superfluity. An apple tree laden with fruit: how peaceful, how picturesque. But remove the rosy filter of humanism from our gaze and look again. See nature spuming and frothing, its mad spermatic bubbles endlessly spilling out and smashing in that inhuman round of waste, rot, and carnage. From the jammed glassy cells of sea roe to the feathery spores poured out into the air from bursting green pods, nature is a festering hornet’s nest of aggression and overkill. (Sexual Personae: Art and Decadence from Nefertiti to Emily Dickinson [New Haven: Yale University Press, 1990] p. 28)

  6. In a parking lot I watched a cicada desperately trying to get away from a bird who delivered the coup de grace in a peck, before pulling the vanquished bug’s wings off by beating it on the pavement by them and then flying off with just the juicy bit in its beak. The birds really had it good for a few weeks.

  7. One was flapping about like mad on the sidewalk such that it could be heard all down the block. I went to right it, but it wouldn’t stop. I realized that it was not merely stuck upside-down, it was spasming, probably from neurological (I hesitate to say “brain”) damage inflicted by a car windshield.

  8. I picked one up off the sidewalk on Porter street on my way through Rock Creek Park and tried to toss it into the encroaching vines. It had other ideas and instead flew straight into the street. The first of the passing automobile sent it tumbling down the pavement like a piece of garbage. It got its footing just in time for the second car to splatter it. Again, sucks to watch something you tried to save die.

  9. Finally, Cicadas bite it because people eat them. As the cicadas started appearing, there were a number of stories about them in the papers, all of which mentioned the cicada eaters (e.g. Barr, Cameron W., “Cicada: The Other White Meat,” The Washington Post, 16 April 2004, p. A1). A restaurant here in D.C. was actually going to put them on the menu, fried in butter, white wine and a few sprigs of lemon grass. No one is waxing fantastic about eating the annual cicadas: they’re just green bugs. But, Oh! The seventeen-year cicadas are such a delicacy! It seems that the amount of glee people take in eating a thing is proportional to the amount of destruction accomplished in the thing’s consumption.

On to a few slightly less morbid observations. After some time seeing their molted exoskeletons around, I was lucky enough to come across one in the process of molting one morning on my way to work. I watched for a couple of minutes. I imagined the molting process to be hours of peril for an insect, but it seemed to be going very quickly. Little contractions moved from the end of the abdomen to the top of the thorax where the old carapace had split open. It made a little wriggle side to side to get its legs and antennae free. The wings were visible, crumpled, still not unfurled and hardened and, strangely, the body was white, not yet black. Perhaps the preliminary process of splitting the old shell and the post-molt waiting for wings and new carapace to harden are time consuming. I had to move on before the new insect was free. It is clearly a process that is not perfect as I came across a few cicadas with a still wilted wing and in one case, I found a cicada, still clinging half-way up a tree, that had died part way out of its old shell. Sorry, I said that I was done with the macaw aspects of the cicadas.

Surprisingly late in the season I came across two larval cicadas that had not yet molted. They are an almost entirely different creature. They seem a little fatter but they are much more quick and have a pointy snout, rather than the metallic face-mask like a Mortal Combat ninja villain. Rather than the outlandish colors of the mature insect, they are all the same shade of creamy brown. I tried for some time to follow these two to see if they would dig their claws into the wooden handrail on which I found them and begin the molting process, but they seem extremely finicky about their selection of molt location. In my back yard, it seemed that they just ran six inches up the first stalk they could find, but I do recall a large number of abandoned carapaces about ten feet up and way out on a limb of a tree in the neighbor’s yard.

Seventeen-year cicada posed on the chain-link fence in the back yard, Mount Pleasant, Washington, D.C., circa June 2004

Seventeen-year cicada posed on the chain-link fence in the back yard, Mount Pleasant, Washington, D.C., circa June 2004

They are a beautiful little insect. Their colors are so vibrant and the contrasts between their gun metal black bodies, the gold highlights of their exoskeletal plates and gold wings and their red eyes are striking. They look like little kid roaches wearing mom’s gold jewelry. They are fat, meaty little creatures that drag their too fat abdomens around behind them. Their thorax comes to a strong point where the shoulder blade meets the wing. The groove that extends back from there, into which the wings park nicely strikes this observer as one of those examples of evolution having perfectly molded a thing for its purpose. Their story — of nibbling on tree roots underground for seventeen years, to emerge into the bright world of day for a short few weeks of mating and to die in such staggering numbers — gives them a strange pathos. Strange especially because who would have ever attributed pathos to an insect, but it is there on their stoic, unmoving faces. That they are so unmoved by death almost implies a certain knowledge on their part of their inevitability. But I get a little carried away.

Many people complained about them. I loved them. They will be there, digging around in the roots. I can’t wait to see them again in the next seventeen years.

Advertisements

Kinsley on Brooks

I don’t want to seem as if I started a blog solely to rant about David Brooks, but Michael Kinsley’s very clever review (“Suburban Thrall,” The New York Times, 23 May 2004) of Brooks’s new book, On Paradise Drive: How We Live Now (and Always Have) in the Future Tense warrants a few remarks. First, Kinsley points out how easily liberals have been duped by Brooks:

For several years, in the world of political journalism, David Brooks has been every liberal’s favorite conservative. This is not just because he throws us a bone of agreement every now and then. Even the most poisonous propagandist (i.e., Bill O’Reilly) knows that trick. Brooks goes farther. In his writing and on television, he actually seems reasonable. More than that, he seems cuddly. He gives the impression of being open to persuasion. Like the elderly Jewish lady who thinks someone must be Jewish because “he’s so nice,” liberals suspect that a writer as amiable as Brooks must be a liberal at heart. Some conservatives think so too.

There is a prize for being the liberals’ favorite conservative, and Brooks has claimed it: a column in The New York Times.

Lay off, Kinsley. I admit it: I thought that he seemed cuddly too. I was excited by the New York Times column. I am five posts into this thing and already I’m airing opinions this easily lampooned.

The problem that I am having with Brooks it that the humor serves to weaken, or at least confuse the critical faculty. I don’t know how to read Brooks. Is he a political humorist like P.J. O’Rourke or Al Franken? But I don’t have trouble reading O’Rourke and Franken: they are sure to be clear about when they are interjecting a joke or two and when they are making a serious point. Is he a sociologist who employs a snappy commercial shorthand instead of the dry phrasing of academia? But Brooks seems to want an undue amount of hyperbolic license to make his case, to the point where his exaggerations becomes simply misleading.

Citing Sasha Issenberg’s fact checking of Brooks (“Boo-Boos in Paradise,” Philadelphia Magazine April 2004; to be filed right next to Thomas Frank’s essay and my earlier post), Kinsley spends some time on the essential unseriousness of Brooks’s analyses: “Brooks does not let the sociology get in the way of the shtick, and he wields a mean shoehorn when he needs the theory to fit the joke.” This is a more genial version of Frank’s criticisms, which recognized the insidiousness of Brooks’s under the radar take on class in America:

The tools being used are the blunt instruments of propaganda, not the precise metrics of sociology. The “two Americas” commentators showed no interest in examining the mysterious inversion of the nation’s politics in any systematic way. Their aim was simply to bolster the stereotypes using whatever tools were at hand …

Even if his chosen style makes a muddle of it, Brooks is correct to point out the deep divides separating Americans. As his structuring metaphors of consumerism call out, much of this has to do with materialist factors. Here is Kinsley’s attempt to make sense:

… our defining — and uniting — characteristics as Americans, according to Brooks, are that we’d rather leave than fight, and we’re always thinking about the future instead of dwelling on the past. That means the enormous gulfs in values, aspirations, understanding of the world and food preferences he outlines so wittily in the first part of “On Paradise Drive” don’t turn Americans against one another … We all prosper in our various cultural cul-de-sacs (or as Brooks puts it, much better: “Everybody can be an aristocrat within his own Olympus”), and we don’t trouble ourselves about what the folks in the next cul-de-sac might be up to.

I bookmark this phenomena because I will have a lot more to write about it in some future posts on micro-fame and the technological changes that drive and structure it.

Okay, now I’ll lay off Brooks for a while.

What’s Wrong With David Brooks

Fortunately, I’m not the only one who thinks that David Brooks is lost in la-la land. Thomas Frank, whose book, What’s the Matter With Kansas: How Conservatives Won the Heart of America is currently receiving a good deal of liberal acclaim, writes the following in an excerpt thereof published in Harper’s (“Lie Down for America,” April 2004, p. 37):

David Brooks, who has since made a career out of projecting the liberal stereotype onto the [red and blue map of the 2000 election], took to the pages of The Atlantic to admit on behalf of everyone who lives in a Blue zone that they are all snobs, toffs, wusses, ignoramuses, and utterly out of touch with the authentic life of the people:

We in the coastal metro Blue areas read more books and attend more plays than the people in the Red heartland. We’re more sophisticated and cosmopolitan — just ask us about our alumni trips to China or Provence, or our interest in Buddhism. But don’t ask us, please, what life in Red America is like. We don’t know. We don’t know who Tim LaHaye and Jerry B. Jenkins are … We don’t know what James Dobson says on his radio program, which is listened to by millions. We don’t know about Reba and Travis … Very few of us know what goes on in Branson, Missouri, even though it has seven million visitors a year, or could name even five NASCAR drivers … We don’t know how to shoot or clean a rifle. We can’t tell a military officer’s rank by looking at his insignia. We don’t know what soy beans look like when they’re growing in a field.

One is tempted to dismiss Brooks’s grand generalizations by rattling off the many ways in which they’re wrong: by pointing out that the top three soybean producers — Illinois, Iowa, and Minnesota — were in fact Blue states; or by listing the many military bases located on the coasts; or by noting that when it came time to build a NASCAR track in Kansas, the county that won the honor was one of only two in the state that went for Gore. Average per capita income in that same lonely Blue county, I might as well add, is $16,000, which places it well below Kansas and national averages, and far below what would be required for the putting on of elitist or cosmopolitan airs of any kind.

It’s pretty much a waste of time, however, to catalogue the contradictions* and tautologies** and huge, honking errors*** blowing round in a media flurry like this. The tools being used are the blunt instruments of propaganda, not the precise metrics of sociology. The “two Americas” commentators showed no interest in examining the mysterious inversion of the nation’s politics in any systematic way. Their aim was simply to bolster the stereotypes using whatever tools were at hand: to cast the Democrats as the party of a wealthy, pampered, arrogant elite that lives as far as it can from real Americans; and to represent Republicanism as the faith of the hard working common people of the heartland, an expression of their unpretentious, all American ways, just like country music and NASCAR. At this pursuit they largely succeeded.

* Consider what we might call the snowmobile dilemma. David Brooks insists that one can trace the Red-state/Blue-state divide by determining whether a person does outdoor activities with motors (the good old American way) or without (the pretentious Blue state way): “We [Blue state people] cross country ski; they snowmobile.” And yet in Newsweek’s take on the Blue/Red divide (it appeared in the issue for January 1, 2001), a “town elder” from Red America can be found railing against people who drive snowmobiles precisely because they signal big city contempt for the “small town values” of Bush Country!

** In the selection printed above, David Brooks tosses off a few names from the conservative political world as though they were uncontroversial folk heroes out in the hinterland, akin to country music stars or favorite cartoonists. But the real reason liberals don’t know much about James Dobson or Tim LaHaye is not because they are out of touch with America but because both of these men are ideologues of the right. Those who listen to Dobson’s radio program or buy LaHaye’s novels, suffused as they are with Bircher style conspiracy theory, tend to be people who agree with them, people who voted for Bush in 2000.

*** The central, basic assertion of the Blue state-Red state literature is that the Democrats are the party of the elite while the Republicans are the party of average, unpretentious Americans. Accordingly, David Brooks asserts in his Atlantic essay that “Upscale areas everywhere” voted for Gore in 2000. As a blanket statement about the rich, this is not even close to correct. Bush was in fact the hands down choice of corporate America: According to the Center for Responsive Politics, Bush raised more in donations than Gore in each of ten industrial sectors; the only sector in which Gore came out ahead was “labor.” In fact, Bush raised more money from wealthy contributors than any other candidate in history, a record he then broke in 2003.

Nor is Brooks’s statement valid even within its limited parameters. When he says “upscale areas everywhere” voted for Gore, he gives Chicago’s North Shore as an example of what he means. And yet, when you look up the actual 2000 voting returns for those areas of the North Shore known for being “upscale,” you find that reality looks very different from the stereotype. Lake Forest, the definitive and the richest North Shore burb, chose the Republican, as it almost always does, by a whopping 70 percent. Winnetka and Kenilworth, the other North Shore suburbs known for their upscaliness, went for Bush by 59 percent and 64 percent, respectively.

And there were obviously many other “upscale areas” where Bush prevailed handily: Fairfax County, Virginia (suburban D.C.), Cobb County, Georgia (suburban Atlanta), DuPage County, Illinois (more of suburban Chicago), St. Charles County, Missouri (suburban St. Louis), and Orange County, California (the veritable symbol of upscale suburbia), to name but a few.

David Brooks Through the Looking Glass

When David Brooks first began writing for The New York Times editorial page, I thought that a better selection could not have been made. Brooks is funny, cleaver and unorthodox1 — exactly the sort of conservative that should be writing for this country’s “newspaper of record.” As his output has begun to pile up, though, I have begun to think that he will need a star chart to locate the current state of debate.

His latest editorial, “Looking Through Keyholes” would be more aptly titled, “Through the Looking-glass.” He argues that D.C. commentators, rather than focus on the critical events in Najaf and Falluja, are a’chatter about the books and testimonies of Richard Clarke, Condoleezza Rice and Bob Woodward — all dealing with events prior to 2004. “This is like pausing during the second day of Gettysburg to debate the wisdom of the Missouri Compromise.” Time spent preparing for hearings and defending the administration against the myriad accusations is time not spent on solving the problems of Iraq. He dismisses criticisms of Bush as mere Washington conceit. “The first duty of proper Washingtonians is to demonstrate that they are smarter than whomever they happen to be talking about. It’s quite easy to fulfill this mission when you are talking about the past.”2

The fact is that for nearly two years now, Washington insiders have been trying — to no avail, but in the worst case, to their peril — to contribute to the debate over how to handle the situation in Iraq.

Regarding troop levels, General Tommy Franks initially planned to go into Iraq with a force comparable to that of the first Gulf War, but, under pressure from Rumsfeld, continually whittled it down. General Shinseki told congress that a few hundred thousand soldiers would be required in Iraq for up to five years. Retired military personnel voiced concern about troop levels.3 John McCain and even General Abizaid have both called for additional soldiers and for a more international force.4

Pentagon personnel were prohibited from, or reprimanded for, participating in CIA war games that simulated the disorder of the aftermath of invasion.5 In the Pentagon sponsored war games, those acting the role of the enemy were specifically prohibited from employing tactics similar to those used by Iraqi irregulars during combat. Hence Lt. Gen. William Wallace’s controversial remark, “The enemy we’re fighting is a bit different than the one we war-gamed gainst.”6

A parade of Iraqi exiles met with administration officials, including Bush, to warn about the dangers of a lapse in order.7 Reports by the Army War College, The Council on Foreign Relations and the James A. Baker III Institute for Foreign Policy also warned about the dangers of a breakdown in civil administration and the disbanding of the Iraqi army.8 French officials warned Rice about an insurgency and ethnic tensions.9 Former Central Command chief Anthony Zinni telephoned a general inside his old command to remind him of planning and simulations for an occupation of Iraq that Zinni had conducted in 1999.10 The State Department’s Future of Iraq project spent nearly a year producing a thirteen volume report11 and Powell circulated among the National Security Council a fifteen page memo on the history of U.S. occupations that argued that troop strength and postwar security would be critical factors.12

The administration was still maintaining that oil revenue would pay for Iraq’s reconstruction, when Lawrence Lindsey gave an interview to The Wall Street Journal in which he estimated that the cost of the war would be $100 to 200 billion.13 Throughout the FY2004 budget negotiations, Bush stuck to his original budget for Iraq despite close Congressional questioning. Even after the $87 billion had been authorized, the Chiefs of the Army, Marine Core and Air Force warned that it only covered about eight months of operations.14

The rewards for this diligency have been few. Shinseki was severely dressed down about his estimates with Wolfowitz calling his estimate “wildly off the mark” and Rumsfeld reiterating the same.15 Zinni has gone from special envoy to Israel and Palestine to personae non grata. Iraqi exile Kanan Makiya, whose book, Republic of Fear was closely read in administration circles, has gone the same way since voicing his concerns.16 Rumsfeld specifically told Jay Garner to disregard the Future of Iraq report and that the project’s chief, Thomas Warrick, was to be removed from his staff. Garner resisted the disbanding of the Iraqi army, but it went ahead after he was replaced. Lawrence Lindsey was fired for his forthrightness on the costs of the adventure in Iraq.17 Retired army personnel who spoke out on force levels were slandered as “Clinton generals,” “armchair generals” and the like.

Of course, I am skipping over the reams of excellent commentary in the media because, as Bush has said, “I rarely read the [news] stories.”18 One could have a bang-on solution for our problems in Iraq and may as well leave it tri-folded in one’s jacket pocket for all the impact it will have. Even interventionist extraordinaire Max Boot is saying that the administration is in “a political cocoon where they cut themselves off from outside criticism, just dismiss it as being naysayers.”19

But there is something more fundamental here. We live in a democracy and are approaching an election. The president is up for his quadrennial review. President Bush is continually saying, “I look forward to talking to the American people about why I made the decisions I made.”20 Conservative pundits defend Bush’s use of September 11th as a campaign issue saying “Sept. 11, its aftermath and the response…are central to deciding the fitness of George W. Bush to continue in office.”21 What people like Brooks who complain about the criticisms of Bush miss is that this is exactly what a discussion of Bush’s record looks like. All the hullabaloo that Brooks derides is about “deciding the fitness of George W. Bush to continue in office.” Brooks and other defenders of the administration are frustrated that the “discussion” is two sided, not merely the Bush campaign — sole owner of all information and opinion regarding its record — talking at a pliant audience.

I think Bush would agree with me that the presidential campaign ranks at least as high as our problems in Iraq, because after neutralizing (not solving) the Falluja problem, the man overburdened with Iraq jumped on a campaign bus for a tour of battleground states.

As for Brooks, despite the deployment of French culture as window dressing, he is merely repeating the Republican boilerplate of “You cannot criticize the president. He is a war president.” That sort of intellectual thugery has about as little place in the “newspaper of record” as Jayson Blair.

Notes

  1. Do you doubt me? Though some of his political writings for The Weekly Standard are perhaps more true to form, I judged him on the cultural work such as BOBO’s in Paradise: The New Upper Class and How They Got There. New York: Simon & Schuster. 2000; “The Organization Kid.” The Atlantic Monthly. April 2001. http://www.theatlantic.com/issues/2001/04/brooks-p1.htm; or “Patio Man and the Sprawl People.” The Weekly Standard. 12-19 August 2002. http://www.weeklystandard.com/Content/Public/Articles/000/000/001/531wlvng.asp
  2. Brooks, David. “Looking Through Keyholes.” The New York Times. 27 April 2004. http://www.nytimes.com/2004/04/27/opinion/looking-through-keyholes.html.
  3. E.g. Hoar, Joseph P. “Why Aren’t There Enough Troops in Iraq?” The New York Times. 2 April 2003; McCaffrey, Barry. “Gaining Victory in Iraq.” U.S. News & World Reports. 7 April 2003. p. 26; McCaffrey, Barry. “We Need More Troops.” The Wall Street Journal. 29 July 2003.
  4. Schmitt, Eric. “General in Iraq Says More G.I.’s are Not Needed.” The New York Times. 29 August 2003; McCain, John. “Why We Must Win.” The Washington Post. 31 August 2003. p. B7. Schmitt’s article was given a misleading title. Abizaid did say that additional American troops should not be sent, but because he worried about “the public perception both within Iraq and within the Arab world about the percentage of the force being so heavily American.” He did however say that more soldiers from other countries were needed and that the training of native Iraqi forces should be hastened.
  5. Fallows, James. “Blind Into Baghdad.” The Atlantic Monthly. January/February 2004. p. 58. http://www.theatlantic.com/issues/2004/01/fallows.htm.
  6. Kaplan, Fred. “War-Gamed.” Slate. 28 March 2003. http://www.slate.com/id/2080814.
  7. Brinkley, Joel and Eric Schmitt. “Iraqi Leaders Say U.S. Was Warned of Disorder After Hussein, But Little Was Done.” The New York Times. 30 November 2003.
  8. Fallows. p. 68; Elliott, Michael. “So, What Went Wrong?” Time. 6 October 2003. p. 34.
  9. Mann, James. The Rise of the Vulcans: The History of Bush’s War Cabinet. New York: Viking. 2004, p. 349; Burrougil, Bryan, Evgenia Peretz, David Rose and David Wise. “The Path to War.” Vanity Fair. no. 525. May 2004. pp. 288-289.
  10. Ricks, Thomas E. “For Vietman Vet Anthony Zinni, Another War on Shaky Territory.” The Washington Post. 23 December 2003.
  11. Fallows. pp. 56-58.
  12. Packer, George. “Letter from Baghdad: War After the War.” The New Yorker. 24 November 2003. pp. 61-62. http://www.newyorker.com/fact/content/?031124fa_fact1.
  13. Davis, Bob and David Rogers. “Bush Economic Aide Says Cost of Iraq War May Top $100 Billion.” The Wall Street Journal. 16 September 2002.
  14. Schmitt, Eric. “Service Chiefs Challenge White House on the Budget.” The New York Times. 11 February 2004.
  15. Schmitt, Eric. “Pentagon Contradicts General on Iraq Occupation Force’s Size.” The New York Times. 28 February 2003.
  16. E.g. Makiya, Kanan. “The Wasteland.” The New Republic. 5 May 2003. pp. 19-21. http://www.tnr.com/doc.mhtml?i=20030505&s=makiya050503; Makiya, Kanan. “Hopes Betrayed.” The Observer. 16 February 2003. http://observer.guardian.co.uk/iraq/story/0,12239,896611,00.html.
  17. An administration insider confirms that this was the specific reason for his firing. Allen, Mike, David Von Drehle and Jonathan Weisman. “Treasury Chief, Key Economic Aide Resign as Jobless Rate Hits 6 Percent.” The Washington Post. 7 December 2002. p. A1.
  18. Bush, George W. Interview with Brit Hume. Fox News Chanel. 22 September 2003. http://www.foxnews.com/story/0,2933,98006,00.html.
  19. Boot, Max. Lou Dobbs Tonight. CNN. 30 April 2004. http://www.cnn.com/TRANSCRIPTS/0404/30/ldt.00.html.
  20. E.g. Bush told Tim Russert that he was “looking forward” to a discussion seven times. Bush, George W. and Tim Russert. Meet the Press. NBC News. 8 February 2004. http://msnbc.msn.com/id/4179618/.
  21. Krauthammer, Charles. “Why 9/11 Belongs in the Campaign.” Time. 15 March 2004. p. 100.

Smarties’s First Baby Steps

From: <administrator>
To: taylordw@goodleaf.net
Date: Fri, August 15, 2003 3:30 pm
Subject: yo

Donnarino,

Sorry to have missed you when you were in Seattle. Thought you’d be interested to know that I have been browsing my apache logs. Checking out the attacks from the latest MS worm you know. Anyway, I found that someone hit your Stiglitz smarties article, and the referring link was from a google search. Your journey down the road to fame has begun.

J

httpd-access.log:164.119.68.88 — [15/Aug/2003:05:51:19 -0700] “GET /smarties/economics/stiglitz/stiglitz.html HTTP/1.1” 200 47128 “http://www.google.com/search?q=%22paul +krugman%22 +biography +council+of +economic +advisors +clinton&hl=en&lr=&ie=UTF-8&start=20&sa=N” “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)”

Real Programmers Don’t Use PASCAL

Ed Post, “Readers’ Forum: Real Programmers Don’t Use PASCAL,” Datamation, vol. 29, no. 7, July 1983, pp. 263-265

Datamation, vol. 29, no. 7, July 1983

Datamation, July 1983

Editor’s note: What I have posted below is the document as it circulated on Usenet, BBSs and the Internet, and as I first encountered it. Like all viral forwards, it contains a certain amount of scholia, interpolated material and additional formatting conventions. I have also posted a PDF image of the letter as it originally appeared in Datamation here (1.18MB). ~ DWT (1 April 2013)

Ed Post
Graphic Software Systems
P.O. Box 673
25117 S.W. Parkway
Wilsonville, OR 97070

Back in the good old days — the “Golden Era” of computers, it was easy to separate the men from the boys (sometimes called “Real Men” and “Quiche Eaters” in the literature). During this period, the Real Men were the ones that understood computer programming, and the Quiche Eaters were the ones that didn’t. A real computer programmer said things like “DO 10 I=1,10” and “ABEND” (they actually talked in capital letters, you understand), and the rest of the world said things like “computers are too complicated for me” and “I can’t relate to computers — they’re so impersonal”. (A previous work [1] points out that Real Men don’t “relate” to anything, and aren’t afraid of being impersonal.)

But, as usual, times change. We are faced today with a world in which little old ladies can get computerized microwave ovens, 12 year old kids can blow Real Men out of the water playing Asteroids and Pac-Man, and anyone can buy and even understand their very own Personal Computer. The Real Programmer is in danger of becoming extinct, of being replaced by high-school students with TRASH-80s!

There is a clear need to point out the differences between the typical high-school junior Pac-Man player and a Real Programmer. Understanding these differences will give these kids something to aspire to — a role model, a Father Figure. It will also help employers of Real Programmers to realize why it would be a mistake to replace the Real Programmers on their staff with 12 year old Pac-Man players (at a considerable salary savings).

LANGUAGES

The easiest way to tell a Real Programmer from the crowd is by the programming language he (or she) uses. Real Programmers use FORTRAN. Quiche Eaters use PASCAL. Nicklaus Wirth, the designer of PASCAL, was once asked, “How do you pronounce your name?” He replied “You can either call me by name, pronouncing it ‘Veert’, or call me by value, ‘Worth’.” One can tell immediately from this comment that Nicklaus Wirth is a Quiche Eater. The only parameter passing mechanism endorsed by Real Programmers is call-by-value-return, as implemented in the IBM/370 FORTRAN G and H compilers. Real programmers don’t need abstract concepts to get their jobs done: they are perfectly happy with a keypunch, a FORTRAN IV compiler, and a beer.

  • Real Programmers do List Processing in FORTRAN.
  • Real Programmers do String Manipulation in FORTRAN.
  • Real Programmers do Accounting (if they do it at all) in FORTRAN.
  • Real Programmers do Artificial Intelligence programs in FORTRAN.

If you can’t do it in FORTRAN, do it in assembly language. If you can’t do it in assembly language, it isn’t worth doing.

STRUCTURED PROGRAMMING

Computer science academicians have gotten into the “structured programming” rut over the past several years. They claim that programs are more easily understood if the programmer uses some special language constructs and techniques. They don’t all agree on exactly which constructs, of course, and the examples they use to show their particular point of view invariably fit on a single page of some obscure journal or another — clearly not enough of an example to convince anyone. When I got out of school, I thought I was the best programmer in the world. I could write an unbeatable tic-tac-toe program, use five different computer languages, and create 1000 line programs that WORKED. (Really!) Then I got out into the Real World. My first task in the Real World was to read and understand a 200,000 line FORTRAN program, then speed it up by a factor of two. Any Real Programmer will tell you that all the Structured Coding in the world won’t help you solve a problem like that — it takes actual talent. Some quick observations on Real Programmers and Structured Programming:

  • Real Programmers aren’t afraid to use GOTOs.
  • Real Programmers can write five page long DO loops without getting confused.
  • Real Programmers enjoy Arithmetic IF statements because they make the code more interesting.
  • Real Programmers write self-modifying code, especially if it saves them 20 nanoseconds in the middle of a tight loop.
  • Real Programmers don’t need comments: the code is obvious.
  • Since FORTRAN doesn’t have a structured IF, REPEAT ... UNTIL, or CASE statement, Real Programmers don’t have to worry about not using them. Besides, they can be simulated when necessary using assigned GOTOs.

Data structures have also gotten a lot of press lately. Abstract Data Types, Structures, Pointers, Lists, and Strings have become popular in certain circles. Wirth (the above-mentioned Quiche Eater) actually wrote an entire book [2] contending that you could write a program based on data structures, instead of the other way around. As all Real Programmers know, the only useful data structure is the array. Strings, lists, structures, sets — these are all special cases of arrays and can be treated that way just as easily without messing up your programing language with all sorts of complications. The worst thing about fancy data types is that you have to declare them, and Real Programming Languages, as we all know, have implicit typing based on the first letter of the (six character) variable name.

OPERATING SYSTEMS

What kind of operating system is used by a Real Programmer? CP/M? God forbid — CP/M, after all, is basically a toy operating system. Even little old ladies and grade school students can understand and use CP/M.

Unix is a lot more complicated of course — the typical Unix hacker never can remember what the PRINT command is called this week — but when it gets right down to it, Unix is a glorified video game. People don’t do Serious Work on Unix systems: they send jokes around the world on USENET and write adventure games and research papers.

No, your Real Programmer uses OS/370. A good programmer can find and understand the description of the IJK305I error he just got in his JCL manual. A great programmer can write JCL without referring to the manual at all. A truly outstanding programmer can find bugs buried in a 6 megabyte core dump without using a hex calculator. (I have actually seen this done.)

OS/370 is a truly remarkable operating system. It’s possible to destroy days of work with a single misplaced space, so alertness in the programming staff is encouraged. The best way to approach the system is through a keypunch. Some people claim there is a Time Sharing system that runs on OS/370, but after careful study I have come to the conclusion that they are mistaken.

PROGRAMMING TOOLS

What kind of tools does a Real Programmer use? In theory, a Real Programmer could run his programs by keying them into the front panel of the computer. Back in the days when computers had front panels, this was actually done occasionally. Your typical Real Programmer knew the entire bootstrap loader by memory in hex, and toggled it in whenever it got destroyed by his program. (Back then, memory was memory — it didn’t go away when the power went off. Today, memory either forgets things when you don’t want it to, or remembers things long after they’re better forgotten.) Legend has it that Seymour Cray, inventor of the Cray I supercomputer and most of Control Data’s computers, actually toggled the first operating system for the CDC7600 in on the front panel from memory when it was first powered on. Seymour, needless to say, is a Real Programmer.

One of my favorite Real Programmers was a systems programmer for Texas Instruments. One day, he got a long distance call from a user whose system had crashed in the middle of some important work. Jim was able to repair the damage over the phone, getting the user to toggle in disk I/O instructions at the front panel, repairing system tables in hex, reading register contents back over the phone. The moral of this story: while a Real Programmer usually includes a keypunch and lineprinter in his toolkit, he can get along with just a front panel and a telephone in emergencies.

In some companies, text editing no longer consists of ten engineers standing in line to use an 029 keypunch. In fact, the building I work in doesn’t contain a single keypunch. The Real Programmer in this situation has to do his work with a text editor program. Most systems supply several text editors to select from, and the Real Programmer must be careful to pick one that reflects his personal style. Many people believe that the best text editors in the world were written at Xerox Palo Alto Research Center for use on their Alto and Dorado computers [3]. Unfortunately, no Real Programmer would ever use a computer whose operating system is called SmallTalk, and would certainly not talk to the computer with a mouse.

Some of the concepts in these Xerox editors have been incorporated into editors running on more reasonably named operating systems. EMACS and VI are probably the most well known of this class of editors. The problem with these editors is that Real Programmers consider “what you see is what you get” to be just as bad a concept in text editors as it is in women. No, the Real Programmer wants a “you asked for it, you got it” text editor — complicated, cryptic, powerful, unforgiving, dangerous. TECO, to be precise.

It has been observed that a TECO command sequence more closely resembles transmission line noise than readable text [4]. One of the more entertaining games to play with TECO is to type your name in as a command line and try to guess what it does. Just about any possible typing error while talking with TECO will probably destroy your program, or even worse — introduce subtle and mysterious bugs in a once working subroutine.

For this reason, Real Programmers are reluctant to actually edit a program that is close to working. They find it much easier to just patch the binary object code directly, using a wonderful program called SUPERZAP (or its equivalent on non-IBM machines). This works so well that many working programs on IBM systems bear no relation to the original FORTRAN code. In many cases, the original source code is no longer available. When it comes time to fix a program like this, no manager would even think of sending anything less than a Real Programmer to do the job — no Quiche Eating structured programmer would even know where to start. This is called “job security”.

Some programming tools NOT used by Real Programmers:

  • FORTRAN preprocessors like MORTRAN and RATFOR. The Cuisinarts of programming — great for making Quiche. See comments above on structured programming.
  • Source language debuggers. Real Programmers can read core dumps.
  • Compilers with array bounds checking. They stifle creativity, destroy most of the interesting uses for EQUIVALENCE, and make it impossible to modify the operating system code with negative subscripts. Worst of all, bounds checking is inefficient.
  • Source code maintainance systems. A Real Programmer keeps his code locked up in a card file, because it implies that its owner cannot leave his important programs unguarded [5].

THE REAL PROGRAMMER AT WORK

Where does the typical Real Programmer work? What kind of programs are worthy of the efforts of so talented an individual? You can be sure that no real Programmer would be caught dead writing accounts-receivable programs in COBOL, or sorting mailing lists for People magazine. A Real Programmer wants tasks of earth-shaking importance (literally!):

  • Real Programmers work for Los Alamos National Laboratory, writing atomic bomb simulations to run on Cray I supercomputers.
  • Real Programmers work for the National Security Agency, decoding Russian transmissions.
  • It was largely due to the efforts of thousands of Real Programmers working for NASA that our boys got to the moon and back before the Russkies.
  • The computers in the Space Shuttle were programmed by Real Programmers.
  • Real Programmers are at work for Boeing designing the operating systems for cruise missiles.

Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the Pioneer and Voyager spacecraft by heart. With a combination of large ground-based FORTRAN programs and small spacecraft-based assembly language programs, they can to do incredible feats of navigation and improvisation, such as hitting ten-kilometer wide windows at Saturn after six years in space, and repairing or bypassing damaged sensor platforms, radios, and batteries. Allegedly, one Real Programmer managed to tuck a pattern-matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter.

One plan for the upcoming Galileo spacecraft mission is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/- 3 kilometers of the surface of Mars. Nobody is going to trust a PASCAL program (or PASCAL programmer) for navigation to these tolerances.

As you can tell, many of the world’s Real Programmers work for the U.S. Government, mainly the Defense Department. This is as it should be. Recently, however, a black cloud has formed on the Real Programmer horizon.

It seems that some highly placed Quiche Eaters at the Defense Department decided that all Defense programs should be written in some grand unified language called “ADA” (registered trademark, DoD). For a while, it seemed that ADA was destined to become a language that went against all the precepts of Real Programming — a language with structure, a language with data types, strong typing, and semicolons. In short, a language designed to cripple the creativity of the typical Real Programmer. Fortunately, the language adopted by DoD has enough interesting features to make it approachable: it’s incredibly complex, includes methods for messing with the operating system and rearranging memory, and Edsgar Dijkstra doesn’t like it [6]. (Dijkstra, as I’m sure you know, was the author of “GoTos Considered Harmful” — a landmark work in programming methodology, applauded by Pascal Programmers and Quiche Eaters alike.) Besides, the determined Real Programmer can write FORTRAN programs in any language.

The real programmer might compromise his principles and work on something slightly more trivial than the destruction of life as we know it, providing there’s enough money in it. There are several Real Programmers building video games at Atari, for example. (But not playing them. A Real Programmer knows how to beat the machine every time: no challange in that.) Everyone working at LucasFilm is a Real Programmer. (It would be crazy to turn down the money of 50 million Star Wars fans.) The proportion of Real Programmers in Computer Graphics is somewhat lower than the norm, mostly because nobody has found a use for Computer Graphics yet. On the other hand, all Computer Graphics is done in FORTRAN, so there are a fair number people doing Graphics in order to avoid having to write COBOL programs.

THE REAL PROGRAMMER AT PLAY

Generally, the Real Programmer plays the same way he works — with computers. He is constantly amazed that his employer actually pays him to do what he would be doing for fun anyway, although he is careful not to express this opinion out loud. Occasionally, the Real Programmer does step out of the office for a breath of fresh air and a beer or two. Some tips on recognizing real programmers away from the computer room:

  • At a party, the Real Programmers are the ones in the corner talking about operating system security and how to get around it.
  • At a football game, the Real Programmer is the one comparing the plays against his simulations printed on 11 by 14 fanfold paper.
  • At the beach, the Real Programmer is the one drawing flowcharts in the sand.
  • A Real Programmer goes to a disco to watch the light show.
  • At a funeral, the Real Programmer is the one saying “Poor George. And he almost had the sort routine working before the coronary.”
  • In a grocery store, the Real Programmer is the one who insists on running the cans past the laser checkout scanner himself, because he never could trust keypunch operators to get it right the first time.

THE REAL PROGRAMMER’S NATURAL HABITAT

What sort of environment does the Real Programmer function best in? This is an important question for the managers of Real Programmers. Considering the amount of money it costs to keep one on the staff, it’s best to put him (or her) in an environment where he can get his work done.

The typical Real Programmer lives in front of a computer terminal. Surrounding this terminal are:

  • Listings of all programs the Real Programmer has ever worked on, piled in roughly chronological order on every flat surface in the office.
  • Some half-dozen or so partly filled cups of cold coffee. Occasionally, there will be cigarette butts floating in the coffee. In some cases, the cups will contain Orange Crush.
  • Unless he is very good, there will be copies of the OS JCL manual and the Principles of Operation open to some particularly interesting pages.
  • Taped to the wall is a line-printer Snoopy calender for the year 1969.
  • Strewn about the floor are several wrappers for peanut butter filled cheese bars (the type that are made stale at the bakery so they can’t get any worse while waiting in the vending machine).
  • Hiding in the top left-hand drawer of the desk is a stash of double stuff Oreos for special occasions.
  • Underneath the Oreos is a flow-charting template, left there by the previous occupant of the office. (Real Programmers write programs, not documentation. Leave that to the maintenance people.)

The Real Programmer is capable of working 30, 40, even 50 hours at a stretch, under intense pressure. In fact, he prefers it that way. Bad response time doesn’t bother the Real Programmer — it gives him a chance to catch a little sleep between compiles. If there is not enough schedule pressure on the Real Programmer, he tends to make things more challenging by working on some small but interesting part of the problem for the first nine weeks, then finishing the rest in the last week, in two or three 50-hour marathons. This not only impresses his manager, who was despairing of ever getting the project done on time, but creates a convenient excuse for not doing the documentation. In general:

  • No Real Programmer works 9 to 5. (Unless it’s 9 in the evening to 5 in the morning.)
  • Real Programmers don’t wear neckties.
  • Real Programmers don’t wear high heeled shoes.
  • Real Programmers arrive at work in time for lunch. [9]
  • A Real Programmer might or might not know his spous’s name. He does, however, know the entire ASCII (or EBCDIC) code table.
  • Real Programmers don’t know how to cook. Grocery stores aren’t often open at 3 a.m., so they survive on Twinkies and coffee.

THE FUTURE

What of the future? It is a matter of some concern to Real Programmers that the latest generation of computer programmers are not being brought up with the same outlook on life as their elders. Many of them have never seen a computer with a front panel. Hardly anyone graduating from school these days can do hex arithmetic without a calculator. College graduates these days are soft — protected from the realities of programming by source level debuggers, text editors that count parentheses, and user friendly operating systems. Worst of all, some of these alleged computer scientists manage to get degrees without ever learning FORTRAN! Are we destined to become an industry of Unix hackers and Pascal programmers?

On the contrary. From my experience, I can only report that the future is bright for Real Programmers everywhere. Neither OS/370 nor FORTRAN show any signs of dying out, despite all the efforts of Pascal programmers the world over. Even more subtle tricks, like adding structured coding constructs to FORTRAN have failed. Oh sure, some computer vendors have come out with FORTRAN 77 compilers, but every one of them has a way of converting itself back into a FORTRAN 66 compiler at the drop of an option card — to compile DO loops like God meant them to be.

Even Unix might not be as bad on Real Programmers as it once was. The latest release of Unix has the potential of an operating system worthy of any Real Programmer. It has two different and subtly incompatible user interfaces, an arcane and complicated terminal driver, virtual memory. If you ignore the fact that it’s structured, even C programming can be appreciated by the Real Programmer: after all, there’s no type checking, variable names are seven (ten? eight?) characters long, and the added bonus of the Pointer data type is thrown in. It’s like having the best parts of FORTRAN and assembly language in one place. (Not to mention some of the more creative uses for #define.)

No, the future isn’t all that bad. Why, in the past few years, the popular press has even commented on the bright new crop of computer nerds and hackers ([7] and [8]) leaving places like Stanford and M.I.T. for the Real World. From all evidence, the spirit of Real Programming lives on in these young men and women. As long as there are ill-defined goals, bizarre bugs, and unrealistic schedules, there will be Real Programmers willing to jump in and Solve The Problem, saving the documentation for later. Long live FORTRAN!

ACKNOWLEGEMENT

I would like to thank Jan E., Dave S., Rich G., Rich E. for their help in characterizing the Real Programmer, Heather B. for the illustration, Kathy E. for putting up with it, and atd!avsdS:mark for the initial inspiration.

REFERENCES

[1] Feirstein, B., Real Men Don’t Eat Quiche, New York, Pocket Books, 1982.
[2] Wirth, N., Algorithms + Datastructures = Programs, Prentice Hall, 1976.
[3] Ilson, R, ‘Recent Research in Text Processing’, IEEE Trans Prof Commun, Vol PC-23 No 4 Dec 4 1980.
[4] Finseth, C., Theory and Practice of Text Editors – or – a Cookbook for an EMACS, B.S. Thesis, MIT/LCS/TM-165, Massachusetts Institute of Technology, May 1980.
[5] Weinberg, G., The Psychology of Computer Programming, New York, Van Nostrabd Reinhold, 1971, page 110.
[6] Dijkstra, E., On the GREEN Language Submitted to the DoD, Sigplan notices, Volume 3, Number 10, October 1978.
[7] Rose, Frank, Joy of Hacking, Science 82, Volume 3, Number 9, November 1982, pages 58 – 66.
[8] The Hacker Papers, Psychology Today, August 1980.

William Gibson’s Idoru and Blogging

I want to add one more thought about blogging before I get started. In my Inaugural Post I asked, “Why join this societal wave of exhibitionism?” and mentioned the relation of technology to surveillance, voyeurism, privacy and exhibitionism. Every time I think about these issues, a character from William Gibson’s 1996 novel Idoru comes to mind.

Before I delve into the main point, I want to say that I think William Gibson is a genius. In his first novel, Neuromancer (1984), the hit that launched the cyberpunk genre, he came up with the term cyberspace. In case you passed over that parenthetical date too quickly, let me point out that he came up with the idea of cyberspace in 1984: before there was either the Internet or virtual reality.

Yes, I am aware that Tron came out in 1982, but Tron is about a man who is sucked into a little, tiny world inside of a computer were the programs are personified (e.g. the vilan, “Master Control”) and forced to fight high-tech gladiatorial games in sexy spandex body suits. This of course will never happen and is merely a technological variant of The Fantastic Voyage, The Wizard of Oz or The Lion, The Witch and the Wardrobe. Yes, there are some silly parts of Neuromancer: the space Rastafarians are hardly the heady stuff of Arthur C. Clarke or Isaac Asimov. However, a total emersion interface to a simulated world spread over a network of computers is freaking visionary. Unlike Tron, which set people’s understanding of computers back a decade, Neuromancer is the future.

What is most relevant to blogging is his vision of celebrity and media that make up the ideological backdrop of Idoru. The novel is set in the not-too-distant future where mass media has continued to throw its net wider and wider, where, as Andy Warhol said in what must be the most accurate prediction ever made, “everyone will be famous for fifteen minutes.” Murderers are famous, the parents of their victims are famous, college students fake kidnappings to get on television, unaccomplished debutantes are famous for nothing other than ostentation, people become famous when sex tapes “accidentally” find there way on to the Internet, people elbow their way onto television for opportunities to boast about things that previously one wouldn’t even want one’s neighbors to know. Actually, I am talking about the present, but imagine this trend married to the myriad of widely affordable media production and distribution technologies chased out twenty years into the future. With thousands of television channels to fill up and with everyone’s vanity site on the Internet and with no gatekeepers, fame will devolve to the masses. Gibson has one of his characters describe it thus:

“Nobody’s really famous anymore, Laney. Have you noticed that?…I mean really famous. There’s not much fame left, not in the old sense. Not enough to go around…We learned to print money off this stuff,” she said. “Coin of our realm. Now we’ve printed too much; even the audience knows. It shows in the ratings…Except,” she said… “when we decide to destroy one.” (6-7)

Gibson spends the opening chapters of the book describing how derelict protagonist Colin Laney lost his previous job as a “researcher” at a tabloid news show called Slitscan. In this future, like our present, an increasing proportion of people’s transactions are being passively recorded in corporate databases. And also as in our present, some companies exist solely to purchase information, correlate disparate pieces in useful ways and sell it to those who might put it to some (usually pernicious) use. In this novel, Slitscan had a questionable relationship with such a data agglomeration corporation called DatAmerica and Laney’s job was to troll through the data trails left by celebrities looking for the “nodal points” — the confluences of data — that indicated something gossip-worthy for the show to report.

Laney was not, he was careful to point out, a voyeur. He had a peculiar knack with data collection architectures, and a medically documented concentration deficit that he could toggle, under certain conditions, into a state of pathological hyperfocus…he was an intuitive fisher of patterns of information: of the sort of signature a particular individual inadvertently created in the net as he or she went about the mundane yet endlessly multiplex business of life in a digital society. (30-31)

Laney was fired when, while researching the mistress of a celebrity, it became clear to him from her data trail that she intended to commit suicide and he tried unsuccessfully to intervene. Here Laney checks back with his mark after returning from a vacation:

The nodal point was different now, though he had no language to describe the change. He sifted the countless fragments that had clustered around Alison Shires in his absence, feeling for the source of his earlier conviction. He called up the music that she’d accessed while he’d been in Mexico, playing each song in the order of her selection. He found her choices had grown more life-affirming; she’d moved to a new provider, Upful Groupvine, whose relentlessly positive product was the musical equivalent of the Good News Channel.

Cross-indexing her charges against the records of her credit-provider and its credit retailers, he produced a list of everything she’d purchased in the past week. Six-pack, blades, Tokkai carton opener. Did she own a Tokkai carton opener? But then he remembered Kathy’s advice, that this was the part of research most prone to produce serious transference, the point at which the researcher’s intimacy with the subject could lead to loss of perspective. “It’s often easiest for us to identify at the retail level, Laney. We are a shopping species. Find yourself buying a different brand of frozen peas because the subject, watch out.” (66-67)

Before excerpting a passage where Gibson describes the future of gossip journalism, let me remind you that this is Gibson’s view from 1996, when MTV’s The Real World was only in its 4th season, the O.J. Simpson trial was just over, Monica Lexinsky’s blue dress was stain-free and Survivor was still four years off:

Slitscan was descended from “reality” programming and the network tabloids of the late twentieth century, but it resembled them no more than some large, swift, bipedal carnivore resembled its sluggish, shallow-dwelling ancestors. Slitscan was the mature form, supporting fully global franchises. Slitscan’s revenues had paid for entire satellites and built the building he worked in in Burbank.

Slitscan was a show so popular that it had evolved into something akin to the old idea of a network. It was flanked and buffered by spinoffs and peripherals, each designed to shunt the viewer back to the crucial core, the familiar and reliably bloody alter that one of Laney’s Mexican co-workers called Smoking Mirror.

It was impossible to work at Slitscan without a sense of participating in history, or else what Kathy Torrance would argue had replaced history. Slitscan itself, Laney suspected, might be one of those larger nodal points he sometimes found himself trying to imagine, an informational peculiarity opening into some unthinkably deeper structure.

In his quest for lesser nodal points, the sort that Kathy sent him into DatAmerica to locate, Laney had already affected the course of municipal elections, the market in patent gene futures, abortion laws in the state of New Jersey, and the spin on an ecstatic pro-euthanasia movement (or suicide cult, depending) called Cease Upon the Midnight, not to mention the lives and careers of several dozen celebrities of various kinds.

Not always for the worst, either, in terms of what the show’s subjects might have wished for themselves. Kathy’s segment on the Dukes of Nuke ‘Em, exposing the band’s exclusive predilection for Iraqi fetal tissue, had sent their subsequent release instant platinum (and had resulted in show-trials and public hangings in Baghdad, but he supposed life was hard there to begin with). (50-52)

Of course, something like Slitscan — or the Jerry Springer Show, Cops, E True Hollywood Story, Average Joe or The Fifth Wheel in our time — could not exist were it not for the sadistic voyeurism of the masses. I select this passage as much to satisfying my own snickering elitism as to illustrate the lust for other people’s misery that comprises our current and future television viewing audience:

…Slitscan’s audience…is best visualized as a vicious, lazy, profoundly ignorant, perpetually hungry organism craving the warm god-flesh of the anointed. Personally I like to imagine something the size of a baby hippo, the color of a week-old boiled potato, that lives by itself, in the dark, in a double-wide on the outskirts of Topeka. It’s covered with eyes and it sweats constantly. The sweat runs into those eyes and makes them sting. It has no mouth, Laney, no genitals, and can only express its mute extremes of murderous rage and infantile desire by changing the channels on a universal remote. Or by voting in presidential elections. (35-36)

Of course one can already see aspects of this world coming into being. Corporations are harvesting, agglomerating and correlating information at a frightening and increasing rate — but that is for another post. What I am thinking about here is the voyeuristic and micro-celebrity aspects of our quickening information age. I have a friend who reads several people’s blogs on an occasional basis, some of whom he has never even met. Of one that he hasn’t met, he maintains that this blogger is teetering on the brink of an infidelity with a coworker against his current girlfriend — an infidelity, the imminence of which he himself is not yet aware! My friend keeps returning to this blog awaiting the climactic post as if it were a soap opera.

There you have it: micro-celebrity, sadistic voyeurism, a readable data trail from which one might extrapolate future behavior with a minimal amount of theory. Admittedly, my friend is following an intentional data trail rather than a passive one, but the small difference between this situation and that of Gibson’s Laney anticipating the suicide is striking.

I don’t absolve myself of any of this. I loved the show Trauma: Real Life in the ER, which is about as sadistic of a voyeurism as you’ll find. I did say in the “Inaugural Post” that I consider this a “deeply improprietious endeavor.” I am, however, aware of the context in which I embark upon this effort.