Tchotchkes and Circus

Over the Thanksgiving weekend our Australian member pointed out a striking contrast. A constant topic of conversation among our group, being mid-career professionals from New York and Washington, D.C., it the outrageous price of houses. We are all at that age where we are looking and scheming, but for myself I have completely written off the prospect of ever owning a house in any place where I would like to live, namely the big city.

In the midst of one of these rants, Dean, a man with a considerable lust for gadgets mind you, pointed out that increasingly the most important things in life — housing, education, healthcare — are astronomically expensive, pushing completely unaffordable to normal middle class people. Meanwhile all the trivial junk — banana hangers, juicers, fruit dryers, bread makers, cheese straighteners — becomes ever more cheap.

This is just the economic continuation of bread and circus: as the most important things in life recede ever farther from grasp, people are distracted by trivial entertainment and petty satisfactions.

Often enough, this is offered up as adequate consolation in the bargain of trade liberalization. Yes, yes, mid-level skilled jobs may be fleeing the country at an alarming rate but this is completely offset — so the argument goes — by the stunning decrease in prices. People’s wages may have stagnated, but the goods they seek to purchase have decreased in price so their real standard of living has improved. The fly in the ointment is that the price of imported goods — cheese straighteners et. al. — has decreased while the price of domestically produced goods — healthcare, houses, education — has continued to increase apace. Or perhaps what we are witnessing is correct valuation of these dear goods: as the return on investment in these life-investments has grown, their value, like blue-chip stocks, has grown accordingly. Whatever the case, what we are witnessing is the reverse of Robert Reich’s thesis from The Work Of Nations: rather than investing in our immovable capital, namely our nation’s citizens, we are allowing them to crumble in favor of tooth brushes that match the bathroom curtains.

Owing to I-don’t-know-what — morbidity about the future and infatuation with the shimmer of the present — the calculation by which your average person discounts future prosperity is all out of whack. Contra the Virginia Postrel thesis, life may be ever more stylish and well designed, but it is simultaneously more mean and slim in its life-investment aspects. What we are experiencing is a hollowing out of the human economy. The aesthetics are just the latest in bread and circus. And I’m not talking ivory tower abstractions about what constitutes the good life — some sort of life of mind and real freedom versus crass materialist comfort. As Hans Roslings has amply demonstrated (e.g. Debunking ‘Third-World’ Myths with the Best Stats You’ve Ever Seen,”, TED, Monteray, California, February 2006) and as I’ve learned as a supervisor, basic health is perhaps the most important prerequisite to prosperity. Education is the foundation upon which future wellbeing is built. To the extent that we defer human investment in favor of spending our money on the day-to-day, we undermine our capacity to keep the circus of gadgets going.

Books I Haven’t Read

Since I just bought Pierre Bayard’s How to Talk About Books You Haven’t Read, I guess that I should post on it now, not having read it, rather than later when maybe I will.

I became a book collector fairly early on and all the way up through my early post-college years I could still name the author and title of every book I owned and I had honestly familiarized myself with at least a significant chunk of each one. But then as my income grew while my available free time stayed constant or even has diminished slightly, the portion of my book collection with which I have that level of familiarity has shrunk precipitously. At this point I have to confess to being as much a book collector as a book reader. In fact, it occurred to me a few nights ago, after recently having installed three new shelves, that I may have to start budgeting my book acquisitions in shelf-inches rather than dollars (“I’m only allowed three inches this month so its either the thousand page tomb or the two 350 page jobs”).

When S. saw me unload this latest acquisition from my bag she was rather amused that I had found just the right book. But with the seed planted, on no less than three occasions throughout the day did I catch myself and stop to point out that I was just that moment talking about some text that I had not in fact read.

And this dovetails well with David Brooks’s column a few weeks ago on “The Outsourced Brain” (The New York Times, 26 October 2007) where he said,

I had thought that the magic of the information age was that it allowed us to know more, but then I realized the magic of the information age is that it allows us to know less.

In a follow-up, Ezra Klein really makes the not reading point (“The External Brain,” 26 October 2007):

But so long as [Google’s] around, I don’t need to really read anything. I just need to catalogue the existence of things I might one day read. I don’t so much study web sites as scan for impressions, for markers, for key words I’ll need if I want to return. I don’t need the knowledge so much as a vague outline of what the knowledge is and how to get back.

Indeed, not reading is the wave of the future.

When I was younger and not yet even a dilettante, still just groping toward my present pissant snobbery, my younger and even more bizarre brother, brought us both into contact with the film The Metropolitan. The class issues were lost on me at the time, but it was a revelation: people just hanging around talking about ideas and drinking cocktails. What more could a person possibly want?

The snippet of dialogue that then as now stands out to me the most is one of their salon go-rounds:

Audrey Rouget: What Jane Austen novels have you read?

Tom Townsend: None. I don’t read novels. I prefer good literary criticism. That way you get both the novelists’ ideas as well as the critics’ thinking. With fiction I can never forget that none of it really happened, that it’s all just made up by the author.

To this day I probably read twice as many book reviews as I do actual books.

FaceBook

S. has recently become very interested in social networking sites and has drug me to get FaceBook and MySpace pages. FaceBook is pretty cool in that it’s like a social networking engine with an API for user application development. Judging by some of the applications, they give developers a lot of access. But the thing that I don’t get is why the people behind FaceBook seem to so lack ambition. First of all, they have yet to completely shed their college-oriented origins, so their network remains entirely too fragmented. But the real oversight is why, with that huge existing user database, they haven’t deployed more core functionality. Right now is seems like FaceBook is just a sort of online business card. Why haven’t they deployed dating, group scheduling and calendaring, blogging, employment, classified advertisements and so on? They could be match.com, meetup.com, livejournal.com, monster.com and craigslist.com all rolled into one. Or if not build the functionality themselves, why not partner and integrate or build some gateways? There are some features like what I am talking about, but they are rudimentary. Why not put them front and center? Seems like a recipe for obsolescence to me. In this environment it’s innovate or wither.

ABM

In the wake of the U.S.-Russia dustup over placement of an ABM interceptor site there has been a raft of articles on the U.S. missile shield. The October 2007 issue of Arms Control Today devotes the cover and six articles to it. Matthew Yglesias (“Preemption, 12 October 2007) calls his readers’ attention to a long story in Rolling Stone on the subject (Hitt, Jack, “The Shield,” Issue 1036, 4 October 2007).

I think that Mr. Yglesias is correct to say that the real purpose of ABM is “to facilitate American first strikes.” That the U.S. seeks such a capacity is the conclusion of a RAND report (Buchan, Glen C., et. al., Future Roles of U.S. Nuclear Forces: Implications for U.S. Strategy, Santa Monica, California: RAND, 2003, see p. 61) and Keir Lieber and Daryl Press suggest (“The End of MAD: The Nuclear Dimension of U.S. Primacy,” International Security, vol. 30, No. 4, Spring 2006, pp. 7-44, see p. 28) that in such a scheme, mop-up of a small number of surviving missiles launched after a disarming counterforce strike might be a job for which an ABM system of limited capability might be adequately suited.

But this isn’t the whole of the story: there are three reasons that the right has in the past and continues today to be so in favor of an anti-ballistic missile system.

  1. More fundamental than anything else is the American cultural reason for the fervor for ABM on the right. The culturally Scotch-Irish descended, Jacksonian segment of the United States subscribes to a very specific notion of warfare and the law of nations. War is to be fought all out with no restraint. Victory resulting in complete submission of the opponent is the objective. It is retributive in its notion of justice and particularistic rather than universalizing and legalistic in its reasoning. It is a mentality that never made the leap to the counterintuitive reasoning of the nuclear age. Its members have never understood limited war or restraint in warfare. Hence the angst over Vietnam, the use of torture in Iraq and opposition to all forms of arms control.

    The basis of arms control in the 60s and 70s was the gradual acceptance by nuclear strategists of MAD and its institutionalization in the Anti-Ballistic Missile Treaty of 1972. That nuclear powers would intentionally remain vulnerable to attack was the linchpin in stabilizing the nuclear arms race, but this flew in the face of the Jacksonian notion of war. Conspiring with one’s enemies to limit one’s capabilities and limit the uncontainable violence of war didn’t fit their paradigm and ever since they have been raging to tear down the entire structure. The intense interest in deploying an anti-ballistic missile system has had less to do with pragmatic considerations of national security than with the ideological struggle between two strategic paradigms. No policy debate would be so intense and fought out over generations of strategists and politicians were it just a weapons system at stake. The aim of moving to deploy an ABM system so urgently — even before it has been adequately demonstrated to work — is specifically to destroy the existing arms control regime and international system more generally in favor of one more in line with Jacksonian notions.

    This is why opponents of ABM have done so much to pillor Ronald Reagan, the id of 1970s and 80s America, and why the label “star wars,” with its invocation of psycho-cultural tropes, was so effective. The whole debate about ABM has taken place where strategic reasoning leaves off and social-psychology picks up.

  2. Hedging one’s opposition to ABM on technical infeasibility is probably a bad option. First, Americans, with their infinite faith in technology and can-do attitude won’t buy it. Second, at some point a system of at least some rudimentary capability will probably be up and running. A review of the history of nearly every weapon today touted as a miracle system shows that at some point in its development it was widely considered a boondoggle that would never work.

    The Patriot Missile is a good example here. During development in the late 1970s there was endless harping that the technical hurdles were insurmountable and that it would never work. The first battery was deployed in 1984 as an anti-aircraft weapon, but it was designed to be a modular system and underwent a number of major and minor upgrades, including the 1988 upgrade that gave it the anti-ballistic missile capability for which it is so well known today. In the 1991 Gulf War CNN footage of Patriot missiles rising to destroy incoming Scuds over Israel and Saudi Arabia are some of the most memorable images of the war. Subsequent studies have indicated that the success rate of the Patriot was significantly lower than initially reported, but additional upgrades throughout the 1990s have further refined the performance of the weapon. In the invasion of Iraq the weapon misidentified and shot down two allied aircraft, but it is hardly the only system to have malfunctioned resulting in friendly-fire deaths. It is presently undergoing an upgrade that is nearly a complete system redesign and will significantly enhance performance in nearly every aspect. The important point is that it managed to overcome its technical hurdles, with significant progress being made post-deployment and has undergone a number of modifications that have pushed a thirty year old system well beyond its initial specifications. A similar story could be told for the Tomahawk cruise missile or the B-2 stealth bomber.

  3. As Senator Lyndon Johnson argued to liberal skeptics who thought the 1957 civil rights bill didn’t go far enough, it was more important that a bill get passed than any particular content of the bill. Or as Senator Johnson put it, “Once you break the virginity, it’ll be easier next time.” Senator Edward Kennedy has offered a similar defense of his votes for micro-initiative healthcare programs or No Child Left Behind. If a comprehensive universal healthcare bill is unpassable, than pass it in a million little pieces. Or, it is more important to get Congress to agree in principle on federal education spending. The program can later be reengineered with amendments.

    One of the notable features of the post Newt Gingrich / George W. Bush right is the degree to which they have learned to use the very things they most hate about government to their advantage. One is that a budget line-item never dies. All that was necessary was to fund ABM once, then there would be interest groups, a bureaucracy, a scientific community, a lobby and the fundamental human laziness of just carrying a line-item forward. The program would then live in perpetuity.

    Combine this with number two, that the technological problems can be ironed out in the field with enough money, and the important point is to get systems in place. The pressures of real-world operability plus the bureaucratic juggernaut will force a system into existence. The arguments that Mr. Hitt in the Rolling Stone piece thinks his strongest are, at this high-level, no argument at all. You really have to dig down into the nitty-gritty — which he does not do — before such arguments start to have any impact. In this scenario, a few negative GAO reports are no threat. In fact, they could shame politicians to throw good money after bad, lest failure show their previous votes in a new light. In fact, I’ll wager that if the Democrats capture Congress and the White House in 2008, the ABM juggernaut just keeps rolling on unabated.

The real problem with an anti-ballistic missile system is that it is a Maginot Line. This is the case for three reasons.

  1. ICBM counter-measures and ABM system requirements don’t scale at the same rate so would-be attackers can defeat ABM — or at least confound it to the point where a defender could not factor it into their strategic considerations with any reliability — much more easily and affordably than defenders can adapt. And being on the right side of a scalability calculation is how one wins a strategic competition.

    If the technical countermeasures aren’t enough, it’s worth noting that the calculation of an ABM system is that its OODA loop is inside that of an ICMB flight time. As terrifyingly short as ICBM flight times are, they are long enough compared to modern C3I. To defeat ABM, all one has to do is compress ballistic missile flight time to less than the OODA of ABM. The 25 minutes from Asia to the U.S. is a relatively long time, but park a missile submarine loaded with intermediate-range ballistic missiles (IRBMs) a few hundred miles off the coast and now you are talking about flight times of more like five minutes. Many IRBMs are suborbital so even if detected and reacted to, there just might not be enough air under a warhead for a ground-based interceptor to work its magic.

    Reaction time of ABM could be shortened too, with the first C of C3I — Command — being the lowest hanging fruit. But automate the decision-making component and SkyNet goes live.

    All of these calculations explain why the Chinese are spending so heavily on SSBNs (Lewis, Jeffrey, “Two More Chinese Boomers?,” Arms Control Wonk, 4 October 2007) as well as why the United States continues to turn out attack submarines ($2.7 billion for one Virginia class submarine in the FY 2008 budget) nearly 20 years after the end of the Cold War and without a single navy peer competitor prowing the seas.

  2. Once it becomes clear that ballistic missiles are under threat, states will quickly realize that the future is in cruise missiles.

    Having watched a number of U.S. air power attacks on CNN, Americans think that cruise missiles are an exclusive U.S. technology. While, say, the U.S. Tomahawk cruise missile is an extremely sophisticated weapon, cruise missiles are not beyond the reach of less capable powers. The German V-1 “flying bomb”, first flown in 1944, was essentially a cruise missile. The United States deployed its first cruise missile, the problem-prone Snark, in 1961 and initially development of the cruise missile was considerably ahead of that of the ICBM. The Europeans have the Storm Shadow. During the invasion of Iraq, Saddam Hussein attacked Kuwait with a Chinese-made Silkworm cruise missile, first deployed in the early 1980s. Proliferation of cruise missiles is proceeding apace and the technology is not so sophisticated as to be intercepted by export control regimes. Hell, the flight control system of a Tomahawk runs on an 8086 processor. And it’s not even manufactured by Intel anymore. The design has been licensed to a bunch of low-end Asian chip fabricators.

    Cruise missiles fly low and under radar detection systems, are capable of maneuver and because they don’t follow set, easily calculable trajectories like ICBMs, are not subject to easy intercept. Cruise missiles usually have shorter ranges, so we are potentially back talking about anti-submarine warfare again.

  3. Then, of course, there is the most radical delivery system. If I were a terrorist or rogue state plotting to get a weapon of mass destruction to a U.S. city, I would just FedEx it.

    As has been fairly well observed, modern terrorism and to an increasing extent, modern war in general, is parasitic on the very highways and byways of globalization. There is no killer app here that can solve the problem. This is more labor-intensive problem demanding a myriad of heterogeneous and creative operations.

It would seem to me that given the scalability issue covered in number one, ABM is a grand-strategic looser. Much more security per dollar could be had through the tried and true means of anti-proliferation, traditional deterrence, counterforce, anti-submarine warfare and the newer, but relatively affordable area of homeland security.

Transformation of Media

Dan Savage strikes a forward-looking tone in discussing some organizational changes at The Stranger (“The More Things Change,” SLOG, 19 September 2007):

You’re reading this online, so you’re probably aware that The Stranger isn’t just a newspaper anymore: In addition to our weekly print edition, we’ve got blogs, podcasts, video, tons of expanded web content, and the occasional amateur porn contest. In order to manage the growth of our editorial content — in order to keep putting out Seattle’s only newspaper while at the same time running the best alt-weekly website in the country — we’ve had to change our editorial department’s structure.

Media is changing, as inevitably it will under the pressure of technology. One can be matter-of-fact about it, or try to get out ahead of it and shape the coming new world or one can endure the slow extinction of a species whose ecological niche is dwindling. “The Stranger isn’t just a newspaper anymore.” What a breath of fresh air. And this from a man who just two years ago wrote as a guest blogger (“Who Am I? Why Am I Here?, Daily Dish, 8 August 2005),

“Savage Love” readers have been asking me to start a blog of my own for, oh, six or seven years now and I’ve resisted. I’m a Luddite, I confess, one of the ways in which my deeply conservative soul expresses itself. It was only a few years ago that I started accepting email at “Savage Love” …

This reminds me that Matthew Yglesias had a subdued blog-triumphalism mini-kick back in July-August that unlike most blog-triumphalism was really pretty interesting.

Now With Charts,” The Atlantic.com, 24 July 2007

This is a reminder, I think, of why we should look forward to the day when the op-ed column is a dead format and everyone just blogs. Brooks’ original column would, obviously, have been better if it — like Nyhan’s reply — had come with links to data and charts. What’s more, it’d be good if we could expect Brooks to reply to the sort of criticisms he’s getting from Nyhan, Dean Baker, and others. Maybe he has something fascinating to say on his own behalf. But the way the columnizing world works, there’s almost no chance he’ll address his next column to trying to rebut the critics of this one. But a back-and-forth debate on this subject with links and charts and data would be much more interesting than what we’re going to get instead where liberals decide Brooks is a liar and Brooks remains convinced that liberals are crazy.

Better Get a New Job,” The Atlantic.com, 19 August 2007

As Kevin Drum says there was no crowding out here where what Marty Lederman or Duncan Black or Andrew or I were doing somehow made it more difficult for newspapers to do investigative reporting. If anything, the reverse is true. The widespread availability of a vast sea of armchair analysis and commentary on the internet will, over time, force large, professionalized news organizations to focus on their core, hard-to-duplicate competencies — and spend less time on the sort of fact-averse punditry Skube’s doing right here.

It was easier to see the harrumphing of the recording industry as what it was: the slothful groan of the vested interest in the face of a new upstart. There was too much crass money lying around for us to not see through all their protestations about art. Journalists and writers have a more subtly wrought tale to spin.

I particularly like Mr. Yglesias’s second point. The bloggosphere and the mainstream media are like countries in the economist’s parable of comparative advantage. And like the citizenry of those countries, bloggers and journalists can’t help but see the shifts and specialization from which the advantage arises as anything but threatening. “They took our jobs.”

It’s worth noting that in the theory of comparative advantage both countries benefit from specialization even when one country is superior at all activities in question. Perhaps it won’t matter so much that bloggers are just a bunch of guys in their pajamas and that politicians have learned how to game the press.

Real Programmers Don’t Use PASCAL

Ed Post, “Readers’ Forum: Real Programmers Don’t Use PASCAL,” Datamation, vol. 29, no. 7, July 1983, pp. 263-265

Datamation, vol. 29, no. 7, July 1983

Datamation, July 1983

Editor’s note: What I have posted below is the document as it circulated on Usenet, BBSs and the Internet, and as I first encountered it. Like all viral forwards, it contains a certain amount of scholia, interpolated material and additional formatting conventions. I have also posted a PDF image of the letter as it originally appeared in Datamation here (1.18MB). ~ DWT (1 April 2013)

Ed Post
Graphic Software Systems
P.O. Box 673
25117 S.W. Parkway
Wilsonville, OR 97070

Back in the good old days — the “Golden Era” of computers, it was easy to separate the men from the boys (sometimes called “Real Men” and “Quiche Eaters” in the literature). During this period, the Real Men were the ones that understood computer programming, and the Quiche Eaters were the ones that didn’t. A real computer programmer said things like “DO 10 I=1,10” and “ABEND” (they actually talked in capital letters, you understand), and the rest of the world said things like “computers are too complicated for me” and “I can’t relate to computers — they’re so impersonal”. (A previous work [1] points out that Real Men don’t “relate” to anything, and aren’t afraid of being impersonal.)

But, as usual, times change. We are faced today with a world in which little old ladies can get computerized microwave ovens, 12 year old kids can blow Real Men out of the water playing Asteroids and Pac-Man, and anyone can buy and even understand their very own Personal Computer. The Real Programmer is in danger of becoming extinct, of being replaced by high-school students with TRASH-80s!

There is a clear need to point out the differences between the typical high-school junior Pac-Man player and a Real Programmer. Understanding these differences will give these kids something to aspire to — a role model, a Father Figure. It will also help employers of Real Programmers to realize why it would be a mistake to replace the Real Programmers on their staff with 12 year old Pac-Man players (at a considerable salary savings).

LANGUAGES

The easiest way to tell a Real Programmer from the crowd is by the programming language he (or she) uses. Real Programmers use FORTRAN. Quiche Eaters use PASCAL. Nicklaus Wirth, the designer of PASCAL, was once asked, “How do you pronounce your name?” He replied “You can either call me by name, pronouncing it ‘Veert’, or call me by value, ‘Worth’.” One can tell immediately from this comment that Nicklaus Wirth is a Quiche Eater. The only parameter passing mechanism endorsed by Real Programmers is call-by-value-return, as implemented in the IBM/370 FORTRAN G and H compilers. Real programmers don’t need abstract concepts to get their jobs done: they are perfectly happy with a keypunch, a FORTRAN IV compiler, and a beer.

  • Real Programmers do List Processing in FORTRAN.
  • Real Programmers do String Manipulation in FORTRAN.
  • Real Programmers do Accounting (if they do it at all) in FORTRAN.
  • Real Programmers do Artificial Intelligence programs in FORTRAN.

If you can’t do it in FORTRAN, do it in assembly language. If you can’t do it in assembly language, it isn’t worth doing.

STRUCTURED PROGRAMMING

Computer science academicians have gotten into the “structured programming” rut over the past several years. They claim that programs are more easily understood if the programmer uses some special language constructs and techniques. They don’t all agree on exactly which constructs, of course, and the examples they use to show their particular point of view invariably fit on a single page of some obscure journal or another — clearly not enough of an example to convince anyone. When I got out of school, I thought I was the best programmer in the world. I could write an unbeatable tic-tac-toe program, use five different computer languages, and create 1000 line programs that WORKED. (Really!) Then I got out into the Real World. My first task in the Real World was to read and understand a 200,000 line FORTRAN program, then speed it up by a factor of two. Any Real Programmer will tell you that all the Structured Coding in the world won’t help you solve a problem like that — it takes actual talent. Some quick observations on Real Programmers and Structured Programming:

  • Real Programmers aren’t afraid to use GOTOs.
  • Real Programmers can write five page long DO loops without getting confused.
  • Real Programmers enjoy Arithmetic IF statements because they make the code more interesting.
  • Real Programmers write self-modifying code, especially if it saves them 20 nanoseconds in the middle of a tight loop.
  • Real Programmers don’t need comments: the code is obvious.
  • Since FORTRAN doesn’t have a structured IF, REPEAT ... UNTIL, or CASE statement, Real Programmers don’t have to worry about not using them. Besides, they can be simulated when necessary using assigned GOTOs.

Data structures have also gotten a lot of press lately. Abstract Data Types, Structures, Pointers, Lists, and Strings have become popular in certain circles. Wirth (the above-mentioned Quiche Eater) actually wrote an entire book [2] contending that you could write a program based on data structures, instead of the other way around. As all Real Programmers know, the only useful data structure is the array. Strings, lists, structures, sets — these are all special cases of arrays and can be treated that way just as easily without messing up your programing language with all sorts of complications. The worst thing about fancy data types is that you have to declare them, and Real Programming Languages, as we all know, have implicit typing based on the first letter of the (six character) variable name.

OPERATING SYSTEMS

What kind of operating system is used by a Real Programmer? CP/M? God forbid — CP/M, after all, is basically a toy operating system. Even little old ladies and grade school students can understand and use CP/M.

Unix is a lot more complicated of course — the typical Unix hacker never can remember what the PRINT command is called this week — but when it gets right down to it, Unix is a glorified video game. People don’t do Serious Work on Unix systems: they send jokes around the world on USENET and write adventure games and research papers.

No, your Real Programmer uses OS/370. A good programmer can find and understand the description of the IJK305I error he just got in his JCL manual. A great programmer can write JCL without referring to the manual at all. A truly outstanding programmer can find bugs buried in a 6 megabyte core dump without using a hex calculator. (I have actually seen this done.)

OS/370 is a truly remarkable operating system. It’s possible to destroy days of work with a single misplaced space, so alertness in the programming staff is encouraged. The best way to approach the system is through a keypunch. Some people claim there is a Time Sharing system that runs on OS/370, but after careful study I have come to the conclusion that they are mistaken.

PROGRAMMING TOOLS

What kind of tools does a Real Programmer use? In theory, a Real Programmer could run his programs by keying them into the front panel of the computer. Back in the days when computers had front panels, this was actually done occasionally. Your typical Real Programmer knew the entire bootstrap loader by memory in hex, and toggled it in whenever it got destroyed by his program. (Back then, memory was memory — it didn’t go away when the power went off. Today, memory either forgets things when you don’t want it to, or remembers things long after they’re better forgotten.) Legend has it that Seymour Cray, inventor of the Cray I supercomputer and most of Control Data’s computers, actually toggled the first operating system for the CDC7600 in on the front panel from memory when it was first powered on. Seymour, needless to say, is a Real Programmer.

One of my favorite Real Programmers was a systems programmer for Texas Instruments. One day, he got a long distance call from a user whose system had crashed in the middle of some important work. Jim was able to repair the damage over the phone, getting the user to toggle in disk I/O instructions at the front panel, repairing system tables in hex, reading register contents back over the phone. The moral of this story: while a Real Programmer usually includes a keypunch and lineprinter in his toolkit, he can get along with just a front panel and a telephone in emergencies.

In some companies, text editing no longer consists of ten engineers standing in line to use an 029 keypunch. In fact, the building I work in doesn’t contain a single keypunch. The Real Programmer in this situation has to do his work with a text editor program. Most systems supply several text editors to select from, and the Real Programmer must be careful to pick one that reflects his personal style. Many people believe that the best text editors in the world were written at Xerox Palo Alto Research Center for use on their Alto and Dorado computers [3]. Unfortunately, no Real Programmer would ever use a computer whose operating system is called SmallTalk, and would certainly not talk to the computer with a mouse.

Some of the concepts in these Xerox editors have been incorporated into editors running on more reasonably named operating systems. EMACS and VI are probably the most well known of this class of editors. The problem with these editors is that Real Programmers consider “what you see is what you get” to be just as bad a concept in text editors as it is in women. No, the Real Programmer wants a “you asked for it, you got it” text editor — complicated, cryptic, powerful, unforgiving, dangerous. TECO, to be precise.

It has been observed that a TECO command sequence more closely resembles transmission line noise than readable text [4]. One of the more entertaining games to play with TECO is to type your name in as a command line and try to guess what it does. Just about any possible typing error while talking with TECO will probably destroy your program, or even worse — introduce subtle and mysterious bugs in a once working subroutine.

For this reason, Real Programmers are reluctant to actually edit a program that is close to working. They find it much easier to just patch the binary object code directly, using a wonderful program called SUPERZAP (or its equivalent on non-IBM machines). This works so well that many working programs on IBM systems bear no relation to the original FORTRAN code. In many cases, the original source code is no longer available. When it comes time to fix a program like this, no manager would even think of sending anything less than a Real Programmer to do the job — no Quiche Eating structured programmer would even know where to start. This is called “job security”.

Some programming tools NOT used by Real Programmers:

  • FORTRAN preprocessors like MORTRAN and RATFOR. The Cuisinarts of programming — great for making Quiche. See comments above on structured programming.
  • Source language debuggers. Real Programmers can read core dumps.
  • Compilers with array bounds checking. They stifle creativity, destroy most of the interesting uses for EQUIVALENCE, and make it impossible to modify the operating system code with negative subscripts. Worst of all, bounds checking is inefficient.
  • Source code maintainance systems. A Real Programmer keeps his code locked up in a card file, because it implies that its owner cannot leave his important programs unguarded [5].

THE REAL PROGRAMMER AT WORK

Where does the typical Real Programmer work? What kind of programs are worthy of the efforts of so talented an individual? You can be sure that no real Programmer would be caught dead writing accounts-receivable programs in COBOL, or sorting mailing lists for People magazine. A Real Programmer wants tasks of earth-shaking importance (literally!):

  • Real Programmers work for Los Alamos National Laboratory, writing atomic bomb simulations to run on Cray I supercomputers.
  • Real Programmers work for the National Security Agency, decoding Russian transmissions.
  • It was largely due to the efforts of thousands of Real Programmers working for NASA that our boys got to the moon and back before the Russkies.
  • The computers in the Space Shuttle were programmed by Real Programmers.
  • Real Programmers are at work for Boeing designing the operating systems for cruise missiles.

Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the Pioneer and Voyager spacecraft by heart. With a combination of large ground-based FORTRAN programs and small spacecraft-based assembly language programs, they can to do incredible feats of navigation and improvisation, such as hitting ten-kilometer wide windows at Saturn after six years in space, and repairing or bypassing damaged sensor platforms, radios, and batteries. Allegedly, one Real Programmer managed to tuck a pattern-matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter.

One plan for the upcoming Galileo spacecraft mission is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/- 3 kilometers of the surface of Mars. Nobody is going to trust a PASCAL program (or PASCAL programmer) for navigation to these tolerances.

As you can tell, many of the world’s Real Programmers work for the U.S. Government, mainly the Defense Department. This is as it should be. Recently, however, a black cloud has formed on the Real Programmer horizon.

It seems that some highly placed Quiche Eaters at the Defense Department decided that all Defense programs should be written in some grand unified language called “ADA” (registered trademark, DoD). For a while, it seemed that ADA was destined to become a language that went against all the precepts of Real Programming — a language with structure, a language with data types, strong typing, and semicolons. In short, a language designed to cripple the creativity of the typical Real Programmer. Fortunately, the language adopted by DoD has enough interesting features to make it approachable: it’s incredibly complex, includes methods for messing with the operating system and rearranging memory, and Edsgar Dijkstra doesn’t like it [6]. (Dijkstra, as I’m sure you know, was the author of “GoTos Considered Harmful” — a landmark work in programming methodology, applauded by Pascal Programmers and Quiche Eaters alike.) Besides, the determined Real Programmer can write FORTRAN programs in any language.

The real programmer might compromise his principles and work on something slightly more trivial than the destruction of life as we know it, providing there’s enough money in it. There are several Real Programmers building video games at Atari, for example. (But not playing them. A Real Programmer knows how to beat the machine every time: no challange in that.) Everyone working at LucasFilm is a Real Programmer. (It would be crazy to turn down the money of 50 million Star Wars fans.) The proportion of Real Programmers in Computer Graphics is somewhat lower than the norm, mostly because nobody has found a use for Computer Graphics yet. On the other hand, all Computer Graphics is done in FORTRAN, so there are a fair number people doing Graphics in order to avoid having to write COBOL programs.

THE REAL PROGRAMMER AT PLAY

Generally, the Real Programmer plays the same way he works — with computers. He is constantly amazed that his employer actually pays him to do what he would be doing for fun anyway, although he is careful not to express this opinion out loud. Occasionally, the Real Programmer does step out of the office for a breath of fresh air and a beer or two. Some tips on recognizing real programmers away from the computer room:

  • At a party, the Real Programmers are the ones in the corner talking about operating system security and how to get around it.
  • At a football game, the Real Programmer is the one comparing the plays against his simulations printed on 11 by 14 fanfold paper.
  • At the beach, the Real Programmer is the one drawing flowcharts in the sand.
  • A Real Programmer goes to a disco to watch the light show.
  • At a funeral, the Real Programmer is the one saying “Poor George. And he almost had the sort routine working before the coronary.”
  • In a grocery store, the Real Programmer is the one who insists on running the cans past the laser checkout scanner himself, because he never could trust keypunch operators to get it right the first time.

THE REAL PROGRAMMER’S NATURAL HABITAT

What sort of environment does the Real Programmer function best in? This is an important question for the managers of Real Programmers. Considering the amount of money it costs to keep one on the staff, it’s best to put him (or her) in an environment where he can get his work done.

The typical Real Programmer lives in front of a computer terminal. Surrounding this terminal are:

  • Listings of all programs the Real Programmer has ever worked on, piled in roughly chronological order on every flat surface in the office.
  • Some half-dozen or so partly filled cups of cold coffee. Occasionally, there will be cigarette butts floating in the coffee. In some cases, the cups will contain Orange Crush.
  • Unless he is very good, there will be copies of the OS JCL manual and the Principles of Operation open to some particularly interesting pages.
  • Taped to the wall is a line-printer Snoopy calender for the year 1969.
  • Strewn about the floor are several wrappers for peanut butter filled cheese bars (the type that are made stale at the bakery so they can’t get any worse while waiting in the vending machine).
  • Hiding in the top left-hand drawer of the desk is a stash of double stuff Oreos for special occasions.
  • Underneath the Oreos is a flow-charting template, left there by the previous occupant of the office. (Real Programmers write programs, not documentation. Leave that to the maintenance people.)

The Real Programmer is capable of working 30, 40, even 50 hours at a stretch, under intense pressure. In fact, he prefers it that way. Bad response time doesn’t bother the Real Programmer — it gives him a chance to catch a little sleep between compiles. If there is not enough schedule pressure on the Real Programmer, he tends to make things more challenging by working on some small but interesting part of the problem for the first nine weeks, then finishing the rest in the last week, in two or three 50-hour marathons. This not only impresses his manager, who was despairing of ever getting the project done on time, but creates a convenient excuse for not doing the documentation. In general:

  • No Real Programmer works 9 to 5. (Unless it’s 9 in the evening to 5 in the morning.)
  • Real Programmers don’t wear neckties.
  • Real Programmers don’t wear high heeled shoes.
  • Real Programmers arrive at work in time for lunch. [9]
  • A Real Programmer might or might not know his spous’s name. He does, however, know the entire ASCII (or EBCDIC) code table.
  • Real Programmers don’t know how to cook. Grocery stores aren’t often open at 3 a.m., so they survive on Twinkies and coffee.

THE FUTURE

What of the future? It is a matter of some concern to Real Programmers that the latest generation of computer programmers are not being brought up with the same outlook on life as their elders. Many of them have never seen a computer with a front panel. Hardly anyone graduating from school these days can do hex arithmetic without a calculator. College graduates these days are soft — protected from the realities of programming by source level debuggers, text editors that count parentheses, and user friendly operating systems. Worst of all, some of these alleged computer scientists manage to get degrees without ever learning FORTRAN! Are we destined to become an industry of Unix hackers and Pascal programmers?

On the contrary. From my experience, I can only report that the future is bright for Real Programmers everywhere. Neither OS/370 nor FORTRAN show any signs of dying out, despite all the efforts of Pascal programmers the world over. Even more subtle tricks, like adding structured coding constructs to FORTRAN have failed. Oh sure, some computer vendors have come out with FORTRAN 77 compilers, but every one of them has a way of converting itself back into a FORTRAN 66 compiler at the drop of an option card — to compile DO loops like God meant them to be.

Even Unix might not be as bad on Real Programmers as it once was. The latest release of Unix has the potential of an operating system worthy of any Real Programmer. It has two different and subtly incompatible user interfaces, an arcane and complicated terminal driver, virtual memory. If you ignore the fact that it’s structured, even C programming can be appreciated by the Real Programmer: after all, there’s no type checking, variable names are seven (ten? eight?) characters long, and the added bonus of the Pointer data type is thrown in. It’s like having the best parts of FORTRAN and assembly language in one place. (Not to mention some of the more creative uses for #define.)

No, the future isn’t all that bad. Why, in the past few years, the popular press has even commented on the bright new crop of computer nerds and hackers ([7] and [8]) leaving places like Stanford and M.I.T. for the Real World. From all evidence, the spirit of Real Programming lives on in these young men and women. As long as there are ill-defined goals, bizarre bugs, and unrealistic schedules, there will be Real Programmers willing to jump in and Solve The Problem, saving the documentation for later. Long live FORTRAN!

ACKNOWLEGEMENT

I would like to thank Jan E., Dave S., Rich G., Rich E. for their help in characterizing the Real Programmer, Heather B. for the illustration, Kathy E. for putting up with it, and atd!avsdS:mark for the initial inspiration.

REFERENCES

[1] Feirstein, B., Real Men Don’t Eat Quiche, New York, Pocket Books, 1982.
[2] Wirth, N., Algorithms + Datastructures = Programs, Prentice Hall, 1976.
[3] Ilson, R, ‘Recent Research in Text Processing’, IEEE Trans Prof Commun, Vol PC-23 No 4 Dec 4 1980.
[4] Finseth, C., Theory and Practice of Text Editors – or – a Cookbook for an EMACS, B.S. Thesis, MIT/LCS/TM-165, Massachusetts Institute of Technology, May 1980.
[5] Weinberg, G., The Psychology of Computer Programming, New York, Van Nostrabd Reinhold, 1971, page 110.
[6] Dijkstra, E., On the GREEN Language Submitted to the DoD, Sigplan notices, Volume 3, Number 10, October 1978.
[7] Rose, Frank, Joy of Hacking, Science 82, Volume 3, Number 9, November 1982, pages 58 – 66.
[8] The Hacker Papers, Psychology Today, August 1980.

William Gibson’s Idoru and Blogging

I want to add one more thought about blogging before I get started. In my Inaugural Post I asked, “Why join this societal wave of exhibitionism?” and mentioned the relation of technology to surveillance, voyeurism, privacy and exhibitionism. Every time I think about these issues, a character from William Gibson’s 1996 novel Idoru comes to mind.

Before I delve into the main point, I want to say that I think William Gibson is a genius. In his first novel, Neuromancer (1984), the hit that launched the cyberpunk genre, he came up with the term cyberspace. In case you passed over that parenthetical date too quickly, let me point out that he came up with the idea of cyberspace in 1984: before there was either the Internet or virtual reality.

Yes, I am aware that Tron came out in 1982, but Tron is about a man who is sucked into a little, tiny world inside of a computer were the programs are personified (e.g. the vilan, “Master Control”) and forced to fight high-tech gladiatorial games in sexy spandex body suits. This of course will never happen and is merely a technological variant of The Fantastic Voyage, The Wizard of Oz or The Lion, The Witch and the Wardrobe. Yes, there are some silly parts of Neuromancer: the space Rastafarians are hardly the heady stuff of Arthur C. Clarke or Isaac Asimov. However, a total emersion interface to a simulated world spread over a network of computers is freaking visionary. Unlike Tron, which set people’s understanding of computers back a decade, Neuromancer is the future.

What is most relevant to blogging is his vision of celebrity and media that make up the ideological backdrop of Idoru. The novel is set in the not-too-distant future where mass media has continued to throw its net wider and wider, where, as Andy Warhol said in what must be the most accurate prediction ever made, “everyone will be famous for fifteen minutes.” Murderers are famous, the parents of their victims are famous, college students fake kidnappings to get on television, unaccomplished debutantes are famous for nothing other than ostentation, people become famous when sex tapes “accidentally” find there way on to the Internet, people elbow their way onto television for opportunities to boast about things that previously one wouldn’t even want one’s neighbors to know. Actually, I am talking about the present, but imagine this trend married to the myriad of widely affordable media production and distribution technologies chased out twenty years into the future. With thousands of television channels to fill up and with everyone’s vanity site on the Internet and with no gatekeepers, fame will devolve to the masses. Gibson has one of his characters describe it thus:

“Nobody’s really famous anymore, Laney. Have you noticed that?…I mean really famous. There’s not much fame left, not in the old sense. Not enough to go around…We learned to print money off this stuff,” she said. “Coin of our realm. Now we’ve printed too much; even the audience knows. It shows in the ratings…Except,” she said… “when we decide to destroy one.” (6-7)

Gibson spends the opening chapters of the book describing how derelict protagonist Colin Laney lost his previous job as a “researcher” at a tabloid news show called Slitscan. In this future, like our present, an increasing proportion of people’s transactions are being passively recorded in corporate databases. And also as in our present, some companies exist solely to purchase information, correlate disparate pieces in useful ways and sell it to those who might put it to some (usually pernicious) use. In this novel, Slitscan had a questionable relationship with such a data agglomeration corporation called DatAmerica and Laney’s job was to troll through the data trails left by celebrities looking for the “nodal points” — the confluences of data — that indicated something gossip-worthy for the show to report.

Laney was not, he was careful to point out, a voyeur. He had a peculiar knack with data collection architectures, and a medically documented concentration deficit that he could toggle, under certain conditions, into a state of pathological hyperfocus…he was an intuitive fisher of patterns of information: of the sort of signature a particular individual inadvertently created in the net as he or she went about the mundane yet endlessly multiplex business of life in a digital society. (30-31)

Laney was fired when, while researching the mistress of a celebrity, it became clear to him from her data trail that she intended to commit suicide and he tried unsuccessfully to intervene. Here Laney checks back with his mark after returning from a vacation:

The nodal point was different now, though he had no language to describe the change. He sifted the countless fragments that had clustered around Alison Shires in his absence, feeling for the source of his earlier conviction. He called up the music that she’d accessed while he’d been in Mexico, playing each song in the order of her selection. He found her choices had grown more life-affirming; she’d moved to a new provider, Upful Groupvine, whose relentlessly positive product was the musical equivalent of the Good News Channel.

Cross-indexing her charges against the records of her credit-provider and its credit retailers, he produced a list of everything she’d purchased in the past week. Six-pack, blades, Tokkai carton opener. Did she own a Tokkai carton opener? But then he remembered Kathy’s advice, that this was the part of research most prone to produce serious transference, the point at which the researcher’s intimacy with the subject could lead to loss of perspective. “It’s often easiest for us to identify at the retail level, Laney. We are a shopping species. Find yourself buying a different brand of frozen peas because the subject, watch out.” (66-67)

Before excerpting a passage where Gibson describes the future of gossip journalism, let me remind you that this is Gibson’s view from 1996, when MTV’s The Real World was only in its 4th season, the O.J. Simpson trial was just over, Monica Lexinsky’s blue dress was stain-free and Survivor was still four years off:

Slitscan was descended from “reality” programming and the network tabloids of the late twentieth century, but it resembled them no more than some large, swift, bipedal carnivore resembled its sluggish, shallow-dwelling ancestors. Slitscan was the mature form, supporting fully global franchises. Slitscan’s revenues had paid for entire satellites and built the building he worked in in Burbank.

Slitscan was a show so popular that it had evolved into something akin to the old idea of a network. It was flanked and buffered by spinoffs and peripherals, each designed to shunt the viewer back to the crucial core, the familiar and reliably bloody alter that one of Laney’s Mexican co-workers called Smoking Mirror.

It was impossible to work at Slitscan without a sense of participating in history, or else what Kathy Torrance would argue had replaced history. Slitscan itself, Laney suspected, might be one of those larger nodal points he sometimes found himself trying to imagine, an informational peculiarity opening into some unthinkably deeper structure.

In his quest for lesser nodal points, the sort that Kathy sent him into DatAmerica to locate, Laney had already affected the course of municipal elections, the market in patent gene futures, abortion laws in the state of New Jersey, and the spin on an ecstatic pro-euthanasia movement (or suicide cult, depending) called Cease Upon the Midnight, not to mention the lives and careers of several dozen celebrities of various kinds.

Not always for the worst, either, in terms of what the show’s subjects might have wished for themselves. Kathy’s segment on the Dukes of Nuke ‘Em, exposing the band’s exclusive predilection for Iraqi fetal tissue, had sent their subsequent release instant platinum (and had resulted in show-trials and public hangings in Baghdad, but he supposed life was hard there to begin with). (50-52)

Of course, something like Slitscan — or the Jerry Springer Show, Cops, E True Hollywood Story, Average Joe or The Fifth Wheel in our time — could not exist were it not for the sadistic voyeurism of the masses. I select this passage as much to satisfying my own snickering elitism as to illustrate the lust for other people’s misery that comprises our current and future television viewing audience:

…Slitscan’s audience…is best visualized as a vicious, lazy, profoundly ignorant, perpetually hungry organism craving the warm god-flesh of the anointed. Personally I like to imagine something the size of a baby hippo, the color of a week-old boiled potato, that lives by itself, in the dark, in a double-wide on the outskirts of Topeka. It’s covered with eyes and it sweats constantly. The sweat runs into those eyes and makes them sting. It has no mouth, Laney, no genitals, and can only express its mute extremes of murderous rage and infantile desire by changing the channels on a universal remote. Or by voting in presidential elections. (35-36)

Of course one can already see aspects of this world coming into being. Corporations are harvesting, agglomerating and correlating information at a frightening and increasing rate — but that is for another post. What I am thinking about here is the voyeuristic and micro-celebrity aspects of our quickening information age. I have a friend who reads several people’s blogs on an occasional basis, some of whom he has never even met. Of one that he hasn’t met, he maintains that this blogger is teetering on the brink of an infidelity with a coworker against his current girlfriend — an infidelity, the imminence of which he himself is not yet aware! My friend keeps returning to this blog awaiting the climactic post as if it were a soap opera.

There you have it: micro-celebrity, sadistic voyeurism, a readable data trail from which one might extrapolate future behavior with a minimal amount of theory. Admittedly, my friend is following an intentional data trail rather than a passive one, but the small difference between this situation and that of Gibson’s Laney anticipating the suicide is striking.

I don’t absolve myself of any of this. I loved the show Trauma: Real Life in the ER, which is about as sadistic of a voyeurism as you’ll find. I did say in the “Inaugural Post” that I consider this a “deeply improprietious endeavor.” I am, however, aware of the context in which I embark upon this effort.