Malcolm Gladwell’s Infinite Monkey Theorem

The Infinite Monkey Theorem apparently still holds if you substitute mediocre humans for monkeys. Here is Malcolm Gladwell writing on how to brute force genius (“In the Air,” The New Yorker, 12 May 2008):

In the nineteen-sixties, the sociologist Robert K. Merton wrote a famous essay on scientific discovery in which he raised the question of what the existence of multiples tells us about genius. No one is a partner to more multiples [simultaneous scientific discovery], he pointed out, than a genius, and he came to the conclusion that our romantic notion of the genius must be wrong. A scientific genius is not a person who does what no one else can do; he or she is someone who does what it takes many others to do. The genius is not a unique source of insight; he is merely an efficient source of insight. “Consider the case of Kelvin, by way of illustration,” Merton writes, summarizing work he had done with his Columbia colleague Elinor Barber:

After examining some 400 of his 661 scientific communications and addresses . . . Dr. Elinor Barber and I find him testifying to at least 32 multiple discoveries in which he eventually found that his independent discoveries had also been made by others. These 32 multiples involved an aggregate of 30 other scientists, some, like Stokes, Green, Helmholtz, Cavendish, Clausius, Poincaré, Rayleigh, themselves men of undeniable genius, others, like Hankel, Pfaff, Homer Lane, Varley and Lamé, being men of talent, no doubt, but still not of the highest order. . . . For the hypothesis that each of these discoveries was destined to find expression, even if the genius of Kelvin had not obtained, there is the best of traditional proof: each was in fact made by others. Yet Kelvin’s stature as a genius remains undiminished. For it required a considerable number of others to duplicate these 32 discoveries which Kelvin himself made.

This is, surely, what an invention session is: it is Hankel, Pfaff, Homer Lane, Varley, and Lamé in a room together, and if you have them on your staff you can get a big chunk of Kelvin’s discoveries, without ever needing to have Kelvin — which is fortunate, because, although there are plenty of Homer Lanes, Varleys, and Pfaffs in the world, there are very few Kelvins.

Our tendency is to imagine Newton, Darwin or Einstein as the pinnacle of genius, but they are merely the peak performance of that draft design kludge we all carry around in out heads. One can easily imagine ranks of genus many levels beyond our showings to date, ranging all the way from the Star Trek character Data to the gods (I uses these literary examples merely to demonstrate that we’re capable of imagining higher orders of genus). Each ranking of genius, all the way up to the gods, bears the same relation to the rank just below as Kelvin does to Homer Lanes, Varleys, and Pfaffs: not one of qualitative difference, but merely one of efficiency. And that relation obtains not just between each level, but over the entire span from pinnacle to base as well. It’s the wet machine corollary of Turing completeness.

This is, of course, why the proof from design for the existence of god fails, because one can imagine the universe being created in an instant by a supergenius, but it is equally plausible that it was created by a committee with some time on their hands. And the more time available or the larger the committee, the less capable any of its members has to be to produce a given output.

Commercial Pleasures

21 May 2008, David Cook wins American Idol

When I first moved to D.C. I had a roommate who was such a Redskins fan that I almost couldn’t be in the house when a game was on, so loudly did he scream at the television. I didn’t get it. There’s no excuse for getting that emotionally wrapped up in something so alien from one’s own life. Then I discovered Ninja Warrior on G4 (G4 | wikipedia). S. and I scream and wave our hands at the television with an increasing fanaticism as the contestant nears the finish line and the announcer goes ballistic with Japanese excitement.

Last night when they declared David Cook the winner of American Idol I lost it in a way that I never have before over television. From the time it of the final six (Carle, Brook, Castro, Syesha, Archuleta and Cook) it has been apparent to both S. and I that it was going to come down to a contest of the Davids and both of us have pretty much figured that Archuleta had the teenie-bops with their text-dexterous fingers all lined up and thus was going to win. Last week when Syesha was voted off I though Archuleta was going to win. Cook seems burned out and you could see the disappointment he had with himself after many of his performances. Meanwhile Archuleta’s been on the rise. “Stand by Me” couldn’t have been a better choice for him. Then on Tuesday night Archuleta’s rendition of Elton John’s “Don’t Let the Sun Go Down on Me” was perfect. Meanwhile Mr. Cook’s songs were sufficiently mediocre that after his final performance (“The World I Know“) he too was so certain that he had lost that the moment he finished his song, before Randy Jackson had said his first “Yo,” Mr. Cook started to cry. The judges all gave him consolation complements.

And I had long since made my peace with Mr. Cook coming in second. He has a tendency toward incipient pop rock as it is and being saddled with the obligations of being the American Idol was only going to further hamper him. They were going to hoist a few television commercials and a really marketable, over-produced album on him, when what he needs is to get together with someone edgier, someone who realizes that the explosive, dramatic power ballads are Mr. Cook’s forte. Better that Mr. Archuleta ends up the America Idol. He’s already a one-man boy band.

And so last night when Ryan Seacrest started, “And the winner is … ,” I interjected “Archuleta.” “David”; Seacrest paused having revealed nothing with two finalists both named David. Again I finished his sentence. “Archuleta.” When Mr. Seacrest finally let out “Cook” I leapt off the sofa. “No fucking way!” I shouted in disbelief. After fore explicatives and expressions of disbelief I tuned around in a circle and stared at the television in disbelief. Out of 97.5 million votes, Mr. Cook won 54.75 million to Mr. Archuleta’s 42.75 million, or 56 percent of the vote, a 12 million vote margin of victory. I was sure that David Cook was going to lose, but he won and in the end it wasn’t even close.

And Mr. Cook thought he knew he had lost as well. He seemed resigned to his fate and already congratulatory toward Mr. Archuleta as Mr. Seacrest taunted them with the results. I think it was the shock as much as the adulation and excitement that caused Mr. Cook to become so emotional after the announcement.

This is part justice and part tragedy. Mr. Cook is 25 years old. He got a degree in graphics design, but before settling into the nine-to-fiver he told himself that he was going to give music a few more years to see if he could make it work. One of his friends and a band-mate had already given up on music and gone to real work. Mr. Cook was nearing the end of his experiment and already had his alternative waiting in the wings. He’s been given a new lease on his dream. On the other hand, his older brother, Adam, is dying o brain cancer. I heard, I think it was his mother, say that it’s like heaven and hell: for David to be doing so well while things are going so poorly for Adam. I can’t imagine the survivor’s guilt Mr, Cook must be feeling in front of his brother. It has been an emotional rollercoaster for Mr. Cook and his family.

How did it happen? Always the political blogger — can’t avoid analyzing election returns. A few episodes ago I saw a sign waving in the audience: “Cougars for Cook.” The average age of an American Idol viewer is significantly up — witness moi. I guess the cougars overruled the teenie-bops. The other factor was his trio of performances early in the season: Lionel Richie’s “Hello,” the Beatles’s “Eleanor Rigby” and Michael Jackson’s “Billie Jean.” I think he never did anything so spectacular as “Billie Jean” and he tired as the season wound to its climax, but on those three he built a winning reputation.

I took a little walk this afternoon to go buy lunch and some coffee. I found my pace brisk and my thoughts buoyant. In the background of my mind, David Cook had won, and it has caused the slightest uplift in my mood all day long. It’s stupid I know, to be such a fan-boy. But I can’t help it: I really like David Cook.

The Stakes in 2008

I have more or less figured that right-wing crossover-voting Democratic Congressmen would hamstring an Obama administration, guaranteeing that any of his significant initiatives go nowhere and forcing him into a Clintonian strategy of triangulation, centrism and micro-initiatives. But Kevin Drum intriguingly suggests that Republicans, chastened by the 2008 outcome, could have the opposite effect (“End of an Era?,” Political Animal, The Washington Monthly, 19 May 2008):

They won’t be willing to say this during a presidential campaign, but there are at least half a dozen smart Republican senators who understand this and don’t really want to go down with the ship. So even if Democrats don’t win a filibuster-proof majority in November — as they almost certainly won’t — it’s likely that there will still be enough survival-inspired GOP senators around to give Barack Obama the votes he needs to make a difference. If that’s the case, and if Obama has the courage of his convictions, his first two years could be historic.

Unfortunately for Senator Obama it’s structural factors such as this that make Senator Clinton such a tenacious foe: this year could promise a shoo-in victory for the opposition party. And as if that wasn’t a sweet enough pot, whoever gets the nomination could potentially — again for structural reasons, not cause of personal vision thing — have a historical administration. Wouldn’t you too fight tooth and nail for such an opportunity were you in Hillary Clinton’s position?

Playing Into bin Laden’s Hands

Last week President Bush (remember him?) took his message somewhere that people might listen without creating a media spectacle, the Israeli Knesset, where he made his now infamous, implicit criticism of Barack Obama (“President Bush Addresses Members of the Knesset,” The Knesset, Jerusalem, Israel, 15 May 2008):

Some seem to believe that we should negotiate with the terrorists and radicals, as if some ingenious argument will persuade them they have been wrong all along. We have heard this foolish delusion before. As Nazi tanks crossed into Poland in 1939, an American senator declared: “Lord, if I could only have talked to Hitler, all this might have been avoided.” We have an obligation to call this what it is — the false comfort of appeasement, which has been repeatedly discredited by history.

Yes, yes, appeasement has been discredited, except in all those other instances where its opposite, belligerence or intransigence has been discredited too (“The Contradictory Lessons of the Twentieth Century, smarties, 28 August 2004). The fact is that there is no diplomatic panacea — firm resolve always works — and what is required is that ever so subtle virtue, judgment, exactly what this administration has been lacking.

This all reminds me of the passage from Ron Suskind’s The One Percent Doctrine in which he describes the assessment of the CIA as to the meaning of Osama bin Laden’s 29 October 2004 statement , made just days before the 2004 presidential election:

Inside of the CIA, of course, the analysis moved on a different track. They had spent years, as had a similar bin Laden unit at FBI, parsing each expressed word of the al Qaeda leader and his deputy, Zawahiri. What they’d learned over nearly a decade is that bin Laden speaks only for strategic reasons — and those reasons are debated with often startling depth inside the organization’s leadership. …

Today’s conclusion: bin Laden’s message was clearly designed to assist the President’s reelection.

At the five o’clock meeting, once various reports on latest threats were delivered, John McLaughten opened the issue with the consensus view: “Bin Laden certainly did a nice favor today for the President.” (p. 335-336)

The fact is that the policies of President Bush and his administration have been an irreplaceable gift to al Qaeda. As Osama bin Laden himself said in the afore mentioned statement,

[It is] easy for us to provoke and bait this administration. All that we have to do is to send two mujahedeen to the furthest point east to raise a piece of cloth on which is written al Qaeda, in order to make generals race there to cause America to suffer human, economic and political losses without their achieving anything of note …

Osama bin Laden essentially told the world that he loves George Bush for playing right into al Qaeda’s hands.

Barack Obama and the left more generally have responded to the President’s implicit criticism, but it’s been entirely meta. It’s beyond the bounds of fair politics, the President shouldn’t make such criticisms while abroad, etc. The left should deal squarely with this issue. Appeasement — were it even true — would be one thing, but George Bush is America’s gift to Osama bin Laden. Senator Obama is not bin Laden’s candidate: George W. Bush is. For seven years now the West has danced to bin Laden’s tune. On 20 January 2009 that ends.

Ouch! 2008 as 1972

Among all the other things they’ve lost, at least The Economist hasn’t lost their edge. In review of Rick Perlstein’s new book, Nixonland, they have the following to say about the present election season (“The Fuel of Power,” vol. 387, no. 8579, 10 May 2008, pp. 93-94):

It is hard, in the current political season, to read this book without hearing the sound of history rhyming, to paraphrase Mark Twain. George McGovern’s promise of “post-partisanship” galvanised America’s youth. He trumpeted his opposition to the Vietnam war under the slogan of “right from the start”. He went on to suffer one of the biggest defeats in the general election in American history. “Dirty politics confused him,” Hunter S. Thompson sighed. Nixon chose “experience counts” as his campaign slogan in 1960 and boasted that he had spent “a lifetime getting ready”. He made up for his lack of personal charm by an almost deranged relentlessness. But this week’s result suggests that these are only half-rhymes at best: Barack Obama has already met his Richard Nixon and slain her.

The entire media establishment this week is touting the demise of the Clinton campaign, and the whole thing has been rather unseemly for Senator Clinton, but no one says it in quite such a wince-inducing fashion as The Economist.

American Pseudo-Religion; Science and Experience

The title of David Brooks’s op-ed Tuesday, “The Neural Buddhists” (The New York Times, 13 May 2008), sounded cyberpunk and that was enough to entice me to read it. Turns out it’s some comments on the trend in neurological and genetic research toward characterizing the religious tendency and the religious experience. A lot of the editorial is wishful thinking on the part of a religious conservative, but then there’s the musings from which the piece draws its title:

This new wave of research will not seep into the public realm in the form of militant atheism. Instead it will lead to what you might call neural Buddhism.

In their arguments with Christopher Hitchens and Richard Dawkins, the faithful have been defending the existence of God. That was the easy debate. The real challenge is going to come from people who feel the existence of the sacred, but who think that particular religions are just cultural artifacts built on top of universal human traits. It’s going to come from scientists whose beliefs overlap a bit with Buddhism.

I often point out that the fastest growing religion in the U.S. today is not Mormonism or any branch of Christianity, but the poorly conceptualized “spiritual but not religious” (“Teens: Spiritual, But Not Religious,” smarties, 11 January 2005). This isn’t some entirely post-1960s baby-boom or gen-X phenomenon. It is the latest manifestation of a long line of uniquely American religion stretching from the Enlightenment deism of the founding generation to the transcendentalism of the late Nineteenth Century to the Progressive era psycho-spirituality of William James. It pulls together an idiosyncratic combination of Christianity, grand historical conspiracy theories à la the Freemasons, various strains of mysticism, yeoman pragmatism, naturalism, popular science, amateur philosophical speculation, do-gooderism, health fads, self-help, popular psychology and positive thinking. It’s all of a piece with American mesianism, paranoia, individualism, pragmatism and the melting pot. It’s a little incipient and a little too convenient for the American way of life, having dispensed with the hard truths and the dark side of religion as well as any of the really imposing moral injunctions, but there it is. And Mr. Brooks is right to point out that the best fit for this among the ancient religions is Buddhism.

As for the rest of the article, it’s just the ontological argument for the existence of god without the minor premise. And the refutation is the same today as it was in the Eighteenth Century: you can’t imagine something into existence. A recurrent dream of Pegasus, however deeply felt, is not the existence of Pegasus. Conversely, the Pegasus of the recurrent dream is not what people would mean were they to speak of the existence of Pegasus. The question isn’t whether one has a particular brain experience. People have all manner of experiences, imaginary and not, as well as everything in between — in fact, the vast bulk of human experience probably lies somewhere between the real and the imagined. The question is whether or not a given experience correlates to an existent external state of affairs.

Amidst the natural sciences the question of correlation between a purported experience and a state of affairs external to mind is not something determined in some crass way. “It really happed.” “No it didn’t.” “Yes it did!” There is simply no sense dwelling on a single instance. Scientists discount a sample size of one. If there is too much dispute over a particular instance, simply drop it in favor of further inquiry. Fleeting and unitary experiences are dismissed in scientific practice in favor of what might be called the intersubjective (see e.g. intersubjectivity or intersubjective verifiability), the societal nature of scientific knowledge or a Wittgensteinian denial of a private language in favor of the essentially public nature of our scientific discourses.

For all of Nietzsche’s fretting that the death of god had unchained the Earth from the Sun, religion was every bit as arbitrary and subjective as its adherents today accuse irreligion of being. In the end, the whole of society swings over the abyss on a tether of fundamentally ungrounded beliefs. Science at least has the merit of basing its propositional criteria on egalitarian public discourse. Religion is based on all manner of purportedly private experience — revelations, miracles, conversations with the gods, passions, et cetera — all considered beyond criticism. Some people are chosen, enlightened or who knows what — the plans of the gods are inscrutable — and the rest of us, not so exalted, accept or reject religious belief on the authority of those possessed of such experiences. To those who prefer something more determinate, Jesus reiterates the Deuteronomic injunction, “Do not put the Lord your God to the test” (Matthew 4:7, Deuteronomy 6:16).

This is one of the major divisions between science and religion. Were science to start poking its nose into religious business, the religious person would object that the spiritual is a realm of deeply personal experience, not subject to the critical dissection of all comers. And yet in it’s public aspect, religious practitioners are expected to take the word of people having had religious experiences. No attempt is made to abstract an experience away from an individual experiencer. Religion believes every obscurantist story that any old quack tells, at least where not condemned by religious authority.

A recognition deeply built into the practice of natural science, even if never properly conceptualized or explicitly taught, is the recognition of the fallibility, or at least the broad diversity in function, of the human mind. The well observed fact of low brain performance, stretching from simple poor judgment, forgetfulness, error, misperception and dishonesty to careerism, optical illusions and dreams, all the way to delusion, mental disorder, group psychology and mass hysteria has been incorporated into the background of scientific practice. In this regard a particular theory of mind is a part of the body of scientific practice. And, importantly, it’s not a complicated theory of mind — though one can pursue it to various levels of sophistication — but rather one built upon rather day-to-day observation of human foibles. I think the books of reference here are not any of the ones that Mr. Brooks lists, but David Linden’s The Accidental Mind: How Brain Evolution Has Given Us Love, Memory, Dreams, and God or Gary Marcus’s Kluge: The Haphazard Construction of the Human Mind.

One doesn’t have to search very far in one’s own life to find examples of how the brain, while a miracle of evolution, only works so well. At least a couple of times a week I experience a random, spasmodic jerk of some extremity. My cube neighbor at work, my brother, my highschool physics teacher and a former priest all have facial ticks, some rather elaborate, of which I am certain they are completely unaware and were they to become aware, would not be able to control. So-called religious phenomena — feelings of destiny, hearing voices, talking to god, heightened emotional states, impulses, a sense of unity, feelings of disembodiment — are of a piece with this. I don’t deny that religious people have the experiences that they claim. Subjective experiences are experiences nonetheless. What I deny is that such experiences have any greater significance.

Or for that matter there is the even more commonplace matter of difference in perspective. In this sense science is a highly stylized political methodology for producing consensus amidst the rocky shoals of vast differences in human experience.

These commonplace observations are the cause for the emphasis on repeatability and independent verification in scientific practice. It’s not enough for one person to have had an experience, or even for a very large number of people to have shared that experience for it to be established as a scientific fact. The standard for a scientific fact is that it must be something accessible to all; it must be something determinately replicable. A scientific community employs a fairly common engineering method for combating error: given that humans are cheap and plentiful, accommodate for the very low performance of each individual unit of scientific production by performing each task in redundancy. The inaccuracy of any given unit is cancelled out over the span of the entire system.

This is also the cause for the conservatism in science when it comes to abandonment of a long-standing theory. Nonscientists are fond of pointing out one or two contrary studies or a handful of unexplained mysteries and thinking a major theory overturned. The more efficient explanation is to discount early anomalies as human fallibility. The efficient practice when dealing with a theory propped up by thousands of observations, millions of person hours of labor and the consilience of logically related theories and at the same time a small set of recalcitrant data is to wait and see. That’s not to say that anomalies are dismissed — from economics, to discount something is to calculate the present day value of something that will potentially be of a different value in the future — they are merely tabled pending additional information. But should the accumulation of anomalies reaches a critical mass, they will eventually be widely admitted into the corpus of accepted fact. It’s the other side of the redundancy equation.

“We’re in the middle of a scientific revolution. It’s going to have big cultural effects.” True, just not the ones Mr. Brooks is thinking of. I think that what we’re seeing is essentially Antony Flew’s “Theology and Falsification” playing out on a societal scale. Atheists keep on raising unanswerable objections to religious belief — and not just in polemics, but ubiquitously in the zeitgeist — and religious people are staging a fighting retreat by continually lowering the bar and circumscribing ever more narrowly the propositional territory it is that they are defending. Neural Buddhism, spiritual but not religious — people may continue to profess all manner of confusion on the matter — it’s all a track to an essentially irreligious society.