The Napoleon Dynamite Problem

After casing Ben Goertzel’s blog today, the point that I find myself really chewing on is this one (“The Increasing Value of Peculiar Intelligence,” The Multiverse According to Ben, 26 November 2008):

What occurs to me is that in a transparent society, there is massive economic value attached to peculiar intelligence. This is because if everyone can see everything else, the best way to gain advantage is to have something that nobody can understand even if they see it. And it’s quite possible that, even if they know that’s your explicit strategy, others can’t really do anything to thwart it.

Yes, a transparent society could decide to outlaw inscrutability. But this would have terrible consequences, because nearly all radical advances are initially inscrutable. Inscrutability is dangerous. But it’s also, almost by definition, the only path to radical growth.

I argued in a recent blog post [“The Inevitable Increase of Irrationality,” 25 November 2008] that part of the cause of the recent financial crisis is the development of financial instruments so complex that they are inscrutable to nearly everyone — so that even if banks play by the rules and operate transparently, they can still trick shareholders (and journalists) because these people can’t understand what they see!

But it seems that this recent issue with banks is just a preliminary glimmering of what’s to come.

Inscrutability, peculiarity, the idiosyncratic are already creeping in. Mr. Goertze is right to point to the rise of the quants and mathematical finance as an example. The one that comes to mind for me is the Napoleon Dynamite problem.

NetFlix has announced a million dollar prize for anyone who can improve the precision of its recommendation engine by ten percent. The New York Times Magazine and NPR’s On the Media both did stories about it back in November (Thompson, Clive, “If You Liked This, You’re Sure to Love That,” 23 November 2008; Gladstone, Brooke, “Knowing Me, Knowing You,” 21 November 2008). It turns out that improving the quality of this sort of singular value decomposition algorithm is geometric in difficulty. Most movies are easy to predict whether someone will like or dislike them, but a small number of odd movies thwart the algorithm. Chief among them is Napoleon Dynamite. For the research group profiled in The New York Times piece, Napoleon Dynamite was responsible for a whopping fifteen percent of all recommendation errors. There is no telling on the basis of people’s past movie rating history whether or not they’ll like this movie.

But the Napoleon Dynamite problem isn’t a solitary anomaly, but rather the paradigm of a trend. What we have is a Hollywood focused on these monster, expensive productions. Increasingly the movies that Hollywood makes are global products, with as much revenue coming from abroad as from the U.S. audience, so Hollywood is careful to strip its movies of any dialogue, humor or situations which are culturally nuanced and might not translate well. So the plot and dialog that we get in big Hollywood movies today is only the most broadly recognized and basic cultural tropes. Also, Hollywood has jacked the price of a movie up to the point where viewers now almost universally make a theatre-rental division: big special effects movies that they want to see in the theatres, and the dramas for which screen size isn’t a factor. It is a division with a positive feedback loop in that movie makers are aware of it and now shape their product offerings around it.

For a particularly depressing take on this, give a listen to Malcolm Gladwell’s 2006 New Yorker Festival talk on the use of machines to produce blockbuster scripts. At the same time that institutions like NetFlix are using computers to match customers to movies with increasing efficiency on the consumer end, Hollywood is using computers to make films increasingly easy to pigeonhole and match to demographics on the production side. It’s post-Fordist cultural production perfected. Soon we will be able to take the human out of the equation and the entertainment industry will just garnish out wages.

But there is — as is always the case — a countervailing motion. Just as Hollywood productions become increasingly trite and formulaic, there is the rise of these wildly bizarre and idiosyncratic films like The Zero Effect, Adaptation, Eternal Sunshine of the Spotless Mind, Lost in Translation, The Royal Tenenbaums, I Huckabees, Burn After Reading and so on. There is this sort of shadow Hollywood with it’s own set of stars and directors branding the alt-film genera: Wes Anderson, Charlie Kaufman, the Coen brothers, Catherine Keener, John Malkovich, William H. Macy, Frances McDormand. I would be remiss if I didn’t mention Steve Buscemi here.

What we have is a hollowing out of the middle. Along a spectrum, films range from obscurantia to formulaic. In the past, most movies probably fell in some broad middle: accessible, but unique. And most movie watchers probably fell there too. But increasingly movies and the movie-watching audience is being polarized into the genera constellations at one end and the difficult to categorize peculiarities at the other. Notice that the ambiguity of suspense has been replaced by the spectacle of gore in horror; that the sort of romantic comedy for which Drew Barrymore was designed and built has completely driven the older adult romantic drama to extinction. Similarly, the sort of accessible quirky, artiness represented by Woody Allen has moved much further down the spectrum of the idiosyncratic. The people who didn’t like Woody Allen are utterly baffled by Wes Anderson.

To generalize: hitherto we have been a normal distribution society. The majority of people fall into the broad middle and are closely related. But increasingly we are on the way toward a parabolic, or inverse normal distribution society, where the preponderance resides at the antipodes and people are separated by wide gulfs. This is true across the cultural spectrum, whether it’s politics, religion, the professions and so on. In the United States it is almost happening physically with the costal regions swelling as the center of the country is abandoned to satellite guided tractors and migrant labor. Some might call this the condition of postmodernity, some might call it the dissolution of Western Civilization.

Everything Malcolm Gladwell

Malcolm Gladwell’s new book, Outliers: The Story of Success, hit the stores today. As a loser, I’m not super-enthused about successful people. That said, I’ve become a big Malcolm Gladwell fan, so here are a few links.

He’s just the sort of person who should have a blog; and he has one. He maintains an archive of his published writings, as does The New Yorker where he is a staff writer. His previous two books are The Tipping Point (2000) and Blink (2005).

He’s presented on some of his ideas at the 2006 and 2007 New Yorker conferences. He’s been on Q&A with Brian Lamb. Apparently Charlie Rose is a big fan because Mr. Gladwell had been on his show seven times. NPR has done a number of profiles, interviews and reviews which can be found in their archive. And of course he’s given a TED Talk. It’s on the origin of extra chunky pasta sauce and the proliferation within product lines. More fundamentally it’s on the death of Platonism in the commercial food industry.

He’s been profiled in Fast Company (Sacks, Danielle, “The Accidental Guru,” January 2005), The New York Times (Donadio, Rachel, “The Gladwell Effect,” 5 February 2006) and now for his latest book New York Magazine (Zengerle, Jason, “Geek Pop Star,” 9 November 2008).

I used one of his articles as the basis for a little snark with “Malcolm Gladwell’s Infinite Monkey Theorem” (27 May 2008).

The funny thing you’ll notice if you follow a few of these links is that the hair isn’t the only big thinker factor. Despite a considerable speaker’s fee, Mr. Gladwell doesn’t own many suits.

Malcolm Gladwell’s Infinite Monkey Theorem

The Infinite Monkey Theorem apparently still holds if you substitute mediocre humans for monkeys. Here is Malcolm Gladwell writing on how to brute force genius (“In the Air,” The New Yorker, 12 May 2008):

In the nineteen-sixties, the sociologist Robert K. Merton wrote a famous essay on scientific discovery in which he raised the question of what the existence of multiples tells us about genius. No one is a partner to more multiples [simultaneous scientific discovery], he pointed out, than a genius, and he came to the conclusion that our romantic notion of the genius must be wrong. A scientific genius is not a person who does what no one else can do; he or she is someone who does what it takes many others to do. The genius is not a unique source of insight; he is merely an efficient source of insight. “Consider the case of Kelvin, by way of illustration,” Merton writes, summarizing work he had done with his Columbia colleague Elinor Barber:

After examining some 400 of his 661 scientific communications and addresses . . . Dr. Elinor Barber and I find him testifying to at least 32 multiple discoveries in which he eventually found that his independent discoveries had also been made by others. These 32 multiples involved an aggregate of 30 other scientists, some, like Stokes, Green, Helmholtz, Cavendish, Clausius, Poincaré, Rayleigh, themselves men of undeniable genius, others, like Hankel, Pfaff, Homer Lane, Varley and Lamé, being men of talent, no doubt, but still not of the highest order. . . . For the hypothesis that each of these discoveries was destined to find expression, even if the genius of Kelvin had not obtained, there is the best of traditional proof: each was in fact made by others. Yet Kelvin’s stature as a genius remains undiminished. For it required a considerable number of others to duplicate these 32 discoveries which Kelvin himself made.

This is, surely, what an invention session is: it is Hankel, Pfaff, Homer Lane, Varley, and Lamé in a room together, and if you have them on your staff you can get a big chunk of Kelvin’s discoveries, without ever needing to have Kelvin — which is fortunate, because, although there are plenty of Homer Lanes, Varleys, and Pfaffs in the world, there are very few Kelvins.

Our tendency is to imagine Newton, Darwin or Einstein as the pinnacle of genius, but they are merely the peak performance of that draft design kludge we all carry around in out heads. One can easily imagine ranks of genus many levels beyond our showings to date, ranging all the way from the Star Trek character Data to the gods (I uses these literary examples merely to demonstrate that we’re capable of imagining higher orders of genus). Each ranking of genius, all the way up to the gods, bears the same relation to the rank just below as Kelvin does to Homer Lanes, Varleys, and Pfaffs: not one of qualitative difference, but merely one of efficiency. And that relation obtains not just between each level, but over the entire span from pinnacle to base as well. It’s the wet machine corollary of Turing completeness.

This is, of course, why the proof from design for the existence of god fails, because one can imagine the universe being created in an instant by a supergenius, but it is equally plausible that it was created by a committee with some time on their hands. And the more time available or the larger the committee, the less capable any of its members has to be to produce a given output.