The Zero Effect of Archival Research

Daryl Zero Paper Headache

Now that I’m spending time doing research for my thesis at the Library of Congress Manuscript Division and the National Archive, I’m really wishing that the fictional manual / memoir that serves as Daryl Zero’s voice-over in The Zero Effect were a real book that I could consult:

Now, a few words on looking for things. When you look for something specific your chances of finding it are very bad because of all things in the world, you only want one of them. When you look for anything at all your chances of finding it are very good because of all the things in the world you’re sure to find some of them.

Daryl Zero is for me a guru on par with Yoda, Keisuke Miyagi and Ogami Itto.

Knowledge is a Network

Bollen, Johan, et al., Clickstream Data Yields High-Resolution Maps of Science, Public Library of Science One

Knowledge is a network phenomenon. Only primitive knowledge consists of non-systematized catalogues of facts. System is the highest state of knowledge. Right now the system of our knowledge might be said to be clumpy, with well developed disciplines, but tenuous connections between them. Knowledge is still subject to cluster analysis. The apogee of knowledge will be a system of complete propositional consistency.

I present here a selection of discussions of the network nature of knowledge:

Kevin Kelly (“The Fifth and Sixth Discontinuity,” The Technium, 15 June 2009):

We casually talk about the “discovery of America” in 1492, or the “discovery of gorillas” in 1856, or the “discovery of vaccines” in 1796. Yet vaccines, gorillas and America were not unknown before their “discovery.” Native peoples had been living in the Americas for 10,000 years before Columbus arrived and they had explored the continent far better than any European ever could. Certain West African tribes were intimately familiar the gorilla, and many more primate species yet to be “discovered.” Dairy farmers had long been aware of the protective power of vaccines that related diseases offered, although they did not have a name for it. The same argument can be made about whole libraries worth of knowledge — herbal wisdom, traditional practices, spiritual insights — that are “discovered” by the educated but only after having been long known by native and folk peoples. These supposed “discoveries” seems imperialistic and condescending, and often are.

Yet there is one legitimate way in which we can claim that Columbus discovered America, and the French-American explorer Paul du Chaillu discovered gorillas, and Edward Jenner discovered vaccines. They “discovered” previously locally known knowledge by adding it to the growing pool of structured global knowledge. Nowadays we would call that accumulating structured knowledge science. Until du Chaillu’s adventures in Gabon any knowledge about gorillas was extremely parochial; the local tribes’ vast natural knowledge about these primates was not integrated into all that science knew about all other animals. Information about “gorillas” remained outside of the structured known. In fact, until zoologists got their hands on Paul du Chaillu’s specimens, gorillas were scientifically considered to be a mythical creature similar to Big Foot, seen only by uneducated, gullible natives. Du Chaillu’s “discovery” was actually science’s discovery. The meager anatomical information contained in the killed animals was fitted into the vetted system of zoology. Once their existence was “known,” essential information about the gorilla’s behavior and natural history could be annexed. In the same way, local farmers’ knowledge about how cowpox could inoculate against small pox remained local knowledge and was not connected to the rest of what was known about medicine. The remedy therefore remained isolated. When Jenner “discovered” the effect, he took what was known locally, and linked its effect into to medical theory and all the little science knew of infection and germs. He did not so much “discover” vaccines as much as he “linked in” vaccines. Likewise America. Columbus’s encounter put America on the map of the globe, linking it to the rest of the known world, integrating its own inherent body of knowledge into the slowly accumulating, unified body of verified knowledge. Columbus joined two large continents of knowledge into a growing global consilience.

The reason science absorbs local knowledge and not the other way around is because science is a machine we have invented to connect information. It is built to integrate new knowledge with the web of the old. If a new insight is presented with too many “facts” that don’t fit into what is already known, then the new knowledge is rejected until those facts can be explained. A new theory does not need to have every unexpected detail explained (and rarely does) but it must be woven to some satisfaction into the established order. Every strand of conjecture, assumption, observation is subject to scrutiny, testing, skepticism and verification. Piece by piece consilience is built.

Pierre Bayard (How to Talk About Books You Haven’t Read [New York: Bloomsbury, 2007]):

As cultivated people know (and, to their misfortune, uncultivated people do not), culture is above all a matter of orientation. Being cultivated is a matter not of having read any book in particular, but of being able to find your bearings within books as a system, which requires you to know that they form a system and to be able to locate each element in relation to the others. …

Most statements about a book are not about the book itself, despite appearances, but about the larger set of books on which our culture depends at that moment. It is that set, which I shall henceforth refer to as the collective library, that truly matters, since it is our mastery of this collective library that is at stake in all discussions about books. But this mastery is a command of relations, not of any book in isolation …

The idea of overall perspective has implications for more than just situating a book within the collective library; it is equally relevant to the task of situating each passage within a book. (pgs. 10-11, 12, 14)

And I might add that it is not only passages within a book, but passages between books. Books, passages, paragraphs, et cetera are all stand-ins for or not-quite-there-yet stabs at the notion of memes. It is the relation of memes that is critical and books or passages therefrom are proxies or meme carriers. A book is a bundle of memes. And those memes bear a certain set of relations to all the other memes bundled in all the other books (or magazines, memos, blog posts, radio broadcasts, conversations, thoughts, or any of the other carriers of memes).

This brings us to the ur-theory of them all, W.V.O. Quine’s thumb-nail sketch epistemology from “Two Dogmas of Empiricism,” (The Philosophical Review vol. 60, 1951, pp. 20-43; Reprinted in From a Logical Point of View [Cambridge, Mass.: Harvard University Press, 1953; second, revised, edition 1961]):

The totality of our so-called knowledge or beliefs, from the most casual matters of geography and history to the profoundest laws of atomic physics or even of pure mathematics and logic, is a man-made fabric which impinges on experience only along the edges. Or, to change the figure, total science is like a field of force whose boundary conditions are experience. A conflict with experience at the periphery occasions readjustments in the interior of the field. Truth values have to be redistributed over some of our statements. Reevaluation of some statements entails re-evaluation of others, because of their logical interconnections — the logical laws being in turn simply certain further statements of the system, certain further elements of the field. Having reevaluated one statement we must reevaluate some others, whether they be statements logically connected with the first or whether they be the statements of logical connections themselves. But the total field is so undetermined by its boundary conditions, experience, that there is much latitude of choice as to what statements to reevaluate in the light of any single contrary experience. No particular experiences are linked with any particular statements in the interior of the field, except indirectly through considerations of equilibrium affecting the field as a whole.

If this view is right, it is misleading to speak of the empirical content of an individual statement — especially if it be a statement at all remote from the experiential periphery of the field. Furthermore it becomes folly to seek a boundary between synthetic statements, which hold contingently on experience, and analytic statements which hold come what may. Any statement can be held true come what may, if we make drastic enough adjustments elsewhere in the system. Even a statement very close to the periphery can be held true in the face of recalcitrant experience by pleading hallucination or by amending certain statements of the kind called logical laws. Conversely, by the same token, no statement is immune to revision. Revision even of the logical law of the excluded middle has been proposed as a means of simplifying quantum mechanics; and what difference is there in principle between such a shift and the shift whereby Kepler superseded Ptolemy, or Einstein Newton, or Darwin Aristotle?

For vividness I have been speaking in terms of varying distances from a sensory periphery. Let me try now to clarify this notion without metaphor. Certain statements, though about physical objects and not sense experience, seem peculiarly germane to sense experience — and in a selective way: some statements to some experiences, others to others. Such statements, especially germane to particular experiences, I picture as near the periphery. But in this relation of “germaneness” I envisage nothing more than a loose association reflecting the relative likelihood, in practice, of our choosing one statement rather than another for revision in the event of recalcitrant experience. For example, we can imagine recalcitrant experiences to which we would surely be inclined to accommodate our system by reevaluating just the statement that there are brick houses on Elm Street, together with related statements on the same topic. We can imagine other recalcitrant experiences to which we would be inclined to accommodate our system by reevaluating just the statement that there are no centaurs, along with kindred statements. A recalcitrant experience can, I have already urged, be accommodated by any of various alternative reevaluations in various alternative quarters of the total system; but, in the cases which we are now imagining, our natural tendency to disturb the total system as little as possible would lead us to focus our revisions upon these specific statements concerning brick houses or centaurs. These statements are felt, therefore, to have a sharper empirical reference than highly theoretical statements of physics or logic or ontology. The latter statements may be thought of as relatively centrally located within the total network, meaning merely that little preferential connection with any particular sense data obtrudes itself.

As an empiricist I continue to think of the conceptual scheme of science as a tool, ultimately, for predicting future experience in the light of past experience. Physical objects are conceptually imported into the situation as convenient intermediaries — not by definition in terms of experience, but simply as irreducible posits comparable, epistemologically, to the gods of Homer. Let me interject that for my part I do, qua lay physicist, believe in physical objects and not in Homer’s gods; and I consider it a scientific error to believe otherwise. But in point of epistemological footing the physical objects and the gods differ only in degree and not in kind. Both sorts of entities enter our conception only as cultural posits. The myth of physical objects is epistemologically superior to most in that it has proved more efficacious than other myths as a device for working a manageable structure into the flux of experience.

Image from Bollen, Johan, Herbert Van de Sompel, Aric Hagberg, Luis Bettencourt, Ryan Chute, et al., “Clickstream Data Yields High-Resolution Maps of Science,” Public Library of Science One, vol. 4, no. 3, March 2009, e4803, doi:10.1371/journal.pone.0004803. See article for a larger version.

Information Phenotype

The fractal tree at the corner of 18th and Lamont, Washington, D.C., 29 April 2009

I have previously commented on my love of the movie π (“The Supernovae in Your Coffee Cup,” 2 November 2008). It left me with two enduring images: mixing coffee and cream as an example of turbulence, previously discussed; and the branching of tree limbs as an example of fractal symmetry. I love winter for its exposure of this fabulous phenomena, innocuously right over our heads. I am always a little sad for the arrival of spring and the enshrouding of all these thought-provoking fractals in greenery.

The picture above is of my favorite tree in the neighborhood where I live. The degree to which the pattern of major arc over two-thirds of growth length followed by sharp break and lesser arc over remainder of growth length is repeated trunk to twig is amazing. Notice the arc of the trunk: unlike many trees which follow one rule for trunk and a separate rule for the branches, this tree follows a single rule throughout.

We think of a fractal as a recursive algorithm, a mathematical formula. But there’s no math in that tree. The recipe for that fractal is coded somewhere in the tree’s DNA. But the DNA contains no fractal. The DNA is a bunch of nucleotides that are transcribed by messenger RNA that code amino acids that assemble into proteins that form the structures of cells. The cells then split and differentiate in response to a complex of internal chemical signals and environmental stimuli to grow in a pattern that is the fractal.

One might say that there is a fractal somewhere in that tree, but there are so many transformation rules between nucleotide sequence and fractal growth pattern, that it is only in a manner of speaking. I am reminded of Wittgenstein’s discussion of what constitutes following a rule and going against it (Philosophical Investigations, trans. G. E. M. Anscombe [Malden, Mass.: Blackwell, 1953]):

198. “But how can a rule shew me what I have to do at this point? Whatever I do is, on some interpretation, in accord with the rule.” — That is not what we ought to say, but rather: any interpretation still hangs in the air along with what it interprets, and cannot give it any support. Interpretations by themselves do not determine meaning.

“Then can whatever I do be brought into accord with the rule?” — Let me ask this: what has the expression of a rule — say a sign-post — got to do with my actions? What sort of connection is there here? — Well perhaps this one: I have been trained to react to this sign in a particular way, and now I so react to it.

But that is only to give the causal connection; to tell how it has come about that we now go by the sign-post; not what this going-by-the-sign really consists in. On the contrary; I have further indicated that a person goes by a sign-post only in so far as there exists a regular use of sign posts, a custom.

199. Is what we call “obeying a rule” something that it would be possible for only one man to do, and to do only once in his life? — This is of course a note on the grammar of the expression “to obey a rule.”

It is not possible that there should have been only one occasion on which someone obeyed a rule. It is not possible that there should have been only one occasion on which a report was made, an order given or understood; and so on. — To obey a rule, to make a report, to give an order, to play a game of chess, are customs (uses, institutions).

To understand a sentence means to understand a language. To understand a language means to be master of a technique.

200. It is, of course, imaginable that two people belonging to a tribe unacquainted with games should sit at a chess-board and go through the moves of a game of chess; and even with all the appropriate mental accompaniments. And if we were to see it we should say they were playing chess. But now imagine a game of chess translated according to certain rules into a series of actions which we do not ordinarily associate with a game — say into yells and stamping of feet. And now suppose those two people to yell and stamp instead of playing the form of chess that we are used to; and this in such a way that their procedure is translatable by suitable rules into a game of chess. Should we still be inclined to say that they were playing a game? What right would one have to say so?

201. This is our paradox: no course of action could be determined by a rule, because every course of action can be made out to accord with the rule. The answer was: if everything can be made out to accord with the rule, then it can also be made out to conflict with it. And so there would be neither accord nor conflict here.

It can be seen that there is a misunderstanding here from the mere fact that in the course of our argument we gave one interpretation after another; as if each one contented us at least for a moment, until we thought of yet another standing behind it. What this shews is that there is a way of grasping a rule which is not an interpretation, but which is exhibited in what we call “obeying the rule” and “going against it” in actual cases.

Of course Wittgenstein is writing about social phenomena where custom and training are factors, but the undecidability of rules is the point here. Socially dogmatic, we are dismissive of blatant divergence from consensus. Less dogmatic — but not free of dogma — science resorts to the metaphysical-aesthetic notion of Ockham’s razor with which to cut through the myriad of rules that might potentially be made to accord with observed behavior. Is there really a fractal in the tree’s DNA? The fractal pattern of tree growth is but an interpretation of the tree’s DNA — an interpretation that would be different given a differing machinery of RNA transcription, amino acid assembly, protein expression, etc.

The Future of Economics: The Arational and the Irrational

Back in 2000 The Economist ran an article titled “The Future of Economics” (4 March 2000, p. 80). It was largely a gloss on a symposium on the same from the Journal of Economic Perspectives (vol. 14, no. 1, Winter 2000). The authors acknowledged that economics was a faltering field. Setting aside the proposition that economics may simply have run it’s course and be into its dotage of diminishing returns, the article considers two possibilities for a way forward:

David Colander of Middlebury College, in an article that looks back on the present from an imagined 2050, blames the current discontent on the orthodox general-equilibrium model that underlies most of today’s economic theory. He favors a shift from the current approach, which has been called “loose-fitting positivism” (propose a model consistent with standard assumptions, then test it), to one based on “loose-fitting pragmatism” (forget about canonical principles, just search for patterns in the data).

Such an approach, he says, would be consistent with “the rise of complexity science within the scientific community generally.” Researchers sitting at their computers, subjecting data to a withering barrage of statistical analysis, would still hope to come up with laws of a sort, or regularities at any rate. But these “laws” would be regarded as provisional and ever-shifting: indeed, the claim is that changeless underlying patterns do not exist. Complex systems expand and evolve; even at the most fundamental level, these patterns are temporary. But whether this approach could still be called “economics” is debatable.

The second approach is much easier to reconcile with traditional methods. Its most celebrated exponent is Richard Thaler of the University of Chicago, who has also written a paper for the symposium. Mr. Thaler agrees that the canonical principles of orthodox theory have led economics astray, but he believes these mistakes can be put right. He seeks, in other words, a tighter-fitting positivism. You improve the fit above all, he would argue, by putting a more realistic account of human cognition at the center of the theory.

Orthodox theory famously assumes that people are rational. In reality, they are not. On the other hand, they are not crazy, or crassly incompetent — in other words, their behavior is not random. If economics could try harder to recognize that people try to be rational, but in certain, often predictable, ways fail to be, the positivist approach would have a better foundation. In essence, what Mr. Thaler calls for is a marriage, or at least much closer cohabitation, between economics and psychology.

I have thought of this article frequently since reading it back in 2000 when it was first published. Given the spate of books along these lines, especially the second, I’d have to say that this was one of the more perspicacious articles that I’ve ever read.

The first approach is an example of Petabyte Age type thinking, eight years before Wired put it on the cover. But of course it is an idea that had to incubate in the rarified world of advanced theoreticians for years before any eruption into the popular conscience. The main offering in this area would be Steven Levitt and Stephen Dubner’s Freakonomics (2005), though their book is not a fully atheoretic inquiry so much as putting of large questions to the test of large data sets. More to the topic would be Ian Ayres’s Super Crunchers: Why Thinking-by-Numbers Is the New Way to Be Smart (2007), though the fact that Mr. Ayres used the very methods he describes in his book to arrive upon a title casts a great deal of doubt on the soundness of said methods.

As for the build a better model of the economic corpuscles approach, it seems to have advanced along far enough that it is now also ready to be packaged up for mass consumption. And of course the psychologists have much more lucrative options in publishing than the mathematicians and computer scientists.

Judging by some of the key phrases in the Economist article (the predictably irrational stuff) I was pretty sure that they had in mind Dan Ariely’s thinking, published as Predictably Irrational (2008), but it turns out that Richard Thaler is, along with Cass Sunstein, the author of Nudge (2008). Rounding out the most omnipresent trio is Tim Harford’s The Logic of Life: The Rational Economics of an Irrational World (2008). Also on the list of offerings along this line would be Ori and Rom Brafman’s Sway: The Irresistible Pull of Irrational Behavior (2008) and Michael Shermer’s The Mind of the Market: Compassionate Apes, Competitive Humans, and Other Tales from Evolutionary Economics (2007).

So that’s the future of economic study. Either a discounting of human rationality in favor of the system effect of irrationality or allowing rationality to drop out in favor of the system effect of economic thing-in-itself.

Patterned Lawlessness

Back in July Will Wilkinson made a point that I thought was interesting at the time, but that has stuck in my grey matter and is gradually working it’s way toward becoming a fundamental component of my worldview (“Note About Rational Scofflaws,” The Fly Bottle, 11 July 2008):

I wonder how many drivers exceed the speed limit basically whenever they judge that it won’t cause anybody any problems. I’d guess, approximately, all of them. Also, there are very clear laws about, say, using turn signals, or using turn signals when parallel parking (do you do this?), or not taking a right hand turn on red lights when it is marked, not double parking, even if you’re just going to be one minute while you fetch your latte. And so on. When’s the last time you jaywalked? Lunch? People are more or less rational and tend to respond to incentives, and therefore the roads are a zone of patterned lawlessness. We all know what infractions the cops care about — how much over the speed limit is too much over, etc. — and we tend to respond accordingly. We even tend to internalize and moralize the rules whose expected cost of violation is relatively high. It’s more efficient that way. And thus our huffing indignation is easily riled by those who face different incentives and so flout different rules than the ones we flout without reflection.

This morning on my ride to work I coasted through a stop sign in front of a police cruiser that was approaching from the road to my right. I gave a little embarrassed smile and a little wave. She made a little disapproving face and waved back. It’s anarchy I tell you. Anarchy! I got to work in four minutes.

I have always thought of anarchism as a proscriptivist political program. It’s never occurred to me to consider anarchism as a positivist description of what’s actually going on behind normal law-conforming behavior.

People have an imagination of the law as somehow an ultimately hard thing. We hear expressions like “the iron law of …” or we use the same word, “law,” in physics as we do in our social imaginings. By linking the law with morality and construing morality as partaking of the metaphysical, the associations flow back the other direction as well.

And reference to the law would serve as a good explanation in most instances. Why does everyone so assiduously follow the lines painted on the roads, or when they drive over them, do so in such a regular fashion? And thus we might explain the vast middle hump of the bell curve of driving behavior. But then someone swerves over the line into oncoming traffic. To account for all driving behavior — the outliers as well as the vast middle of the curve — another theory with more breadth is required.

I also like the way that this theory strips morality of its metaphysical pretensions, paints the metaphysics as mere rhetorical device, or sees the inclination to render our ordering prescripts as fundamental as merely a pragmatic shorthand, or as the ideological reification of particularly strong emotions. Really we just react in a pragmatic way to the incentives that we find around us. It should be noted that some of those incentives are natural and some institutional. This is perhaps part of the basis for distinction, a la Elliot Turiel, between prohibitions of morality and prohibitions of social convention.

Patterned lawlessness is also a description of affairs that comports with the existential account of law-conforming behavior. So entrenched is our notion of the law as somehow inviolable, or so cowed is our thinking by the high wall of consequence erected by the law that we are prone to see dictates of the law as things about which there simply is no other option but to do as we are told. Existentialism was born in part as a reaction to the horrors of amorality and unreason to which people were pushed at the behest of state bureaucracies in the Twentieth Century, namely the Somme, the Holocaust. Existentialism contains the admonition that at every moment we stand free to do otherwise, even where the law is concerned.

American Pseudo-Religion; Science and Experience

The title of David Brooks’s op-ed Tuesday, “The Neural Buddhists” (The New York Times, 13 May 2008), sounded cyberpunk and that was enough to entice me to read it. Turns out it’s some comments on the trend in neurological and genetic research toward characterizing the religious tendency and the religious experience. A lot of the editorial is wishful thinking on the part of a religious conservative, but then there’s the musings from which the piece draws its title:

This new wave of research will not seep into the public realm in the form of militant atheism. Instead it will lead to what you might call neural Buddhism.

In their arguments with Christopher Hitchens and Richard Dawkins, the faithful have been defending the existence of God. That was the easy debate. The real challenge is going to come from people who feel the existence of the sacred, but who think that particular religions are just cultural artifacts built on top of universal human traits. It’s going to come from scientists whose beliefs overlap a bit with Buddhism.

I often point out that the fastest growing religion in the U.S. today is not Mormonism or any branch of Christianity, but the poorly conceptualized “spiritual but not religious” (“Teens: Spiritual, But Not Religious,” smarties, 11 January 2005). This isn’t some entirely post-1960s baby-boom or gen-X phenomenon. It is the latest manifestation of a long line of uniquely American religion stretching from the Enlightenment deism of the founding generation to the transcendentalism of the late Nineteenth Century to the Progressive era psycho-spirituality of William James. It pulls together an idiosyncratic combination of Christianity, grand historical conspiracy theories à la the Freemasons, various strains of mysticism, yeoman pragmatism, naturalism, popular science, amateur philosophical speculation, do-gooderism, health fads, self-help, popular psychology and positive thinking. It’s all of a piece with American mesianism, paranoia, individualism, pragmatism and the melting pot. It’s a little incipient and a little too convenient for the American way of life, having dispensed with the hard truths and the dark side of religion as well as any of the really imposing moral injunctions, but there it is. And Mr. Brooks is right to point out that the best fit for this among the ancient religions is Buddhism.

As for the rest of the article, it’s just the ontological argument for the existence of god without the minor premise. And the refutation is the same today as it was in the Eighteenth Century: you can’t imagine something into existence. A recurrent dream of Pegasus, however deeply felt, is not the existence of Pegasus. Conversely, the Pegasus of the recurrent dream is not what people would mean were they to speak of the existence of Pegasus. The question isn’t whether one has a particular brain experience. People have all manner of experiences, imaginary and not, as well as everything in between — in fact, the vast bulk of human experience probably lies somewhere between the real and the imagined. The question is whether or not a given experience correlates to an existent external state of affairs.

Amidst the natural sciences the question of correlation between a purported experience and a state of affairs external to mind is not something determined in some crass way. “It really happed.” “No it didn’t.” “Yes it did!” There is simply no sense dwelling on a single instance. Scientists discount a sample size of one. If there is too much dispute over a particular instance, simply drop it in favor of further inquiry. Fleeting and unitary experiences are dismissed in scientific practice in favor of what might be called the intersubjective (see e.g. intersubjectivity or intersubjective verifiability), the societal nature of scientific knowledge or a Wittgensteinian denial of a private language in favor of the essentially public nature of our scientific discourses.

For all of Nietzsche’s fretting that the death of god had unchained the Earth from the Sun, religion was every bit as arbitrary and subjective as its adherents today accuse irreligion of being. In the end, the whole of society swings over the abyss on a tether of fundamentally ungrounded beliefs. Science at least has the merit of basing its propositional criteria on egalitarian public discourse. Religion is based on all manner of purportedly private experience — revelations, miracles, conversations with the gods, passions, et cetera — all considered beyond criticism. Some people are chosen, enlightened or who knows what — the plans of the gods are inscrutable — and the rest of us, not so exalted, accept or reject religious belief on the authority of those possessed of such experiences. To those who prefer something more determinate, Jesus reiterates the Deuteronomic injunction, “Do not put the Lord your God to the test” (Matthew 4:7, Deuteronomy 6:16).

This is one of the major divisions between science and religion. Were science to start poking its nose into religious business, the religious person would object that the spiritual is a realm of deeply personal experience, not subject to the critical dissection of all comers. And yet in it’s public aspect, religious practitioners are expected to take the word of people having had religious experiences. No attempt is made to abstract an experience away from an individual experiencer. Religion believes every obscurantist story that any old quack tells, at least where not condemned by religious authority.

A recognition deeply built into the practice of natural science, even if never properly conceptualized or explicitly taught, is the recognition of the fallibility, or at least the broad diversity in function, of the human mind. The well observed fact of low brain performance, stretching from simple poor judgment, forgetfulness, error, misperception and dishonesty to careerism, optical illusions and dreams, all the way to delusion, mental disorder, group psychology and mass hysteria has been incorporated into the background of scientific practice. In this regard a particular theory of mind is a part of the body of scientific practice. And, importantly, it’s not a complicated theory of mind — though one can pursue it to various levels of sophistication — but rather one built upon rather day-to-day observation of human foibles. I think the books of reference here are not any of the ones that Mr. Brooks lists, but David Linden’s The Accidental Mind: How Brain Evolution Has Given Us Love, Memory, Dreams, and God or Gary Marcus’s Kluge: The Haphazard Construction of the Human Mind.

One doesn’t have to search very far in one’s own life to find examples of how the brain, while a miracle of evolution, only works so well. At least a couple of times a week I experience a random, spasmodic jerk of some extremity. My cube neighbor at work, my brother, my highschool physics teacher and a former priest all have facial ticks, some rather elaborate, of which I am certain they are completely unaware and were they to become aware, would not be able to control. So-called religious phenomena — feelings of destiny, hearing voices, talking to god, heightened emotional states, impulses, a sense of unity, feelings of disembodiment — are of a piece with this. I don’t deny that religious people have the experiences that they claim. Subjective experiences are experiences nonetheless. What I deny is that such experiences have any greater significance.

Or for that matter there is the even more commonplace matter of difference in perspective. In this sense science is a highly stylized political methodology for producing consensus amidst the rocky shoals of vast differences in human experience.

These commonplace observations are the cause for the emphasis on repeatability and independent verification in scientific practice. It’s not enough for one person to have had an experience, or even for a very large number of people to have shared that experience for it to be established as a scientific fact. The standard for a scientific fact is that it must be something accessible to all; it must be something determinately replicable. A scientific community employs a fairly common engineering method for combating error: given that humans are cheap and plentiful, accommodate for the very low performance of each individual unit of scientific production by performing each task in redundancy. The inaccuracy of any given unit is cancelled out over the span of the entire system.

This is also the cause for the conservatism in science when it comes to abandonment of a long-standing theory. Nonscientists are fond of pointing out one or two contrary studies or a handful of unexplained mysteries and thinking a major theory overturned. The more efficient explanation is to discount early anomalies as human fallibility. The efficient practice when dealing with a theory propped up by thousands of observations, millions of person hours of labor and the consilience of logically related theories and at the same time a small set of recalcitrant data is to wait and see. That’s not to say that anomalies are dismissed — from economics, to discount something is to calculate the present day value of something that will potentially be of a different value in the future — they are merely tabled pending additional information. But should the accumulation of anomalies reaches a critical mass, they will eventually be widely admitted into the corpus of accepted fact. It’s the other side of the redundancy equation.

“We’re in the middle of a scientific revolution. It’s going to have big cultural effects.” True, just not the ones Mr. Brooks is thinking of. I think that what we’re seeing is essentially Antony Flew’s “Theology and Falsification” playing out on a societal scale. Atheists keep on raising unanswerable objections to religious belief — and not just in polemics, but ubiquitously in the zeitgeist — and religious people are staging a fighting retreat by continually lowering the bar and circumscribing ever more narrowly the propositional territory it is that they are defending. Neural Buddhism, spiritual but not religious — people may continue to profess all manner of confusion on the matter — it’s all a track to an essentially irreligious society.

Formal Cognition

A few weeks ago I went to the most recent installment of the D.C. Future Salon. The presenter was the organizer, Ben Goertzel of Novamente and the subject was “Artificial Intelligence and the Semantic Web.” One of the dilemmas that Mr. Goertzel chased out with the notion of a semantic web is that the complexity is conserved: either it has to be in the software agent or it has to be in the content. If it is in the agent, then it can be highly centralized — a few geniuses develop some incredibly sophisticated agents — or it can be in the content in which case the distributed community of content providers all have to adequately mark-up their content in a way that more simple agents can process. Mr. Goertzel is hedging his bets: he is interested both in developing more sophisticated agents and in providing a systematic incentive to users to mark up their content.

In the course of discussing how to encourage users to mark up their own content, Mr. Goertzel listed as one of the problems that most lack the expertise to do so. “What portion of people are adept at formal cognition? One tenth of one percent? Even that?” I had probably herd the phrase or something like it before but for whatever reason, this time it leapt out. I had a collection of haphazard thoughts for which this seemed a fitting rubric and I was excited that this may have been a wheel for which no reinvention was required. When I got home I googled “formal cognition” figuring there would be a nice write-up of the concept on Wikipedia, but nothing. I fretted: formal cognition could simply mean machine cognition (computers are formal systems). Maybe it was a Goertzel colloquialism and the only idea behind it was a hazy notion that came together in that moment in Mr. Goertzel’s mind.

Anyway, I like the notion and absent an already systemized body of thought called formal cognition, here is what I wish it was.

While the majority might continue to think with their gut and their gonads, a certain, narrow, technocratic elite is in the process of assembling the complete catalog of formally cognitive concepts. There is the set that consists of all valid, irreducibly simple algorithms operative in the world along with their application rules covering range, structure of behavior, variants, proximate algorithms, compatibilities, transition rules, exceptions, et cetera. I am going to show my reductionist cards and say that given a complete set of such algorithms, all phenomena in the universe can be subsumed under one of these rules, or of a number acting in conjunction. In addition to there being a complete physis of the world, there is, underlying that, a complete logic of the world.

This reminds me of Kant’s maxim of practical reason that will is a kind of causality and that the free will is the will that is not determined by alien causes, that is, the will that acts according to its own principles, which are reason (see e.g. Groundwork of the Metaphysic of Morals, Part III). It seems to me that a project of delineating the principles of formal cognition is a liberating act insofar as we are casting out the innumerable unconscious inclinations of that dark netherworld of the intuition (gut and gonad), instilled as they were by millennia of survival pressures — the requirements for precision of which were considerably different from those of a modern technological people — in favor of consciously scrutinized and validated principles of thought.

By way of outstanding example, one might be prone to say that evolution is such a logic. At this point evolution has jumped the bank of its originating field of thought, the life sciences, and begun to infect fields far beyond its origin. It is increasingly recognized today that evolution through natural selection is a logical system, one of the fundamental algorithms of the world, of which the common conception of it as a process of life science is merely one instantiation. Perhaps it was only discovered there first because it is in life phenomena that its operation is most aggressive and obvious. But it is now recognized that any place where replication, deviation and competition are found, an evolutionary dynamic will arise. Some cosmologists even propose a fundamental cosmological role for it as some sort of multiverse evolution would mitigate the anthropic problem (that the universe is strangely tuned to the emergence of intelligent life).

However, note that evolution is a second order logic that arises in the presence of replication, deviation and competition. It would seem that evolution admits to further decomposition and that it is replication, deviation and competition that are fundamental algorithms for our catalog. But even these may be slight variations on still more fundamental algorithms. It strikes me that replication might just be a variant of cycle, related perhaps through something like class inheritance or, more mundanely, through composition (I feel unqualified to comment on identity or subspecies with algorithms because it is probably something that should be determined by the mathematical properties of algorithms).

System. But have I been wrong to stipulate irreducibly simplicity as one of the criteria for inclusion in our catalog? The algorithms in which we are interested are more complex than cycle. They are things like induction, slippery-slope, combinatorial optimization or multiplayer games with incomplete information. We have fundamental algorithms and second order or composite algorithms and a network of relations between them. Our catalogue of algorithms is structured.

The thing that I think of most here is Stephen Wolfram’s A New Kind of Science (complete text online | | Wikipedia) in which he describes a systematic catalog of enumerated algorithms, that is, there is an algorithm that could generate the entire catalog of algorithms, one after the other. These algorithms each generate certain complex patterns and as Mr. Wolfram suggests, the algorithms stand behind the phenomena of the material world.

An interesting aside lifted from the Wikipedia page: in his model science becomes a matching problem: rather than reverse engineering our theories from observation, once a phenomenon has been adequately characterized, we simply search the catalogue for the rules corresponding to the phenomenon at hand.

It seems to me that this catalog might be organized according to evolutionary principles. By way of example, I often find myself looking at some particularly swampy looking plant — this is Washington, D.C. — with an obviously symmetrical growth pattern — say, radial symmetry followed by bilateral symmetry, namely a star pattern of stems with rows of leaves down each side. Think of a fern. Then I see a more modern plant such as a deciduous tree, whose branch growth pattern seems to follow more of a scale symmetry pattern. The fern-like plants look primitive, whereas the deciduous branch patterns look more complex. And one followed the other on the evolutionary trajectory. The fern pattern was one of the first plant structures to emerge following unstructured algae and very simple filament structure moss. The branching patterns of deciduous trees didn’t come along until much later. There are even early trees like the palm tree that are simple a fern thrust up into the air. The reason that fern-like plants predate deciduous trees has to do with the arrangement of logical space. A heuristic traversing logical space encounters the algorithm giving rise to the radial symmetry pattern before it does that of scale symmetry. The heuristic would work the same whether it was encoded in DNA or in binary or any other instantiation you happen to think of.

A fantastical symmetry. I’m going to allow myself a completely fantastical aside here — but what are blogs for?

It is slightly problematic to organize the catalogue on evolutionary principles insofar as they are logical principles and sprung into existence along with space and time. Or perhaps they are somehow more fundamental than the universe itself (see e.g. Leibniz) — it is best to avoid the question of whence logic lest one wander off into all sorts of baseless metaphysical speculation. Whatever the case, biological evolution comes onto the scene relatively late in cosmological time. It would seem that the organizing principle of the catalogue would have to be more fundamental than some latter-day epiphenomena of organic chemistry.

Perhaps the entire network of logic sprung into existence within the realm of possibility all at once, though the emergence of existent phenomena instantiating each rule may have traversed a specific, stepwise path through the catalogue only later. But there isn’t a straightforward, linear trajectory of simple all the way up to the pinnacle of complexity, but rather an iterative process whereby one medium of evolution advances the program of the instantiation of the possible as far as that particular medium is capable before its potential is exhausted. But just as the limits of its possibilities are reached, it gives way to a new medium that instantiates a new evolutionary cycle. The new evolutionary cycle doesn’t pick up where the previous medium left off, but start all the way from zero. Like in Ptolemy’s astronomy there are epicycles and retrograde motion. But the new medium has greater potential then its progenitor and so will advance further before it too eventually runs up against the limits of its potential. So cosmological evolution was only able to produce phenomena as complex as, say, fluid dynamics. But this gave rise to star systems and planets. The geology of the rocky planets has manifest a larger number of patterns, but most importantly life and the most aggressive manifestation of the catalog of algorithms to date, biological evolution. As has been observed, the most complexly structured three pounds of matter in the known universe is the human brain that everyone carries around in their head.

If life-based evolution has proceeded so rapidly and demonstrated so much potential, it is owing to the suppleness of biology. However, the limits of human potential are already within sight and a new, far more dexterous being, even more hungry to bend matter to logic than biological life ever was has emerged on the scene: namely the Turing machine, or the computer. This monster of reason is far faster, more fluid and polymorphous, adaptable, durable and precise than us carbon creatures. In a comparable compression of time from cosmos to geology and geology to life, the computer will “climb mount improbable,” outstrip its progenitor and explore further bounds of the catalog of logic. One can even imagine a further iteration of this cycle whereby whatever beings of information we bequeath to the process of reason becoming real repeat the cycle: they too reach their limits but give rise to some even more advanced thing capable of instantiating as yet unimagined corners of the catalogue of potential logics.

But there is a symmetry between each instantiation of evolution whereby the system of algorithms was traversed in the same order and along the same pathways. Perhaps more than the algorithms themselves are universal, perhaps the network whereby they are related is as well. That is to say that perhaps there is an inherent organizing structure within the algorithms themselves, a natural ordering running from simple to complex. Evolution is not the principle by which the catalog is organized, but merely a heuristic algorithm that traverses this network according to that organizing principle. Evolution doesn’t organize the catalog, but its operation illuminates the organization of the catalog. Perhaps that is what makes evolution seem so fundamental: that whatever it’s particular instantiation, it is like running water that flows across a territory defined by the catalogue. Again and again in each new instantiation evolution re-traverses the catalogue. First it did so in energy and matter, then in DNA, then in steel and silicon, now in information.

Anti-System. This is fantastical because, among other reasons, it is well observed that people who are captivated by ideas are all Platonists at heart. I have assiduously been avoiding referring to the algorithms of a system of formal cognition as forms. It all begs the question of whence logic — which, again, is a terrible question.

Of course the notion of formal cognition doesn’t need to be as systematic as what I have laid out so far. Merely a large, unsystematized collection of logically valid methods along with the relevant observations about the limitations, application rules and behaviors of each one would go a significant way toward more reliable reasoning. Perhaps such a thing doesn’t exist at all — I tend towards a certain nominalism, anti-foundationalism and relativism. But the notion of a complete logical space, or a systematic catalog is perhaps like one of Kant’s transcendental illusions — a complete science or moral perfection — the telos, actually attainable or only fantasized, that lures on a certain human endeavor.

Politics. All of this having been said, I remain of the belief that politics is the queen of the sciences. Formal cognition wouldn’t be automated decision making and it could only ever enter into political dialog as decision support or as rhetoric.

As Kant wrote, “Thoughts without content are empty; intuitions without concepts are blind” (Critique of Pure Reason, A51 / B75). Kant developed an idiosyncratic terminology and perhaps another way of phrasing this, more suited to my purpose here, would be to say that formal reason absent empirical data is empty; but that empirical data unsystemized by conceptual apparatus is an incoherent mess. A complete system of the world cannot be worked out a priori and a mere catalogue of all observations about the world would be worse than useless.

Formally cognitive methods must be brought to bear. And against a complex and messy world I do not think that their application will be unproblematic. In passing above, I mentioned the notion of application rules. Each algorithm has attendant rules regarding when it comes into force, for what range of phenomena it is applicable, when it segues to another applicable algorithm, et cetera. Take for instance the notion of the slippery-slope or the snowball. Not all slippery-slopes run all the way to the bottom. Most are punctuated by points of stability along the way, each with its own internal logic as to when some threshold is overcome and the logic of the slipper-slope resumes once more. Or perhaps some slippery-slope may be imagined to run all the way to the bottom — it’s not ruled out by the logic of the situation — but for some empirical reason in fact does not. Once the principles of formal cognition come up against the formidable empirical world, much disputation will ensue.

Then there is the question of different valuation. Two parties entering into a negotiation subscribe to two (or possibly many, many more) different systems of valuation. Even when all parties are in agreement about methods and facts, they place different weights on the various outcomes and bargaining positions on the table. One can imagine formally cognitive methods having a pedagogic effect and causing a convergence of values over time — insofar as values are a peculiar type of conclusion that we draw from experience or social positionality — but the problems of different valuation cannot be quickly evaporated. One might say that the possibly fundamental algorithm of trade-off operating over different systems of valuation goes a long way toward a definition of politics.

Finally, one could hope that an increased use and awareness of formally cognitive methods might have a normative effect on society, bringing an increased proportion of the citizenry into the fold. But I imagine that a majority of people will always remain fickle and quixotic. Right reasoning can always simply be ignored by free agents — as the last seven years of the administration of George W. Bush, famous devotee of the cult of the gut as he is, have amply demonstrated. As an elitist, I am going to say that the bifurcation between an illuminati and the rabble — as well as the historical swings in power between the two — is probably a permanent fixture of the human condition. In short, there will be no panacea for the mess of human affairs. The problem of politics can never be solved, only negated.