The Internet is Still Very, Very New

The Stranger “reviews” twitter and makes the obvious, though necessary point (Constant, Paul, “Paul Constant Reviews Twitter,” The Stranger, 30 June 2009):

So I’m going to say something that might strike you as weird and naive, but it’s true. Listen: The internet is still very, very new.

Most people haven’t even been on the internet for 10 years yet. Ten years! Every technology is lawless frontier after just 10 years.

Television was still radio with scenery 10 years after its inception. People pointed, awestruck, at planes 10 years after Kittyhawk.

We’re just learning what the internet can do, and we’ll learn a lot more once children born today grow up with today’s internet.

For the first three years of twitter, it was easy to lampoon the service as the ultimate medium for whining about first world problems. But then the Iranian election happened and overnight it became a tool for unleashing social transformation and the indispensable news medium. The Internet is still new. Many potential services lay as yet unimplemented. Many will at first seem trivial or demeaning of this or that high value (“Is Google making us stupid?”). They will seem so until the moment when they transform into something utterly other than their original intention, specification, design.

Good point aside, can we have no more articles about twitter written entirely of 140 character paragraphs. It was cute at first, but now it’s just very gimmicky. It was worth it once for the style of the thing, but now to do so only detracts from your larger point. The 140 character message has its place and it is not the short-form essay.

Advertisements

The Noosphere Visualized

FaceBook's corner of the noosphere visualized, by Jack Lindamood

Jack Lindamood, one of the programmers at FaceBook, has produced a demonstration of an incredible tool that displays all activities on FaceBook geolocated on a rendering of the Earth. When he switches to a mode where it draws a line connecting the initiator to the recipient we get a view of something hitherto entirely invisible: a small subset of the noosphere being stitched together in real time.

I love it when he declares this “the Earth as Facebook sees it.” Then he switched to the wireframe view in which the Earth becomes just a grid for the information patterns that arc over its surface. Just as the geosphere becomes the platform for the biosphere and the biosphere becomes a geological force (Chown, Marcus, “Earth’s ‘Mineral Kingdom’ Evolved Hand in Hand with Life,” New Scientist, 19 November 2008), so the Earth is the grid on top of which the noosphere constructs itself, eventually becoming biological.

The Expanse of the Noosphere

Having all those icons stretching out into outer space looks impressive. But both the atmosphere and the crust of the Earth are proportionally thinner than the skin of an apple. And the biosphere is confined to an even narrower band than that. The noosphere’s region of activity is narrower still, with human activity limited to only a few score meters above and below the surface of the Earth. The noosphere has a tiny primary zone of hyperactivity, but tendrils reaching out to more exotic regions. Just as the biosphere is largely confined to the Earth’s surface, but extreme species inhabit the deep fissures of geysers, the bottom of ocean trenches subsisting on exotic metabolisms and at the bottom of mine shafts wherever we drill them and at whatever depth, so the noosphere stretches in tentative wisps far beyond its primary realm. Consider:

  1. Given that we have probes on other planets, in interplanetary space and in the case of the Voyager probes, approaching the heliopause and hence interstellar space, that are all beaming signals back to the Earth, the network of input devices of the noosphere is interplanetary and nearly interstellar. A small portion of the information making up the noosphere and effecting outcomes within it, is cosmic in origin.
  2. The internet has gone interplanetary. (Courtland, Rachel, “‘Interplanetary Internet’ Passes First Test,” New Scientist, 19 November 2008). NASA used to manually initiate direct, batch communication with our space probes, but now has software routers in many probes for an always on packet switched interplanetary space network. This includes orbiters and rovers roaming planetary surfaces. The time may soon come when the various surveillance and communications systems in space will have open APIs like present day Google Maps, FaceBook, Twitter and others.
  3. The first radio broadcasts powerful enough to penetrate the ionosphere were made in the late 1930s, so for the last 70 years an expanding radio sphere or future light cone has been filling with the broadcasts of our civilization, arranged in concentric spheres in reverse-chronological order (from out perspective): every electromagnetic pulse from an atomic explosion, every television broadcast, telephone conversation conducted through satellite, IP packet sent through satellite internet, the complete transcript and telemetry of every space mission and the numerous software patches and whole operating system upgrades broadcast to various space probes. In that sense the noosphere, or at least the informational exhaust wafting off the noosphere, or the electromagnetically petrified version of the noosphere already stretches out to a radius of 70 light years. At that distance it spans nearly 5,000 stars and a similar number of exosolar planetary systems.

Time is out of joint and an epoch has an extremely long tail in both direction. One begins with imperceptible first movements, and similarly only finally passes away long after anyone has noticed. We might say that even while the noosphere is still only in the primitive stages of weaving itself together here on Earth, it has already become interplanetary and arguably also interstellar.

(FaceBook video via Frank Episale)

OneMachine Doppelganger

Kevin Kelly has long discussed the OneMachine: the essentially unified, single distributed computer built up from all our networked PCs, PDAs, cell phones, digital cameras and other personal electronics (see e.g. “Tap Into the 12-Million-Teraflop Handheld Megacomputer,” Wired, vol. 16, no. 7, June 2008; “Dimensions of the One Machine,” The Technium, 2 November 2007; “One Huge Computer,” Wired, vol. 6, no. 8, August 1998).

Last week The New York Times ran an article on how below the surface, and running on the very same computers as the productive, life-enhancing OneMachine is a nefarious parallel network, OneMachine’s dark doppelganger, the BotNet (Markoff, John, “A Robot Network Seeks to Enlist Your Computer,” 20 October 2008, p. B1):

Botnets remain an Internet scourge. Active zombie networks created by a growing criminal underground peaked last month at more than half a million computers, according to shadowserver.org, an organization that tracks botnets. Even though security experts have diminished the botnets to about 300,000 computers, that is still twice the number detected a year ago.

The actual numbers may be far larger; Microsoft investigators, who say they are tracking about 1,000 botnets at any given time, say the largest network still controls several million PCs.

“The mean time to infection is less than five minutes,” said Richie Lai, who is part of Microsoft’s Internet Safety Enforcement Team, a group of about 20 researchers and investigators. The team is tackling a menace that in the last five years has grown from a computer hacker pastime to a dark business that is threatening the commercial viability of the Internet.

I have already written about how when the singularity occurs, it may not be what we expect. My suspicion is that either it will be overtly evil, or merely a recreation of the chaos of biological nature in a more durable, powerful and virulent form (“A Few Heretical Thoughts on the Singularity,” 19 August 2008).

What do phenomena like the BotNet suggest about the singularity? What comes will grow out of what is and what is will bequeath its characteristics to what comes — at least initially. Between the various military establishments and the criminal underground, we are instilling our machines with hostile, aggressive tendencies. But we are also making numerous, competitive systems. Will there be “the singularity” or will it, like in the novels of Vernor Vinge and Charles Stross, come in secretive, uncertain fits and starts? Will there be multiple singularities? Will one system cross the threshold, followed by another, then another. It makes sense to speak of “the singularity” when one is imagining a unified system, but when one is considering a multitude of contending systems, crossing the threshold of the singularity is but one move in a strategic game. Perhaps the machines will be hostile to predecessor biological life, but perhaps they will be so consumed in competition with their fellow AIs as to be merely indifferent to us, as we are to, say, pigeons or squirrels.

And how goes the strategic competition between OneMachine and BotNet? We ought to know. What portion of computational capacity, bandwidth, energy consumption, hours of their masters’ time are the two consuming? Qualitatively, how are they matching capabilities? Kevin Kelly has managed to make some calculations for the former, but what of the latter? Of course this would subject to the usual problems of surveillance of those who do not want to be surveyed.

Organizations like McAffee, Norton and the International Botnet Taskforce are attempting to build something akin to an immune system for the Internet, but the billion-year persistence of the arms race between host immune systems and the various infectious agents suggests that dampening catastrophe is probably the best outcome we can hope for. It’s an example of co-evolution where competition between host and agent drives the development of one another. Viruses don’t kill their host by design, they merely seek to hijack their reproductive machinery to their own purposes. Killing the host, or at least killing them too quickly, or the epiphenomenon of killing too many of them too quickly, are all suboptimum in that they result in diminished opportunity for continued infection and reproduction. Ebola gets it wrong. HIV gets it really right. But virus behavior as a whole is not intelligent. Occasionally a virus goes super virulent or hits a particularly vulnerable population and massive outbreak occurs that wrecks havoc for host and infectious agent alike. I presume that BotNets will continue to act something like this.

And since one third of known biological species are parasites and the proportion seems to be growing, it would seem that there is something fundamental about the strategy of parasitism. We should anticipate its continuance, both in genetic and electronic space.