Winter 92 - THE VETERAN NEOPHYTE
THE VETERAN NEOPHYTE
Many of the things that are important, many of the phenomena that drive the world, are based on
very simple rules. Huge numbers of independent entities interacting in a simple way at their local
level can exhibit surprisingly complex behavior. The amazing and endlessly fascinating thing is that
the end result is not at all obvious if you look only at the local rules.
Weather, for instance: get a bunch--and I meanlots --of gas molecules and water vapor together, and
weather just happens (I've heard that really big closed buildings, like hangars and roofed stadiums,
experience "weather" inside). As far as the molecules are concerned, there's no such thing as weather;
they just sort of bump around and interact with their neighbors, and the result is wind, or clouds, or
Another good example is evolution (one of my favorite topics): throw a bunch of replicating things
into an environment with limited but necessary (for replication) resources, and evolution just
happens. As far as the replicators are concerned, there's no such thing as evolution; they simply do
their best to replicate, and the result is trees, or dogs, or us.
Chemistry is another example that comes to mind: throw a bunch of atoms together, and chemistry
just happens. Again, as far as the atoms are concerned, there's no such thing as chemistry; they
simply attract and repel each other, sticking together or flying apart, swapping electrons around, and
the result is diamonds, or dynamite, or rust.
The examples go on and on, you can find them almost anywhere you care to look. Scientists call it
"emergent behavior": simple, local rules, repeated ad infinitum (in time, or space, or even some other
dimension), surprisingly often produce behavior that's unexpected, even unpredictable, from just the
rules. One of the things I like so much about computers is that they're superlative tools for exploring
There are three things in particular that make computers so good for this task: they can do
arithmetic unsupervised, once they're told what to do; they can do their arithmetic inside a logical
structure; and they can do itreally fast. This combination is extremely powerful and, more important,
is unique to computers. Before computers, no one ever saw good pictures of fractals-- though a few
mathematicians knew they were there--and the reason is simply that no one had the patience to slog
through the incredibly tedious, repetitive arithmetic needed to generate pictures of them. Computers
allowed mathematicians to write a recipe for the math, and then just wait a little while for the results.
In this sense, computers are a kind of microscope that allows people to see certain thingsfor the very
Today there's a huge and burgeoning branch of research, often and aptly termed the "sciences of
complexity," that has only become possible with the aid of computers. Emergent behavior is just one
aspect of this larger field. The study of complexity is suggesting all kinds of brand-new approaches in
long-established fields. Medicine, sociology, psychology, economics, biology, neuroscience,
mathematics, physics--all have been affected. Computers have also given rise to completely new
fields of inquiry: artificial intelligence, artificial life, chaos theory, neural networks, genetic
algorithms, even the study of computation itself. The list of applications and repercussions seems to
It's amazing to me still, and probably always will be, that doing arithmetic inside a logical structure is
a necessaryand sufficient condition to simulate anything that can be described precisely. (Even things
that can't be described precisely can be "precisely approximated"; a fact that makes engineers rejoice
but mathematicians gag.) Simply doing arithmetic very fast and automatically produces a blazing,
frothing torrent of diversity, a veritable fire hose of creation.
What's even more fascinating to me is that computers themselves are beginning to exhibit many of
the properties that characterize complex systems, including emergence. All they do, really, is
arithmetic. (Of course, if you want to get down deep, all they do is shove electrons around, but that's
a little too abstract, even for me.) But look at all the things computers are used for today, and think
of all the things theycould be used for. Admittedly, this progression and diversification is driven by
humans--it wouldn't happen without us--but the number and variety of computers and software that
exist have arisen without a grand design, without an overall plan. It has truly begun to evolve.
Early computer programs directly reflected the computer's capabilities. Most were basically number
crunchers, since at heart the computer is a number cruncher. Computers were, after all, invented to
do long, time-consuming calculations quickly and automatically (it helps a lot during wartime). And
that'sstill all they do, but the programs have changed dramatically.
Programmers soon began to abstract their programs away from sheer arithmetic--and thus from the
machine--and began to use the arithmetic to simulate other things, both strange and ordinary. Word
processing, computer graphics, spreadsheets, databases: all these arrived on the scene. There was (and
still is) a wild divergence away from simply doing arithmetic. In theory, according to mathematical
proofs, computers can simulateany logical system. There are certainly plenty of logical systems to go
around, and plenty more to invent.
So the progress of computing is a kind of human-driven evolution, with human use being the "fitness
function" (that is, the function that determines how well a particular entity is doing). Humans also
drive the mutation and recombination, since they're the ones inventing and modifying programs. And
that's where programmers come into the picture. If we're dealing with an evolutionary process, and
we want it to continue as fast as possible (we do, don't we?), we should provide the things that drive
evolution most strongly: diversity, large numbers, and strong selection pressure.
Selection pressure is amply provided by the marketplace; applications that aren't useful, or are too
expensive or buggy, die quick ignominious deaths. The large numbers that we need are already there,
and getting larger. We can help increase them by moving away from the current tendency toward
huge, multipurpose, feature-crammed applications and trying to get closer to the concept of
independent, single-purpose tools. (Besides, small programs are easier to develop, easier to support,
and easier for people to learn.)
This "granulation" also helps increase diversity, in that it breaks up the different functions of an
application into independent entities, with "lives" of their own. But even more effective at increasing
diversity is thinking of new things. Only by trying new stuff, by constantly exploring the landscape of
possibilities, by endlessly diversifying, do we make progress. Today's applications are only the tiniest
subset of what's possible.
Admittedly, there are very real practical limits: computers are only so fast (so far); developers need to
make a living, so their programs have to sell (excepting, of course, those of you lucky enough to work
in research and academia: you can't use this excuse); and, probably most important, programming
computers well turns out to bereally hard! But none of these limits are insurmountable. Computers
are getting faster at an incredible rate, new markets are opening up as the number and diversity of
computer users increase, and programming is getting easier. (Obviously the joy of programming has
very little to do with the mechanics of communicating with the machine: just look at all the assembly
hackers and UNIX folks in the world. Come to think of it, maybe a lot of the fun is figuring out how
to say what you want with a painfully limited vocabulary.)
A characteristic trait of complex systems is their sensitive dependence on initial conditions. Ask any
meteorologist. A tiny whisper of change can cascade into a complete transformation of the system.
The evolution of computing is careening along at a very high speed, with a lot of inertia, and in a lot
of directions; but a gentle shove in just the right place might profoundly affect the outcome. Where's
the right place to push? If I knew, I wouldn't be working for a living. But if we all just start pushing
everywhere we can think of, as often as we can, then we're helping computing reach its next
incarnation, whateverthat may be. I can't wait to find out.
- Artificial Life, edited by Christopher G. Langton (Addison-Wesley, 1989).
- Chaos by James Gleick (Penguin Books, 1987).
- Great Mambo Chicken and the Trans-Human Condition by Ed Regis (Addison-Wesley, 1990).
- The Tenth Good Thing About Barney by Judith Viorst (Atheneum, 1971).
DAVE JOHNSON once spent the better part of a day at the public library researching rock skipping (a.k.a. gerplunking or
dapping). He found two official organizations, one annual event, and a handful of articles in various magazines. Although
he sent very nice letters to the organizations asking for further information, he never heard from them. The currently
recognized world record is 29 skips. Rock skipping is still poorly understood by scientists. *
Dave welcomes feedback on his musings. He can be reached at JOHNSON.DK on AppleLink, email@example.com on the
Internet, or 75300,715 on CompuServe.*