Reading
from: Why the future doesn't need us by Bill Joy
In the mid-'80s, ... Eric Drexler's [book] Engines of Creation, ...
described beautifully how manipulation of matter at the atomic level could
create a utopian future of abundance, where just about everything could be made
cheaply, and almost any imaginable disease or physical problem could be solved
using nanotechnology and artificial intelligences.
A subsequent book,Unbounding the Future: The Nanotechnology Revolution,
which Drexler cowrote, imagines some of the changes that might take place in a
world where we had molecular-level "assemblers." Assemblers could make possible
incredibly low-cost solar power, cures for cancer and the common cold by
augmentation of the human immune system, essentially complete cleanup of the
environment, incredibly inexpensive pocket supercomputers - in fact, any
product would be manufacturable by assemblers at a cost no greater than that of
wood - spaceflight more accessible than transoceanic travel today, and
restoration of extinct species.
(I need to comment here that this isn't the future. People are actively
working on technologies and theoretical designs to accomplish these goals using
molecular level machines.)
See if you can guess which famous mathematician wrote these lines:
First let us postulate that the computer scientists succeed in developing
intelligent machines that can do all things better than human beings can do
them. In that case presumably all work will be done by vast, highly organized
systems of machines and no human effort will be necessary... the human race
might easily permit itself to drift into a position of such dependence on the
machines that it would have no practical choice but to accept all of the
machines' decisions. As society and the problems that face it become more and
more complex and machines become more and more intelligent, people will let
machines make more of their decisions for them, simply because machine-made
decisions will bring better results than man-made ones. Eventually a stage may
be reached at which the decisions necessary to keep the system running will be
so complex that human beings will be incapable of making them intelligently. At
that stage the machines will be in effective control. People won't be able to
just turn the machines off, because they will be so dependent on them that
turning them off would amount to suicide.
If you haven't recognized him, the author of this frightening passage is
Theodore Kaczynski - the Unibomber. In the mid 21st Century, computer scientists believe they may be
able to build machines that exceed our intelligence.
I urge you to go to the library or the web and read this article.
Sermon
I love technology. Any of you who see me work with computers or pull out my
Palm IIIxe to capture a item on my todo list or to make an appointment have
witnessed my affection for electronic devices. If I could afford it, I'd have
all the latest technological gadgets for my computer and our home. From the
time I built my first crystal radio set and assembled my first short wave radio
to my discovery of computers in college, I've been fascinated by technology. I
remember sitting all alone at the computer terminal, upstairs in DuPont Hall at
the University of Delaware, pressing the return key, and seeing the computer
instantaneously reply, "Ready!" She was my perfect partner, my dream playmate.
She was always there and ready to respond to me 24 hours a day. Even today,
there is a little tiny thrill I feel each time I click the mouse and the
computer immediately does what I ask it - without questioning the order.
I'm not alone. The world has fallen madly in love with technology and what it
can do for us. I don't need to tell you how computers are changing every
aspect of our lives. This year computers have unlocked the secret message of
our DNA and transcribed its program - completely. We know every bit of the
code. We don't know what it means - yet - but the translation process has
already begun. The world of biotechnology seems boundless as we continue to
unlock the secrets of how living organisms work. Our exploding knowledge is
translating into the ability to change the DNA of living systems to suit our
own purposes. Have a problem with a pest attacking your crop? Just splice in
a gene to protect it. Want your cow to produce more milk? Just feed it a
special hormone to increase production.
One of the latest innovations referred to by Drexler in the reading is
"nanotechnology." The prefix nano refers to the scale measuring molecular
size. Computer memory chips store their ones and zeros now in dimensions of
tens of atoms, a dimension limited by the effects of quantum uncertainty and
cosmic rays. I've done some reading and looked at a few designs for building
molecular machines that can turn into atomic assembly lines. One solution to
the coming energy crisis could be here. Instead of waiting for the natural
process of growth and decay, oil companies could start with biomass, such as
organic waste, at one end and the other have a spigot with jet fuel flowing
out! Molecular technology promises to be several orders of magnitude more
energy efficient than conventional methods.
The tremendous pace of innovation today has predisposed us to believe that
advances in technology will save us from any future troubles. Experts and
government officials haven't responsibly planned for the coming oil production
peak because they believe scientists and engineers will come to their rescue.
There is a closely followed futurist and Wall Street stock picker, George
Gilder, who has mixed this faith in progress with his Christian religious
faith. He believes that the technological run-up to the end of the millenium
is just beginning and good times are ahead. He sweeps aside the pessimists
having deep faith in uncertainty, in what we don't know will happen next: the
unanticipated discovery or innovation arriving just in time to catch us before
we fall.[1]
Even though I love technology, I share Bill Joy's doubts whether advances in
technology will be our salvation. While Gilder is right, we can't see what
will be ahead and how it will shape the future, we can predict from what we do
know of today's technology and we do know about human behavior, what
might come next as well. And what might come next
is getting more and more frightening.
There are two effects of the development of new technology that I find
particularly troubling. The first, the malevolent use of a new invention or
discovery to harm people rather than help them. All I need do to persuade you
of this danger is mention two names: Saddam Hussein, and Usama Bin Laden. Both
would be more than happy to acquire whatever powerful new technology is
discovered to use it against their enemies, particularly us. If gene splicing
could be used generate a version of the Ebola virus that would kill Caucasians
selectively, I expect they'd do it. The more powerful the technology, the
greater the danger. And when the scientists are driven by a deluded
understanding of the Islamic idea of jihad, all the more we have to
fear. Not only the Islamic world should be the source of our concern but
terrorists everywhere.
The suggested solution to this potential danger might be to try to keep the
discoveries secret and protect them from those who might abuse them. Good
luck! Science thrives in an atmosphere where ideas are freely exchanged.
Internet technology is making information harder and harder to control. Think
about how difficult it has been to protect our nuclear secrets. In a world of
greater and greater social mobility, it is harder and harder to separate the
good guys from the bad guys.
This doesn't trouble me as much as the second effect, the law of unintended
consequences. Even if, we have world peace and everyone loves everyone else,
there is no way to outlaw human ignorance, mistakes and stupidity. And even if
we have the best minds working together under the most careful controls and
regulations, things happen that nobody would have expected. Technology bites
back[2].
Some of these problems seem perfectly obvious to us after the fact. But at the
time, no one had any idea there might be a problem. In fact the social good
seemed obvious. Think for a moment about preventing forest fires. It seems
obvious we should stop fires because they kill and destroy both property,
plants and animals. Yet now that we've held fires off for so many years, when
a dry year does come, the fires are more intense and kill the trees rather than
just clearing out the under-story and fertilizing the soil. How about the use
of antibiotics for example in soap. Seemed like we should spread germ killers
all over the place to protect our families' health. But evolution never sleeps
and rarely can be defeated. Not all the bacteria will be killed by the
antibiotics and the resistant strains accidentally selected by mutation will be
the next terrors we must face with no biological weapon to stop them.
How about something as simple as a better ski boot that protects the ankle.
Doctors soon noticed this new boot was causing many more anterior cruciate
ligament knee injuries. Or think about motorcycle helmets and neck injuries.
Rather than dying, many riders were doomed to being quadriplegics for the rest
of their lives. I could do a whole sermon on introducing African bees in
Brazil to increase honey production. Or the introduction of Starlings and the
Melaleuca tree to our ecosystem with the best of intentions and the most
disastrous of consequences. And new mistakes are happening right now, even as
I speak.
Each innovation that gets introduced follows the cycle of intensification
followed by disaster then the addition of precautions and vigilance to prevent
further problems. The end byproduct of the addition of a new technology is
ever greater and greater vigilance. As our vigilance becomes more and more
complicated we will need more and more machines to help. The Unibomber's
prediction becomes even more ominous if we think this through to its natural
conclusion.
I'm personally most tuned into unintended consequences when I write computer
software even though I'm very, very good at it. For non-programmers, think
about trying to write an essay without misspellings and free of any grammatical
errors - on the first draft. As software programs grow in size, the error rate
does not follow a linear growth rate but rather an exponential one. In common
sense language, the bigger the software project gets, the more completely
unmanageable it gets, requiring too many people to find all the bugs. Soon,
there may not be enough people in the whole world to find all the software bugs
in all the software being written. And the worst bugs will only manifest when
we least expect them, when our great space probe lands on Mars in an unexpected
way and tilts the antenna the wrong direction.
There just isn't any way we can figure out how to prevent the experience of
"Oops!"
The first use of the atom bomb over Hiroshima I think of as the turning point
in our awareness that technology could be dangerous to the future of our
species, even life on this planet. An all-out nuclear war could put so much
debris in the air that everyone and everything would freeze without the sun's
light. Looking back in history now, a little wiser, we can see how each
technological innovation both helped advance civilization and also wipe out
tribes that didn't adopt it. There is a wonderful book on this subject titled,
Guns, Germs and Steel, that shows in great detail how innovations in technology
have shaped us and the world. Small, almost accidental variations in the
environment have stimulated human inventions that have had profound effects
independent of human intelligence or genetic fitness.
What drives this innovation? Why do we keep turning to technology to save us?
I know why I adopt each new computer innovation that comes my way. It makes my
life easier and more pleasant. It prevents, heals or protects me from danger
and discomfort. Satisfaction of desire and avoidance of pain endlessly spin
the wheel of technological progress. The desire to exist and continue to exist
is central to our humanity And this isn't just some human urge. We share this
urge to live and successfully pass life on to our offspring with all life
forms.
The Buddha really understood this very well and laid it all out for us over
2500 years ago. All living beings desire existence and resist non-existence.
All living beings experience confusion about how best to continue
to exist and avoid future non existence for themselves and their offspring. We
are trapped in the endless cycle of birth and death and cannot see beyond
it.
It has always seems a little ironic to me that one who loves technology as much
as I do would be so enamoured with Buddhism. Theravadan Buddhist monks, the
sect I'm most fond of, are about as far from technology as you can get. The
have almost no possessions, beg for food, turn their attention away from the
world to explore what is already inside them. Their meditation practice as
taught by the Buddha has no use for technology. No devices are needed for
awakening. No special incense or crystals to hold or musical sounds to listen
to or chant. Just sit and be awake to what is.
I doubt if I've ever been happier in my life than sitting in silence at a
meditation retreat with a peaceful mind, feeling love for everybody full of
compassion for human suffering. In that moment, I didn't need a new DVD player
or a Palm keyboard, my favorite meal or a warm whirlpool bath to make myself
feel complete. For that moment, I experienced living life without conditions
or expectations. It can be done by you too.
Such experiences are the foundation of my decision to become a minister. I
sometimes feel my heart tugged by technology as I think about being part of the
excitement and creativity of ushering in the next computer revolution. Yet I
see that the pace of technical innovation far outstrips the pace
of social innovation. My wisdom and intelligence can be better used inventing
social responses to the effects of technology rather than accelerating
technological development.
I find it interesting that the simplicity of Buddhism comes to the West right
as technology threatens to finish us off. Perhaps George Gilder is right.
There is some kind of a loving God out there holding the tiller. The Dalia
Lama, my personal vote for the prophetic voice of God, in particular brings
powerful ideas to our civilization that perhaps will moderate our potential for
world destruction. He encourages us to understand that the happiness we seek
is independent of material progress. What gives life meaning is embodying love
and compassion with a sense of universal responsibility.
What I'm certain of is we cannot survive as a divided world any longer. Most
of our problems are global in scope and will require global answers.
Technology alone cannot solve the problems that are buried in the human brain.
The inner human problem has no technological solution. Humanity must wake up
to the subjective nature of our craving, hatred and delusion and make a
conscious choice to begin living in a new way that cultivates wisdom and
compassion.
Since we cannot prevent the unintended consequences of innovation, at best, we
can use our collective wisdom to choose the appropriate, earth friendly
technology. Let us turn back from our addiction to progress and turn toward
the treasures already piled high in front of us. Sustainability must be our
guiding principle for the 21st Century.
I'm afraid there is no way out of this mess ... but there is a
way into it and beyond it.
Copyright (c) 2000 by Samuel A. Trumbore. All rights reserved.
[1] Check out Gilder Technology at
http://www.gildertech.com/
[2] Tenner, Edward, Why Things Bite Back:
Technology and the Revenge of Unintended Consequences, Knopf, 1996,
ISBN0-679-42563-2