How do we best manage emerging technologies? (with their ability increasingly to be both wondrous and potentially our undoing)

The future will require that we as tool makers assume a new kind and depth of responsibility for the tools we create.

New technologies have always been two-edged.   Learning to make fire, while it let us better warm ourselves and cook our food, also risked our immolation.  Today, the automobile makes us much more mobile, and also pollutes the environment.  The telephone lets us communicate long distances, and also disrupts our privacy.

But the magnitude of possible harm in the future will be well beyond anything imaginable in times past.  Advances new on the horizon will have the potential to end the human experiment.  Learning to manage increasingly Janus-faced technologies will present some of the future’s most critical challenges.

Innovations with obvious cataclysmic potential, such as the splitting of the atom and the manipulation of the human genome, most readily come to mind.  But many that present the most frightening risks are more subtle in their effects.

For example, I would argue that the information revolution, while it has more potential to benefit us than any other sphere of discovery, at once has the greatest potential to be our undoing.    This is worth looking at more closely as grappling with the gifts and curses of the Age of Information so well illustrates the easily unsolvable-seeming quandaries that new technologies present

While information technologies are designed to increase communication, just as easily they can have the opposite effect.  They can serve as delivery systems not just for useless information, but for information that is exploitative, indeed directly damaging.

We catch of glimpse of this potential with the endless car crashes, explosions, and splattering blood that pervade modern visual media.  Media violence works—keeps us glued to the screen—because it has a direct neurological effect.  The mini-injection of adrenaline that comes with each shooting or explosion mimics the biological signals we use to know whether something is important—whether we should stay tuned.

At the least, such exploits human compassion—making us in the process numb and cynical.   But experienced over time the effect can be even more disturbing.

I’m reminded of an experiment often referred to in psychology classes to illustrate the mechanisms of addiction.  A wire is inserted into excitement centers in a rats brain, then run to a depressable peddle in its cage.  Eventually, quite by accident, the rat steps on the peddle.  Over time the rat discovers the connection between pressing the peddle and the jolt of excitement it brings.  He presses it with growing frequency.  Eventually, the animal neglects other activities, even eating, and dies.

Future advances will have the capacity—using the addiction analogy—to function as increasingly powerful ‘designer drugs.’

So what do we do?  We face parallel bewilderments with attempts to manage any kind of emerging technology.

For example, it is not nearly as simple as we might think to separate effects that can serve us from those that will not.  Using media violence again to illustrate, there the task of management would seem straightforward:  we oppose, even legislate against, violent content.  But things aren’t that simple.  A good argument can be made that visual media’s ability to depict violence with graphic inescapability represents one of its greatest strengths—remember the tanks rolling through Tienamen Square, or, during the Vietnam War, that picture of a young girl, her body scorched from napalm running naked toward the camera.  Again and again, what we wish to catch slips through our grasp.

Perhaps the best approach, then, is to do nothing—just trust the marketplace.  But this gets us no closer.   Trust is not warranted when the product being sold is diversion and addition—and has this potential magnitude of effect.   Communications technologies will serve more and more as our social nervous system.  The choices we make in regard to them will define the future of human meaning—and perhaps even our survival.

But is managing human invention—of any sort—really even possible? Many have argued that the drive to be tool-makers is unstoppable, impervious to self-reflection.  But if that is the case, our future will not be bright.

Perhaps management is not the right word.  In the end we can’t, “control” invention any more than we can once and for all control love or the creative process of a work of art—and wouldn’t want to.  But neither love nor art, of any significance, are about unbridled impulse.  Once again we are only beginning to ask the right questions.

 

 

 

 

 

 

JOIN OUR LIST

Fill out the form below to receive monthly articles and updates from Charles Johnston, M.D.