To use our advanced search functionality (to search for terms in specific content), please use syntax such as the following examples:
Human beings are reengineering their environment—and their genetic makeup—in ways that would have been almost unimaginable just decades ago. Will new technologies save mankind from all its ills, or will we reengineer ourselves out of existence?
Human beings are reengineering their environment—and their own genetic makeup—in ways that would have been almost unimaginable just decades ago. Will new technologies save mankind from all its ills, or will we reengineer ourselves out of existence?
In June of 1961, less than two months after Soviet cosmonaut Yuri Gagarin became the first man to orbit Earth in a spacecraft, the popular science-fiction series The Twilight Zone aired a tale of a librarian found "obsolete" in a futuristic court proceeding. Even at the dawn of the space age, mankind had already begun to fear that it might be overtaken by the technology it had created.
Four decades later, millions of people have more computing power on their desktops than Gagarin had in his entire spacecraft—and at a fraction of the cost. Medicine can treat ailments thought fatal only a few years ago. Genetic engineering and robotics have moved out of science fiction novels and into our daily newspapers.
As technology brings ever-greater capabilities within our reach, many continue to wonder: are we making ourselves obsolete? Will new technologies change humanity forever? Or will human nature, exploiting these technologies, cause us to destroy ourselves? Millennia ago, God anticipated mankind's creativity and the enthusiasm of today's futurists. At the Tower of Babel, He observed that unless He confounded the builders' languages, "nothing that they propose to do will be withheld from them" (Genesis 11:6). Will God need to intervene once again, to save us from ourselves?
Many today, like economist Julian Simon, see advancing technology as an ever-increasing benefit for human society. "Our species is better off in just about every measurable material way," says Simon. "And there is stronger reason than ever to believe that these progressive trends will continue past the year 2000, past the year 2100, and indefinitely" (The State of Humanity, 1995). But others see dangers on the horizon, agreeing with astronomer Clifford Stoll who comments: "I'm saddened by a blind faith that technology will deliver a cornucopia of futuristic goodies without extracting payment in kind" (High-Tech Heretic, 2000, p. xi).
What is the answer? Compared to previous generations, we know at least that the question has changed. As computer industry pioneer Bill Joy observed: "This is the first moment in the history of our planet when any species, by its own voluntary actions, has become a danger to itself—as well as to vast numbers of others" ("Why the Future Doesn't Need Us," Wired Magazine, August 2000).
Will modern technology exact "payment in kind" from those who seek its benefits? Will mankind's best efforts soon prove the scriptural admonition: "There is a way that seems right to a man, but its end is the way of death" (Proverbs 14:12)? How will some of these new technologies affect humanity?
Biotechnology is the process in which a specific gene, or blueprint of a trait, is isolated and removed from one organism, then relocated into the DNA of another organism to replicate that trait. In the last decade, biotechnology has moved from the laboratory to the kitchen. Dubbed "Frankenfood" by critics, genetically altered crops are becoming more and more common. Already, many foods produced through genetic engineering are on sale in supermarkets. More than 55 percent of all soybeans, and nearly half of all corn produced in the United States is genetically modified, to provide insect resistance, increase yield or reduce need for herbicides. Gene-splicing techniques have also been employed to "improve" tomatoes—and even beer. Even the European Union, which long resisted the sale of genetically modified foods, is giving way, and will lift a three-year-old ban in October 2002.
Yet consumers remain wary. A recent study by the Pew Charitable Trusts showed that 75 percent of Americans consider it "somewhat" or "very" important to know whether their food has been genetically altered. Nearly 60 percent do not want genetically engineered crops introduced into the food supply. Yet many of these same consumers do not realize that more than half the foodstuffs on supermarket shelves already contain genetically modified organisms.
Scientists envision "a future in which we will choose what to eat based on our own genetic makeup. With the benefit of genetic testing, we would know whether we carried genes that predisposed us to illnesses such as cancer, heart disease, Alzheimer's and diabetes. We'd then eat foods—many of them the products of genetic engineering—that would be designed to help prevent or cure those diseases" ("Fighting Diseases," Los Angeles Times, February 5, 2001). One recent example is a strain of corn being developed to serve as a "vaccine" against hepatitis B.
But some are warning of unintended consequences. Scientists have noted that as genetic variability in crops is reduced, the consequences of crop disease become far greater. A disease that would only kill a portion of a genetically diverse crop might destroy the whole of a genetically identical crop. Disease that once caused only a local shortage could cause a global famine. Even "healthy" genetically engineered crops may pose problems, if "hardy, gene-altered crops develop into 'superweeds' that are difficult to eradicate" ("Biotech Crops Need Oversight," Los Angeles Times, February 21, 2002).
Remarkably, engineers have created transistors thousands of times smaller than those found in today's most advanced microprocessors. Using nanotechnology—an emerging science based on building molecular-scale machines atom by atom—scientists foresee that within two or three decades, microchips millions of times more powerful than today's models will one day be the backbone of intelligent devices too tiny to be seen by the naked eye.
Researchers at Delft University of Technology have built a transistor from a single molecule one nanometer wide—about 1/10,000 the thickness of a human hair—which can be toggled on and off using a single electron. Researchers project that nanotechnology will soon allow the creation of super-intelligent, microscopic devices. A swarm of micro-devices might, for example, solve the toxic waste problem by disassembling poisonous molecules, such as dioxin, into the innocuous atoms that compose them.
Sadly, it may be far easier to create destructive uses for nanotechnology. "Nanotechnology has clear military and terrorist uses, and you need not be suicidal to release a massively destructive nanotechnological device—such devices can be built to be selectively destructive, affecting, for example, only a certain geographical area or a group of people who are genetically distinct. An immediate consequence of the Faustian bargain in obtaining the great power of nanotechnology is that we run a grave risk—the risk that we might destroy the biosphere on which all life depends.… 'Plants' with 'leaves' no more efficient than today's solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough omnivorous 'bacteria' could out-compete real bacteria: they could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stop—at least if we make no preparation. We have trouble enough controlling viruses and fruit flies" (Joy, "Why The Future Doesn't Need Us," Wired Magazine, August 2000).
Nanotechnology will even have medical applications. Eric Drexler, a leading proponent of nanotechnology, has suggested that nanomachines will eventually be injected into cancer victims. The tiny robots would be programmed to recognize and kill malignant cells—much as an antibody can kill a disease-causing virus. Less-sophisticated nanotech medical tools may emerge within a few years, suggests Phil Kuekes, a computer scientist at Hewlett-Packard Labs and an expert in molecular-scale processors. Intelligent nano-scale devices could be injected as "biological sensors in the body, or for diagnostic purposes in the clinic," Kuekes said. The probes could be powered by ambient light or body heat and deliver a constant stream of data about disease organisms or other medical conditions ("Tiny Transistors a Big Leap for Technology," Los Angeles Times, July 6, 2001).
New technologies have changed society's view of medicine, and indeed its view of what it means to be human. Looking back at a book written just 30 years ago, we can see how quickly and dramatically society's assumptions about life have changed. In his groundbreaking 1970 work Future Shock, futurist Alvin Toffler asked a then-rhetorical question. "Does death occur when the heart stops beating, as we have traditionally believed? Or does it occur when the brain stops functioning?… What are the ethics of committing [one without brain function] to death to obtain a healthy organ needed for transplant to save the life of a person with a better prognosis?" (p. 206).
Today, the debate is no longer between heart-death and brain-death. Organs from brain-dead individuals are routinely harvested for transplants; two decades have passed since a U. S. Presidential Commission in 1981 endorsed this change. Now, the debate has shifted to how science might use embryos, and even fetuses, for medical benefit. Last year, President George W. Bush restricted the use of federal funds to support "stem cell" research. To obtain stem cells, fertilized embryos must be destroyed. Not only does this bring objections from the pro-life community, but even pro-abortion groups are concerned that researchers' growing needs for fertilized eggs could someday become an industry exploiting poor women.
Despite federal restrictions, and widespread concerns, some are calling "therapeutic cloning" of human embryos "the dawn of a new age in medicine." Embryos that could otherwise grow into babies are being treated as sources of stem cells, from which organs and tissues can be grown as treatments for disease ("What Clones?" Scientific American, February 2002, p. 18).
Controversy erupted in Great Britain last February, when a couple was given permission to create a baby so that its cells could be harvested as treatment for its brother's fatal blood disorder. Some hail this development as a blessing for those with serious illnesses, while others fear that babies, like embryos, will become just another commodity. Asked one opponent: "Should we allow a child to be manufactured in order to serve the medical needs of an older brother?" Since then, a House of Lords committee has ruled that, in the words of committee chairman Richard Harries, "no avenue of research should be blocked off" to British researchers exploring embryo cloning. In March, Canada took a position less restrictive than Britain's, but still allowing embryos left over from fertility clinics to be used for stem cell research.
Other scientists are looking to computer technology, rather than biotechnology, to manufacture or augment human life. Author Ray Kurzweil foresees a massive increase in computer power in the next few years, allowing human beings to "reengineer" themselves. He believes that we can expect, by the year 2020, "to achieve human brain capacity in a $1,000 device… your personal computer will be able to simulate the brain power of a small village by 2030, the entire population of the United States by 2048, and a trillion human brains by 2060. If we estimate the human Earth population at 10 billion persons, one penny's worth of computing circa 2099 will have a billion times greater computing capacity than all humans on earth" (The Age of Spiritual Machines, 1999, p. 105).
Kurzweil predicts that within the next century, human beings will "download" themselves into artificially constructed bodies that will provide computing power far beyond the capacity of the human brain. Those who are shocked at this prospect might consider his observation that "in terms of transforming our bodies, we are already further along in this process than we are in advancing our minds. We have titanium devices to replace our jaws, skulls, and hips. We have artificial skin of various kinds. We have artificial heart valves. We have synthetic vessels to replace arteries and veins, along with expandable stents to provide structural support for weak natural vessels. We have artificial arms, legs, feet and spinal implants. We have all kinds of joints: jaws, hips, knees, shoulders, elbows, wrists, fingers, and toes. We have implants to control our bladders.… we have long had implants for teeth and breasts" (ibid., p. 135).
Mankind has already chosen to replace natural bodies with man-made parts. To Kurzweil, what remains is only a matter of degree. He, and many others, expect that human beings 100 years from now will not look, think or act like they do today. Yet where Kurzweil sees unlimited possibilities, many see danger and—even if we survive—the loss of ourselves as a species.
Thirty years ago, when most of these trends were future prospects confined to the laboratory, Toffler asked: "Might we not unleash horrors for which man is totally unprepared? In the opinion of many of the world's leading scientists the clock is ticking for a 'biological Hiroshima'" (Toffler, p. 199).
But simply because horrors might occur, should we expect that they will? Consider Toffler's assessment: "In short, it is safe to say that, unless specific counter-measures are taken, if something can be done, someone, somewhere will do it. The nature of what can and will be done exceeds anything that man is as yet psychologically or morally prepared to live with" (ibid., p. 205).
Should we be afraid? Left to itself, mankind cannot reliably predict—much less guarantee—its own future. Yet thousands of years ago, the prophet Daniel accurately foresaw that in the "time of the end" mankind's "knowledge shall increase" (Daniel 12:4). In Daniel and elsewhere, Scripture explains what God's people must do to make themselves ready in the end times, as they see more and more prophecies fulfilled (see "Prepare for Christ's Coming!" on page 4 of this issue).
Has human nature changed since Scripture was written? Jesus Christ described conditions before His return. "But as the days of Noah were, so also will the coming of the Son of Man be. For as in the days before the flood, they were eating and drinking, marrying and giving in marriage, until the day that Noah entered the ark, and did not know until the flood came and took them all away, so also will the coming of the Son of Man be" (Matthew 24:37–39). Just as in the days of Noah, human beings will be absorbed in pursuits of the flesh, unaware of the momentous events about to occur.
Mankind, by itself, may be heading toward destruction. But true Christians today are not by themselves—they have the wonderful gift of the Holy Spirit to guide them (Romans 8:14), and they have God's promise of protection in the troubling times ahead (John 17:11). Moreover, just when humanity is about to destroy itself, Jesus Christ will return to usher in His millennial rule over the nations (Matthew 24:21–22, 29–31).
At His return, true Christians will experience a change far more dramatic than even the most exuberant advocates of genetic engineering or robotics can imagine. Scripture explains that "we shall all be changed—in a moment, in the twinkling of an eye, at the last trumpet. For the trumpet will sound, and the dead will be raised incorruptible, and we shall be changed" (1 Corinthians 15:51–52).
True Christians need not fear becoming obsolete. When "death is swallowed up in victory" (1 Corinthians 15:54) they will share in the amazing transformation that God has planned for mankind, as He prepares human beings to serve Him in tomorrow's world.