cropped bannermashup2

By Mark Ridinger

For the world’s over 2 billion Christians, Easter Sunday represents Jesus’ resurrection and ascension into heaven, and by so doing, paving the path for mankind to achieve everlasting life. It is perhaps not a coincidence then, that the producers of the movie Transcendence chose Easter weekend for the release of their new film. In it, Johnny Depp plays a brilliant, dying, computer scientist that in a quixotic effort uploads his mind into a supercomputer, in an attempt to achieve a version of what is commonly called the Singularity. Transcendence, but not of the divine variety. From there on, things get pretty interesting for humankind.

The concept of the Singularity, or artificial intelligence so great it surpasses human intelligence and understanding, has been discussed for decades; first introduced by the mathematician John von Neumann in the 1950’s. Since then, two diametrically opposed views have emerged: the heaven and hell scenarios. One of the biggest cheerleaders for the heaven case is inventor and futurist Ray Kurzweil (now working for Google). The heaven scenario postulates that the Singularity will bring unfathomable gifts to humankind, not only in terms of cures for disease, alleviation of hunger, and limitless energy, but for immortality as well, as we will be able to ultimately ditch our fragile, mortal biological host for a durable and ever lasting silicon model. But before these wonders are bestowed on us, Kurzweil also predicts the rise of a populace anti-technology movement, which he has labeled the New Luddites, as they would be the descendants of the movement that protested increasing machine automation in 19th century industrial-age England.

But it is hard to call Bill Joy, cofounder of Sun Microsystems, a Luddite. Yet, he is one of the main proponents of the hell scenario, which argues, in short, that the acute, exponentially technological explosion that is part of the Singularity would be a real existential threat to humanity, as it would give unfathomable power to potentially anyone. He thinks it is entirely possible that this all leads to the extinction of the human race—or worse—within 25 years.

It is true that technology always produces dislocations and disruptions. The power loom was the focus of the Luddites, but so was the “horseless carriage”, the airplane, and the Internet, to name but a few. And for the most part, people adapted. It’s dangerous to say it’s “different this time”; almost always that proves to be wrong. But is the exponential change in technology—one leading to the Singularity or not—finally poised to overwhelm the glacial pace that is evolutionary change, and those that arose from it—namely humans? Do we have the wisdom and sagacity to handle such a “transcension” and even if we do, do we really want to leave behind our humanity, as we have always known it? And are those opposed to pursuing this, now or in the future, merely technophobic New Luddites?

On March 27, an elderly, but healthy woman known to the public only as Anne from Sussex, availed herself of medically assisted suicide after traveling from her home in Britain to Switzerland. Although 89, it seems hard pressed to dismiss her as some Luddite. She was a former Royal Navy engineer, and described her life as “full, with so many adventures and tremendous independence.” Yet, she lamented that technology had so taken the “humanity out of human interactions” such that we were becoming “robots” that sat in front of screens, and that it was now just too hard to adapt to the “modern world.” She had grown weary of “swimming against the current.”

“If you can’t join them”, she said, “get off.”

Hopefully, Anne from Sussex will turn out to be a rare, unfortunate and sad victim of the existential ennui that rapid technological change can produce, but not a greater harbinger of things to come. But we might have to wait until the history books are written (if they are written) describing the aftermath of the Singularity—should it occur—to find out whether she was an outlier or a human canary in the coalmine. There is more to take away from this than just a case study of severe “technophobia”. Namely, what is mankind’s role in shaping our own destiny? If given the tools to direct our evolution, merge with an AI in the Singularity, for example, will we do it? Should we do it? And will it even be possible to opt out (short of suicide), if one doesn’t want to “evolve”?

It seems unlikely that we will be unable to avoid the seduction of achieving the Singularity, if and when. After all, we are told in Genesis 1 that God said to man: Be fruitful and multiply; fill the earth and subdue it. Is the Singularity the ultimate extension of that biblical passage and this “neogenesis” what we have been preordained to achieve?

Christians look to Easter as the promise of everlasting life given to us by Jesus dying for humanity’s sins, and with it, a chance to transcend the chains of our material bodies and take a seat next to God in heaven. It remains to be seen if mankind’s quest to create and direct our own transcendence--and by so doing to become God-like-- will end in a heaven or hell on earth.