Home » The Singularity — Is the Oscar-Nominated “Her” a Glimpse at Our Real Future?

The Singularity — Is the Oscar-Nominated “Her” a Glimpse at Our Real Future?

by Bruce Haring

The nominations are in. HerSpike Jonze’s film romance story between man and computer, scored five Academy Awards bids on Thursday, February 6th, including a nod for best original screenplay for Jonze.

It may be further proof that “the Singularity” has taken hold.

The Singularity is a much-discussed and allegedly hypothetical moment when artificial intelligence surpasses human understanding. The concept dates to 1958 and the dawn of the computer age, but truly entered pop culture with the 2005 book, The Singularity is Near: When Humans Transcend Biology by Ray Kurzweil. The book predicted a future where progress outstrips human comprehension, and we opt to augment ourselves with robotic add-ons, thereby transcending biological limitations.

 Vernor Vinge thinks the era of human beings will end with the Singularity’s advent.

Kurzweil initially pegged the Singularity’s arrival to 2028, but since has pushed back its birth date to 2045, a time when he presumably will have passed and will not be around to accept kudos or brickbats.

Debate on when (or if) the Singularity will arrive is like anything associated with the future — pick an area to focus on, and there will be some expert aligned with that particular point of view. The Singularity is near; the Singularity is far; the Singularity is impossible. Or even more tantalizing, the Singularity may already have occurred — and that’s why Google is building four-story barges in the San Francisco bay, to take its precious new omni-mind outside the control of any country and establish a seasteading residence beyond any laws.

The fascination with the Singularity is its own niche industry. There is a Singularity University (funded by Google) where the great minds of our time gather to discuss technological improvements that may help humans achieve. There’s also a Singularity Summit, a conference where the great issues surrounding the purported Singularity’s arrival are discussed, the Machine Intelligence Research Institute, and the Future of Humanity Institute at Oxford University, all of them busy debating the possibilities and ramifications of exponential intelligence growth.

Which leads us to perhaps the biggest question surrounding the arrival, one hinted at by Jonze’s film: if the Singularity does arrive, will that be a good thing or a bad thing for humans?

THE GOOD, THE BAD, THE UGLY

Kurzweil claimed that humans will begin to augment minds and bodies with modifications to tap into the power of the Singularity. This would usher in a golden age of reason, where technological improvements heretofore undreamed of by our puny human minds are suddenly within reach. Think an end to disease, faster-than-light travel, perhaps terraforming of other planets. This new superhuman intelligence would transcend any limitations of physics or biology.

However, Vernor Vinge, also one of the leading idea men in the Singularity discussions and credited with coining its name, takes a darker view. His 1993 article, “The Coming Technological Singularity: How to Survive in the Post-Human Era,” thinks the era of human beings will end with the Singularity’s advent. Vinge, incidentally, cited 2030 as the date for its appearance.  While Vinge believes that humans augmented with artificial intelligence won’t be truly human and therefore a different species, he’s not the most radical visionary in this sector.

There are those who argue that once machine intelligence becomes accelerated, it may see little need for humans and will do nothing to benefit them. It may, in fact, eradicate them, a la The Terminator’s Skynet scenario.

Still others postulate that the Singularity may decide that staying on Earth, or even in this time or dimension, does not suit its purposes. In this scenario, the super-intelligence creates its own wormhole or other means of transport, simply says, “Thanks for all the fish,” and departs.

Of course, there are those who pooh-pooh the idea of the Singularity, chief among them Gordon Moore, whose “Moore’s Law” about the rapidity of technological development is often the launching point for discussions. The naysayers claim that fundamental laws of scientific growth, and an understanding of how cognition develops, point to no Singularity without some “black swan” event.

Yet, because the future is unknown, we are left with the tantalizing possibility that such transcendent intelligence could happen. We’ll know for certain on March 2nd, when the Academy Awards winners are revealed. If it’s a big night for Spike Jonze, get ready for big changes.

You may also like

WP Twitter Auto Publish Powered By : XYZScripts.com