Singularitarianism

Singularitarianism is a movement defined by the belief that a singularity -the creation of superintelligence -will likely happen in the future medium, and that it should be considered that the Singularity benefits humans . [1]

Singularitarians are distinguished from other futurists who speculate on a singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they may sometimes be dedicate their lives to their lives. [2]

Time magazine describes the worldview of singularitarians by saying that “it’s not a fringe idea, it’s not, no more than a weather forecast is science fiction. There’s an intellectual idea that involves some super-intelligent immortal cyborgs, but … while the Singularity appears to be, it’s an idea that rewards sober , careful evaluation. ” [1]

Alternative definitions

Inventor and futurist Ray Kurzweil , author of the 2005 book The Singularity Is Near: When Humans Transcend Biology , defines a Singularitarian as someone “who understands the Singularity and who has reflected on its implications for his or her own life”; he estimates the Singularity will occur around 2045 . [2]

History

Singularitarianism coalesced into a coherent ideology in 2000 when artificial intelligence (AI) researcher Eliezer Yudkowsky wrote The Singularitarian Principles , [2] [3] in which he stated that a “Singularitarian” believes that the singularity is a secular, non-mystical event which is possible and beneficial to the world and is working towards by its adherents. [3]

In June 2000 Yudkowsky, with the support of Internet entrepreneurs Brian Atkins and Sabine Atkins, founded the Machine Intelligence Research Institute to work towards the creation of self-improving Friendly AI . MIRI’s writings argue for the idea that an AI with the ability to improve upon its own design ( Seed AI ) would quickly lead to superintelligence . These Singularitarians believe reaching the Singularity That Swiftly and Safely is the best way as possible to minimize net existential risk .

Many people believe in a singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is a small movement, which includes transhumanistphilosopher Nick Bostrom . Inventor and futurist Ray Kurzweil , who predicts that the Singularity will occur circa 2045 , greatly contributed to popularizing Singularitarianism with his 2005 book The Singularity Is Near: When Humans Transcend Biology . [2]

What, then, is the Singularity? It is a future period when the pace of change will be so rapid, its impact so deep, that human life will be irreversibly transformed. This method will be used in the future, from the perspective of our business models, to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the meaning of our past and the ramifications for our future. To truly understand it inherently changes one’s view of life in general and one’s particular life. I look at someone who understands the singularity and who has reflected on its implications for his or her own life as a “singularitarian.” [2]

With the support of NASA , Google has a broad range of technology forecasters and technocapitalists , the Singularity University opened in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change .

In July 2009, many prominent Singularitarians participated in a conference organized by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss the potential impact of robots and computers and the impact of the hypothetical possibility that they could become self-sufficient and able to make their own decisions. They argue that the possibility of becoming self-sufficient in their ability to self-sufficiency (or cybernetic revolt)). They noted that they have acquired various forms of semi-autonomy, and that they are able to rely on their own resources. They warned that some computer viruses can evade elimination and have achieved “cockroach intelligence”. They asserted that self-awareness as a science fiction is unlikely, but there are other potential hazards and pitfalls. [4] Some experts and academics have questioned the use of robots for military combat , especially when such robots are given some degree of autonomous functions. [5] The President of the AAAI has commissioned a study to look at this issue. [6]

Reception

Science journalist John Horgan has likened singularitarianism to a religion:

Let’s face it. The singularity is a religious rather than a scientific vision. The science fiction writer Ken MacLeod has dubbed it “the rapture for nerds,” an allusion to the end-time, when Jesus whisks the sinners behind us. Such yearning for transcendence, whether spiritual or spiritual, is all too understandable. Nuclear proliferation, overpopulation, poverty, famine, environmental degradation , climate change , resource depletion , and AIDS. Engineers and scientists should be more inclined to face the world’s problems and find solutions to them, rather than indulging in escapism, pseudoscientific fantasies like the singularity. [7]

Kurzweil rejects this categorization, stating that its predictions about the singularity are driven by the data that increases in computational technology have been exponential in the past. [8]

See also

Strong AI

Eschatology

Existential risk from artificial general intelligence

Friendly AI

Intelligence explosion

Post scarcity

Technological singularity

Technological utopianism

Outline of transhumanism

References

^ Jump up to:a b “2045: The Immortal Year Man Becomes” Time Magazine, February 2011