We are independent & ad-supported. We may earn a commission for purchases made through our links.

Advertiser Disclosure

Our website is an independent, advertising-supported platform. We provide our content free of charge to our readers, and to keep it that way, we rely on revenue generated through advertisements and affiliate partnerships. This means that when you click on certain links on our site and make a purchase, we may earn a commission. Learn more.

How We Make Money

We sustain our operations through affiliate commissions and advertising. If you click on an affiliate link and make a purchase, we may receive a commission from the merchant at no additional cost to you. We also display advertisements on our website, which help generate revenue to support our work and keep our content free for readers. Our editorial team operates independently from our advertising and affiliate partnerships to ensure that our content remains unbiased and focused on providing you with the best information and recommendations based on thorough research and honest evaluations. To remain transparent, we’ve provided a list of our current affiliate partners here.

What is the Technological Singularity?

Michael Anissimov
By
Updated Feb 18, 2024
Our promise to you
WiseGEEK is dedicated to creating trustworthy, high-quality content that always prioritizes transparency, integrity, and inclusivity above all else. Our ensure that our content creation and review process includes rigorous fact-checking, evidence-based, and continual updates to ensure accuracy and reliability.

Our Promise to you

Founded in 2002, our company has been a trusted resource for readers seeking informative and engaging content. Our dedication to quality remains unwavering—and will never change. We follow a strict editorial policy, ensuring that our content is authored by highly qualified professionals and edited by subject matter experts. This guarantees that everything we publish is objective, accurate, and trustworthy.

Over the years, we've refined our approach to cover a wide range of topics, providing readers with reliable and practical advice to enhance their knowledge and skills. That's why millions of readers turn to us each year. Join us in celebrating the joy of learning, guided by standards you can trust.

Editorial Standards

At DelightedCooking, we are committed to creating content that you can trust. Our editorial process is designed to ensure that every piece of content we publish is accurate, reliable, and informative.

Our team of experienced writers and editors follows a strict set of guidelines to ensure the highest quality content. We conduct thorough research, fact-check all information, and rely on credible sources to back up our claims. Our content is reviewed by subject matter experts to ensure accuracy and clarity.

We believe in transparency and maintain editorial independence from our advertisers. Our team does not receive direct compensation from advertisers, allowing us to create unbiased content that prioritizes your interests.

The Technological Singularity, or simply "Singularity," is a multi-faceted concept in futurism with several overlapping and sometimes conflicting definitions. The most proper and prominent definition of the Singularity was given by Vernor Vinge in his essay, The Coming Technological Singularity. It refers to the point at which superhuman intelligence is created technologically. These superhuman intelligences could then apply their brainpower and expertise to the task of creating additional or more powerful superhuman intelligences, resulting in a snowball effect with consequences beyond our present ability to imagine.

The term "Technological Singularity" was coined by analogy to the singularity in the center of a black hole, where the forces of nature become so intense and unpredictable that our ability to calculate the behavior of matter in these circumstances drops to zero. Often mentioned in conjunction with the idea of superhuman intelligence in Singularity dialogues is the notion of accelerating technological change. Some have argued that as the slope of technological progress increases, it will culminate in an asymptote, similar visually to a mathematical singularity.

However, this notion of the singularity is not the same as Vinge intended; referring to the emergence of superhuman intelligence, along with superhuman thinking speeds. (Including smartness, ability to understand and create concepts, turn data into theories, make analogies, be creative, and so on.) Though superhuman intelligences creating additional superhuman intelligences would indeed result in the acceleration of technological progress, progress would not become infinite, in the sense that a mathematical singularity would suggest.

Because superhuman intelligences would, by definition, be smarter than any human, our ability to predict what they would be capable of with a given amount of time, matter, or energy are unlikely to be accurate. A superhuman intelligence might be able to fashion a functioning supercomputer out of cheap and readily available components, or develop full-fledged nanotechnology with nothing but an atomic force microscope. Because the ability of a superhuman intelligence to design and manufacture technological gadgets would rapidly surpass the best efforts of human engineers, a superhuman intelligence could very well be the last invention that humanity ever needs to make. Due to their superhuman genius and the technologies they could rapidly develop, the actions of intelligences emerging from a Technological Singularity could result in either the extinction or the liberation of our entire species, depending on the attitudes of the most powerful superhuman intelligences towards human beings.

Oxford philosopher Nick Bostrom, director of the Oxford Future of Humanity Institute and the World Transhumanist Organization, argues that the way superhuman intelligences treat humans will depend on their initial motivations at the moment of their creation. A kind superhuman intelligence would, in wanting to preserve its kindness, beget kind (or kinder) versions of itself as the self-improvement spiral continued. The result could be a paradise in which superhuman intelligences solve the world's problems and offer consensual intelligence enhancement to human beings. On the other hand, a malicious or indifferent superhuman intelligence would be likely to produce more of the same, resulting in our accidental or deliberate destruction. For these reasons, the Technological Singularity might be the single most important milestone our species will ever confront.

Several paths to superhuman intelligence have been proposed by Singularity analysts and advocates. The first is IA, or Intelligence Amplification, taking an existing human and transforming her into a nonhuman being through neurosurgery, brain-computer interfacing, or perhaps even brain-brain interfacing. The other is AI, or Artificial Intelligence, the creation of a dynamic cognitive system surpassing humans in its ability to form theories and manipulate reality. When either of these technologies will reach the threshold level of sophistication necessary to produce superhuman intelligence is uncertain, but a variety of experts, including Bostrom, cite dates within the 2010-2030 range as likely.

Because the Singularity may be nearer than many would assume, and because the initial motivations of the first superhuman intelligence may determine the fate of our human species, some philosopher-activists ("Singularitarians") view the Singularity not only as a topic for speculation and discussion, but as a practical engineering goal that meaningful progress can be made towards in the present day. Thus, in 2000 the Singularity Institute for Artificial Intelligence was founded by Eliezer Yudkowsky to work exclusively towards this goal.

WiseGEEK is dedicated to providing accurate and trustworthy information. We carefully select reputable sources and employ a rigorous fact-checking process to maintain the highest standards. To learn more about our commitment to accuracy, read our editorial process.
Link to Sources
Michael Anissimov
By Michael Anissimov

Michael is a longtime WiseGEEK contributor who specializes in topics relating to paleontology, physics, biology, astronomy, chemistry, and futurism. In addition to being an avid blogger, Michael is particularly passionate about stem cell research, regenerative medicine, and life extension therapies. He has also worked for the Methuselah Foundation, the Singularity Institute for Artificial Intelligence, and the Lifeboat Foundation.

Discussion Comments

By anon140018 — On Jan 06, 2011

So basically the technological singularity would be the inability to predict whether or not this superhuman intelligence would work alongside humans, take measures to control us because it thinks it knows better or become a real-life Skynet? This makes sense.

If AI evolves to the point where it not longer needs to be programmed by humans to tell it what to do, there's no knowing what will happen after that.

By anon24123 — On Jan 07, 2009

i wonder if the end times will come because an intelligence like that could eventually reach god like levels and challenge it for control that'd be rad haha

Michael Anissimov

Michael Anissimov

Michael is a longtime WiseGEEK contributor who specializes in topics relating to paleontology, physics, biology, a...

Read more
WiseGEEK, in your inbox

Our latest articles, guides, and more, delivered daily.

WiseGEEK, in your inbox

Our latest articles, guides, and more, delivered daily.