What does The Singularity mean? Can The Singularity be avoided?
Scientists believe that within 15 years from now on, humanity will have the technological means to create superhuman intelligence. Some voices says that, shortly after, the human era will be ended. Is this kind of progress avoidable? And if it’s not, can events be guided so that we may survive? These questions were scientifically investigated. Some possible answers (and some further dangers) were presented by Vernor Vinge in one of his articles.
To begin, we should define the “The Singularity” concept. In the last century, the acceleration of technological progress has been the central feature. This happens because of the imminent creation by technology of entities with greater than human intelligence. There are several means by which science may achieve this revelation, Vernor Vinge listing the following:
1. There may be developed computers that are “awake” and superhumanly intelligent. (To date, there has been much controversy as to whether we can create human equivalence in a machine. But if the answer is “yes, we can”, then there is little doubt that beings more intelligent can be constructed shortly thereafter.)
2. Large computer networks (and their associated users) may “wake up” as a superhumanly intelligent entity.
3. Computer/human interfaces may become so intimate that users may reasonably be considered superhumanly intelligent.
4. Biological science may provide means to improve natural human intellect.
The first three possibilities depend mainly on improvements in computer hardware. This is why scientists believe that the creation of something with greater intelligence than the human one will happen before 2030.
Of course, this fact won’t rest without consequences and it is believed that when greater-than-human intelligence drives progress, that progress will be much more rapid. It could be the progress itself that would involve the creation of still more intelligent entities. This could mean that developments that before were thought might only happen in “a million years” will likely happen in the next century.
Vernor Vinge says that “it’s fair to call this event a singularity. It is a point where our old models must be discarded and a new reality rules. As we move closer to this point, it will loom vaster and vaster over human affairs till the notion becomes a commonplace. Yet when it finally happens it may still be a great surprise and a greater unknown.”
There is a quesion that must be answered about how the Singularity spread across the human world view? For a while yet, the general critics of machine sapience will have good press. After all, as long as we have hardware as powerful as a human brain it is almost stupid to think we’ll be able to create human equivalent (or greater) intelligence.
A symptom of progress toward the Singularity could be that ideas themselves should spread ever faster, and even the most radical will
quickly become commonplace.
Another problem that scientists try to answer to is “can the singularity be avoided?” Well, the answer seems to be that after all, it may not happen at all if there are no possibilities. But if the technological Singularity can happen, it will. It is known that even if all the governments of the world were to understand the “threat” and be scared to death of it, progress toward the goal would continue.
There have been fiction stories of laws passed forbidding the construction of “a machine in the likeness of the human mind”. Eric Drexler has provided spectacular insights about how far technical improvement may go. He agrees that superhuman intelligences will be available in the near future and that such entities pose a threat to the human status quo. But Drexler affirms that we can restrict such transhuman devices so that their results can be examined and used safely.
Another approach to confinement is to build rules into the mind of the created superhuman entity. If the Singularity could not be prevented or limited, just how bad could the Post-Human era be? Well bad enough.
Specialists says that the Singularity is an inevitable consequence of the humans’ natural competitiveness and the possibilities inherent in technology. And yet humans are the initiators. We have this freedom to establish initial conditions and to make things happen in ways that are less hostile than others.