What is the technological uniqueness?

Technological singularity or simply "singularity" is a multilateral concept in futurism with several overlapping and sometimes contradictory definitions. Vernor Vinge in his essay, the upcoming technological singularity was given the best and most important definition of singularity. It refers to a point at which superhuman intelligence is created technologically. These superhuman intelligence could then apply their brain power and expertise on the task of creating another or stronger superhuman intelligence, resulting in the effect of a snowball with consequences of our current ability to imagine. zero. Often mentioned in conjunction with the idea of ​​superhuman intelligence dialogue singularity is the concept of accelerating technological changes. Some argued that with increasing the tendency of technological progress, it culminated in asymptot, similarly visually as mathematical singularity. Referring to the emergence of superhuman intelligence, along with superhuman speeds of thinking. (Including catchGreen, ability to understand and create concepts, turn data into theory, create analogies, be creative, etc.

Because superhuman intelligence would be smarter than any person, our ability to predict what would be unlikely to be accurate. Superhuman intelligence could be able to create a functioning supercomputer from cheap and easily accessible components or develop a full -fledged nanotechnology with nothing but atomic microscope. Because the ability of superhuman intelligence to design and produce technology gadgets would quickly overcome the best efforts of human engineers, superhuman intelligence could be very well the last invention that humanity ever needs. Due to their superhuman genius and technologies that could develop rapidly, the actions of intelligence would be based on technological uniqueness either to extinction or liberate all of oursEho, depending on the attitudes of the strongest superhuman intelligence towards human beings.

Oxford philosopher Nick Bostrom, Director of Oxford Future of Humanity Institute and World Transhumanist Organization , claims that the way in which superhuman intelligence treats people will depend on their initial motivations. A second superhuman intelligence would be delayed by the species (or more kind) version, as the spiral continued to improve. The result could be a paradise in which superhuman intelligence solves world problems and offer consensual news for human beings. On the other hand, a harmful or indifferent superhuman intelligence would probably create more of the same, which would result in our random or deliberate destruction. For these reasons, the technological singularity may be the only most important milestone to face.

several ways to atDlidic intelligence was designed by analysts and advocates of singularity. The first is IA or the intelligence of amplification , taking an existing person and converting it by nanehuman housing through neurosurgery, connecting the brain computer, or even brain brain connection. The second is an AI or artificial intelligence , creating a dynamic cognitive system that overcomes people in its ability to create theories and manipulate reality. When some of these technologies achieve the sophistication level necessary to create superhuman intelligence, it is uncertain, but many experts, including Bostrome, quote data in the range of 2010-2030 as likely.

Because singularity can be closer than many would assume, and because the initial motivation of the first superhuman intelligence can determine the fate of our human species, some philosophers ("singularitarians") consider not only a topic for speculation and discussion, but as a practical engineering objectiveHe currently records the dream of a time at present. In 2000, the institute of singularity for artificial Intel ligence was founded by Eliezer Yudkowsky to work solely on this goal.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?