Prevention of AI risks for sustainable development of society and post-humanity researches

Robots are becoming more and more human-like, and people are learning to interact with them, constantly improving them not only externally, but also internally. Social networks, where fiction is mostly circulated as news from the right sources, began to flag controversial content. However, in situat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zinchenko, V., Boichenko, M., Popovych, M., Polishchuk, O.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Robots are becoming more and more human-like, and people are learning to interact with them, constantly improving them not only externally, but also internally. Social networks, where fiction is mostly circulated as news from the right sources, began to flag controversial content. However, in situation of probabilistic truth and doublethink, there is no problem to replace the proper reality with the illusory one: to mark real plots with the sign “fiction” and vice versa. Evolution (both natural-biological and human – organized-social-technological-intellectual, etc.) will necessarily lead to the expansion of thinking, endowed with rights and status of new species. The review of the researches on this issue, which is now coming to the ”forefront” not only in robotics and AI, but also in evolutionary genetics, psychology, philosophy, pedagogy, psychiatry, microbiology, anthropology, neurology and other sciences, shows that the post-human approach is becoming increasingly influential, according to which if machines (androids, cyborgs, etc.) acquire the ability to feel and empathize, they will no longer be machines. Sustainable development of society is critically dependent on proper assessment of prospects of cyber transformation of humans or humanization of robots as a result of risks of AI embodiments in humans and robots.
ISSN:0094-243X
1551-7616
DOI:10.1063/5.0161321