Imagine an alien (ET) comes to you and offer you knowledge that would satisfy all material needs of humanity and cure all the sick, but as a side-effect of all the knowledge and technology, let’s say 1% of the population will have the ability to exterminate humanity. Would you accept the gift?
Now imagine a second scenario, the alien doesn’t ask but tells you that you will be provided with said knowledge and you (humanity) have one year to prepare. Apparently, that is their test for new civilizations acceptance, the unwilling applicant either adapt to the new knowledge/technology or self-destroy.
This is just a thought experiment about the rising power of AI. There will be a point in the development of AGI when we will be put in a similar position. The picture could be generalized for any major technological advancement: genetic engineering, AGI, etc. State regulations are the only tool against the side effects of too much power in the hands of an increasing number of people, and the regulations mostly succeed partly fail the task now and definitely will fail in the future. At some point in not so distant future, only unthinkable – world government (Y.N.Harrari style) will be the only surviving option. What do you think are our chances for survival?
Comments
There are currently no comments on this article.
Comment