Consciousness is a millennia-long debate and probably will go on till the end of our species. We disagree on almost everything about consciousness and professionals dealing with consciousness avoid the term or use it in a very narrow sense (as the sensation of having experiences). We cannot provide a general definition of consciousness to the point that we don’t agree on what a definition of consciousness should look like.
However, there is one practical aspect we face about consciousness: conscious machines and it is mostly answering two questions:
- should we allow our computers to reach a level of complexity and autonomy that one way or another will lead to the emergence of something which is consciousness in humans?
- in the case of conscious AGI, how are we supposed to treat it? It is not a moral question (at least at first) but a practical one. Conscious AGI will have an initial set of values and motivations but inevitably it will evolve by modifying its code. The only way to coexist with conscious AGI is to recognize some common rights analog to human rights.
Comments
There are currently no comments on this article.
Comment