Is there any fundamental difference between biological life and artificial intelligence? Can machines, computers and robots become conscious one day? What makes us human? If anything? What is consciousness?
I think the right question to ask here on the fundamental level is the one about the instinct of self-preservation. Philosophical materialists (like Dawkins et al) would argue that all other aspects of consciousness are deducible from that. The complexity of decision making in survival strategies would sooner or later require some fuzzy logic and may very well spark the very thing that we call consciousness. We also wouldn’t be zombies as kindness and positive social interaction would prove to be beneficial survival strategies.
So, would a system complex enough develop such an “instinct” for self-preservation out of itself? Did such systems merely emerge on this planet because of the usual evolutionary processes?
Thoughts?