Watching Mamoru Oshii's Ghost in the Shell was very interesting and had some great questions and insights. What I wish to focus on though are the meaning and implications of "the ghost."
The basic premise of the movie follows the assumption of the strong AI thesis, that being that all is required for intelligence and sentience is a sufficiently fast and complex enough program and then consciousness emerges. This cuts two ways first it means that we as humans are nothing special as far as consciousness and intelligence goes, we are just one small point on the time line of evolution. Second, if it really is true that consciousness simply arises out of complexity and speed then who we think we are really is nothing else than a ghost in reference to our own machine. When the mind, i.e. the brain breaks down, we cease to exist.
This is the materialistic premise behind the strong AI thesis, that being we are nothing more than a program running and who we think we are or might be is nothing more than the sum of the program and its specific memories and experiences.
The movie seems to imply that there is something special about being human. But if the strong AI thesis is correct then it really makes no difference if we are human, cyborg, android or computer. All that does matter is that we, which ever of the four previous "species" you wish to choose has consciousness "emerge."
The funny thing about the strong AI thesis is that, somehow and we don't know how, consciousness emerges from nonconsciousness. This is very much akin to the evolutionary argument that from the inanimate and lifeless matter of the universe that life emerges. This would be the same as how the movie portrays program 2501, after enough time, experience, and gaining enough complexity and speed she/he/it spontaneously (miraculously!?) becomes sentient and we have life.
The question that I have started asking myself this semester since we are focusing on AI in almost everything we have read or watched, is what truly is the status of intelligence. That is, is intelligence a necessary and/or sufficient condition for life? I would have to say that it is neither, since in the former case we have instances of plant life and microbes that we would say are alive but lack intelligence. Now in regards to the latter case of its being a sufficient condition, I, when the course started assumed it was, but now I am not so sure.
The movie stresses that for something to be a life form it needs to evolve and needs to be able to die. Unusual standards for defining life. Let us leave out the biological side of the issue for the moment and focus on (and assume for the sake of argument) the strong AI hypothesis as portrayed in the movie. I find it hard to imagine that the computer I am typing on now could somehow, someday simply by becoming faster and more complex actually become alive. Somehow I imagine that it would be much easier to see my computer as intelligent without being alive.
It seems to me now that intelligence does not necessarily make something alive.
Are animals intelligent? If the answer is yes then they too, as we, must have a soul or a ghost in the machine. But, what if the answer is that they are not intelligent and have no soul or ghost? Then Descartes would be right they really do not feel pain, anger, joy... they are just a system of stimulus and reaction, a purely causal system.
Another issue that the movie does not really chomp down on is "what is intelligence?" We seem to take for granted that what would count as intelligence can and must be something we recognize as, if not identical at least strongly similar to our own "intelligence."
Perhaps fetus' or young children do not have their ghost until their brain (i.e. computer) is sufficiently developed enough. But we already know that brain does not necessarily mean "mind" or "ghost" or "soul" until it gets complex enough for many creatures have brains that we do not give credence to for having either intelligence or a ghost.
How are we then to take Oshii's ideas represented in this movie?
If you happen to be a materialist, an evolutionist, then perhaps you welcome the ideas that life is nothing but increasing complexity. But if you happen to be of a theistic point of view it would be a shot at any importance of human values, morals and ethics.
Complexity and design do not happen by accident. Like creates like. Like comes from like. Perhaps we can someday achieve the goal of creating life, but I do not think it will happen according to the hypothesis of the strong AI argument. Intelligence is neither a necessary nor sufficient condition for life, but if we ever do figure out how to create life and not just mimic some of its instantiations, we may then become as Gods ourselves. We may then discover what the other tree in the Garden of Eden really was offering us as an option.
Would that be a good thing?