Wednesday, May 2, 2007

The human computer

Ok, we all know how I feel about being able to extend life. Other than the fact that it would be completely boring to live that long...well, that's the only real drawback I guess.

The idea of this qusp seems like they turned the human (such as it is in the book) body into something of a computer, akin more to an Apple 2e than what we use now. My understanding is that these bodies can not function without a qusp, and the "user" can determine what kind of body he/she/it wants, much as a computer shopper can determine what kind of hardware is necessary to serve his or her purposes.

Ghost in the Shell anyone?

Bodies seem to be nothing more than computers that can run a program that acts as an intelligent being. Take the qusp out (or transmit yourself) and the body ceases functioning. Done with that planet? Recycle the body so another can be made. Nothing more than a useless box.

Here's a better question, and I may have missed it in the reading; What if the qusp was destroyed while it was occupied. What happens to that person? Sure, a local death may cause the loss of a few hours of "life", but what happens when the main essence of the person is destroyed? Could viruses affect it? What about EMP (assuming there is a way to create such a thing)? I would be more concerned about these things than living forever.

1 comment:

Josh said...

This is an excellent analysis and it reasserts my own personal reaction to the "immortality" presented in the novel. I would further add my own concerns, which have their roots in the comment you (Mike) made. It seems to me that, while the extent of our discussion in this class was about the definition of humanity, considering a cyborg human is an oxy moron, and maybe even a paradox. This book seems to show that adding technology to a person makes him a machine and removes his humanity. Consider the question of humanity in regards to cyborgs: how could a cyborg be considered human when adding technology to a human (or rather, using technology as a way to sustain human life and intelligence) makes them more like a computer? Throughout the class I have been emphasizing the unique characteristics of the human brain and body--even adding the question of the soul's existence--and I have stated my belief that this could never be reproduced by man. I was referring to the idea of machines being developed to imitate humans, but this book opens it up to humanity being replaced by computers. So, if a computer cannot properly copy human abilities/intelligence, than using a computer to sustain life of a man does not represent true immorality for humanity, because humanity becomes extinct when computers take over for the brain.

It is also interesting to consider how much this demeans humanity and (as I already stated in class) the value of life. The reason that it demeans us is that it claims that the features that make us human--the living organs and, most importantly, the brain--are insignificant and that a computer can replace us. What does this imply? Our lives are meaningless if the body we were born with can be discarded for a program that we develop. Is this not a form of technology extinguishing the human race? The reality is that the idea of a Qusp not only diminishes us by the assumption that our bodies are insignificant, but it also implies that the technology that we create (with the complex human brain) is better than us! How insulting it would be to say that this computer I am typing this response on might be the ancestor to that which will replace me (or, at least, sustain me).

The book itself doesn't seem to suggest that humanity is defined by memory and here is the reason why: the book describes memory in computer terms. You "save your progress," backup your thoughts. This is a feature of computers, not the brain. Memories in the brain can be repressed or forgotten, or might fade over the years. They are recalled through thinking, not accessing (the difference being that it might take an effort to remember something, but with a computer you can just pull up the saved data). In fact, memories are one of the examples of something the computer can do better than the humans. However, that statement is not entirely true: it is harder for the brain to remember data than when a computer saves information, but the complexity of the brain allows for the flaws, the forgotten memories, the repressed thoughts, the recognition of an object that you can't place. In other words, flaws define humanity.

So this idea is so unlikely, and I agree with you when you present this point of view. The question is, will humans ever make a computer that doesn't crash? And why would someone want immortality by given up their humanity for the sake of technology? There are so many objections to this future--many of which I presented in class--but one of the main problems is that this is a false way to live forever. As I stated before, I would rather die and live for eternity (as my faith declares) than live forever as a software program. So I guess this clarifies, a little, my other point as well (in the post "The Body - Trash or Treasure?"), but it also extends from your idea of the human computer. I would never want to be reduced to a piece of hardware!

(Wow! This response is longer than the original post!)