Sunday, January 27, 2008

I Wish I Were A Digital I

Susan Barnes includes Minsky's statement that "that biological human brain cells could be replaced with computer chips" in her examination of mind versus body. (pg 235) Minsky continues asking, "Would that new machine be the same as you?" Teri also posed a similar concern in her response.

I would argue it is. Assuming the computer chips are made without flaws, and enough knowledge had been gathered about the brain and designing the chips, there's no difference between a real person and a person (or cyborg) with one of these computerized brains. In essence, our brains already act like computers, though far more advanced than the current supercomputer. There's absolutely no reason given enough time, according to Moore's Law, we won't be able to program a computer with the capabilities of a neural network identical to the human brain.

To address Teri's concern, I don't think in the example we are arguing that a digital copy of one's self from yesterday will match today's self. Rather think of it like this (and assume sufficient technologies exist): Suppose one day, an evil genius snatches you up, and through the wonders of (post)modern technology, exactly copies your self to a little computer chip and replaces your brain. However, he also erases any memory of his devious little experiment. When you woke up the next morning, would you, or for that matter anyone, be able to tell your self had been digitized? (Ignore the giant scar.) I would argue no.

In Barnes's writing, I get a sense of fear of this future. She seems hesitant to believe that a digitalized version of one's self is equal to the original, biological source. She argues for the necessity of a "real" body to develop one's sense of self.

However, a "real" body is no longer necessary. In a biological setting, the body only exists to provide stimulus for the "mind." Now, in a digital age, such physical bodies are no longer necessary. There's no physical body necessary, if it can be replaced by digital bodies created in cyberspace. A brain in a vat will have just as much self as any other "real" person, maybe even more given any stimulus is possible.

Now, combine these ideas together. I would argue at some point in the future, we will have digital lifeforms, either purely in cyberspace or in robotic form, indistinguishable from "real organic life." They will think, act, and live in the exact manner as ourselves. They will meet or exceed our own intelligence. There's absolutely nothing stopping this future, nor should there. At some point the human species will die off, but through our digital creations, our legacy may continue.

Digitally.

5 comments:

Teri Stolarz said...

What's the point of a completely digital existence if they have no true feelings or appreciation for what they are. May be they will surpass human intelligence, skill, efficiency and mortality but without an appreciation or understanding for livelihood I can not feel as though it's a worthwhile existence. They will still just be a computer program. So, what's the point?

Ted Baker said...

That's a good point. I would say, when this technology does come around, digital beings could have true feelings and appreciation for what they are. I mean, we have these abilities only because little electrons are shooting through our brains. All we'd need to do is replicate that same idea (granted it will take many years) and conceptually an organic brain would think no differently an a digital brain.

The point of course is digital things can be reproduced without much cost. Imagine backing up every experience you've ever had with the ability to view or even "relive" them whenever you wanted. I think that would be pretty cool.

Of course this opens up the potential for abuse, so we as a society will need to weigh the pros and cons when such technologies actually take shape.

FYI, if you haven't seen it already, check out the movie Blade Runner. In it, robots identical to humans are created and through experience learn human emotions. The protagonist, played by Harrison Ford, is a police officer whose job it is to "retire" aka kill such replicants once they start showing human emotion. It really explores the issue that Barnes brings up in a very cool setting.

Teri Stolarz said...

Thank's. I'll definitely try to check it out! Maybe it will give me a little more "food for thought."

Brian McNamara said...

I am all for a digital life, Ted. Just wait until they can figure out how to create quantum computers, which would then be able to solve complex logic problems, such as the traveling salesman. (Quantum Computers would theoretically be able to assign a value of 1, 0 or both so a piece of information and work in the realm of Quantum Mechancics.)

And in response to Teri's comment: isn't real life just as pointless as living in a computer program? Or to put it another way, wouldn't the only difference between cyber and real life be one of perception?

If advanced enough, you would be able to "feel" things or "move" them, however these things would be moved in a digital environment. It would be just as real as your current perception. While standing, if you don't look down at your feet you really don't know they're there. It's similar to the philosopher Berkley who believed that something only exists as long as you perceive it.

Lance Strate said...

The concept of artificial life has been around for a while, as an alternative to artificial intelligence. AI is a top down approach, AL is bottom up, incorporating the idea of evolution--artificial life forms in cyberspace coded to change and evolve in response to their environment.

Interesting point about feelings, but I'd have to ask whether a virus or an amoeba has feelings? Does an insect? I think the problem that comes up is not so much whether computer programs can be considered alive or not, but that the possibility makes us think about and question what we mean by the term life, how we define it. If a virus is alive, would an artificially constructed molecular chain that functions just like a virus be alive?

I do feel that there is some essence that cannot be transferred from a brain to a computer, however. There may be a way to clone a mind technologically as well as biologically, but the original mind is not actually transferred. Consider that if you could create one copy, you could create more than one, and all of them would be you in one sense, and not you in another.