@nerdfiles,
It's a good question that I think is illustrated in some robot movies, like that one with Haley Joel Osment (
A.I.). One of the most common critiques of the movie is that it's just hard to get wrapped up in a character whom you know is a robot. And it was true -- it was a schmaltzy movie that tugged at emotional strings, and yet if the main character had gotten crushed by a truck, I'm not sure I'd have cared too much.
Think about that example. Or think about C3PO's various encounters with disintegration, etc. In
A.I. you have a cute kid of an actor whom you know is a real live human outside that movie, playing the character of an innocent little persecuted little boy robot. In other words, the movie is creating a story that could be told about a human -- but we know the character is a robot. And that little bit of information de-invests us emotionally.
That makes me think that our
moral sense of humanity and human obligation does not hinge upon the brain or the intelligence -- it hinges upon a sense of shared identity. This extends in a sort of dilutional way to animals (dilutional in that it becomes less strong as we anthropomorphize less). This is why parents passionately love their kids even if they have severe mental retardation and could never achieve the type of "mind" as a convincing A.I. being. This is why we find the Nazi extermination of mentally handicapped kids to be a crime against humanity. Humanity consists in the "thing" of us too, not just the "being" of us.