@jeeprs,
jeeprs;77887 wrote:Give it pain. Work out a system so when you whack it, it cries. Better still, make it sensitive to insults - say 'the X5U was much more subtle than you' and it cries. Unbeatable.
That's cracking me up.
More seriously however the question wether or not a robot could have an ability of feeling pain deserves some attention.
At first we might have to differentiate between feeling and perception.
A perception is something much more primitive.
The unicellular protist called
Euglenadoes not have a single neuron and thus no ability of feeling anything that we would consider equivalent to pain.
However it has a primitive organ of perception, something that we could consider a prestage of a sense organ.
A particular spot causes chemical rections depending on sunlight, a reaction that automatically triggers a particular pattern of movement in its flagellum, which will always make it head for the sun. It does not feel any better when it heads for the sun, it's a kind of mechanical process of which this little organism profits because the energy from the light really has a positive impact on its metabolism.
Obviously, since it doesn't have any feelings, it was evolutionary selection that caused this automatism to appear.
Me as a human, i like the sun, but i don't just perceive it, seeing the sun is something that makes me feel well (especially in old cold Germany).
Could there be something like a robot that also likes the sun?
I mean that doesn't just perceive it, but really likes it?
If i think of the radio in my car, it shows a kind of autonomous behaviour: When the reception of the radio station is getting weaker, it will search for the next better frequency to connect to it.
Of course this also is one kind of perception. The perception is not registered by a consciousness of course, but neither is Euglena's.
If anyone finds Euglena's perception to be to primitive to be considered such, we have plenty of stages between Euglena and humans. For example a slug is still very primitive but it defintiely has perceptions.
It has evaluating mechanisms telling it wether an input is good or bad. (cutting the skin e.g. is bad).
A machine might also have an evaluation system like primitive organisms.
For example a solar driven machine could autonomously register that it needs more sun and according to this perception move to a different position.
The question is: Will the machine feel bad, when it doesn't get enough sun and run low on power?
If we give the machine an urge to stay charged by any means, will it feel bad when its battery is running low?
My guess is, it won't have more emotions than Euglena in the first place.
I don't think we can locate any emotions at this stage.
However what if the needs become more numerous and the different stages the machine can reach get more complex?
Let's say it has a perception as complex as a slug?
A slug doesn't have emotions but it shows clear pain symptoms.
At the moment there's not much evidence that machines could develop into a direction that makes them as sensitive as slugs because there is no need for them to be so sensitive.
But times change, and machines are not that mechanical anymore the way we are used to it.
Remember: There is a (relativy) new increasingly important concept applied to machines that is based on the logical structure of neuronal networks.
Computers are leaving the binary stage behind, in which they only had 'on' and 'off' position.
Research of swarm intelligence shows the highly emergent potential of information processing systems.
If you have any background information on emergent phenomenons you will agree that we have to expect surprising effects in terms of artificial intelligence.
When i say surprising i am not talking about 'surprised how soon they become humanlike', but i am talking about phenomenons we simply don't expect.
The next generation(s) of AI may have unexpected emergent properties.
Even if a machine does not have a consciousness like 'i feel bad', it might develop autonomous reactions of evaluating a condition and making decisions of its own, to change this condition.
This will still be completely different from any human feeling, because it takes an organically evoluted body to have organic emotions.
We must not forget that emotions are physical in the first place.
For this reason a machine will never have the SAME feeling as we do when we think e.g. 'this stinks'.
But in fact it might have a perception that triggers an urge.
From a logical perspective this will not be so much different.
I guess the philosophical branch to read more about it would be
functionalism.