1
   

Do you suppose that technology has a limit?

 
 
Terry
 
  1  
Reply Tue 5 Feb, 2008 03:34 pm
Noddy, interesting article about lying robots. Do you think that they could feel guilt or remorse for killing their fellow robots, or were they simply executing a program?

For a machine to exercise free will, it would need to be able to do more than assign values to possible consequences of decisions and "choose" the option which yielded the highest number. If it could make a non-optimal choice because it "wanted" to (and I don't mean it simply makes a random selection), then I would agree that it had free will. What could a machine "want"?
0 Replies
 
JLNobody
 
  1  
Reply Tue 5 Feb, 2008 05:25 pm
Terry, I do you feel that the existence of free will itself is not problematical, only its existence within machines (?).

BTW, many of us anthropomorphize our animal pets and machines--I know someone who has named his car. In addition, the AI people have virtually unlimited aspirations for thinking-"feeling" machines, and we tend to robotize ourselves. I'm thinking of our military and our factory and office workers. Remember Charlie Chaplin's spoof on the robotization of factory workers (can't remember the title of his classic film)?
0 Replies
 
Terry
 
  1  
Reply Tue 5 Feb, 2008 09:54 pm
JLN, I am inclined to think that free will is possible for human beings, within the limits of biology and social conditioning. Way back in college I took a course in fluidics and recall that a very tiny control force can have a huge effect on flow. Perhaps the conscious energy pattern arising from the sustained 40 hertz firing of the neurons in a neuronal network could effect a slight change in that pattern that would initiate other loops that stimulate nerves that fire up muscles. I am currently reading "Exploring Consciousness" by Rita Carter and may have a better explanation when I finish it.

Any task that has become routine may be seen as robotic. I am not conscious of telling my fingers to move when I type, scratch an itch, or drive (actually I drive on autopilot a lot when I am thinking about other things). It does seem as though there is an attempt to robotize society by standardizing education, jobs and desires. Maybe people feel safer when everything is predictable.
0 Replies
 
JLNobody
 
  1  
Reply Tue 5 Feb, 2008 10:36 pm
Terry, interesting response, but I have come to see the old issue of free-will vs. determinism as a false one: it is purely hypothetical, metaphysical, and unresolvable. As the pragmatists might put it, it doesn't matter for our actual lives if we were totally free or totally determined; it would feel the same either way.
0 Replies
 
Chumly
 
  1  
Reply Tue 5 Feb, 2008 11:13 pm
Terry wrote:
Chumley, I looked at your "Sex with a robot" thread. I cannot imagine how anyone could "love" something they knew was a machine. Some people anthropomorphize machines that they KNOW cannot feel emotions or act deliberately ("loving" your car) and I suppose that someone who did that could project emotions onto a life-like robot. It's probably a guy thing: women want emotional involvement from the one we love. But perhaps we all delude ourselves into believing that our love is reciprocated.
Excerpt from "Question" by The Moody Blues:

And when you stop and think about it
You won't believe it's true
That all the love you've been giving
Has all been meant for you
0 Replies
 
Dedshaw
 
  1  
Reply Thu 14 Feb, 2008 07:27 pm
first of all i have to say that i dont belive any machine or robot of any kind can feel any sort of emotion or pain. they could respond to a human's emotions, ive seen on the science channel where robots will read the facial expressions of people and respond to that the way they were programmed to. it basically reads the face like a finger print i guess is the easiest way i could put it. it picks certain points on the face and observes the reactions and where those points move ect. also the scientists were talking to it in different tones of voices, and the robot responded to the tones but not the actual dialouge because of the different soundwaves was read, but obviously couldnt understand the words. so i think that a machine can only do things that a human can program it to, and not "spread its wings" and do things on its own unless it was a virus or some kind of glitch in the system, but it couldnt think for itself. machines are just 1's ,0's, and complex programming nothing else. without us they couldnt function. if computers go haywire and act out, someone has to be pulling strings.

as far as any robot destroying us and such, if i were to belive it was possible i think we would kill ourselves before that happend. humans can be very emotional people and can throw logic right out the window. religoun is a big major problem thats why there are so many wars. we are pretty much hardwired to destroy ourselves, i think, because out of a bunch, theres always one with a psychological disorder, that just thinks differently then the rest, and doesnt conform and acts out against the people he/she grew up with.
0 Replies
 
jasonrest
 
  1  
Reply Thu 14 Feb, 2008 07:35 pm
Dedshaw wrote:
first of all i have to say that i dont belive any machine or robot of any kind can feel any sort of emotion or pain. they could respond to a human's emotions, ive seen on the science channel where robots will read the facial expressions of people and respond to that the way they were programmed to. it basically reads the face like a finger print i guess is the easiest way i could put it. it picks certain points on the face and observes the reactions and where those points move ect. also the scientists were talking to it in different tones of voices, and the robot responded to the tones but not the actual dialouge because of the different soundwaves was read, but obviously couldnt understand the words. so i think that a machine can only do things that a human can program it to, and not "spread its wings" and do things on its own unless it was a virus or some kind of glitch in the system, but it couldnt think for itself. machines are just 1's ,0's, and complex programming nothing else. without us they couldnt function. if computers go haywire and act out, someone has to be pulling strings.

as far as any robot destroying us and such, if i were to belive it was possible i think we would kill ourselves before that happend. humans can be very emotional people and can throw logic right out the window. religoun is a big major problem thats why there are so many wars. we are pretty much hardwired to destroy ourselves, i think, because out of a bunch, theres always one with a psychological disorder, that just thinks differently then the rest, and doesnt conform and acts out against the people he/she grew up with.


Can a human not program a computer to think for itself?
0 Replies
 
JLNobody
 
  1  
Reply Thu 14 Feb, 2008 08:00 pm
Much, if not ultimately all, human feeling, thinking and acting are driven by biological forces. That is not so with thinking machines; they are "motivated" only by software instructions. At most they simulate human action. They only behave AS IF they were driven by human motivations and drives. To me, this "as-if" qualification may make a real difference.
0 Replies
 
jasonrest
 
  1  
Reply Thu 14 Feb, 2008 09:13 pm
JLNobody wrote:
Much, if not ultimately all, human feeling, thinking and acting are driven by biological forces. That is not so with thinking machines; they are "motivated" only by software instructions. At most they simulate human action. They only behave AS IF they were driven by human motivations and drives. To me, this "as-if" qualification may make a real difference.


Indeed.
I guess it all depends on one's definition of what it means to "think".
If you believe the process has a biological foundation which it does...(in humans) then it is much more unlikely that robots will ever "think". However, even this is possible, in my opinion.

If I remember correctly, the post prior did not specify the biological connection. So in that case, computers and robots have been "thinking" for a while now.
0 Replies
 
Chumly
 
  1  
Reply Fri 15 Feb, 2008 07:39 am
Quote:
In computer programming, duck typing is a style of dynamic typing in which an object's current set of methods and properties determines the valid semantics, rather than its inheritance from a particular class. The name of the concept refers to the duck test, attributed to James Whitcomb Riley, which may be phrased as follows:

If it walks like a duck and quacks like a duck, I would call it a duck.
In duck typing one is concerned with just those aspects of an object that are used, rather than with the type of the object itself.
http://en.wikipedia.org/wiki/Duck_typing
0 Replies
 
testy
 
  1  
Reply Thu 13 Mar, 2008 01:11 pm
technology only has a limit if god made a certain set of rules about the universe, then those rules can only be explored so far. If god changes the rules endlessly, then there is no limit
0 Replies
 
jasonrest
 
  1  
Reply Thu 13 Mar, 2008 01:40 pm
testy wrote:
technology only has a limit if god made a certain set of rules about the universe,


Rules? such as what? Gravity.
Man has defied most earthly limitations.
0 Replies
 
testy
 
  1  
Reply Thu 13 Mar, 2008 01:42 pm
jasonrest wrote:
testy wrote:
technology only has a limit if god made a certain set of rules about the universe,


Rules? such as what? Gravity.
Man has defied most earthly limitations.


by rules i mean what technology is, technology is limited by rules. You cannot break rules because they are rules, so a rule is a limit. Gravity is a rule, if you break it then you just haven't found "gods" rule which is going to be the final, last rule that you can't break and that is going to be the limit of technology
0 Replies
 
JLNobody
 
  1  
Reply Thu 13 Mar, 2008 05:17 pm
Jasonrest, it also seems to me that "thinking" has a quality that is profoundly "biological": it frequently has properties of open-endedness, ambivalence, and contradiction. Could this ever be so for the "thinking" of machines?
And perhaps most important for the comparison of humans and artificial intelligence is the pervasive role of unconscious processes (often expressed as feeling) paralleling and motivating conscious thought.

Another point: the "rules" or "laws" of Nature should not be taken to be commandments produced by some divine legislator. The "Laws" are no more than observed regularities which "describe" (rather than "explain") HOW Nature works/WHAT it does, not WHY it does as it does.
0 Replies
 
jasonrest
 
  1  
Reply Thu 13 Mar, 2008 06:57 pm
JLNobody wrote:
Jasonrest, it also seems to me that "thinking" has a quality that is profoundly "biological": it frequently has properties of open-endedness, ambivalence, and contradiction. Could this ever be so for the "thinking" of machines?
And perhaps most important for the comparison of humans and artificial intelligence is the pervasive role of unconscious processes (often expressed as feeling) paralleling and motivating conscious thought.

Another point: the "rules" or "laws" of Nature should not be taken to be commandments produced by some divine legislator. The "Laws" are no more than observed regularities which "describe" (rather than "explain") HOW Nature works/WHAT it does, not WHY it does as it does.


I totally agree, I did not inject this talk of rules though.
0 Replies
 
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.05 seconds on 05/06/2024 at 07:28:22