1
   

What makes us want, and why, and why not computers?

 
 
Reply Thu 10 Jul, 2008 05:26 pm
Why do we want, is it because we have values? What gives us the ability to want or distinguish values?

In relation to a computer how do we know that a computer doesn't 'want' anything? Obviously it doesn't but does this imply that 'wanting' relies on the conscience?

If we never had the self awareness or instinct for survival would we want anything? Is all virtue derived from instincts in our DNA that indirectly lead us to 'want'.

Note: Laughing "want" = prefer, desire, etc.

I know that I crave a food perhaps because I remember what it tastes like and decided I like it instinctively, I react to the desire for food.

A computer does not 'prefer', or 'rather accept'; it will either be true or false, based on positive and negative. Binary: 00010010 = a value

0 (2^1)
1 (2^2)
0 (2^3)
0 (2^4)
1 (2^5)
0 (2^6)
0 (2^7)
0 (2^8) Therefore 00010010 is a value of 36.

If value matches then true, if not then false, right?. If two values were tried as inputs, say 30 and 14, There is no depiction between what the true value of those values are relative to the value wanted (36). An analog perspective would be able to perceive 30 as closer to 14 so if in this situation 30 was comparable to 36 as relative to 14 as less comparable to 36 then 30 would be what the computer would want more (except thats not the case). Perhaps, because there is a need for relative comparison, the only possible way do that is to process 'potential'. What I mean is a knowledge of the future to know what potential the values have in the future.

In a digital sense a potential value is only comparable to the present, meaning the single instance, or single value that corresponds to the value tried to be inputted.
In an analog sense a potential value always has value relative to the exact value needed (36 in this example). In order for this situation to work the potential value needs the potential of relativity, not just the potential of true and false.
Relativity requires memory yes, but also the ability to emphasize memory as projections to the future. True and false would only require memory to store the code and act upon it.

Any insights to help me figure this out?
[CENTER]:popcorn:
[/CENTER]
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 1 • Views: 3,202 • Replies: 48
No top replies

 
Zetetic11235
 
  1  
Reply Thu 10 Jul, 2008 06:09 pm
@Holiday20310401,
Perhapse consider the food a pointer to the data stored inreguards to the food and what you have experienced of the food. Consider a data bank containing all experienced things and place the parameter of either needing food to accomplish an order sent by receptors in the stomache or by the pleasure receptors in the brain. Say that need and pleasure are true and unpleasant options are false when both are present. Then enter in the health value and consequences of consumption. It is all maximization based on sense data to the best organizational ability of the mind. When two people with the same need crave different things, it is due to incomplete information on both parts in regards to maximal benefit. If one craves a food which is unhealthy but stimulates the pleasure center of the brain, the known disadvantges of ingestion may or may not outwiegh the advantages or in this case compulsion. Also, you must remember that although nuerons are functionally binary, that is fireing or not fireing, they are multi directional and work in n-ary space. A neuron can fire in any or multiple directions to multiple destinations. The human memory affects the path of maximal benefit and makes it less clear. The human mind is no less that one hundred times more complex than the most sophisticated supercomputer due to the method of organization and networking neurons have.
0 Replies
 
paulhanke
 
  1  
Reply Thu 10 Jul, 2008 06:50 pm
@Holiday20310401,
... hmmmmmmmmmm ... my Windows PC hacks up hairballs all the time - does that mean that it doesn't "want" me touching its keyboard in a certain way? ... or am I anthropomorphizing?

Speaking of anthropomorphizing, does a bacterium that has turned to swim up-gradient toward a cellulose source "want" cellulose?

Speaking of bacteria, does a computer simulation of a bacterium that has turned to swim up-gradient toward a computer simulation of cellulose source "want" cellulose?

Circling back around to the third part of your question, is a computer simulation of consciousness itself conscious?

Getting back to the first part of your question, if we knew enough to construct a computer simulation of consciousness, would we by definition know what makes us want and why?

And at the end of this tortuous circle, have I answered your question? No - but that's not for lack of "wanting" to! :bigsmile:
Holiday20310401
 
  1  
Reply Sat 12 Jul, 2008 10:43 am
@paulhanke,
What would make a computer be able to want. See, I believe that an analog set up gives us the abililty to want in a self aware perspective.
paulhanke
 
  1  
Reply Sat 12 Jul, 2008 07:00 pm
@Holiday20310401,
Holiday20310401 wrote:
What would make a computer be able to want. See, I believe that an analog set up gives us the abililty to want in a self aware perspective.


... so then an analogue computer (Analog computer - Wikipedia, the free encyclopedia) has more chance of being self-aware than a digital computer? ... I dunno - maybe I'm short-sighted, but I just have a hard time attributing something as alive as "self awareness" to something as material as the computing substrate ... I'm more of an "organization" and "process" thinker, myself Wink ... to-date, things engineered by humans are too organizationally and processually simple - too linear - to ever be considered alive, let alone self-aware ... that isn't to say that all things created by humans are that way, too - after all, isn't human culture a human creation?
mashiaj
 
  1  
Reply Tue 22 Jul, 2008 07:15 pm
@paulhanke,
if the computers evolved from natural selection maybe.
0 Replies
 
YoungButWise
 
  1  
Reply Tue 22 Jul, 2008 08:32 pm
@Holiday20310401,
Us human beings want and choose thing because of one thing, Greed. One of the seven deadly sins. Every human is born to want things to desire. Computers are not capable because they do not have a brain or animal instinct. Therefore, Because of our animal instinct we have greed and therefore want and choose rather than decide.
Holiday20310401
 
  1  
Reply Tue 22 Jul, 2008 11:34 pm
@YoungButWise,
But could you not hardwire a computer to simulate the the emotions of greed. I think what we are getting at is intentions, and how a computer has no connection between intentions and self benefit.

When cognating the intentions there is no input towards what matters of its self, which in turn relies on a self awareness, which relies upon emotions which can cause greed. The only input is the binary, the only info. taken into account is the info, being processed in the instance of processing. The information being processed does not link to any other information in a deterministic way, defined through the being itself. That sorty of thing is programmed.
dominant monad
 
  1  
Reply Wed 23 Jul, 2008 03:46 am
@Holiday20310401,
nice posts.. i think you're hitting on it holiday, intentions require a sense of self, and computers lack that sense of self. It's my personal belief that a human's sense of self is also just an illusion, made up of complex neurological processes, giving the illusion of 'self' vs. 'other'. I believe that babies are not born like this (can you remember wanting something at the age of 2 weeks?), and that as we grow up, we develop a sense of self over and above our base reactionary needs.

Anyway holiday have you looked into machine-state functionalism? there are lots and lots of arguments that the mind is or isn't like a computer.. the biggest problem for machine-state functionalism (and functionalism in general) is called 'liberalism' (as the word is normally used in Phil of Mind). That is, if you say that the mind is like a machine, then you also say that machines have a mind. Does your toaster also have a mind? if not, (like we already said above), where do you draw the line? The other big problem is that of 'qualia'. Does a computer feel pain? Can that subjective qualitative feeling of being alive translate to a deterministic machine-mind?

Your original question: What makes us want, and why, and why not computers? is the central problem of the Phil of mind. What makes us different from the other forms of complex mechanics that we've seen to date? Where is that spark of life? of consciousness? How does that happen? I don't think any of us can answer that, but it's good to argue about it.

Links you might find useful:
MindPapers: 4.3c. Machine Functionalism
mashiaj
 
  1  
Reply Wed 23 Jul, 2008 03:46 pm
@dominant monad,
i think that the computers, robots will never be like the living beings because
the living beings are made in a manner that is beyond thought, the computers works in binary 01, our brains works inexactly 1.674573553745367354362434 emotions are inexact thoughts are exact.
No0ne
 
  1  
Reply Wed 23 Jul, 2008 07:02 pm
@mashiaj,
mashiaj wrote:
i think that the computers, robots will never be like the living beings because
the living beings are made in a manner that is beyond thought, the computers works in binary 01, our brains works inexactly 1.674573553745367354362434 emotions are inexact thoughts are exact.


They wont be "like" living being's, the would "be".

"Composed of thought"

Are we not?
0 Replies
 
Doobah47
 
  1  
Reply Sat 26 Jul, 2008 05:41 am
@paulhanke,
paulhanke wrote:
.
Speaking of anthropomorphizing, does a bacterium that has turned to swim up-gradient toward a cellulose source "want" cellulose?


Is the notion 'want' simply an embellishment of the sense of purpose tiny things tend to permanently have? Have animals begun to evolve and reject this consistent revolution of purpose that carries the universe? The sloth or the unemployed TV addict - where is the purpose? Are we merely conglomerations of atoms having lots of fun with specified purpose, even capable of igniting metaphysical ideas as a purpose?

I would say to the first query that purpose is desirable, even necessary, yet for humans in society a purpose is often a great pleasure; although our purpose is essentially to exist, humans have managed to invent secondary purposes, like delusions of success among peers; however the lack of secondary purpose often defeats the ego into believing that the primary purpose is not worth the protein it was made from. Could one say that desire for hierarchical movement as a purpose is common among all things, even the very small things? So when something is conditioned it's position is subject to change, it could transform the impression it makes upon other things. Do atoms have ambitions? Or perceptive abilities? Surely they do, logically speaking they should be conscious, so perception is not out of the question.

Randomization of language might confuse the purpose, but deciphering the puzzle would be a purpose in itself. Sounds like a riddle. A fiddle! No it's horrible. Or incorrigible. Is that the end of the doodle?

Purpose it is! Everything has a purpose, people would appear to have many very confusing purposes - evolution has blessed people with ideals disguised by confusion - the smaller entities could also be utterly confused. Perhaps the bacterium is in a state of fusion, compelled by it's sense of purpose to fulfill the criteria arranged for it - of course there's the reward when it finds the cellulose, so is reward the reason for purpose or is reward an phenomenon that entities have succumbed or evolved to desire? In other words have the rewards always been at the pinnacle of a purpose, the entities becoming accustomed and addicted to the rewards, yet before the evolution there was indifference between the entity and it's reward.
paulhanke
 
  1  
Reply Sat 26 Jul, 2008 08:32 am
@Doobah47,
Doobah47 wrote:
I would say to the first query that purpose is desirable, even necessary, yet for humans in society a purpose is often a great pleasure; although our purpose is essentially to exist, humans have managed to invent secondary purposes, like delusions of success among peers; however the lack of secondary purpose often defeats the ego into believing that the primary purpose is not worth the protein it was made from.


... I read in a book once that the contemporary denigration of teleology is misplaced - that Aristotle was right in the first place ... there is a Final Cause - life is it's own purpose ...

Doobah47 wrote:
Do atoms have ambitions? Or perceptive abilities? Surely they do, logically speaking they should be conscious, so perception is not out of the question.


... interesting question ... is life matter or process? ... if only the latter, can non-life have ambitions?
dominant monad
 
  1  
Reply Sat 26 Jul, 2008 09:46 am
@paulhanke,
I don't think something can have ambitions unless it's sentient and can represent those ambitions to itself in an abstract way. Otherwise it's simply reaction, or carrying out what has been built into it by evolution.
0 Replies
 
boagie
 
  1  
Reply Sat 26 Jul, 2008 09:58 am
@mashiaj,
Smile
For computers not wanting, is it not the lack of innate need, anything that wants has a biological innate need, even desires serve this master of innate need.


"I don't think something can have ambitions unless it's sentient and can represent those ambitions to itself in an abstract way. Otherwise it's simply reaction, or carrying out what has been built into it by evolution."

Dominant Monad,

Why would the formation of abstract concepts constitute the ability to take action, these abstractions are still caused and their formation is reaction, the following behaviour whatever that might be, would still be reaction.
dominant monad
 
  1  
Reply Sat 26 Jul, 2008 10:19 am
@boagie,
Quote:
Why would the formation of abstract concepts constitute the ability to take action, these abstractions are still caused and their formation is reaction, the following behaviour whatever that might be, would still be reaction


I didn't say specifically that, i was referring to "ambition" as someone above used it. "The ability to take action" is different from "ambition", as i understand it, ambition implies a desire to improve one's 'self', and you can't have ambition unless you first recognise you as your 'self'. Simply taking action can be reactionary, and occur without a sense of self.

But if you're point is about determinism, and that abstract thoughts themselves are caused, and therefore they count simply as reactions, then i agree, as i think determinism is near unescapable if physicalism is true, although that's not what i was meaning in that sentence above.
Holiday20310401
 
  1  
Reply Sat 26 Jul, 2008 12:04 pm
@dominant monad,
And then what causes ambition is what causes the self. Still back to square one. I'm still convinced that analog is an approach that neurons "link" as that gives the complexity for consciousness to arise.

Also, it has to do with processing, that perhaps relies in the analog. A computer will process information 0,1 as they flow through; but nothing else is processed outside of that. And 0,1's are only the input, at the instance of an input there can't be an output genereated at the same time, I think.

As humans we process what we perceive and perhaps we develop a sense of self, ambition, consciousness through the gathering of memory to create relative instances that can be compared to the perception being processed. When we look at a knife we give it potential, by not categorizing its properties like a computer; rather we relate its use that has benefitted us in memory, thus defining the knife. Its like relative instances; with the ability to relate back to memory already processed and the experience.

A computer experiences only the information coming in to be processed, not able to substantiate between relative experiences. It is programmed what experience has what potential. It has no emotion to give "intropsectral sustantiative potentials". lol.
0 Replies
 
Binyamin Tsadik
 
  1  
Reply Thu 16 Oct, 2008 04:44 pm
@Holiday20310401,
Computers don't want because they don't have any needs.
Holiday20310401
 
  1  
Reply Thu 16 Oct, 2008 06:32 pm
@Binyamin Tsadik,
That doesn't make sense to me. Computers need to be able to function just as we do. If computers could recognize each other's needs then they would understand the need to function themselves. So the knowledge of the needs for the self is intrinsic to experiences one has with others who are the same. We cannot simply derive it from ourselves I guess, and then expect wants or desires to arise like that.

This is why I think the mind works analogically. Tricky!!!
0 Replies
 
paulhanke
 
  1  
Reply Thu 16 Oct, 2008 06:37 pm
@Binyamin Tsadik,
... is it so much they don't have needs, or is it that they have needs but their software is completely unaware of those needs? ... there is a thrust area in the information technology industry to develop robust computer systems that do not only address typical functions that directly support human activity ... they are also aware of themselves and their environments and have control over themselves to the extent that they can be self-regulating, self-healing, self-etc. ... the buzzphrase for such systems is "autonomic computing", obviously inspired by the human autonomic system ... given that such computers are aware of their needs and are capable of taking steps to address those needs, can they in any primitive sense of the word "want"?
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
  1. Forums
  2. » What makes us want, and why, and why not computers?
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 04/26/2024 at 03:32:09