1
   

Argument against empircist approach towards mind

 
 
Reply Wed 25 Feb, 2009 08:32 pm
The AI researcher John McCarthy said (seriously) that his thermostat has three 'beliefs', 'too cold', 'too hot' and 'just right'. Many in AI assert that sufficiently complex computer with appropriate software would have a mind not unlike our own. It seems the claim they're making is that, the more complex the calculating device, the more conscious it is. Furthermore, the definition of what can be considered by most strong AI believers is very broad. I hope I'm not missing the point here because these premises give rise to some ridiculous conclusions. I'll use them here:


  1. The more complex a calculating device, the more conscious it is.
  2. A thermostat can be considered a calculating device whose input is the room temperature, and whose output is a desired shape which signals the heating unit to turn on or off.
  3. A thermostat is not a very complex calculating device.

Therefore, a thermostat is only slightly conscious.

So far so good. We haven't arrived at anything ludicrous yet. Let's try with the human brain.


  1. The more complex a calculating device, the more conscious it is.
  2. A human brain can be considered a calculating device whose inputs and outputs are both nervous impulses ... a little more complex in this case since you can consider the human brain to be a bit like a huge sequential circuit in that its own state is a part of its inputs. (If you want a more AI-ish definition, the human brain is a calculating device whose inputs are the senses and whose outputs are plans for rational action. In either case, it can be agreed that the human brain is a kind of calculating device.)
  3. The human brain is a relatively complex calculating device.

Therefore a human brain is relatively very conscious, compared to even the best of our computers.

OK, nothing ridiculous yet. One more time


  1. The more complex a calculating device, the more conscious it is.
  2. The Earth can be considered a calculating device whose inputs are its current state and external variables like sunlight, meteoroids, etc. and whose output is its next state.
  3. The Earth is relatively complex, even compared to the brain. For a useful benchmark, there are no individual computers or cluster installations that can fully and accurately simulate it. Even the NEC Earth Simulator is only scratching the surface ... literally. The computational power necessary to provide a detailed simulation of the Earth would be massive.

Therefore, the Earth itself is many, many times more conscious than any human.

You might object that it's ridiculous for me to consider the Earth a calculating device, but as long as we're considering thermostats to have thoughts it isn't a stretch. With the kind of definitions advanced by strong AI believers (hope I'm not misrepresenting anyone), any physical object can be considered a 'calculating device' of sorts if you assign the right inputs and outputs. Calculating devices can be powered with gears, electromechanical switches, solid-state devices, flowing water, light (in fiber optic cables), presumably quantum particles, hell, even DNA, why not rocks, lava and water too?

Everything! is conscious to some extent, if you accept this all-inclusive degrees-of-consciousness approach. The whole Universe and all its constituent parts ...

I was always under the impression strong AI believers, Singularitarianists, etc. typically prided themselves on not believing in new-age claptrap but, come on, Mother Gaia?!
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 1 • Views: 1,954 • Replies: 31
No top replies

 
Aedes
 
  1  
Reply Wed 25 Feb, 2009 08:53 pm
@odenskrigare,
Does Buffalo have greater consciousness than Ithaca?

Or does it merely have a greater number of conscious units?
Kielicious
 
  1  
Reply Thu 26 Feb, 2009 12:18 am
@Aedes,
Even though I think the AI researchers certainly have their work cut out for them, and I wonder if their efforts ever will come to fruition, Im still optimistic. However, its hard to say exactly what the minimal requirements are for consciousness. Obviously its not a black and white issue, especially as of now. But then again when talking about life itself its hard to say when a system has gone from the chemical stage (so to speak) to the biological stage (so to speak). Once again, its shades of gray, not black and white, and I think that is the keystone.

Chalmers says the same thing about consciousness that you said about McCarthy above, roughly. And that is that he too thinks that a thermostat could be somewhat conscious on a very minute scale. That consciousness is foundamental to reality but I find it all very hard to believe. Im not saying he or they are wrong but I tend to think that the mind arises from the brain; not from my liver or big-toe or thermostat. Then again, when I think about whether some computers are conscious....it depends... this is when it starts getting tricky.:perplexed:
odenskrigare
 
  1  
Reply Thu 26 Feb, 2009 08:31 am
@Kielicious,
Aedes wrote:
Does Buffalo have greater consciousness than Ithaca?

Or does it merely have a greater number of conscious units?


Consider Buffalo a cluster of humans. It has greater consciousness because of the higher population.

lol

Kielicious wrote:
Even though I think the AI researchers certainly have their work cut out for them, and I wonder if their efforts ever will come to fruition, Im still optimistic. However, its hard to say exactly what the minimal requirements are for consciousness. Obviously its not a black and white issue, especially as of now. But then again when talking about life itself its hard to say when a system has gone from the chemical stage (so to speak) to the biological stage (so to speak). Once again, its shades of gray, not black and white, and I think that is the keystone.


If we want to reduce consciousness to strictly physical science, it's going to have to be black and white, isn't it?

I considered going into AI research and, specifically, neural network research before economics happened to me. But many of them, they're busy chasing some unattainable grail and I don't want to have any part in that.

Kielicious wrote:
Chalmers says the same thing about consciousness that you said about McCarthy above, roughly. And that is that he too thinks that a thermostat could be somewhat conscious on a very minute scale. That consciousness is foundamental to reality but I find it all very hard to believe. Im not saying he or they are wrong but I tend to think that the mind arises from the brain; not from my liver or big-toe or thermostat. Then again, when I think about whether some computers are conscious....it depends... this is when it starts getting tricky.:perplexed:


There are many possible answers. The right one is one no one has probably even considered before. I am saying, though, that this one is wrong.
boagie
 
  1  
Reply Thu 26 Feb, 2009 11:53 am
@odenskrigare,
Hi All!Smile

Is not consciousness reaction to, evaluation and the formation of intent, intent to react to, or intent to avoid. In AI the missing elements seems to be consciousness of internal need or rather to have internal need, indeed that is what is missing in the machine/computer, internal need is what modivaton is, internal need or internal want, an internal realization. Is not the response of the thermostat a response to chemistry, a change in the chemistry of the mercery triggering function- a two bit process which can go either way-on or off depending on the state of its chemistry changed by temperature. Is not the ability to preceive an outward state which thus produces an inner need/want, the seed of reactive consciousness.
odenskrigare
 
  1  
Reply Thu 26 Feb, 2009 02:15 pm
@boagie,
Utility (which can be used to explain wants/needs) can be related mathematically. What's to stop a computer from using a utility function?

Here I'm playing the Devil's advocate btw
Kielicious
 
  1  
Reply Thu 26 Feb, 2009 05:41 pm
@odenskrigare,
odenskrigare wrote:
If we want to reduce consciousness to strictly physical science, it's going to have to be black and white, isn't it?

I considered going into AI research and, specifically, neural network research before economics happened to me. But many of them, they're busy chasing some unattainable grail and I don't want to have any part in that.


Well, its quite hard to say if it is a black and white issue. We have no idea what the minimal requirements are for C to happen and when we look at brain stuctures its not the size of the brain that determines how conscious a given being is but rather the complexity. Whale brains are massive compared to human brains but they arent nearly as complex as ours. Similarly when we go down the animal scale of what is or isnt conscious we find problems...we seem to think dogs and cats are conscious but what about jellyfish? They dont even have a brain or a CNS.... Its hardly a black and white issue when dealing with C, especially in invertebrates. Maybe someday we can quantify C and put an end to this problem but for now its quite perplexing.

odenskrigare wrote:
There are many possible answers. The right one is one no one has probably even considered before. I am saying, though, that this one is wrong.


And you are probably right...
0 Replies
 
odenskrigare
 
  1  
Reply Thu 26 Feb, 2009 05:54 pm
@odenskrigare,
lol, that reminds me of:

Are Whales Smarter Than We Are?: Scientific American Blog

I don't think there will ever be an equation to define 'C'. I'm not into tilting at windmills.
0 Replies
 
KaseiJin
 
  1  
Reply Thu 26 Feb, 2009 08:11 pm
@odenskrigare,
An interesting proposition, and fun-like discussion on a very career-delimiting (for those in neuroscience) topic--the magic C. While I, myself, am not decided on some of the finer points, I have come to see a possible need for re-Englishing the terminology used by those in the field.

It has been pointed out by others that it's a bit of a disappointment that we usually only have a negative for the activity of brain/ganglion that is active at levels below what we usually think of when we hear, or use, the word consciousness--viz. unconscious.① That could lead one to thinking, 'well brainbeing unconscious sounds kind of like dead brain. . . ,' but somehow shouldn't be the case, one could respond, because if the neurons and certain glia cells are alive, they are going to be conscious. [note:this is not consciousness, simply conscious]

Therefore the 're-Englishing' that I would propose (and I would hope that some others in the field will eventually accept this type of thinking, at least) , would be describing the term 'conscious' as the activity of, mostly, neuronic cells--which would more accurately imply that conscious is a contiuum of activity levels which eventually reach a level we would call 'consciousness.'

I tend to take a pragmatic stance, and so wouldn't wander off into areas which have little explanatory leverage in a day to day, year to year life. While I can see the position that some in AI would thus be presenting, as you have cleverly shown, odenskrigare, I wouldn't see any need for it. In other words, we need not think of a mass of rock itself, nor a scoop of dirt itself, nor a waterwheel itself, nor a thermostat itself, as a living entity, and thus need not consider those individual entites as conscious in nature. I wonder if that might not have some bearing on the position you are humorously elucidating? I mean, the distinction between a living entity and a non-living entity in a practical playing field?




① It is good to keep in mind that consciousness is the noun form derivative of conscious

② Again, I am using this in collective non-count form, so, for example, V5 of the visual cortex in occipital lobe, is brain just as much as Brocca's Area, or the agmydala is.
0 Replies
 
boagie
 
  1  
Reply Thu 26 Feb, 2009 10:22 pm
@odenskrigare,
odenskrigare:)

Utility does not quite make it, in the sense that I am thinking in, for in the biological system the essence of conscious response is in the absence of what is needed to continue in being, or to satisfy/slake the longing, the need, similar in some sense to addiction. Utility is speaking of ends, need provides the means.
odenskrigare
 
  1  
Reply Thu 26 Feb, 2009 11:23 pm
@boagie,
KaseiJin wrote:
I tend to take a pragmatic stance, and so wouldn't wander off into areas which have little explanatory leverage in a day to day, year to year life. While I can see the position that some in AI would thus be presenting, as you have cleverly shown, odenskrigare, I wouldn't see any need for it. In other words, we need not think of a mass of rock itself, nor a scoop of dirt itself, nor a waterwheel itself, nor a thermostat itself, as a living entity, and thus need not consider those individual entites as conscious in nature. I wonder if that might not have some bearing on the position you are humorously elucidating? I mean, the distinction between a living entity and a non-living entity in a practical playing field?


If that's the case, and we want to assume the premise I knocked down in the first post, then we need to figure out what physically distinguishes a 'living' entity from a 'non-living' entity wrt having an intellect.

boagie wrote:
odenskrigare:)

Utility does not quite make it, in the sense that I am thinking in, for in the biological system the essence of conscious response is in the absence of what is needed to continue in being, or to satisfy/slake the longing, the need, similar in some sense to addiction. Utility is speaking of ends, need provides the means.


Sorry, I don't follow you.
boagie
 
  1  
Reply Thu 26 Feb, 2009 11:34 pm
@odenskrigare,
odenskrigare wrote:
If that's the case, and we want to assume the premise I knocked down in the first post, then we need to figure out what physically distinguishes a 'living' entity from a 'non-living' entity wrt having an intellect. boagie, Sorry, I don't follow you.


odenskigrare,Smile

Well in the above, what distinguishes the living entity from the non-living entity, I would say need.
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 12:22 am
@boagie,
Define 'need', please.
boagie
 
  1  
Reply Fri 27 Feb, 2009 01:08 am
@odenskrigare,
odenskrigare wrote:
Define 'need', please.


odenskrigare,Smile

The absence of something essential to your being, as in breathing or eating. If one is denied that which is essential does one not become inanimate. Need is the basis of modivation for all living things.
0 Replies
 
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 01:12 am
@odenskrigare,
What if a machine has oxygen based metabolism and eats food?
boagie
 
  1  
Reply Fri 27 Feb, 2009 01:17 am
@odenskrigare,
odenskrigare wrote:
What if a machine has oxygen based metabolism and eats food?


odenskrigare,Smile

The question would not be what if a machine has oxygen based metablolism and eats food, the question would be are those things necessary to its continued existence, and is the machine in its need able to yearn, desire for that which is needed.
0 Replies
 
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 01:28 am
@odenskrigare,
My answers are:


  1. Yes. (Assume it breaks down if these needs are not met in a timely fashion.)
  2. According to strong AI wankers, yes.
boagie
 
  1  
Reply Fri 27 Feb, 2009 01:37 am
@odenskrigare,
odenskrigare wrote:
My answers are:


  1. Yes. (Assume it breaks down if these needs are not met in a timely fashion.)
  2. According to strong AI wankers, yes.

[/LIST]
odenskrigare,Smile

Are you stating that presently, AI can yearn and/or desire for what is needed, not just programed to request it, and does it realize that its needs lie outside itself?
0 Replies
 
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 01:42 am
@odenskrigare,
I am a believer in weak AI and only in weak AI, but it is certainly possible, even now, that an artificially intelligent agent could provide its own instructions through learning, etc. AI's are not only static rule sets you know
boagie
 
  1  
Reply Fri 27 Feb, 2009 01:45 am
@odenskrigare,
odenskrigare wrote:
I am a believer in weak AI and only in weak AI, but it is certainly possible, even now, that an artificially intelligent agent could provide its own instructions through learning, etc. AI's are not only static rule sets you know


odenskrigare,Smile

Amazing, I really do not know much on the topic, but it sounds like it is much more advanced than I had previously believed--thanks for the insight odenskrigare!! I doubt however that the ability to yearn and desire could ever be more than an instruction in AI. For the organism it is felt in the essence of their being--in the fiber so to speak.
0 Replies
 
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
  1. Forums
  2. » Argument against empircist approach towards mind
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.04 seconds on 04/20/2024 at 07:12:40