1
   

Argument against empircist approach towards mind

 
 
Khethil
 
  1  
Reply Fri 27 Feb, 2009 03:52 am
@odenskrigare,
I can't speak to this AI researcher or John McCarthy, but just from reading the posts here....

odenskrigare wrote:
... It seems the claim they're making is that, the more complex the calculating device, the more conscious it is...


... this is the problem. It's a fallacy. Through learning various languages I've constructed many programs with astounding complexity that didn't do a thing Surprised. Many 'machines' are complex, but degree of complexity doesn't directly correlate to degree of consciousness. Does anyone believe this? You said, ".. it seems the more complex...". Does this mean this was put forth as some premise or it was just how you took it?

Removing this, the entire argument falls apart.

Thanks
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 03:56 am
@Khethil,
Khethil wrote:
I can't speak to this AI researcher or John McCarthy, but just from reading the posts here....



... this is the problem. It's a fallacy. Through learning various languages I've constructed many programs with astounding complexity that didn't do a thing Surprised.


So?

Consciousness is now based on how useful something is, in the opinion of one individual? I certainly wasn't informed.

Khethil wrote:
Many 'machines' are complex, but degree of complexity doesn't directly correlate to degree of consciousness.


And why not
0 Replies
 
Khethil
 
  1  
Reply Fri 27 Feb, 2009 04:30 am
@Khethil,
odenskrigare wrote:
... It seems the claim they're making is that, the more complex the calculating device, the more conscious it is...
Khethil wrote:
... this is the problem. It's a fallacy.
odenskrigare wrote:
So?


What do you mean, "So?" - it's the entire basis for your argument; which If you missed it, I happen to agree with. You might wanna go back and read my reply again.

odenskrigare wrote:
Consciousness is now based on how useful something is, in the opinion of one individual? I certainly wasn't informed.

I didn't say anything about equating usefulness to... well, anything. Where'd you get this from, another thread?

Khethil wrote:
Many 'machines' are complex, but degree of complexity doesn't directly correlate to degree of consciousness.
odenskrigare wrote:
And why not


Well this is an interesting question. I've never been a big believer in those who set out to prove a negative. Given this, I suppose the best thing - to answer your question - would be to ask you to provide some proof that it does. But I try to be a good sport. Let's try this approach:[INDENT]In no definition or context in which I'm aware, does complexity directly-correlate to consciousness (here's a definition). Or, I could give examples of complex assemblages that exhibit no measure of consciousness: My house, your tennis shoe, that generator, this light switch. I suppose in a cosmic sense one might posit these things could actually be conscious; but not only is that counter intuitive (and something for which no evidence exists), it's absurd.
[/INDENT]But if I may offer a word of advice; go back and read my reply again - this time a little more slowly.

Thanks
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 04:51 am
@Khethil,
Khethil wrote:
What do you mean, "So?" - it's the entire basis for your argument; which If you missed it, I happen to agree with. You might wanna go back and read my reply again.


My "So?" was directed at the fact that you had written complex programs that didn't do anything which were, presumably, not conscious in your opinion.

Khethil wrote:

I didn't say anything about equating usefulness to... well, anything. Where'd you get this from, another thread?


Khethil wrote:
Through learning various languages I've constructed many programs with astounding complexity that didn't do a thing Surprised.


Quoted.

Khethil wrote:

Well this is an interesting question. I've never been a big believer in those who set out to prove a negative. Given this, I suppose the best thing - to answer your question - would be to ask you to provide some proof that it does. But I try to be a good sport. Let's try this approach:[INDENT]In no definition or context in which I'm aware, does complexity directly-correlate to consciousness (here's a definition). Or, I could give examples of complex assemblages that exhibit no measure of consciousness: My house, your tennis shoe, that generator, this light switch. I suppose in a cosmic sense one might posit these things could actually be conscious; but not only is that counter intuitive (and something for which no evidence exists), it's absurd.

[/INDENT] I'm using the broad, twisted definition of consciousness which strong AI wankers like McCarthy must adapt.

This is the specific definition I am trying to knock down.

Please read about McCarthy's comments on the thermostat:

mccarthy thermostat - Google Search

According to this definition, a thermostat or indeed a planet has "the quality or state of being aware especially of something within oneself" and "the state or fact of being conscious of an external object, state, or fact".

Furthermore, a central tenet of this definition, which is vital to the strong AI/Singularitarian wanker crowd, is that, as hardware and software increase in complexity, so too will their grade of consciousness.
Khethil
 
  1  
Reply Fri 27 Feb, 2009 05:03 am
@odenskrigare,
Yea, I see what you're saying well enough. But I fear we may be on different planets (communicatively). So good luck with this

Thanks
0 Replies
 
KaseiJin
 
  1  
Reply Fri 27 Feb, 2009 08:24 am
@odenskrigare,
odenskrigare;50922 wrote:
If that's the case, and we want to assume the premise I knocked down in the first post, then we need to figure out what physically distinguishes a 'living' entity from a 'non-living' entity wrt having an intellect.


Yes, I would go along with that 'we need to figure out...' bit. I would not be able to really offer any clear cut suggestions, though, unfortunately. I am not convinced that we could ascribe it to 'need[/i],' (if we were to look at the 'precurser event,' but perhaps something with activity level--I mean, a flake of granite would surely be less active that a microbe; I would think. (but that line of thought would be going off topic, so . . . )
0 Replies
 
Zetherin
 
  1  
Reply Fri 27 Feb, 2009 04:08 pm
@odenskrigare,
It appears to me we're only aware of the consciousnesses that our sensory perception can make distinct. That is, we wouldn't classify the earth as conscious, as we have no way to visualize a single entity - a distinct constintuent. Just as we wouldn't call a group of people being more conscious than an individual; each individual has it's own consciousness, they don't appear to be shared. If they are all shared, then "Consciousness" seems to lose it's value as a categorization of a single entity -- we could just say everything is a big ball of consciousness (practically telling us nothing).

Khethil was on the right track: Complexity does not necessarily equal intelligence or consciousness, as far as I can see. I wouldn't say a thermostat is conscious, as it has no awareness -- it simply completes a function. If we said a thermostat was aware, then we'd have to extrapolate this out to nearly everything, from a computer program to a rock, and this seems absolutely absurd.

Chalmers even argues:
"Consciousness is a set of emergent, higher-level properties that arise from, but are ontologically autonomous of, the physical properties of the brains of organisms"

Though he doesn't completely oppose panpsychism, I don't see how a chip off a rock can be sentient. Those molecules contained within that chip do not have this awareness we share. It appears that consciousness comes after the fact, almost as a sum is greater than the parts concept. It emerges in the brain due to a multitude of processes, many of which we don't have full grasp of yet; somewhere between an amoeba and a human consciousness emerges, but we don't know quite where to draw the line yet. This information will most likely be found within the A.I research.

Also, if we say the human mind is simply a chain of complex thermostats strung together (which is essentially the claim here), then the *self* concept is pretty much thrown out the window. Each of our actions of predetermined, we're simply carrying out orders from a laundry list of influences -- just as a complex A.I would. In this case, we have absolutely no power over our actions, and consciousness is just a big bag of tricks. We have the illusion we're in control, and yet every thought, feeling, and action can be quantified, mathematically predicted.

I'm not convinced of this. There's something behind our emotions, behind our actions, that I don't feel can be completely reduced. I believe in qualia, and I believe we have some control. If the reductionist argument prevails, however, I'd be a sad camper. It would be depressive to even live.
odenskrigare
 
  1  
Reply Fri 27 Feb, 2009 05:03 pm
@Zetherin,
Zetherin wrote:
Khethil was on the right track: Complexity does not necessarily equal intelligence or consciousness


That's the assumption I'm making for the sake of argument (literally)

I'm assuming the loony position to discredit it, not agree with it

That being said, I would like to know how to "correlate" the two ... you need hard numbers to find a correlation coefficient and regression line

Zetherin wrote:
as far as I can see. I wouldn't say a thermostat is conscious, as it has no awareness -- it simply completes a function. If we said a thermostat was aware, then we'd have to extrapolate this out to nearly everything, from a computer program to a rock, and this seems absolutely absurd.


For the sake of argument, and strictly for the sake of argument, can we say that a human is a function with lots and lots of parameters?

Zetherin wrote:

Chalmers even argues:
"Consciousness is a set of emergent, higher-level properties that arise from, but are ontologically autonomous of, the physical properties of the brains of organisms"


I don't know who he is Sad

Zetherin wrote:
Though he doesn't completely oppose panpsychism, I don't see how a chip off a rock can be sentient. Those molecules contained within that chip do not have this awareness we share. It appears that consciousness comes after the fact, almost as a sum is greater than the parts concept. It emerges in the brain due to a multitude of processes, many of which we don't have full grasp of yet; somewhere between an amoeba and a human consciousness emerges, but we don't know quite where to draw the line yet. This information will most likely be found within the A.I research.


I keep abreast of developments in AI because I once wanted to enter research in this same field ... I doubt it personally. Neuroscience, maybe, but even that will leave a lot in darkness. People in both fields are full of hubris, btw, don't listen to everything they say.

Zetherin wrote:
Also, if we say the human mind is simply a chain of complex thermostats strung together (which is essentially the claim here), then the *self* concept is pretty much thrown out the window. Each of our actions of predetermined, we're simply carrying out orders from a laundry list of influences -- just as a complex A.I would.


Although it doesn't necessarily exclude qualia, like you say you believe below, the bundle theory proposed by Hume is compelling and has medical support. (See what happens when the corpus callosum is severed.)

I'm with you on qualia but I don't believe there is a separate 'self'.

Zetherin wrote:
I believe in qualia, and I believe we have some control. If the reductionist argument prevails, however, I'd be a sad camper. It would be depressive to even live.


I wouldn't worry about it. Laughing
0 Replies
 
bemoosed
 
  1  
Reply Sun 29 Mar, 2009 09:09 pm
@Khethil,
Some nice posts here. At the risk of redundancy and worse, I'll add my first thoughts, nothing rigorous...

When I'm in a dreamless sleep, I don't seem to be very conscious. Some, cos I can be woken up. But sleeping or waking, I imagine the structural complexity of my nervous system is about the same.

Maybe the patterns of dynamic neural activity are less complex when I'm sleeping dreamlessly. In such a state I seem to be less conscious than, say, a shark on the hunt, but I wouldn't be surprised if I've still got more going on when I'm unconscious than at least some conscious animals. Depends on how I might quantify "complexity of patterns", and "conscious" of course...

What I'm getting to is that it seems to me that "conscious" as I usually mean the word might require some complexity, but that a conscious system also has to be (1) designed for it and then (2) actively engaging some key element(s) of that design (no ID meant!). Looking at it from some kind of materialist perspective, anyway.
0 Replies
 
Harby phil
 
  1  
Reply Tue 31 Mar, 2009 11:04 am
@odenskrigare,
I'm a wanker apparently :whistling:.

Well not really, I'm not all too familiar with the specific going-ons of the current field of AI but I do believe the complexity/reducibility/information theories are being misrepresented harshly. To begin with, the Earth obviously isn't conscious and neither is a thermostat for that matter. They do however input and output information, just like everything in existance infact.

The mistake you made odens is understating the complexity of the sentient lifeform and overstating the complexity of a planet (or w/e). Consciousness is not a mere system, it is a whole constellation of 'em.

Truth is, we don't yet know enough about consciousness, but what we do know IS reducible and can be replicated in AI (in time). I'm not saying that the unknown part adheres to the same principles, but these guys are atleast using that which we know to try and find out. Their zealousness may not be justified, but your skepticism is definitely unproductive.
0 Replies
 
Zetherin
 
  1  
Reply Tue 31 Mar, 2009 12:21 pm
@odenskrigare,
Quote:
The mistake you made odens is understating the complexity of the sentient lifeform and overstating the complexity of a planet (or w/e). Consciousness is not a mere system, it is a whole constellation of 'em.
Actually, I don't find that a mistake at all - it's called critically thinking. Comparing our consciousness to a less-sentient object could allow us knowledge.

What are you basing "Consciousness is not a mere system" on? Is it just a *feeling*? "System" does not mean only a single function is performed; my computer "system", for instance, completes several different functions.

What are the qualities that make it (sentient being) not a "mere system" - how do you know a thermostat isn't conscious? What exactly are the distinctions?

If we were to design an A.I which was able to gain and respond to empirical knowledge (one's already in design actually), what would you think? Could it be a sentient being -- could it be conscious?

We need specifics, this is what we're seeking (or I am, at least). And I think the skepticism here is healthy, as the understanding of consciousness, as you note, isn't very conclusive.
Harby phil
 
  1  
Reply Tue 31 Mar, 2009 09:26 pm
@Zetherin,
Zetherin wrote:
Actually, I don't find that a mistake at all - it's called critically thinking. Comparing our consciousness to a less-sentient object could allow us knowledge.

What are you basing "Consciousness is not a mere system" on? Is it just a *feeling*? "System" does not mean only a single function is performed; my computer "system", for instance, completes several different functions.
Then compare it to other animals, as someone stated already invertebrates can be a great source of information on the matter. What I tried to say is that although something the size of a planet might be working with more information than humans, the latter's processing of the information is more complex. The complexity of the process with which the analysis of the information is done is the (most probable; hey it still might be a soul or whatnot) key to consciousness, not the ammount of information.

As for system constellation, I meant that consciousness is not a single system, that it would function and still be called consciousness if you removed parts commonly held responsible for related functions (blind people, although lacking a critical information input, are still quite conscious) down to an unknown degree. This independance of systems that constitute parts of what we call consciousness would by the same logic be capable of arising by addition of systems, i.e. reducibility.

This was meant to support the possibility inorganic objects are "somewhat" conscious or have the potential to be, contrary to what you have asserted, and I apologise for being too lazy to explain.

Quote:
What are the qualities that make it (sentient being) not a "mere system" - how do you know a thermostat isn't conscious? What exactly are the distinctions?

If we were to design an A.I which was able to gain and respond to empirical knowledge (one's already in design actually), what would you think? Could it be a sentient being -- could it be conscious?

We need specifics, this is what we're seeking (or I am, at least). And I think the skepticism here is healthy, as the understanding of consciousness, as you note, isn't very conclusive.
Yes, I believe it could be conscious, that is why I said the research is productive. The criticism isn't because it isn't constructive criticism, it takes the, as the OP said himself, the loony approach to counter an overzealous supporter. If he were to instead take it seriously, presenting a more realistic example or atleast presenting the not-so-hard AI views on the same matter (whether or not he personally supports them), I would have taken him seriously.
0 Replies
 
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 04/24/2024 at 09:53:46