Mills75 wrote:The lobes of the cerebral cortex would be the necessary parts. Consciousness, which basically refers to the knowledge of one's own existence (as it's being used here), doesn't seem to exist until sometime after child birth (well after the cerebral cortex is developed and active--it seems we actually have to learn that we exist).
An interesting idea, that we have to learn that we exist, implies there is a fair chance that we do not, but I'll come back to that. Self-awareness wasn't what I meant by consciousness, which is partly why I used the term 'experience' earlier. You are right that our senses are, for the most part, what makes us able to experience, but the crux for me is what it is that is experiencing, the 'experience'.
For example, the ability to experience pain may not require the prerequisite of self-awareness. I would say that causing pain to anything that can experience pain is wrong, whatever other attributes the entity have have or lack.
This discussion seems to be leading us towards a more general discussion on the nature of consciousness, which seems fine to me, let's say the main road is blocked so we need to take a diversion before we can get back on track!
Mills75 wrote:Now, as to whether or not a brain of any size implies a degree of consciousness--probably not, but it's not so much size that's important as is the presence of the proper features. Consciousness seems to depend on the cerebral cortex. Some animals (insects, most fish--I think, and some lizards) don't have a cerebral cortex at all. Mammals do, but not as complex as the human cortex (though there is a lot of evidence that the other primates are capable of self-awareness, too). You can sort of think of the brain as a computer and consciousness as a program. Unless the computer meets the system requirements of the program, it's not going to be able to run that program. Our brains don't meet the system requirements of consciousness until a certain stage of development.
An interesting analogue. It reminds me of Daniel Dennet's idea that the brain is like a computer with a parallel processor, and consciousness is software the make it seem like it's a serial processor.
From here, it seems we have three interesting lines of enquiry:
(Q1) Why do we experience? (Does the brain create consciousness, or does it 'tap into' it?)
(Q2) Does 'I' - the singular ego- exist? (If not, then while I might maintain that it is wrong to cause harm to an entity like an embryo, is might not be any more wrong to kill it than to not have a child in the first place).
(Q3) It death morally relevant? (If the singular 'I' does not exist, then is our illusionary sense of self morally relevant? If not, then would that mean that, while it is still wrong to cause pain, it would be no moral difference between an adult human being killed and the same human never being conceived.)
Hate to do this, but I've run out of time and have to go, I'll put some more flesh on these questions when I get a chance, but in the mean time, what are your initial thoughts?