real life wrote:Brandon9000 wrote:real life wrote:Brandon9000 wrote:real life wrote:Brandon9000 wrote:The negative mutations tend to die out, the occasional ones that confer some advantage tend statistically to spread through the gene pool. Over immense lengths of time, this process produces greater and greater functionality.
One of the most obvious problems with this idea is that often, even mutations that are supposed to confer some advantage in the final product actually convey NO advantage initially since they only supposedly account for a small part of a complex structure (an eye, an ear , etc) or worse convey an initial DISadvantage because they reduce the benefit of a formerly beneficial structure while not yet realizing the supposed benefit of an eventual (thousands of years later?) development. Example the jawbone-to-ear story.
So why are these mutations spread throughout the gene pool when they convey no advantage and may cause disadvantage? Just luck?
It's not a problem in the slightest. Mutations which confer no advantage do not usually come to dominate the gene pool. Probably virtually never. Traits which do confer an advantage do, although the advantage may be small.
That's exactly the point.
When a mutation which does not
yet confer an advantage shows up (for instance, one of many mutations/ genetic changes which would be necessary for a complex structure. An eye, for instance) how is it said that this useless mutation hangs around for generations and generations and generations until another and another and another mutation/ genetic change take place (luckily it occurs each time in the same line of descent among this organism's population) to put all the pieces of this complex structure together in such a way (crude and unrefined as yet but at least a beginning) to start to convey at least SOME benefit to the organism?
You're misrepresenting what we say totally. We say that an eye cannot have evolved unless at every step of the process there was more advantage than at the previous step.
That is my point. The first, second, third and so on mutations don't necessarily convey an advantage at every step. It could even be perceived as a disadvantage as in the Jawbone-to-ear story. The jawbone keeps receding until it becomes a bone in the middle ear. The shrinking jawbone surely must have become a liability at some point, making it much more difficult for many, many generations of the creature to feed themselves adequately.
Brandon9000 wrote: Maybe it began with a patch of slightly light sensitive skin, so that on a good day the creature could tell the difference between high and low illumination.
Yeah maybe but maybe not.
Is evolution at this point reduced to simple guessing?
When the first light sensitive skin cell supposedly appeared, was there an optic nerve to carry light-generated stimuli? Was there an area of the brain that could interpret it? If not, what advantage did it confer?
Since we're guessing let's ask again could it be a possible disadvantage? Would uninterpreted additional data just show up as brain noise producing a confusing effect rather than a benefit? (What would happen if the human ear were fine tuned to other frequencies and could suddenly hear xrays and gamma rays coming from the sun?)
We're guessing, but what we're not guessing about is that in order for a trait to appear, there must be a pathway to it such that at every step, there is more advantage than at the last step. You are hardly in a position to criticize potential weaknesses in a deduction, since you then turn around and accept an ancient text as the authority on the nature of the world with nothing more than at most a few arguments of plausibility.
If there were a patch of light sensitive skin, perhaps it could function initially along the normal pathways for tactile sensations, and specialized neural handling evolved only later.