Today’s DpSU from Mr Thomas of the ICR is on the subject of the 100,000 year old South African paint lab, and is called Ancient Paint Workshop Challenges Human Evolutionary Story. Now, whatever makes him think that?
His primary objection begins like so:
According to standard evolutionary dogma, mankind did not start evolving advanced cognitive abilities until about 70,000 years ago. Christopher Henshilwood, lead author of the study that appeared in Science, told Nature News that these artifacts “push back by 20,000 or 30,000 years” the imagined advent of “complex cognition,” or the time when humans finally had the brainpower to prepare and use paint.
You’ll find the Nature News article linked above the image, and I advise you to start there (and at the place the image itself links to) if you managed to miss the coverage of the find, back when it was still news.
Considering that ‘cognitive abilities’ is pretty subjective, I for one have no problem with evidence that would push back the date that it evolved a few tens of thousands of years. After all, it is an inherent property of ‘earliest known x’ records that they almost always mover further back in time.
Brian Thomas continues with a questionably relevant and probably out of date quote from the person who found Lucy, and then goes on to give his explanation as to why our dates for the evolution for cognition might all be consistent: as you should know by now, the creationists dispute any and evry dating mechanism science has on its hands.
But before I get to that, I must present to you two paragraphs which, amazingly, show that Mr Thomas actually understands the real limitations of 14C dating:
The study authors performed three different dating techniques on the remains. Curiously, however, they did not carbon-date any artifacts. Some of the tools included cow, seal, and dog bones, and the abalone shells must still contain protein. All of this material should have datable carbon. Why were they not carbon-dated?
The answer probably has to do with the fact that carbon dating is unreliable for artifacts older than 60,000 or so carbon-years. If any carbon-14 was detected in these remains, then the carbon age would be no more than 60,000 years, and probably far fewer—thousands of years younger than these scientists’ target date.
He, of course, implies in the last sentence that the scientists were looking to get a specific date, and so couldn’t use the 14C method as that would get them a lower age. He was doing so well before that. The point is that, unlike what the creationists would like you to believe, 14C decays not to zero but to a small, background level that would indeed date to a more recent age than 100,000 years. This is why, whenever the creationists date million-year-old things they always come up with a date that is generally less than an order of magnitude more than their age of the universe, and several less than the real age of the object.
Anyway, he goes on:
In their Science report, the archaeologists showed dates on a photograph of several vertical feet of cave floor layers. However, one dating method showed an age of 100,000 years for a layer that was dated at only 75,000 years by another method. And there were other inconsistencies, raising suspicion over the reliability of the age assignments.
I can’t see the original Science paper beyond the abstract, so I don’t know what he means here.
And there is more. They labeled the very top eight-or-so-inch-thick layer “hiatus,” instead of assigning it an age like the other layers. Then they labeled the next lower layer at about 68,000 years. Why would nine feet of cave sediments accumulate over 32,000 years, then almost no sediment accumulate for 68,000 years?
I was under the impression that you labelled gaps, not layers, as hiatuses, but whatever. Gaps of hundreds of millions of years are known – 68,000 years is not all that impossible. The sediment is in a cave, and changing patterns of water channels could easily have this kind of effect.
He changes tack:
The idea that man evolved his cognitive abilities has no scientific support, either. Cognitive thoughts are not traits encoded by particular genes, but immaterial traits. Thoughts interact with the material world through the sophisticated architecture of the brain, which requires thousands of precisely interacting genes. Tinkering with brains or brain-developing genes brings disaster, not improvement. Thus, brain biology shows that modern human cognition must have been purposeful and present at the beginning.
This idea of his is based on the concept of dualism – where there legitimately is a difference between the brain and the mind. Ironically, this has ‘no scientific support.’ On the other hand, we can see that many other animals have (lesser) cognitive abilities, and determine what the primary differences between our brain and theirs are. We can also take people with brain damage, and determine that there are legitimate differences in, well, everything about them. On the other hand, you don’t seem to be able to do the same thing when it comes to providing evidence in favour of dualism. Cognitive thoughts may well not be “traits encoded by particular genes,” but they do indeed seem to be the emergent properties of the genes.
Nothing remains but to point out that his conclusion is therefore flawed, and to echo the little image disclaimer below this particular article:
Image credit: Copyright © 2011 AAAS. Adapted for use in accordance with federal copyright (fair use doctrine) law. Usage by [Eye on the] ICR does not imply endorsement of copyright holders.
(NB: the image that they use is not the same one as I do).