"I hypothesize that this innovation may have been facilitated by an early attempted domestication of dogs, as indicated by a group of genetically and morphologically distinct large canids which first appear in archaeological sites at about 32 ka B.P. (uncal)."

"I hypothesize that this innovation may have been facilitated by an early attempted domestication of dogs, as indicated by a group of genetically and morphologically distinct large canids which first appear in archaeological sites at about 32 ka B.P. (uncal)."

Last year, population geneticist David Reich of Harvard Medical School in Boston, Massachusetts, and his colleagues compared the genome of a 45,000-year-old human from Siberia with genomes of modern humans and came up with the lower mutation rate2. Yet just before the Leipzig meeting, which Reich co-organized with Kay Prüfer of the Max Planck Institute for Evolutionary Anthropology, his team published a preprint article3 that calculated an intermediate mutation rate by looking at differences between paired stretches of chromosomes in modern individuals (which, like two separate individuals’ DNA, must ultimately trace back to a common ancestor). Reich is at a loss to explain the discrepancy. “The fact that the clock is so uncertain is very problematic for us,” he says. “It means that the dates we get out of genetics are really quite embarrassingly bad and uncertain.”

(emphasis mine)

Now, a lot of careful work has been done calibrating the human "molecular clock," but it's unclear whether similar careful work has been done on dogs to see, for example, whether mutation rates have changed throughout history in various dog and wolf populations. Work in this area is certainly still evolving and dates estimated from genetic evidence alone should invite skepticism. Such estimates must take into account archaeological, anthropological, and other forms of evidence as well.