Is the reason for a higher quality time base to improve the jitter, not overall frequency accuracy? The worse the clock jitter the more trace to trace dispersion - maybe not important at 1 MHz but at 300 MHz a cycle is 3.3ns, so a jitter of 33 ps would cause 1% trace to trace shift.

Or is there some magic software inside the DSPs that averages out or compensates in some other fashion for time base jitter?