Calibrating The MSP430 Digitally Controlled Oscillator

The MSP430 is a popular microcontroller, and on board is a neat little clock source, a digitally controlled oscillator, or DCO. This oscillator can be used for everything from setting baud rates for a UART or for setting the clock for a VGA output.

While the DCO is precise – once you set it, it’ll keep ticking off at the correct rate – it’s not accurate. Without a bit of code, it’s difficult to set the DCO to the rate you want, and the code to set that rate will be different between different chips.

When [Mike] tried to set up a UART between an MSP430 and a Bluetooth module, he ran into a problem. Setting the MSP to the correct baud rate was difficult. Luckily, there’s a way around that.

There’s an easy way to set the DCO on the MSP programatically; just set two timers – one that interrupts every 512 cycles, with its clock source set to the DCO, and another that interrupts every 32768 cycles that gets its clock from a 32.768kHz crystal. The first timer clicks off every second, and by multiplying the first timer by 512, the real speed of the DCO can be deduced.

After playing around with this technique and testing the same code on two different chips, [Mike] found there can be a difference of almost 1MHz between the DCOs from chip to chip. That’s something that would have been helpful to know when he was playing around with VGA on the ‘430. Back then he just used a crystal.

While this technique will work and is useful for a variety of purposes, as the other poster’s mentioned above, TI also provides pre-loaded factory calibrated trim values for each device for common frequencies such as 1MHz and 8MHz. If you need something different than that for a peripheral, the device has post-scalers to divide the frequency that is fed to the peripherals.

In fact, to be honest, I have *never* used the direct DCO settings themselves at all. I’ve always used the CALBCS/CALDCO settings. I guess there might be some exotic reason to have to use them, but nothing common comes to mind. Anything where you need a specific (non-calibrated) frequency, the DCO isn’t precise enough anyway.

Fwiw, the later-series MSP430’s such as F5xxx and some FRAM versions have a fancy FLL that will auto-trim the DCO based on the 32.768KHz crystal. That’s a nice feature … I once found the maximum speed you could run the MSP430F5172 (in a tight loop blinking an LED with busy-wait delays) is 56MHz.

I would like to note that the DCO trimming typically involves using a pair of clock speeds to adjust the overall clock rate. This often works OK for longer periods, but for short periods it results in timing jitter which can be problematic in some applications. For example, the F5529LP board has a 4 MHz resonator for the USB timing even though it also has a 32 KHz xtal.

I’ve recently had a lot of hands-on time with the MSP430F5529 board. The 4Mhz is fed through a pre-scaler then PLL’ed to 48Mhz. The 32K xtal is completely non-esential nor used for the USB stuff and the DCO isn’t even used due to timing reasons I believe. You can’t even use the factory programmed USB BSL mode without a 4/8/12/24Mhz crystal installed.

If you already have a UART with a steady enough datastream you don’t even need a Crystal for calibration. You just get your Timing from the Signal edges on the uart. This is especially easy if you have control over the other side since you can just send frames containing know patterns.

Heck, the USCI on the MSP430s have automatic baud rate detection anyway, so you don’t even have to do anything. And of course, as has been mentioned, every MSP430 has at least one calibrated frequency stored in info flash. And cal software is included from TI anyway if you screw up and delete it.

The perfectionist in me is saying that there is no reason you should let the results be off by up to 511 cycles when you have hardware interrupts that can latch exact timer counts, but I’m sure the calibration derived from this is more than good enough to keep things in-line.