Machine learning lets micro decode your handwriting

This rig will take the letters you write on the touchpad using a stylus and turn them into digital characters. The system is very fast and displays near-perfect recognition. This is all thanks to a large data set that was gathered through machine learning.

The ATmega644 that powers the system just doesn’t have the speed and horsepower necessary to reliably recognize handwriting on its own. But provide it with a dataset to compare against and you’re in business. [Justin] and [Stephen] designed a neural network algorithm that took a large volume of character handwriting samples, and boiled them down into a set of correlations that can be referenced when encountering a new entry. This set is about 88 kilobytes, too much to store in the microprocessor, but easy to reference from an external flash memory device.

There’s plenty of gritty details in the write up linked above, but you may want to start with the video overview found after the break.

Fair enough; it’s just that the professor’s voice and demeanor conjured up an EE professor I had who was soft spoken and unfailing polite but could cut you down with a word. Lecture started 5 min. early and went to the bell; he would write on the board with one hand while erasing with the other. What made him a hard ass was that he expected the students to be serious, pushed us to the limits of our intelligence and cut us no slack. It was exactly what I didn’t know that I needed at the time.