.1 mm on meter stick is gotten from a fair guess. You need to combine that fair guess with multiple measurements to obtain a statistical basis then compute an average. Your result should state a estimate at the magnitude of your error, perhaps the standard deviation of your data.

As far as significant figures go, if it is marked down to milimeters, the best measurement you can get is 0.0001m, you are allowed to guess one decimal place beyond what you can measure, provided you show a realm of error in the same magnitude.

Can one read a meter stick down to a tenth of a millimeter? For example, could I reasonably measure an object as 32.43 cm using a meter stick?

Click to expand...

It's not clear to me that there's a rigorous justification for the standard error quote when using a meter stick, but it's also not clear that one is needed. If it's an accurate meter stick and you can resolve the markings with your eyes, then your measurement is certainly good to the millimeter level. Beyond that, determining the error is more of a psychological problem than anything else.

Lets say your measuring a pencil
You can measure the number of millimeters exactly without any error, right? Just count up the ticks. This value is exact, lets say its 0.104m, or 10.4cm. After that, you are permitted to estimate up to one decimal place more, for example, if its really close to the 4mm mark, you can guess it to be about 10.41cm, or 10.42cm, something like that. if its clsoe to the middle, 10.45. The point is you can put a reasonable estimate on this value. It is not reasonable however to say that a pencil is 10.42434395098cm just by looking off a ruler, because you can't obtain such a high accuracy. The most you can say is what you can observe as exact, plus one more decimal place.