Wednesday, July 22, 2015

John Cook makes an interesting point
regarding a rule you may have had drummed into your head when you
first learned fractions:

[It] serves some purpose in the
early years, but somewhere along the way students need to learn
reducing fractions is not only unnecessary, but can be bad for
communication. For example, if the fraction 45/365 comes up in the
discussion of something that happened 45 days in a year, the fraction
45/365 is clearer than 9/73. The fraction 45/365 is not simpler in a
number theoretic sense, but it is psychologically
simpler since it's obvious where the denominator came from. In this
context, writing 9/73 is not a simplification but
an obfuscation.

Simplifying fractions
sometimes makes things clearer, but not always. It depends on context,
and context is something students don't understand at first. So it
makes sense to be pedantic at some stage, but then students need to
learn that clear communication trumps pedantic
conventions. [emphasis in original]

This touches
on something I have noticed as a parent of an increasingly inquisitive
toddler: The need to focus on one lesson frequently requires setting
aside a wider context so the matter at hand can be held in mind. It is
perhaps harsh to call reducing fractions pedantic, but there is a
serious issue here. Teaching such rules as if they must always be
followed or come from a vacuum discourages subsequent questioning and
integration with other knowledge. A full explanation is likely
impractical at the time, but perhaps teachers should more often say
something like, "We will be doing things this way because it makes
these lessons easier to learn."

2 comments:

Anonymous
said...

Decimal format (0.123) is even better so that fractions can be compared to each other. Unless the denominator is small (2/3) or standardized (90°, 45/365 days) you probably shouldn't use fractions. Who can do mental math with numbers like 29/93?

Maybe, maybe not, since decimals combines reduction with conversion to a denominator of 10, 100, 1000, etc. Even non-standard numbers like 93 can be easier in, say, a shared context of things that regularly occur 93 times. (Maybe a company finds it convenient to package 93 of something for some reason or another...)

It all depends on context (as above and in terms of what normal humans can process) and the mathematical background/ability of the audience. Time and again, I see examples of people who don't really understand what "percent" means, and that's arguably better than a decimal since those are whole numbers (or easily-enough rounded to them), and easier to read than decimals. I think Cook's main point is that we have to keep context in mind when using numbers, and that doing things the same way every time can get in the way.