[techspeak] A unit of memory or data equal to
the amount used to represent one character; on modern architectures
this is usually 8 bits, but may be 9 on 36-bit machines. Some
older architectures used `byte' for quantities of 6 or 7 bits, and
the PDP-10 supported `bytes' that were actually bitfields of
1 to 36 bits! These usages are now obsolete, and even 9-bit bytes
have become rare in the general trend toward power-of-2 word sizes.

Historical note: The term was coined by Werner Buchholz in 1956
during the early design phase for the IBM Stretch computer;
originally it was described as 1 to 6 bits (typical I/O equipment
of the period used 6-bit chunks of information). The move to an
8-bit byte happened in late 1956, and this size was later adopted
and promulgated as a standard by the System/360. The word was
coined by mutating the word `bite' so it would not be
accidentally misspelled as bit. See also nybble.

Traditionally, the term "kilobyte" meant (210)1 or 1024 bytes, megabyte was 10242 bytes, etc. To avoid confusion (!) with the standard SI units, the definitions above were published in January, 1999 by the IEC.