> > You may want to also consider the *21*-bit option now.
>
> Why? If we extend APIs and data structures from 16-bit integer types,
> then very naturally we will use 32-bit integer types. I see nowhere any
> benefit whatsoever in artificially introducing restrictions for
> character/glyph identifiers to 21 bits,