A couple of weeks ago, Rust gained suffix inference for integer literals.
Up until this change, to write an integer literal that had any type
other than `int`, you had to add a suffix to it. This was especially
annoying with uint literals, which had to be written with a `u`
suffix; it was a type error to write `let x: uint = 3;` because the 3
lacked a suffix. This is no longer the case.
Now, if you write an integer literal without a suffix (3, -500, etc.),
the compiler will try to infer its type based on type annotations and
function signatures in the surrounding program context. For example,
in the following program the type of `x` is inferred to be u16 because
it is passed to a function that takes a u16 argument:
let x = 3;
fn identity_u16(n: u16) -> u16 { n }
identity_u16(x);
It works in patterns, too:
let account_balance = 45u64;
alt account_balance {
0 { log(error, "not enough") }
1 to 99 { log(error, "enough") }
_ { log(error, "whoa") }
}
But if the program gives conflicting information about what the type
of the unsuffixed literal should be (that is, if the typing context
overconstrains the type), you'll get an error message, as in the
following:
let x = 3;
fn identity_u8(n: u8) -> u8 { n }
fn identity_u16(n: u16) -> u16 { n }
identity_u8(x); // after this, `x` is assumed to have type `u8`
identity_u16(x); // raises a type error (expected `u16` but found `u8`)
In the absence of any type annotations (that is, if the typing context
underconstrains the type), Rust will assume that an unsuffixed integer
literal has type int. This is is a fairly arbitrary choice, but it
helps with backwards compatibility, since in the past, unsuffixed
integer literals were always of type int.
let n = 50;
log(error, n); // n is an int
Of course, if you don't want the types of your integer literals to be
inferred, it's fine to keep writing them with suffixes.
Enjoy!
Lindsey