Two more recent LTU threads which may be of interest are here and here.

In some statically-typed languages such as SQL, Eiffel and Java (excluding intrinsics like int, bool and double), there is an implicit minimal type with one element (null in Java, VOID in Eiffel, NULL in SQL) which is a valid subtype of pretty much everything. Strictly speaking, it isn't a true "bottom" (the bottom type is empty, after all, and the set containing the null reference is not empty), but is is a minimal type.

Many of the objections to nulls arise because of this property, making null values essentially unavoidable.

The bottom type is an artifact of the curry-howard isomorphism between proofs and types and is by no means a universal feature of type theories as it is used as the type of programs that other theories would reject as not being well-formed. It isn't of any practical use in programs because a type with no values and no methods isn't a valid subtype of anything.

A minimal type with a single value and no methods is of some use in a real programming language as instances of the type require zero storage so you can do things like create a graph with data on the edges but not in the nodes while in theory not having any overhead.

The bottom type does have some practical use. It's the type of constructs like Java's "throw" that diverge by throwing an exception. Java treats "throw" specially. In Java any code after a throw is flagged as unreachable. If Java had an explicit bottom type then it could treat user written code the same way, so that code after "logMessageAndThrow(blah)" would also be recognized as unreachable.

A bottom type can also be used with covariant type parameters. In Scala, there's one constant to mark the end of any list called Nil. Nil's type is List[Nothing]*, where "Nothing" is Scala's way of saying bottom. Concretely you can see that "head" of a List[T] returns a T so head of List[Nothing] must return Nothing and in fact it throws an exception.

All that said, languages like ML and Haskell just encode the bottom type as "forall a.a" and that works just fine, too. That doesn't mean the bottom type is useless, it just means that a language may have a reasonable way of indicating it without having an explicit type.

The use of âŠ¥ to represent things like nontermination or failure--has, for some reason, always kind of bothered me. This is a wholly philosophical observation, but if âŠ¥ is the zero of your type system, use of it (as the return type of a function) to signify failure of various sorts is like division by that zero--all bets are off.

In purely theoretical contexts, where nontermination, "getting stuck", or other causes of failure are essentially elided from consideration, this is a reasonable position to take, I suppose--it is a convenient means to abstract away details that aren't important to the problem at hand.

For production programming languages, error handling does become an important concern. It's frequently useful to distinguish between functions which may throw exceptions (or otherwise fail) between those which do not; and use of âŠ¥ to indicate failure undermines this--the algebraic sum of some T and âŠ¥ is T.

At any rate, if a language includes a type (whether named or not) containing more than zero values; it's probably inappropriate to refer to it as a bottom type, as âŠ¥ is empty. By maintaining the empty invariant, the satisfaction of otherwise impossible promises (such as the notion that the universal subtype must somehow implement every possible method) becomes vacuously possible. :)

In SQL below 6NF, or relational programming with outer joins, some sort of representation is required for an unknown domain value in a relation with join dependencies. NULL is the default choice.

This is unfortunate because it loses information... i.e. if some sort of 'unique unknown variable' were introduced instead, one could determine when two NULLs happen to be equal, and it'd be far more suitable to updateable views.