Suppose we have a set $f$ of ordered pairs (so not a triple $(X,Y,f)$ but just the $f$) and suppose that $f$ has the appropriate property such that we can view $f$ as a function. Formally, we wish to add a function $f'$ to our language. My understanding is that first-order logic demands that functions be total, thus $f'$ needs to be defined for all possible inputs. My question is, does setting $f'(x)=x$ for all $x$ outside the domain of $f$ lead to any contradictions or undesirable technicalities? And if not, is this the standard approach?

The best solution, I think, is to add another constant $c$ to the language and require $f(x)=c$ if and only if $x$ is not in the original domain of $f$.
–
Asaf KaragilaSep 21 '12 at 11:57

4

The best solution is, of course, to work with partial functions. See my answer.
–
Andrej BauerSep 21 '12 at 12:49

3

There is also a context mismatch in the question. By convention, every function must be total when interpreting a first-order language. However, in that context a function is a symbol of the language. The interpretation of that symbol is a set of ordered pairs but it is best not to confuse the symbol with its interpretation when working in this kind of context. Outside that context, a partial function is a perfectly legitimate object.
–
François G. Dorais♦Sep 21 '12 at 13:42

1 Answer
1

First-order logic "demands" no such thing. It is an accident of current set-theoretic formalization of math that partial functions may be applied outside their domains. This is something that mathematicians never do in practice. We even teach kids in school that division by zero is "foribidden".

Prescribing the value of a partial function outside of its domain in an ad-hoc fashion is a very bad idea because it makes statements "accidentally" true, or false as the case may be. Since one purpose of logic is to help mathematicians with their work (and not to trick them), we should avoid situations in which sensless things behave as if they are meaningful. An aside: for the same reason type theory is a better formalization for the working mathematician (not a set theorist or a meta-mathematician) than set theory, because in type theory it makes no sense to say "the empty set is an element of the number $\sqrt{2}$".

To answer your question, there are of course serious logical treatments of partial functions (Google "logic of partial functions".) One possibility is to simply admit that basic function symbols are not total, and work from there. This leads to interesting questions about the meaning of some statements, but is in any case much closer to actual mathematical practice. Mathematicians are always careful to establish that some object is well defined before they go on using it, and this is essentially what needs to be done in a logic which admits partiality.

Classical treatments of first-order logic insist on having only total functions. This is a simplification for the benefit of logicians. Another such simplification is insistence on having only a single sort, and a third one on not having any way to introduce new symbols and define their meanings. There are meta-theorems which, broadly speaking, guarantee that these simplifications do not diminish the generality of formal logic. The working mathematian who is not interested in proving theorems about logic but wants to work in logic should know that the agenda of the sneaky logicians is not the same as his own.