To start, why are we even looking at three-valued logic? We want to get to fuzzy reasoning, and three valued logic really doesn't do the job. But it's a useful starting point, because in general, most of us have only worked with standard logics where statements are either true or false. In Fuzzy logic, we're don't have that - and that can lead to a lot of complexity. So looking at something simple that shows us how something either than "true" or "false" can actually make sense in logic.

The idea of three-valued logic is, quite simply, to add an alternative to true or false: neither. We usually abbreviate the truth values to T, F, and N. What does N mean? It depends on exactly which three-valued logic you choose. And already, we can start to see where some of the complexity is going to come from.

In traditional propositional or predicate logic, we can take statements, and assign them a truth value: either they're true, or they're false. Once we know whether they're true or false, we can use that knowledge to infer whether other statements are true or false. In general, in classical logic, the inference rules seem obvious. If you know that \(A\) is true, then you know \(A lor B\) is true. It doesn't matter what \(B\) is; if \(B\) is false, then \(A lor B\) is true; if \(B\) is true, then \(A lor B\) is true. No matter what, if you know that \(A\) is true, then \(A lor B\)must be true.

In a three valued logic, that isn't necessarily the case. It all depends on what \(N\) means. For example, you can define a three-valued logic where \(N\) means "I don't know". In that case, you'd have inference rules like \(Gamma :- A={bf T} Rightarrow Gamma :- (A lor B)={bf T}\) - meaning that if I know that \(A\) is true, then I can infer that \(A lor B\) is also true. If \(N\) means "I don't know", then \(A lor N\) is still true.

But \(N\) doesn't have to mean "I don't know". It could mean "maybe". It could mean "neither" (in the sense of vagueness; Ann isn't really tall, but she isn't really not tall either). Or it could mean "nonsense", as in an invalid statement that can't be assigned a truth value. If it means "nonsense", then \((A lor {bf N}) = {bf N}\) - that is, \(A\) or meaningless nonsense is meaningless nonsense.

So there are many three-valued logics, depending on exactly what you want \(N\) to mean, and how you want it to behave. Let's look at one particular three-valued logic, just to see how it works. We'll start with Kleene's strong three valued logic - generally written \(K^S_3\). \(K^S_3\) is a three valued logic where \(N\) means "unknown" or "uncertain". The basic logical operators in \(K^S_3\) behave according to the truth table below:

So N behaves pretty much as you'd expect unknown to behave. If you know something is true, then true or unknown is true, because the unknown could be true or it could be false - and in either case, it makes sense that the result of a logical or should result in true.

Setting things up this way has some interesting effects. In \(K^S_3\), there are no tautologies. (A tautology is a statement that is always true.) Take a look at one of the classic ones: \(A lor lnot A\). If \(A\) is true, then it's true. If \(A\) is false, then it's true. But if $A$ is \(N\), then it's \(N\). Similarly, there are no contradictions: \(A land not A\) isn't false when \(A\) is \(N\).

So just by adding this simple \(N\) value, some of the most basic things that we've come to expect from logical reasoning, some of our most basic logical intuitions no longer work. We're going to have to get used to that as we get fuzzier.

So there are some unexpected things in our three-valued logic. But some things continue to work as we expect. One of the nice properties of \(K^S_3\) is that its connectives are normal - which just means that if we limit ourselves to true and false, they'll behave the same way as they do in two-valued logic.

But normality doesn't give us as much as it might seem. We can take some pretty innocuous looking proofs from two-valued logic, and they'll fall apart in \(K^S_3\), even though they're built on normal connectives.

Consider one simple argument from propositional argument. Suppose that we know \(lnot (A leftrightarrow B)\). In two-valued propositional logic, from that, we can infer that for any proposition P, either \(P leftrightarrow A\), or \(P leftrightarrow B\). That's not true in \(K^S_3\): suppose that A is N, and B is T. P could then be F.

This is, obviously, just a quick taste. But \(K^S_3\) is pretty simple - you can figure out most of its properties just from this little bit here. But there are several other ways of defining three-valued logics. In the next post, we'll take a quick look at two others: \(L_3\), which is similar to \(K^S_3\) but modifies the definition of implication so that \(N leftrightarrow N\) is true; and \(B_3\), which is built on the idea that \(N\) means something like undefined. \(B_3\) also builds up some more interesting structure around the three-valued logic.

In K_3, as we can see from Mark's table A=N. So, by the symmetric property of equality, we have N=A. Since ~N=N, and by the transitive property of equality we also have ~N=A. So, if A=N is true, than A=~N is also true. Likewise, if A=N is false, A=~N is also false. So, if multivalued logic *needs* a two-valued logic foundation, then contradictions (given these qualify as contradictions) come as derivable.

For a statement like (A=N)=N we don't know what happens to it. If we did, then it would end theoretically end up as true or false, and we'd have two-valued logic instead of three-valued logic here.

By the truth table if A has truth value of I don't know... N, then it also has truth value of the negation of I don't know... ~N. So, if we insist on assigning a truth value of true or false to A, then since it has truth of N (A=N is a definition, not an equation), it also has a truth value of T or F also. So, we have one contradiction there, and others like what I did above come as derivable.

Technically, a statement like (A = N) is not a statement you can make in the logical framework. One side is a proposition, the other is a truth value. Saying G: A = N is a statement that the truth value of A is N, but you can only make that statement in a metalogical framework.

Hmm, curious. I do wonder at the applicability of this particular logic system, because it seems to me that if we want to deal with a system that includes "true", "false", and "I don't know", we would be doing so in the understanding that deep down, on some level, every well-formed statement is either true or false, it's just that we may not always know which it is. If we take this understanding, then any time we have an N as the value of a variable in a statement, then the result should be the T if the statement is true whether we choose T or F for the variable instead, F if the statement is false whether we choose T or F for the variable, and N if the value of the statement depends on whether N is T or F, or if the value of the statement is N in any situation.

For example, if I take the statement for A to be, "Bob is taller than Jim," then I may have no idea whether or not this is true (and thus assign "N" to it), but I do know that A and not A is false.

I guess I'm not clear, then, as to exactly what sort of real-world system this would apply. It seems that this system must be implying something more than, "I don't know," perhaps something a bit closer to, "This is unknowable." Even then I'm not entirely certain, as A and not A for A=N would, in some cases, be true (e.g. if "unknowable" were to mean "some definite value that is neither true nor false"). An example being, "Bob is tall", "Bob is not tall," and "Bob is neither short nor tall".

I do wonder at the applicability of this particular logic system, because it seems to me that if we want to deal with a system that includes “true”, “false”, and “I don’t know”, we would be doing so in the understanding that deep down, on some level, every well-formed statement is either true or false, it’s just that we may not always know which it is.

Not really. Suppose that "T" means "can be proven true in logic system S", "F" means "can be proven false in logic system S" and "N" means "cannot be proven true or false in logic system S", which can happen whenever S is incomplete. There is no deep notion of "truth" here, because "truth" is relative to the proof system.

Elsewhere, Mark has ranted against the addition of an 'infinity' or 'NaN' to the natural numbers (another NULL). The argument is basically that it breaks the laws of algebra. The same applies for adding N to the Boolean 'numbers' (false and true) . Is this less serious in any particular way?

Well, I can see how this sort of simple three-valued logic system would be good here, but at the same time it would seem to provide an \(N\) more often than is necessary. After all, it is definitely the case that a loop can or cannot be parallelized. We may not necessarily know which is the case, but the truth is definitely one or the other.

So if you ever had a situation in the code that consists of \(A wedge neg A\), then this will falsely give \(N\) whenever \(A = N\).

Yes, I understand that this system is used in some computer codes, and it seems to me that it would be rather difficult to modify the system to take care of situations such as the above in an efficient manner. But in terms of applying the logic to, say, logical arguments, it seems that it needs more to really be applicable to the way we use the classification, "I don't know."

It's very, very bad, and Codd really shouldn't have introduced the NULL: the relational model has been doing fine for 10 years before he decided that NULL was a good addition to deal with "unknown information".

You see, when you add NULL, you end up with three-valued logic: now "x = y" may be TRUE, or FALSE, or UNKNOWN... or is it? You see, NULL is added to all domains, including Booleans. Which means that we have four-valued logic, with TRUE, FALSE, UNKNOWN, NULL... and there is no "natural" four-valued logics to pick, and many identities of boolean algebra don't hold — which means that, for instance, the possibilty to optimize the WHERE part of a SQL query is limited.

The more natural way of dealing with unknown data in relational model would be this: instead of relation A with NULL-able attribute x, you have two relations, Ax_p with the same heading as A, except that x is now non-NULL-able, and Ax_n, with the same heading as A, but with attribute x removed.

Of course, that means that to get sub-relation of A where all attributes are non-NULL, you have to JOIN an awful lot of relations together. The main problem here is not storage/physical representation problem: no one has ever mandated that a relation must be stored as exactly one SQL table/B-tree/whatever; the problem is that you need tools for working with sets of relations. Well, the relational model, after all, is just the first-order logical framework, so why not extend it to the second order?

Not that I knew what to call it at the time, but I once used this exact system of three-valued logic in a boolean circuit generation program. (It is the most obvious one after all.) This way it could handle loops in the circuit diagram without the final outcome depending on the initial state. I'm not sure whether it worked all that well, since it was too slow to get to big enough examples where loops might actual start to make the circuits smaller.

Kind of like Jason above, I'm not completely sold on the claim that there are no tautologies in Kleene's. The example you give is that in this three-value system, A && A' is not always true, as shown by the truth table entry if A is "N" (this makes both A and A' "N"). But it seems to be that in this particular case, where a certain state is either T or F (we just don't know which) the relationship between A and A' is still maintained - they are still opposites. This would make the A && A' tautology still hold true.

You can potentially still define other connectives than the basic primitives of the logic. For instance, using the basic primitives here, it might come as possible to define logical equivalence in a way such that AB is true if they have the same truth value. However, even if other operators can get defined which yield tautologies, for the *basic* connectives (that might not come as the usual term, but it helps indicate this idea at work) of the logic involved, there don't exist any tautologies. The basic connectives above consist of the ^, v, and ~ as Mark's truth table defined them. The truth value of all statements (given that we know their truth value), in principle, in K_3 come as derivable from these basic connectives as described in the truth table.

Just look at the middle row of the tale of values. If A and B are both N, any combination of them produces N. No matter what you do to the connectives, you can't get T. So, no tautologies using the standard connectives.

Now, you can define new primitive connectives, so it would be possible to get tautologies based on them.

I'm intrigued by this notion, but I think it would result in a four-value logic system instead of a three value system. You have T, F, N, and M. N and M behave exactly the same as in \(K^S_3\), except that \(N=neg M\) and vice versa. Then \((N wedge N) = T\) and \((N wedge M) = F\). Probably this system has already been proposed and studied, but I'm not familiar with the literature on the subject.

I know of at least one practical application of this kind of three-valued logic. I once worked on a compiler that would evaluate array index expressions to figure out if loops could be parallelized or not. I computed ranges for the variables in the index expressions (restricted linear combinations of integer variables), and on various places the algorithm used true|false|unknown values, including logical combinations of the values.

This reminds me of the std_logic type in VHDL. It has many more values, but the basic idea is similar.
There you can have 1 and 0 (equivalent to true and false), high and low (weak true and false), high impedance, unknown, uninitialized and don't care. Then there are tables that define how they combine under logical operations. A difference is that you can also assign two values to a wire and they will get resolved (e.g. low with high impedance -> low, but low with 1 -> 1).

I really don't like the fact that "A and ~A" is unknown when N is unknown. I always think of an unknown as "we don't know yet" (fan of Hilbert...), and as soon as we do know, "A and ~A" will immediately become false.

It's not an either-or between logical systems, there don't exist sufficient reasons to decide between them, this isn't a sporting competition where we have to decide a winner, and it comes as well possible that we might not know yet, and might never know.

(Also, testing if latex works in comments. I don't know if it does. If the above appears as a logical not, then latex works in comments by surrounding the expression with double-dollar-signs. If not, well, sorry.)

For any readers particularly interested in this topic, I suggest they look for a copy of "Deviant Logic" by Susan Haack, Cambridge University Press, 1974. It is my "bible" on the area and provides both a mathematical and a philosophical viewpoint on non-classical logics.

That's *ONE* potential reference. But, if that's your "bible" on the area, it's like having ONLY the book of Mark for the New Testament, or equating the entire Torah with the book of Deuteronomy. Make no mistake, if one wants to evaluate the possibilities for logics where n>2, it comes as all too easy to oversimplify the matter and miss the fact that more possibilities exist which an author such as Haack might miss. For instance, it *does* come as possible to consistently use *all* the basic laws of classical logic in a full-blown infinite-valued logic context. James J. Buckley and Esfander Eslami describe this in their introductory book on Fuzzy Sets and Fuzzy Logic in a chapter on "mixed fuzzy logic".

No, I didn't say that. I didn't mean that either. Joe referred to Haack's book as his "bible" on the area. Unless I've misinterpreted him, he means that ultimately Haack's book ends up more important than other books on fuzzy logic. Haack's book is by no means comprehensive or encyclopedic. It comes from one author. How can one rationally hold such a book that high in esteem and have treated the subject fairly?

It would be one thing had Joe referred to a book with several authors with several different points-of-view and/or interpretations from "the same" author such as the Torah or the Christian's New Testament. But, he didn't. It wouldn't... and never really did if anyone did this... make sense to refer solely to _Mathematical Principles of Natural Philosophy_ for any readers particularly interested in the topic of physics. Nor would it make sense *solely* to refer to the _Critique of Pure Reason_ for any readers interested in philosophy. Nor would one rationally suggest taking Shakespeare's literary critics as their "bible" on Shakespeare (this analogy is more apt... think about it).

Suggesting Haack on such may help someone. By all means read what she has to say. But, her book certainly deals with many more topics than just fuzzy logic, and it is after all criticism... it's NOT the actual produced work. Criticism only deepens understanding if one has actually viewed the work in question itself. On top of this, there does exist a more current work: http://www.amazon.com/Deviant-Logic-Fuzzy-Beyond-Formalism/dp/0226311341

Did I miss where you defined logical implication and logical equivalence for Kleene's strong three-valued logic? Or did you assume that one could derive them from ~A v B, or A^~B, or A=B is equivalent to 'A implies B and B implies A'?

As I take it you know, there exist many different forms of implication in multi-valued logic. All one needs to do to get one comes as to find an equivalent formula in classical logic to A implies B (e. g. with a logic with truth values in [0, 1] we just need a function f such that f(0, 0)=1, f(0, 1)=1, f(1, 0)=0, f(1, 1)=1), and then apply that formula to a multi-valued logic context. Do you know with absolute certainty that these formulas all come out equivalent to ~A v B or A^~B (or that these two equal each other) *when extended* to Kleene's 3-valued logic? Does someone have a proof of that? It would seem like sheer co-incidence to me since the only real requirement comes as that f(T, T)=T, f(T, F)=F, f(F, T)=T, f(F, F)=T... even for a particular 3-valued logic like the one with truth tables above.

Lukasiewicz used 3-valued logics in proving the independence of his 3 axioms for propositional calculus. One can also treat 3-valued (and potentially n-valued logics) axiomatically, once one has correct axioms. To do this one just needs to find a logical "tautology" and then check thoroughly that for the three-valued logic proposed also qualifies as a "tautology". More simply if an expression in classical logic such as
{[(p->q)^(q->r)]->(p->r)}, where -> denotes the material conditional, and ^ denotes logical "and", which always yields 1 for all values of p, q, and r, we check to see if in the proposed 3-valued logic that for all values of p, q, and r yield a truth value of 1. To make this easier to check, one might want to write the truth table as a relational matrix (or "table" if you insist on differentiating the term "matrix" so that it only applies to linear algebra) treating negation as a unary operation, and all other ones as binary operations. Unfortunately, I don't know how to write such a matrix here, if that even is possible here. Then look across all rows to try to find a rule for the relational matrix (and prove that such a rule holds by cases). One can look down all columns for more rules. Then one can look at the diagonals to see if any more rules hold. Once you've expressed those rules, you can then use them in checking an expression, so you don't have to write a truth table for 3 3 ^=27 possibilities.

Then once one has correct axioms one can prove further expressions as correct using logical rules of inference. As an example, in Lukasiewicz's 3-valued logic, with denoting the material conditional, and N denoting negation, C(0, 0)=1, C(0, .5)=1, C(0, 1)=1,
C(.5, 0)=.5, C(.5, .5)=1, C(.5, 1)=1, C(1, 0)=0, C(1, .5)=.5, C(1, 1)=1. N(0)=0, N(.5)=.5, N(1)=0
CCpqCCqrCpr
CpCNpq are both correct axioms since they always yield truth value of 1.
CCNppp, however, is not.