A terminal value is an ultimate goal, an end-in-itself. In an AI with a utility or reward function, the terminal value is the maximization of that function. (In some of Eliezer Yudkowsky's earlier writings, the non-standard term "supergoal" is used.)

Terminal values vs. instrumental

Terminal values stand in contrast to instrumental values, which are means-to-an-end, mere tools in achieving terminal values. For example, if a given university student does not enjoy is studying but is doing so merely as a professional qualification, we can describe his terminal value as getting a job; getting good grades is an instrument to that end.

Some values may be called "terminal" merely in relation to an instrumental goal, yet themselves serve instrumentally towards a higher goal. In the previous example, the student may want the job to gain social status and money; if he could get prestige and money without working he would; and in this case the job is instrumental to these other values. However, in considering future AI, the "terminal value" is generally used only for the top of the goal hierarchy: the true ultimate goals of a system, which do not serve any higher value.