Expected value is a concept employed in statistics, and is a useful concept for deciding how beneficial or harmful an action will be. To calculate an expected value, you need to gain an understanding of each outcome in a situation and its probability, or chance of each outcome occurring. The steps below will guide you through a couple example problems to teach you the concept of an expected value.

Steps

Method1

Calculating a Basic Expected Value Problem

1

Familiarize yourself with the problem. Before thinking about all the possible outcomes and probabilities involved, make sure to understand the problem. For example, consider a die-rolling game that costs $10 per play. A 6-sided die is rolled once, and your cash winnings depend on the number rolled. Rolling a 6 wins you $30; rolling a 5 wins you $20; rolling any other number results in no payout.

2

List all the possible results. It helps to make a list of all the possible results in the given situation.[1] In the example above, there are 6 possible results. They are: (1) roll a 1 and lose $10, (2) roll a 2 and lose $10, (3) roll a 3 and lose $10, (4) roll a 4 and lose $10, (5) roll a 5 and win $10, and (6) roll a 6 and win $20.

Note that each outcome is $10 lower than described above, because you have to pay $10 to play the game no matter what result you roll.

3

Determine the probability of each outcome. In this case, the probability of each of the 6 outcomes is the same. When rolling a 6-sided die, the chance of a particular number being rolled is 1 in 6. To make it easier to write and calculate, we'll turn this fraction (1/6) into a decimal by plugging it into a calculator: 0.167. Write this probability next to each outcomes, especially if you are solving a problem with a different probability for each outcome.

If you put 1/6 into a calculator, you might actually get something like 0.166667. We're rounding the decimal to 0.167 to make things easier to calculate. It's close to the actual number, so we'll still get a pretty accurate result.

If you want a very accurate result and you have a calculator with a parenthesis button, write (1/6) instead of 0.167 when you calculate the formulas below.

4

Write down the value of each result. Multiply the $ amount of one result by the chance of that result happening by the to find out how many dollars that result contributes to the expected value. For instance, the result of rolling a 1 is -$10 and the chance of rolling a 1 is 0.167. The value of rolling a 1 is therefore (-10) * (0.167).

You don't need to calculate these results out yet if you have a calculator that can handle multiple operations at a time. You'll get a more accurate result if you plug in the whole equation, later on.

5

Add the value of each result together to get the expected value of the event. Continuing with the example above, the expected value of the dice game is: (-10 * .167) + (-10 * .167) + (-10 * .167) + (-10 * .167) + (10 * .167) + (20 * .167), or - $1.67. Therefore, when playing the dice game, you should expect to lose $1.67 per game played.

6

Understand the implications of the expected value calculation. In the example above, it was determined that the expected winnings of the game were - $1.67 per roll. This is an impossible outcome for one game; you can only either lose $10, win $10, or win $20. However, the expected value is useful as a long-term average figure. If you play this dice game over and over, you will lose somewhere near $1.67 per game on average. Another way to think of expected value is assigning the game a particular cost (or benefit) of playing; you should only decide to play the game if the fun of playing is worth paying $1.67 each time.

The more times the situation is repeated, the more accurately the expected value will mirror the actual average outcome. For example, you might play the game five times in a row and lose every time, resulting in an average loss of $10. However, if you were to play the game 1,000 times or more, your average result would almost always be close to the expected value of -$1.67 per game. This principle is called the "law of large numbers."

Method2

Calculating an Expected Number of Coin Flips for a Specific Result

1

Use this to work out the average number of coins you'd need to flip for a specific pattern to come up. For example, you can use this method to work out the expected number of coins you'd need to flip until you get two heads in a row. This problem is a little more complicated than the basic expected value problem, so read the section above this one if you're not yet comfortable with expected values.

2

Let x stand for the answer you're trying to find. Let's say you're trying to figure out how many coins you need to flip on average to get to heads in a row. We'll be setting up an equation to help us find this answer. The answer we're looking for, the average number of coins you have to flip, will be called x. We'll set up this equation as we go along. Right now we just have:

x = ___

3

Think about what happens if the first flip is a tails. When you try this, half the time, your first coin flip will end up tails. If this happens, you've "wasted" one flip, while your chance of flipping two heads in a row hasn't changed at all. Just like before this coin flip, you'd expect to flip an average number of additional coins before you get two heads in a row. In other words, you'd expect to flip x more coins, plus the 1 you already flipped.[2] In equation form, "half the time, you'll need to flip x coins plus 1" lets us extend our equation like this:

x = (0.5)(x+1) + ___

We'll be filling in the blank as we continue to think about other situations.

You can use fractions instead of decimals if it's easier for you. 0.5 is the same as 1/2.

4

Consider what will happen if the first flip is a heads. There's a 0.5 (or 1/2) chance that you get a heads on your first flip. This seems closer to our goal of flipping two heads in a row, but how much closer? The easiest way to figure this out is to think about what our options are for our second flip:

If our second flip is a tails, we're back to the beginning, with two flips wasted.

If our second flip is also a heads, we're done!

5

Learn how to calculate the probability of two events both happening. We know that a coin flip has a 0.5 chance of getting a heads, but what's the chance of two coin flips both coming up heads? To find the probability of two events both happening, multiply the probability of each one together. In this case, it's 0.5 x 0.5 = 0.25. This is also the chance of flipping a heads, then a tails, since they each have a 0.5 chance of happening: 0.5 x 0.5 = 0.25.

Add the result for "heads, then tails" to the equation. Now that we know the probability of this happening, we can work on extending our equation. There's a 0.25 (or 1/4) chance that we'll waste two flips without getting anywhere. Using the same logic we used earlier, when we thought about our first coin being a tail, we'll still need x more coin flips on average to get out result, plus the 2 we already flipped. In equation form, this is (0.25)(x+2), which we can add to our equation:

x = (0.5)(x+1) + (0.25)(x+2) + ___

7

Add the result for "heads, heads" to the equation. If you flip heads, heads, with your first two coins, then you're done. You've got the result in exactly 2 flips. As we worked out before, there's a 0.25 chance of this happening, so the equation for this is (0.25)(2). Our equation is now complete:

x = (0.5)(x+1) + (0.25)(x+2) + (0.25)(2)

If you're not sure whether you've thought of every possible situation, there's an easy way to check whether the equation is complete. The first number in each "piece" of the equation represents the chance of that event happening. These will always add up to 1. Here, 0.5 + 0.25 + 0.25 = 1, so we know we've covered every situation

8

Simplify the equation. Let's make the equation simpler by multiplying it out. Remember, if you see parentheses set up like this (0.5)(x+1), you multiply the 0.5 by each term in the second set of parentheses to get 0.5x + (0.5)(1), or 0.5x + 0.5. Let's do that for each term of the equation, then combine terms to make it as simple as possible:

x = 0.5x + (0.5)(1) + 0.25x + (0.25)(2) + (0.25)(2)

x = 0.5x + 0.5 + 0.25x + 0.5 + 0.5

x = 0.75x + 1.5

9

Solve for x. Just like any equation, when you're trying to find out what x equals, you need to get x on one side of the equation. Remember, x means "the average number of coins you have to flip to get two heads in a row." Once we find out what x equals, we have our answer.

x = 0.75x + 1.5

x - 0.75x = 0.75x + 1.5 - 0.75x

0.25x = 1.5

(0.25x)/(0.25) = (1.5)/(0.25)

x = 6

On average, you should expect to flip a coin six times before you get two heads in a row.

Method3

Understanding the Concepts

1

Understand what an expected value is. The expected value is not necessarily the result you're most likely to get. After all, sometimes an expected value will be an impossible result, for instance, the expected value might be +$5 for a game with only a $10 prize. What the expected value calculates is how much value you should put on that event. If the game has an expected value of +$5, you should play it if you think the time and effort spent is worth getting paid $5. If a different game has an expected value of -$20, you should only play if you think the fun of playing is worth $20.

2

Understand the concept of independent events. In everyday life, many of us think we're having a lucky day if a few good things happen to us, and we might expect the day to stay lucky and more good things to happen. Alternatively, we might think that we've had enough bad luck today and more bad things won't happen for a while. From a mathematical standpoint, events don't usually work like this. If you flip an ordinary coin, there will always be exactly 1/2 chance of getting a heads, and a 1/2 chance of getting a tails. It doesn't matter if the last twenty flips were all heads, all tails, or a mix: the next coin flip still has exactly these probabilities. The coin flip is "independent" from the other coin flips, unaffected by them.

The belief that you have a lucky or unlucky run of coin flips (or other independent, random events), or that you've "used up your bad luck" and will now have good results, is called the gambler's fallacy. This is named after people's tendency to make risky or foolish gambling decisions if they feel they are on a "lucky streak" or if they feel their "luck is about to turn."

3

Understand the law of large numbers. You might think the expected value is unhelpful, since it rarely seems to tell you the result you actually end up with. If you calculate that the expected value of a roulette game is -$1, then play three games, often you'll end up with -$10, or +$60, or some other result. The "law of large numbers" helps explain why expected value is more useful than you might think: the more games you play, the closer to the expected value (the average result) you'll get. When you're looking at large numbers of events, the total result is likely to be close to the expected value.