2 Answers
2

If you add a truly random character into a truly random position of a word (uniformly chosen), you get "entropy of position" + "entropy of character" as addition to the entropy of the word. (Not exactly, it's a bit less). The entropy of character is the size of the possible characters. 64 possible characters would be $log2(64) = 6$ bits of entropy. Entropy of position are the numbers of possible positions where the character can be placed. With a word with a length of 4 characters you can get 5 positions. (length of the word plus 1. "foo" has 4 pssibilities: "?foo", "f?oo", "fo?o" and "foo?".)

Example: You got the word "hello". Now you want to add one character of a possiblity of $26+26+10 = 62$ characters (small and big letters and numbers) into the word. Possible outcome: "h9ello", "helalo", "heXllo", ...
Entropy of "hello": Let say 0, because it is a constant value in this example. Entropy of character: $log2(62) = 5.95$ bits. Entropy of position: $log2(4+1) = 2.32$ bits. Whole entropy: 8.27 bits.
Problem: That's not entirely true, because you lose a bit entropy to a specific effect. Look at the example "hello": You could add an "e" after, but also before the e in the word. It's not possible to distingush this two possibilities ("heello" and "heello"), they are the same. That's one possibility less than we have calculated. You can do this for every letter of the word. Words like "hhhhh" are even worse, because every time the random character is a "h" you can't tell where in the word it was added.

It is also worth pointing out that this entropy estimation assumes that your random choice of a character to insert is uniformly random, i.e. you have no preference for any particular letter or uppercase/lowercase/digit distinction. Any deviation, however unintentional, will lower the actual entropy further. For example, if you find yourself choosing "X" in one out of eight cases, that's only 3 instead of 5.95 bits.
–
pyramidsJul 14 '14 at 8:27

By that statement, does that mean adding a random character to a random position in a Diceware word adds 10 bits to each word?

No. The ten bit estimate is for adding a random symbol from the 36-item table to a random position in the passphrase. The entropy in the character choice is about five bits and the entropy in the choice of position is another almost five. The exact number depends on passphrase length, but six word passphrases are ~30 characters long on average.

If you add a symbol at random to each word, you lose more than half the "position" entropy, and get ~7 bits per added symbol.

(The entropy loss from duplicates that Nova mentions in the other answer is minimal, because most diceware words are primarily letters, while the added characters are digits or symbols.)

What would be the entropy per Diceware word if random printable ASCII character is used instead?

The entropy in a printable ASCII character is about 6.5 bits, so you could only gain 1.5 bits if you did that (so about 11.5 bits in total). Here there would be a loss from duplicates, as well as a more serious loss from being able to accidentally get another Diceware word (e.g. woo + d -> wood). Further, it would be more difficult to generate when using dice.

What would be the entropy per Diceware word if a random Diceware word is used as the random symbol in a random position per Diceware word?

Well, the entropy in a Diceware word is 12.9 bits, so you'd gain ~18 from adding one randomly anywhere in the passphrase or ~15 per from adding one to each word. However, that could again form another Diceware passphrase (e.g. yam + aha -> yamaha), so the actual entropy added would be slightly less.

Note: This problem also exists when creating a normal Diceware passphrase, which is why they recommend always using a space as a separator between words.