This duality can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past but cannot control it; we may control the future but have no knowledge of it.

My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'

Scientific American (1971), volume 225, page 180.

Explaining why he named his uncertainty function "entropy".

Omni: Will robots be complex enough to be friends of people?Shannon: I think so. I myself could very easily imagine that happening. I see no limit to the capabilities of machines. As microchips get smaller and faster, I can see them getting better than we are. I can visualize a time in the future when we will be to robots as dogs are to humans.
[...]Omni: Do you find it depressing that chess computers are getting so strong?Shannon: I am not depressed by it. I am rooting for the machines! I have always been on the machines' side. Ha-ha!

Omni Magazine (1987). (sometimes quoted as "I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.")