In analysis today, we found the limits of several sequences. All of which relied on the fact that lim (1+1/n)^n=e. (that's the number e). Are you surprised by this result? I am!

The way my brain works, I see this limit as a competition between the n in the parenthesis and the n exponent... Which of these two win the race to infinity? If the parenthetical n wins, you'd expect the limit of the whole expression to be 1 because the stuff in the parenthesis goes to 1 and 1^n will be 1. On the other hand if the exponential n wins, the expression will diverge to infinity (something bigger then 1, even if its just barely bigger, being raised to the infinity goes to infinity)?

Actually, this is what I see when I plug big values of n into the limit expression on my TI-89. Plug n=1,000,000,000, you get the answer 1. Now keep the n the same in the exponent and plug n=1,000,000 into the parenthesis. You get a little infinity symbol. (BTW, does anyone know how to tell the calculator to figure the limit as n goes to infinity... I only figured out how to take limits going to finite values.)

So, intuition failed me. Somehow the parenthetical n and the exponential n score a tie in their race to infinity, compromise and the result is the number e! It's not clear why this would be. Besides, start plugging smaller values into your calculator (like n=1000) and you get something approximating what we know e to be (like 2.7162 in this case).

From the binomial theorem you can show that this expression is always increasing and its always less than 3 (or 2.8 or 2.72 or 2.719 or 2.7183 or... you get the picture). Intuitition (which I've learned to keep a close eye on!) leads to the conclusion that as n gets bigger and bigger it must get closer and closer to some number less than 3 (but greater than, at least, 1). There's actually a result in analysis that says bounded sequences (e.g. it never gets bigger than 3 and never gets less than 1) which are also increasing (e.g. each element in the sequence is bigger than the previous element) always have a limit.

Ok, but what is this number the sequence gets closer and closer to? Most people just say that e is defined by this limit, but some sites mention a proof, by Euler, that this value is the same as the famous taylor expansion (i.e. e=1+1/2+1/6+1/24+...) or some other 'definition' of the number. Here's another representation of e:
I can't seem to find the actually proof, though.

Take a look at e's first 100,000 digits (calculated in Mathematica using the N[e,100000] function). In this string of digits, the digits that has the longest string of repeats is 0 with 6 consecutive appearances. 1234 is the longest string of this form (consecutive digits starting with 1), it shows up 10 times, and you can find two instances of 98765. Here's the first million digits... and the first 5 million digits.

Well, among the search results was this article by Fabio Rojas, frequent guest blogger at Marginal Revolution. He points to an article in which Mr. Lansburg argues that, economically speaking, authors of computer virus cause significantly more damage than a murderer. Thus, that we execute murderers implies that we should execute such hackers.

Of course this is ridiculous, but why should it be? Prof. Rojas makes the point that people must be "hard wired to care about concentrated damages rather than diffuse damages," a psychological argument to explain away 'irrational' behavior.

This reminded me of an article today by Prof. Delong about price gouging. Economically, price gouging in times of spiked demand makes sense because it incents suppliers to prepare for such a shortage.

"Ridiculous!" crys the non-economists. As Prof. DeLong points out:

Now out in the real world it is fair to say that nobody who has not been brainwashed by the graduate economics core would look upon such a pattern of prices with equanimity--not in electricity, not in airline tickets, not in railraod freight traffic, not in port charges, not in cargo shipping, not in vaccines, not in plywood in the week after a hurricane.

Prof. DeLong goes on to say:

...people seem to value a stable environment, or a smoothness of prices, or something even when the result is substantial allocative inefficiencies when the shortage comes and we then ration haphazardly by luck rather than efficiently by price. People strike long-term contracts that are clearly inefficient--that specify price but leave quantities to be delivered at the discretion of the purchaser.

...another psychological argument to dispense with otherwise sound economic argument. It seems to me that a psychological preference (are these preferences?) for diffuse costs (i.e. smooth prices in space) must be related to the psychological preference for stable prices (i.e. smooth prices over time).

The article is about the earned income tax credit and the minimum wage. Prof. DeLong argues for a balanced approach of income redistribution (a little with the EITC and a little with the minimum wage).

Why do I bring all this up? Well, this amazing mix of technologies (blogs, rss feeds, google and outlook) came together seemlessly to augment a conversation my Dad and I had the other day about the EITC (I just forwarded him, via email, the DeLong piece).