``the first positive integer that cannot be specified in less than a billion words''

instead? Everything has a rather different flavor. Let's see why. The first problem we've got here is what does it mean to specify a number using words in English? This is very vague. So instead let's use a computer. Pick a standard general-purpose computer, in other words, pick a universal Turing machine (UTM). Now the way you specify a number is with a computer program. When you run this computer program on your UTM it prints out this number and halts. So a program is said to specify a number, a positive integer, if you start the program running on your standard UTM, and after a finite amount of time it prints out one and only one great big positive integer and it says ``I'm finished'' and halts.

Bernoulli's Principle:They appeal to searches with “links” in the optimization space and smoothness constraints that enable “hill-climbing” optimization [32].7 Prior knowledge about the smoothness of a search landscape required for gradient based hill-climbing,8is not only common but is also vital to the success of some search optimizations.9

Such procedures, however, are of little use when searching to find a sequence of, say, 7 letters from a 26-letter alphabet to form a word that will pass successfully through a spell checker, or when choosing a sequence of commands from 26 available commands to generate a logic operation such as XNOR [46]. The ability of a search procedure to work better than average on a class of problems is not prohibited by COI.10

http://schneider.ncifcrf.gov/paper/ev/dembski/specified.complexity.htmlHe shows that pHe shows that pure random chance cannot create information, and he shows how a simple smooth function (such as y = x2) cannot gain information. (Information could be lost by a function that cannot be mapped back uniquely: y = sine(x).) He concludes that there must be a designer to obtain CSI. However, natural selection has a branchingmapping from one to many (replication) followed by pruningmapping of the many back down to a few (selection).

http://schneider.ncifcrf.gov/pitfalls.htmlTreating Uncertainty (H) and Entropy (S) as identical OR treating them as completely unrelated.The former philosophy is clearly incorrect because uncertainty has units of bits per symbol while entropy has units Joules per Kelvin. The latter philosophy is overcome by noting that the two can be relatedif one can correlate the probabilities of microstates of the system under consideration with probabilities of the symbols

Claim of identical: William Dembski in the book No Free Lunch stated that the two forms are mathematically identical (page 131)

-----------------------------------------------http://schneider.ncifcrf.gov/paper/ev/dembski/specified.complexity.htmlThe random number generator used in evis a deterministic function, yet the ev program clearly shows an increase in the information (as defined by Shannon) in the binding sites. (In other words, all the complex discussion and mathematics that Dr. Popescru puts out is a smoke screen that covers the simple situation at hand.) There is also the point from thermodynamics that information can be gained in a system (ie the entropy can go down) so long as there is at least a minimum compensating increase in the entropy outside the system