You might profit from reading Bill Dembski's article, The Chance of the Gaps (The name is a "takeoff" on the oft used God of the Gaps). Here is an interesting portion.
------------------
2. Universal Probability Bounds
In the observable universe, probabilistic resources come in very limited supplies. Within the known physical universe there are estimated around 10^80 elementary particles. Moreover, the properties of matter are such that transitions from one physical state to another cannot occur at a rate faster than 10^45 times per second. This frequency corresponds to the Planck time, which constitutes the smallest physically meaningful unit of time.7 Finally, the universe itself is about a billion times younger than 10^25 seconds (assuming the universe is between ten and twenty billion years old). If we now assume that any specification of an event within the known physical universe requires at least one elementary particle to specify it and cannot be generated any faster than the Planck time, then these cosmological constraints imply that the total number of specified events throughout cosmic history cannot exceed
10^80 x 10^45 x 10^25 = 10^150.
It follows that any specified event of probability less than 1 in 10^150 will remain improbable even after all conceivable probabilistic resources from the observable universe have been factored in. A probability of 1 in 10^150 is therefore a universal probability bound.8 A universal probability bound is impervious to all available probabilistic resources that may be brought against it. Indeed, all the probabilistic resources in the known physical world cannot conspire to render remotely probable an event whose probability is less than this universal probability bound.
The universal probability bound of 1 in 10^150 is the most conservative in the literature. The French mathematician Emile Borel proposed 1 in 10^50 as a universal probability bound below which chance could definitively be precluded (i.e., any specified event as improbable as this could never be attributed to chance).9 Cryptographers assess the security of cryptosystems against a brute force attack that employs as many probabilistic resources as are available in the universe to break a cryptosystem by chance. In its report on the role of cryptography in securing the information society, the National Research Council set 1 in 10^94 as its universal probability bound for ensuring the security of cryptosystems against chance-based attacks.10 Such levels of improbability are easily attained by real physical systems. It follows that if such systems are also specified and if specified complexity is a reliable empirical marker of intelligence, then these systems are designed.
Implicit in a universal probability bound such as 10^-150 is that the universe is too small a place to generate specified complexity by sheer exhaustion of possibilities. Stuart Kauffman develops this theme at length in his book Investigations.11 In one of his examples (and there are many like it throughout the book), he considers the number of possible proteins of length 200 (i.e., 20^200 or approximately 10^260) and the maximum number of pairwise collisions of particles throughout the history of the universe (he estimates 10^193 total collisions supposing the reaction rate for collisions can be measured in femtoseconds). Kauffman concludes: “The known universe has not had time since the big bang to create all possible proteins of length 200 [even] once.”12 To emphasize this point, he notes: “It would take at least 10 to the 67th times the current lifetime of the universe for the universe to manage to make all possible proteins of length 200 at least once.”13
Kauffman even has a name for numbers that are so big that they are beyond the reach of operations performable by and within the universe—he refers to them as transfinite. For instance, in discussing a small discrete dynamical system whose dynamics are nonetheless so complicated that they cannot be computed, he writes: “There is a sense in which the computations are transfinite—not infinite, but so vastly large that they cannot be carried out by any computational system in the universe.”14 Kauffman justifies such proscriptive claims in exactly the same terms that I justified the universal probability bound a moment ago. Thus as justification he looks to the Planck time, the Planck length, the radius of the universe, the number of particles in the universe, and the rate at which particles can change states.15 Kauffman’s idea of transfinite numbers is insightful, but the actual term is infelicitous because it already has currency within mathematics, where transfinite numbers are by definition infinite (in fact, the transfinite numbers of transfinite arithmetic can assume any infinite cardinality whatsoever).16 I therefore propose to call such numbers hyperfinite numbers.17
Kauffman often writes about the universe being unable to exhaust some set of possibilities. Yet at other times he puts an adjective in front of the word universe, claiming it is the known universe that is unable to exhaust some set of possibilities.18 Is there a difference between the universe (no adjective in front) and the known or observable universe (adjective in front)? To be sure, there is no empirical difference. Our best scientific observations tell us that the world surrounding us appears quite limited. Indeed, the size, duration, and composition of the known universe are such that 10150 is a hyperfinite number. For instance, if the universe were a giant computer, it could perform no more than this number of operations (quantum computation, by exploiting superposition of quantum states, enriches the operations performable by an ordinary computer but cannot change their number); if the universe were devoted entirely to generating specifications, this number would set an upper bound; if cryptographers
6
confine themselves to brute-force methods on ordinary computers to test cryptographic keys, the number of keys they can test will always be less than this number.
But what if the universe is in fact much bigger than the known universe? What if the known universe is but an infinitesimal speck within the actual universe? Alternatively, what if the known universe is but one of many possible universes, each of which is as real as the known universe but causally inaccessible to it? If so, are not the probabilistic resources needed to eliminate chance vastly increased and is not the validity of 10^–150 as a universal probability bound thrown into question? This line of reasoning has gained widespread currency among scientists and philosophers in recent years. In this paper I will to argue that this line of reasoning is fatally flawed. Indeed, I will argue that it is illegitimate to rescue chance by invoking probabilistic resources from outside the known universe. To do so artificially inflates one’s probabilistic resources.
3. The Inflationary Fallacy
Only probabilistic resources from the known universe may legitimately be employed in testing chance hypotheses. In particular, probabilistic resources imported from outside the known universe are incapable of overturning the universal probability bound of 10^–150. My basic argument to support this claim is quite simple, though I need to tailor it to some of the specific proposals now current for inflating probabilistic resources. The basic argument is this: It is never enough to postulate probabilistic resources merely to prop an otherwise failing chance hypothesis. Rather, one needs independent evidence whether there really are enough probabilistic resources to render chance plausible.
Consider, for instance, a state lottery. Suppose we know nothing about the number of lottery tickets sold and are informed simply that the lottery had a winner. Suppose further that the probability of any lottery ticket producing a winner is extremely low.
What can we conclude? Does it follow that many lottery tickets were sold? Hardly.