The clue is in the name Second Law of Thermodynamics is that it is to do with heat, not information. The SLoT does not apply generally to information theory. There is no equivalent, in thermodynamics, of Shannon's Noisy Channel Theorem: thermodynamics can be considered a sub-class of information theory, but laws of thermodynamics do not propogate upwards to all information theory applications.
Then why do you keep conflating them and assume that laws of thermodynamics apply to information?
You have that completely reversed: I've been telling you that you are equivocating for weeks over the term entropy, but you keep bringing it up.
.. because of being heated above its melting point so average kinetic energy of the particles approach or exceed the bond energies.
.. because of the temperature difference between it and its surroundings, along with gravity and the expansion of gases as they are heated, sets up a convection current which carries away hot air.
Nope. That Law is not about a process that can get thigs done, but it places restrictions on changes in state variables.
The second Law is inappropriate for describing the causes of events, so I wouldn't use that term in the situations you set up.
So do you disagree with Prof. Lambert when he says, "the following are all
examples of the second law: hot pans cool; water spontaneously flows down Niagara Falls; the air in our tires will blow out to the atmosphere if the tire walls are punctured; when gasoline is mixed with air in a car's cylinders, it explodes if a spark is introduced; a speeding car that hits a brick wall doesn't passively stop. There is a loud crash as the car's metal is bent and plastic and glass broken and the bricks (slightly warmed) fly all over area. Cream put in coffee doesn’t stay by itself but instead spreads throughout the coffee."?
Yorzhik said:
I can, with 100% accuracy predict that GCT will not provide us with a better word to use because of 2 things. The first is what I already pointed out; that CDists never like things to be clear. The second is that there is not better term.
You equivocate endlessly, so are you ready to answer MY questions, for the avoidance of you appearing to muddy the water again?
- Do you agree with Shannon's Noisy Channel Coding Theorem (Shannon's Theorem) that communication can be managed with arbitrarily small data losses, unlike thermodynamic systems and their entropy rises?
- Are you aware that most mutations don't increase the information entropy, since information entropy is simply a measure of how much information is needed for coding and not a measure of meaning or usefulness?
Wow... I didn't realize you would confirm my predictions so well. You just ignored what I said completely!
Anyway, to answer your questions: Shannon says that noise enters at the transmission phase of communication. There is a lot of communication that takes place in the cell, most importantly from one generation to the next.
If it were true that mutations don't increase information entropy, then any noise entering a message would not increase information entropy. This is what prompted Weaver to say:
It is generally true that when there is noise, the received signal is exhibits greater information--or better, the received signal is selected out of a more varied set than is the transmitted signal. This is a situation which beautifully illustrates the semantic trap into which one can fall if he does not remember that information is used here with a special meaning that measures freedom of choice and hence uncertainty as to what choice has been made. It is therefore possible for the word information to have either good or bad connotations. Uncertainty which arises by virtue of freedom of choice on the part of the sender is desirable uncertainty. Uncertainty which arises because of errors or because of the influence of noise is undesirable uncertainty.
It is thus clear where the joker is in saying that the received signal has more information. Some of this information is spurious and undesirable and has been introduced via the noise. To get the useful information in the received signal we must subtract out this spurious portion. |
And now for the questions you didn't answer: Are you saying that mutations don't happen, or if they do happen that they aren't entering at the transmission phase? Which is it?