Yep. That's what it's called when someone abandons reasoning and goes for ridicule in an argument. Your disparaging reference is an attempt to avoid the direction our discussion was taking. Better yet, just show us a geneticist who says there's such a thing as "devolution."Show us someone who actually understands genetics who says so. You see, perhaps, that you really don't mitigate the fix you're in by a diversion.
As you learned, things do indeed devolve. Perhaps when your rage subsides over your ignorance being exposed, you'll be able to admit the facts. :up:
We showed you plenty of evidence. No point in denial. You'll just have to come to terms with the facts.
Shannon's channel capacity theorem only applies to living organisms and their products, such as communications channels and molecular machines that make choices from several possibilities. Information theory is therefore a theory about biology, and Shannon was a biologist. |
As you learned, this does nothing to change the theory. You can't call Shannon information meaningful.
Turns out, his theory is actually inapplicable for evolutionists. His ideas have applications far beyond communications. But we don't expect Darwinists to appreciate the details when they get the basics so wrong.
Yes. As you learned, Shannon is all about being able to eliminate noise in messages we receive. It's how NASA communicates over huge distances with low-powered transmitters. It's how the Internet works. Two things that most definitely did not arise from random mutations and natural selection (to go along with everything else).
It's all about reducing noise.
In the earlier days of long-distance communication this was indeed what people thought: when you're dealing with a noisy channel, you have no choice but trade your error rate for redundancy. In 1948, however, the mathematician Claude Shannon proved them wrong. In his ground-breaking Mathematical theory of communication Shannon showed that given any error rate, no matter how small, it's possible to find a code that produces this tiny error rate and enables you to transmit messages at a transmission rate that is only limited by the channel's capacity. This result is known as Shannon's noisy channel coding theorem. It shows that near-error-free transmission doesn't lead to near-zero efficiency. |
What Shannon showed is that once the entropy is below some critical value, a code with an arbitrarily small rate of decoding error can always be found. Ie, how the receiver can eliminate the noise to get at the meaning. And it is the meaning that Shannon said nothing about and what you want nothing to do with.