So is how this works that you just ignore my post and then ask when I'm going to get to give an answer? I suppose you burying your head in the sand hurts you more than me, so here is the point from the post you ignored:To discuss semantic information, you have to first describe how 'meaning' is to be measured, and as far as I can recall, you have never done that. So your comment is meaningless. The Shannon measure of information generally increases with the addition of the noise entropy, as your reference to Weaver showed and I said a long time ago. That it doesn't match your needs is unfortunate, but there is a reason why most YECs avoid defining which sort of information measure they are using.
The semantic meaning of changes in DNA can be defined in terms of whether it increases or decreases the chances of success for the specific species/organism/gene. In which case those mutations that increase survival chances have more value to the species/organism/gene, implying that the information in that genome has improved for the environment in which it finds itself.
The Shannon information content of DNA is only if interest in a technical sense, since the amount of Shannon information doesn't indicate anything important about an organism (since the genome sizes of successful species varies much more that you'd expect.)
Are you ever going to get to your point, clearly and precisely? I'g guess no, since you've picked up Stripe's deceitful tactic of pretending to have answered difficult questions in a previous, yet unidentified, post even when it is not true.
Since semantics cannot be measured, all encoded messages are perfect according to Shannon.
I suppose I'll just have to keep reminding you since your reading comprehension might improve with repetition.
Whether it be semantic information or not, the information can only degrade with noise. Which, if you work hard to misunderstand, means semantics doesn't matter with respect to the amount of information.
And as a test to see if this is true, let's see you restate my argument in your own words. I'm betting you won't even try.
I won that bet this first time.
But the highlighted portion has to be supported first. And so far, it is just your declaration that claims that noise according to Shannon doesn't degrade the signal in cell messages. Supply some evidence that noise in cell messages normally adds information "that happen to work" if you want to show me wrong. Nothing else will do.gcthomas said:It is clear, at least, that Yorz recognises that mutations cause an increase in Shannon information, especially when combined with gene duplications, insertions and chromosome duplications. Since evolution only requires variation in genotype that cause differential survival rates in the phenotype, there is nowhere else to go here. Mutations cause variations, evolution selects those that happen to work. End of story.
I wonder what he'll claim now, seeing that he has proven himself wrong with his own linked sources?
And while you're at it, you'll be showing us how it doesn't apply to all digital communications.
And while you're at it, when Weaver said, "It is thus clear where the joker is in saying that the received signal has more information." who was he saying the joker is?