The quantity and quality of information
In information theories, the concept of “information” itself can be confusing and even self-contradictory. Of course, nature itself is not wrong. It’s the words that we use. Shannon entropy is fundamentally mathematical without considering any meaning, structure, or organization with regard to information. From that point of view, one can say chaos corresponds to more information because Shannon entropy really is a measure of uncertainty or randomness in a system. On the other hand, as… Read More »The quantity and quality of information