To see how Shannon’s choice embodies a self-reflexive moment, it will be necessary to understand more precisely how informational entropy is like and unlike thermodynamic entropy. Shannon defined information as a function of the probability distribution of the mes-sage elements.17 Information in Shannon’s sense does not exist in the same way as the dimensions of this book exist. A book can be mea-sured as twelve inches long, even if there are no other books in the world. But the probability that a book has that dimension is mean-ingful only if there are other books with which it can be compared. If all books are twelve inches long,Shannon’s equation for information calculated it in such a way as to have it depend both on how probable an element is and on how improbable it is. Having information depend on the probability of message elements makes sense from an engineer’s point of view. Effi-cient coding reserves the shortest code for the most likely elements (for example, the letter e in English), leaving longer codes for the unlikely ones (for instance, x and z). Improbable elements will oc-cupy the most room in the transmission channel because they carry the longest codes. Thus for a channel of given capacity, fewer im-probable elements can be sent in a unit of time than probable ones. This explains why an engineer would think it desirable to have a direct correlation between probability and information. the probability that a given book has that dimension is i, indicating complete certainty about the re-sult. If half of the books are twelve inches, the probability is 1/2; if none are, it is o. Similarly, information cannot be calculated for a message in isolation. It has meaning only with respect to an ensem-ble of possible messages.
To see how Shannon’s choice embodies a self-reflexive moment, it will be necessary to understand more precisely how informational entropy is like and unlike thermodynamic entropy. Shannon defined information as a function of the probability distribution of the mes-sage elements.17 Information in Shannon’s sense does not exist in the same way as the dimensions of this book exist. A book can be mea-sured as twelve inches long, even if there are no other books in the world. But the probability that a book has that dimension is mean-ingful only if there are other books with which it can be compared. If all books are twelve inches long,Shannon’s equation for information calculated it in such a way as to have it depend both on how probable an element is and on how improbable it is. Having information depend on the probability of message elements makes sense from an engineer’s point of view. Effi-cient coding reserves the shortest code for the most likely elements (for example, the letter e in English), leaving longer codes for the unlikely ones (for instance, x and z). Improbable elements will oc-cupy the most room in the transmission channel because they carry the longest codes. Thus for a channel of given capacity, fewer im-probable elements can be sent in a unit of time than probable ones. This explains why an engineer would think it desirable to have a direct correlation between probability and information. the probability that a given book has that dimension is i, indicating complete certainty about the re-sult. If half of the books are twelve inches, the probability is 1/2; if none are, it is o. Similarly, information cannot be calculated for a message in isolation. It has meaning only with respect to an ensem-ble of possible messages.
正在翻译中..