In information theory, entropy is a measure of uncertainty. The greater the amount of information, the smaller the uncertainty and the smaller the entropy, and the smaller the amount of information, the greater the uncertainty, and the greater the entropy. According to the characteristics of entropy, we can judge the randomness and disorder degree of an event by calculating entropy, and we can also use entropy to determine the degree of dispersion of an indicator, the greater the degree of discreteness of the indicator, the greater the impact of the indicator on the comprehensive evaluation. Therefore, according to the degree of variation of each indicator, the weight of each indicator can be calculated by using the information entropy tool, which provides the basis for the comprehensive evaluation of multi-indicators.
正在翻译中..