In information theory, entropy is a measure of uncertainty. The greater the amount of information, the smaller the uncertainty and the smaller the entropy; the smaller the amount of information, the greater the uncertainty and the greater the entropy. According to the characteristics of entropy, we can judge the randomness and disorder degree of an event by calculating the entropy value, and we can also use the entropy value to judge the dispersion degree of an indicator. The greater the impact. Therefore, the information entropy can be used to calculate the weight of each indicator according to the degree of variation of each indicator, which provides a basis for comprehensive evaluation of multiple indicators.