Measurement accuracy is the most important index of humidity sensor, each increase of a percentage point, for humidity sensor is on a step, or even a level.Because to achieve different precision, their manufacturing costs vary greatly, and the price also varies greatly.So the user must be tailored, should not blindly pursue "high, fine, sharp".If the humidity sensor is used at different temperatures, the temperature drift should be taken into account.It is well known that relative humidity is a function of temperature, which seriously affects the relative humidity in a given space.Each change in temperature is 0.1℃.A humidity change (error) of 0.5% Rh will be generated.If it is difficult to keep a constant temperature in use, it is inappropriate to put forward a high humidity measurement accuracy.In most cases, if there is no precise temperature control, or if the space under test is unsealed, an accuracy of ± 5% RH is sufficient.For the local space that requires accurate control of constant temperature and humidity, or the occasion that needs to track and record humidity changes at any time, the humidity sensor with accuracy above ± 3%RH is selected.And the accuracy of more than ± 2% RH requirements I am afraid even calibration of the standard humidity generator sensor is difficult to do, let alone the sensor itself.Relative humidity gauges, even at 20 -- 25℃, are difficult to achieve accuracy of 2% RH.Usually the characteristics given in the product data are measured at room temperature (20 ° C ±10 ° C) and in clean gases.