The Thermal Time Constant is a measurement of the time required for the thermistor to respond to a change in the ambient temperature. The technical definition of Thermal Time Constant is, "The time required for a thermistor to change 63.2% of the total difference between its initial and final body temperature when subjected to a step function change in temperature, under zero power conditions".
The thermal time constant is affected by the medium in which the test is performed. For example, the thermal time constant will be shorter in moving air than in still air and shorter in moving water than in still water.
The most common method used for measuring the thermal time constant of a thermistor is by placing the device in still air at room temperature. Adequate power is then applied to raise the thermistors body temperature well above that of the ambient. The power is maintained until thermal stabilization at the elevated temperature, is achieved. Then, the power is removed from the thermistor and simultaneously, a timer is triggered. The resistance of the thermistor is continuously monitored and when it indicates that the thermistor body has cooled to the temperature which represents 63.2% of the temperature difference between the elevated temperature and that of the ambient temperature, the timer is stopped. The time indicated represents one time constant and is usually expressed in "seconds".
Although this is the most common method for measuring the thermal time constant of thermistors and thermistor probe assemblies, it is not always the best method to use for all applications. For example, if a thermistor probe assembly is designed to be used for temperature control of a fluid, it will usually be best to measure the thermal time constant using a step change in fluid temperature rather than using the "Self Heating" method in still air.