Abstract:
The hot cathode ionization gauge, a fundamental instrument for ultra-high vacuum measurement, experiences limitations in measurement accuracy, partly due to temperature effects induced by the high-temperature cathode. These temperature effects pose challenges due to the interplay of multiple scales and mechanisms, rendering the investigation of temperature effects in ionization gauges particularly complex. The presence of a high-temperature filament increases the temperature of gas molecules surrounding the ionization gauge, while the pressure in the cathode region is also temperature-dependent. Given that the gas temperature in the cathode region is challenging to measure directly, simulations of the hot cathode ionization gauge under various operating conditions within the vacuum chamber were performed using Molflow software. This approach facilitated the acquisition of pressure distributions and gas temperature distributions across different spatial regions. Following the correction of the measured pressure using the thermal transpiration equilibrium equation, the corrected results aligned more closely with the true values. Experimental tests indicated that as the wall temperature increased, the temperature of gas molecules also rose, resulting in an increase in pressure within the test region. However, the pressure reading on the ionization gauge decreased. The influence of thermal transpiration was evident, and the variation pattern of the pressure reading on the ionization gauge was found to be relatively consistent with the simulation results. By utilizing the wall temperature as a characteristic parameter to adjust sensitivity, the pressure results were brought closer to the actual values. This approach offers a viable method for correcting the indicated pressure of the ionization gauge, thereby enhancing the measurement accuracy of the hot cathode ionization gauge.