Radiation influence on commonly used temperature sensors for measuring indoor air temperatures can be significant, especially at the typically low air velocities occurring indoors. Conceptually, a physical sensor may not read the true air temperature, it only reads its own temperature, and, being a solid body, it will exchange energy with the surrounding surfaces (walls, windows etc.) through radiation. In the present study, radiation influence on indoor air temperature measurements was investigated experimentally and errors were quantified in simple terms. Measures to reduce the impact on some common temperature sensors were explored. A special test rig was built to simulate typical airflow and radiation environments indoors. It is suggested that the radiation impact on a temperature sensor is quantified by a radiation sensitivity factor defined as RSF = hrad/hconv, where hrad and hconv are heat transfer coefficients for radiation and convection, respectively. As this definition infers, the radiation sensitivity is dependent on size, geometry and emissivity of the temperature sensor. The radiation sensitivity factor, thus being unique for each type of sensor, was measured for some common types of thermistors and thermocouples. It is demonstrated that radiation errors may be reduced by 60 – 80 % on thermistors by reducing their emissivity through gold sputtering, and on thermocouples by stripping the insulation at the outermost part of their sensor leads.