As discussed by Instrumentation Tools in their article “What is RTD Sensitivity?”, Resistance Temperature Detectors (RTDs) are temperature-sensitive resistors. The sensitivity of an RTD indicates the extent to which its resistance changes with temperature variations.

Some discussions of RTD sensitivity use the alpha value of the element to denote sensitivity, but this can be misleading as it does not directly show the amount of resistance change per degree. The alpha value, represented by α = [(R100°C – R0°C)/R0°C]/100°C, shows the fractional increase in resistance over a 100°C range. For instance, with an alpha value of 0.00385, the resistance increases by 38.5% from 0-100°C, or 0.385% for each degree Celsius.

Thus, the sensitivity of an RTD can be indicated as 0.385%/°C, or more precisely in its native units as Ω/°C. Sensitivity is calculated by multiplying the RTD’s resistance at the reference temperature (R0) by the Temperature Coefficient of Resistance (TCR), α. For a 100Ω platinum RTD with α=0.00385 Ω/Ω/°C and R0=100Ω, the sensitivity coefficient is 0.385Ω/°C. Therefore, a 100Ω sensor at 0°C will increase its resistance by 0.385Ω for every degree Celsius.

Higher alpha values indicate greater sensitivity. For example, a Nickel RTD with α=0.00672Ω/Ω/°C and R0=120Ω has a sensitivity of 0.8064Ω/°C, making it more than twice as sensitive as a Platinum RTD with α=0.00385Ω/Ω/°C. Additionally, RTDs with higher nominal resistance (R0) values are more sensitive. For instance, a 500Ω Pt RTD (Pt500) is five times more sensitive than a 100Ω Pt RTD (Pt100), and a 1000Ω Pt RTD (Pt1000) is ten times more sensitive.

Click here to learn more about Blaze Technical Services’ products.

Article with all rights reserved, courtesy of instrumentationtools.com