Improved Calibration Method for Resistive Sensors Using Direct Interface Circuits
Traditional calibration methods for measuring resistance values using a direct interface circuit, comprising a capacitor and some calibration resistors, offer a straightforward, economical option. However, such methods show greater inaccuracies as resistance values decrease. This article presents a...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2020-08, Vol.69 (8), p.5693-5701 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Traditional calibration methods for measuring resistance values using a direct interface circuit, comprising a capacitor and some calibration resistors, offer a straightforward, economical option. However, such methods show greater inaccuracies as resistance values decrease. This article presents a modification of the method that ensures more accurate measurements and extends the range of resistance values that can be measured. Unlike the linear equation used in the traditional method, the proposed method obtains the power functions related to resistance value and measured discharge time. Since these circuits need calibration resistors, criteria are established to find their values. To reduce arithmetical complexity, two approximations are tested to calculate the exact value of the resistance. With this approach, the relative errors for low resistances are reduced and the measuring range is extended. Three measurement setups were carried out with a microcontroller and a field-programmable gate array (FPGA) to validate the results, showing the validity of the method independently of the device. Using the FPGA configured so that the maximum current that can be sink per output pin is 24 mA, the error in the estimation of a resistance of 9.9~\Omega is ten times less than that obtained using the classic two-point calibration method. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2019.2958583 |