Instrumentation Calibration Reduction Abstracts from Publications (Co)Authored by Dr. Holbert

Instrument Calibration Reduction Using Signal Validation

K. E. Holbert, H. M. Hashemian

Calibration reduction affords an excellent opportunity to exploit the capabilities of signal validation. In-plant testing is underway at an operating pressurized water reactor using a signal validation based calibration reduction system. This work has shown that on-line monitoring of instrument channels in nuclear power plants can help segregate the channels that are drifting from those that are not. This helps limit the calibration effort to only those channels requiring recalibration as opposed to current practice which involves the calibration of all channels. Because instrument recalibration is generally required for each fuel cycle and since many instruments are normally inaccessible, calibration reduction is beneficial from the decreased manpower requirements (during an already busy fuel outage) and the reduction in personnel exposure.

Transactions of the American Nuclear Society, Vol. 69, pp. 372-373, 1993.


Instrumentation Calibration Reduction

Keith E. Holbert

Abstract

Calibration reduction is an application of signal validation. Signal validation (instrument fault detection) is a determination as to whether a process indicator is providing a reliable reading. Calibration reduction does not eliminate the need to perform instrument calibration, but lessens the effort involved. The obvious benefit of calibration reduction is lower manpower requirements due to the reduced workload. Since re-calibration for a nuclear power plant is generally required based on the fuel cycle length (12-18 months) and since many instruments are normally inaccessible, this re-calibration effort generally comes during an already busy re-fueling outage. This when manpower is already at a premium. Calibration reduction can also lower the amount of time spent in radiation areas, thereby reducing personnel exposure.

The calibration reduction methodology involves the use of an inter-signal consistency checking technique. Comparisons are continuously made, and a tally is kept as to the agreement (or disagreement) between individual signal readings. The error between the signals is computed and compared to the calibration accuracy of the instrument. The actual calibration reduction involves assuming the calibration curve of those signals which are in agreement to be nearly identical. The effort then is directed at verifying the accuracy of just one of the consistent signals. Once the accuracy of a single signal has been verified, the other consistent signals are also declared to be within calibration. Two signals will never read identically, but as long as they agree to within the specified tolerance, then a justifiable conclusion can be made as to the calibration accuracy. Other signals which are inconsistent (since they are outside the tolerance band) are deemed out of calibration. Thus, by testing a single instrument, the calibration accuracy for the other redundant signals may be determined, thereby lowering the overall workload (but not eliminating it).

Calibration reduction is performed in a continuous on-line mode (in real-time). Results (or relative indications) on the accuracy of instrumentation are readily available. For instance, if instrument calibration appears to be drifting over time, then preventive maintenance on the sensor may be performed before complete failure. This predictive maintenance aspect has the benefit of a maintenance scheduling aid (including the ability to pre-order components), as well as the possibility in the severest case of preventing a plant trip (which depending on the nature could require the filing of a LER [Licensee Event Report]).

Electric Power Research for the 90's, Proceedings of the First Annual Industrial Partnership Program Conference, January 29, 1991.


Last updated: January 29, 1997
Return to Dr. Holbert's Publications