Most chemical production processes have the characteristics of large scale, long process, continuous, and automatic. In order to effectively perform process operations and production control, various types of instruments are required to measure the specific values of various variables in the production process. Although the instruments and measuring methods used in the measurement are different, the mechanism of the measurement process is the same, that is, the process of comparing the measured variable with the magnitude of the same type of unit. Various measuring instruments are the technical tools to achieve this comparison. For various measuring instruments used on production plants, it is always desirable that their measurements be accurate. However, in the actual measurement process, due to the influence of subjective and objective factors such as the performance of the measuring instrument itself, installation and use environment, measurement methods, and operator negligence, there are some deviations between the measurement result and the measured true value. This deviation is called For measurement error.
The Classification of Measurement Error
The classification method of error is various, such as according to the law of error, it can be divided into system error, accidental error and error; according to the conditions used by the instrument, there are basic error and auxiliary error; according to the measured variable The relationship between the time changes is divided into static error and dynamic error. According to the relationship with the measured variable, there are fixed value error and accumulated error. Absolute errors, relative errors, and citation errors often referred to as measuring instruments are classified according to the numerical representation of the error.
1. Absolute error
Absolute error refers to the difference between the measured value of the instrument and the true value of the measured variable. In fact, the true value of the measured variable is not known exactly, and the same measured variable is often measured with a standard instrument with higher accuracy. The measurement result is taken as the true value of the measured variable. Absolute errors have units and symbols, but they do not fully reflect the accuracy of the instrument and can only reflect the accuracy of a certain point. We will refer to the largest absolute error in each point as the absolute error of the instrument. The opposite sign of absolute error is called the correction value.
2. Relative error
The relative error refers to the ratio of the measured absolute error to the measured variable. It can objectively reflect the accuracy of the measurement results, usually expressed as a percentage.
3. Citation error (relative to the error or relative error)
The accuracy of the measuring instrument is not only related to the absolute error and relative error, but also related to the measuring range of the instrument. Industrial instrumentation usually uses reference errors to indicate the accuracy of the instrument, which is the ratio of the absolute value to the upper limit of the measurement range or the measurement range.
Citation error is also called relative error or relative error. It is characterized by non-dimensional, positive and negative points, and can accurately reflect the accuracy of the instrument.
Because the reference error is related to the range of the measuring instrument, when measuring the measured variable with the same accuracy meter, in order to reduce the absolute error value of the measured point and improve the measurement accuracy, the zero point of the instrument is often migrated to compress the instrument range.
4. The instrument indicates the value of the error check
In order to ensure accurate, reliable and long-term operation of all kinds of meters, the instrument maintenance personnel should regularly perform the indication error check of the running instruments and perform a full performance cycle check on the instrument after the shutdown to check whether the instruments meet the requirements. Technical performance indicators, here list the display meter value error verification.
Although there are many kinds of display instruments and their structures are different, there are two kinds of check methods that are commonly used.
1) Signal Comparison This is a relatively common calibration method. The adjustable signal generator is used to add the same signal to the instrument to be calibrated and the standard instrument. The indication of the instrument to be calibrated is compared with the indication of the standard instrument. Find the error of each point. For example, a manual pressure pump is used to input signals to the calibrated pressure gauge and the standard pressure gauge.
2) Direct calibration method This verification method uses standard instruments to directly signal the instrument to be calibrated. The actual signal indication of the standard instrument is compared with the standard true value of the verification point of the instrument being calibrated, and then determined. The error of the calibration point of the instrument being calibrated, such as the use of a standard resistance box to check the distribution of dynamic resistance indicator or automatic balance bridge.
The Instrument Quality Indicators
In engineering, the following quality indicators are usually used to measure the quality of the instrument.
1. Allowable error and basic error
According to the requirements of the instrument, a maximum error allowed under normal conditions is specified. The maximum allowable error is called the allowable error. It is usually expressed by the maximum reference error.
The basic error of the meter refers to the maximum error of the meter under normal working conditions when the meter is shipped from the factory. The basic error of the general instrument is also the allowable error of the instrument.
2. Accuracy and accuracy rating
Under normal use conditions, the accuracy of the meter's measurement results is called the accuracy of the meter. The smaller the reference error, the higher the accuracy of the instrument, and the reference error is related to the range of the meter. Therefore, when using the same accuracy level meter, the range of the compression range is often used to reduce the measurement error.
In industrial measurement, in order to facilitate the representation of the quality of the instrument, the degree of accuracy of the meter is usually expressed in terms of an accuracy level. The accuracy level is the maximum reference error minus the percent sign. Accuracy level is one of the important indicators to measure the quality of the instrument. China's industrial instrumentation grade is generally divided into 0.1, 0.2, 0.5, 1.0, 1.5, 2.5, 5.0 seven levels, and marked on the instrument scale ruler or nameplate.
The accuracy of the instrument is customarily referred to as accuracy, and the level of accuracy is customarily referred to as the accuracy level.In the measurement process, there is an error in the inevitable, but there are three ways to minimize the measurement error:
1. Using precision instruments;
2. Measurements to average;
3. Improved measurement methods.