**1. Introduction**

242 Energy Efficiency – The Innovative Ways for Smart Energy, the Future Towards Modern Utilities

Reliability, 45 (3), 611-622.

[18] Ryu, D., Chang, S., 2005, Novel Concept for Reliability Technology, Microelectronics

[19] Ryu, D., 2012, Improving Reliability and Quality for Product Success, CRC Press

Steam is important secondary energy resource media in iron& steel enterprises, amounting to nearly 10% of the whole energy consumption. When an enterprise is running, if all the produced steam can meet all the demands and no steam is bled, the overall energy efficiency can be effectively improve. Thus the complex networks of steam pipes and steam production scheduling systems were set up. Obviously, steam scheduling has to depend on the real time measured data from the steam pipe networks. Accurately measuring the variables of pressure, temperature and flow rate is essential to secure the safety and economic efficiency. It's also necessary for accumulating the amount of steam production or the consumption to calculate the energy cost for each working procedure.

With the help of Energy Management System (EMS), all the data collected from the distributed instruments. In practice, the measurements of pressure and temperature are usually accurate enough for the application except that the sensors or the transducers fail. However, the measurements of mass flow rate are not so accurate because of the complex nature of the steam itself, lacking of high precision measuring instruments, the impact of interference and information transmitting network failures and other reasons. The reliability of the measurement of mass flow rate is poor.

When the steam mass flow measurement values deviate from the actual values to a certain extent, the automatic control system may largely deviate from the process requirements substantially. Even worse steam bleeding or accident could happen [1]. Therefore, it is not a satisfactory to decide or adjust the production process according to the data from the flow rate meters [2]. In energy management, the accumulation differences between production and consumption make it difficult to calculate the energy costs, analyze the segments of

© 2012 Xianxi et al., licensee InTech. This is an open access chapter distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. © 2012 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

irrational use of steam, and find the weakness in the management links. Therefore, improving the reliability the steam flow rate measurement data is essential for the normal production and energy conservation in iron& steel enterprises.

Data Processing Approaches for the Measurements of Steam Pipe Networks in Iron and Steel Enterprises 245

At present, the mass flow rates are mostly deduced by the volume flow rates and the density. However, the changes of temperature and pressure in the transmission process would lead to the density of steam deviate from the original designed value [3]. The measurement errors would be very large [1]. Any more, some superheated steam would change into a vapor-liquid two-phase medium, which making precision worse.

With the ambient temperature changing, the total amount of the condensate water in the transmission process will be different. That makes difference between the amount of the production and the consumption of steam. In addition, the steam pipe leakage will

As the orifice differential pressure flow rate meter being in use for long time, the size of aperture would differ from the original size because of the adhered foreign bodies, or erosion by the durative high temperature steam flow. As the parameters can not be adjusted on time, and it is hard to calibrate the instrument, the measurement errors will

The instrument interfered by the disturbances or failure happening in data transmission

The abnormal data from certain sensors (including temperature, pressure and flow rate sensors) may be characterized as fluctuation quickly with a large magnitude (induced by the poor contact in the instruments), keeping a certain value without any varying (induced by the failure of the data sampling systems) or being outside the normal variable range. The first two cases are not discussed here because they are easy to be discovered. As for the last case, the statistical process control approaches to determine the control limits for data monitoring is applied. When the values of the controlled variables (or functions) exceed the limits, it shows there are abnormal data except that the process is actually abnormal. The

The statistical process control (SPC) and control chats were first proposed by Shewhart in quality prediction. Traditional SPC mainly treated single variables. When the value locates outside the normal range, the system would output an alarming signal to notify the operators to check and make sure whether the state is abnormal. In the work, if the process state is actually normally operating, the values of the variables can be judged as fault data. Reasonably determined control limits can reduce the probability of false alarm. Many researches focus on the problem [4-6]. In the work, empirical distribution function combined with the principle of "3σ"[3]is applied to determine the control limits of

channels will make significant errors to the data derived in the control center.

statistical process control includes two types--univariate and multivariate control.

add to the difference. Therefore, the accumulated readings are always doubtful.

3. The Occurrence of wearing or damage to the Key Components

4. The external Interference or the Failure of Data Transmission Channel

**2.2. Determining control limits for fault data detecting** 

*2.2.1. Determining the control limits for single variable* 

2. The Complexity of Steam Characteristics

accumulate.

different variables.

The objective of the work may be depicted by figure 1. The real time data (mainly referring to the mass flow rate variables) would be mainly processed by three approaches. By fault data detection and reconstruction, the fault data would be picked out and the real value would be reconstructed or estimated. By gross error detection, the data with gross error would be discovered and re-estimated. By data reconciliation, the random error would be decreased and the quality of the data would be further improved.

**Figure 1.** The objective of the work

For fault data detection and reconstruction, the reasons of the low accuracy of steam flow rate data are introduced. By applying the statistical process control theory[3] to determine univariate and multivariate process control limit to monitor the abnormal data online. The approach to calculate the real data (mass flow rates) through the thermal and hydraulic mathematical models of the steam pipe networks is proposed.

For the section of gross error detection, the definition of the problem is introduced. And the two basic gross error detection approaches, the Measurement Test (MT) method and the Method of pseudonodes (MP), are demonstrated.

For the section of data reconciliation, the constrained least-squares problems stated in the section of gross error detection is discussed in detail, including the assumptions for the application,the constraint equations and the selection of the weighted parameter matrix.

The presented approaches of data processing can be programmed for computers to determine the abnormality and improve the precision of the mass flow rate data.
