**2.3. Numerical methods**

Steady, turbulent, thermal flows were considered in the present work, and a commercial software package, Fluent, was used for all simulations. The governing Favre-averaged conservation equations of mass, momentum, and enthalpy are not reproduced here, but can be readily found in [1, 2].

For closure of these partial differential equations, the realizable k-ε turbulence model was applied to model turbulent momentum transfer. A benchmark study on turbulence models indicated that this model was superior to other four popular two-equation models and could provide similar results as those from the Reynolds stress model, a second-momentum closure [3]. The Reynolds analogy [4] was used to account for turbulent enthalpy transfer, and for this type of pipe flows, the turbulence Prandtl number of 0.7 was used [5, 6]. The gravity of 9.8 m/s2 was assigned in the direction consistent with the heat exchanger mounting orientation.

For the thermal properties of methane, polynomials derived from the NIST JANAF tables [7] were used to calculate the specific heat as a function of temperature. Data from NIST [8] were used to obtain polynomials to determine the molecular viscosity and thermal conductivity of methane as functions of temperature.

A segregated solver with a second-order accuracy scheme was chosen to resolve the flow fields. At convergence, the imbalance of mass flow rate between the inlet and exit was less than 0.34% for the original design and 0.007% for the modified configuration, while for the energy imbalance it was 0.38% for the former and 0.007% for the later. Due to the unsteady nature of the thermal flow field of the original design, the convergence could not reach the level for the modified case. Sixteen cores of a 64-bit LINUX cluster with 4 GB RAM for each

Failure Analysis of a High-Pressure Natural Gas Heat Exchanger and its Modified Design

http://dx.doi.org/10.5772/intechopen.71202

259

Mass flow rate 0.237 kg/s Temperature 295.3 K Absolute pressure 30.4 bar

**Figure 4** shows the temperature contours along the longitudinal symmetric plane of the heat exchanger. Significantly, variation of temperature inside the heater is observed. Hightemperature region exists in the upper portion of the vessel, while the low-temperature regions occur in the lower portion. The temperature profiles along the top and bottom walls are displayed in **Figure 5**. It is obvious that the top wall temperature reaching ~1700 K is considerably higher than the allowed service temperature of steel pipes, SA-106 GR.B [9]. Certainly, the heavy-duty vessel could not survive at such high-temperature and high-pressure operating conditions. Another important feature in **Figure 4** is the unsteady nature of the thermal flow field. Large eddies or fluid pockets randomly occur in the regions along and

Why is the vessel wall temperature so high when the natural gas is set to be heated by only ~200 K? And why is the thermal flow field so unsteady in nature? These questions can be answered by analyzing the flow features or characteristics inside the heat exchanger vessel.

core were used to perform all simulations.

**3. Results and discussion**

**Table 1.** The inlet boundary conditions.

**3.1. Results of the original design**

above the central axis of the vessel.

**Figure 4.** Temperature contours at the longitudinal symmetric plane.

**Figure 3.** Meshes: (a) mesh for one section of the heat exchanger; (b) mesh at the section across heating elements; and (c) mesh across one heating-element keeper.


**Table 1.** The inlet boundary conditions.

A segregated solver with a second-order accuracy scheme was chosen to resolve the flow fields. At convergence, the imbalance of mass flow rate between the inlet and exit was less than 0.34% for the original design and 0.007% for the modified configuration, while for the energy imbalance it was 0.38% for the former and 0.007% for the later. Due to the unsteady nature of the thermal flow field of the original design, the convergence could not reach the level for the modified case. Sixteen cores of a 64-bit LINUX cluster with 4 GB RAM for each core were used to perform all simulations.
