2.2 Definitions

• The second law: An element of irreversible heat transferred, δQ, is a product of the temperature T and the increment of its conjugate variable S (i.e., δQ ¼ TdS).

The entropy S is defined in the second thermodynamic laws, and its fundamen-

Example 1: In a car, there are four seats including a driver's. Three guests will occupy the same number of seats. How many different configurations are available? There are three people, A, B, and C, and three seats, S1, S2, and S3. If A can chose a seat first, then A has three choices. Then, B and C have, in a sequence,

Next, when the N objects are divided into two groups. Group 1 and group 2 can contain N<sup>1</sup> and N<sup>2</sup> objects, respectively. Then, the total number of the possible ways

> N! N1!N2!

which is equal to the number of combinations of N objects taking N1objects

<sup>N</sup><sup>1</sup> <sup>¼</sup> <sup>N</sup>!

where a<sup>0</sup> ¼ a<sup>3</sup> ¼ 1 and a<sup>1</sup> ¼ a<sup>2</sup> ¼ 3. For the power of N, the equation exists as

N! N1!N2!

k!ð Þ N � k !

∑ N2¼0

∑ N3¼0

<sup>y</sup> <sup>þ</sup> <sup>3</sup>xy<sup>2</sup> <sup>þ</sup> <sup>1</sup> � <sup>y</sup><sup>3</sup> <sup>¼</sup> <sup>∑</sup>

x<sup>N</sup><sup>1</sup> <sup>1</sup> xN<sup>2</sup> <sup>2</sup> ¼ ∑ N k¼0 Ck Nxk 1xN�<sup>k</sup>

<sup>¼</sup> CN�<sup>k</sup>

N! N1!N2!N3!

For example, consider the following equation of a binomial expansion

∑ N2¼0

<sup>N</sup> <sup>¼</sup> <sup>N</sup>!

two and one choices. Then, the total number of possible configurations are 3 � 2 � 1 ¼ 3! ¼ 6.

C<sup>N</sup>

ð Þ <sup>x</sup> <sup>þ</sup> <sup>y</sup> <sup>3</sup> <sup>¼</sup> <sup>1</sup> � <sup>x</sup><sup>3</sup> <sup>þ</sup> <sup>3</sup>x<sup>2</sup>

<sup>N</sup> <sup>¼</sup> <sup>∑</sup> N1¼0

Ck

<sup>N</sup> <sup>¼</sup> <sup>∑</sup> N1¼0

If we add <sup>x</sup><sup>3</sup> with a constraint condition of <sup>N</sup> <sup>¼</sup> <sup>∑</sup><sup>3</sup>

ð Þ x<sup>1</sup> þ x<sup>2</sup>

ð Þ x<sup>1</sup> þ x<sup>2</sup> þ x<sup>3</sup>

where N<sup>1</sup> þ N<sup>2</sup> ¼ N and

N � ð Þ� N � 1 ð Þ� N � 2 ð Þ� N � 3 ⋯ � 2 � 1 ¼ N! (2)

<sup>N</sup>1!ð Þ <sup>N</sup> � <sup>N</sup><sup>1</sup> ! (3)

anxny<sup>3</sup>�<sup>n</sup> (4)

<sup>2</sup> (5)

<sup>3</sup> (7)

<sup>N</sup> (6)

<sup>k</sup>¼<sup>1</sup>Nk, then

x<sup>N</sup><sup>1</sup> <sup>1</sup> xN<sup>2</sup> <sup>2</sup> x N<sup>3</sup>

3 n¼0

• The third law: As T ! 0, S ! constant, and S ¼ kB ln Ω, where Ω is the

tal property is described in the third law, linking the macroscopic element of irreversible heat transferred (i.e., δQÞ and the microstates of the system. Suppose you have N objects (e.g., people) and need to position them in a straight line consisting of the same number of seats. The first and second objects have N and N � 1 choices, respectively; similarly, the third one has N � 2; the fourth one has N � 3 choices; and so on. The total number of ways of this experi-

number of microstates.

Non-Equilibrium Particle Dynamics

to place N objects into two groups is

ment is as follows:

at a time

80

### 2.2.1 Boltzmann's entropy

A thermodynamic system is assumed to have a number of small micro-systems. Say that there are N micro-systems and mð Þ ≤ N thermodynamic states. This situation is similar to Nð Þ ¼ 10 balls in mð Þ ¼ 3 containers. The number of balls in container 1, 2, and 3 is N1, N2, and N3, respectively. Then the total number of different configurations of micro-systems in m micro-states is defined as

$$\Omega\_N = \frac{N!}{\prod\_{k=1}^m N\_k!} \tag{11}$$

Boltzmann proposed a representation of entropy of the entire ensemble as

$$\mathbf{S}\_B = k\_B \ln \mathbf{\Omega}\_N \tag{12}$$

### 2.2.2 Gibbs entropy

The Gibbs entropy can be written using Ω, as

$$\frac{\mathcal{S}}{k\_B} = \ln \Omega\_N = \ln \frac{N!}{\prod\_{k=0}^m N\_k!} = \ln N! - \sum\_{k=0}^m \ln N\_k!$$

and using Stirling's formula as

$$
\ln N! = N \ln N / e
$$

for a large Nð Þ ≫ 1 , to derive

$$\frac{\mathcal{S}}{k\_B} \simeq N \ln \left( N/e \right) - \sum\_{k=0}^{m} N\_k \ln \left( N/e \right) = -N \sum\_{k=0}^{m} \left( \frac{N\_k}{N} \right) \cdot \ln \left( \frac{N\_k}{N} \right)$$

Finally, we have

$$\mathcal{S} = -k\_B N \sum\_{k=0}^{m} p\_k \ln p\_k$$

where pk ¼ Nk=N exists as the probability of finding the system in thermodynamic state k. Gibbs introduced a form of entropy as

$$s\_G = -k\_B \sum\_{k=0}^{m} p\_k \ln p\_k$$

which is equal to the system entropy per object or particle, denoted as

$$s\_G = \frac{\mathcal{S}}{N} = -k\_B \sum\_{k=0}^{m} p\_k \ln p\_k$$

### 2.2.3 Shannon's entropy

In information theory, Shannon's entropy is defined as [2]

$$\mathcal{S}\_{\rm Sh} = -\sum\_{i} p\_i \log\_b p\_i \tag{13}$$

3. Diffusion: an irreversible phenomenon

DOI: http://dx.doi.org/10.5772/intechopen.86607

Fundamentals of Irreversible Thermodynamics for Coupled Transport

thermodynamic processes.

3.1 Mutual diffusion

Figure 1.

83

the mid-wall is removed.

Diffusion refers to a net flow of matter from a region of high concentration to a region of low concentration, which is an entropy-increasing process, from a more ordered to a less ordered state of molecular locations. For example, when a lump of

dissolves, and the molecules spread out until evenly distributed. This change from a localized to a more even distribution exists as a spontaneous and, more importantly, irreversible process. In other words, diffusion occurs by itself without external driving forces. In addition, once diffusion occurs, it is not possible for the molecular distribution to return to its original undiffused state. If diffusion does not occur spontaneously, then there is no natural mixing, and one may have a bitter coffee taste and sweet sugar taste in an unmixed liquid phase. In general, diffusion is closely related to mixing and demixing (separation) within a plethora of

engineering applications. Why does diffusion occur, and how do we understand the spontaneous phenomena? A key stands as the entropy-changing rate from one static equilibrium to the other. Before discussing diffusion as an irreversible phenomenon, however, the following section includes several pictures so as to create a better understanding of diffusion phenomenon as one of the irreversible

Diffusion is often driven by the concentration gradient referred to as ∇c, typically in a finite volume, temperature, and pressure. As temperature increases, molecules gain kinetic energy and diffuse more actively in order to position evenly within the volume. A general driving force for isothermal diffusion exists as a gradient of the chemical potential ∇μ between regions of higher and lower concentrations.

As shown in Figure 1, diffusion of solute molecules after removing the mid-wall is spontaneous. Initially, two equal-sized rectangular chambers A and B are separated by an impermeable wall between them. The thickness of the mid-wall is negligible in comparison to the box length; in each chamber of A and B, the same amount of water is contained. Chamber A contains seawater of salt concentration 35,000 ppm, and chamber B contains fresh water of zero salt concentration. If the separating wall is removed slowly enough not to disturb the stationary solvent medium but fast enough to initialize a sharp concentration boundary between the

Diffusion in a rectangular container consisting of two equal-sized chambers A and B (a) before and (b) after

sugar is added to a cup of coffee for a sweeter taste, the solid cube of sugar

As the digital representation of integers is binary, the base b is often set as two. Note that Shannon's entropy is identical to Gibbs entropy, if Boltzmann's constant kB is discarded and the natural logarithm ln ¼ log <sup>e</sup> is replaced by log 2. Entropy only takes into account the probability of observing a specific event, so the information it encapsulates is information about the underlying probability distribution, not the meaning of the events themselves. Example 3 deals with tossing a coin or a dice and how the entropy S increases with respect to the number of available outcomes.

Example 3: Let's consider two conventional examples, i.e., a coin and a dice. Their Gibbs entropy values (i.e., entropy per object) are

$$\frac{\mathcal{S}\_{\text{coin}}}{\mathcal{k}\_B} = -\sum\_{k=1}^2 p\_k \ln p\_k = -\sum\_{k=1}^2 \left(\frac{1}{2} \cdot \ln \frac{1}{2}\right) = \ln 2 = 0.6931$$

$$\frac{s\_{\text{dica}}}{k\_B} = -\sum\_{k=1}^{6} \cdot \frac{1}{6} \ln \frac{1}{6} = \ln 6 = 1.791$$

The system entropies of the coin and the dice are

$$\mathbf{S}\_{\text{eoin}}/k\_{\text{B}} = 2 \times 0.6931 = 1.386$$

$$\mathbf{S}\_{\text{diča}}/k\_B = \mathbf{6} \times \mathbf{1.791} = \mathbf{10.750}$$

and their ratio is

$$\frac{S\_{\text{disc}}}{S\_{\text{coin}}} = \frac{6 \times \ln 6}{2 \times \ln 2} = \mathbf{3} \cdot \frac{\ln 2 \cdot \mathbf{3}}{\ln 2} = \mathbf{3} \times 2.5850 = 7.754 \therefore \mathbf{3}$$

where three indicates the ratio of the number of available cases of a dice (6) to that of a coin (2). The entropy ratio, 7.754, is higher than the ratio of available states, 3.
