Preface

Entropy is an important physical parameter to reflect the disordered state of a certain system, which is mainly expressed in three different forms: Clausius entropy, Boltzmann entropy, and Shannon entropy. Clausius entropy originated from the second law of thermodynamics, which describes the heat flow from a high-temperature region to a low-temperature region. The equation is:

dS=-dQ/T

where Q is heat and T is the absolute temperature.

Boltzmann entropy is based on the thermodynamic statistics principle, which is in the form of:

DScn= k ln W

where k is the Boltzmann's constant (k = 1.38X10- 23J/K) and W is the thermodynamic probability, which is the total number of micro-states corresponding to a certain macro-state.

The expression of Shannon entropy is:

DSin=−K∑PilnPi

where Pi is the probability of occurrence of the i-th information of the information source. Shannon entropy mainly measures the information; the greater the entropy, the more the randomness and the less the information.

For entropic materials, the content of each component Xi is used to replace the Pi in the Shannon entropy equation as:

DS=−R∑XilnXi

where R is the gas constant (R = 8.314 J/mol·K) and Xi is the content of the i-th component (at. %).

For a random solid solution, the possibility of the i-th component to occupy the fixed lattice site is proportional to its content. High-entropy materials are closely related to information entropy.

This book summarizes recent developments in high-entropy materials, including their properties, processing, modeling, and applications.

## **Yong Zhang**

**1**

Section 1

Fundamentals

North Minzu University and State Key Laboratory for Advanced Metals and Materials, University of Science and Technology Beijing, Beijing, China

Section 1 Fundamentals

## **Chapter 1**
