**5. The view that 'Information is Physical' is the Foundation of Quantum Information Theory**

Claude Shannon in a truly remarkable paper, Claude E. Shannon [1], laid down the foundations of the subject. In this paper Shannon claims that the main concern of Quantum Information Theory is as follows:

*"The fundamental problem of communication (under quantum information theory) is that of reproducing at one point either exactly or approximately* the quantity of information *selected at another point." [1]*

The Quantum Information Theory is much richer and more complex (than its classical counterpart) and it is inherently interdisciplinary in nature, since it touches on multiple fields and brings physicists, computer scientists, and mathematicians together on common goals.

It is far from being complete but has already found application areas well beyond the processing and transmission of information. In particular, it provides a new perspective to investigate, characterize, and classify the complex behavior of large quantum systems, ranging from materials to chemical compounds, high energy problems, and even holographic principles.

Nevertheless, even if Quantum Information Theory reinforces the notion that 'Information is Physical,' based on quantum physics, the notion itself is also relevant within classical physics.

#### **5.1 Shannon's definition of quantity of information**

Shannon defined the 'quantity of information' produced by a source, for example, the quantity in a message by a formula similar to the equation that defines thermodynamic entropy in physics. In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. According to Shannon Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.

Shannon introduced as his most basic term, viz. **informational entropy**. It is the number of binary digits required to encode a message. This might appear currently to be a simple, even obvious way to define how much information is in a message. However, in 1948, at the very origin of quantum. Information age, the digitization of information of any sort was considered to be a revolutionary step. Shannon's 1948 paper might have been the first to use the word "bit," short for binary digit.

Shannon's paper contained two theorems. The first of these is the *source coding theorem*, which gives a formula for how much a source emitting random signals can be compressed, while still permitting the original signals to be recovered with high probability.

The second theorem, *the channel coding theorem*, states that with high probability, n uses noisy channel N can communicate C – o (n) bits reliably, where C is the channel capacity.

Thus, a new approach emerges as a result of treating information as a quantum concept and to ask what new insights can be gained by encoding this information in individual quantum systems.

## **5.2 Generalizations and Laws in quantum information science**

While we often treat **information** in abstract terms (especially in the context of computer science), it is more correct to think of **information** as being represented as different physical states and obeying the **laws** of physics.

However, what does it mean to say that information obeys the laws of physics. In Quantum Information Theory this amounts to claiming that both the transmission and processing of information are governed by **quantum laws defined in terms of "Qubits"** (and not by the classical "bits"). Since qubits behave quantumly and in terms of quantum probabilities, we can also capitalize them to explain the two most important phenomena of quantum information science, viz. "superposition" and "entanglement."
