**1. Introduction**

### **1.1 History of computing**

Evolution in one region of science and technology leads to the discovery of a new one. In less than a century, research and development of functional computing technologies have renovated science, technology, and nation massively. The first practical computer around the 20th century was not capable of doing mathematical computations, on its own. Practical devices need a solid physical implementation of theoretical concepts. Nowadays, computers are solving problems instantly

and accurately provided the input is relevant, and a set of instructions given are favorable. It all started from World War II when Alan Turing created a real generalpurpose computer with a storable program model and is known as the 'Universal Turing Machine'. It was redesigned by Von Neumann and is now the most important architecture for almost every computer. The computers and their physical parts kept improving with time in terms of performance and their strengths. And gradually, the industry of computers became larger than the military department which initiated it. The advancement in control and understanding of humans over nature and physical systems has given us the latest electronic devices we are utilizing today [1].

### **2. A new kind of computing**

Today's computers are smaller, cheaper, faster, greatly efficient, and even more powerful as compared to early computers that used to be huge, costly, and more power-consuming. It becomes possible due to improvements in architecture, hardware components, and software running on them. Electronic circuits used in computers are getting smaller and smaller day by day. Transistors are small semiconductor devices that are used to amplify and also switch electric or electronic signals. They were used to be fabricated on a piece of silicon. The circuit was made by connecting these transistors together into a single silicon surface. The shape of circuits in an IC was printed together in all layers of silicon at the same time. This process takes the same amount of time even if the number of transistors in the circuit was increased. The cost of production of IC was decided by the size of silicon and not the number of transistors. This reduced the price of products due to which manufacturing and selling of IC increased and thus benefits and sales also. From the idea of connecting individual transistors to the collection of these transistors (Logic Gates) and finally, the collection of these Logic Gates used to get connected into a single integrated circuit (IC). Nowadays, a single IC can even integrate small computers onto it.

Gordon Moore, co-founder of Intel, in 1965, discovered that the number of transistors on a silicon microprocessor chip had made twice as much every year while the prices were reduced to half since their invention. This is known as Moore's Law. Moore's Law is considerable because it means that computers and their computing power get smaller and faster over time. Though this law is putting the brakes on now and consequently, the improvement in classical computers is not like before it used to be [2].

This leads to the idea of the smallest computer by reducing the size of the circuit up to the size of an atom. But then these circuits will not be able to act as a switch as electrons inside an atom can become invisible from one side of a barrier and appear on another side, i.e. they can exist in more than one place at the same time. This is due to the teleporting phenomena in quantum mechanics called "Quantum Tunneling". It shows that the size of the circuits of the classical computer after 5–7 nanometers has reached their limit. The representation and processing of these computers can be illustrated by the law of classical physics that gives us an only deterministic justification of the Universe. But it fails to forecast all noticeable phenomena occurring in nature and this led to the discovery of quantum mechanics, the biggest changeover in physics. Thus, there is a need for new computing other than current classical computing to put its state into some physical information rather than a circuit. Since the quantum phenomena are bringing up more constraints on the design of the computers. It changes the basic building blocks of a computer that not only expects new type of hardware creation but also a new design, software, and layers of abstraction to facilitate the designers to create and

#### *Introduction to Quantum Computing DOI: http://dx.doi.org/10.5772/intechopen.94103*

exploit these systems even if their complexities scale over time. The design of the hardware components has to be governed by quantum properties [3].

**Quantum Computing** is a new kind of computing based on Quantum mechanics that deals with the physical world that is probabilistic and unpredictable in nature. Quantum mechanics being a more general model of physics than classical mechanics give rise to a more general model of computing- quantum computing that has more potential to solve problems that cannot be solved by classical ones. To store and manipulate the information, they use their own quantum bits also called '*Qubits*' unlike other classical computers which are based on classical computing that uses binary bits 0 and 1 individually. The computers using such type of computing are known as '*Quantum Computers'*. In such small computers, circuits with transistors, logic gates, and Integrated Circuits are not possible. Hence, it uses the subatomic particles like atoms, electrons, photons, and ions as their bits along with their information of spins and states. They can be superposed and can give more combinations. Therefore, they can run in parallel using memory efficiently and hence is more powerful. Quantum computing is the only model that could disobey the Church-Turing thesis and thus quantum computers can perform exponentially faster than classical computers.

#### **3. Need for quantum computers**

Quantum computers can solve any computational problem that any classical computer can. According to the Church-Turing thesis, the converse is also true that classical computers can solve all the problems of quantum computers too. It means they provide no extra benefit over classical computers in terms of computability but there are some complex and impossible problems that cannot be solved by today's conventional computers in a practical amount of time. It needs more computational power. Quantum computers can solve such problems in reasonably and exponentially lower time complexities, also known as "**Quantum Supremacy**" [4].

Peter Shor in 1993 showed that Quantum computers can help to solve these problems considerably more efficiently like in seconds without getting overheated. He developed algorithms for factoring large numbers quickly. Since their calculations are based on the probability of an atom's state before it is actually known. These are having the potential to process data in an exponentially huge quantity. It also explains that a practical quantum computer could break the cryptographic secret codes. It can risk the security of encrypted data and communication. It can expose private and protected secret information. But the advantages of quantum computers are also kept in mind that is significantly more than its flaws. Hence, they are still needed and further research is going towards a brighter future.

#### **4. Fundamentals of quantum computing**

While designing the conventional computer, it was kept in mind that transistors' performance especially when getting smaller, will be affected by noise if any type of quantum phenomenon takes place. They tried to avoid quantum phenomena completely for their circuits. But the quantum computer adapts a different technique instead of using classical bits and even works on the quantum phenomenon itself. It uses quantum bits that are analogous to classical bits and have two quantum states where it can be either 0 or 1 except it follows some quantum properties where it can have both values simultaneously leading to a concept of superposed bits.

## **5. Where the concept of bits came from?**

Transistors are the fundamental construction blocks for an IC which are connected through wires in a circuit. They conduct electric signals between devices. The communication between transistors within an IC takes place through electric signals. The behavior of the signals is analog in nature. Therefore, their values are real numbers that change smoothly between 0 and 1. These electric signals can also interact with the environment resulting in noise. Therefore, a little change from 0 to 0.1 due to temperature or vibrations from the environment can drastically change the system's behavior. There are two types of noise present in the environment. The first type of noise results from energy instabilities occurring suddenly within the object like temperature above absolute zero Kelvin. These are fundamental in nature. Other types of noise are the consequences of signal interactions. This type of noise could have corrected or designed. But neither of them got designed nor corrected or maybe left intentionally uncorrected at the hardware layer. They are systematic in nature [5].

To overcome these noises in analog circuits, the IC is built with transistors in such a way that it could work on digital signals (binary bits) instead of analog signals. These circuits are called 'Logic Gates'. They perceive the electric signals containing values of real numbers as a binary digit or 'bit' of either 0 (low voltage) or 1 (high voltage). Registers are another type of Gate which stores a bit or the number of bits present in an input value to process further. Gates can remove noise from a signal by limiting the set of values a signal can hold. Constructing IC using logic gates rather than transistors simplifies the designing by creating a powerful circuit that is not sensitive to design and fabrication issues and facilitates abstraction to designers so that they can focus only on gate functions (Boolean functions) rather than circuit issues. Boolean functions are defined by the rules of Boolean algebra. They can use an automated design tool for mapping the required logic gates. A standard library containing a set of tested logic gates is integrated into the silicon chip design with the help of their manufacturing technology. Negligible error rates can be achieved using digital logic and standard libraries. This helps in making the design robust. Also, the data is encoded by adding some redundant bits in the memory using an error correction code. This code is checked at regular intervals to detect the error. It also helps in other traits of design like testing and debugging.

**Quantum Bit or Qubit** is the fundamental unit of quantum information that represents subatomic particles such as atoms, electrons, etc. as a computer's memory while their control mechanisms work as a computer's processor. It can take the value of 0, 1, or both simultaneously. It is a million times more powerful than today's strongest supercomputers. Production and management of qubits are tremendous challenges in the field of engineering. They acquire both, digital as well as analog nature which gives the quantum computer their computational power. Their analog nature indicates that quantum gates have no noise limit and their digital nature provides a norm to recover from this serious weakness. Therefore, the approach of logic gates and abstractions created for classical computing is of no use in quantum computing. Quantum computing may adopt ideas only from classical computing. But this computing needs its own method to overcome the variations of processing and any type of noise. It also needs its own strategy to debug errors and handle defects in design.

Qubit has two quantum states similar to the classical binary states. The qubit can be in either state as well as in the superposed state of both states simultaneously. There is a representation of these quantum states also known as Dirac notation [6].

In this notation, the state label is kept between two symbols | and ⟩. Therefore, states are written as |0⟩ and |1⟩ which are literally having analog values and both are participating to give any value between 0 and 1 given that sum of probability of occurrence of each state must be 1. Thus any quantum bit wave function can be *Introduction to Quantum Computing DOI: http://dx.doi.org/10.5772/intechopen.94103*

expressed as a two-state linear combination each with its own complex coefficient i.e. |w⟩ = x |0⟩ + y |1⟩ where x and y are coefficients of both the states. The probability of the state is directly proportional to the square of the magnitude of its coefficient. |x|2 is the probability of identifying the qubit state 0 and |y|<sup>2</sup> is the probability of identifying the qubit state 1. These probabilities when summed up must give a total of 1 or say 100% mathematically, i.e. |x|<sup>2</sup> + |y|2 = 1.

#### **6. Properties of quantum computing**

In quantum physics, the quantum object does not exist in an entirely determined state. It looks like a particle but behaves like a wave when not being observed. This dual nature of particles leads to interesting physical phenomena. The state of any quantum object is expressed as a sum of possible participating states or a wavefunction. Such states are coherent due to the interference of all the participating states either in a constructive or a destructive manner. Observation of quantum objects when they interact with some larger physical system results in the extraction of information. Such observation of quantum objects is called quantum measurement. Measurement can also result in the loss of information by disrupting the quantum state. These are some of the properties of quantum objects. Quantum objects referred here are the qubits in the case of quantum computing. The progress of any quantum system is regulated by Schrodinger's equation that tells us about the change in the wave-function of the system due to the energy environment. This environment is the system Hamiltonian which is a mathematical description of energies experiencing from all forces felt by all components of the system. To control any quantum system, there is a need to control this environment by isolating the system from the forces of the universe that cannot be controlled easily and by assigning energy within this isolated area only. A system cannot be completely isolated. However, energy and information exchanges can be minimized. This interaction with the outside environment can lead to loss of coherence and can result in "Decoherence" [7].

The properties are the conceptual rules and mathematical manifestations that describe the behavior of the particles. Quantum computers use three fundamental properties of quantum mechanics to store, represent, and perform operations on data in such a way so that it can compute exponentially faster than any classical computer. The three properties are given as follows [8]:

#### • Superposition

Superposition in quantum mechanics states that any two quantum states can be summed up (superposed) resulting in another valid quantum state. It is a fundamental principle of quantum mechanics. Oppositely we can say that any quantum state is the sum of two or more than two other unique states.

Superposition in quantum computing refers to the ability of a quantum system where quantum particle or qubit can exist in two different positions or say, in multiple states at the same time. It provides high-speed parallel processing in an unbelievable way and is very different from their classical equivalents that have binary constraints. The quantum computer system holds the information that exists in two states simultaneously. Qubits are brought into a superposition by influencing them with the help of lasers so that it can simultaneously store 0 and 1 at the same time. In classical computing, if there are 2 bits, the total possible values after combining we get are 4, out of which only 1 value is possible at any instant. But on the other hand, if there are 2 qubits in the quantum computer. The total possible values after combination are 4 and all are possible at once. It looks like unthinkable

because it is not like gravity that can be proved easily just by looking at the falling of an apple. The laws of classical physics fail here because superposition only exists in the territory of quantum particles.

For example, when solving a puzzle-like maze, a quantum particle can decide to take the various paths at the same time using superposition. This process matches the function of the parallel computer. Due to this property, the qubit is able to navigate the maze in exponentially less time than a classical bit

• Entanglement

Entanglement in quantum mechanics is a physical phenomenon where two or more quantum objects are inherently linked such that measurement of one rules the possible measurement of another. In other words, a pair or a group of particles interacts or share spatial locality such that the quantum state of each particle cannot be characterized independently of the other particle's state in the same group even when they are separated by a large distance.

Entanglement is one of the important properties of quantum computing. It refers to the strong correlation existing between two quantum particles (physical properties of systems) or qubits. Qubits are linked together in a perfect instantaneous connection, even if they are isolated at any large distances such as located at the opposite ends of the Universe. They are entangled or defined with reference to each other. The fact is that the state of one particle influences the state of the other. It creates strong communication between qubits. Once they got entangled, they will stay connected even after separated at any distance. In classical computers, if bits are doubled, computational power also gets doubled. But in the case of Entanglement, adding extra bits to a quantum computer can increase its computational power exponentially. Quantum computer uses this property in a sort of quantum daisy chain.

Some examples of entanglement can be seen in nature such as electrons separated from each other at some distance inside an electron cloud are massively entangled with one another. If one electron is at both the states of spin-up and spin-down with each state having a probability of ½, a similar case is with the other electron.

• Interference

The property of interference in quantum computers is similar to wave interference in classical physics. Wave interference happens when two waves interact with each other in the same medium. It forms a resultant wave with either their amplitudes added together when they are aligned in the same direction known as constructive interference or a resultant wave with their amplitudes canceled out when waves are in opposite direction known as destructive interference. The net wave can be bigger or smaller than the original wave depending on the type of interference. Since all subatomic particles along with light pose dual nature, i.e. particle and wave nature both. The quantum particle may experience interference. If each particle goes through both the slits (Young's double-slit experiment) simultaneously due to superposition, they can cross its own path interfering with the path direction. The idea of interference allows us to intentionally bias the content of the qubit towards the needed state. However, it can also result in a quantum computer to combine its various computations into one making it more error-prone [9].

#### **7. The topography of quantum technology**

The quantum phenomena are not limited to just quantum computing but they apply to other technologies also including quantum information science, quantum

#### *Introduction to Quantum Computing DOI: http://dx.doi.org/10.5772/intechopen.94103*

communication, and quantum metrology. The progresses of all these technologies are mutually dependent on each other and can control as well as transform the entire quantum system. They share the same theory of physics, common hardware and related methods [10].

*Quantum Information Science* seeks the methods of encoding the information in a quantum system. It includes statistics of quantum mechanics along with their limitations. It provides a core for all other applications such as quantum computing, communications, networking, sensing and metrology.

*Quantum Communication and networking* concentrates on the conversation or exchange of information by encoding it into a quantum system to facilitate communication between quantum computers. Quantum cryptography is the subset of quantum communication in which quantum properties help to design the secure communication system.

*Quantum sensing and metrology* is the study and development of quantum systems. The drastic sensitivity of such a system to environmental nuisances can be utilized in order to measure important physical properties (e.g. electric and magnetic fields, temperature, etc.) more accurately than classical systems. Quantum sensors are based on qubits and are carried out using the experimental quantum systems.

*Quantum computing* is the central focus of this research which exploits the quantum mechanical properties of superposition, entanglement and interference to enact computations. In common, a quantum computer is a physical system that comprises a collection of qubits that must be isolated from the environment for their quantum state to stay coherent until it performs the computation. These qubits are organized and manipulated in order to enforce an algorithm and to achieve a result with high probability from the measurement of its final state.


**Difference between classical computers and quantum computers** [11].
