1. Background

Shannon [1] introduced the concept of entropy in communication theory and founded the subject of information theory. The stochastic system has an important property known as entropy which is widely used in various fields.

Further, the second law of thermodynamics that explains that there cannot be spontaneous decrease in the entropy of system described that over time the systems tend to be more disordered. Thus, information theory has found wide applications in statistics, information processing and computing instead of concerned with communication systems only.

> © 2016 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and eproduction in any medium, provided the original work is properly cited. © 2018 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

If we consider entropy equivalent to uncertainty then an enormous deal of insight can be obtained. Zadeh [2] introduced and enlightened about a generalised theory of vagueness or ambiguity. In order to observe about the external world, uncertainty plays a very important role. For understanding composite phenomena, any discipline that contributes in order to understand measure, regulate, maximise or minimise and control is considered as a significant input.

Uncertainty plays a significant role in our perceptions about the external world. Any discipline that can assist us in understanding it, measuring it, regulating it, maximizing or minimizing it and ultimately controlling it to the extent possible, should certainly be considered an important contribution to our scientific understanding of complex phenomena.

Uncertainty is not a single monolithic concept. It can appear in several guises. It can arise in what we normally consider a probabilistic phenomenon. On the other hand, it can also appear in a deterministic phenomenon where we know that the outcome is not a chance event, but we are fuzzy about the possibility of the specific outcome. This type of uncertainty arising out of fuzziness is the subject of investigation of the relatively new discipline of fuzzy set theory.

We shall first take up the case of probabilistic uncertainty. Probabilistic uncertainty is related to the uncertainty connected with the probability of outcomes.

Consider a set of events E = (E1, E2,…, En) with a set of probability distribution P = (p1, p2, …, pn), pi ≥ 0, P<sup>n</sup> <sup>i</sup>¼<sup>1</sup> pi <sup>¼</sup> 1.

Then the Shannon [1] entropy associated with P is given by,

$$\operatorname{pH}(\mathbb{P}) = -\sum\_{i=1}^{n} p\_i \log p\_i \tag{1}$$

5. H(P\*Q) = H(P) + H(Q), that is, the uncertainty of two independent probability distributions

The measures in (Eq. (1)) not only measures uncertainty, but it also measures equality of p1, p2, …, pn, since it has the maximum value when p1, p2,…,pn, are all equal and has the minimum value when pi's are most unequal. In fact pi's can be regarded as proportions rather than

> <sup>i</sup>¼<sup>1</sup> <sup>p</sup><sup>α</sup> <sup>i</sup>=P<sup>n</sup> <sup>i</sup>¼<sup>1</sup> pi � �, <sup>α</sup> 6¼ <sup>1</sup>, <sup>α</sup>≻<sup>0</sup>

Havrada and Charvat [4] gave the first nonadditive measure of entropy and it is used in the

<sup>i</sup>¼<sup>1</sup> <sup>p</sup> 1 τ i � �<sup>τ</sup>

1 þ api

� � log 1 <sup>þ</sup> bpi

1 þ cpi

� � log 1 <sup>þ</sup> kpi

� � log 1 <sup>þ</sup> api

� � log 1 <sup>þ</sup> cpi

Xn i¼1 pα <sup>i</sup> � 1 !

<sup>1</sup> � <sup>P</sup><sup>n</sup>

1 a Xn i¼1

1 þ bpi

1 þ kpi

1 c2 Xn i¼1

� �, <sup>α</sup> 6¼ <sup>1</sup>, <sup>α</sup> <sup>≻</sup>0, <sup>β</sup>≻0, <sup>α</sup> <sup>þ</sup> <sup>β</sup> � <sup>1</sup> <sup>≻</sup>0 (3)

, α 6¼ 1, α ≻0: (4)

Fuzzy Information Measures with Multiple Parameters http://dx.doi.org/10.5772/intechopen.78803

<sup>1</sup> � <sup>2</sup><sup>τ</sup>�<sup>1</sup> , <sup>τ</sup> 6¼ <sup>1</sup>, <sup>τ</sup>≻0: (5)

� � � api � �, a ≻0 (6)

� � � ð Þ <sup>1</sup> <sup>þ</sup> <sup>b</sup> log 1ð Þ <sup>þ</sup> <sup>b</sup> pi � �, b≻0 (7)

> � � � cpi � �, c ≻0 (8)

� � � ð Þ <sup>1</sup> <sup>þ</sup> <sup>k</sup> log 1ð Þ <sup>þ</sup> <sup>k</sup> pi � �, k≻ 0 (9)

(2)

9

After Shannon's [1] entropy, various other measures of entropy have been proposed.

<sup>1</sup> � <sup>α</sup> log <sup>P</sup><sup>n</sup>

�

Entropy of order α and type β was described by Kapur in the way as:

<sup>i</sup>¼<sup>1</sup> <sup>p</sup> αþβ�1 <sup>i</sup> =P<sup>n</sup> <sup>i</sup>¼<sup>1</sup> <sup>p</sup> β i

1 � α

is the sum of the uncertainties of the two probability distributions. 6. H(p1, p2,…, pn) = H(p1+ p2,p3,…, pn) + (p1+ p2)H(p1/p1 + p2, p2/p1 + p2)

Entropy of order α was described by Renyi [3] in the way as:

<sup>H</sup>αð Þ¼ <sup>P</sup> <sup>1</sup>

<sup>1</sup> � <sup>α</sup> log <sup>P</sup><sup>n</sup>

<sup>H</sup><sup>α</sup>ð Þ¼ <sup>P</sup> <sup>1</sup>

Hτð Þ¼ P

Kapur [6] gave the following nonadditive measures of entropy:

1 b Xn i¼1

pi log pi þ

1 k2 Xn i¼1

pi log pi þ

i¼1

i¼1

pi log pi þ

pi log pi þ

Hað Þ ¼� <sup>P</sup> <sup>X</sup><sup>n</sup>

Hcð Þ¼� <sup>P</sup> <sup>X</sup><sup>n</sup>

Behara and Chawla [5] defined the nonadditive τ entropy as

<sup>H</sup>α,βð Þ¼ <sup>P</sup> <sup>1</sup>

probabilities.

modified form as

Hbð Þ ¼� <sup>P</sup> <sup>X</sup><sup>n</sup>

Hkð Þ¼� <sup>P</sup> <sup>X</sup><sup>n</sup>

i¼1

i¼1

The base of logarithm is taken as 2. Also it is assumed that

$$0 \log 0 = 0.$$

Shannon [1] obtained (Eq. (1)) on the basis of following postulates:


The measures in (Eq. (1)) not only measures uncertainty, but it also measures equality of p1, p2, …, pn, since it has the maximum value when p1, p2,…,pn, are all equal and has the minimum value when pi's are most unequal. In fact pi's can be regarded as proportions rather than probabilities.

After Shannon's [1] entropy, various other measures of entropy have been proposed.

Entropy of order α was described by Renyi [3] in the way as:

If we consider entropy equivalent to uncertainty then an enormous deal of insight can be obtained. Zadeh [2] introduced and enlightened about a generalised theory of vagueness or ambiguity. In order to observe about the external world, uncertainty plays a very important role. For understanding composite phenomena, any discipline that contributes in order to understand measure, regulate, maximise or minimise and control is considered as a significant input. Uncertainty plays a significant role in our perceptions about the external world. Any discipline that can assist us in understanding it, measuring it, regulating it, maximizing or minimizing it and ultimately controlling it to the extent possible, should certainly be considered an impor-

Uncertainty is not a single monolithic concept. It can appear in several guises. It can arise in what we normally consider a probabilistic phenomenon. On the other hand, it can also appear in a deterministic phenomenon where we know that the outcome is not a chance event, but we are fuzzy about the possibility of the specific outcome. This type of uncertainty arising out of fuzziness is the subject of investigation of the relatively new discipline of fuzzy set theory.

We shall first take up the case of probabilistic uncertainty. Probabilistic uncertainty is related to

Consider a set of events E = (E1, E2,…, En) with a set of probability distribution P = (p1, p2, …,

i¼1

0 log0 ¼ 0:

1. H(P) should be a continuous permutationally symmetric function of p1, p2,…, pn, that is, ambiguity changes by slight quantity if there is slight quantity changes in pi's and ambi-

2. H(p1, p2,…, pn, 0) = H(p1, p2,…, pn), that is, uncertainty should not change when an

3. H(P) should be minimum when P is any one of the n degenerate distribution Δ<sup>1</sup> ¼ ð Þ 1; 0;…; 0 , Δ<sup>2</sup> ¼ ð Þ 0; 1;…; 0 , …,Δ<sup>n</sup> ¼ ð Þ 0; 0; …; 1 and the minimum value should be zero because in all

4. H(P) should be maximum when p1 = p2 = … = pn = 1/n because in this case the uncertainty

pi log pi (1)

H Pð Þ¼�X<sup>n</sup>

tant contribution to our scientific understanding of complex phenomena.

8 Fuzzy Logic Based in Optimization Methods and Control Systems and Its Applications

the uncertainty connected with the probability of outcomes.

Then the Shannon [1] entropy associated with P is given by,

The base of logarithm is taken as 2. Also it is assumed that

impossible outcome is added to the scheme.

these cases, there is no uncertainty about the outcome.

Shannon [1] obtained (Eq. (1)) on the basis of following postulates:

guity remain unchanged if pi's exchange among themselves.

pn), pi ≥ 0, P<sup>n</sup>

is maximum.

<sup>i</sup>¼<sup>1</sup> pi <sup>¼</sup> 1.

$$H\_{\alpha}(P) = \frac{1}{1 - \alpha} \left( \log \left( \sum\_{i=1}^{n} p\_i^{\alpha} / \sum\_{i=1}^{n} p\_i \right), \alpha \neq 1, \alpha > 0 \tag{2}$$

Entropy of order α and type β was described by Kapur in the way as:

$$H\_{a, \beta}(P) = \frac{1}{1 - \alpha} \log \left( \sum\_{i=1}^{v} p\_i^{a + \beta - 1} / \sum\_{i=1}^{v} p\_i^{\beta} \right), \alpha \neq 1, a \succ 0, \beta \succ 0, a + \beta - 1 \succ 0 \tag{3}$$

Havrada and Charvat [4] gave the first nonadditive measure of entropy and it is used in the modified form as

$$H^{\alpha}(P) = \frac{1}{1-\alpha} \left(\sum\_{i=1}^{n} p\_i^{\alpha} - 1\right), \alpha \neq 1, \alpha \succ 0. \tag{4}$$

Behara and Chawla [5] defined the nonadditive τ entropy as

$$H\_{\pi}(P) = \frac{1 - \left(\sum\_{i=1}^{n} p\_i^{\frac{1}{\pi}}\right)^{\pi}}{1 - 2^{\pi - 1}}, \pi \neq 1, \pi > 0. \tag{5}$$

Kapur [6] gave the following nonadditive measures of entropy:

$$H\_a(\mathcal{P}) \ = -\sum\_{i=1}^n p\_i \log p\_i \ \ + \frac{1}{a} \sum\_{i=1}^n \left[ \left( 1 + ap\_i \right) \ \log \left( 1 + ap\_i \right) - ap\_i \right], a \succ 0 \tag{6}$$

$$H\_b(P) = -\sum\_{i=1}^n p\_i \log p\_i \, + \, \frac{1}{b} \sum\_{i=1}^n \left[ (1 + bp\_i) \cdot \log \left( 1 + bp\_i \right) - (1 + b) \log (1 + b) p\_i \right], b \succ 0 \quad (7)$$

$$H\_c(P) = -\sum\_{i=1}^n p\_i \log p\_i + \frac{1}{c^2} \sum\_{i=1}^n \left[ \left( 1 + cp\_i \right) \cdot \log \left( 1 + cp\_i \right) - cp\_i \right], c \succ 0 \tag{8}$$

$$H\_k(P) = -\sum\_{i=1}^n p\_i \log p\_i + \frac{1}{k^2} \sum\_{i=1}^n \left[ \left(1 + kp\_i\right) \log\left(1 + kp\_i\right) - (1+k)\log\left(1 + k|p\_i|\right), k \succ 0 \right] \tag{9}$$
