Gaetano Licata

*Università degli Studi di Palermo Italy* 

#### **1. Introduction**

This is an introductive study on what Fuzzy Logic is, on the difference between Fuzzy Logic and the other many-valued calculi and on the possible relationship between Fuzzy Logic and the complex sciences. Fuzzy Logic is nowadays a very popular logic methodology. Different kinds of applications in cybernetics, in software programming and its growing use in medicine seems to make Fuzzy Logic, according to someone, the "new" logic of science and technology. In his enthusiastic panegyric of Fuzzy Logic, Kosko (1993) argues that after thirty years from the birth of this calculus, it is time to declare the new era of Fuzzy Logic and to forget the old era of classical logic. I think that this point of view is too much simplistic. However, it is true that Fuzzy Logic and many valued-logics are connected with a new ontology. Quantum physics and biology of complexity push the research in the direction of a new and more complex concept of logical formalization. The ontological vagueness must be connected to the logical vagueness, the undetermined development of some natural phenomena must be treated with many-valued logic. The importance and usefulness of many-valued logic in science will be showed by two examples: i) the unforeseeability in biology (theory of Prigogine), ii) the birth of quantum mechanics and the employment of probabilistic logic in the interpretation of the wave-function. Kosko affirms that Fuzzy Logic is the solution to the fact that science proposes a linear image of a nonlinear world and quotes the Heisenberg's indeterminacy principle. Moreover Kosko proposes a fuzzy alternative to the probabilistic interpretation of the equation of Schrödinger. I think that Fuzzy Logic is an important device to face the new ontology of complexity, but this does not entail that Fuzzy Logic is the solution to all the scientific problems. It is useful to understand why Fuzzy Logic created this illusion and which can be its real place in scientific methodology.

One of the most important argument employed by Kosko to underline the superiority of Fuzzy Logic (against classical and probabilistic logic) is its similarity with natural language (henceforth NL) and natural thinking. The relaxations of the pretences of logical truth, in classical sense, seem to give Fuzzy Logic the character of natural thinking, with its imprecision and its approximation, and also with its richness. In my work I will argue that Fuzzy Logic is a very useful device to treat natural phenomena and quantities in a logical way, and that Fuzzy Logic is a very versatile many-valued logic because the number of truth values can vary from few to infinite, at will of the user. Nevertheless, I will argue also that Fuzzy Logic is not the key of the formalization of NL, and that the phenomena of vagueness and the relaxations of classical logical truth, which Fuzzy Logic can treat, are only one

Fuzzy Logic, Knowledge and Natural Language 5

probability (following de Finetti, 1989) think that Fuzzy Logic is useless because all the problems treated by Fuzzy Logic can be solved by calculus of probability. On the other hand, Kosko thinks that probabilistic logic was born for the fuzzy nature of things, and that probabilistic logic can be reduced to Fuzzy Logic. In this chapter I will try to study the relationship between sciences and logical systems. In particular I will underline how the birth of an *ontology* of complexity is in correspondence with a *logic* of complexity. The starting point of the analysis, and the element which binds logical systems and sciences, can be the nomologic-deductive model of scientific explanation. I will employ this famous model of explanation (and of prediction) as a structure in which it is possible to change the different logical systems, on the basis of the different scientific ontologies considered. With the examples of the biology of complexity and of quantum mechanics, I will show that the scientific indeterminism is the natural field of many-valued logics. It is exactly in the discussion of these problems that we will find the place of many-valued logics in the context of formalized languages, and the place of Fuzzy Logic in the context of many-valued logics. The nomologic-deductive explanation of a phenomenon proposed by Popper (1935) was applied by Hempel (1942) to history. An event *E* is explained, from a causal point of view, if it is possible to deduce logically it from two kind of premises: a) the general scientific laws (*L1*, *L2*, …, *Ln*), which regard the development of the event *E* as belonging to a class, and b) the initial conditions (*C1*, *C2*, …, *Cn*), which are specific aspects of *E* in connection with the

Popper proposes the following example. We have a causal explanation of "the breaking of a thread which is charged of a weight" (*Explanandum*), if we have two kind of premises: the proposition "A thread breaks when it is charged of a weight that is heavier than the weight which defines the resistance of the thread", and this has the form of a general law (*L1*); and propositions which are specific of the singular event like "The weight which defines the resistance of this thread is x" (*C1*) and "The thread has been charged with a weight of 2x" (*C2*). Clearly this example regards a deterministic event. It can be studied with the methodologies of macroscopic physics (Galilean and Newtonian physics). On the basis of explanations similar to the scheme of Fig. 1, it is possible to make predictions which are, in principle, absolutely true. E.g. It is absolutely true that, if the thread of the previous example is charged with a weight of 0.5x, the thread will not break; or it is absolutely true that, on the basis of laws of Keplero, in 365 days and 6 hours the earth will be in the same position, with respect to the sun, of today. Buzzoni (2008) proposes an example of a probabilistic version of the nomologic-deductive model of Fig. 1. If the scientific laws, which regard the event *E*, are

general laws. In the scheme of Hempel

Fig. 1. The nomologic-deductive model of scientific explanation.

statistic and not deterministic, the scheme can be

Fig. 2. The probabilistic version of the nomologic-deductive model.

aspect of NL. Natural Language/Thinking has a lot of aspects that a logical calculus cannot have. Being the source of Fuzzy Logic – and of all possible logical calculi –, NL gives to Fuzzy Logic some power. Our task here is to study the nature of this power. From this point of view, it is clear that the enthusiastic judgement on Fuzzy Logic, given by Kosko, should be reconsidered.

As third point of my work, I will focus the position of Fuzzy Logic in the context of the manyvalued logics. Fuzzy Logic has two important characteristic which make it very versatile and very useful in technological employments: 1) the user can choose the number of truth-values from few to infinite; 2) the process of fuzzification and defuzzification requires a double kind of truth values: the "hedges" and the percent values. In the process of fuzzification the quantities to employ in the calculus must be considered as "scalar". This can be done creating a table of correspondence between the intensity of a phenomenon and the percent values. The application of fuzzy calculus to clinical diagnosis, with the fuzzification of biological parameters (the signs, the symptoms and the laboratory tests) is a good example of fuzzification of quantitative phenomena (cf. Licata, 2010). The correspondence with percent values is aimed to establish the hedges which marks the passage from a fuzzy set to another. The hedges correspond to the critical values of the quantities, those who drives the decisions of the system's user (human or artificial). Thus Fuzzy Logic has the aspect of a complex polyvalent calculus which is clearly aimed to applicative and, in particular, "engineering" solutions. From a theoretical point of view, the debate has evidenced that Fuzzy Logic is a good device to treat the linguistic vagueness, as distinct from the uncertainty (Benzi, 1997). In technical sense the "uncertainty" is the effect of an incomplete information with respect to a subjective prevision. The uncertainty is usually treated with the probabilistic logic, but some scholar (those who follows Kosko, 1993) thinks that uncertainty and casualness are aspects derived from vagueness. Fuzzy Logic, according to them, is the best way to give a mathematical account of the uncertain reasoning. With regard to this problem, I think that it is not strange to find similarities and new connections between methods which are studied to treat similar aspects of knowledge; this happens also because distinctions like uncertainty/vagueness or subjective/objective are often linguistic distinctions. Thus the study of Fuzzy Logic, in the context of Natural Language, is a good method to say what is in general Fuzzy Logic and what are its best employments.

#### **2. Importance and usefulness of Fuzzy Logic in sciences: A classification of sciences on the basis of complexity**

More and more the science of today face uncertainty, vagueness and phenomena which traditional methodologies cannot study adequately. With respect to classical ontology, an ontology of complexity is, in the last years, the object of sciences. The birth of many-valued logic is marked by the idea that classical bivalent logic has been the soul of the old scientific ontology, while the new calculi with many truth-values should be the basis of the complex ontology. In this sense, the father of classical logic and of classical ontology, Aristotle, is considered an outdated thinker (Kosko, 1993). Which is the place of Fuzzy Logic in the context of many-valued calculi? Is it true that the Aristotelian bivalence is an outdated logic? Fuzzy Logic is a very useful device to treat natural phenomena and quantities in logical way, it is a very versatile many-valued system because the number of truth values can vary from few to infinite, at will of the user. Moreover the study of probabilistic logic, and of its applications, is very important to understand Fuzzy Logic. Indeed many theorists of

aspect of NL. Natural Language/Thinking has a lot of aspects that a logical calculus cannot have. Being the source of Fuzzy Logic – and of all possible logical calculi –, NL gives to Fuzzy Logic some power. Our task here is to study the nature of this power. From this point of view, it is clear that the enthusiastic judgement on Fuzzy Logic, given by Kosko, should

As third point of my work, I will focus the position of Fuzzy Logic in the context of the manyvalued logics. Fuzzy Logic has two important characteristic which make it very versatile and very useful in technological employments: 1) the user can choose the number of truth-values from few to infinite; 2) the process of fuzzification and defuzzification requires a double kind of truth values: the "hedges" and the percent values. In the process of fuzzification the quantities to employ in the calculus must be considered as "scalar". This can be done creating a table of correspondence between the intensity of a phenomenon and the percent values. The application of fuzzy calculus to clinical diagnosis, with the fuzzification of biological parameters (the signs, the symptoms and the laboratory tests) is a good example of fuzzification of quantitative phenomena (cf. Licata, 2010). The correspondence with percent values is aimed to establish the hedges which marks the passage from a fuzzy set to another. The hedges correspond to the critical values of the quantities, those who drives the decisions of the system's user (human or artificial). Thus Fuzzy Logic has the aspect of a complex polyvalent calculus which is clearly aimed to applicative and, in particular, "engineering" solutions. From a theoretical point of view, the debate has evidenced that Fuzzy Logic is a good device to treat the linguistic vagueness, as distinct from the uncertainty (Benzi, 1997). In technical sense the "uncertainty" is the effect of an incomplete information with respect to a subjective prevision. The uncertainty is usually treated with the probabilistic logic, but some scholar (those who follows Kosko, 1993) thinks that uncertainty and casualness are aspects derived from vagueness. Fuzzy Logic, according to them, is the best way to give a mathematical account of the uncertain reasoning. With regard to this problem, I think that it is not strange to find similarities and new connections between methods which are studied to treat similar aspects of knowledge; this happens also because distinctions like uncertainty/vagueness or subjective/objective are often linguistic distinctions. Thus the study of Fuzzy Logic, in the context of Natural Language, is a good method to say what is in general

**2. Importance and usefulness of Fuzzy Logic in sciences: A classification of** 

More and more the science of today face uncertainty, vagueness and phenomena which traditional methodologies cannot study adequately. With respect to classical ontology, an ontology of complexity is, in the last years, the object of sciences. The birth of many-valued logic is marked by the idea that classical bivalent logic has been the soul of the old scientific ontology, while the new calculi with many truth-values should be the basis of the complex ontology. In this sense, the father of classical logic and of classical ontology, Aristotle, is considered an outdated thinker (Kosko, 1993). Which is the place of Fuzzy Logic in the context of many-valued calculi? Is it true that the Aristotelian bivalence is an outdated logic? Fuzzy Logic is a very useful device to treat natural phenomena and quantities in logical way, it is a very versatile many-valued system because the number of truth values can vary from few to infinite, at will of the user. Moreover the study of probabilistic logic, and of its applications, is very important to understand Fuzzy Logic. Indeed many theorists of

be reconsidered.

Fuzzy Logic and what are its best employments.

**sciences on the basis of complexity** 

probability (following de Finetti, 1989) think that Fuzzy Logic is useless because all the problems treated by Fuzzy Logic can be solved by calculus of probability. On the other hand, Kosko thinks that probabilistic logic was born for the fuzzy nature of things, and that probabilistic logic can be reduced to Fuzzy Logic. In this chapter I will try to study the relationship between sciences and logical systems. In particular I will underline how the birth of an *ontology* of complexity is in correspondence with a *logic* of complexity. The starting point of the analysis, and the element which binds logical systems and sciences, can be the nomologic-deductive model of scientific explanation. I will employ this famous model of explanation (and of prediction) as a structure in which it is possible to change the different logical systems, on the basis of the different scientific ontologies considered. With the examples of the biology of complexity and of quantum mechanics, I will show that the scientific indeterminism is the natural field of many-valued logics. It is exactly in the discussion of these problems that we will find the place of many-valued logics in the context of formalized languages, and the place of Fuzzy Logic in the context of many-valued logics. The nomologic-deductive explanation of a phenomenon proposed by Popper (1935) was applied by Hempel (1942) to history. An event *E* is explained, from a causal point of view, if it is possible to deduce logically it from two kind of premises: a) the general scientific laws (*L1*, *L2*, …, *Ln*), which regard the development of the event *E* as belonging to a class, and b) the initial conditions (*C1*, *C2*, …, *Cn*), which are specific aspects of *E* in connection with the general laws. In the scheme of Hempel

#### Fig. 1. The nomologic-deductive model of scientific explanation.

Popper proposes the following example. We have a causal explanation of "the breaking of a thread which is charged of a weight" (*Explanandum*), if we have two kind of premises: the proposition "A thread breaks when it is charged of a weight that is heavier than the weight which defines the resistance of the thread", and this has the form of a general law (*L1*); and propositions which are specific of the singular event like "The weight which defines the resistance of this thread is x" (*C1*) and "The thread has been charged with a weight of 2x" (*C2*). Clearly this example regards a deterministic event. It can be studied with the methodologies of macroscopic physics (Galilean and Newtonian physics). On the basis of explanations similar to the scheme of Fig. 1, it is possible to make predictions which are, in principle, absolutely true. E.g. It is absolutely true that, if the thread of the previous example is charged with a weight of 0.5x, the thread will not break; or it is absolutely true that, on the basis of laws of Keplero, in 365 days and 6 hours the earth will be in the same position, with respect to the sun, of today. Buzzoni (2008) proposes an example of a probabilistic version of the nomologic-deductive model of Fig. 1. If the scientific laws, which regard the event *E*, are statistic and not deterministic, the scheme can be

Fig. 2. The probabilistic version of the nomologic-deductive model.

Fuzzy Logic, Knowledge and Natural Language 7

until the point of instability: this is the "bifurcation" point. The bifurcation points can give rise to new forms of order: they are the keys of the growing up and of the evolution of living organisms, because they are radical and unexpected transformations in which the system's behavior takes a new direction. The figure 2 shows how a system changes the trajectory of its behavior: the arrow *f*1 represents the development of the system before the bifurcation point B. The arrow *f*2 represents a new unexpected direction in the behavior of the system after B. The broken arrow represents the hypothetical direction of the behavior of the system if B would

According to Prigogine, in the point of instability the system can take different ways (*f*2', *f*2'', …). The way that actually will be taken (*f*2) depends on the history of the system and on the casual fluctuations of the system's environment, which happen in the moment of instability. In the case of living organisms this is very interesting, because a living organism is always a registration of its past and this past always influences its development. Moreover the life of organisms always depends on their environment. In the point of bifurcation the dissipative structure is very sensitive to the minimal fluctuations of the environment, because the point of bifurcation is a "crisis" of the system, in which the rules of the system can be modified. The smallest casual events can orientate the choice of the future behavior of the system. Given that all the living organisms live in environments which continuously fluctuate, and given that it is impossible to know which fluctuations will happen in the moment of maximum instability, it is impossible to foresee the future way of the system. Thus Prigogine theorizes the unforeseeability in principle of the system in the point of bifurcation. Many-valued logics and probabilistic logic are clearly more useful than classical logic in the explanation, in the description and in the prediction of a biological phenomenon. In the human sciences the complexity reaches an higher degree, with respect to biology, because human sciences study the phenomena in which operate a multiplicity of biological subjects who has a mind. The superior complexity of this degree is given by the unforeseeability of groups. If the development of phenomena which regard a single subject is unforeseeable for the complexity of biological processes and for the influence that the consciousness of the subject has in the processes, then the phenomena which regard groups of biological subjects, and groups of minds, entail an unforeseeability of superior degree. Sciences like economy, politics and history (the so called human sciences) study phenomena in which the behavior of the group is the result of the behavior of the single biological subjects, and the result of the influences between subjects. On the basis of these distinctions, we can consider three kinds of phenomena: A) Macro-physical and chemical phenomena; B) Biological phenomena, C) Group phenomena. To each kind of phenomenon a different kind of logical system can correspond, with a different degree of complexity. It's clear that the number of truth-values and the possibility to vary the number of truth-values mark the different degree of complexity. The degrees of logical complexity will be: I) truth-false; II) truth-false + a fix number of truth values (simple polyvalence); III) truth-false + a variable number of

Fig. 3. The bifurcation point in the evolution of a dissipative structure.

have not happened.

We can explain that a person P was infected by measles (*Explanandum*), affirming, as general probabilistic law, that "The probability to be infected by measles is high when a person is exposed to contagion" (*L1*); and, as initial condition, that "The person P has been exposed to contagion". In this case the event *E* is not a (absolutely true) logical consequence of the premises: the *explanans* furnishes to the *explanandum* only a certain degree of probability. An eventual prediction, based upon this kind of explanation, would be probable and not true.

These two examples of scientific explanation show the difference between natural sciences (macroscopic physics, classical chemistry) and human sciences (history, economics, politics, psychology). An intermediate case of complexity between natural sciences and human sciences could be the biology and similar sciences1: the complexity of biological phenomena cannot be well represented by classical logic. In the prevision of a macro-physical phenomenon it is possible to employ a two-valued logic, because knowing the physical laws and the initial conditions it is possible to make previsions with an absolute certainty. On the other hand, in the prevision of a biological phenomenon it is impossible to have an absolute certainty. At the degree of complexity of biological sciences rises the need to employ probabilistic logic. The high number of variables, the complexity of processes and the high number of possible ways give rise to the problem of unforeseeability.

*Prigogine's theory and complexity in biology*. Prigogine introduced in 1967 the concept of "dissipative structure" to describe some special chemical systems. Later the concept of dissipative structure was employed to describe the living organisms. In Prigogine's theory (1967, 1971, 1980), the dissipative structures are in condition of stability when they are far from thermic equilibrium; in this condition they are also able to evolve. When the flux of energy and matter, which goes through them, grows up, they can go through new phases of instability and transform in new, more complex, structures. Dissipative structures receive matter from outside; the instabilities and the jumps to new forms of organization are the result of fluctuations which are amplified by the positive feedback loops (reinforcement feedback loops). The critical points of instability, the jumps, are also called "bifurcation points", because the system can choose between different ways of evolution, out of the normal way. In an artificial machine the structure and the components are fix and immutable, on the other hand, in a living organism structure and components change continuously. A continue flux of matter goes through living organisms: every cell decomposes and synthesizes chemical structures while eliminating the rejects. In living organism there are always development and evolution. The cell can be considered a very complex kind of dissipative structure, while a very simple example of dissipative structure is the drain whirlpool. The forces which find balance in a whirlpool are mechanic and the most important is the gravity, while the forces which operate in a cell are chemical and the description of these forces is enormously more complex than the description of a whirlpool. The energetic processes of the cell are the catalytic cycles which act as feedback loops. The catalytic cycles (Eigen, 1971) are very important in metabolic processes, they can act as auto-balancing feedback loops and as reinforcement feedback loops. The autobalancing feedback loops maintain the stability in the metabolism of the organism, the reinforcement feedback loops can push the organism more and more far from equilibrium,

<sup>1</sup> The case of medicine is particular, because in some aspects it can be considered a biological science (e.g. from the point of view of the development of a pathological phenomenon), while in other aspects (e.g. in psychiatry, or for the influence of consciousness in the development of some diseases) it can be considered a human science.

We can explain that a person P was infected by measles (*Explanandum*), affirming, as general probabilistic law, that "The probability to be infected by measles is high when a person is exposed to contagion" (*L1*); and, as initial condition, that "The person P has been exposed to contagion". In this case the event *E* is not a (absolutely true) logical consequence of the premises: the *explanans* furnishes to the *explanandum* only a certain degree of probability. An eventual prediction, based upon this kind of explanation, would be probable and not true. These two examples of scientific explanation show the difference between natural sciences (macroscopic physics, classical chemistry) and human sciences (history, economics, politics, psychology). An intermediate case of complexity between natural sciences and human sciences could be the biology and similar sciences1: the complexity of biological phenomena cannot be well represented by classical logic. In the prevision of a macro-physical phenomenon it is possible to employ a two-valued logic, because knowing the physical laws and the initial conditions it is possible to make previsions with an absolute certainty. On the other hand, in the prevision of a biological phenomenon it is impossible to have an absolute certainty. At the degree of complexity of biological sciences rises the need to employ probabilistic logic. The high number of variables, the complexity of processes and the high

*Prigogine's theory and complexity in biology*. Prigogine introduced in 1967 the concept of "dissipative structure" to describe some special chemical systems. Later the concept of dissipative structure was employed to describe the living organisms. In Prigogine's theory (1967, 1971, 1980), the dissipative structures are in condition of stability when they are far from thermic equilibrium; in this condition they are also able to evolve. When the flux of energy and matter, which goes through them, grows up, they can go through new phases of instability and transform in new, more complex, structures. Dissipative structures receive matter from outside; the instabilities and the jumps to new forms of organization are the result of fluctuations which are amplified by the positive feedback loops (reinforcement feedback loops). The critical points of instability, the jumps, are also called "bifurcation points", because the system can choose between different ways of evolution, out of the normal way. In an artificial machine the structure and the components are fix and immutable, on the other hand, in a living organism structure and components change continuously. A continue flux of matter goes through living organisms: every cell decomposes and synthesizes chemical structures while eliminating the rejects. In living organism there are always development and evolution. The cell can be considered a very complex kind of dissipative structure, while a very simple example of dissipative structure is the drain whirlpool. The forces which find balance in a whirlpool are mechanic and the most important is the gravity, while the forces which operate in a cell are chemical and the description of these forces is enormously more complex than the description of a whirlpool. The energetic processes of the cell are the catalytic cycles which act as feedback loops. The catalytic cycles (Eigen, 1971) are very important in metabolic processes, they can act as auto-balancing feedback loops and as reinforcement feedback loops. The autobalancing feedback loops maintain the stability in the metabolism of the organism, the reinforcement feedback loops can push the organism more and more far from equilibrium,

1 The case of medicine is particular, because in some aspects it can be considered a biological science (e.g. from the point of view of the development of a pathological phenomenon), while in other aspects (e.g. in psychiatry, or for the influence of consciousness in the development of some diseases) it can be

number of possible ways give rise to the problem of unforeseeability.

considered a human science.

until the point of instability: this is the "bifurcation" point. The bifurcation points can give rise to new forms of order: they are the keys of the growing up and of the evolution of living organisms, because they are radical and unexpected transformations in which the system's behavior takes a new direction. The figure 2 shows how a system changes the trajectory of its behavior: the arrow *f*1 represents the development of the system before the bifurcation point B. The arrow *f*2 represents a new unexpected direction in the behavior of the system after B. The broken arrow represents the hypothetical direction of the behavior of the system if B would have not happened.

Fig. 3. The bifurcation point in the evolution of a dissipative structure.

According to Prigogine, in the point of instability the system can take different ways (*f*2', *f*2'', …). The way that actually will be taken (*f*2) depends on the history of the system and on the casual fluctuations of the system's environment, which happen in the moment of instability. In the case of living organisms this is very interesting, because a living organism is always a registration of its past and this past always influences its development. Moreover the life of organisms always depends on their environment. In the point of bifurcation the dissipative structure is very sensitive to the minimal fluctuations of the environment, because the point of bifurcation is a "crisis" of the system, in which the rules of the system can be modified. The smallest casual events can orientate the choice of the future behavior of the system. Given that all the living organisms live in environments which continuously fluctuate, and given that it is impossible to know which fluctuations will happen in the moment of maximum instability, it is impossible to foresee the future way of the system. Thus Prigogine theorizes the unforeseeability in principle of the system in the point of bifurcation.

Many-valued logics and probabilistic logic are clearly more useful than classical logic in the explanation, in the description and in the prediction of a biological phenomenon. In the human sciences the complexity reaches an higher degree, with respect to biology, because human sciences study the phenomena in which operate a multiplicity of biological subjects who has a mind. The superior complexity of this degree is given by the unforeseeability of groups. If the development of phenomena which regard a single subject is unforeseeable for the complexity of biological processes and for the influence that the consciousness of the subject has in the processes, then the phenomena which regard groups of biological subjects, and groups of minds, entail an unforeseeability of superior degree. Sciences like economy, politics and history (the so called human sciences) study phenomena in which the behavior of the group is the result of the behavior of the single biological subjects, and the result of the influences between subjects. On the basis of these distinctions, we can consider three kinds of phenomena: A) Macro-physical and chemical phenomena; B) Biological phenomena, C) Group phenomena. To each kind of phenomenon a different kind of logical system can correspond, with a different degree of complexity. It's clear that the number of truth-values and the possibility to vary the number of truth-values mark the different degree of complexity. The degrees of logical complexity will be: I) truth-false; II) truth-false + a fix number of truth values (simple polyvalence); III) truth-false + a variable number of

Fuzzy Logic, Knowledge and Natural Language 9

The photon, the quantum of light, is connected to the electromagnetic wave with energy *E*

fundamental hypothesis of undulatory mechanics, proposed by de Broglie in 1924, is that each particle is connected to a wave described by these equations. The law of propagation of these waves of "matter" was found by Schrödinger, who in analogy with classical mechanics formulated the right wave equation. The energy of a classic particle is a function of position and of velocity, thus Born affirmed that the *probability* to find a particle of mass *m*

���

Where the field of forces around the particle derives from the potential *V*(*x*) and ħ is a variant of the Plank's constant h, ħ = h/2π. The hypothesis of de Broglie, and the Schrödinger's equation, mark the birth of quantum mechanics. Undulatory mechanics permits to calculate the energetic levels of atoms and the spectral terms. In this way, the old theory of quanta based on classical principles, unable to interpret the spectrum of black body, the photoelectric effect and the atomic spectra was surpassed. Born gave to the wavefunction the following probabilistic interpretation, an interpretation which was refused by

*ψ\*(r1,t)ψ(r1,t)dv* (2)

mechanics of matrices, in which a dynamic variable is represented by a matrix *Q*. The

Where H is the matrix obtained from the classic Hamiltonian function, through the substitution of classic dynamic variables with the correspondent Heisenberg's matrices. The second member QH HQ is the commutator, and it is commonly written [Q,H]. Quantum mechanics derives from the acknowledgement of the equivalence between the undulatory mechanics and the mechanics of matrices. In quantum mechanics the status of a physical system is represented by a vector of Hilbert's space. Dynamic variables are represented by linear operators in the Hilbert's space. The evolution of a physical system can be described in two ways. In the first way, proposed by Schrödinger, the operators are fixed, while the

represents the vector of the status and H is the operator of energy. In the second

way, proposed by Heisenberg, the vector of status is fixed while the operators evolve in time

ψ

E = hνand

> ψ(x, t

��� � �(�)� (1)

. In the same years Heisenberg theorized the

dt = QH − HQ (3)

�� � = �� (4)

ψ(r, t

r<sup>1</sup>, is p = h/ λ

), the probability to find a

) which satisfies

. The

and quantity of motion *p*, which are given by the equations

the Schrödinger's equation

particle in the volume

ψ

where

Where

ψ

following the equation

in the position *x* at the instant *t* was expressed by the wave function

�ħ ��

Schrödinger. In a one-particle-system, with a wave function

dv

\* is the conjugated complex of

vector of status evolves in time following the equation

equation of motion of mechanics of matrices is

�� = − <sup>ħ</sup>� 2�

, centered around the point

�ħ dQ

�ħ �

truth values (a multiple system which contains subsystems with different numbers of truthvalues: complex polyvalence). In this way we find a place for Fuzzy Logic in the context of many-valued systems. With respect to simple many-valued logical systems with a fix number of truth-values, Fuzzy Logic give the possibility to vary the number of truth-values and to build multiple logical systems for the treatment of different aspects of the phenomenon. Thus we can build the following table:


Table 1. Increasing complexity of sciences and the adequate logical systems for their methodologies.

I think that this table can receive some critique, nonetheless it is possible that it tells something right. For instance, molecular biology is a border science between the first and the second level of complexity, because chemical reactions can be treated as phenomena of first level with a bivalent approach, while biological phenomena must be treated as phenomena of second level with a polyvalent approach. Also medicine is a border science, because it contains (biological) phenomena of second level, but it can be considered a science of third level for the influence that the consciousness has in the development of pathologic phenomena.

In the twentieth century the quantum physics theory has demonstrated that simple phenomena are not the basis of complex phenomena, and that the more you go in the microcosm to find the "elements" of reality, the more you find complexity. The theorists of quantum mechanics treated the complexity of quantum interaction employing many-valued logics. The passage from classical particles to waves corresponds to the passage from twovalued logic to many-valued logics. It is very interesting, to recall the birth of quantum physics, that the employment of probabilistic logic marks this birth, and that many-valued systems and fuzzy logic are connected to quantum paradigm.

*Quantum physics and many-valued logics*. Formulated at the beginning of twentieth century, undulatory mechanics theorizes that the behavior of the smallest constituent of matter can be described trough waves of frequency (ν) and lenght (λ). In the hypothesis of undulatory mechanics, variables of corpusculary kind are connected to variables of undulatory kind.

truth values (a multiple system which contains subsystems with different numbers of truthvalues: complex polyvalence). In this way we find a place for Fuzzy Logic in the context of many-valued systems. With respect to simple many-valued logical systems with a fix number of truth-values, Fuzzy Logic give the possibility to vary the number of truth-values and to build multiple logical systems for the treatment of different aspects of the

phenomenon

Foreseeability, Always true or false

Un-foreseeability of single subjects

Un-foreseeability of groups

Table 1. Increasing complexity of sciences and the adequate logical systems for their

I think that this table can receive some critique, nonetheless it is possible that it tells something right. For instance, molecular biology is a border science between the first and the second level of complexity, because chemical reactions can be treated as phenomena of first level with a bivalent approach, while biological phenomena must be treated as phenomena of second level with a polyvalent approach. Also medicine is a border science, because it contains (biological) phenomena of second level, but it can be considered a science of third level for the influence that the consciousness has in the development of

In the twentieth century the quantum physics theory has demonstrated that simple phenomena are not the basis of complex phenomena, and that the more you go in the microcosm to find the "elements" of reality, the more you find complexity. The theorists of quantum mechanics treated the complexity of quantum interaction employing many-valued logics. The passage from classical particles to waves corresponds to the passage from twovalued logic to many-valued logics. It is very interesting, to recall the birth of quantum physics, that the employment of probabilistic logic marks this birth, and that many-valued

*Quantum physics and many-valued logics*. Formulated at the beginning of twentieth century, undulatory mechanics theorizes that the behavior of the smallest constituent of matter can

mechanics, variables of corpusculary kind are connected to variables of undulatory kind.

) and lenght (

λ

). In the hypothesis of undulatory

ν

Kind of explanation

Nomologicdeductive explanation

Probabilistic-Nomologicdeducitve explanation

Different manyvalued logics (Fuzzy Logic) in the Nomologicdeducitve explanation

Logical system employed

Classical bivalent Logic

> Polyvalent Logics (Probabilistic logic)

Different manyvalued logics (Fuzzy Logic) in a single phenomenon

phenomenon. Thus we can build the following table:

complexity Science Foreseeability of

Classical macroscopic Physics, classical Chemistry

Biologic sciences (biology, istology, genetics)

Human sciences (history, sociology, psicology, economics)

systems and fuzzy logic are connected to quantum paradigm.

be described trough waves of frequency (

Levels of

1° level of complexity

2° level of complexity

3° level of complexity

methodologies.

pathologic phenomena.

The photon, the quantum of light, is connected to the electromagnetic wave with energy *E* and quantity of motion *p*, which are given by the equations E = hν and p = h/ λ. The fundamental hypothesis of undulatory mechanics, proposed by de Broglie in 1924, is that each particle is connected to a wave described by these equations. The law of propagation of these waves of "matter" was found by Schrödinger, who in analogy with classical mechanics formulated the right wave equation. The energy of a classic particle is a function of position and of velocity, thus Born affirmed that the *probability* to find a particle of mass *m* in the position *x* at the instant *t* was expressed by the wave function ψ(x, t) which satisfies the Schrödinger's equation

$$i\hbar\frac{\partial\psi}{\partial t} = -\frac{\hbar^2}{2m}\frac{\partial^2\psi}{\partial x^2} + V(\mathbf{x})\psi\tag{1}$$

Where the field of forces around the particle derives from the potential *V*(*x*) and ħ is a variant of the Plank's constant h, ħ = h/2π. The hypothesis of de Broglie, and the Schrödinger's equation, mark the birth of quantum mechanics. Undulatory mechanics permits to calculate the energetic levels of atoms and the spectral terms. In this way, the old theory of quanta based on classical principles, unable to interpret the spectrum of black body, the photoelectric effect and the atomic spectra was surpassed. Born gave to the wavefunction the following probabilistic interpretation, an interpretation which was refused by Schrödinger. In a one-particle-system, with a wave function ψ(r, t), the probability to find a particle in the volume dv, centered around the point r<sup>1</sup>, is

$$
\psi^\*(r\_\nu t)\varphi(r\_\nu t)dv\tag{2}
$$

where ψ\* is the conjugated complex of ψ. In the same years Heisenberg theorized the mechanics of matrices, in which a dynamic variable is represented by a matrix *Q*. The equation of motion of mechanics of matrices is

$$i\hbar \frac{d\mathbf{q}}{dt} = \mathbf{Q}\mathbf{H} - \mathbf{H}\mathbf{Q} \tag{3}$$

Where H is the matrix obtained from the classic Hamiltonian function, through the substitution of classic dynamic variables with the correspondent Heisenberg's matrices. The second member QH HQ is the commutator, and it is commonly written [Q,H]. Quantum mechanics derives from the acknowledgement of the equivalence between the undulatory mechanics and the mechanics of matrices. In quantum mechanics the status of a physical system is represented by a vector of Hilbert's space. Dynamic variables are represented by linear operators in the Hilbert's space. The evolution of a physical system can be described in two ways. In the first way, proposed by Schrödinger, the operators are fixed, while the vector of status evolves in time following the equation

$$i\hbar\frac{\partial}{\partial t}\psi = H\psi\tag{4}$$

Where ψ represents the vector of the status and H is the operator of energy. In the second way, proposed by Heisenberg, the vector of status is fixed while the operators evolve in time following the equation

Fuzzy Logic, Knowledge and Natural Language 11

was an expertise of quantum mechanics. And the studies of Łukasiewicz on many-valued logic, from 1920, were soon connected with the development of quantum physics. In 1936 Birkhoff and von Neumann wrote a famous essay on the logic of quantum mechanics. The fact that the Schrödinger's wave function *ψ* was interpreted in a probabilistic way, and the fact that the father of the idea of vague/fuzzy sets was an expertise of quantum mechanics, are very important for two reasons: they show us that i) a many-valued logic system is much more adequate to quantum physics than classical logic and that ii) a scientific theory becomes much more clear if we find an adequate logic system to explain the phenomena considered by it.

Mathematic has been, especially in the last four centuries, the language of science; if logics is another useful point of view to understand natural phenomena, then Fuzzy Logic is a very good instrument to build explanations, even it is not the solution of so many problems as

Fuzzy Logic is not the key of a complete formalization of NL. The phenomena of vagueness and the relaxations of classical logical truth are only an aspect of NL that Fuzzy Logic is able to treat. In the essay *Vagueness: An Excercise in Logical Analysis* (1937), the work in which was proposed for first time the idea of vague sets, Black distinguished three kinds of imprecision in NL: the generality, the ambiguity and the vagueness. The generality is the power of a word to refer to a lot of things which can be very different each other. The ambiguity is the possibility that a linguistic expression has many different meanings. The vagueness is the absence of precise confines in the reference of a lot of adjectives and common names of human language, e.g. "table", "house", "tall", "rich", "strong", "young", etc. More precisely, vagueness is an approximate relation between a common name or a *quantitative* adjective2 and the objects of the world which can be referred by this name or predicated of this adjective. Fuzzy Logic has been developed to treat in a formal way the linguistic vagueness. The successes of Fuzzy Logic in the field of engineering (in the automatic and self-regulating processes of cybernetics) and the birth of fuzzy sets theory from the study of linguistic vagueness (cf. Black 1937, Zadeh 1965) empowered the idea that Fuzzy Logic can give solution to the problems that the bivalent logic leaves unsolved in artificial intelligence (henceforth AI). Kosko (1993) proposes the idea that an artificial system will be a good imitation of a natural system, like a brain, only when the artificial system will be able to learn, to get experience and to change itself without the intervention of a human programmer. I think that this is correct, but I believe that it is not enough to put Fuzzy Logic into a dynamic system to solve the problems of AI. Instead Kosko (cf. 1993: 185-190) hypothesizes that the employment of Fuzzy Logic is the key to give the *common sense* to a system. I think that this is not correct. The common sense is the result of so many experiences and so many complex processes of our knowledge, in social interaction, that it is not enough to substitute bivalent logic with Fuzzy Logic to obtain a system which operates on the basis of common sense. Moreover it is important to remember that classical logic is however the soul of logic, and Zadeh does not think that there is a great difference between

2 With the expression "quantitative adjective" I mean an adjective which refers to qualities which have

variable intensities, i.e. qualities which can be predicated of the subject more or less.

**3. Fuzzy Logic is not the key of the formalization of Natural Language** 

Thus it is clear why it is useful to build and to improve the table 1.

Kosko believes.

classical and Fuzzy Logic.

$$i\hbar \mathbf{Q}' = [\mathbf{Q} \,\mathrm{H}] \tag{5}$$

Where Q' is the derivate respect to time of the operator Q. Over these two ways, it is possible to give an intermediate representation of the evolution of a physical system in which both the vector of status and the operators evolve in time. It is a postulate of quantum mechanics that the operators of position and of impulse of a particle satisfy the relation of commutation

$$\{\mathbf{Q}, \mathbf{P}\} = l\hbar\tag{6}$$

Thus the position and the impulse cannot be measured simultaneously with arbitrary precision. This is the principle of indetermination enounced by Heisenberg. The measure of a dynamic variable Q gives a determined and exact result q only when the vector ψ, the vector of the status of the system, satisfies the equation Qψ = qψ. In this case ψ is an autostatus of Q, which corresponds to the auto-value q. If the system is not in an auto-status of the dynamic variable measured, it is impossible to predict the result of the measure, it is only possible to assign the probability to obtain a determinate value q. For this statistic character of quantum mechanics some physicists, like Einstein, believed that this theory is not complete. Kosko (1993: 67) writes that the operator ψ represents the matter's wave in an infinitesimally little volume dV; Born interpreted the square of the absolute value of the wave, |ψ|2, as a measure of probability. Thus the infinitesimal quantity |ψ|2 dV measures the *probability* that a particle of matter is in the infinitesimally little region dV. This entails that all the infinitesimal particles are casual point. On the other hand, the fuzzy thinking considers |ψ|2 dV as the measure of *how much* the particle is in the region dV. According to this point of view, the particles are to some extent in all the regions of the space: hence the particles are deterministic clouds. Telling that the quantum particles, in fuzzy thinking, are "deterministic" clouds, Kosko means that it is precisely determinable the measure of the quantity of matter in the volume dV; as we will see, fuzzy thinking is always connected to the precision. However, the adjective "deterministic" is too much employed to describe the old scientific paradigm, thus I prefer to say that, in this anti-probabilistic interpretation, quantum particles are *fuzzy* clouds.

Thus it is clear that, in the table 1, the first level of complexity does not contain the sciences which study the most fundamental elements of reality, or the "atoms" of the ancient science. The classification of the table 1 regards only the methodology and the kind of phenomenon considered by a science, not the *ontological level* (more or less fundamental) of the phenomena studied by a science. In quantum physics, which study the subatomic particles, we find the unforeseeability and the uncertainty that, at the level of macroscopic chemistry and of macroscopic physics, is substituted by foreseeability and bivalence. Thus quantum mechanics could be a science of second or third level of complexity in the table 1. In the classical physics of Galileo and Newton the phenomena were reduced to the properties of material and rigid bodies. From 1925, quantum mechanics showed that, at subatomic level, material bodies dissolve in undulatory schemas of probability. Subatomic particles are not understood as rigid but very little entities, they are instead relationships or interconnections between processes which can be only theoretically distinguished. The schemas of probability (or fuzziness) are not probabilities of material objects (electrons, quarks, neutrins, etc.), but probabilities of interconnections between events. Capra (1996: 41) writes that "when we move our attention from macroscopic objects to the atoms and to subatomic particles, Nature does not show isolated bricks, but it appears as a complex weft of relations between parts of an unified everything". The father of the idea of vague (fuzzy) sets, M. Black,

Where Q' is the derivate respect to time of the operator Q. Over these two ways, it is possible to give an intermediate representation of the evolution of a physical system in which both the vector of status and the operators evolve in time. It is a postulate of quantum mechanics that the operators of position and of impulse of a particle satisfy the relation of commutation

Thus the position and the impulse cannot be measured simultaneously with arbitrary precision. This is the principle of indetermination enounced by Heisenberg. The measure of a dynamic variable Q gives a determined and exact result q only when the vector

status of Q, which corresponds to the auto-value q. If the system is not in an auto-status of the dynamic variable measured, it is impossible to predict the result of the measure, it is only possible to assign the probability to obtain a determinate value q. For this statistic character of quantum mechanics some physicists, like Einstein, believed that this theory is

infinitesimally little volume dV; Born interpreted the square of the absolute value of the

the *probability* that a particle of matter is in the infinitesimally little region dV. This entails that all the infinitesimal particles are casual point. On the other hand, the fuzzy thinking

this point of view, the particles are to some extent in all the regions of the space: hence the particles are deterministic clouds. Telling that the quantum particles, in fuzzy thinking, are "deterministic" clouds, Kosko means that it is precisely determinable the measure of the quantity of matter in the volume dV; as we will see, fuzzy thinking is always connected to the precision. However, the adjective "deterministic" is too much employed to describe the old scientific paradigm, thus I prefer to say that, in this anti-probabilistic interpretation,

Thus it is clear that, in the table 1, the first level of complexity does not contain the sciences which study the most fundamental elements of reality, or the "atoms" of the ancient science. The classification of the table 1 regards only the methodology and the kind of phenomenon considered by a science, not the *ontological level* (more or less fundamental) of the phenomena studied by a science. In quantum physics, which study the subatomic particles, we find the unforeseeability and the uncertainty that, at the level of macroscopic chemistry and of macroscopic physics, is substituted by foreseeability and bivalence. Thus quantum mechanics could be a science of second or third level of complexity in the table 1. In the classical physics of Galileo and Newton the phenomena were reduced to the properties of material and rigid bodies. From 1925, quantum mechanics showed that, at subatomic level, material bodies dissolve in undulatory schemas of probability. Subatomic particles are not understood as rigid but very little entities, they are instead relationships or interconnections between processes which can be only theoretically distinguished. The schemas of probability (or fuzziness) are not probabilities of material objects (electrons, quarks, neutrins, etc.), but probabilities of interconnections between events. Capra (1996: 41) writes that "when we move our attention from macroscopic objects to the atoms and to subatomic particles, Nature does not show isolated bricks, but it appears as a complex weft of relations between parts of an unified everything". The father of the idea of vague (fuzzy) sets, M. Black,


vector of the status of the system, satisfies the equation Q

not complete. Kosko (1993: 67) writes that the operator

wave, |

considers |

ψ

ψ

quantum particles are *fuzzy* clouds.

���� = ��� �� (5)

��� �� = �� (6)

. In this case

represents the matter's wave in an

ψ

ψ = qψ

ψ


ψ, the

is an auto-


ψ

was an expertise of quantum mechanics. And the studies of Łukasiewicz on many-valued logic, from 1920, were soon connected with the development of quantum physics. In 1936 Birkhoff and von Neumann wrote a famous essay on the logic of quantum mechanics. The fact that the Schrödinger's wave function *ψ* was interpreted in a probabilistic way, and the fact that the father of the idea of vague/fuzzy sets was an expertise of quantum mechanics, are very important for two reasons: they show us that i) a many-valued logic system is much more adequate to quantum physics than classical logic and that ii) a scientific theory becomes much more clear if we find an adequate logic system to explain the phenomena considered by it. Thus it is clear why it is useful to build and to improve the table 1.

Mathematic has been, especially in the last four centuries, the language of science; if logics is another useful point of view to understand natural phenomena, then Fuzzy Logic is a very good instrument to build explanations, even it is not the solution of so many problems as Kosko believes.
