**3. Selection of the models**

The proposed probabilistic methods for cognitive solving of problems 1 and 2 are based on selected probabilistic models which are implemented effectively in wide application areas. The main principle at a selection of models consists that useful knowledge should be result of their application in conditions of various uncertainties. Knowledge is understood as the form of existence and ordering of results of cognitive activity of human. In the applications to solv**e** problems 1 and 2**,** useful knowledge (received as a result of probabilistic modeling in time) is an output information of admissible quality or cognitive conclusion that allows to solve **a** specific applied problem. As the results of selection**,** the author's models to estimate the probabilistic measures of a quality of used information and the probabilities of "success" and risks of "failure" for "black box" and for complex structures are proposed for AIS. The models are widely tested and approved in practice [15–19, 22–37].

#### **3.1 Selection for "black box"**

Selected models for every system element, presented as "black box," allow to estimate probabilities of "success" and/or "failure" during given prognostic period. A probabilistic space (*Ω*, *B*, *P*) for estimation of system operation processes is traditional [15–21], where *Ω* is a limited space of elementary events; *B* is a class of all subspace of *Ω*-space, satisfied to the properties of *σ*-algebra; and *P* is a probability measure on a space of elementary events *Ω*. Such space (*Ω*, *B*, *P*) is built and proposed for using [22–37].

Not considering uncertainty specificities, in general case, intellectual operation of AIS component aims to provide reliable and timely producing complete, valid and/or, if needed, confidential information; see **Figure 5**. The gathered information is used for its proper specificity. And, the proposed models [18–19] allow to estimate the intellectual operation processes on a level of used information quality, which is important for every AIS (information may be used by technical devices, "smart" elements, robotics, users, etc.).

*Probabilistic Methods for Cognitive Solving of Some Problems in Artificial Intelligence Systems DOI: http://dx.doi.org/10.5772/intechopen.89168*

**Figure 5.** *Quality of used information (abstraction).*

• "OR" 1st compound part of the route, …, "OR" last compound part of the route are overcame successfully by the robot at given time (for problem 2).

Each component after system decomposition is presented as a "black box." For each "black box," various probabilistic models can be applied for calculations and for building required probabilistic distribution function (PDF) of time between the next deviations from an established norm. A norm is connected with definitions of "success" and "failure," it may be connected with the precondition to "failure" (to prevent "failure"—see Example 2). Focus on processes' description allows to use only time characteristics (mean time or frequency of events), the dimensionless or

Appropriate calculated probabilities of "success" and/or "failure" (risk of "failure") in comparisons to real events during the prediction periods represent the knowledge of admissibility borders for probabilities of "success" and acceptability borders for risks of "failure." The process of cognitive solving of problems 1 and 2 means not only the formation and use of this knowledge for interested system, but also the estimated quality of monitored and used information (including definition

The proposed probabilistic methods for cognitive solving of problems 1 and 2 are based on selected probabilistic models which are implemented effectively in wide application areas. The main principle at a selection of models consists that useful knowledge should be result of their application in conditions of various uncertainties. Knowledge is understood as the form of existence and ordering of results of cognitive activity of human. In the applications to solv**e** problems 1 and 2**,** useful knowledge (received as a result of probabilistic modeling in time) is an output information of admissible quality or cognitive conclusion that allows to solve **a** specific applied problem. As the results of selection**,** the author's models to estimate the probabilistic measures of a quality of used information and the probabilities of "success" and risks of "failure" for "black box" and for complex structures are proposed for AIS. The models are widely tested and approved in practice

Selected models for every system element, presented as "black box," allow to estimate probabilities of "success" and/or "failure" during given prognostic period. A probabilistic space (*Ω*, *B*, *P*) for estimation of system operation processes is traditional [15–21], where *Ω* is a limited space of elementary events; *B* is a class of all subspace of *Ω*-space, satisfied to the properties of *σ*-algebra; and *P* is a probability measure on a space of elementary events *Ω*. Such space (*Ω*, *B*, *P*) is built and

Not considering uncertainty specificities, in general case, intellectual operation of AIS component aims to provide reliable and timely producing complete, valid and/or, if needed, confidential information; see **Figure 5**. The gathered information is used for its proper specificity. And, the proposed models [18–19] allow to estimate the intellectual operation processes on a level of used information quality, which is important for every AIS (information may be used by technical devices,

cost characteristics peculiar for various applications.

of input for continuous modeling).

*Probability, Combinatorics and Control*

**3. Selection of the models**

[15–19, 22–37].

**8**

**3.1 Selection for "black box"**

proposed for using [22–37].

"smart" elements, robotics, users, etc.).

The proposed analytical models ("The model of functions performance by a complex system in conditions of unreliability of its components," "The models complex of calls processing for the different dispatcher technologies," "The model of entering into system current data concerning new objects of application domain," "The model of information gathering," "The model of information analysis," "The models complex of dangerous influences on a protected system," and "The models complex of an authorized access to system resources") allow to estimate the probability of "success" and risks to lose quality of intellectual operation during given prognostic period considering consequences; see **Table 1**. Required limits on probability measures are recommended as produced knowledge for the best AIS practice (estimated on dozens practical estimations for various application areas).

The next probabilistic model is devoted to estimate a probability of "success" and risk of "failure" on high meta level. This is based on studying the general AIS technology of periodical diagnostics of system integrity. Some general technologies were researched for "The models complex of dangerous influences on a protected system," see **Table 1**. Here, the general case for AIS is presented.

For system element allowing prediction of risks to lose its integrity during given prognostic period, there is studied the next general AIS technology of providing system integrity.

Technology is based on the periodical diagnostics of system integrity (without the continuous monitoring between diagnostics). Diagnostics are carried out to detect danger sources occurrence from threats into a system or consequences of negative influences (for example, these may be destabilizing factors on dangerous enterprise). The lost system integrity can be detected only as a result of diagnostics, after which system recovery is started. Dangerous influence on system is acted stepby step: at first, a danger source occurs into a system, and then after its activation may be a loss of integrity; see **Figure 6**. Occurrence time is a random value that can be distributed by PDF of time between neighboring occurrences of danger Ωoccur(t) = *P*(τoccurrence ≤ t) = 1 exp(t/Toccur), Toccur is mean time, frequency σ = 1/Toccur. Activation time is also random value which can be distributed by PDF of activation time of occurred danger Ωactiv(t) = *P*(τactivation ≤ t) = 1 exp(t/Tactiv), Tactiv is mean time. System integrity cannot be lost before an occurred danger source is activated. A threat is considered to be realized only after a danger source has activated and influenced on system.

It is supposed that used diagnostics tools allow to provide system integrity recovery after revealing danger sources occurrence or the consequences of influences. Thus, the probability (*P*) of providing system integrity within the given


There are possible the next variants:

*DOI: http://dx.doi.org/10.5772/intechopen.89168*

next diagnostics and Tdiag is the diagnostics time.

detail [18, 19, 35–37].

*during prognostic period Tgiven.*

**Figure 6.**

1. for variant 1

2. for variant 2 measure (a)

(Tbetw + Tdiag); measure (b)

diagnostics time.

by (1).

**11**

variant 1—given prognostic period Tgiven is less than the established period

*Some random events for technology: left—correct operation to provide system integrity; right—a loss of integrity*

*Probabilistic Methods for Cognitive Solving of Some Problems in Artificial Intelligence Systems*

Here, Tbetw. is the time between the end of diagnostics and the beginning of the

*<sup>P</sup>*ð Þ<sup>1</sup> Tgiven <sup>¼</sup> <sup>1</sup> � <sup>Ω</sup>occur <sup>∗</sup> <sup>Ω</sup>activ Tgiven , (1)

<sup>þ</sup> Trmn*=*Tgiven *<sup>P</sup>*ð Þ<sup>1</sup> ð Þ Trmn , (2)

<sup>N</sup> Tbetw <sup>þ</sup> Tdiag

<sup>N</sup> Tbetw <sup>þ</sup> Tdiag *<sup>P</sup>*ð Þ<sup>1</sup> ð Þ Trmn (3)

For the given period Tgiven, the next statements are proposed for use, see in

Under the condition of independence of considered characteristics, the probability of providing system integrity (probability of "success") is equal to

*<sup>P</sup>*ð Þ<sup>2</sup> Tgiven <sup>¼</sup> N Tbetw <sup>þ</sup> Tdiag *<sup>=</sup>*Tgiven *<sup>P</sup>*ð Þ<sup>1</sup>

*<sup>P</sup>*ð Þ<sup>2</sup> Tgiven <sup>¼</sup> *<sup>P</sup>*ð Þ<sup>1</sup>

where N = [Tgiven/(Тbetw. + Тdiag.)] is the integer part, Trmn = Tgiven � N

The probability of success within given prognostic period *P*(1)(Tgiven) is defined

All these models, supported by various versions of software tools, registered by Rospatent, may be applied and improved for solving quality and safety problems, connected with intellectual system presented as "black box" [18, 19, 38–44].

The modification of this model allows to use different values of diagnostics

and recovery time [35–37]; for formulas (1)–(3), recovery time is equal to

variant 2—given prognostic period Tgiven is more than or equal to the established period between neighboring diagnostics (Tgiven ≥ Tbetw. + Tdiag).

between neighboring diagnostics (Tgiven < Tbetw. + Tdiag);

#### **Table 1.**

*The proposed analytical models to estimate AIS operation quality [18, 19].*

prognostic period Tgiven (i.e., probability of "success") may be estimated as a result of the use of the next probabilistic model. Risk to lose integrity (R) addition to 1 for probability of providing system integrity R = 1 P.

*Probabilistic Methods for Cognitive Solving of Some Problems in Artificial Intelligence Systems DOI: http://dx.doi.org/10.5772/intechopen.89168*

**Figure 6.**

*Some random events for technology: left—correct operation to provide system integrity; right—a loss of integrity during prognostic period Tgiven.*

There are possible the next variants:

variant 1—given prognostic period Tgiven is less than the established period between neighboring diagnostics (Tgiven < Tbetw. + Tdiag); variant 2—given prognostic period Tgiven is more than or equal to the established period between neighboring diagnostics (Tgiven ≥ Tbetw. + Tdiag).

Here, Tbetw. is the time between the end of diagnostics and the beginning of the next diagnostics and Tdiag is the diagnostics time.

For the given period Tgiven, the next statements are proposed for use, see in detail [18, 19, 35–37].

Under the condition of independence of considered characteristics, the probability of providing system integrity (probability of "success") is equal to

1. for variant 1

$$P\_{(1)}\left(\mathbf{T}\_{\text{given}}\right) = \mathbf{1} - \boldsymbol{\Omega}\_{\text{occur}} \* \boldsymbol{\Omega}\_{\text{active}}\left(\mathbf{T}\_{\text{given}}\right),\tag{1}$$

2. for variant 2

measure (a)

$$\begin{array}{l} \mathbf{P}\_{(2)}\left(\mathbf{T}\_{\text{given}}\right) = \mathbf{N}\left(\left(\mathbf{T}\_{\text{between}} + \mathbf{T}\_{\text{diag}}\right) / \mathbf{T}\_{\text{given}}\right) \mathbf{P}\_{(1)}\left(\mathbf{T}\_{\text{between}} + \mathbf{T}\_{\text{diag}}\right) \\ \qquad + \left(\mathbf{T}\_{\text{rmn}} / \mathbf{T}\_{\text{given}}\right) \mathbf{P}\_{(1)}(\mathbf{T}\_{\text{rmn}}), \end{array} \tag{2}$$

where N = [Tgiven/(Тbetw. + Тdiag.)] is the integer part, Trmn = Tgiven � N (Tbetw + Tdiag);

measure (b)

$$P\_{(2)}\left(\mathbf{T}\_{\text{given}}\right) = P\_{(1)}\,^N\left(\mathbf{T}\_{\text{between}} + \mathbf{T}\_{\text{diag}}\right)P\_{(1)}\left(\mathbf{T}\_{\text{rmn}}\right) \tag{3}$$

The probability of success within given prognostic period *P*(1)(Tgiven) is defined by (1).

The modification of this model allows to use different values of diagnostics and recovery time [35–37]; for formulas (1)–(3), recovery time is equal to diagnostics time.

All these models, supported by various versions of software tools, registered by Rospatent, may be applied and improved for solving quality and safety problems, connected with intellectual system presented as "black box" [18, 19, 38–44].

prognostic period Tgiven (i.e., probability of "success") may be estimated as a result of the use of the next probabilistic model. Risk to lose integrity (R) addition to 1 for

probability of providing system integrity R = 1 P.

*The proposed analytical models to estimate AIS operation quality [18, 19].*

**Threats to AIS operation**

*Probability, Combinatorics and Control*

Information is not produced as a result of system unreliability

Delayed information producing (i.e., not in real

Producing of incomplete

Information validity deterioration caused by: • non-actual input information; • errors missed or made during information verification; • incorrectness of processing

Violation of information confidentiality

Violation of secure system operation including • random faults of staff and

• dangerous influences (revealing of software and technical defects, virus influences, violators' influences, terrorist attacks in information environment, psychological influence

• unauthorized access

users;

etc.);

**Table 1.**

**10**

information

**Evaluated measure (required limits as produced knowledge for the best**

Probability of providing reliable functions performance during given

Probability of well-timed processing during the required term (no less than

Relative portion of all well-timed

Relative portion of well-timed processed calls of those types for which the customer requirements are

Probability that system contains information about states of all real object and coincides (no less than 0.9)

Probability of information actuality on the moment of its use (no less than

Probability of errors absence after checking (no less than 0.97). Fraction of errors in information after

Probability of correct analysis results obtaining (no less than 0.95)

Probability of system protection against unauthorized access during objective period (no less than 0.999)

Probability of faultless (correct) operation during given time (no less

Mean time between errors. Probability of system protection against unauthorized access (no less

time (no less than 0.99). Mean time between failures. System availability (no less than

Mean response time.

met (no less than 95%)

processed calls.

**Model tittle**

The model of functions performance by a complex system in conditions of unreliability of its components

The models complex of calls processing for the different dispatcher technologies

The model of entering into system current data concerning new objects of application domain

The model of information

The model of information

The models complex of an authorized access to system

The models complex of dangerous influences on a protected system. The models complex of an authorized access to system

gathering.

analysis

resources

resources

**practice)**

0.9995)

0.95).

0.9).

checking.

than 0.95).

than 0.99)

**quality**

time)

Summaries for the last model are as follows:

• The input for modeling include: frequency of the occurrences of potential threats (or mean time between the moments of the occurrences of potential threats which equals to 1/frequency); mean activation time of threats; mean recovery time; time between the end of diagnostics and the beginning of the next diagnostics; diagnostics time; and given prognostic period.

between losses of integrity for system and each compound subsystems and elements

*Probabilistic Methods for Cognitive Solving of Some Problems in Artificial Intelligence Systems*

For example, integrated complex system, combined from intellectual structures for modeling interested system including AIS (**Figure 7**), can be analyzed by formulas (1)–(5) and probabilistic models described above and allowing to form PDF by (4) and (5). The correct operation of this complex system during the given period means: during this period both first and second subsystems (left and right) should operate correctly according their destinations, i.e., integrity of complex system is provided if "AND" integrity of first system left "AND" integrity of second

All these ideas of analytical modeling operation processes are supported by the

What about new knowledge by using the proposed methods and models for cognitive solving of problems 1 and 2 of the chapter? A use of these methods and models on different stages of AIS life cycle (concept, development, utilization, support stages) allows to produce cognitive answers for the following questions:

• What about different risks to lose integrity in operation?

different possible scenarios of operation?

*An integrated complex system of two serial subsystems (abstraction).*

• What about the justified norms for values of monitored parameters?

• What requirements should be specified to MTBLI and to repair time for

• Which information operation processes should be duplicated and how?

(MTBLI as analog of MTBF).

*DOI: http://dx.doi.org/10.5772/intechopen.89168*

system right are provided.

**Figure 7.**

**13**

software tools [18, 19, 21, 23, 38–44].

• The calculated results of modeling include: the probability of providing system integrity within given prognostic period (i.e., probability of "success"); and risk to lose integrity (i.e., probability of "failure") as addition to 1 for probability of "success."
