**4. Problem 1 of planning the possibilities of functions performance on the base of monitored information about events and conditions**

It is supposed that the terms "success" and accordingly "failure" are defined in terms of admissible condition of interested system to operate for the purpose.

Note. For example, for each parameter of equipment, the ranges of possible values of conditions may be estimated as "Working range inside of norm" and "Out of working range, but inside of norm" ("success") or "Abnormality" ("failure"), interpreted similarly light signals—"green," "yellow," and "red." For this definition, a "failure" of equipment operation characterizes a threat to lose system norm integrity after danger influence (on the logic level this range "Abnormality" may be interpreted analytically as failure, fault, losses of quality, or safety etc.). But the definition may be another: for example, a "failure" may be defined as incident or accident. For this definition, short-time being in the range "Abnormality" is not "failure," because the incident or accident may not happen.

There are four steps proposed for cognitive solving of problem 1 of planning the possibilities of functions performance on the base of monitored information about events and conditions; see **Figure 8**.

**Step 1**. The complete set of variants for actions, and for each variant—a set of components is defined. Each use case may be characterized by an expected gain in comparable conventional units. If the objective value of gain cannot be defined,

*Probabilistic Methods for Cognitive Solving of Some Problems in Artificial Intelligence Systems DOI: http://dx.doi.org/10.5772/intechopen.89168*

#### **Figure 8.**

• What processing devices and technologies should be used to achieve the

• What data flows and functional tasks may be the main causes of "bottlenecks"?

• What data gathering technologies and engineering solutions can guarantee the

• What qualification requirements should be for the users of AIS (from the AIS

• How dangerous are scenarios of environment influences and what protective

• How the use of integrity diagnostics and security monitoring will worsen time-

• What protection system effectiveness should be to prevent an unauthorized

The proposed methods and models provide the next approach for cognitive

**4. Problem 1 of planning the possibilities of functions performance on the base of monitored information about events and conditions**

terms of admissible condition of interested system to operate for the purpose. Note. For example, for each parameter of equipment, the ranges of possible values of conditions may be estimated as "Working range inside of norm" and "Out of working range, but inside of norm" ("success") or "Abnormality" ("failure"), interpreted similarly light signals—"green," "yellow," and "red." For this definition, a "failure" of equipment operation characterizes a threat to lose system norm integrity after danger influence (on the logic level this range "Abnormality" may be interpreted analytically as failure, fault, losses of quality, or safety etc.). But the definition may be another: for example, a "failure" may be defined as incident or accident. For this definition, short-time being in the range "Abnormality" is not

"failure," because the incident or accident may not happen.

events and conditions; see **Figure 8**.

**14**

It is supposed that the terms "success" and accordingly "failure" are defined in

There are four steps proposed for cognitive solving of problem 1 of planning the possibilities of functions performance on the base of monitored information about

**Step 1**. The complete set of variants for actions, and for each variant—a set of components is defined. Each use case may be characterized by an expected gain in comparable conventional units. If the objective value of gain cannot be defined,

The rationale answers allow to improve and accumulate knowledge

• What information verification and validation control should be used?

necessary level of system integrity (quality, safety, etc.)?

• What is the system tolerance to data flows changing?

*Probability, Combinatorics and Control*

completeness and actuality of used information?

effectiveness and efficiency points of view)?

technologies will provide the required security?

probabilistic characteristics of system?

• What are the information security risks? etc*.*

access?

concerning AIS.

solving problems 1 and 2.

*Steps for cognitive solving of problem 1.*

expert value of expected level of "success" for each variant may be established, for example, on a dimensionless scale from 0 to 100 (0—"no gain", i.e., "failure"; 100—"the maximal gain," i.e., complete "success"). After learning by knowledge base, self-improving AIS uses input and the corresponding results of probabilistic modeling in a form of the solution of previously specific encountered problem 1.

Knowledge base (K-base) is defined as a database that contains inference rules and information about human experience and expertise in a domain (ISO/IEC 2382-1:1993).

**Step 2**. The measures and optimization criteria are chosen. As criteria can be accepted:


**Step 3**. The accumulated knowledge is used to refine the input for modeling. A quality of used information is estimated by models above considering limitations from **Table 1**. Using the model for each variant, the probabilistic measures are calculated for given prognostic period (see proposed models above and Step 1). From a set of possible variants, the optimal one is chosen, according to Step 2 criterion.

Note. For example, there are proposed the next general formal statements of problems for system optimization:

1.on the stages of system concept, development, production, and support: system parameters, software, technical, and management measures (Q) are the most rationale for the given period if on them the minimum of expenses (Zdev.) for creation of system is reached

$$Z\_{\rm dev.}\left(\mathbb{Q}\_{\rm rational}\right) = \min\_{\mathbb{Q}} Z\_{\rm dev.}\left(\mathbb{Q}\right),\tag{6}$$

at limitations on probability of an admissible level of quality *P*quality (Q) ≥ *P*adm. and expenses for operation Сoper. (Q) ≤ Сadm. and under other development, operation, or maintenance conditions;

2.on utilization stage: system parameters, software, technical, and management measures (Q) are the most rational for the given period of operation if on them the maximum of probability of correct system operation is reached

$$P\_{\text{quality}}\left(\mathbf{Q}\_{\text{rational}}\right) = \max\_{\mathbf{Q}} P\_{\text{quality}}\left(\mathbf{Q}\right),\tag{7}$$

modeling. To do this, the robot can use data from various sources (for example, from air drones, intelligent buoys on the water or sensors under water, etc.). If necessary, possible damages are taken into account. For example, each use case may be characterized by an expected damage in comparable conventional units. If the objective value of a damage cannot be defined, expert value of expected level of "failure" for each variant may be established, for example, on a dimensionless scale from 0 to 100 (0—"no damages", i.e., "success"; 100—"the maximal damage"). After learning by K-base, self-improving AIS also uses input and the corresponding results of probabilistic modeling in a form of the solution of previously specific

*Probabilistic Methods for Cognitive Solving of Some Problems in Artificial Intelligence Systems*

The index i of the first part of the selected route is set to the initial value i = 1. **Step 2**. The accumulated knowledge is used to refine the input for prognostic modeling. A quality of used information is estimated by models above considering limitations from **Table 1**. Using probabilistic model, a calculation of the probability of failure is carried out for each variant. From the set of remaining route variants, the optimal one is chosen (for it is the minimum probability of failure that is

**Step 3**. The robot overcomes the i-th part of the selected route. If the part cannot be overcome successfully according to probabilistic modeling and/or actual data, the comeback to the initial point of the part is being. If an alternative route is not here, the comeback to initial point of the previous part is being. The input for modeling every part of possible route for each of the variants is updated. New knowledge is improved, accumulated, and systematized in K-base by comparing it with reality

**Step 4**. If, after overcoming the i-th part, the robot arrived at the intended point of route (i.e., the last part of the route is overcome and the goal is achieved), then the solution of task 2 for optimizing the route is complete. If the robot has not yet arrived at the intended point (i.e., the last part of the route is not overcome), then

(using a specific method considering AIS capabilities for self-improving).

encountered problem 2.

*Steps for cognitive solving of problem 2.*

*DOI: http://dx.doi.org/10.5772/intechopen.89168*

achieved).

**17**

**Figure 9.**

at limitations on probability of an admissible level of quality *P*quality (Q) ≥ *P*adm. and expenses for operation Сoper. (Q) ≤ С adm. and under other operation or maintenance conditions.

For limitation on *P*quality (Q) K-base is used; for example, see **Table 1**. For calculation probabilistic measures for given prognostic period, the proposed models are used.

These statements (6), (7) may be transformed into the problems of expenses or risk minimization in different limitations. There may be a combination of these formal statements in system's life cycle.

**Step 4**. A plan for the optimal variant of actions (defined in Step 3) is formed. To support the efficiency and/or effectiveness of the functions, the achievable gain calculated at Step 3 is recorded. New knowledge is improved, accumulated, and systematized in K-base by comparing it with reality (for example, by a specific method considering AIS capabilities for self-improving).

Note. A solution that meets all conditions may not exist. In this case, there is no optimal variant of planning the possibilities of functions performance on the base of monitored information. Additional systems analysis, adjustment of the criteria, or limitations is required (see, for example, ISO/IEC/IEEE 15288).
