**3.1 Culture of safety**

Institutional Core Values: At an institutional level, the quality oversight team must create a cultural shift by coordinating that each department addresses safety, process improvement, professional outcome assessment, and patient satisfaction. This cultural shift should be focused on personal responsibility and behaviors consistent with institutional core values [1, 32, 78].

#### **3.2 Decision making**

All healthcare providers who attend to patients in the medical system are highly motivated, highly trained individuals whose professional goal is to support others in their most vulnerable moments. It is therefore of the utmost importance to approach medical errors from a systems perspective, understanding that human decisionmaking is anchored in an evolving medical system where ideally, patient safety should depend on error anticipation and prevention [47]. HROs e.g., nuclear power industry, commercial aviation, aeronautics, base their safety on organized algorithms. Humans

*Patient Safety in the Critical Care Setting: Common Risks and Review of Evidence-Based… DOI: http://dx.doi.org/10.5772/intechopen.108005*

are placed in an environment where decision-making is anticipated, and the appropriate tools exist to minimize risk (e.g., checklists).

Little attention has been given to a type of medical error that is difficult to measure and rarely reported: diagnostic errors. This group of errors have been recognized as a significant patient safety threat, involving intra- and inter-professional teamwork [79, 80]. In 2015, the National Academies of Sciences (NAM) released a landmark report addressing this concern: Improving Diagnosis in Health Care [81]. Decisionmaking and subsequent actions for diagnostic and treatment purposes occur in an ecosystem that involves structure, processes, policies, and an accepted culture within an institution. It is imperative to understand diagnostic reasoning and critical thinking to improve diagnostic performance and reduce error.

Daniel Kahneman won a Nobel Prize in 1982 for the systematic identification and characterization of human decision behaviors which were not previously described. His book, *Thinking, Fast and Slow*, describes two cognitive systems of thinking and decision making: System 1 or Type 1 (automatic, emotional, stereotypic, unconscious), and System 2 or Type 2 (slow, effortful, infrequent, logical, calculating, conscious) [82]. He theorized that System 1 uses cognitive "shortcuts" (heuristics) to reduce the cognitive cost of decision-making. In this area, cognitive biases can preclude the bayesian approach to medical decisions. Cognitive psychologists have explored medical reasoning, the use of these mental systems, and cognitive selfmonitoring strategies (metacognition, debiasing) that allow for a mental pause to recognize and shift between these processes [83, 84]. Understanding cognitive decision-making processes that influence medical decision behavior will impact cognitive errors. Education in cognitive science and critical thinking, associated with metacognitive skill training [85] using curricula can provide practitioners with the tools to understand and recognize cognitive biases, particularly in high-paced, highrisk specialties such as Anesthesiology, Critical Care, and Emergency Medicine [86].
