**Meet the editor**

Dr Mehmet Savsar is Professor of Industrial Engineering and Management Systems Engineering at Kuwait University. He received his B.Sc. degree from Karadeniz Technical University, Turkey in 1975; his M.Sc. and Ph.D degrees from the Pennsylvania State University, USA in 1978 and 1982 respectively in Industrial Engineering and Operations Research. He worked as a researcher in

Pennsylvania State University during 1980-1982; as a faculty member in Anadolu University, Turkey during 1982-1984; and in King Saud University, Saudi Arabia during 1984-1997. He has been with Kuwait University since 1997. He served as the chairman of the Industrial and Management Systems Engieering Department at Kuwait University during 2006-2010. His research interests include modeling of production systems; quality, reliability and maintenance management; facility layout; flexible manufacturing; and scheduling. He has over 130 journal and conference publications in international journals and conferences. He is on editorial boards of several international journals and conferences.

Contents

**Preface IX** 

Cathy Balding

Kozo Koura

Yasuo Iwaki

Chapter 3 **ISO-GUM and Supplements** 

Chapter 4 **The Use of Quality Function** 

Geoffrey Doherty

Chapter 8 **Using a Class Questionnaire** 

Yuji Okita

and Kamer Ainur Aivaz

Chapter 5 **Quality Assurance in Education 75** 

Chapter 6 **Challenges for Quality Management** 

Chapter 7 **Implementing Quality Management Systems in Higher Education Institutions 129** 

Chapter 2 **The Development and Changes** 

Chapter 1 **Five Essential Skills for 21st Century** 

**Quality Professionals in Health and Human Service Organisations 1** 

**of Quality Control in Japan 19** 

**are Utilized for QA of BCA Data 25** 

**Deployment in the Implementation of the Quality Management System 55**  Elena Condrea, Anca Cristina Stanciu

**in Higher Education – Investigating Institutional Leadership, Culture and Performance 103**  P. Trivellas, P. Ipsilantis, I. Papadopoulos and D. Kantas

Maria J. Rosa, Cláudia S. Sarrico and Alberto Amaral

**Ethics Instruction During Higher Education 147** 

**for Quality Improvement of Engineering** 

## Contents

#### **Preface** XIII


X Contents


Contents VII

Chapter 20 **Improving Quality Assurance** 

Dietmar Winkler and Stefan Biffl

Chapter 21 **Optimization of Optical Inspections Using Spectral Analysis 399**  K. Ohliger, C. Heinze and C. Kröhnert

**in Automation Systems Development Projects 379** 

Renan Prasta Jenie


#### Chapter 21 **Optimization of Optical Inspections Using Spectral Analysis 399**  K. Ohliger, C. Heinze and C. Kröhnert

VI Contents

Chapter 9 **Towards Learning-Focused Quality** 

Yuan Li and Houyi Zhu

Juha Kettunen

Chapter 15 **Critical Success Factors**

Chapter 16 **The ACSA Accreditation Model:** 

Chapter 17 **Quality Improvement Through** 

Chapter 18 **Automatic Maintenance Routes** 

Renan Prasta Jenie

Vesa Hasu and Heikki Koivo

Chapter 19 **Implementation of CVR / IT Methodology** 

**Assurance in Chinese Higher Education 161** 

**Facing the Challenge of Assuring and Improving** 

Luis Ahumada, Carmen Montecinos and Alvaro González

**Quality in Low Performing Schools 183** 

**Summary of Management Approaches 193** 

Cecilia Latrach, Naldy Febré and Ingrid Demandes

**for Quality Assurance in the Greek Health System 245**

**for Quality Assurance in Healthcare Organizations 267** 

**Self-Assessment as a Quality Improvement Tool 289**

Jens Heidrich, Patric Keller, Yi Yang and Axel Wickenkamp

**Based on the Quality Assurance Information 335** 

**on Outsourced Applied Research to Internship Environment, Case, Information Technology Directorate of Bina Nusantara Foundation 353** 

Chapter 10 **Quality Assurance in Chile's Municipal Schools:**

Chapter 11 **Integrated Higher Education Management:** 

Chapter 12 **Quality Assurance in the Career of Nursing 209**

Chapter 13 **Quality Assurance of Medicines in Practice 219** Beverley Glass and Alison Haywood

Athanassios Vozikis and Marina Riga

Víctor Reyes-Alcázar, Antonio Torres-Olivera, Diego Núñez-García and Antonio Almuedo-Paz

Antonio Almuedo-Paz, Diego Núñez-García, Víctor Reyes-Alcázar and Antonio Torres-Olivera

**Visualization of Software and Systems 315** Peter Liggesmeyer, Henning Barthel, Achim Ebert,

Chapter 14 **Patterns of Medical Errors: A Challenge** 

Preface

Quality is one of the most important factors when selecting products or services. Consequently, understanding and improving quality has become the main issue for business strategy in competitive markets. The need for quality-related studies and research has increased in parallel with advances in technology and product complexity. Quality engineering and management tools have evolved over the years, from the principles of "Scientific Management" through quality control, quality assurance, total quality, six sigma, ISO certification and continuous improvement. In order to facilitate and achieve continuous quality improvement, the development of

With the initiation of "Scientific Management" principles by F. W. Taylor in 1875, productivity became a focus in dealing with complex systems. Later, systematic inspection and testing of products were started by AT&T in 1907. After the introduction of control chart concepts by W. A. Shewhart in 1924 and acceptance sampling methodology by H. F. Dodge and H. G. Romig in 1928 at Bell Labs, statistical quality control tools became widely used in industry. After 1950, total quality control concepts were introduced by several pioneers including A. V. Feigenbaum. In addition to development of several new quality control tools and techniques, use of design of experiments became widely used for quality assurance and for improving quality. In 1989, Motorola Company initiated six sigma concepts to assure high quality for complex electronic products and related systems. After 1990, ISO 9000 quality certification programs were introduced and became widespread in many organizations. American Society for Quality Control became American Society for

Quality terminologies are varied and often used interchangeably. In particular, quality assurance and quality control are both used to represent activities of a quality department, which develops planning processes and procedures to make sure that the products manufactured or the services delivered by organizations will always be of good quality. However, there is a difference between the two. In particular, while quality assurance is process oriented and includes preventive activities, quality control is product oriented and includes detection activity, which focuses on detecting the defects after the product is manufactured. Thus, testing a product is in quality control domain and is not quality assurance. Quality Assurance makes sure that the right

new tools and techniques are continually required.

Quality to put emphasis on quality improvement.

## Preface

Quality is one of the most important factors when selecting products or services. Consequently, understanding and improving quality has become the main issue for business strategy in competitive markets. The need for quality-related studies and research has increased in parallel with advances in technology and product complexity. Quality engineering and management tools have evolved over the years, from the principles of "Scientific Management" through quality control, quality assurance, total quality, six sigma, ISO certification and continuous improvement. In order to facilitate and achieve continuous quality improvement, the development of new tools and techniques are continually required.

With the initiation of "Scientific Management" principles by F. W. Taylor in 1875, productivity became a focus in dealing with complex systems. Later, systematic inspection and testing of products were started by AT&T in 1907. After the introduction of control chart concepts by W. A. Shewhart in 1924 and acceptance sampling methodology by H. F. Dodge and H. G. Romig in 1928 at Bell Labs, statistical quality control tools became widely used in industry. After 1950, total quality control concepts were introduced by several pioneers including A. V. Feigenbaum. In addition to development of several new quality control tools and techniques, use of design of experiments became widely used for quality assurance and for improving quality. In 1989, Motorola Company initiated six sigma concepts to assure high quality for complex electronic products and related systems. After 1990, ISO 9000 quality certification programs were introduced and became widespread in many organizations. American Society for Quality Control became American Society for Quality to put emphasis on quality improvement.

Quality terminologies are varied and often used interchangeably. In particular, quality assurance and quality control are both used to represent activities of a quality department, which develops planning processes and procedures to make sure that the products manufactured or the services delivered by organizations will always be of good quality. However, there is a difference between the two. In particular, while quality assurance is process oriented and includes preventive activities, quality control is product oriented and includes detection activity, which focuses on detecting the defects after the product is manufactured. Thus, testing a product is in quality control domain and is not quality assurance. Quality Assurance makes sure that the right things are done in the right way. It is important to make sure that the products are produced or the services are provided in good quality before they are tested in the final stage of production. Once in final stage, there is no way to recover the costs that are already incurred due to bad quality. Quality assurance is therefore an area that needs to be studied and investigated in more detail with respect to various production processes, and service activities. Quality assurance is widely applied in such areas as industrial manufacturing, healthcare, medical areas, software, education, transportation, research, government activities, and other service industries.

The purpose of this book is to present new concepts, the state-of-the-art techniques, and advances in quality related research. Novel ideas and current developments in the field of quality assurance and related topics are presented in different chapters, which are organized according to application areas. Initial chapters present basic ideas and historical perspectives on quality, while subsequent chapters present quality assurance applications in education, healthcare, medicine, software development, service industry, and other technical areas. This book is a valuable contribution to the literature in the field of quality assurance and quality management. The primary target audience for the book includes students, researchers, quality engineers, production and process managers, and professionals who are interested in quality assurance and related areas.

> **Prof. Mehmet Savsar**  Kuwait University, College of Engineering & Petroleum, Industrial Engineering Department, Safat Kuwait

## **Five Essential Skills for 21st Century Quality Professionals in Health and Human Service Organisations**

Cathy Balding *Qualityworks P/L and La Trobe University Australia* 

#### **1. Introduction**

Society's demand for quality in all spheres has never been higher. In health and human services industries in particular, consumers and funding bodies demand both technical excellence and outstanding customer service. Industries such as health, aged care and community services are struggling to meet these challenges, as the numbers of consumers grow, technology adds new a layer of complexity that solves some problems and creates others, and staff are expected to provide excellent customer service as well as technically effective services. The role of the quality improvement professional in these organizations is expanding in line with these growing expectations and has never been more important. Traditional quality systems focused on compliance and monitoring are no longer sufficient to create an excellent consumer experience, and quality managers need to add to their skills base to effectively support their organizations in this rapidly evolving environment. This chapter proposes five essential skills for quality professionals in the new millennium that build on, and go beyond, those associated with traditional monitoring and improvement, and are essential for taking organizations beyond compliance to transformation of the consumer experience. The five essential skills for 21st century quality managers discussed in this chapter are:


The content is derived from the literature and from the author's 20 years experience working as a quality manager and with quality managers in health and aged care.

#### **2. Support robust quality governance**

Transforming the consumer experience cannot be achieved without effective governance for quality. We now need quality governance and systems that address the impact we have on our consumers – not just the outcomes we achieve. People across the organisation, from the

Five Essential Skills for 21st Century

Table 1.

of information.

Quality Professionals in Health and Human Service Organisations 3

The importance of a quality governance system cannot be overstated; it provides the foundation for the myriad pieces of a quality system and gives people a role in that system,

The concept of governance arose from the need to ensure greater and clearer accountability for the quality and safety of care experienced by the consumer. This is still a work in progress in healthcare. There are many health service organisations in which individuals are not aware of the clear, specific, personal responsibility they have for the quality of care and services they provide. This makes it difficult for staff to carry out their responsibilities, and even harder to create a consistently safe, quality experience for consumers. Governance is where the governing body, executives and managers play their critical role in creating safe, quality care. The executive must translate the strategic quality goals into operational plans and strategies to facilitate their implementation as part of organisational business. Those on the frontline of care create the consumer experience, but the organisational supports for this must come from the top, as staff require leadership, policy, systems and an investment of time and resources to implement the strategies. And, of course, the quality manager provides technical support across the organisation to enable staff to fulfil their responsibilities. An example of generic governance roles for quality care is described in

Another aspect of accountability is the way in which committees support the quality system. Driving the achievement of the quality plan through line management will generally occur in partnership with working groups or committees, particularly where implementation requires cooperation across staff groups or services. When committees are action focused they are invaluable in tracking and driving progress with the quality goals. When committees are just information recipients, staff will have difficulty understanding their purpose – and may try to avoid them. Quality managers need to be alert to directionless committees – and get them on track before they erode the credibility of the quality system. Committees should take an active role in quality goal monitoring and action at the local department/service level (where they might take responsibility for driving one component of a goal) right through to board committee level (which monitors progress with achieving the quality goals). Committees that have an explicit responsibility for achieving a quality goal are more likely to be proactive decision makers and less likely to be passive recipients

To be useful, committees need a clear purpose and something that they are responsible for so they can make decisions and take action. Giving a quality committee responsibility for driving and monitoring a quality goal, objective, strategy or governance support will add some life and energy to proceedings. A clear purpose also helps determine a committee's agenda and membership. Quality committee agendas can be structured according to the quality goals and their objectives and components, which makes it easier to see how data monitoring and improvement activities link to the achievement of great care. All reporting should help a committee determine if progress is being made towards implementing governance cornerstones or achieving the relevant quality goals. Committee membership is

which in turn makes the implementation of the various governance systems easier.

**2.1.1 Clarifying accountabilities for creating safe, quality care** 

**2.1.2 Developing dynamic quality committees** 

boardroom to the customer interface, need to be clear on their individual responsibility for the quality of the services they provide and supported to enact it. Quality managers must be able to work with governing bodies and executives to design and develop systems that support staff to fulfil their responsibilities. This section discusses the governance systems required to enable and empower personnel across the organisation to enact their role in creating high quality services every day.

#### **2.1 Understanding and implementing quality governance**

The concept of quality governance is a relatively recent phenomenon. When the author started working as a quality manager in the 1980s, we thought that if we were accredited, doing some auditing and clinical review and engaging staff in quality projects then we were doing well. We knew that leadership was important, but we didn't know how important it was or indeed how best to lead. It took various studies and inquiries into suboptimal care and adverse events in healthcare to demonstrate that safe and high-quality care in a complex environment requires more than good staff trying hard. Clinical governance largely emerged from the findings of public inquiries into poor care that found that the majority of these organisations were not the victims of deliberately negligent practitioners. What they lacked were systems: for including consumers in their care, for supporting staff to provide quality care, for clarify accountabilities and for measurement and improvement. Nor did they exhibit consumer and safety-oriented cultures, with 'blame and shame' the common response to adverse events and passive response to data indicating suboptimal results. (Hindle et al., 2006)

Of course, quality care can't be achieved without good staff doing their best. But to create great care consistently, healthcare staff also need sturdy organisational supports behind them. Staff are 'front of house' – out there working with the customers. Governance is 'back of house' – the behind-the-scenes systems that support staff and enable them to provide a great consumer experience. To make the components of great care happen for every consumer, every day you'll need to ask:


Providing safe, quality care and guarding against organisational weaknesses that allow poor care requires commitment and accountability to be embedded in the organisational structures and culture, but also requires a targeted plan. Setting goals and targets for the quality of care your organisation wants to deliver, and implementing strategies to achieve them is part of the governance of any health or aged care organisation. The emergence of clinical governance over the past decade has been healthcare's approach to providing this accountability, planning and support. In aged and primary care, this can be reframed using more appropriate terms such as 'quality governance' or 'care governance'. The key components of governance can be organised into four generic cornerstones:


boardroom to the customer interface, need to be clear on their individual responsibility for the quality of the services they provide and supported to enact it. Quality managers must be able to work with governing bodies and executives to design and develop systems that support staff to fulfil their responsibilities. This section discusses the governance systems required to enable and empower personnel across the organisation to enact their role in

The concept of quality governance is a relatively recent phenomenon. When the author started working as a quality manager in the 1980s, we thought that if we were accredited, doing some auditing and clinical review and engaging staff in quality projects then we were doing well. We knew that leadership was important, but we didn't know how important it was or indeed how best to lead. It took various studies and inquiries into suboptimal care and adverse events in healthcare to demonstrate that safe and high-quality care in a complex environment requires more than good staff trying hard. Clinical governance largely emerged from the findings of public inquiries into poor care that found that the majority of these organisations were not the victims of deliberately negligent practitioners. What they lacked were systems: for including consumers in their care, for supporting staff to provide quality care, for clarify accountabilities and for measurement and improvement. Nor did they exhibit consumer and safety-oriented cultures, with 'blame and shame' the common response to adverse events and passive response to data indicating suboptimal

Of course, quality care can't be achieved without good staff doing their best. But to create great care consistently, healthcare staff also need sturdy organisational supports behind them. Staff are 'front of house' – out there working with the customers. Governance is 'back of house' – the behind-the-scenes systems that support staff and enable them to provide a great consumer experience. To make the components of great care happen for every

Providing safe, quality care and guarding against organisational weaknesses that allow poor care requires commitment and accountability to be embedded in the organisational structures and culture, but also requires a targeted plan. Setting goals and targets for the quality of care your organisation wants to deliver, and implementing strategies to achieve them is part of the governance of any health or aged care organisation. The emergence of clinical governance over the past decade has been healthcare's approach to providing this accountability, planning and support. In aged and primary care, this can be reframed using more appropriate terms such as 'quality governance' or 'care governance'. The key

What do we currently have in place that supports great care as we've defined it?

 What do we need to enhance/change to achieve our quality goals? What new processes/supports do we need that we don't currently have?

components of governance can be organised into four generic cornerstones:

creating high quality services every day.

results. (Hindle et al., 2006)

consumer, every day you'll need to ask:

strategic leadership, planning and culture

effective and accountable workforce

consumer participation

quality and risk systems.

**2.1 Understanding and implementing quality governance** 

The importance of a quality governance system cannot be overstated; it provides the foundation for the myriad pieces of a quality system and gives people a role in that system, which in turn makes the implementation of the various governance systems easier.

#### **2.1.1 Clarifying accountabilities for creating safe, quality care**

The concept of governance arose from the need to ensure greater and clearer accountability for the quality and safety of care experienced by the consumer. This is still a work in progress in healthcare. There are many health service organisations in which individuals are not aware of the clear, specific, personal responsibility they have for the quality of care and services they provide. This makes it difficult for staff to carry out their responsibilities, and even harder to create a consistently safe, quality experience for consumers. Governance is where the governing body, executives and managers play their critical role in creating safe, quality care. The executive must translate the strategic quality goals into operational plans and strategies to facilitate their implementation as part of organisational business. Those on the frontline of care create the consumer experience, but the organisational supports for this must come from the top, as staff require leadership, policy, systems and an investment of time and resources to implement the strategies. And, of course, the quality manager provides technical support across the organisation to enable staff to fulfil their responsibilities. An example of generic governance roles for quality care is described in Table 1.

#### **2.1.2 Developing dynamic quality committees**

Another aspect of accountability is the way in which committees support the quality system. Driving the achievement of the quality plan through line management will generally occur in partnership with working groups or committees, particularly where implementation requires cooperation across staff groups or services. When committees are action focused they are invaluable in tracking and driving progress with the quality goals. When committees are just information recipients, staff will have difficulty understanding their purpose – and may try to avoid them. Quality managers need to be alert to directionless committees – and get them on track before they erode the credibility of the quality system. Committees should take an active role in quality goal monitoring and action at the local department/service level (where they might take responsibility for driving one component of a goal) right through to board committee level (which monitors progress with achieving the quality goals). Committees that have an explicit responsibility for achieving a quality goal are more likely to be proactive decision makers and less likely to be passive recipients of information.

To be useful, committees need a clear purpose and something that they are responsible for so they can make decisions and take action. Giving a quality committee responsibility for driving and monitoring a quality goal, objective, strategy or governance support will add some life and energy to proceedings. A clear purpose also helps determine a committee's agenda and membership. Quality committee agendas can be structured according to the quality goals and their objectives and components, which makes it easier to see how data monitoring and improvement activities link to the achievement of great care. All reporting should help a committee determine if progress is being made towards implementing governance cornerstones or achieving the relevant quality goals. Committee membership is

Five Essential Skills for 21st Century

complex environment.

aged care service.

towards the same change.

pressure for long enough, it can suddenly disintegrate.

**2.2 Work effectively in complex systems** 

Quality Professionals in Health and Human Service Organisations 5

Organizations providing human services are complex systems. They have a large number of inputs and processes, and are continually exposed to outside pressures and influences. It is imperative that quality managers working in these environments understand how these systems work to be successful. This section explains what complex systems are, how they work and, most importantly, why these things are important for quality managers, because of the way they directly impact on the pursuit of high quality services in an organisation. Working in a complex system, but treating it as if it is a simple or complicated system, makes it difficult to achieve consistently high quality services. Change and improvement in complex systems require a particular approach, tailored to the unique characteristics of the

Complex systems operate according to distinctive and often counter-intuitive rules. It is important that quality managers understand these rules and, in particular, their implications for creating change and improving safety and quality. Traditional, production line approaches to quality are only half the story in a complex environment such as a health or

All complex systems have a goal, which may be as simple as survival, or maintaining the current situation. Be prepared for push back from the system if you interfere with it achieving its goal. Systems enjoy their status quo and strive to maintain it. If you change one part of the system, this will result in resistance from the other parts of the system it is linked to because it means they will have to change as well. The more parts of the system there are and the more possible connections between them, the harder it is to change and the easier it is to create chaos (Meadows, 2008). So whenever you take action within a complex system, there will be side effects. These may be positive or negative, depending on your perspective. In our health services, we usually expect that effect will follow cause. This is production line thinking. We recognise these as false conclusions when we can't then replicate the same result in another part of the organisation. The result may have been due to the natural variation inherent in every system. Or it may have been due to your intervention – but this intervention won't work the same way in another part of the system. Generally speaking, real change in complex systems requires a lot of different parts of the system to be working

A complex system acts like a web of elastic bands so that when you pull one piece out of position it will stay there only for as long as you exert force on it. When you let go, you may be surprised and annoyed that it springs back to where it was before. In addition, a complex system may or may not be stable. Stable complex systems that have not been subject to a lot of change become more resistant to change as time goes on. All of us have experienced this in organisations, where one service or department has somehow escaped the force of change experienced by other parts of the organisation. When their turn comes, they find change very difficult. In an unstable system, however, pressure to make changes can cause the system to burst like a balloon. If the system is under a lot of pressure routinely, this may only take a small trigger, just as a small crack in a dam can lead to its collapse because of the constant pressure of water behind it. So if you put an unstable system under enough

**2.2.1 An overview of some key complex systems characteristics** 

always tricky to get right. Members can be invited on the basis of who has to be on this committee – there will always be political and relationship imperatives in a complex system – and who you need on the committee to fulfil its purpose. Some members may need to be there because they are decision makers and have formal power. Depending on the committee's role, you may also want people with informal power – the influencers. If the committee is responsible for addressing improvement in a particular area of the organisation, you will need some who have a deep understanding of the relevant systems, relationships and mental maps. Everyone on a quality-related committee should understand its purpose and exactly what each of their roles is – be it sharing their knowledge, experience or influence – and be invited to contribute to discussions and decisions on that basis.


Table 1. Examples of governance roles in creating quality care (Australian Commission on Safety and Quality in Healthcare [ACSQHC], 2010; Victorian Quality Council [VQC], 2003)

#### **2.2 Work effectively in complex systems**

4 Quality Assurance and Management

always tricky to get right. Members can be invited on the basis of who has to be on this committee – there will always be political and relationship imperatives in a complex system – and who you need on the committee to fulfil its purpose. Some members may need to be there because they are decision makers and have formal power. Depending on the committee's role, you may also want people with informal power – the influencers. If the committee is responsible for addressing improvement in a particular area of the organisation, you will need some who have a deep understanding of the relevant systems, relationships and mental maps. Everyone on a quality-related committee should understand its purpose and exactly what each of their roles is – be it sharing their knowledge, experience or influence – and be invited to

**Quality Governance Responsibilities**

care and services to be achieved Lead a just, proactive culture

Make the achievement of great care a priority

Make the achievement of great care a priority

tools, policy and people development

quality care and services

approach to achieving it

the broader quality agenda

care within their services

organisational quality goals

Make the right thing easy for staff to do

Support and enable all staff to create great care

Table 1. Examples of governance roles in creating quality care (Australian Commission on Safety and Quality in Healthcare [ACSQHC], 2010; Victorian Quality Council [VQC], 2003)

local initiatives

each consumer

Set strategic direction and the line in the sand for the quality of

 Ensure management provides the necessary system supports and staff development to provide great care for each consumer, and monitors progress towards achieving the strategic quality goals

 Set strategic goals for great care and operationalise them through effective governance, resources, data, plans, systems, support,

 Monitor and drive progress towards the strategic quality goals Develop a thinking organisation and a just culture, wherein staff are supported to take a proactive approach to achieving safe,

Make the achievement of great care a priority and take a proactive

Operationalise the strategic quality goals by translating them into

Understand the key organisational safety and quality issues and

Monitor and drive progress by implementing the drivers of great

Develop staff and systems to create quality care and services for

 Make evaluation and improvement a routine part of care Develop, implement and evaluate initiatives to contribute to the

 Create a great experience for each consumer through positive behaviours and attitudes and a proactive approach

contribute to discussions and decisions on that basis.

**Organisational** 

**Governing Body** Accountable for the quality of care, services and consumer experience

**Chief Executive and Executives**  Accountable for and lead great care and services

**Directors and Managers**  Responsible for the quality of care in each service

**Clinicians and** 

Responsible for quality of care at point of care

**Staff** 

**level** 

Organizations providing human services are complex systems. They have a large number of inputs and processes, and are continually exposed to outside pressures and influences. It is imperative that quality managers working in these environments understand how these systems work to be successful. This section explains what complex systems are, how they work and, most importantly, why these things are important for quality managers, because of the way they directly impact on the pursuit of high quality services in an organisation. Working in a complex system, but treating it as if it is a simple or complicated system, makes it difficult to achieve consistently high quality services. Change and improvement in complex systems require a particular approach, tailored to the unique characteristics of the complex environment.

#### **2.2.1 An overview of some key complex systems characteristics**

Complex systems operate according to distinctive and often counter-intuitive rules. It is important that quality managers understand these rules and, in particular, their implications for creating change and improving safety and quality. Traditional, production line approaches to quality are only half the story in a complex environment such as a health or aged care service.

All complex systems have a goal, which may be as simple as survival, or maintaining the current situation. Be prepared for push back from the system if you interfere with it achieving its goal. Systems enjoy their status quo and strive to maintain it. If you change one part of the system, this will result in resistance from the other parts of the system it is linked to because it means they will have to change as well. The more parts of the system there are and the more possible connections between them, the harder it is to change and the easier it is to create chaos (Meadows, 2008). So whenever you take action within a complex system, there will be side effects. These may be positive or negative, depending on your perspective. In our health services, we usually expect that effect will follow cause. This is production line thinking. We recognise these as false conclusions when we can't then replicate the same result in another part of the organisation. The result may have been due to the natural variation inherent in every system. Or it may have been due to your intervention – but this intervention won't work the same way in another part of the system. Generally speaking, real change in complex systems requires a lot of different parts of the system to be working towards the same change.

A complex system acts like a web of elastic bands so that when you pull one piece out of position it will stay there only for as long as you exert force on it. When you let go, you may be surprised and annoyed that it springs back to where it was before. In addition, a complex system may or may not be stable. Stable complex systems that have not been subject to a lot of change become more resistant to change as time goes on. All of us have experienced this in organisations, where one service or department has somehow escaped the force of change experienced by other parts of the organisation. When their turn comes, they find change very difficult. In an unstable system, however, pressure to make changes can cause the system to burst like a balloon. If the system is under a lot of pressure routinely, this may only take a small trigger, just as a small crack in a dam can lead to its collapse because of the constant pressure of water behind it. So if you put an unstable system under enough pressure for long enough, it can suddenly disintegrate.

Five Essential Skills for 21st Century

proactive, flexible, thinking solutions.

Quality Professionals in Health and Human Service Organisations 7

where they are open to a myriad of influences and changing circumstances. What is

Improving reliability through systems that force and guide safe decisions, provide backups, remind staff of preferred behaviour and catch fallible humans when they make a mistake, are key aspects of creating safety. In fact, their use is in its infancy in healthcare – compared to other high-risk industries – and there would probably be significant benefit in fasttracking the implementation of proven safety systems. Rule-based decision making, such as the use of protocols and checklists is also extremely useful in many situations; for example, by inexperienced practitioners who are learning standard procedures for frequent high-risk situations. Standard procedures can be useful for experts as well – particularly if they find themselves in a situation that they do not often experience (Flin et al, 2008). Not all aspects of standardisation and reliability are foolproof, however, and there is danger in thinking that they are a set and forget solution to safety. There are many reasons for this in a complex system. Remember the 'policy resistant' aspect of complex systems? Complex systems – and the people working within them – do not always respond well to overly restrictive rules, and they may react in unexpected ways. Creating a standardized approach, unless based on a forcing function, does not guarantee that it will be followed. And forcing functions, while useful in creating safety, can give rise to complacency and a lack of staff alertness. So standardisation is *one* answer to improving safety and quality, but not the *only* answer.

Why is this? We often find that there is such a strong emphasis on procedures, checklists and protocols that organisations attempt to write one for every eventuality. But it is almost impossible for a procedure to be written for every situation in a complex system, and unlikely that staff will refer to all procedures if there are too many of them (Amalberti et al, 2006). Reliability in high reliability organisations is accomplished by standardisation and simplification of as many processes as possible. But your health service is a dynamic organism with a high level of variability, production pressure, professional autonomy and rapid creation of new knowledge. Not everything can be fixed and standardised so when trying to reduce variability and improve reliability, it is better to focus on the variation that is creating real problems, rather than variation more broadly. All safety policies have a natural lifespan as the context around them is constantly changing. The challenge of creating and maintaining safety within this context requires a mix of standardisation and

Over reliance on rule-based decision making is another flaw in mechanistic approaches to safety and quality in health services. It may cause a degree of skill decay; if an unexpected and unfamiliar situation arises and no rule exists, will the person making the decisions be able to formulate an effective course of action? (Flin et al, 2008). Protocols too may reduce or discourage the ability of people to be proactive, practice situational awareness, identify deviations from normal situations – in short, to think for themselves (Dekker, 2005). Bad decisions can also occur in rule based situations if the wrong rule or protocol is selected. It is human nature to prefer a familiar rule, whether or not it is the right one to match the situation in which the decision maker finds themselves. A mechanistic rule-based approach to safety is based on the premise that safety is the result of people following procedures, but staff work around rules and procedures that do not meet their needs for efficiency and streamlining*.* Developing checklists and protocols in response to risks may provide a sense of action having been taken, but can send the message that reliable, safe care requires

required is a balance of rules, systems and thinking, proactive staff.

Despite these characteristics, complex systems work because people make them work. But to do this, processes in the system are often changed as the system evolves, and then the relationships between the processes have to change to keep the system working. The relationship between different parts of the system determines how the system overall works, so each process change, however minor, can affect the behaviour of the whole. This is an important point! All processes in a system are interdependent and they all interact. The key to change is not to just focus on one process in isolation, but to look at how it relates to the other processes in the system. Systems can also become self-organising and can generate their own hierarchies of power and influence. These hierarchies may not be the same as those seen on your organisational chart. Each person, wherever they sit in the system, has the power to affect the way the system behaves. Relationships within each subsystem are denser and stronger than relationships between subsystems. For example, there are likely to be more interdependencies and networks up and down a silo in a health service than across and between silos. Interaction within the silos occurs mainly between members of the same professional group: nurses interacting with nurses, and doctors interacting with doctors. These tribes give the people within them an important sense of belonging but it can be hard to break down the walls and build bridges between them (Braithwaite, 2010).

Complex systems do not necessarily operate according to the policies of the organisation. On the contrary, complex systems can be exceedingly policy resistant. This resistance particularly arises when an introduced change threatens the goal of the system or when policies are implemented that are not based on the reality and unwritten rules of those having to implement them. We've all experienced policies developed on the run, or even painstakingly over a long period, that have only been partially adhered to by those they were designed for. If there is too great a mismatch between the policy requirements and the way that things really get done or the goals of the system, the policy will generally fail. At worst, people will disregard it; at best, they will work around it to meet their goals of getting their work done in the most effective, efficient and easiest way – a way that has probably been crafted over time and is protected by and embedded in the way the system operates and the unwritten beliefs of those who work within it. The way in which policy is implemented can also influence the degree to which it is enacted as intended. Poor implementation opens up a policy to all sorts of change and interpretation by those using it. This may drive policy enactment to drift away from the original intention.

The importance of quality professionals being able to adjust to and deal with these characteristics cannot be underestimated. It can mean the difference between the creation of consistently safe and quality services, and implementing monitoring and improvement with few gains. The implications of these complex systems characteristics are discussed throughout the remainder of this chapter.

#### **2.3 Develop a balance of rule based and proactive approaches to quality**

Human services have traditionally relied on rules to enforce standards and ways of working. But, as we can see from the characteristics of complex systems, more than traditional approaches are required to create consistently safe and high quality health and human services. Of course some rules and standardization are important, but too many rules can do as much damage as too few. Staff work around rules that are not a good fit for their environment and all systems and procedures gradually erode in complex systems,

Despite these characteristics, complex systems work because people make them work. But to do this, processes in the system are often changed as the system evolves, and then the relationships between the processes have to change to keep the system working. The relationship between different parts of the system determines how the system overall works, so each process change, however minor, can affect the behaviour of the whole. This is an important point! All processes in a system are interdependent and they all interact. The key to change is not to just focus on one process in isolation, but to look at how it relates to the other processes in the system. Systems can also become self-organising and can generate their own hierarchies of power and influence. These hierarchies may not be the same as those seen on your organisational chart. Each person, wherever they sit in the system, has the power to affect the way the system behaves. Relationships within each subsystem are denser and stronger than relationships between subsystems. For example, there are likely to be more interdependencies and networks up and down a silo in a health service than across and between silos. Interaction within the silos occurs mainly between members of the same professional group: nurses interacting with nurses, and doctors interacting with doctors. These tribes give the people within them an important sense of belonging but it can be hard

to break down the walls and build bridges between them (Braithwaite, 2010).

This may drive policy enactment to drift away from the original intention.

**2.3 Develop a balance of rule based and proactive approaches to quality** 

throughout the remainder of this chapter.

Complex systems do not necessarily operate according to the policies of the organisation. On the contrary, complex systems can be exceedingly policy resistant. This resistance particularly arises when an introduced change threatens the goal of the system or when policies are implemented that are not based on the reality and unwritten rules of those having to implement them. We've all experienced policies developed on the run, or even painstakingly over a long period, that have only been partially adhered to by those they were designed for. If there is too great a mismatch between the policy requirements and the way that things really get done or the goals of the system, the policy will generally fail. At worst, people will disregard it; at best, they will work around it to meet their goals of getting their work done in the most effective, efficient and easiest way – a way that has probably been crafted over time and is protected by and embedded in the way the system operates and the unwritten beliefs of those who work within it. The way in which policy is implemented can also influence the degree to which it is enacted as intended. Poor implementation opens up a policy to all sorts of change and interpretation by those using it.

The importance of quality professionals being able to adjust to and deal with these characteristics cannot be underestimated. It can mean the difference between the creation of consistently safe and quality services, and implementing monitoring and improvement with few gains. The implications of these complex systems characteristics are discussed

Human services have traditionally relied on rules to enforce standards and ways of working. But, as we can see from the characteristics of complex systems, more than traditional approaches are required to create consistently safe and high quality health and human services. Of course some rules and standardization are important, but too many rules can do as much damage as too few. Staff work around rules that are not a good fit for their environment and all systems and procedures gradually erode in complex systems, where they are open to a myriad of influences and changing circumstances. What is required is a balance of rules, systems and thinking, proactive staff.

Improving reliability through systems that force and guide safe decisions, provide backups, remind staff of preferred behaviour and catch fallible humans when they make a mistake, are key aspects of creating safety. In fact, their use is in its infancy in healthcare – compared to other high-risk industries – and there would probably be significant benefit in fasttracking the implementation of proven safety systems. Rule-based decision making, such as the use of protocols and checklists is also extremely useful in many situations; for example, by inexperienced practitioners who are learning standard procedures for frequent high-risk situations. Standard procedures can be useful for experts as well – particularly if they find themselves in a situation that they do not often experience (Flin et al, 2008). Not all aspects of standardisation and reliability are foolproof, however, and there is danger in thinking that they are a set and forget solution to safety. There are many reasons for this in a complex system. Remember the 'policy resistant' aspect of complex systems? Complex systems – and the people working within them – do not always respond well to overly restrictive rules, and they may react in unexpected ways. Creating a standardized approach, unless based on a forcing function, does not guarantee that it will be followed. And forcing functions, while useful in creating safety, can give rise to complacency and a lack of staff alertness. So standardisation is *one* answer to improving safety and quality, but not the *only* answer.

Why is this? We often find that there is such a strong emphasis on procedures, checklists and protocols that organisations attempt to write one for every eventuality. But it is almost impossible for a procedure to be written for every situation in a complex system, and unlikely that staff will refer to all procedures if there are too many of them (Amalberti et al, 2006). Reliability in high reliability organisations is accomplished by standardisation and simplification of as many processes as possible. But your health service is a dynamic organism with a high level of variability, production pressure, professional autonomy and rapid creation of new knowledge. Not everything can be fixed and standardised so when trying to reduce variability and improve reliability, it is better to focus on the variation that is creating real problems, rather than variation more broadly. All safety policies have a natural lifespan as the context around them is constantly changing. The challenge of creating and maintaining safety within this context requires a mix of standardisation and proactive, flexible, thinking solutions.

Over reliance on rule-based decision making is another flaw in mechanistic approaches to safety and quality in health services. It may cause a degree of skill decay; if an unexpected and unfamiliar situation arises and no rule exists, will the person making the decisions be able to formulate an effective course of action? (Flin et al, 2008). Protocols too may reduce or discourage the ability of people to be proactive, practice situational awareness, identify deviations from normal situations – in short, to think for themselves (Dekker, 2005). Bad decisions can also occur in rule based situations if the wrong rule or protocol is selected. It is human nature to prefer a familiar rule, whether or not it is the right one to match the situation in which the decision maker finds themselves. A mechanistic rule-based approach to safety is based on the premise that safety is the result of people following procedures, but staff work around rules and procedures that do not meet their needs for efficiency and streamlining*.* Developing checklists and protocols in response to risks may provide a sense of action having been taken, but can send the message that reliable, safe care requires

Five Essential Skills for 21st Century

**2.4 Develop strategic quality plans** 

quality managers in the 21st century.

this is of much more interest to clinicians and staff.

problems, monitor compliance

every consumer (Balding, 2011).

There are three key aspects to a quality system in health and aged care:

solve problems and improve consumer experiences and outcomes

Quality Professionals in Health and Human Service Organisations 9

We cannot expect to eliminate human error and systems failure, but we can develop organisations that are more resistant to their adverse effects. Achieving this balance within a high-risk and ever changing environment is a critical challenge for healthcare managers and staff. But this approach reflects more realistically the environment within which we work every day. An environment that cultivates both systems and people not only supports the creation of a safer environment, but improved quality of care and services more broadly.

Health services have traditionally measured inputs and outputs, and to a lesser extent outcomes, as valid and reliable outcome data can be difficult to obtain. They have been less concerned with measuring and addressing their impact on the consumer experience. We often see quality systems focused on compliance and small scale improvement, resulting in task focused programs with little purpose or direction. Like a jigsaw puzzle without the picture, there are many pieces, but no one is quite sure how to put it together. Yet engaging staff in playing their part in quality requires an inspiring vision of the service quality the organisation is committed to provide for each consumer, and a clear pathway to get there. Creating consistently high quality consumer experiences in complex organisations requires a strategic approach. Quality professionals must be able to work with their executives and managers to create a blueprint wherein goals, strategies, leadership and governance converge on a specific target: great technical care and customer service. Strategic quality planning and implementation within complex healthcare environments is a key skill for

So, what is goal-based quality planning – and why do we need it? Staff involved in health and aged care quality systems are often frustrated because they don't understand why they are being asked to collect data, develop new processes or go to meetings. Simply, they can't see how these efforts fit into the bigger picture. All they see are tasks that interfere with their capacity to do 'real' work. A goal based quality plan is the blueprint for how the quality system components work together to achieve a quality consumer experience. A clear, strategically focused quality plan can help quality professionals to clarify and fulfil their role and support managers and staff to better understand their part in achieving quality care. It also demonstrates that participation in the quality system is about a lot more than achieving accreditation, as the focus of the quality system becomes the impact of monitoring and improvement activities on consumers, rather than fulfilling accreditation requirements. And

Maintenance – minimise risk, maintain processes and standards of care, detect

Improvement – identify and drive operational improvements in processes designed to

Transformation – develop and pursue a strategic view of consistently 'great' care for

Most quality systems address maintenance and improvement, but too few use their quality and governance structures and processes to pursue transformation. So how does goal-based

nothing more than insisting upon routine standardised procedures. Nothing threatens safety like the belief that the problem is solved (Bosk et al, 2009).

#### **2.3.1 Moving beyond standardisation to create safety and quality**

When developing safety policies and protocols, it is better to give staff fewer rules that can be reliably followed around the clock than to write 'perfect' protocols based on ideal conditions that require workarounds to fit the situation at 11pm on a Saturday night. Try to resist the pressure to develop a new rule in response to every adverse event or root cause analysis finding because you'll end up with a mix of 'should follow' and 'must follow' rules that will muddy the safety waters. 'Should follow' rules that have little credibility or apparent consequence are unlikely to be followed in a messy, high-risk, high-stress environment, so why bother? Erosion of compliance with 'should follow' rules can, in turn, negatively influence compliance with the more important 'must follow' rules. When people are violating a protocol, find out why! It may be for a good reason and may give you an insight into what's going on in practice – and what's required to improve. Use observation and discussion to work out what's really happening. And when introducing a new protocol to reduce a risk, do the troubleshooting around whether or not it's likely to be followed, before people's lives depend on it. Quality managers who understand and can explain the value of not constraining the system any more than necessary, and who encourage challenging a new protocol with 'why won't it work?' and 'how are people likely to work around it?' are more likely to effect positive change in their organisation's approach to safety and quality than those obsessed with rules and compliance.

Another strategy for creating safety and quality in complex organizations is to develop the resilience of the staff. Resilience engineering is a concept derived from human factors engineering – the discipline that studies the interface between machines and systems and human beings, and improves design so that humans can operate safely and effectively. From a human factors perspective, resilience refers to the ability, within complex and highrisk organisations, to understand how failure is avoided and how to design for success. It describes how people learn and adapt to create safety in settings that are fraught with gaps, hazards, tradeoffs and multiple goals. Resilience can be described as a property of both individuals and teams within their workplace (Jeffcott et al, 2009). It fits well with James Reason's observation that his 'Swiss Cheese Model of Accident Causation' (Reason, 2008). requires another slice of cheese – cheddar, not Swiss – at the end of the line. This slice represents humans as the final barrier and defence against unsafe situations turning into harm, when all other systems fail. Practising resilience requires organizations to investigate how individuals, teams and organisations monitor, adapt and act effectively to cope with system failures in high-risk situations, and to apply and develop these lessons.

In the end, rules don't create safety – people do. Quality care and services are created by systems and standardisation, and also by proactive staff working in partnership with consumers to create the organisation's vision for great care. Building resilience is a component of this approach that combines elements of creating safety, human factors, high performing teams, job satisfaction and empowerment in a way that may assist with winning the hearts and minds of the staff at point of care. These are the staff we ultimately depend on to create and deliver the safety and quality of care we want our consumers to experience every day.

We cannot expect to eliminate human error and systems failure, but we can develop organisations that are more resistant to their adverse effects. Achieving this balance within a high-risk and ever changing environment is a critical challenge for healthcare managers and staff. But this approach reflects more realistically the environment within which we work every day. An environment that cultivates both systems and people not only supports the creation of a safer environment, but improved quality of care and services more broadly.

#### **2.4 Develop strategic quality plans**

8 Quality Assurance and Management

nothing more than insisting upon routine standardised procedures. Nothing threatens

When developing safety policies and protocols, it is better to give staff fewer rules that can be reliably followed around the clock than to write 'perfect' protocols based on ideal conditions that require workarounds to fit the situation at 11pm on a Saturday night. Try to resist the pressure to develop a new rule in response to every adverse event or root cause analysis finding because you'll end up with a mix of 'should follow' and 'must follow' rules that will muddy the safety waters. 'Should follow' rules that have little credibility or apparent consequence are unlikely to be followed in a messy, high-risk, high-stress environment, so why bother? Erosion of compliance with 'should follow' rules can, in turn, negatively influence compliance with the more important 'must follow' rules. When people are violating a protocol, find out why! It may be for a good reason and may give you an insight into what's going on in practice – and what's required to improve. Use observation and discussion to work out what's really happening. And when introducing a new protocol to reduce a risk, do the troubleshooting around whether or not it's likely to be followed, before people's lives depend on it. Quality managers who understand and can explain the value of not constraining the system any more than necessary, and who encourage challenging a new protocol with 'why won't it work?' and 'how are people likely to work around it?' are more likely to effect positive change in their organisation's approach to

Another strategy for creating safety and quality in complex organizations is to develop the resilience of the staff. Resilience engineering is a concept derived from human factors engineering – the discipline that studies the interface between machines and systems and human beings, and improves design so that humans can operate safely and effectively. From a human factors perspective, resilience refers to the ability, within complex and highrisk organisations, to understand how failure is avoided and how to design for success. It describes how people learn and adapt to create safety in settings that are fraught with gaps, hazards, tradeoffs and multiple goals. Resilience can be described as a property of both individuals and teams within their workplace (Jeffcott et al, 2009). It fits well with James Reason's observation that his 'Swiss Cheese Model of Accident Causation' (Reason, 2008). requires another slice of cheese – cheddar, not Swiss – at the end of the line. This slice represents humans as the final barrier and defence against unsafe situations turning into harm, when all other systems fail. Practising resilience requires organizations to investigate how individuals, teams and organisations monitor, adapt and act effectively to cope with

system failures in high-risk situations, and to apply and develop these lessons.

every day.

In the end, rules don't create safety – people do. Quality care and services are created by systems and standardisation, and also by proactive staff working in partnership with consumers to create the organisation's vision for great care. Building resilience is a component of this approach that combines elements of creating safety, human factors, high performing teams, job satisfaction and empowerment in a way that may assist with winning the hearts and minds of the staff at point of care. These are the staff we ultimately depend on to create and deliver the safety and quality of care we want our consumers to experience

safety like the belief that the problem is solved (Bosk et al, 2009).

safety and quality than those obsessed with rules and compliance.

**2.3.1 Moving beyond standardisation to create safety and quality** 

Health services have traditionally measured inputs and outputs, and to a lesser extent outcomes, as valid and reliable outcome data can be difficult to obtain. They have been less concerned with measuring and addressing their impact on the consumer experience. We often see quality systems focused on compliance and small scale improvement, resulting in task focused programs with little purpose or direction. Like a jigsaw puzzle without the picture, there are many pieces, but no one is quite sure how to put it together. Yet engaging staff in playing their part in quality requires an inspiring vision of the service quality the organisation is committed to provide for each consumer, and a clear pathway to get there. Creating consistently high quality consumer experiences in complex organisations requires a strategic approach. Quality professionals must be able to work with their executives and managers to create a blueprint wherein goals, strategies, leadership and governance converge on a specific target: great technical care and customer service. Strategic quality planning and implementation within complex healthcare environments is a key skill for quality managers in the 21st century.

So, what is goal-based quality planning – and why do we need it? Staff involved in health and aged care quality systems are often frustrated because they don't understand why they are being asked to collect data, develop new processes or go to meetings. Simply, they can't see how these efforts fit into the bigger picture. All they see are tasks that interfere with their capacity to do 'real' work. A goal based quality plan is the blueprint for how the quality system components work together to achieve a quality consumer experience. A clear, strategically focused quality plan can help quality professionals to clarify and fulfil their role and support managers and staff to better understand their part in achieving quality care. It also demonstrates that participation in the quality system is about a lot more than achieving accreditation, as the focus of the quality system becomes the impact of monitoring and improvement activities on consumers, rather than fulfilling accreditation requirements. And this is of much more interest to clinicians and staff.

There are three key aspects to a quality system in health and aged care:


Most quality systems address maintenance and improvement, but too few use their quality and governance structures and processes to pursue transformation. So how does goal-based

Five Essential Skills for 21st Century

years time?

end of 20XX are:

experience with us?

achievable changes, as seen in Table 2.

for each individual (person-centred).

optimal outcomes (effective and appropriate).

and streamlined flow (continuous, accessible, efficient)

around developing the vision might go something like this:

How would we like to describe our care and services?

What would we like the media to be saying about us – or not saying?

Quality Professionals in Health and Human Service Organisations 11

transformational effect on an organisation. A strategic approach should be designed to take your organisation somewhere better than it is now, and that requires a quality plan based on the vision of care that your organisation wants to move towards. It must also be based on current reality, achievable enough so that people can believe it can happen and enough of an improvement that it is worth pursuing. If you want people to lay the quality bricks, you have to engage them in developing a rich picture of what the finished house will look like. It is important to define quality care from both the consumer and provider perspectives. One without the other is only half the story. It is not an easy undertaking to pull the threads of your organisation together to achieve a common vision for the quality of care your organisation wants to provide. And it is likely to be nearly impossible unless it is clearly defined, ruthlessly prioritised and pursued with laser-like focus. It also needs to fit with existing system goals. To achieve all of this, plans should not contain too many ingredients and focus on achieving the essentials of great care for every consumer, every time. This means that these essentials must be defined. Engaging people across the organisation, including consumers and the governing body, is a good way to ensure this picture of quality care is both aspirational and achievable. Frontline staff and 'frequent flyer' consumers are central to this process. No one understands the difference between great and unacceptable care like those engaged in the care and service delivery transaction. The conversation

How would we like each of our consumers to experience our care and services in three

How would we like our consumers to feel about our services and describe their

Consumers, staff, executives and the governing body can - and should – contribute to these conversations. But it is not always easy to take the next step and turn this rich picture of quality care into concrete, strategic goals. This is where many organisations falter. Without goals, your quality plan may look like a long to-do list with no specific purpose. The vision for the care you want to provide must be rich, and also translated into concrete goals to describe the way things could be. Goals must be attractive and describe real, desirable,

Our strategic goals for the care and services each of our consumers will experience by the

Care and services are designed and delivered to create the best possible experience

Consumers are provided with, and experience, care and services in a logical, clear

 Care and services are designed and delivered to minimise the risk of harm (safe). Care is based around the consumer as an individual, and is designed to achieve

Table 2. Examples of strategic goals for an organisation's quality of care (Balding, 2011).

quality planning address this? More importantly, how does it address this in a complex environment? Your quality plan and system are only as good as the extent to which they impact on the care the consumer receives – supporting it to be good today, and driving it to be great over the long term. Helping managers and staff understand this, and their role in it, is a key responsibility of the quality manager. And it's not just the managers and staff who need to understand it; a quality manager will often have to explain it to the organisation's executive and governing body. When it comes to quality, governing bodies needs something tangible to govern and leaders need something concrete to lead.

The strategic approach to quality planning and creating great care in this chapter is based on the characteristics of successful strategic planning processes used in healthcare and other industries, and is a good fit with complex systems characteristics. They include:


Organisations using this dynamic approach develop their quality plan as the platform for achieving the organisational strategic vision for quality. The strategic planning process is managed centrally or corporately and the leaders, managers and staff who are closest to the consumer are the key implementers. A dynamic quality plan is a map and a vehicle for reaching a destination. That means that a strategic approach to maintaining, improving and transforming great care and services requires you to know the where (where are we now and where do we want to go?), the why and what (why are we doing this and what do we want to achieve?) and the how (how will we get there?).

#### **2.4.1 Setting goals is key to success**

One of the most valuable skills a quality manager can offer an organisation is the development of clear and measurable goals. Do you really know what your organisation is trying to achieve? What do you want to be known for in terms of the quality of care and services you provide? Where do you stand in terms of the key quality and safety issues in your industry?

The research points to the need for a shared purpose if real change is to be made. Engaging people's hearts and minds in a common purpose requires us to paint a rich, specific picture of what they will gain if they participate and what the end result will look like. This is a staple of effective strategic planning. But it is still rare to see health services with a specific vision for the quality of care and services they wish to provide for their consumers. The pressures of short-term budget cycles and political and corporate demands do not lend themselves to a comprehensive, longer-term approach. However, stretch goals can have a

quality planning address this? More importantly, how does it address this in a complex environment? Your quality plan and system are only as good as the extent to which they impact on the care the consumer receives – supporting it to be good today, and driving it to be great over the long term. Helping managers and staff understand this, and their role in it, is a key responsibility of the quality manager. And it's not just the managers and staff who need to understand it; a quality manager will often have to explain it to the organisation's executive and governing body. When it comes to quality, governing bodies needs something

The strategic approach to quality planning and creating great care in this chapter is based on the characteristics of successful strategic planning processes used in healthcare and other

a horizontal approach to the planning process where input and participation are

using learning, information and rewards to increase the strategic view of the entire

encouragement and the cultivation of strategic thinking and culture change at all levels

 having strategic decision making driven down to all levels of the organisation so that achieving the strategic direction becomes part of everyone's job. (Zuckerman, 2005). Organisations using this dynamic approach develop their quality plan as the platform for achieving the organisational strategic vision for quality. The strategic planning process is managed centrally or corporately and the leaders, managers and staff who are closest to the consumer are the key implementers. A dynamic quality plan is a map and a vehicle for reaching a destination. That means that a strategic approach to maintaining, improving and transforming great care and services requires you to know the where (where are we now and where do we want to go?), the why and what (why are we doing this and what do we

One of the most valuable skills a quality manager can offer an organisation is the development of clear and measurable goals. Do you really know what your organisation is trying to achieve? What do you want to be known for in terms of the quality of care and services you provide? Where do you stand in terms of the key quality and safety issues in

The research points to the need for a shared purpose if real change is to be made. Engaging people's hearts and minds in a common purpose requires us to paint a rich, specific picture of what they will gain if they participate and what the end result will look like. This is a staple of effective strategic planning. But it is still rare to see health services with a specific vision for the quality of care and services they wish to provide for their consumers. The pressures of short-term budget cycles and political and corporate demands do not lend themselves to a comprehensive, longer-term approach. However, stretch goals can have a

industries, and is a good fit with complex systems characteristics. They include:

the use of vision statements that inspire and stretch the organisation

tangible to govern and leaders need something concrete to lead.

the development of revolutionary goals to achieve the vision

want to achieve?) and the how (how will we get there?).

**2.4.1 Setting goals is key to success** 

equalised across the organisation

organisation

your industry?

of the organisation

transformational effect on an organisation. A strategic approach should be designed to take your organisation somewhere better than it is now, and that requires a quality plan based on the vision of care that your organisation wants to move towards. It must also be based on current reality, achievable enough so that people can believe it can happen and enough of an improvement that it is worth pursuing. If you want people to lay the quality bricks, you have to engage them in developing a rich picture of what the finished house will look like.

It is important to define quality care from both the consumer and provider perspectives. One without the other is only half the story. It is not an easy undertaking to pull the threads of your organisation together to achieve a common vision for the quality of care your organisation wants to provide. And it is likely to be nearly impossible unless it is clearly defined, ruthlessly prioritised and pursued with laser-like focus. It also needs to fit with existing system goals. To achieve all of this, plans should not contain too many ingredients and focus on achieving the essentials of great care for every consumer, every time. This means that these essentials must be defined. Engaging people across the organisation, including consumers and the governing body, is a good way to ensure this picture of quality care is both aspirational and achievable. Frontline staff and 'frequent flyer' consumers are central to this process. No one understands the difference between great and unacceptable care like those engaged in the care and service delivery transaction. The conversation around developing the vision might go something like this:


Consumers, staff, executives and the governing body can - and should – contribute to these conversations. But it is not always easy to take the next step and turn this rich picture of quality care into concrete, strategic goals. This is where many organisations falter. Without goals, your quality plan may look like a long to-do list with no specific purpose. The vision for the care you want to provide must be rich, and also translated into concrete goals to describe the way things could be. Goals must be attractive and describe real, desirable, achievable changes, as seen in Table 2.

Our strategic goals for the care and services each of our consumers will experience by the end of 20XX are:


Table 2. Examples of strategic goals for an organisation's quality of care (Balding, 2011).

Five Essential Skills for 21st Century

change.

system?

organisation

Quality Professionals in Health and Human Service Organisations 13

Once high quality care and services are achieved, they must be embedded in everyday work. This is one of the most challenging aspects of a quality system, particularly in complex, dynamic organizations, and effective change skills are pivotal to the quality role. Quality managers often underestimate the difficulties of achieving sustained change in this environment, resulting in re-work and waste as changes that don't 'take' are reimplemented. Lasting change to effect improvement requires both systems and people

In a complex system you need to understand what drives current processes before you can achieve a sustained impact and improvement in outcomes. Observe the humans in their natural systems environment. This may be the most important of all the 'change basics' steps – and one of the least practised. With the goal of determining organisational fit and

the timing and context of the change. What else is changing or happening in this

personal attitudes towards change generally, and past experiences with change in the

the social and values anchors that are important to the change targets and that maintain

aspects of the current situation that the change targets don't like. Can these be

driving and restraining forces for change and the degree to which it looks like the

This should help you build an informative picture of the current situation. What has to change to achieve your vision? Work policies and practices? Physical surrounds? Emotional ties? Cultural norms? Understanding and working with the current culture is critical to success – even if that culture is the very thing you want to change. Use your mud map of the current situation to assess, identify and build on what currently works. 'Appreciative inquiry' is a process of identifying something that works consistently well within a system and finding out how this happens (NHS, 2002). Have you ever performed a root cause analysis on something that works to find out why it works well? This makes a nice change from looking at things that don't work well, which is a more common approach in healthcare. Tools such as process mapping, direct observation and conversations with the various players are useful here to tease out the positive characteristics of the current system that will help anchor the changed system. Not only will this help inform your preplanning,

the degree to which the system participants perceive the change as beneficial

the degree of fit between the goals of the system and the goals of the change

**2.5 Create impact and improve outcomes through sustained systems change** 

**2.5.1 Understand the current system before you try to change it** 

readiness for change, you can look for systems factors such as:

the key relationships between processes and people

the status quo. Which of these are non-negotiable?

eliminated or improved as part of the change?

but you will be laying a foundation for buy in.

drivers outweigh the restraints (NHS 2002, 2004).

who and what drives the current system

the perception of the need for change

People are attracted to ideas they feel they are involved in generating. Involving the staff affected in developing the goals for change can help create both buy-in, and the goal clarity that people need before deciding if and how they will participate. Goal clarity appears to be another problem area in creating change. If you aim at nothing in particular – or something ambiguous – that's probably what you'll hit. And yet it is not uncommon to see changes and improvements implemented with only a vague idea of what they will achieve and no clear objectives against which to measure success. The goals for your change must be SMART: specific, measurable, achievable, realistic and time-bound. Goals are about turning your vision into something achievable. Goals are not tasks; goals describe the desired future achievement. A SMART goal will encompass: How well? By when? How will we know? These are then broken down into objectives and the key tasks or stepping stones that have to be traversed, depending on where you're starting from, to achieve the final goal.

#### **2.4.2 Select priorities carefully**

A traditional problem with quality plans is that they are over ambitious. But it's far better to do fewer things and get them right. That's why any good plan has short, medium and longterm goals. Developing an annual Quality Action Plan, derived from the strategic quality plan, is a good way to keep the strategic quality plan current and dynamic. The annual plan contains the priorities to be achieved over the coming 12 months. It ensures the strategic quality plan can evolve with changing external and internal circumstances, while maintaining the overall direction towards achieving the quality goals over the longer term.

So what should be done in the first year of the plan? The selection of your first year objectives will be based on the activities that:


The 'first among equals' priority for consumers is safety and this requires robust processes across all services to reduce risk in key areas. Priorities may also be selected based on safety and indicator data, consumer and staff feedback and identified problems in specific areas. Policy, funding issues and key risks must also be addressed as priorities – that's a reality. If compliance and safety issues are at the head of your quality priorities queue, try to also include some aspirational objectives for improving the consumer experience from other dimensions of quality on the Year One list, or you may lose the momentum and energy created by the planning process. Internally, you will already have many activities in place that will help you achieve your goals. You could start by conducting a gap analysis to ascertain where current quality activities are or are not addressing or supporting the key priorities. Other organisations can also supply ideas for achieving your quality goals. Above all, don't get caught up in the detail of planning to the extent that you lose sight of your purpose. Keep the care you want every consumer to experience at the centre of your activities.

People are attracted to ideas they feel they are involved in generating. Involving the staff affected in developing the goals for change can help create both buy-in, and the goal clarity that people need before deciding if and how they will participate. Goal clarity appears to be another problem area in creating change. If you aim at nothing in particular – or something ambiguous – that's probably what you'll hit. And yet it is not uncommon to see changes and improvements implemented with only a vague idea of what they will achieve and no clear objectives against which to measure success. The goals for your change must be SMART: specific, measurable, achievable, realistic and time-bound. Goals are about turning your vision into something achievable. Goals are not tasks; goals describe the desired future achievement. A SMART goal will encompass: How well? By when? How will we know? These are then broken down into objectives and the key tasks or stepping stones that have to

be traversed, depending on where you're starting from, to achieve the final goal.

have the greatest impact in creating a positive experience for each consumer

address components of great care that are currently suboptimal – or non existent

The 'first among equals' priority for consumers is safety and this requires robust processes across all services to reduce risk in key areas. Priorities may also be selected based on safety and indicator data, consumer and staff feedback and identified problems in specific areas. Policy, funding issues and key risks must also be addressed as priorities – that's a reality. If compliance and safety issues are at the head of your quality priorities queue, try to also include some aspirational objectives for improving the consumer experience from other dimensions of quality on the Year One list, or you may lose the momentum and energy created by the planning process. Internally, you will already have many activities in place that will help you achieve your goals. You could start by conducting a gap analysis to ascertain where current quality activities are or are not addressing or supporting the key priorities. Other organisations can also supply ideas for achieving your quality goals. Above all, don't get caught up in the detail of planning to the extent that you lose sight of your purpose. Keep the care you want every consumer to experience at the centre of your activities.

A traditional problem with quality plans is that they are over ambitious. But it's far better to do fewer things and get them right. That's why any good plan has short, medium and longterm goals. Developing an annual Quality Action Plan, derived from the strategic quality plan, is a good way to keep the strategic quality plan current and dynamic. The annual plan contains the priorities to be achieved over the coming 12 months. It ensures the strategic quality plan can evolve with changing external and internal circumstances, while maintaining the overall direction towards achieving the quality goals over the longer term. So what should be done in the first year of the plan? The selection of your first year

**2.4.2 Select priorities carefully** 

maximise safety

objectives will be based on the activities that:

minimise and eliminate the things that shouldn't happen

 solve significant problems and manage key risks meet legislative, policy and accreditation requirements get something going that will take a long time to achieve cover a lot of the quality plan's intent, using the 80:20 principle.

#### **2.5 Create impact and improve outcomes through sustained systems change**

Once high quality care and services are achieved, they must be embedded in everyday work. This is one of the most challenging aspects of a quality system, particularly in complex, dynamic organizations, and effective change skills are pivotal to the quality role. Quality managers often underestimate the difficulties of achieving sustained change in this environment, resulting in re-work and waste as changes that don't 'take' are reimplemented. Lasting change to effect improvement requires both systems and people change.

#### **2.5.1 Understand the current system before you try to change it**

In a complex system you need to understand what drives current processes before you can achieve a sustained impact and improvement in outcomes. Observe the humans in their natural systems environment. This may be the most important of all the 'change basics' steps – and one of the least practised. With the goal of determining organisational fit and readiness for change, you can look for systems factors such as:


This should help you build an informative picture of the current situation. What has to change to achieve your vision? Work policies and practices? Physical surrounds? Emotional ties? Cultural norms? Understanding and working with the current culture is critical to success – even if that culture is the very thing you want to change. Use your mud map of the current situation to assess, identify and build on what currently works. 'Appreciative inquiry' is a process of identifying something that works consistently well within a system and finding out how this happens (NHS, 2002). Have you ever performed a root cause analysis on something that works to find out why it works well? This makes a nice change from looking at things that don't work well, which is a more common approach in healthcare. Tools such as process mapping, direct observation and conversations with the various players are useful here to tease out the positive characteristics of the current system that will help anchor the changed system. Not only will this help inform your preplanning, but you will be laying a foundation for buy in.

Five Essential Skills for 21st Century

local environment.

to be involved

the next cycle?'

but don't follow through.

situation

**2.5.3 Test and implement the changes** 

Quality Professionals in Health and Human Service Organisations 15

fully enable people to take ownership of the task or a change. We often see one or two of these employed in healthcare change but it is unusual to see an individual or team supplied with all four (Balding, 2009). Empowerment does not mean abandonment. Giving people permission to do something differently is not helpful if they are unable to do it. That permission just sets them up to fail. Setting the context for change means preparing the players, understanding what they know and don't know, working with them, watching their performance, giving them feedback and creating an ongoing dialogue with them (Meadows, 2008). It may be more effort at the front end of a change to work with staff to ensure they have all four components, but it will save you a lot of time and trouble at the back end of the change if they are able to embrace, own and run with the change in their

Rapid cycle piloting of change using the Plan Do Study Act (PDSA) cycle is a useful approach to change in a complex system. PDSA fits the changeable and adaptable nature of complex systems and enables you to test ideas under a variety of circumstances (Reason, 2008). It's also a good way to pick up on the feedback and side effects of your change. This model also includes the possibility that the change being tested will not be successful, but because these tests are done on a small scale the risk of failure can be kept to a level that's manageable. PDSA also helps achieve quick wins, even if small, that are integral to gaining stakeholder acceptance of change. Success on a small scale builds confidence, which allows

larger risks and changes. Pilot projects work best under the following circumstances:

The easiest change with the most leverage for the biggest impact is made

change in complex systems (Haines, 1998; Reason, 2008).

Pilots are limited to small samples and short cycles of change with the people who want

They use solutions that have worked for others, but are adapted to fit the local

 An action learning process is used to frequently review progress and the change leader stops to ask: 'how did we go?', 'what did we learn?', 'what were the unintended consequences and side effects?' and 'how should we do it differently in

Participants are not afraid to stop a test change that's clearly not working. This is part of

Staff involved in the pilot will be watching, judging and weighing up whether or not to hitch their wagon to the new way. It is imperative that your process has credibility. When you pilot a change, use a simple but rigorous project management approach and do exactly what you have promised. If you want to change people's beliefs about how things should be done, you must change what they see. A memo or an email about doing something differently will not make it happen. If you want people to believe that changing their behaviour will result in a certain positive outcome, that outcome must occur. If you commit the leadership group to behaving in a different way, they must behave in that way. This is where many change initiatives break down: we make the plan and say what will happen,

#### **2.5.2 Develop your strategies for change – And impact**

Your strategies for change will be based on your mud map of the current situation, particularly the anchors keeping the current situation in place, and represent the flight plan for how to get to your goals from where you are. Where possible, learn from others who have introduced the same or similar changes, whilst adapting their strategies to your own environment. There is no guarantee that strategies that have been successful elsewhere will work as well in your organisation due to the many layers of interactions that make your system unique. Change, transformation and improvement cannot be delivered through the adoption of an imported recipe or formula without adapting it to the current environment. If you introduce a new procedure, software system, data collection or form on a Monday morning without investing in preparing and equipping the people who will use the innovation, it is unlikely to be automatically adopted. The process may have changed, but the people haven't – they are the same as they were on Friday afternoon. Process change is not the same as people change. Process change is transactional and concrete. People change is transitional and involves a psychological process to come to terms with a new situation and change behaviour to enable the new situation to occur. Unless this transition is well managed, change will not work and things can get stuck. Even with obviously positive changes, there are transitions that begin with having to let go of something and there will be push back because your change adds to the staff 'to do' list and new behaviours take longer, both of which result in lost time. At worst, staff are losing something they are strongly wedded to and may actively resist or get stuck in a neutral zone where they are aware of the change but not actively engaged – a sort of change no man's land (Bridges, 1997).

It is important to remember that all staff feel that they are doing their best for each patient. Change for improvement should always be presented as something that helps good practitioners achieve even more. They may maintain that their only desired benefit of change is improved patient outcomes and these, of course, are likely to take some time to become apparent after the initial change. So what are some of the short-term benefits of change you can use to get people's attention? This is where you have to talk about impact as well as outcome. Impact what we are trying to achieve by change, for both consumers and for staff. It's not only about trying to improve the results of care. It's about consumers feeling the impact of your change through a different, more positive experience. Does the change mean that staff are more active listeners – so consumers feel heard? Is it that the change can form part of an action research project and that you can assist staff to write it up for a journal or a conference paper? Will it help both consumers and staff feel more informed and in control of what's going on? Can a process be made more efficient and simpler as part of the change? Can you save them time and money? (Frankel et al, 2011).

Within this framework, as far as possible, give staff as much freedom as possible to devise their own ways of achieving the goals, based on their intimate knowledge of their own systems. But empowering people to create change is not just saying 'make it so' and then being disappointed when they don't achieve the desired result. Empowering people to change in complex systems is not straightforward. But there are some common actions that have been shown to be essential in assisting people to take ownership of a task or change: direction, knowledge, resources and support – the DKRS model of empowerment (Balding, 2011). For the DKRS model to succeed, each of these four components must be present to fully enable people to take ownership of the task or a change. We often see one or two of these employed in healthcare change but it is unusual to see an individual or team supplied with all four (Balding, 2009). Empowerment does not mean abandonment. Giving people permission to do something differently is not helpful if they are unable to do it. That permission just sets them up to fail. Setting the context for change means preparing the players, understanding what they know and don't know, working with them, watching their performance, giving them feedback and creating an ongoing dialogue with them (Meadows, 2008). It may be more effort at the front end of a change to work with staff to ensure they have all four components, but it will save you a lot of time and trouble at the back end of the change if they are able to embrace, own and run with the change in their local environment.

#### **2.5.3 Test and implement the changes**

14 Quality Assurance and Management

Your strategies for change will be based on your mud map of the current situation, particularly the anchors keeping the current situation in place, and represent the flight plan for how to get to your goals from where you are. Where possible, learn from others who have introduced the same or similar changes, whilst adapting their strategies to your own environment. There is no guarantee that strategies that have been successful elsewhere will work as well in your organisation due to the many layers of interactions that make your system unique. Change, transformation and improvement cannot be delivered through the adoption of an imported recipe or formula without adapting it to the current environment. If you introduce a new procedure, software system, data collection or form on a Monday morning without investing in preparing and equipping the people who will use the innovation, it is unlikely to be automatically adopted. The process may have changed, but the people haven't – they are the same as they were on Friday afternoon. Process change is not the same as people change. Process change is transactional and concrete. People change is transitional and involves a psychological process to come to terms with a new situation and change behaviour to enable the new situation to occur. Unless this transition is well managed, change will not work and things can get stuck. Even with obviously positive changes, there are transitions that begin with having to let go of something and there will be push back because your change adds to the staff 'to do' list and new behaviours take longer, both of which result in lost time. At worst, staff are losing something they are strongly wedded to and may actively resist or get stuck in a neutral zone where they are aware of the change but not actively engaged –

It is important to remember that all staff feel that they are doing their best for each patient. Change for improvement should always be presented as something that helps good practitioners achieve even more. They may maintain that their only desired benefit of change is improved patient outcomes and these, of course, are likely to take some time to become apparent after the initial change. So what are some of the short-term benefits of change you can use to get people's attention? This is where you have to talk about impact as well as outcome. Impact what we are trying to achieve by change, for both consumers and for staff. It's not only about trying to improve the results of care. It's about consumers feeling the impact of your change through a different, more positive experience. Does the change mean that staff are more active listeners – so consumers feel heard? Is it that the change can form part of an action research project and that you can assist staff to write it up for a journal or a conference paper? Will it help both consumers and staff feel more informed and in control of what's going on? Can a process be made more efficient and simpler as part of the change? Can you save them time and money? (Frankel et al, 2011). Within this framework, as far as possible, give staff as much freedom as possible to devise their own ways of achieving the goals, based on their intimate knowledge of their own systems. But empowering people to create change is not just saying 'make it so' and then being disappointed when they don't achieve the desired result. Empowering people to change in complex systems is not straightforward. But there are some common actions that have been shown to be essential in assisting people to take ownership of a task or change: direction, knowledge, resources and support – the DKRS model of empowerment (Balding, 2011). For the DKRS model to succeed, each of these four components must be present to

**2.5.2 Develop your strategies for change – And impact** 

a sort of change no man's land (Bridges, 1997).

Rapid cycle piloting of change using the Plan Do Study Act (PDSA) cycle is a useful approach to change in a complex system. PDSA fits the changeable and adaptable nature of complex systems and enables you to test ideas under a variety of circumstances (Reason, 2008). It's also a good way to pick up on the feedback and side effects of your change. This model also includes the possibility that the change being tested will not be successful, but because these tests are done on a small scale the risk of failure can be kept to a level that's manageable. PDSA also helps achieve quick wins, even if small, that are integral to gaining stakeholder acceptance of change. Success on a small scale builds confidence, which allows larger risks and changes. Pilot projects work best under the following circumstances:


Staff involved in the pilot will be watching, judging and weighing up whether or not to hitch their wagon to the new way. It is imperative that your process has credibility. When you pilot a change, use a simple but rigorous project management approach and do exactly what you have promised. If you want to change people's beliefs about how things should be done, you must change what they see. A memo or an email about doing something differently will not make it happen. If you want people to believe that changing their behaviour will result in a certain positive outcome, that outcome must occur. If you commit the leadership group to behaving in a different way, they must behave in that way. This is where many change initiatives break down: we make the plan and say what will happen, but don't follow through.

Five Essential Skills for 21st Century

**3. Conclusion** 

**4. Acknowledgment**

Quality in Healthcare, Australia

(Accessed February 2011)

UNSW, Australia

*Skills*, Ashgate Publishing, UK

Gladwell, M. (2002). *The Tipping Point,* Backbay Books, USA Haines, SG. (1998). *Systems Thinking and Learning*, HRD Press, USA

**5. References** 

Quality Professionals in Health and Human Service Organisations 17

As the pressure on our health and aged care services grows, so too do the demands on the quality professional. Continuing to increase the efficiency and quality of healthcare will require new knowledge and savvier ways of working. To meet these challenges, quality professionals will need to expand their role beyond traditional compliance, measurement and improvement skills and tasks. They will be required to understand their workplaces as complex systems and be experts in supporting their complex organisations to create high quality care. To do this they will support and lead their organisations to develop robust governance, to create safety through a mix of effective systems and resilient people and to achieve sustainable change that positively impacts the consumer experience as well as

improving outcomes. These are the new skills for 21st century quality managers.

With grateful thanks to my family for indulging my preoccupation with all things quality.

ACSQHC. (2010). *Australian Safety and Quality Framework for Healthcare: Putting the* 

Amalberti, R.; Vincent, C.; Auroy, Y.; & de Saint Maurice, G. (2006). Violations and

Balding, C. (2005). Strengthening Clinical Governance through Cultivating the Line

Bosk, CL.; Dixon-Woods, M.; Goeshel, CA.; & Pronovost, PJ. (2009). The Art of Medicine:

Braithwaite, J., (2010). Between-group Behaviour in Health Care: Gaps, Edges, Boundaries,

Frankel, A.; Leonard, M.; Simmonds, T.; Haraden, C.; & Vega, K. (eds) (2009) *The Essential* 

Hindle, D.; Braithwaite, J.; Travaglia, J.; & Idema, R. (2006) *Patient Safety: A Comparative* 

Jeffcott, SA.; Ibrahim JE.; & Cameron, PA. (2009). Resilience in Healthcare and Clinical

Bridges, W. (1997). *Managing Transitions*, Addison Wesley Publishing Company, USA Dekker, S. (2005). *Ten Questions About Human Error*, Lawrence Earlbaum Associates Inc, USA Flin, R.; O'Connor, P.; & Crichton M. (2008). *Safety at the Sharp End – A Guide to Non-Technical* 

Organisations and Institute for Healthcare Improvement, USA

Handover, *Quality and Safety in Healthcare*, vol. 18, pp. 256–260

*Quality and Safety in Health Care*, December 15 (Suppl 1): i66–i71

Management Role., *Australian Health Review*, vol. 29, no. 3 Balding C. (2011). *The Strategic Quality Manager*, Arcade Custom, Australia.

Reality Check for Checklists, *The Lancet*, vol. 374, August 8

*Framework into Action: Getting Started*, Australian Commission on Safety and

Migrations in Health Care: A Framework for Understanding and Management.

Disconnections, Weak Ties, Spaces and Holes. A Systematic Review, *BMC Health Services Research*, vol. 10, no. 330, *http://www.biomedcentral.com/1472-6963/10/330*

*Guide for Patient Safety Officers*. Joint Commission on Accreditation of Healthcare

*Analysis of 18 Enquiries in 6 Countries*. Centre for Clinical Governance Research,

Early wins are required to show that change is possible and can have positive outcomes. Action sends a strong message, more than any memo ever could. Don't be surprised by unexpected or negative outcomes, and don't expect a linear cause followed by effect chain with your change. Look for the unintended negative side effect of your change. For example, if you have streamlined the new consumer registration process, does this leave clients feeling that they have been hurried and not heard? Don't ignore or downplay these negative side effects – they are not failure, but the way of the world in complex systems.

#### **2.5.4 Reinforce, embed and spread the change**

Creating buy-in is one thing. 'Stay in' is something else altogether. Systems need a constant supply of new energy to survive and, until your new change starts to create its own energy, it requires yours! Sustainability is a process, not an ending (NHS, 2002). Many managers want to get everything up and running on auto pilot as soon as possible, but this is the antithesis of what actually sustains change.

In complex systems, sustainability and spread are dynamic processes that need focus and attention. So, define sustainability. What do you mean by it? What do you want to still be happening in one/three/six months from now? People need to be reminded of the goal and the vision, and the way in which these are achieved requires monitoring and course correction in a shifting complex environment. Involve people in developing solutions to overcome the unexpected problems that arise, ensure they are equipped for their role in the change and reinforce where their contribution to the change makes things better for patients. Use the sceptics to help you identify the problems and the roadblocks and show you value their input. Arguing with them will not change their mind and you may lose valuable information (Haines, 1998).

If you've done a good job of your change process by giving the participants a positive experience, ensuring the change is an improvement for patients and staff and finding those quick wins, the initiative should have its ownership and should just about spread itself. This is the 'tipping point' concept, which provides a useful summary of spread (Gladwell, 2002). The 'law of the few' and the 'stickiness factor' are tipping point concepts, which provide us with direction on how to go about reaching the point where the change takes on a life of its own. The law of the few means that a few influential, popular people can effectively spread a message, so use the people who have influence – the 'players' in your complex system – and also the people who just get around and talk a lot. Stickiness means that a message has impact: you can't get it out of your head, it sticks in your memory. Are your messages 'sticky' or dull and forgettable? (Gladwell, 2002). Are they presented in the language of the people – or in complex bureaucratese?

Once you've got the change right, embed it in job descriptions, policies and procedures, competencies and performance reviews. Reinforce it. Remove the old way – if you don't, people will cling to it because it's familiar, and it will make the new way seem like an extra, rather than a replacement. Keep the change on meeting agendas as a specific review item for at least six to twelve months, depending on the size of the change. Appoint a 'keeper' of the change – someone influential whose job it is to keep an eye on the new way of doing things and the people involved, and to identify regression and unintended side effects. Ensure it continues to be linked to broader organisational initiatives.

#### **3. Conclusion**

16 Quality Assurance and Management

Early wins are required to show that change is possible and can have positive outcomes. Action sends a strong message, more than any memo ever could. Don't be surprised by unexpected or negative outcomes, and don't expect a linear cause followed by effect chain with your change. Look for the unintended negative side effect of your change. For example, if you have streamlined the new consumer registration process, does this leave clients feeling that they have been hurried and not heard? Don't ignore or downplay these negative

Creating buy-in is one thing. 'Stay in' is something else altogether. Systems need a constant supply of new energy to survive and, until your new change starts to create its own energy, it requires yours! Sustainability is a process, not an ending (NHS, 2002). Many managers want to get everything up and running on auto pilot as soon as possible, but this is the

In complex systems, sustainability and spread are dynamic processes that need focus and attention. So, define sustainability. What do you mean by it? What do you want to still be happening in one/three/six months from now? People need to be reminded of the goal and the vision, and the way in which these are achieved requires monitoring and course correction in a shifting complex environment. Involve people in developing solutions to overcome the unexpected problems that arise, ensure they are equipped for their role in the change and reinforce where their contribution to the change makes things better for patients. Use the sceptics to help you identify the problems and the roadblocks and show you value their input. Arguing with them will not change their mind and you may lose

If you've done a good job of your change process by giving the participants a positive experience, ensuring the change is an improvement for patients and staff and finding those quick wins, the initiative should have its ownership and should just about spread itself. This is the 'tipping point' concept, which provides a useful summary of spread (Gladwell, 2002). The 'law of the few' and the 'stickiness factor' are tipping point concepts, which provide us with direction on how to go about reaching the point where the change takes on a life of its own. The law of the few means that a few influential, popular people can effectively spread a message, so use the people who have influence – the 'players' in your complex system – and also the people who just get around and talk a lot. Stickiness means that a message has impact: you can't get it out of your head, it sticks in your memory. Are your messages 'sticky' or dull and forgettable? (Gladwell, 2002). Are they presented in the language of the

Once you've got the change right, embed it in job descriptions, policies and procedures, competencies and performance reviews. Reinforce it. Remove the old way – if you don't, people will cling to it because it's familiar, and it will make the new way seem like an extra, rather than a replacement. Keep the change on meeting agendas as a specific review item for at least six to twelve months, depending on the size of the change. Appoint a 'keeper' of the change – someone influential whose job it is to keep an eye on the new way of doing things and the people involved, and to identify regression and unintended side effects. Ensure it

side effects – they are not failure, but the way of the world in complex systems.

**2.5.4 Reinforce, embed and spread the change**

antithesis of what actually sustains change.

valuable information (Haines, 1998).

people – or in complex bureaucratese?

continues to be linked to broader organisational initiatives.

As the pressure on our health and aged care services grows, so too do the demands on the quality professional. Continuing to increase the efficiency and quality of healthcare will require new knowledge and savvier ways of working. To meet these challenges, quality professionals will need to expand their role beyond traditional compliance, measurement and improvement skills and tasks. They will be required to understand their workplaces as complex systems and be experts in supporting their complex organisations to create high quality care. To do this they will support and lead their organisations to develop robust governance, to create safety through a mix of effective systems and resilient people and to achieve sustainable change that positively impacts the consumer experience as well as improving outcomes. These are the new skills for 21st century quality managers.

#### **4. Acknowledgment**

With grateful thanks to my family for indulging my preoccupation with all things quality.

#### **5. References**


**2** 

Kozo Koura

*Japan* 

*Kozo Koura & Associates* 

**The Development and Changes** 

**1. Introduction** 

control in Japan later.

**QC" (1955-59)** 

**1.1 "Investigation and research era of quality control" (1946-49)** 

resources, nothing but "industrious people and its brains"1).

(sentence abbreviated)"2), as the vision of QC of Japan.

In 1945 after the Second World War, we were nearly on the verge of starvation: over 80% of our industrial facilities had been destroyed, and industrial production dropped to a little over 10% of the prewar standard. It may tell that, Japan was staying without natural

Japanese Standards Association (henceforth JSA) was established under instruction of Industrial Standardization Law in 1945 and Union of Japanese Scientists and Engineers (henceforth JUSE) was organized in 1946 and have become the mother's body of the quality control promotion of our country since then. While Chairman of JUSE, Ichiro Ishikawa (the first chairman of Keidanren: the Federation of Economic Organizations) planned the overseas technical investigation team composed by the members of learning and experience for revival of Japan industrial business in 1948 with approved the grant of investigation research cost of Economic Stabilization Board, and as a result of investigation it was proposed to introduce "Quality Control (henceforth QC)" into Japan. JUSE carried out a QC seminar (Quality Control BASIC course) in 1949. QCRG (QC Research Group) organized by the members of a lecturer of this seminar had exerted them for development of quality

In "the statement of the first publication" of the first-issue of the "Quality Control" (present "Quality Management") in March 1950, Ichiro Ishikawa proposed as "For reconstruction of a peace country, a cultural country, and a democratic country, the product of our country must enable it to be positive competitiveness in the world market dignifiedly heading toward 'The Figure which should exist' for the future of our country economy and industry

**2. "SQC Ers" (1950-54) and "Era of systematic management reinforcement of** 

Dr. W.E. Deming was invited and the eight days course of QC was held in 1950. The Deming Prize founded a prize to commemorated Dr. W.E. Deming's contribution and friendship in a lasting way and to promote the continued development of Quality Control in

**of Quality Control in Japan** 

Meadows, DH., (2008). *Thinking in Systems – A Primer*, Sustainability Institute, USA


## **The Development and Changes of Quality Control in Japan**

Kozo Koura *Kozo Koura & Associates Japan* 

### **1. Introduction**

18 Quality Assurance and Management

NHS. (2002). *The Improvement Leaders Guide to Managing the Human Dimensions of Change:* 

NHS. (2004). *Engaging Individual Staff in Service Improvement*, NHS Modernisation Agency,

Victorian Quality Council. (2003). *Better Quality, Better Healthcare: A Safety and Quality* 

Zuckerman, A. (2005). *Healthcare Strategic Planning*. Health Administration Press, USA

*Framework for Victorian Healthcare*, Department of Human Services, Victorian

Meadows, DH., (2008). *Thinking in Systems – A Primer*, Sustainability Institute, USA

*Working with Individuals,* NHS Modernisation Agency, UK

Reason, J. (2008). The Human Contribution, Ashgate Publishing Company, UK

UK

Government, Australia

#### **1.1 "Investigation and research era of quality control" (1946-49)**

In 1945 after the Second World War, we were nearly on the verge of starvation: over 80% of our industrial facilities had been destroyed, and industrial production dropped to a little over 10% of the prewar standard. It may tell that, Japan was staying without natural resources, nothing but "industrious people and its brains"1).

Japanese Standards Association (henceforth JSA) was established under instruction of Industrial Standardization Law in 1945 and Union of Japanese Scientists and Engineers (henceforth JUSE) was organized in 1946 and have become the mother's body of the quality control promotion of our country since then. While Chairman of JUSE, Ichiro Ishikawa (the first chairman of Keidanren: the Federation of Economic Organizations) planned the overseas technical investigation team composed by the members of learning and experience for revival of Japan industrial business in 1948 with approved the grant of investigation research cost of Economic Stabilization Board, and as a result of investigation it was proposed to introduce "Quality Control (henceforth QC)" into Japan. JUSE carried out a QC seminar (Quality Control BASIC course) in 1949. QCRG (QC Research Group) organized by the members of a lecturer of this seminar had exerted them for development of quality control in Japan later.

In "the statement of the first publication" of the first-issue of the "Quality Control" (present "Quality Management") in March 1950, Ichiro Ishikawa proposed as "For reconstruction of a peace country, a cultural country, and a democratic country, the product of our country must enable it to be positive competitiveness in the world market dignifiedly heading toward 'The Figure which should exist' for the future of our country economy and industry (sentence abbreviated)"2), as the vision of QC of Japan.

#### **2. "SQC Ers" (1950-54) and "Era of systematic management reinforcement of QC" (1955-59)**

Dr. W.E. Deming was invited and the eight days course of QC was held in 1950. The Deming Prize founded a prize to commemorated Dr. W.E. Deming's contribution and friendship in a lasting way and to promote the continued development of Quality Control in

The Development and Changes of Quality Control in Japan 21

In 70/71 years, "QC Circle Koryo (General Principles of QC Circle)", and "How To Operate QC Circle Activities" were published. QC Circle Activities were further had organized nine branches offices in the whole country. Moreover, it began to spread through out the world, and China and Korean version in 76 year, and English version of QC Circle Koryo (word of henceforth each country) were published in the 80-year. The International Convention on

The Deming Prize awarded company came out of the construction industry for the first time

The Quality Assurance was compiled into one book as a "Quality Assurance Guide Book", and the concept of quality was also expanded to deal with reliability, PLP (product-liability prevention), environmental management, QC in office work and sales, and conservation of resources and energy problem. Moreover, advanced technologies development, such as "New Seven Management Tools", "QFD (Quality Function Deployment)", and multi-

So called the "Quality Revolution" to which Dr. Juran was mentioned 3) was attained, and made-in-Japan product was begun to export to a world market, and Japanese TQC had been

Reverse export of TQC of Japan started. Diversification and an advancement of a customer demand had been progressed, "Attractive Quality" was advocated in 84, sensitivity quality is also dealt with. Policy Management and Daily Management had been being substantial and clarifying role of management, the management-strategy problem had also been taken, and Group-Wide QC also progressed. Moreover, TQC was also implemented from manufacturing industry to service industry, the QC of Software develops, and the "Social

Then, the internationalization of the Deming Application Prize was determined in 1984, and Florida Electric-Power and Light Incorporated Company, U.S.A became the first recipient company as an overseas company in 1989. ICQC'87 TOKYO was held in 1987 and "Ten

The technology transfer of quality prize started in the same year, and the Malcolm Baldrige National Quality Award in U.S.A. was founded by referring to Deming Prize in 1987, and

While the introduction of the ISO 9000 Quality System / 14000 Environmental Management System to company in our country and the integration with TQM could be considered, it was announced "Declaration TQM" by QCS 1997, and "the Comprehensive Quality Management in the 21st Century" was advocated, and "Stakeholder Relationship Management" with stockholder, customer, employee, society, and environment, etc. came to be considered. Moreover, the systematization of TQM is also proposed and the total quality of a wide sense was defined. On the other side, while "Strategic Policy Management" was

**6. "Era of internationalization of TQC and TQM reconstruction" (1990-99)** 

QC Circle was held in 1978 (ICQCC1978-Tokyo).

variable analysis using the computer was handled in QC.

**5. "Era of leap and development of TQC" (1980-89)** 

Quality" lecture by QC magazine had been taken in 86.

Items of the Specific Features of TQC of Japan" was announced.

also continued to the European Quality Award founded in 1991.

also come to be accepted abroad.

in 1979.

Japan in 1951, and it became the Driving Force of QC Development of subsequent Japan. Spread and application of statistical quality control (SQC) were prosperous in this time, and it was called SQC Era. Also, the management and improvement by control chart, process control table, or process analysis were advanced by the "Deming Cycle", and it was also called as "Process-Control Priority-Focus Era".

Dr. J.M. Juran was invited in 1954 and his seminar was opened, under Dr. Juran's concept of "QC is a part of business management", and it went into subsequent "the Era of Systematic Management by reinforcement of QC", and progressed "Establishment of the Concept and the Technologies of Management". A Deming Cycle was generalized as a PDCA (Plan-Do-Check-Act) Cycle, and systematization and institutionalization of QC, and the QC Activity System were advocated.

### **3. "Total quality control (henceforth TQC) era" (1960-69)**

 By the 1960, "Quality Control magazine" proposed the special issue for "QC Implementation with All Member" by one year's project and the TQC of Dr. A.V. Feigenbaum was also taken into and spreaded, and which were later developed with a next Japanese QC (Japanese TQC). That is, triggered from Dr. Juran's "General management" lecture, the concept of "Management Item" was introduced and further developed into the "Management Item Table by position and rank", "Flag System", and "Policy Management (Management by Policy)".

Further, expanding into substantial strengthening on "Initial Production-Flow Line Quality Control" in new product development and on sale stage and "Vender-Vendee Relation's QC 10 principles", and eventually "Quality Assurance System" were established, and these views were further developed into the improvement of management efficiency, i.e., "Cross-Functional Management" by establishment of management system of quality (Q: Quality), the cost/ price and profits (C: Cost/Profit), and the time for delivery and the quantity of production (D: Delivery/Volume). Then "Policy Management" and "Cross-Functional Management" were established as the fundamentals of business management system. As horizontal management against ordinary vertical management in organization.

On the other hand, QC Circle was born in 1962, the base of "management of humanity respect" was built, and "seven QC tools" and the "QC story for problem solving" were developed as the tools of the activities.

The 1st Quality Control Symposium (henceforth QCS) was held in 1965, and its subject had taken as "Introduction, Promotion and Rooting of QC". The QCS had the role of discussion and discovering right approaches on QC under the collaboration of industrial and academic worlds. "Six Items of the Specific Features of TQC in Japan" was deliberated upon them and announced at the International Conference on Quality Control in 1969 (ICQC'69 TOKYO).

### **4. "Era of establishment of TQC" (1970-79)**

The Japanese Society for Quality Control (JSQC) was established and the Japanese Quality Control Medal was founded in 1970.

Japan in 1951, and it became the Driving Force of QC Development of subsequent Japan. Spread and application of statistical quality control (SQC) were prosperous in this time, and it was called SQC Era. Also, the management and improvement by control chart, process control table, or process analysis were advanced by the "Deming Cycle", and it was also

Dr. J.M. Juran was invited in 1954 and his seminar was opened, under Dr. Juran's concept of "QC is a part of business management", and it went into subsequent "the Era of Systematic Management by reinforcement of QC", and progressed "Establishment of the Concept and the Technologies of Management". A Deming Cycle was generalized as a PDCA (Plan-Do-Check-Act) Cycle, and systematization and institutionalization of QC, and the QC Activity

 By the 1960, "Quality Control magazine" proposed the special issue for "QC Implementation with All Member" by one year's project and the TQC of Dr. A.V. Feigenbaum was also taken into and spreaded, and which were later developed with a next Japanese QC (Japanese TQC). That is, triggered from Dr. Juran's "General management" lecture, the concept of "Management Item" was introduced and further developed into the "Management Item Table by position and rank", "Flag System", and "Policy Management

Further, expanding into substantial strengthening on "Initial Production-Flow Line Quality Control" in new product development and on sale stage and "Vender-Vendee Relation's QC 10 principles", and eventually "Quality Assurance System" were established, and these views were further developed into the improvement of management efficiency, i.e., "Cross-Functional Management" by establishment of management system of quality (Q: Quality), the cost/ price and profits (C: Cost/Profit), and the time for delivery and the quantity of production (D: Delivery/Volume). Then "Policy Management" and "Cross-Functional Management" were established as the fundamentals of business management system. As

On the other hand, QC Circle was born in 1962, the base of "management of humanity respect" was built, and "seven QC tools" and the "QC story for problem solving" were

The 1st Quality Control Symposium (henceforth QCS) was held in 1965, and its subject had taken as "Introduction, Promotion and Rooting of QC". The QCS had the role of discussion and discovering right approaches on QC under the collaboration of industrial and academic worlds. "Six Items of the Specific Features of TQC in Japan" was deliberated upon them and announced at the International Conference on Quality Control in 1969 (ICQC'69 TOKYO).

The Japanese Society for Quality Control (JSQC) was established and the Japanese Quality

horizontal management against ordinary vertical management in organization.

called as "Process-Control Priority-Focus Era".

**3. "Total quality control (henceforth TQC) era" (1960-69)** 

System were advocated.

(Management by Policy)".

developed as the tools of the activities.

Control Medal was founded in 1970.

**4. "Era of establishment of TQC" (1970-79)** 

In 70/71 years, "QC Circle Koryo (General Principles of QC Circle)", and "How To Operate QC Circle Activities" were published. QC Circle Activities were further had organized nine branches offices in the whole country. Moreover, it began to spread through out the world, and China and Korean version in 76 year, and English version of QC Circle Koryo (word of henceforth each country) were published in the 80-year. The International Convention on QC Circle was held in 1978 (ICQCC1978-Tokyo).

The Deming Prize awarded company came out of the construction industry for the first time in 1979.

The Quality Assurance was compiled into one book as a "Quality Assurance Guide Book", and the concept of quality was also expanded to deal with reliability, PLP (product-liability prevention), environmental management, QC in office work and sales, and conservation of resources and energy problem. Moreover, advanced technologies development, such as "New Seven Management Tools", "QFD (Quality Function Deployment)", and multivariable analysis using the computer was handled in QC.

So called the "Quality Revolution" to which Dr. Juran was mentioned 3) was attained, and made-in-Japan product was begun to export to a world market, and Japanese TQC had been also come to be accepted abroad.

#### **5. "Era of leap and development of TQC" (1980-89)**

Reverse export of TQC of Japan started. Diversification and an advancement of a customer demand had been progressed, "Attractive Quality" was advocated in 84, sensitivity quality is also dealt with. Policy Management and Daily Management had been being substantial and clarifying role of management, the management-strategy problem had also been taken, and Group-Wide QC also progressed. Moreover, TQC was also implemented from manufacturing industry to service industry, the QC of Software develops, and the "Social Quality" lecture by QC magazine had been taken in 86.

Then, the internationalization of the Deming Application Prize was determined in 1984, and Florida Electric-Power and Light Incorporated Company, U.S.A became the first recipient company as an overseas company in 1989. ICQC'87 TOKYO was held in 1987 and "Ten Items of the Specific Features of TQC of Japan" was announced.

The technology transfer of quality prize started in the same year, and the Malcolm Baldrige National Quality Award in U.S.A. was founded by referring to Deming Prize in 1987, and also continued to the European Quality Award founded in 1991.

#### **6. "Era of internationalization of TQC and TQM reconstruction" (1990-99)**

While the introduction of the ISO 9000 Quality System / 14000 Environmental Management System to company in our country and the integration with TQM could be considered, it was announced "Declaration TQM" by QCS 1997, and "the Comprehensive Quality Management in the 21st Century" was advocated, and "Stakeholder Relationship Management" with stockholder, customer, employee, society, and environment, etc. came to be considered. Moreover, the systematization of TQM is also proposed and the total quality of a wide sense was defined. On the other side, while "Strategic Policy Management" was

The Development and Changes of Quality Control in Japan 23

ICQ'05-Tokyo (International Conference on Quality'05-Tokyo-) had been held on September 13-16 2005, and attended 51 participating nation, 1066 participants, and 165 presentations.

In order to recovering from dishonor of the Made-in-Japan product of "poor and cheap", we have walked along the Journey of the QC of our country toward first Keidanren chairman Ichiro Ishikawa's vision realization. And it has come to be exportable the product of "good and reasonable price" to the world, and to be called an economic big

In "Upcoming Century of Quality, 21 Century" which Dr. Juran is said 3), there should announce new vision, and, probably, it should carry out and cut after this. Fortunately it is stated that the 8th chairman of Keidanren, Shyoichiro Toyoda's lecture dissemination as "To be country, Japan which is a Country be trusted and respected by the World", for a future of Japan 4). We wish to move forward with this as the 21st century vision of "Quality

Kenichi Koyanagi (1963): Quality Emphasis in Japan's Postwar Trade, pp.4, JUSE Publishing

Shyoichiro Toyoda (1998): Quality Month-Special Event, Lecture Gist; Refer to Syoichiro

Quality Establishment-of-a-Country Japan, Vol.54. No3, pp.36-38, Quality Management,

Toyoda (1996): Creation of "Attractive Japan", Toyo Keizai Sinbunnsya, p.17,

Ichiro Ishikawa (1950): Statement of First Publication --Quality Control, Vol.1, No.1 J.M. Juran: Upcoming Century of Quality (1), ENGINEERS, September, 1994, pp.1-9;

7. Human Resources Development at Workshop, Evolutional-QC Circle (e-QCC)

4. Self-assessment of management system

Establishment-of-a-Country Japan" in 2003, 5).

The Keynote Address of ASQ 48th AQC, May 24 1994.

"Fundamental Idea".

JUSE Publisher, 2003

**8. "Toward new age"** 

country like today.

**9. References** 

Co.

5. Development Program of Management Director of Technology 6. Development Program of Quality Professional Specialist

8. Structuring of Healthcare Management System based of ISO 9000

advocated, "Global Quality Management (GLQM)" came to be taken into account in connection with the internationalization of TQM, and international competitiveness and TQM (Japan-U.S.A. comparison) were also studied.

As for Japanese economy, stagnation advanced with collapse of bubble. Eventually, the consensus for Quality Management had been slipped of minds, and caused serious quality problems and ethical business problems, which were further apprehensions of deterioration of business management quality, product/service quality, and social quality which were structured for social industrial base in Japan, and quite afraid of losing Japanese industries competitiveness in the international market. In December 1999, "Hakone Declaration" was adopted as "the Design of Establishment of the Japan Organization for Quality Innovation" at QCS, Hakone.

"Deming Prize Application Guide" was also revised in the same year.

### **7. "Era of TQM innovation" (2000- )**

"The TQM Encouragement Prize Guide" as preceding step of the continuous promotion to Deming Prize was founded in 2000. Moreover, based on the "Hakone declaration" in 1999, "the Japan Organization for Quality Innovation (JOQI)" was founded on May 23 2001 by the cooperation with JSQC and participation of five quality related organizations, and started for the activity to the reconstruction of a Japanese Management model.

And while learning to the way of revival of the U.S.A., it is Japanese Evolution of TQM started.

The basic concept and the examination standard (mark system) of Deming Prize Application Guide were also revised in 2002.

A "Quality Control" magazine also renewed its name under "Quality Management" magazine, and began to study for the integration of new management with TQM, such as Top's Viewpoint and Corporate Governance, Value Management, Balanced Score Card and Customer Value Management as a new current of quality business management, began to be studied.

In November 2002 "the Asia Quality Network" was formed as QC Research / Promotion Organization of ten nations of Asia, to promote "activity which raises the quality as a factory in the world" was started.

Moreover, ICQCC2003-Tokyo was held in October 17-20 2003 and 1400 attendants participated from 22 countries contained Asia 13 countries and ICQCC(International Conference for QC Circle) 2011 Yokohama on September 11-14, 2011 had been attended 14 participating nations, 1108 participants and 177 presentations.

JOQI reported its activities on May 28 2004 and finished the Objective and Role of Establishment of New Quality Management (System) Model in 21st Century, the contents of report are as follows;


ICQ'05-Tokyo (International Conference on Quality'05-Tokyo-) had been held on September 13-16 2005, and attended 51 participating nation, 1066 participants, and 165 presentations.

#### **8. "Toward new age"**

22 Quality Assurance and Management

advocated, "Global Quality Management (GLQM)" came to be taken into account in connection with the internationalization of TQM, and international competitiveness and

As for Japanese economy, stagnation advanced with collapse of bubble. Eventually, the consensus for Quality Management had been slipped of minds, and caused serious quality problems and ethical business problems, which were further apprehensions of deterioration of business management quality, product/service quality, and social quality which were structured for social industrial base in Japan, and quite afraid of losing Japanese industries competitiveness in the international market. In December 1999, "Hakone Declaration" was adopted as "the Design of Establishment of the Japan Organization for Quality Innovation"

"The TQM Encouragement Prize Guide" as preceding step of the continuous promotion to Deming Prize was founded in 2000. Moreover, based on the "Hakone declaration" in 1999, "the Japan Organization for Quality Innovation (JOQI)" was founded on May 23 2001 by the cooperation with JSQC and participation of five quality related organizations, and started

And while learning to the way of revival of the U.S.A., it is Japanese Evolution of TQM

The basic concept and the examination standard (mark system) of Deming Prize Application

A "Quality Control" magazine also renewed its name under "Quality Management" magazine, and began to study for the integration of new management with TQM, such as Top's Viewpoint and Corporate Governance, Value Management, Balanced Score Card and Customer Value Management as a new current of quality business management, began to

In November 2002 "the Asia Quality Network" was formed as QC Research / Promotion Organization of ten nations of Asia, to promote "activity which raises the quality as a factory

Moreover, ICQCC2003-Tokyo was held in October 17-20 2003 and 1400 attendants participated from 22 countries contained Asia 13 countries and ICQCC(International Conference for QC Circle) 2011 Yokohama on September 11-14, 2011 had been attended 14

JOQI reported its activities on May 28 2004 and finished the Objective and Role of Establishment of New Quality Management (System) Model in 21st Century, the contents of

TQM (Japan-U.S.A. comparison) were also studied.

**7. "Era of TQM innovation" (2000- )** 

Guide were also revised in 2002.

in the world" was started.

report are as follows;

3. Value Creation

1. New Commodity Development 2. Business Process Innovation

"Deming Prize Application Guide" was also revised in the same year.

for the activity to the reconstruction of a Japanese Management model.

participating nations, 1108 participants and 177 presentations.

at QCS, Hakone.

started.

be studied.

In order to recovering from dishonor of the Made-in-Japan product of "poor and cheap", we have walked along the Journey of the QC of our country toward first Keidanren chairman Ichiro Ishikawa's vision realization. And it has come to be exportable the product of "good and reasonable price" to the world, and to be called an economic big country like today.

In "Upcoming Century of Quality, 21 Century" which Dr. Juran is said 3), there should announce new vision, and, probably, it should carry out and cut after this. Fortunately it is stated that the 8th chairman of Keidanren, Shyoichiro Toyoda's lecture dissemination as "To be country, Japan which is a Country be trusted and respected by the World", for a future of Japan 4). We wish to move forward with this as the 21st century vision of "Quality Establishment-of-a-Country Japan" in 2003, 5).

#### **9. References**

Kenichi Koyanagi (1963): Quality Emphasis in Japan's Postwar Trade, pp.4, JUSE Publishing Co.

Ichiro Ishikawa (1950): Statement of First Publication --Quality Control, Vol.1, No.1

J.M. Juran: Upcoming Century of Quality (1), ENGINEERS, September, 1994, pp.1-9;

The Keynote Address of ASQ 48th AQC, May 24 1994.


**3** 

Yasuo Iwaki

 *Japan* 

*Chaos Applied Research Office* 

**ISO-GUM and Supplements** 

**are Utilized for QA of BCA Data** 

International Organization of Standard- Guide to the expression of Uncertainty in Measurement (ISO-GUM)[1] was published as guidance for making a measurement result into an assurance performance by ISO in 1993 and in 1995 corrected. ISO has inquired by adding new thinking to the conventional method. To apply the GUM approached two main assumptions must hold. One is used expression of uncertainty from error so that it might be suitable for an assurance in evaluation of ambiguity measurement data. Another one used Exploratory Data Analysis (EDA) for ambiguity data from classical statistical analysis. It is now useful in order that the Blood Chemical Analysis (BCA) may also grade up the

ISO was additional new issue that is published many supplements in order to fully utilize ISO-GUM. Particularly, it is come to be for Markov Chain Monte Carlo (MCMC) in Bayesian inference and. the come to be used Multi-viable Analysis (MA) that is useful the multi regression analysis by multi-nonlinear least squares method. One of them is a production procedure for obtaining an assurance process. ISO-GUM is respectively as one by one step, it is progressing [2]. The result is also required by improvement of in measurement accuracy

In planning measurement system is very important for Measurement Systems Analysis (MSA) [3]. It is a specially designed experiment that seeks to identify the uncertainty components in the measurand. MSA is used to evaluate the quantitative analysis in medical test system which is entire process of obtaining data. The inspector has to understand well the process of measurement system used in order to ensure the obtained Data Quality Object (DQO) and has strived for quality analysis in good assessment. Many mistakes in the total testing process are called "laboratory errors" [4], although these may be due to poor communication or poorly designed, all of which are beyond the laboratory error control. The uncertainty element in total measurement system is recognized to be an error of measurement management poor. At the management all the processes need to be managed from the preceding paragraph of laboratory analysis to the processing an after and an end. It has to remove the fault element and risk element of clinical healthcare by the uncertainty

reliability of an analysis result with assurance performed.

**1. Introduction** 

and Quality Assurance (QA).

**2. Background of ISO-GUM** 

Yasuo Iwaki *Chaos Applied Research Office Japan* 

#### **1. Introduction**

24 Quality Assurance and Management

International Organization of Standard- Guide to the expression of Uncertainty in Measurement (ISO-GUM)[1] was published as guidance for making a measurement result into an assurance performance by ISO in 1993 and in 1995 corrected. ISO has inquired by adding new thinking to the conventional method. To apply the GUM approached two main assumptions must hold. One is used expression of uncertainty from error so that it might be suitable for an assurance in evaluation of ambiguity measurement data. Another one used Exploratory Data Analysis (EDA) for ambiguity data from classical statistical analysis. It is now useful in order that the Blood Chemical Analysis (BCA) may also grade up the reliability of an analysis result with assurance performed.

ISO was additional new issue that is published many supplements in order to fully utilize ISO-GUM. Particularly, it is come to be for Markov Chain Monte Carlo (MCMC) in Bayesian inference and. the come to be used Multi-viable Analysis (MA) that is useful the multi regression analysis by multi-nonlinear least squares method. One of them is a production procedure for obtaining an assurance process. ISO-GUM is respectively as one by one step, it is progressing [2]. The result is also required by improvement of in measurement accuracy and Quality Assurance (QA).

#### **2. Background of ISO-GUM**

In planning measurement system is very important for Measurement Systems Analysis (MSA) [3]. It is a specially designed experiment that seeks to identify the uncertainty components in the measurand. MSA is used to evaluate the quantitative analysis in medical test system which is entire process of obtaining data. The inspector has to understand well the process of measurement system used in order to ensure the obtained Data Quality Object (DQO) and has strived for quality analysis in good assessment. Many mistakes in the total testing process are called "laboratory errors" [4], although these may be due to poor communication or poorly designed, all of which are beyond the laboratory error control. The uncertainty element in total measurement system is recognized to be an error of measurement management poor. At the management all the processes need to be managed from the preceding paragraph of laboratory analysis to the processing an after and an end. It has to remove the fault element and risk element of clinical healthcare by the uncertainty

existing state or future outcome, more than one possible outcome. Necessity of more than the three sigma accuracy was carried out to IT medical system for world wide base medical healthcare. Especially an External Quality Assurance Scheme (EQAS) is an importance it can use also for common view of medical cognitive diagnosis technology. If the final report value of BCA became a commercial transaction article, it requires to follow the QA by ISO standard. Quality Engineer (QE) is a means also to prevent a misdiagnosis effectively. Furthermore, the Statistical Quality Control (SQC) of a patient individual's data is also important for prevent from a clinical misdiagnosis. The result of a statistical analysis is working not only get to know condition of disease, but it is utilized for exacted judge decisions to support a point of care program. The accomplishing to this requirement, some international regulations and guides has been edited as the results of joint works among

QA is doing its best also in the field of a clinical examination to be able to respond to a patient or a donor effectively. ISO-GUM is edited in series to 98-1 from 98-5 as a guide of

The conventional statistical technology was researching for error of random data by the subjective statistical work by "law of large number", it is come out complicated random error and systematic error. Systematic error can be removed as bias since it can be made a fixed numerical value. Random error was difficult work in order to remove. Therefore, the result in which reading out and clear not able to achieved. Further, the first type error (false positive) and the second type error (false negative) are achieved among the errors. So it had become a cause of the misdiagnosis. For prevent a misdiagnosis, it is required to correct all of the errors factor. Work of an improvement of these errors has been also studied by fault state through many years in the Quality Engineering (QE). The central pole theorem became important recently. Many technologies were useful commonly in QE

QE assumes how often a fault state generative, it is starting analysis from hierarchical gradient-based motion estimation of fault factors though "Failure Mode and Effects Analysis (FMEA)". QE determines the root cause of the fault element though "Root Cause Analysis (RCA)" and "Fault Tree Analysis (FTA)". These are developed based on tracer technology. In professional Test (PT), the procedure of decision followed one by one, in

FTA is versatile methods for dealing with probabilistic risk, reliability and availability. Although FTA was developed in the 1690s for hardware system, it is an adaptable logicbased technique that has been applied to combined hardware and software systems. This research was led it to make the result of the BCA The relation between QE and ISO-GUM is

several international organizations for data quality.[7][8]

order to discern the importance level of fault factors.

**2.2 ISO and QE** 

**2.2.1 Measurement error** 

and in ISO-GUM [8].

shown in table 1.

**2.2.2 QE** 

ISO/IEC. QA of clinical test by ISO-GUM is made utilization in 2006.

data. In order to improve supply service of data from laboratory, it require to QA of the data offered.

The role of global and regional metrological organizations is also to be discussed to get a mutual confidence between these test laboratories. The main targets of above activities may be summarized by the Mutual Recognition Agreement (MRA) between participating economies.

ISO-GUM published for an international consensus based on this concept, it is emerging that analysis values are expressed in combination with uncertainty of their measurement to indicate their reliability. In the field of measurement science and clinical chemical analysis, there is exists world wide requirements for the reliable and competitive evidence to confirm the measurement process and measurement results in many stages. A goal of object is improvement in the reliability as Good Laboratory Practice (GLP)

The purpose is in construction of MSA for without hesitating diagnosis since some inspect data is ambiguous. It is selecting a point of healthcare for always suitable diagnosis, and losing the futility of a health resource. For this reason, it aimed at starting with improvement in the accuracy in the field of BCA as a trial to MSA and assuring the quality of result completely by a laboratory implement guide. The goal of good practice guidance research has continued in the post to support the development of Quality Assurance (QA) and Quality Control (QC) in Quality Engineering (QE) completely by high order accuracy. ISO-GUM was legislated an accuracy assurance for dealing of measurement data.[5]. It is ISO15193:2009 that is defined as in vitro diagnostic medical devices.

ISO/IEC Guide 98-1:2009 provides a brief introduction to be GUM in order to indicate the relevance of that new fundamental guide and promote its use. It also outlines documents related to the GUM that are intended to extend the application of that guide to broader categories and fields of practical problems. It also considers various concepts used in measurement science [6] that is included a science (thinking) and an engineering (thinking). In particular, it covers the need to characterize the quality of a measurement though appropriate statements of measurement uncertainty. ISO edited international vocabulary in metrology (VIM) simultaneously.

#### **2.1 QA and QC**

Measurement data condition are roughly divided into within-laboratory and betweenlaboratory for QC. Under within-laboratory measuring condition, the uncertainty due to within-day variations are estimated from repeatedly measured values of the test sample within the same laboratory. As for research target in clinical examination, the accuracy of Internal Quality Control (IQC) in house must be keep always more than two sigma. The under between-laboratory measuring condition, the uncertainty due to compared between another laboratory variation data. The uncertainty due to between-laboratory variation are estimated from simultaneously measured values of the test sample obtained at more than one laboratory, and an accuracy of External Quality Control (EQC) secures more than three sigma levels, it is an international level. In both case, the individual component are composed to obtain standard uncertainty due to measurement conditions. The lack of certainty, a state of having limited knowledge where it is impossible to exactly describe existing state or future outcome, more than one possible outcome. Necessity of more than the three sigma accuracy was carried out to IT medical system for world wide base medical healthcare. Especially an External Quality Assurance Scheme (EQAS) is an importance it can use also for common view of medical cognitive diagnosis technology. If the final report value of BCA became a commercial transaction article, it requires to follow the QA by ISO standard. Quality Engineer (QE) is a means also to prevent a misdiagnosis effectively. Furthermore, the Statistical Quality Control (SQC) of a patient individual's data is also important for prevent from a clinical misdiagnosis. The result of a statistical analysis is working not only get to know condition of disease, but it is utilized for exacted judge decisions to support a point of care program. The accomplishing to this requirement, some international regulations and guides has been edited as the results of joint works among several international organizations for data quality.[7][8]

QA is doing its best also in the field of a clinical examination to be able to respond to a patient or a donor effectively. ISO-GUM is edited in series to 98-1 from 98-5 as a guide of ISO/IEC. QA of clinical test by ISO-GUM is made utilization in 2006.

#### **2.2 ISO and QE**

26 Quality Assurance and Management

data. In order to improve supply service of data from laboratory, it require to QA of the data

The role of global and regional metrological organizations is also to be discussed to get a mutual confidence between these test laboratories. The main targets of above activities may be summarized by the Mutual Recognition Agreement (MRA) between participating

ISO-GUM published for an international consensus based on this concept, it is emerging that analysis values are expressed in combination with uncertainty of their measurement to indicate their reliability. In the field of measurement science and clinical chemical analysis, there is exists world wide requirements for the reliable and competitive evidence to confirm the measurement process and measurement results in many stages. A goal of object is

The purpose is in construction of MSA for without hesitating diagnosis since some inspect data is ambiguous. It is selecting a point of healthcare for always suitable diagnosis, and losing the futility of a health resource. For this reason, it aimed at starting with improvement in the accuracy in the field of BCA as a trial to MSA and assuring the quality of result completely by a laboratory implement guide. The goal of good practice guidance research has continued in the post to support the development of Quality Assurance (QA) and Quality Control (QC) in Quality Engineering (QE) completely by high order accuracy. ISO-GUM was legislated an accuracy assurance for dealing of measurement data.[5]. It is

ISO/IEC Guide 98-1:2009 provides a brief introduction to be GUM in order to indicate the relevance of that new fundamental guide and promote its use. It also outlines documents related to the GUM that are intended to extend the application of that guide to broader categories and fields of practical problems. It also considers various concepts used in measurement science [6] that is included a science (thinking) and an engineering (thinking). In particular, it covers the need to characterize the quality of a measurement though appropriate statements of measurement uncertainty. ISO edited international vocabulary in

Measurement data condition are roughly divided into within-laboratory and betweenlaboratory for QC. Under within-laboratory measuring condition, the uncertainty due to within-day variations are estimated from repeatedly measured values of the test sample within the same laboratory. As for research target in clinical examination, the accuracy of Internal Quality Control (IQC) in house must be keep always more than two sigma. The under between-laboratory measuring condition, the uncertainty due to compared between another laboratory variation data. The uncertainty due to between-laboratory variation are estimated from simultaneously measured values of the test sample obtained at more than one laboratory, and an accuracy of External Quality Control (EQC) secures more than three sigma levels, it is an international level. In both case, the individual component are composed to obtain standard uncertainty due to measurement conditions. The lack of certainty, a state of having limited knowledge where it is impossible to exactly describe

improvement in the reliability as Good Laboratory Practice (GLP)

ISO15193:2009 that is defined as in vitro diagnostic medical devices.

metrology (VIM) simultaneously.

**2.1 QA and QC** 

offered.

economies.

#### **2.2.1 Measurement error**

The conventional statistical technology was researching for error of random data by the subjective statistical work by "law of large number", it is come out complicated random error and systematic error. Systematic error can be removed as bias since it can be made a fixed numerical value. Random error was difficult work in order to remove. Therefore, the result in which reading out and clear not able to achieved. Further, the first type error (false positive) and the second type error (false negative) are achieved among the errors. So it had become a cause of the misdiagnosis. For prevent a misdiagnosis, it is required to correct all of the errors factor. Work of an improvement of these errors has been also studied by fault state through many years in the Quality Engineering (QE). The central pole theorem became important recently. Many technologies were useful commonly in QE and in ISO-GUM [8].

#### **2.2.2 QE**

QE assumes how often a fault state generative, it is starting analysis from hierarchical gradient-based motion estimation of fault factors though "Failure Mode and Effects Analysis (FMEA)". QE determines the root cause of the fault element though "Root Cause Analysis (RCA)" and "Fault Tree Analysis (FTA)". These are developed based on tracer technology. In professional Test (PT), the procedure of decision followed one by one, in order to discern the importance level of fault factors.

FTA is versatile methods for dealing with probabilistic risk, reliability and availability. Although FTA was developed in the 1690s for hardware system, it is an adaptable logicbased technique that has been applied to combined hardware and software systems. This research was led it to make the result of the BCA The relation between QE and ISO-GUM is shown in table 1.

about the definitions of uncertain states or outcomes. The difference result is that this uncertainty is about human definitions and concepts not an objective fact of nature. It has

PDF expresses distribution of uncertainty (see Fig1). Assignment of a PDF to a quality analysis is using the Principle of Maximum Entropy (PME). There are existence two great traditions. One is probability theory what think is likely trueness of population mean, and second is confidence interval analysis what we know to be total error. Total error is system error plus random error. Probability bounds analysis gives the same answer as confidence interval analysis does when only range information is available. It being careful is that arises between total error and a confidence interval in many cases that example is abnormal distribution, which better as for evaluation uncertainty for QA. A normal distribution is comfortable in estimating the width of variation; it is fundamental statistical quantity by analysis of variance (ANOVA). If also gives the same answers as Monte Carlo analysis does when information is abundant. If it is an abnormal distribution, a setup will became difficult about a trueness value, and the reference value by ISO-GUM is then calculated for QA [10]

In measurement model, input quantities are measuring data of random variable that is interest as many components, x=(x1,x2,….,xi) , then it can quote a functional relationship

 Y=f(x1,x2,….,xi). (1) Where, Y stands for the output quantity, that is the measurand in VIM, whereas xi for multi input quantity. This is a model with one output which is adopted in the current ISO-GUM. A normal distribution is comfortable in estimating the width of variation of fundamental statistical quantity; namely, Root Sum Square (RSS), arithmetical average (mean), Standard

been avoidable while uncertainty (first order).

Fig. 1. Express of error and confidence interval

between measurement result Y and input quantities f(xi), as (1)

Deviation (SD), Coefficient of Variance (CV) and etc. [18]

**2.2.5 ANOVA** 

**2.2.4 Probability Density Function (PDF)** 


Table 1. Comparison ISO-GUM and QE

#### **2.2.3 DL/AMD**

Present are procedures based on modern Bayesian statistics which are used calculate characteristic limit i.e. the decision threshold, detection limit and confidence limit in BCA. Indicated are also key elements of this statistics which can be used for measurement of Decision Level/Amount Minimum Detectable (DL/AMD), DL applied to the activity result. AMD was the detection criterion that was insensitive to sample specific variables such as chemical yield and detector efficiency. The example of instrumental enzyme activation analysis provides an illustration of the issues discussed.

DL/AMD was able to profit by the operation. QE has developed for the QC of industry as 6 sigma level. However, these have been processed by the regression analysis by making subjective frequency probability of a statistic value into normalized distribution. Regression analysis is applied the least squares method. An occurrence probability of the statistics value of a fault element which is accompanied element by a natural variance, it becomes an abnormal distribution in many cases.

In the result of research, "Decided level/Minimum Detectable Concentration (DL/MDC)" are other different taxonomy of uncertainties and decisions that include a more broad sense of uncertainty and how it should be approached from an ethics perspective. Vagueness and ambiguity are some times described as "second order uncertainty", there is uncertainty even about the definitions of uncertain states or outcomes. The difference result is that this uncertainty is about human definitions and concepts not an objective fact of nature. It has been avoidable while uncertainty (first order).

#### **2.2.4 Probability Density Function (PDF)**

28 Quality Assurance and Management

FMEA. FRACAS FTA and RCA

Quadratic equation.

Technical thinking

Quadratic equation.

normal distribution only

Term ISO-GUM QE

Estimation uncertainty. Random error

expression illustration. formula result confidence interval tolerance critical reference standard unit SD % system legislation Scientific thinking

Estimation Uncertainty. error

expression illustration. formula result confidence interval tolerance critical reference standard

Multi-variable analysis.

distribution

Focus normal and abnormal distribution normal distribution test effective free degree parametric nonparametric

test Nonparametric & nonlinear Parametric & linear

Present are procedures based on modern Bayesian statistics which are used calculate characteristic limit i.e. the decision threshold, detection limit and confidence limit in BCA. Indicated are also key elements of this statistics which can be used for measurement of Decision Level/Amount Minimum Detectable (DL/AMD), DL applied to the activity result. AMD was the detection criterion that was insensitive to sample specific variables such as chemical yield and detector efficiency. The example of instrumental enzyme activation

DL/AMD was able to profit by the operation. QE has developed for the QC of industry as 6 sigma level. However, these have been processed by the regression analysis by making subjective frequency probability of a statistic value into normalized distribution. Regression analysis is applied the least squares method. An occurrence probability of the statistics value of a fault element which is accompanied element by a natural variance, it becomes an

In the result of research, "Decided level/Minimum Detectable Concentration (DL/MDC)" are other different taxonomy of uncertainties and decisions that include a more broad sense of uncertainty and how it should be approached from an ethics perspective. Vagueness and ambiguity are some times described as "second order uncertainty", there is uncertainty even

law of propagation of uncertainty.

Multi-variable analysis.

process key comparison. and

Algorithm MCMC model

Algorithm MCMC modeling.

Table 1. Comparison ISO-GUM and QE

abnormal distribution in many cases.

**2.2.3 DL/AMD** 

Focus normal distribution and abnormal

analysis provides an illustration of the issues discussed.

PDF expresses distribution of uncertainty (see Fig1). Assignment of a PDF to a quality analysis is using the Principle of Maximum Entropy (PME). There are existence two great traditions. One is probability theory what think is likely trueness of population mean, and second is confidence interval analysis what we know to be total error. Total error is system error plus random error. Probability bounds analysis gives the same answer as confidence interval analysis does when only range information is available. It being careful is that arises between total error and a confidence interval in many cases that example is abnormal distribution, which better as for evaluation uncertainty for QA. A normal distribution is comfortable in estimating the width of variation; it is fundamental statistical quantity by analysis of variance (ANOVA). If also gives the same answers as Monte Carlo analysis does when information is abundant. If it is an abnormal distribution, a setup will became difficult about a trueness value, and the reference value by ISO-GUM is then calculated for QA [10]

Fig. 1. Express of error and confidence interval

#### **2.2.5 ANOVA**

In measurement model, input quantities are measuring data of random variable that is interest as many components, x=(x1,x2,….,xi) , then it can quote a functional relationship between measurement result Y and input quantities f(xi), as (1)

$$\mathbf{Y} = \mathbf{f}(\mathbf{x}\_1, \mathbf{x}\_2, \dots, \mathbf{x}\_l). \tag{1}$$

Where, Y stands for the output quantity, that is the measurand in VIM, whereas xi for multi input quantity. This is a model with one output which is adopted in the current ISO-GUM. A normal distribution is comfortable in estimating the width of variation of fundamental statistical quantity; namely, Root Sum Square (RSS), arithmetical average (mean), Standard Deviation (SD), Coefficient of Variance (CV) and etc. [18]

Exploratory Data Analysis (EDA) in ISO-GUM calculates one by one until an assurance execution that is according to "Law of the Propagation of Uncertainty (LPU)". It is same as RCA and FTA [9] in QE. Work progresses aiming at goal of full implementation of an

Markov Chain Monte Carlo (MCMC)[4] determines numerically a PDF. A set of possible states or outcomes it where probabilities is assigned to abnormal distribution. This also

MCMC is proposed for the calculation uncertainty which can be considered to the primary other method than statistical method to EDA in ISO-GUM. Result is illustrated to easy understand. MCMC follows from "Derivation" of Markov formula and the Monte Carlo method is based on the central limit theorem, it is include Gibbs sampling and Metropolis -

In probability and statistics, the t-distribution or Student's t-distribution is probability distribution that arises in the problem of estimating the mean of a normally distributed population when the sample size is small. It is the basis of the popular student's t-test for the statistical significance of the difference between two sample means, and for the difference between two population mean. The Student's t-distribution is a special case of the generalized hyperbolic distribution. Student's distribution arises when the population standard distribution is unknown and has to be estimated from the data. As it is in nearly all practical statistical work, problems treating the standard deviation as if it were known are of

1. Those in which the sample size is so large that one may treat a data-based estimate of

2. Those that illustrate mathematical reasoning, in which the problem of estimating the SD is temporarily ignored, because that is not the point that the author or instructor is then

Example of MCMC is recommended in Fig 3 by NIST. There are making the three kind distributions from measured data of abnormal distribution. One is the rectangular (uniform) probability distribution, that are possible for setting six value of primary confidence interval easily and none is preferred against any other value, the probability for any value to be on top after throwing one dice is 1/6 that is derived by Markov formula. Second is tdistribution that is important in both theory and practice. And confidence intervals derived from Student's t-distribution with n-1 degree of freedom. The parameter is called the number of degree of freedom. It is the same as a 95% confidence interval. Third is normalized distribution that is converted from abnormal distribution by regress analysis. Three kinds distribution is analyzed in Fast Fourier Transform (FFT) series, and carry out is a combined Fourier synthesis that is making new confidence interval by ISO-14253-1 (see 3,5 ) in Gaussian distribution. In next step, Inverse Fast Fourier Transform (IFFT) is made the convolution of Gaussian distribution F(u) there, F(u) is made to the Cumulative Distribution Frequency (CDF) that is alternatively referred to in the literature as the

assurance. ANOVA is used "Law of the Propagation of Laboratory error (LPE)".

includes the application of PDF to continuous variables.

**2.2.7 EDA** 

**2.2.8 MCMC** 

two kinds:

explaining.

Hastings (M-H) algorithm.

the variance as if it were certain.

One of most import indicator of random error is time. In ANOVA, QA of measurement data must be considered error theory and effective ecologically so that basically, when null hypothesis is set up in kinetic state. It quotes a time series function. This rule is shown that the dependence is trueness function f(x,t) and it is include error factors g(x,t), as (2)

dx/dt=f(x,t)+g(x,t) (2)

These are assumed as independent function. When an error factor exists by plurality, Burger's formula is adapted to Multi-variable analysis (MA) that has one or more g(x.t). ISO-GUM evaluates by reference value, even when a trueness value is unknown, and error serves as uncertainty.

Bayesian theorem grows from the simple principle that two random variables factor t and x remain in the following dependence: as (3). Vertical arrow | indicate conditional distribution.

$$\mathbf{P}(\mathbf{x}) = \mathbf{P}(\mathbf{x} \mid \mathbf{t})^{\*} \mathbf{P}(\mathbf{t}) = \mathbf{P}(\mathbf{t} \mid \mathbf{x})^{\*} \mathbf{P}(\mathbf{x}) \tag{3}$$
 
$$\mathbf{P}(\mathbf{x} \mid \mathbf{t}) = \mathbf{P}(\mathbf{x} \mid \mathbf{t})^{\*} \mathbf{P}(\mathbf{x}) / \mathbf{P}(\mathbf{t})$$

#### **2.2.6 Key comparison and reference value**

If the trueness value was unknown, ISO-GUM is changed into reference value from trueness value, it is gating in the posterior data base that is set up by the EDA as frequency hypothesis and it does check with a Key Comparison (KC) method. KC is comparing between null hypothesis and frequency hypothesis.

Fig. 2. Posterior distribution and prior distribution

Null hypothesis is making guesstimate uncertainty from prior distribution with an experiment data. Frequency hypothesis is made posterior distribution on data base that is standard posterior distribution. Reference value estimated with in likelihood position in Fig.2. [15]

#### **2.2.7 EDA**

30 Quality Assurance and Management

One of most import indicator of random error is time. In ANOVA, QA of measurement data must be considered error theory and effective ecologically so that basically, when null hypothesis is set up in kinetic state. It quotes a time series function. This rule is shown that

These are assumed as independent function. When an error factor exists by plurality, Burger's formula is adapted to Multi-variable analysis (MA) that has one or more g(x.t). ISO-GUM evaluates by reference value, even when a trueness value is unknown, and error

Bayesian theorem grows from the simple principle that two random variables factor t and x remain in the following dependence: as (3). Vertical arrow | indicate conditional

P(x|t)= P(x|t)\*P(x)/ P(t)

If the trueness value was unknown, ISO-GUM is changed into reference value from trueness value, it is gating in the posterior data base that is set up by the EDA as frequency hypothesis and it does check with a Key Comparison (KC) method. KC is comparing

Null hypothesis is making guesstimate uncertainty from prior distribution with an experiment data. Frequency hypothesis is made posterior distribution on data base that is standard posterior distribution. Reference value estimated with in likelihood position in

dx/dt=f(x,t)+g(x,t) (2)

P(xt)=P(x|t)\*P(t)=P(t|x)\*P(x) (3)

the dependence is trueness function f(x,t) and it is include error factors g(x,t), as (2)

serves as uncertainty.

**2.2.6 Key comparison and reference value** 

between null hypothesis and frequency hypothesis.

Fig. 2. Posterior distribution and prior distribution

distribution.

Fig.2. [15]

Exploratory Data Analysis (EDA) in ISO-GUM calculates one by one until an assurance execution that is according to "Law of the Propagation of Uncertainty (LPU)". It is same as RCA and FTA [9] in QE. Work progresses aiming at goal of full implementation of an assurance. ANOVA is used "Law of the Propagation of Laboratory error (LPE)".

#### **2.2.8 MCMC**

Markov Chain Monte Carlo (MCMC)[4] determines numerically a PDF. A set of possible states or outcomes it where probabilities is assigned to abnormal distribution. This also includes the application of PDF to continuous variables.

MCMC is proposed for the calculation uncertainty which can be considered to the primary other method than statistical method to EDA in ISO-GUM. Result is illustrated to easy understand. MCMC follows from "Derivation" of Markov formula and the Monte Carlo method is based on the central limit theorem, it is include Gibbs sampling and Metropolis - Hastings (M-H) algorithm.

In probability and statistics, the t-distribution or Student's t-distribution is probability distribution that arises in the problem of estimating the mean of a normally distributed population when the sample size is small. It is the basis of the popular student's t-test for the statistical significance of the difference between two sample means, and for the difference between two population mean. The Student's t-distribution is a special case of the generalized hyperbolic distribution. Student's distribution arises when the population standard distribution is unknown and has to be estimated from the data. As it is in nearly all practical statistical work, problems treating the standard deviation as if it were known are of two kinds:


Example of MCMC is recommended in Fig 3 by NIST. There are making the three kind distributions from measured data of abnormal distribution. One is the rectangular (uniform) probability distribution, that are possible for setting six value of primary confidence interval easily and none is preferred against any other value, the probability for any value to be on top after throwing one dice is 1/6 that is derived by Markov formula. Second is tdistribution that is important in both theory and practice. And confidence intervals derived from Student's t-distribution with n-1 degree of freedom. The parameter is called the number of degree of freedom. It is the same as a 95% confidence interval. Third is normalized distribution that is converted from abnormal distribution by regress analysis. Three kinds distribution is analyzed in Fast Fourier Transform (FFT) series, and carry out is a combined Fourier synthesis that is making new confidence interval by ISO-14253-1 (see 3,5 ) in Gaussian distribution. In next step, Inverse Fast Fourier Transform (IFFT) is made the convolution of Gaussian distribution F(u) there, F(u) is made to the Cumulative Distribution Frequency (CDF) that is alternatively referred to in the literature as the

models, uncertainty values measured close to detection limits. Gibbs sampling is used for

Uncertainty associated with measuring operations within the same laboratory is estimated by applying the nested analysis of multi-variance method though experimental data with between-day and within-day variation and sample vial as relevant factors. Specifically, a calibration line is generated at a time of every experiment during same days of experiment period. The thus obtained measured values examined for outliers. If outlier is found, its cause is identified and the rarely value is removed. If a problematic finding is obtained in the measurement, a new measurement is performed, after investigation, two stage nested

A state of uncertainty where some possible outcomes have an undesired effect or significant loss as a risk quantitative uses of the terms uncertainty and risk are fairly consistent from fields such as probability theory, actuarial science, and information theory. Some also create

Evaluation of measurement uncertainty by ISO-GUM recommendations can be summarized

1. Estimation of the standard uncertainties of the main sources: Definition of the quantity

5. Model linearization: The principle of error propagation applied to obtain the standard uncertainty truncates the Taylor's series expansion in first order terms. This is a linear

6. Assumption of normality of measurand in common practice, the distribution of the result is taken as normal and consequently, expanded uncertainty Ue is calculated as

It performs operation of assurance by ISO-GUM in order of standard uncertainty U, Uncertainty is expressed standard deviation. Standard uncertainty Us is obtained MCMC of process .in Fig.3. Us is defined "Uncertainty of the result x of a measurement expressed as a

The combined standard uncertainty Uc is adding many standard uncertainties Us of fault elements. Uc is defined "Standard uncertainty of the result y of a measurement when the result is obtained from the values of a number of other quantities". by ISO/BIPM guide 98.

3. Calibration of the effective degree of freedom of the standard combined uncertainty 4. Calibration of the expand uncertainty: However, the ISO-GUM approach exhibits some

new terms without substantially changing the definition of uncertainty or risk.

2. Calibration of the components of uncertainty for each main sources::

approximation that in some cases could need terms of higher order.

the product of the coverage factor k and the combined uncertainty Uc.

7. Record the data evaluated for uncertainty in an open document

collection of data by double samples.

**2.3.2 Procedure of uncertainty evaluation** 

in the following step

limitations:

**2.3.3 Standard uncertainty** 

standard deviation" by ISO/BIPM guide 98.

**2.3.4 Combined standard uncertainty** 

analysis of variance is applied to estimated individual variation

being measured data by the sensitivity coefficient

distribution function. CDF is compared in comparator, it be able to find out the fault point on the curve, and it is required to exploratory the cause of generating fault state further. U is carried out a final standard uncertainty Us that is affinity (Binding ratio P\*Q/Po= %) in this case. Fig.3 has a spare input port for special form distribution e.g. triangular distribution, U form distribution. A result same as a fuzzy member function instead of FFT is useful.

Fig. 3. MCMC of EDA in 2002 by rule of NIST

#### **2.3 Uncertainty**

#### **2.3.1 Uncertainty theory**

Uncertainty is defined "A parameter, associated with a result of a measurement, that characterizes the dispersion of the values that could reasonable be attributed to the measurand." in VIM.

The uncertainty data can usefully in 2 stage ways according to how they are estimated.


The conventional concept of probability in statistics is associated with the relative frequency of random events. Such a statistics fails in case of systematic effect. Non-linear measurement

distribution function. CDF is compared in comparator, it be able to find out the fault point on the curve, and it is required to exploratory the cause of generating fault state further. U is carried out a final standard uncertainty Us that is affinity (Binding ratio P\*Q/Po= %) in this case. Fig.3 has a spare input port for special form distribution e.g. triangular distribution, U

Uncertainty is defined "A parameter, associated with a result of a measurement, that characterizes the dispersion of the values that could reasonable be attributed to the

The uncertainty data can usefully in 2 stage ways according to how they are estimated.

1. The first stage is utilizing the Bayesian inference of key comparison according to LPU. 2. The second stage analyzes is grouped into two categories the measurement result into type A and type B, according to determined the distribution condition. An importance distinction between both types of statistics lies in a quite different approach to the

The conventional concept of probability in statistics is associated with the relative frequency of random events. Such a statistics fails in case of systematic effect. Non-linear measurement

form distribution. A result same as a fuzzy member function instead of FFT is useful.

Fig. 3. MCMC of EDA in 2002 by rule of NIST

**2.3 Uncertainty** 

**2.3.1 Uncertainty theory** 

measurand." in VIM.

concept of probability.

models, uncertainty values measured close to detection limits. Gibbs sampling is used for collection of data by double samples.

Uncertainty associated with measuring operations within the same laboratory is estimated by applying the nested analysis of multi-variance method though experimental data with between-day and within-day variation and sample vial as relevant factors. Specifically, a calibration line is generated at a time of every experiment during same days of experiment period. The thus obtained measured values examined for outliers. If outlier is found, its cause is identified and the rarely value is removed. If a problematic finding is obtained in the measurement, a new measurement is performed, after investigation, two stage nested analysis of variance is applied to estimated individual variation

A state of uncertainty where some possible outcomes have an undesired effect or significant loss as a risk quantitative uses of the terms uncertainty and risk are fairly consistent from fields such as probability theory, actuarial science, and information theory. Some also create new terms without substantially changing the definition of uncertainty or risk.

#### **2.3.2 Procedure of uncertainty evaluation**

Evaluation of measurement uncertainty by ISO-GUM recommendations can be summarized in the following step


#### **2.3.3 Standard uncertainty**

It performs operation of assurance by ISO-GUM in order of standard uncertainty U, Uncertainty is expressed standard deviation. Standard uncertainty Us is obtained MCMC of process .in Fig.3. Us is defined "Uncertainty of the result x of a measurement expressed as a standard deviation" by ISO/BIPM guide 98.

#### **2.3.4 Combined standard uncertainty**

The combined standard uncertainty Uc is adding many standard uncertainties Us of fault elements. Uc is defined "Standard uncertainty of the result y of a measurement when the result is obtained from the values of a number of other quantities". by ISO/BIPM guide 98.

laboratory,(see ISO17025). The GUM-GUM has been formally adopted as a US National Standard in the form of ANSI/NCLS Z540-2-1997, ISO is edited ISO-15189 to medical use.

1. The system is modeled using a functional relationship between measured quantities X=f(x) and the measurements result y in the form y=f(x). The adequacy of the formula for uncertainty carried out u(y) which is derived by propagating uncertainties in a firstorder approximation to the model of the measurement system. Production of analysis

2. The distribution of y is known, e.g. Gaussian or student distribution in order to obtain

Type A evaluation of standard uncertainty may be based on any valid statistical method for treating data. Example is calculating the standard deviation of mean of a series of independent observation, using the method of least squares to fit a curve to data in order to

Type A is defined "Uncertainties are evaluated by the statistical analysis of a series of observation" by ISO guide. In type A, that performs only normal probability distribution of measurement data, and parametric test method is useful ANOVA, so called classical statistics. The normal distribution is also called the Gaussian or the bell shape curve, it is ubiquitous in nature and statistics useful the central limit theorem. Every variable element can be modeled as sum of many small variations which are approximated normal

Type B is upgrading evaluation of uncertainty than type A for abnormal distribution in order to obtain assurance, it needs employs other method than the statistical method, and based on non-parametric test method. The other method than statistical was proposed MCMC to type B. Type B is defined "Uncertainties evaluated by means other than the

Assignment of PDF to a quantity analysis is using the Principle of Maximum Entropy (PME). Therefore the PME tells us that the Assignment of PDF is rectangular, t-distribution and normalize distribution. (see Fig.3). Type B evaluation of standard uncertainty is usually base on scientific judgment using all information available. Type A evaluation of uncertainty based on limited data are not necessarily more reliable than soundly based Type

Gibbs sampling is an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variable. The purpose of such a sequence is to

the value of coverage factor k for gaiting assurance interval with result.

estimate the parameters of the curve and their standard deviations is calculated.

distribution. it is the only stable distribution having all of its moment finite.

statistical analysis of a series of observations." in ISO/BIPM guide.

**3.1 To apply the ISO-GUM** 

GUM approach two main assumption must hold (12).

model is explained to the supplement 2.

**3.2 Type A uncertainty estimation** 

**3.3 Type B uncertainty estimation** 

**3.4 Gibbs sampling by Bayesian theory** 

B evaluations.

Existence of two or more fault elements will use the multivariate analysis of supplement 3 in ISO-GUM. Each element is sampling as uncertainty elements (s1. s2, sn).

$$\mathbf{U}\mathbf{c} = \sqrt{\mathbf{U}^2\mathbf{s}\_1 + \mathbf{U}^2\mathbf{s}\_2 + \dots + \mathbf{U}^2\mathbf{s}\_{\text{Sn}}}\,\tag{4}$$

#### **2.3.5 Expand uncertainty**

A expand standard uncertainty Ue for assurance performance is calculated as multiple factor by numerical coverage factor k to combined uncertainty as (5). Factor k is called coverage factor.[3] An assurance value is authorized by the final expand uncertainty Ue [11][13].

$$\mathbf{U}\mathbf{e} = \mathbf{k}^\*\mathbf{U}\mathbf{c}\tag{5}$$

#### **2.3.6 Coverage factor**

The coverage factor.[3] is computed it by Welch-Satterwaite formula as (6) with Effective Free Degree (EFD) that is shown (Veff). Free degree is important the number of samples based on maximum likelihood method. It is improved an (Akaike) information criterion (AIC). k is defined "Quantity defining and interval about the result of a measurement that may be expected to encompass a large fraction of the distribution of values that could reasonably be attribution to the measurand." By ISO/BIPM guide 98.

If the measurand distribution is approximated to a student's distribution, k is taken as the tabulated Student's t-value for given significance level. In the general cases, the analytical evaluation of the EDF is still an unsolved problem, in type B, generally contributing with infinite degree of freedom.

$$\mathbf{V}\_{\rm eff} = \frac{\mathbf{u}\_c^4 \text{(\ y)}\tag{6}}{\sum\_{\mathbf{i}=1}^N \frac{\mathbf{c}\_{\mathbf{i}}^4 \mathbf{u}^4 \text{(\ x\_{\mathbf{i}})}}{\mathbf{v}\_{\mathbf{i}}}} \tag{6}$$

An example, the use of a value of *k* other than 2 is taking k, equal to a t-factor obtained from the *t* -distribution when Uc has low degrees of freedom in order to meet the dictated requirement of providing a value that defines an interval having a level of confidence close to 95 percent.

#### **3. ISO-GUM and assurance**

ISO-GUM was legislated in enforcement an QA based on Quality Engineering (QE).

The standardization of measurements is high priority in laboratory medicine, its purpose being to achieve closer comparability of results obtained routine measurement procedures. The GUM has been increasingly adopted as a de facto standard procedure by calibration laboratory,(see ISO17025). The GUM-GUM has been formally adopted as a US National Standard in the form of ANSI/NCLS Z540-2-1997, ISO is edited ISO-15189 to medical use.

#### **3.1 To apply the ISO-GUM**

34 Quality Assurance and Management

Existence of two or more fault elements will use the multivariate analysis of supplement 3 in

A expand standard uncertainty Ue for assurance performance is calculated as multiple factor by numerical coverage factor k to combined uncertainty as (5). Factor k is called coverage factor.[3] An assurance value is authorized by the final expand uncertainty Ue

The coverage factor.[3] is computed it by Welch-Satterwaite formula as (6) with Effective Free Degree (EFD) that is shown (Veff). Free degree is important the number of samples based on maximum likelihood method. It is improved an (Akaike) information criterion (AIC). k is defined "Quantity defining and interval about the result of a measurement that may be expected to encompass a large fraction of the distribution of values that could

If the measurand distribution is approximated to a student's distribution, k is taken as the tabulated Student's t-value for given significance level. In the general cases, the analytical evaluation of the EDF is still an unsolved problem, in type B, generally contributing with

An example, the use of a value of *k* other than 2 is taking k, equal to a t-factor obtained from the *t* -distribution when Uc has low degrees of freedom in order to meet the dictated requirement of providing a value that defines an interval having a level of confidence close

 i=1

N

v c u ( x )

i

i

u ( y)

i 4 4

c 4

The standardization of measurements is high priority in laboratory medicine, its purpose being to achieve closer comparability of results obtained routine measurement procedures. The GUM has been increasingly adopted as a de facto standard procedure by calibration

ISO-GUM was legislated in enforcement an QA based on Quality Engineering (QE).

1 2 22 2 Uc= U s +U s + +U s <sup>n</sup> , (4)

Ue=k\*Uc (5)

(6)

ISO-GUM. Each element is sampling as uncertainty elements (s1. s2, sn).

reasonably be attribution to the measurand." By ISO/BIPM guide 98.

V =

eff

**2.3.5 Expand uncertainty** 

**2.3.6 Coverage factor** 

infinite degree of freedom.

**3. ISO-GUM and assurance** 

to 95 percent.

[11][13].

GUM approach two main assumption must hold (12).


#### **3.2 Type A uncertainty estimation**

Type A evaluation of standard uncertainty may be based on any valid statistical method for treating data. Example is calculating the standard deviation of mean of a series of independent observation, using the method of least squares to fit a curve to data in order to estimate the parameters of the curve and their standard deviations is calculated.

Type A is defined "Uncertainties are evaluated by the statistical analysis of a series of observation" by ISO guide. In type A, that performs only normal probability distribution of measurement data, and parametric test method is useful ANOVA, so called classical statistics. The normal distribution is also called the Gaussian or the bell shape curve, it is ubiquitous in nature and statistics useful the central limit theorem. Every variable element can be modeled as sum of many small variations which are approximated normal distribution. it is the only stable distribution having all of its moment finite.

#### **3.3 Type B uncertainty estimation**

Type B is upgrading evaluation of uncertainty than type A for abnormal distribution in order to obtain assurance, it needs employs other method than the statistical method, and based on non-parametric test method. The other method than statistical was proposed MCMC to type B. Type B is defined "Uncertainties evaluated by means other than the statistical analysis of a series of observations." in ISO/BIPM guide.

Assignment of PDF to a quantity analysis is using the Principle of Maximum Entropy (PME). Therefore the PME tells us that the Assignment of PDF is rectangular, t-distribution and normalize distribution. (see Fig.3). Type B evaluation of standard uncertainty is usually base on scientific judgment using all information available. Type A evaluation of uncertainty based on limited data are not necessarily more reliable than soundly based Type B evaluations.

#### **3.4 Gibbs sampling by Bayesian theory**

Gibbs sampling is an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variable. The purpose of such a sequence is to

2. Assumption of normality of measurand (z): In common practice, the distribution of the result is taken as normal and consequently. Expand uncertainty U(e) is calculated as the product of the coverage factor k and the combined uncertainty U(c). Thus k=2 is very commonly declared value, which corresponds to a level of significance of the

3. In calibration of the effective degree of freedom. If the distribution of z is approximated to a student's distribution, the coverage factor k is taken as the tabulated Student's tvalue for given significance level and effective degree of freedom calculation by the Welch-Satterthwaite equation (6). In the general cases, the analytical evaluation of the effective degree of freedom is still an unsolved problem, type B uncertainties, generally contributing with infinite degree of freedom. However, the ISO-GUM 95 approach

4. Modern concept of evaluation of measurement result uncertainty is based on the model function. Model linearization, the principle of error propagation applied to obtain the standard uncertainty truncates the Taylor's series expansion in first order terms. This is a linear approximation that in some cases could need terms of higher order as

5. Eq.(1) applied to the use base on a first-order Taylor series expansion is approximation. Uc is gotten by doing a geometric mean of the type A and B result. This is model with one output which is adopted in current ISO-GUM. Knowledge of input quantities, which is compete, comes from their PDF. While the PDF has good theoretical foundations, the process of measurement modeling does not yet have them. There are no clues about it in

the ISO-GUM. This is depended on experience data and prior knowledge.

approximately 95% in 2 sigma zone.

exhibits some limitations.

Y=f(x1,x2,….,xi) as Eq.(1).

Fig. 4. Assurance interval by ISO-14253-1

approximate the joint probability distribution, or to compute an integral (such as an expected value). Gibbs sampling is a special case of the M-H algorithm, and thus an example of practical use is MCMC algorithm. The algorithm is named after the physicist J.W. Gibbs, in inference to an analogy between the sampling algorithm and statistical physics. The algorithm was devise by brothers Stuart and Donald German, some eight decades after the passing of Gibbs.

Gibbs sampling is applicable when the joint distribution is not known explicitly, but the conditional distribution of each variable is known. The Gibbs sampling algorithm generates an instance from the distribution of each variable in turn, conditional on current values of the other variable. It can be shown that the sequence of samples constitutes a Markov chain and the stationary distribution. Markov chain is just the sought after joint distribution. Gibbs sampling is particularly well adapted to sampling the posterior distribution of a Bayesian networks, since Bayesian networks are typically specified as a collection of conditional distribution. The point of Gibbs sampling is that given a multivariate distribution, it is simpler from a conditional distribution than to marginalize by integrating over a joint distribution.

It is now the definitive document supplement of ISO-GUM on evaluating.

#### **3.5 Assurance proceed**

Quality Assurance (QA) of a measurement result is substantial by proposal new almost every year, QA is doing its best also in the field of a clinical examination to be able to respond to a patient or a donor effectively. The assurance performance of ISO-GUM is come out by set up of the confidence interval [4] and is decided. Therefore, it is considered that the value acquired by regulation of ISO-GUM is an assurance performance. ISO-GUM is edited in series to 98-5 from 98-1 as a guide of ISO/IEC. It has published by Joint Committee guide Measurement (JCGM). JCGM document number is JCGM 100-107.And ISO14253 [10] was publish as ISO standard.

ISO14253 (See Fig.4) contains decision rules which require the tolerance zone to be reduced by the measuring uncertainty. The measurement data are made to prove conformance a specification and expand by the measuring uncertainty. And it is attempting to prove nonconformance to a specification. It has known the legal phrase "prove beyond a reasonable doubt".

Specification has two clear limits lines. Uncertainty makes the question of conformance more complex. In a drawing specification, it is usually clear what the limits tolerance are, it may be a maximum acceptable cover factors value of upper limit and a low limit is reject line.

The method of setting up a confidence interval (zone) is defined by ISO-14253-1 as same as conformance zone. And measurement uncertainty is increased.[14]

However, the ISO-GUM approach exhibits some limitations, like:

1. Model linearization: The principle of error propagation applied to obtain the standard uncertainty truncates the Taylor's series expansion in first order terms. This is a linear approximation that in some cases could need terms of higher order.

approximate the joint probability distribution, or to compute an integral (such as an expected value). Gibbs sampling is a special case of the M-H algorithm, and thus an example of practical use is MCMC algorithm. The algorithm is named after the physicist J.W. Gibbs, in inference to an analogy between the sampling algorithm and statistical physics. The algorithm was devise by brothers Stuart and Donald German, some eight

Gibbs sampling is applicable when the joint distribution is not known explicitly, but the conditional distribution of each variable is known. The Gibbs sampling algorithm generates an instance from the distribution of each variable in turn, conditional on current values of the other variable. It can be shown that the sequence of samples constitutes a Markov chain and the stationary distribution. Markov chain is just the sought after joint distribution. Gibbs sampling is particularly well adapted to sampling the posterior distribution of a Bayesian networks, since Bayesian networks are typically specified as a collection of conditional distribution. The point of Gibbs sampling is that given a multivariate distribution, it is simpler from a conditional distribution than to marginalize by integrating

Quality Assurance (QA) of a measurement result is substantial by proposal new almost every year, QA is doing its best also in the field of a clinical examination to be able to respond to a patient or a donor effectively. The assurance performance of ISO-GUM is come out by set up of the confidence interval [4] and is decided. Therefore, it is considered that the value acquired by regulation of ISO-GUM is an assurance performance. ISO-GUM is edited in series to 98-5 from 98-1 as a guide of ISO/IEC. It has published by Joint Committee guide Measurement (JCGM). JCGM document number is JCGM 100-107.And ISO14253 [10]

ISO14253 (See Fig.4) contains decision rules which require the tolerance zone to be reduced by the measuring uncertainty. The measurement data are made to prove conformance a specification and expand by the measuring uncertainty. And it is attempting to prove nonconformance to a specification. It has known the legal phrase "prove beyond a reasonable

Specification has two clear limits lines. Uncertainty makes the question of conformance more complex. In a drawing specification, it is usually clear what the limits tolerance are, it may be a maximum acceptable cover factors value of upper limit and a low limit is reject

The method of setting up a confidence interval (zone) is defined by ISO-14253-1 as same as

1. Model linearization: The principle of error propagation applied to obtain the standard uncertainty truncates the Taylor's series expansion in first order terms. This is a linear

conformance zone. And measurement uncertainty is increased.[14] However, the ISO-GUM approach exhibits some limitations, like:

approximation that in some cases could need terms of higher order.

It is now the definitive document supplement of ISO-GUM on evaluating.

decades after the passing of Gibbs.

over a joint distribution.

**3.5 Assurance proceed** 

was publish as ISO standard.

doubt".

line.


Fig. 4. Assurance interval by ISO-14253-1

The experiment should be use test reagents of RIA that is a kind of BCA. RIA is a scientific method used to test antigen (example, hormone levels in blood) without the need to use a bioassay. It involves mixing known quantifies of radioactive antigen with cold antibody to that antigen, then adding unlabeled (cold antigen) is measuring the amount of labeled antigen displaced. Initially, the radioactive antigen is bound to the antibody. When cold antigen is added, the two compete for antibody binding sites. The bound antigens are separated from unbound one. Radioactive isotope is used gamma emit of I-125. This is both extremely sensitive and specific, but it requires special precaution because radioactive substances are used, sophisticated apparatus, and is expensive. In this research, it is useful

Immunoassays are a form of macromolecular binding reaction; no covalent chemical bonding is involved. Antibodies interact with their antigen by weak hydrogen bonding and van der Waals force. Antigen-antibody reactions are dependent on complementary matching shapes being assumed by antibody variable regions of the immunoglobulin. Almost all polyclonal antibodies used in immunoassay reactions are of the immunoglobulin G class. The N-terminal 110 amino acid residues of both the heavy and light chains of the immunoglobulin molecules are variable in sequence and interact to form the antigen binding site. Nevertheless, this variability gives rise to a vast array of difference antibodies binding to different molecules. Attributes of an ideal immunoassay is shown to follow

1. The immunochemical reaction behavior should be identical and uniform for both the

4. For macromolecules the results are declared in arbitrary unit, i.e. international Unit (IU)

The immuno-reaction of an antigen and a catalyst (an antibody activity) is led to measurement theory of RIA with reaction kinetics. Reaction kinetics is expressed with the chemical equivalent amount compound by unit time t. The chemical reaction is shown in Fig 5. The process of chemical change is divided into three phases that are a signs phase at growth phase and stagnation phase. A signs phase shows the resistance characteristic over a reaction in the preparation step of a reaction. A growth phase shows reaction capability in the stage where a reaction grows rapidly. A stagnation phase with asymptote which is a stage where the chemical reacting finally and reaches a chemical equilibrium. An uncertainty element can be expressed with the abnormal state of reaction time and reaction

The strength of binding is determined by equilibrium binding constant for the antigenantibody complex formations. The binding follows the basic thermodynamic principles with reversible reaction between two molecules. This relationship is described by the chemical reaction model as (7) which is useful a chemical kinetics reaction R1 of immunity with a

2. The immunochemical reaction of the antibody reagent is uniform from batch to batch. 3. The immunochemical method is well standardized to ensure that the size of

inference preparation and the analysis in the homogeneity sample.

measurement signal is caused only by the antigen-antibody product.

conversion to mol/L(SI) unit is constant and is dependent on many factors.

main data of Erastrse-1 regent which is a sort of pancreas hormone.[22]

**5. Measurement principal 5.1 Radio-Immuno Assay (RIA)**

capability.

reversible reaction R2 in a polynomial expression.

If the measurand in expand uncertainty goes into a conformance (confidence interval or confidence zone), it value will be authorize the final report value as assurance performance.

#### **4. Supplement of ISO-GUM**

#### **4.1 List of supplement**

The list of supplements of ISO-GUM is shown below. (The mark \* is under plans).

Introduction to the GUM related documents, there were published already three supplements.


An introduction to the GUM is under planning documents of supplement for ISO-GUM, there are four supplements (3).


Data Quality Assurance Object (DQAO) development process consists of the following seven steps by supplement 4.


#### **4.2 MA in supplement 3**

Practical use of multi-variable analysis (MA) is required for BCA from two or more uncertainty elements being inherent. MA is defined in supplement 3. As for an algorithm, the theoretical formula (3) is usefully. The supplement 7 is considering using a least squares method as a base and using Burgers Equation and Crank-Nicolson method for a MA Propagation of uncertainty for several variable can be simple considerably if it is simple multiplicative of secondary variable.

QA of a measurement result is substantial by proposal new almost every year.

The practical use of MA is required for BCA from two or more uncertainty elements. It is expected that the analysis result can begin to find a new uncertainty factor.

#### **5. Measurement principal**

38 Quality Assurance and Management

If the measurand in expand uncertainty goes into a conformance (confidence interval or confidence zone), it value will be authorize the final report value as assurance performance.

Introduction to the GUM related documents, there were published already three

1. Supplement1: Numerical methods for the Propagation of distributions using a MCMC

2. Supplement 2: Extension to any number of output quantities is useful models. JCGM102.

An introduction to the GUM is under planning documents of supplement for ISO-GUM,

4. \*Supplement 4: An introduction to be GUM and related documents. Published 2009

5. \*Supplement 5: Concepts and basic principle of measurement uncertainty evaluation. 6. \*Supplement 6:The role of measurement uncertainty in deciding conformance to

Data Quality Assurance Object (DQAO) development process consists of the following

Practical use of multi-variable analysis (MA) is required for BCA from two or more uncertainty elements being inherent. MA is defined in supplement 3. As for an algorithm, the theoretical formula (3) is usefully. The supplement 7 is considering using a least squares method as a base and using Burgers Equation and Crank-Nicolson method for a MA Propagation of uncertainty for several variable can be simple considerably if it is simple

The practical use of MA is required for BCA from two or more uncertainty elements. It is

QA of a measurement result is substantial by proposal new almost every year.

expected that the analysis result can begin to find a new uncertainty factor.

The list of supplements of ISO-GUM is shown below. (The mark \* is under plans).

3. Supplement 3: Modeling for useful multivariable analysis. JCGM103.

7. \*Supplement 7: Application of the least squares method.

**4. Supplement of ISO-GUM** 

there are four supplements (3).

specified requirements.

seven steps by supplement 4. **Step 1.** State the Problem **Step 2.** Identify the Decision

**Step 5.** Develop a Decision

multiplicative of secondary variable.

**4.2 MA in supplement 3** 

JCGM104.

method [3] by document JCGM101.

**Step 3.** Identify the Inputs to the Decision **Step 4.** Define the Study Boundaries

**Step 6.** Specify Acceptable Limits on Decision Errors **Step 7.** Optimize the Design for Obtaining Data.

**4.1 List of supplement** 

supplements.

#### **5.1 Radio-Immuno Assay (RIA)**

The experiment should be use test reagents of RIA that is a kind of BCA. RIA is a scientific method used to test antigen (example, hormone levels in blood) without the need to use a bioassay. It involves mixing known quantifies of radioactive antigen with cold antibody to that antigen, then adding unlabeled (cold antigen) is measuring the amount of labeled antigen displaced. Initially, the radioactive antigen is bound to the antibody. When cold antigen is added, the two compete for antibody binding sites. The bound antigens are separated from unbound one. Radioactive isotope is used gamma emit of I-125. This is both extremely sensitive and specific, but it requires special precaution because radioactive substances are used, sophisticated apparatus, and is expensive. In this research, it is useful main data of Erastrse-1 regent which is a sort of pancreas hormone.[22]

Immunoassays are a form of macromolecular binding reaction; no covalent chemical bonding is involved. Antibodies interact with their antigen by weak hydrogen bonding and van der Waals force. Antigen-antibody reactions are dependent on complementary matching shapes being assumed by antibody variable regions of the immunoglobulin. Almost all polyclonal antibodies used in immunoassay reactions are of the immunoglobulin G class. The N-terminal 110 amino acid residues of both the heavy and light chains of the immunoglobulin molecules are variable in sequence and interact to form the antigen binding site. Nevertheless, this variability gives rise to a vast array of difference antibodies binding to different molecules. Attributes of an ideal immunoassay is shown to follow


The immuno-reaction of an antigen and a catalyst (an antibody activity) is led to measurement theory of RIA with reaction kinetics. Reaction kinetics is expressed with the chemical equivalent amount compound by unit time t. The chemical reaction is shown in Fig 5. The process of chemical change is divided into three phases that are a signs phase at growth phase and stagnation phase. A signs phase shows the resistance characteristic over a reaction in the preparation step of a reaction. A growth phase shows reaction capability in the stage where a reaction grows rapidly. A stagnation phase with asymptote which is a stage where the chemical reacting finally and reaches a chemical equilibrium. An uncertainty element can be expressed with the abnormal state of reaction time and reaction capability.

The strength of binding is determined by equilibrium binding constant for the antigenantibody complex formations. The binding follows the basic thermodynamic principles with reversible reaction between two molecules. This relationship is described by the chemical reaction model as (7) which is useful a chemical kinetics reaction R1 of immunity with a reversible reaction R2 in a polynomial expression.

The final reaction a stagnation phase is reached on chemical equilibrium. It become to d[P\*Q]/dt=0 and change from (9) to (10). An affinity state is shown on the reaction curve at

In this research, a purpose is improving the accuracy of the calibration curve used for quantitative analysis and making it level which can be assurance. The accuracy of calibration curve is important as intermediate accuracy of the whole measurement system. The measurement data was collected by equalization of double sample in order to make high order accuracy. The Data Quality Object (DQO) of the improvement accuracy targeted

Even in the case of measurements requiring multipoint linear (non-linear) calibration with five or more different concentration reagents of reference homogeneity material. The uncertainty of routine test values can be quantified using basically the same procedure as "Estimating the uncertainty of routine test values" except that the uncertainty of the reference material is calculated as a combined standard uncertainty using the mean value

The reference material used should be an actual sample, the property of which is similar to the patient spaceman to be assayed. Even in the multi-point linear calibration with three or more different rations concentrations of reference material, the uncertainty of routine test

All the product calculation curves are having the quality verified by formula of Mechaelis-

1. Reaction is reversible and favors complex formation under physiological condition. 2. Binding depends on hydrogen bonds, van der vaals forces, ionic bonds, hydrophobic

4. The amount of complex formed depended on concentration of antibody and antigen. Both antigen and antibody (if large enough) have multiple sites for binding to occur. Therefore, extensive cross-linking can occur when both are present in solution. When antibody and antigen reach equivalence large immune complexes form which can

In immunochemical analysis can be nonlinear analysis that is applicable of estimating to the

3. Binding is very specific and requires the correct 3-D structure of an antigen.

uncertainty of assigned value of calibrators and QA control sample.

values can be quantified using basically procedures. Test reagent is called calibrator.

time.

**5.2 Calibration curve** 

Menten formula.

interaction.

precipitate out of solution.

**6. Experiment method** 

**6.1 Progress of work** 

the calibration curve for a chemistry quantity analysis.

obtained by averaging the standard uncertainty (see to Fig.6).

Key aspect is the Antigen-Antibody interaction.

d[P\*Q]/dt=R1[P\*][Q]-R2[P\*Q]… (9)

k1[P\*][Q]=(R1-R2)[P\*Q]……….… (10)

$$\begin{array}{c} \text{R1} \\ \text{P+P\*+Q} \leftarrow \text{>P\*Q+PQ+PI+P\*I} \rightarrow \text{P\*Q} \\ \text{R2} \end{array} \tag{7}$$

Where are:


Po is invariably an usual constant of nature in the "law of mass action". In this case, it must be the measured P\*Q/Po of the effective binding ratio. Affinity is the same as the kinetic reaction rate. An affinity increases until the deactivation of saturation that the reaction is based in the reaction process in time. Kinetic chemical reaction is accelerated by biomaterials as a quadratic differential polynomial equation with reaction time t. The secondary order differential equation is as follows according to three phases as (8)

$$\text{Ad}(\text{P\*Q})^2/\text{dt}\_2 + \text{Bd}(\text{P\*Q})/\text{dt} + \text{C}(\text{P\*Q})\tag{8}$$

The extent of reaction is shown in Fig.5. Secondary order differentiation of the first item shows the rate of acceleration included a special resistance phase. It is shown to the portion of start of the reaction curve in Fig.5. A primary differentiation of the next item is shown reaction velocity. The last item shows the amount of chemical compound after reached chemical equilibrium as stoichiometry. The first item has role important for security of reaction and the analysis of an outlier.

Fig. 5. Elastase-1 chemical reaction

Kinetic reaction ratio is estimated as follow

d[P\*Q]/dt=R1[P\*][Q]-R2[P\*Q]… (9)

$$\text{k1[P\*][Q]=(R1-R2)[P\*Q]......} \tag{10}$$

The final reaction a stagnation phase is reached on chemical equilibrium. It become to d[P\*Q]/dt=0 and change from (9) to (10). An affinity state is shown on the reaction curve at time.

#### **5.2 Calibration curve**

40 Quality Assurance and Management

P+P\*+Q<=>P\*Q+PQ+PI+P\*I→P\*Q (7)

Po is invariably an usual constant of nature in the "law of mass action". In this case, it must be the measured P\*Q/Po of the effective binding ratio. Affinity is the same as the kinetic reaction rate. An affinity increases until the deactivation of saturation that the reaction is based in the reaction process in time. Kinetic chemical reaction is accelerated by biomaterials as a quadratic differential polynomial equation with reaction time t. The

The extent of reaction is shown in Fig.5. Secondary order differentiation of the first item shows the rate of acceleration included a special resistance phase. It is shown to the portion of start of the reaction curve in Fig.5. A primary differentiation of the next item is shown reaction velocity. The last item shows the amount of chemical compound after reached chemical equilibrium as stoichiometry. The first item has role important for security of

Ad(P\*Q)2/dt2+Bd(P\*Q)/dt+C(P\*Q) (8)

secondary order differential equation is as follows according to three phases as (8)

R1

R2


reaction and the analysis of an outlier.

Fig. 5. Elastase-1 chemical reaction

Kinetic reaction ratio is estimated as follow





Where are:


> In this research, a purpose is improving the accuracy of the calibration curve used for quantitative analysis and making it level which can be assurance. The accuracy of calibration curve is important as intermediate accuracy of the whole measurement system. The measurement data was collected by equalization of double sample in order to make high order accuracy. The Data Quality Object (DQO) of the improvement accuracy targeted the calibration curve for a chemistry quantity analysis.

> Even in the case of measurements requiring multipoint linear (non-linear) calibration with five or more different concentration reagents of reference homogeneity material. The uncertainty of routine test values can be quantified using basically the same procedure as "Estimating the uncertainty of routine test values" except that the uncertainty of the reference material is calculated as a combined standard uncertainty using the mean value obtained by averaging the standard uncertainty (see to Fig.6).

> The reference material used should be an actual sample, the property of which is similar to the patient spaceman to be assayed. Even in the multi-point linear calibration with three or more different rations concentrations of reference material, the uncertainty of routine test values can be quantified using basically procedures. Test reagent is called calibrator.

> All the product calculation curves are having the quality verified by formula of Mechaelis-Menten formula.

Key aspect is the Antigen-Antibody interaction.


#### **6. Experiment method**

#### **6.1 Progress of work**

In immunochemical analysis can be nonlinear analysis that is applicable of estimating to the uncertainty of assigned value of calibrators and QA control sample.

Fig. 6. Make a probability distribution of Calibration curve batch, for QC[16].

In this experiment, Elastase-1 sample data is verified whether it would be added other test regents of two homeopathic test reagents of the same usage RIA and test chacking.. Furthermore, in order to investigate an information criterion number and likelihood test and

Two sorts of homeopathic reagents having chosen, there are Thyroxin, and Testosterone.

The reagent of Testosterone is one female hormone and the composition of reagents concentrations are 6 sorts dose which are 25, 50, 100, 250, 500 and 1000. The reagent of Thyroxin is one of thyroid hormone and the composition reagent concentrations are 5 sorts

Measured data is shown in Fig.7 and Fig 8. Both figure quotes the pareto graph that is a pile of the quasi normal distribution curve (solid curve B) on the measured PDF graph (bar graph A). Fig.7 shows 0 dose reagents data as the largest affinity in six sorts of calibrators of

**6.3 Additional reagent samples** 

dose which are 0, 3.0, 6.0, 12, and 24.

**7.1 PDF data of elasyase-1 reagents** 

**7. Experiment result** 

it used multivariable analysis in the time series.

The purpose of such a sequence is to approximate the joint probability distribution, or to compute an integral (such as an expected value). The estimation of a possible discrepancy takes into account both random error and in the measurement process. The distinction to keep in mind and with regard to be able corrected or cannot be corrected theoretically least.

Gibbs sampling is particularly well adapted to sampling the posterior distribution of a Bayesian networks, since thus are typically specified as a collection of conditional distribution. The point of Gibbs sampling is that given a multivariate distribution, it is simpler to sampler from a conditional distribution than to marginalize by integrating over a joint distribution. The algorithms are useful Softcomputing method, fundamental statistical method, MCMC and MA, Softcomputing method is fuzzy function and chaos function.

#### **6.2 Data quality object(DQO)**

This experiment used the test reagent kits of the Elastase-1 that is one of test reagent of pancreas hormone mainly. The test method is RIA [17], RIA is a kind of BCA with detectable marker that is labeling the radioactive material of I-125. A measurement method is detection of the gamma ray which I-125 emits.

The calibration curve made of the test reagent of six sorts of concentration (dose) that is the arrangement harmonious in the shape of a stairs state. A calibration curve is created by applying regression analysis to a frequency density of measurement result. The composition of reagent concentrations are six sorts of 50, 150, 500, 1500 and 5000 dose. Dose is international catalyst unit (IU). Test reagent is called as calibrator. The number of sample size is total 320 sets of calibrators that is divided three groups (lots) that consists one group of 120 sets and two groups of 100 sets. The number of one group is recommended the least in ANOVA. It was made three groups here in order to investigate whether a difference exists between groups. Fig.6 shows the procedure which creates distribution of statistics from the population calibration curves to which regression analysis was applied the frequency density distribution is created by measurement data and analyzed. In an immunoassay the total amount of antigen for standard solution is known by each dose. Plotting affinity against total antigen produces the standard curve. The value for affinity is determined by radioactivity amounts remaining bound divided by the total amount of radioactivity added in the beginning. The radioactivity is proportional to the concentration of the labeled antigen. It is using this standard curve and the affinity for an unknown sample the antigen concentration can be calculated. The dose response curve shown is that a typical of RIA standard curve that is used calibration curve.

In an immunoassay the total amount of antigen for standard solution is known by each dose. Plotting affinity (binding ratio) against total antigen produces the standard curve. The value for affinity is determined by radioactivity remaining bound divided by the total amount of radioactivity added in the beginning – the radioactivity is proportional to the concentration of the labeled antigen. It is using this standard curve and the affinity for an unknown sample the antigen concentration can be calculated. The dose response curve shown is that a typical of RIA standard curve.

The purpose of such a sequence is to approximate the joint probability distribution, or to compute an integral (such as an expected value). The estimation of a possible discrepancy takes into account both random error and in the measurement process. The distinction to keep in mind and with regard to be able corrected or cannot be corrected theoretically

Gibbs sampling is particularly well adapted to sampling the posterior distribution of a Bayesian networks, since thus are typically specified as a collection of conditional distribution. The point of Gibbs sampling is that given a multivariate distribution, it is simpler to sampler from a conditional distribution than to marginalize by integrating over a joint distribution. The algorithms are useful Softcomputing method, fundamental statistical method, MCMC and MA, Softcomputing method is fuzzy function and chaos function.

This experiment used the test reagent kits of the Elastase-1 that is one of test reagent of pancreas hormone mainly. The test method is RIA [17], RIA is a kind of BCA with detectable marker that is labeling the radioactive material of I-125. A measurement method is detection

The calibration curve made of the test reagent of six sorts of concentration (dose) that is the arrangement harmonious in the shape of a stairs state. A calibration curve is created by applying regression analysis to a frequency density of measurement result. The composition of reagent concentrations are six sorts of 50, 150, 500, 1500 and 5000 dose. Dose is international catalyst unit (IU). Test reagent is called as calibrator. The number of sample size is total 320 sets of calibrators that is divided three groups (lots) that consists one group of 120 sets and two groups of 100 sets. The number of one group is recommended the least in ANOVA. It was made three groups here in order to investigate whether a difference exists between groups. Fig.6 shows the procedure which creates distribution of statistics from the population calibration curves to which regression analysis was applied the frequency density distribution is created by measurement data and analyzed. In an immunoassay the total amount of antigen for standard solution is known by each dose. Plotting affinity against total antigen produces the standard curve. The value for affinity is determined by radioactivity amounts remaining bound divided by the total amount of radioactivity added in the beginning. The radioactivity is proportional to the concentration of the labeled antigen. It is using this standard curve and the affinity for an unknown sample the antigen concentration can be calculated. The dose response curve shown is that a typical of RIA standard curve that is used calibration

In an immunoassay the total amount of antigen for standard solution is known by each dose. Plotting affinity (binding ratio) against total antigen produces the standard curve. The value for affinity is determined by radioactivity remaining bound divided by the total amount of radioactivity added in the beginning – the radioactivity is proportional to the concentration of the labeled antigen. It is using this standard curve and the affinity for an unknown sample the antigen concentration can be calculated. The dose response curve

least.

curve.

**6.2 Data quality object(DQO)** 

of the gamma ray which I-125 emits.

shown is that a typical of RIA standard curve.

Fig. 6. Make a probability distribution of Calibration curve batch, for QC[16].

#### **6.3 Additional reagent samples**

In this experiment, Elastase-1 sample data is verified whether it would be added other test regents of two homeopathic test reagents of the same usage RIA and test chacking.. Furthermore, in order to investigate an information criterion number and likelihood test and it used multivariable analysis in the time series.

Two sorts of homeopathic reagents having chosen, there are Thyroxin, and Testosterone.

The reagent of Testosterone is one female hormone and the composition of reagents concentrations are 6 sorts dose which are 25, 50, 100, 250, 500 and 1000. The reagent of Thyroxin is one of thyroid hormone and the composition reagent concentrations are 5 sorts dose which are 0, 3.0, 6.0, 12, and 24.

#### **7. Experiment result**

#### **7.1 PDF data of elasyase-1 reagents**

Measured data is shown in Fig.7 and Fig 8. Both figure quotes the pareto graph that is a pile of the quasi normal distribution curve (solid curve B) on the measured PDF graph (bar graph A). Fig.7 shows 0 dose reagents data as the largest affinity in six sorts of calibrators of

Fig 9 is shown a superposition graph by six measured PDF forms on the variation of affinity

Fig.10 is shown a superposition graph the abnormal distribution of six sorts curve obtain by

The ordinate is the account of frequency of total 320 sample sets. The abscissa is shows affinity (%=P\*Q/P0). The abnormal PDF distribution should be required to create the CDF

(%) of maximum and minimum positions. All of PDF are shown not same form.

regression analysis, because all distributions are not take same as peak position.

**7.2 The superposition graph of six PDF data** 

Fig. 9. A superposition graph of six measured PDF.

Fig. 10. A superposition graph of six the abnormal PDF.

follow up MCMC.

reagent, Fig.8 shows 5000 dose reagent data as the lowest affinity in the six dose sorts. A bar data serves as basic data from now on advancing analysis which is observation data as A in Fig.7 and Fig.8.

Quasi normal distribution is created by regression analysis from the measured frequency distribution as B in Figs.7 and Fig.8. The ordinate shows the account of frequency density of total 320 sample sets. The abscissa shows affinity (%) divided into 20 steps between maximum and minimum affinity.

PDF graph is carrying out random change in any range of sample groups. The PDF of random variation exists completely characterized in both figures.

Both figures are chosen as example of representation from six dose data. Six sorts all concentrations of PDF data were shown the abnormal distribution. All PDF data of measurement results of six sorts were not same as the form. Abnormal distributions are cannot computed both skewness and kurtosis.

Fig. 7. PDF of 0 dose data of Elastase-1

Fig. 8. PDF of 5000 dose data of Elastase-1

#### **7.2 The superposition graph of six PDF data**

44 Quality Assurance and Management

reagent, Fig.8 shows 5000 dose reagent data as the lowest affinity in the six dose sorts. A bar data serves as basic data from now on advancing analysis which is observation data as A in

Quasi normal distribution is created by regression analysis from the measured frequency distribution as B in Figs.7 and Fig.8. The ordinate shows the account of frequency density of total 320 sample sets. The abscissa shows affinity (%) divided into 20 steps between

PDF graph is carrying out random change in any range of sample groups. The PDF of

Both figures are chosen as example of representation from six dose data. Six sorts all concentrations of PDF data were shown the abnormal distribution. All PDF data of measurement results of six sorts were not same as the form. Abnormal distributions are

random variation exists completely characterized in both figures.

Fig.7 and Fig.8.

maximum and minimum affinity.

cannot computed both skewness and kurtosis.

Fig. 7. PDF of 0 dose data of Elastase-1

Fig. 8. PDF of 5000 dose data of Elastase-1

Fig 9 is shown a superposition graph by six measured PDF forms on the variation of affinity (%) of maximum and minimum positions. All of PDF are shown not same form.

Fig.10 is shown a superposition graph the abnormal distribution of six sorts curve obtain by regression analysis, because all distributions are not take same as peak position.

The ordinate is the account of frequency of total 320 sample sets. The abscissa is shows affinity (%=P\*Q/P0). The abnormal PDF distribution should be required to create the CDF follow up MCMC.

Fig. 9. A superposition graph of six measured PDF.

Fig. 10. A superposition graph of six the abnormal PDF.

CDF data shows in Fig. 11 and Fig 12. These data quoted a pareto graph that is a pile of CDF distribution curve (solid curve B) on the PDF (bar graph A). Fig.11 shows the data on 0 dose reagents as the largest affinity in six sorts of reagent, and Fig.12 shows the data on 5000 dose

This research analyzed of abnormal portion looked on CDF curve in Figs.11 and 12. By ISO-GUM of type B. it acquire further higher measurement accuracy and, it is required to analyze a fault element by FTA and EDA. Here, it has to process by type B for abnormal distributions. Fig 7 and Fig,8 data was useful basic data in this research. In both Figs 11 and 12, the reagent kit is divided into three groups (lots) and it shown influence to verify of difference between in reagent kits. The ordinate of is the account of frequency of total 320 sample sets. The abscissa is shows affinity (%=P\*Q/P0) which All graphs quote variation of affinity that divided the affinity into 20 ranks between maximum and minimum. The

**7.4 CDF data** 

reagent as the lowest affinity in six sorts of reagent.

ordinate quotes generating frequency counts.

Fig. 11. Elastase-1 0 dose of 3 rots graph.

Fig. 12. Elastase-1 5000 dose of 3 rots graph

#### **7.3 fundamental statistics quantity**

Table 2 shows what summarized the fundamental statistics quantity (AIC, mean, RSS, Medi, peak, Max, Min, F-test and t-test) and the data of EDA added the operation result by chaos theory and Fuzzy logic. Max is the maximum value in upper limit of confidence interval and Min is the minimum value in the lowest limit of confidence interval. Max and Min data use primary confidence interval. Fuzzy logic use also 20 steps member function in fuzzy logic, Chaos theory use difference equation in nonlinear.

Some value shows the same value based on many columns in table 2. Six columns (Fuzzy, AIC, mean, RSS medi and peak) are shown in red bold letter. The focus inside of the same value is the mean. The mean value can be referenced as a central value by data in this experiment. This value can be set as a reference value by Type A for internal quality control (IQC). The assurance interval by type A was calculated by 95% level and coverage factor k=2. It is 2 sigma of a routine test level and it can be assurance against 95% level which is given by the 0,025 and 0.975 fractions limited [7]. If measured data is over the limited line, it is unclear that dispersion is large, so that the affinity was a low value.


Cipher cod of result

Chaos chaos theory: Difference equation in nonlinear

Fuzzy: 20 steps membership' function

AIC: Likelihood function (Akaike Information Criterion)

RSS: Root Sum Square

Medi: Central value

Peak: peak point.

Max: Maximum value in distribution

Min Minimum value in distribution

F-test F-distribution

t-test t-distribution

EFD: Effective Freedom Degree

Table 2. Fundamental statistics quantity by type A

#### **7.4 CDF data**

46 Quality Assurance and Management

Table 2 shows what summarized the fundamental statistics quantity (AIC, mean, RSS, Medi, peak, Max, Min, F-test and t-test) and the data of EDA added the operation result by chaos theory and Fuzzy logic. Max is the maximum value in upper limit of confidence interval and Min is the minimum value in the lowest limit of confidence interval. Max and Min data use primary confidence interval. Fuzzy logic use also 20 steps member function in fuzzy logic,

Some value shows the same value based on many columns in table 2. Six columns (Fuzzy, AIC, mean, RSS medi and peak) are shown in red bold letter. The focus inside of the same value is the mean. The mean value can be referenced as a central value by data in this experiment. This value can be set as a reference value by Type A for internal quality control (IQC). The assurance interval by type A was calculated by 95% level and coverage factor k=2. It is 2 sigma of a routine test level and it can be assurance against 95% level which is given by the 0,025 and 0.975 fractions limited [7]. If measured data is over the limited line, it

Dose 0 50 150 500 1500 5000 Chaos 71.84 65.94 54.93 36/.77 22.5.2 12.17 Fuzzy **68.5 61.8 52.1 34.8 20.7 10.4**  AIC **68.46 62.42 53.0 34.97 21.34 10.59**  mean **68.4 61.8 52.2 34.7 20.8 10.5**  RSS **68.3 61.9 52.1 34.7 20.7 10.4**  Medi **68.3 61.7 53.1 34.8 20.6 10.4**  peak **68.3 62.2 52.1 34.9 21.5 10.6**  Max 78.2 71.8 60.0 43.1 28.9 17.1 Min 46.7 41.1 34.4 21.7 12.4 6.9 F-test 2.17 2.16 2.17 2.17 2.17 2.17 t-test 2.03 2.02 2.03 2.03 2.03 2.02 EFD 0.18 0.17 0.19 0.17 0.17 0.18

**7.3 fundamental statistics quantity** 

Cipher cod of result

RSS: Root Sum Square Medi: Central value Peak: peak point.

F-test F-distribution t-test t-distribution

Chaos chaos theory: Difference equation in nonlinear

AIC: Likelihood function (Akaike Information Criterion)

Table 2. Fundamental statistics quantity by type A

Fuzzy: 20 steps membership' function

Max: Maximum value in distribution Min Minimum value in distribution

EFD: Effective Freedom Degree

Chaos theory use difference equation in nonlinear.

is unclear that dispersion is large, so that the affinity was a low value.

CDF data shows in Fig. 11 and Fig 12. These data quoted a pareto graph that is a pile of CDF distribution curve (solid curve B) on the PDF (bar graph A). Fig.11 shows the data on 0 dose reagents as the largest affinity in six sorts of reagent, and Fig.12 shows the data on 5000 dose reagent as the lowest affinity in six sorts of reagent.

This research analyzed of abnormal portion looked on CDF curve in Figs.11 and 12. By ISO-GUM of type B. it acquire further higher measurement accuracy and, it is required to analyze a fault element by FTA and EDA. Here, it has to process by type B for abnormal distributions. Fig 7 and Fig,8 data was useful basic data in this research. In both Figs 11 and 12, the reagent kit is divided into three groups (lots) and it shown influence to verify of difference between in reagent kits. The ordinate of is the account of frequency of total 320 sample sets. The abscissa is shows affinity (%=P\*Q/P0) which All graphs quote variation of affinity that divided the affinity into 20 ranks between maximum and minimum. The ordinate quotes generating frequency counts.

Fig. 11. Elastase-1 0 dose of 3 rots graph.

Fig. 12. Elastase-1 5000 dose of 3 rots graph

 Dose 0 50 150 500 1500 5000 SD Central 62,4 56.4 47.2 32.4 20.6 12.0 0.95 UA 8.64 8.43 7.03 5.86 4.53 2.8

> 48. 64.9

N.D Central 68.3 62.2 52.1 34.9 21.5 10.6

0.95 UB 7.26 6.54 6.11 451 3.11 1.84

Us UC 11.2 10.67 9.22 7.32 5.40 3.29

Uncertainty associated with measuring operation which are calibration dispersion, withinday and between days, within-laboratory and between- laboratory dispersion, and the like, including factor due to reagent preparation and instrument variation. In co-data which is inherent in a measurement result, two or more uncertainty factor analyze by MA. The time

In former research, it used multiple-regression for MA based on "Law of propagation of Uncertainty" in EDA and has found out a new uncertainty factor. Fig.13 and Fig.14 is importance data in which is new variable element as the influences for storage days. While

Fig.13 shows changing of the reaction capability by storage days of 0 dose reagent. Data of Elastese-1, it is data comes to deteriorates in right going down in proportion to the increase in days to day. To while the reaction capability of a test reagent downs for storage days.

depended uncertainty analysis is day to day variance in this experiment.

the reaction capability is quoted fall down of a test reagent for storage days.

**56.8- 67.7** 

55. 68.7 40.2 54.2

46- 58.21

**4.61- 56.7** 

26.5 38.3

30.3 49.6

**31.3 38.6**  16.1 25.2

18.4 24.6

**18.8 24.2** 

9.2- 14.8

8.76- 12.4

**9.0- 12.2** 

71,0

75.6

**73.9** 

CIA 53.8

CIB 61.-

AI CIC **62.4** 

0.95UA: 0.95%x2 Uncertainty type A 0.95UB: 0.95%x2 Uncertainty type B CIA: Confidence Interval type A CIB: Confidence Interval type B UC: Combine Uncertainty

CIC: Confidence Interval for assurance

Table 4. Accuracy interval and related data

UC: Combined uncertainty

**7.6.1 Data of elastase-1** 

Cipher cod of result SD: Standard Deviation ND: Normalize Distribution Us: Standard Uncertainty AI: Assurance Interval

Central value

**7.6 MA date** 

The CDF curve shown the form of sigmoid and bending has seen in some portion. The bend of a curve suggested that abnormalities existed in a chemical reaction.

All of CDF result (including the four sorts of other dose) showed a state of an abnormal distribution similarity and not same form. The abnormal distribution of PDF should be required to useful MA that is nonparametric test and nonlinear analysis.

#### **7.5 Confidence zone (interval)**

The protocol of QA was development to estimate the uncertainty of measurement of a chemical analysis by utilizing in house validation studies. The approach was to generate an estimate of the uncertainty across the analytical concentration range. [21]

Table 3 shows amount of calculation result by the Welch Satterthwaite as Eq (3) for coverage factor simulation. Value in table 3 is converted into an effective free degree from the Welch Satterthwaite factor by standard statistical table showing EFD in table 3. The experiment result needs to set coverage factor k to 2.2 by standard statistical table. Because coverage factor 2.0 is generally used as standard. It is necessary to make narrower than a general value the confidence interval which can be assured in this case.

Table 4 shows the confidence interval and related data for QA, a result is expressed numerically and it is. Assurance Interval (AI) data shows final assurance value. Reject zone is shown also in the t-distribution and the square distribution in MCMC of Fig.3.

The assurance interval (zone) determined based on type B of ISO-GUM, in this case is required to set the narrower interval than getting ANOVA value for external quality control (EQC) that is required 99.7% as more than three 3 sigma. Only the measured value which exists inside an assurance interval turns into assured performance.

The measurement vale whose assurance is attained only a mean value which is exist in the confidence zone (see Fig 2) inside assurance interval (AI in table 4) that is authorize.

In the case of a type A evaluation of uncertainty, repeated measurement indications are regarded as independently drawn from normalized frequency distribution and according to its suggestion the uncertainty evaluation method of supplement 1 is applied after having assigned scaled and shifted t-distribution to corresponding input quantities.

Next research investigated the cause of the abnormal part shown in curve of CDF. FTA and RCA performed the method.


Table 3. Calculation result by the Welch Satterthwaite


Cipher cod of result SD: Standard Deviation ND: Normalize Distribution Us: Standard Uncertainty AI: Assurance Interval Central value 0.95UA: 0.95%x2 Uncertainty type A 0.95UB: 0.95%x2 Uncertainty type B CIA: Confidence Interval type A CIB: Confidence Interval type B UC: Combine Uncertainty CIC: Confidence Interval for assurance UC: Combined uncertainty

Table 4. Accuracy interval and related data

#### **7.6 MA date**

48 Quality Assurance and Management

The CDF curve shown the form of sigmoid and bending has seen in some portion. The bend

All of CDF result (including the four sorts of other dose) showed a state of an abnormal distribution similarity and not same form. The abnormal distribution of PDF should be

The protocol of QA was development to estimate the uncertainty of measurement of a chemical analysis by utilizing in house validation studies. The approach was to generate an

Table 3 shows amount of calculation result by the Welch Satterthwaite as Eq (3) for coverage factor simulation. Value in table 3 is converted into an effective free degree from the Welch Satterthwaite factor by standard statistical table showing EFD in table 3. The experiment result needs to set coverage factor k to 2.2 by standard statistical table. Because coverage factor 2.0 is generally used as standard. It is necessary to make narrower than a general

Table 4 shows the confidence interval and related data for QA, a result is expressed numerically and it is. Assurance Interval (AI) data shows final assurance value. Reject zone

The assurance interval (zone) determined based on type B of ISO-GUM, in this case is required to set the narrower interval than getting ANOVA value for external quality control (EQC) that is required 99.7% as more than three 3 sigma. Only the measured value which

The measurement vale whose assurance is attained only a mean value which is exist in the

In the case of a type A evaluation of uncertainty, repeated measurement indications are regarded as independently drawn from normalized frequency distribution and according to its suggestion the uncertainty evaluation method of supplement 1 is applied after having

Next research investigated the cause of the abnormal part shown in curve of CDF. FTA and

Sample size 0 50 150 500 1500 5000 102 17.18 16.83 16.28 16.62 16.64 18.22 100 18.22 16.87 16.28 16.45 16.88 16.84 110 18.27 18.27 18.68 16.07 18.19 18.51 29 7.31 7.44 7.3 7.31 7.27 7.29 341 60.98 59.39 64.24 56.45 58.97 34.1

confidence zone (see Fig 2) inside assurance interval (AI in table 4) that is authorize.

assigned scaled and shifted t-distribution to corresponding input quantities.

is shown also in the t-distribution and the square distribution in MCMC of Fig.3.

of a curve suggested that abnormalities existed in a chemical reaction.

required to useful MA that is nonparametric test and nonlinear analysis.

estimate of the uncertainty across the analytical concentration range. [21]

value the confidence interval which can be assured in this case.

exists inside an assurance interval turns into assured performance.

Table 3. Calculation result by the Welch Satterthwaite

**7.5 Confidence zone (interval)** 

RCA performed the method.

#### **7.6.1 Data of elastase-1**

Uncertainty associated with measuring operation which are calibration dispersion, withinday and between days, within-laboratory and between- laboratory dispersion, and the like, including factor due to reagent preparation and instrument variation. In co-data which is inherent in a measurement result, two or more uncertainty factor analyze by MA. The time depended uncertainty analysis is day to day variance in this experiment.

In former research, it used multiple-regression for MA based on "Law of propagation of Uncertainty" in EDA and has found out a new uncertainty factor. Fig.13 and Fig.14 is importance data in which is new variable element as the influences for storage days. While the reaction capability is quoted fall down of a test reagent for storage days.

Fig.13 shows changing of the reaction capability by storage days of 0 dose reagent. Data of Elastese-1, it is data comes to deteriorates in right going down in proportion to the increase in days to day. To while the reaction capability of a test reagent downs for storage days.

Fig.16 is shown change of average value under storage days by the data based on ANOVA,

In this experiment, since it seem that the data of 7.6.1 is important. I verified whether the same result would be obtained with two homeopathic test reagents of the same usage RIA. Furthermore, in order to investigate an information criterion sample number and likelihood test and it used multivariable analysis in the time series. The results are shown Fig.17 and

Fig.17 and Fig. 18 shows the graph which laid change of SD and of the number of samples on the top of storage days. Fig. 17 shows Thyroxin data. Fig.18. shows Testosterone data.

In both figure, although a periodic change is looked at by change of SD for under every storage days of six sorts dose as Standard Deviation (SD) in order to explore the root cause of fault elements, it will become unstable data few samples. The both figures showed the same characteristic results. Thereby, the check of the reproducibility of an uncertainty factor was completed. About the number of samples, it is the information criterion by maximum

only a flat change of right going down is shown and a periodic change is not seen.

 Fig. 15. Storage days variance by type B of EDA

Fig. 16. Storage days variance by type A of ANOVA

**7.6.2 MA data of additional reagents** 

Fig.18.

Fig.14 shows changing of the reaction capability by the storage period days of 5000 dose reagent. Data shows a degradation of reaction capability that reaches a detectable limit and it is meaning the unstable state of large uncertainty.

In both graph, the ordinate is the affinity (%). The abscissa is shown storage days.

Fig. 13. Change under storage days of 0 dose.

Fig. 14. Change under storage days of 5000 dose

Fig.15 is shown change of SD value under storage days by EDA. In Fig 15, SD value has fallen so that the concentration of reagent as which it is regarded the change united with the biorhythm of 28 diurnal periodicity exists. SD value with reaction capability is falling as to slide with concentration dose. Data of Fig.15 has mean very important to improved accuracy as fault elements The abscissa shows the storage days of every seven days interval in Fig.14. The change interval was a periodical target on the 28 days interval of biorhythm. In the addition, in the domain of low binding capacity, all the unstable of accuracy are as same.

Fig.14 shows changing of the reaction capability by the storage period days of 5000 dose reagent. Data shows a degradation of reaction capability that reaches a detectable limit and

Fig.15 is shown change of SD value under storage days by EDA. In Fig 15, SD value has fallen so that the concentration of reagent as which it is regarded the change united with the biorhythm of 28 diurnal periodicity exists. SD value with reaction capability is falling as to slide with concentration dose. Data of Fig.15 has mean very important to improved accuracy as fault elements The abscissa shows the storage days of every seven days interval in Fig.14. The change interval was a periodical target on the 28 days interval of biorhythm. In the addition, in the domain of low binding capacity, all the unstable of accuracy are as same.

In both graph, the ordinate is the affinity (%). The abscissa is shown storage days.

it is meaning the unstable state of large uncertainty.

Fig. 13. Change under storage days of 0 dose.

Fig. 14. Change under storage days of 5000 dose

Fig. 15. Storage days variance by type B of EDA

Fig. 16. Storage days variance by type A of ANOVA

Fig.16 is shown change of average value under storage days by the data based on ANOVA, only a flat change of right going down is shown and a periodic change is not seen.

#### **7.6.2 MA data of additional reagents**

In this experiment, since it seem that the data of 7.6.1 is important. I verified whether the same result would be obtained with two homeopathic test reagents of the same usage RIA. Furthermore, in order to investigate an information criterion sample number and likelihood test and it used multivariable analysis in the time series. The results are shown Fig.17 and Fig.18.

Fig.17 and Fig. 18 shows the graph which laid change of SD and of the number of samples on the top of storage days. Fig. 17 shows Thyroxin data. Fig.18. shows Testosterone data.

In both figure, although a periodic change is looked at by change of SD for under every storage days of six sorts dose as Standard Deviation (SD) in order to explore the root cause of fault elements, it will become unstable data few samples. The both figures showed the same characteristic results. Thereby, the check of the reproducibility of an uncertainty factor was completed. About the number of samples, it is the information criterion by maximum

sample in Fig 19. The interaction and allosteric effect were considered by from the experiment results of in fig.19. Ordinate is affinity (%) of reagent. Abscissa is dose of reagent

Some one exists in the allosteric effect which inhabits reaction process. Both effects like in

QA system of clinical test data by ISO-GUM is made utilization in 2006. It included in IT system plan for medical health care in 2010. This research is continued in order to reliance of QA further. In ISO-GUM it is pursued two or more buried multivariable uncertainty factors.

This research started for the purpose of preventing the clinical misdiagnosis by ambiguity data on medical care. Therefore, the work is obtaining that of data stable in higher accuracy has been continued. Improve strategy found out by uniting QE and ISO-GUM. The assurance of measurement data which was able to be attained new technology for is main purpose. Clinical data obtained and expected from exact prediction of patient individual's pathological change. After that, a result came to be utilized for the world wider base medical care of EQAS when improvement of ambiguity was obtained. The result of research has

Uncertainty of measurement, traceability and numerical significance are separate but closely related concepts that affect both the format and information conveyed by a quantitative test result. In addition, use of SI units provides a consistent basis for the reporting of clinical laboratory. Katal as a SI unit which evaluates the reaction kinetics of chemical will be used

The experimental result sees enable exact diagnosis decision taking in Bayes inference to QE and ISO standard. Medical Laboratory Quality System (MLQS) is essential in laboratory to

QA has grown to be equivalent QC which can obtain the same result when and anywhere.

the correct result for patient and donor by Good Laboratory Practice (GLP)

Fig. 19. Interaction is exact between 2 calibration curves as an example

Validity of ISO-GUM is increasing by additional issue many supplements.

the bio- science on theorem of logistic function.

**8. Conclusion** 

been satisfied,

for the near future.

at table position as from 0 to 5000.

likelihood theory need to be inquired. This research is under experiment. In Fig.17 and 18, it is unstable area by number of five or less samples. The ordinate shows SD and number of samples. The abscissa shows the storage days of every seven days interval. The ordinate shows SD and number of samples In both graph, the ordinate is the affinity (%) and frequency. The abscissa is shown every seven storage days.

Fig. 17. Shows Thyroxin data.

Fig. 18. Shows Testosterone data.

#### **7.6.3 Interaction and allosteric effect in data**

The purpose of this chapter is to outline methods for assessing uncertainties related to material inhomogeneous that can be a factor in uncertainty analysis (see chapter 6.2).

One more is found in the uncertainty factor. The interaction effect and the allosteric effect are generating by the source of biorhythm. Binding of antibodies to antigens is reversibly and very specific. The introduction of an immune response depends on the size of antigen. Small molecular weight compounds (<2000 Daltons) such as drugs is unable to induce antibody formation.

The interaction investigated the phenomenon prevented from fundamental reaction principle. The example of calibration curve has generating of an interaction which is shown sample in Fig 19. The interaction and allosteric effect were considered by from the experiment results of in fig.19. Ordinate is affinity (%) of reagent. Abscissa is dose of reagent at table position as from 0 to 5000.

Fig. 19. Interaction is exact between 2 calibration curves as an example

Some one exists in the allosteric effect which inhabits reaction process. Both effects like in the bio- science on theorem of logistic function.

#### **8. Conclusion**

52 Quality Assurance and Management

likelihood theory need to be inquired. This research is under experiment. In Fig.17 and 18, it is unstable area by number of five or less samples. The ordinate shows SD and number of samples. The abscissa shows the storage days of every seven days interval. The ordinate shows SD and number of samples In both graph, the ordinate is the affinity (%) and

The purpose of this chapter is to outline methods for assessing uncertainties related to

One more is found in the uncertainty factor. The interaction effect and the allosteric effect are generating by the source of biorhythm. Binding of antibodies to antigens is reversibly and very specific. The introduction of an immune response depends on the size of antigen. Small molecular weight compounds (<2000 Daltons) such as drugs is unable to induce

The interaction investigated the phenomenon prevented from fundamental reaction principle. The example of calibration curve has generating of an interaction which is shown

material inhomogeneous that can be a factor in uncertainty analysis (see chapter 6.2).

frequency. The abscissa is shown every seven storage days.

Fig. 17. Shows Thyroxin data.

Fig. 18. Shows Testosterone data.

antibody formation.

**7.6.3 Interaction and allosteric effect in data** 

QA system of clinical test data by ISO-GUM is made utilization in 2006. It included in IT system plan for medical health care in 2010. This research is continued in order to reliance of QA further. In ISO-GUM it is pursued two or more buried multivariable uncertainty factors. Validity of ISO-GUM is increasing by additional issue many supplements.

This research started for the purpose of preventing the clinical misdiagnosis by ambiguity data on medical care. Therefore, the work is obtaining that of data stable in higher accuracy has been continued. Improve strategy found out by uniting QE and ISO-GUM. The assurance of measurement data which was able to be attained new technology for is main purpose. Clinical data obtained and expected from exact prediction of patient individual's pathological change. After that, a result came to be utilized for the world wider base medical care of EQAS when improvement of ambiguity was obtained. The result of research has been satisfied,

Uncertainty of measurement, traceability and numerical significance are separate but closely related concepts that affect both the format and information conveyed by a quantitative test result. In addition, use of SI units provides a consistent basis for the reporting of clinical laboratory. Katal as a SI unit which evaluates the reaction kinetics of chemical will be used for the near future.

The experimental result sees enable exact diagnosis decision taking in Bayes inference to QE and ISO standard. Medical Laboratory Quality System (MLQS) is essential in laboratory to the correct result for patient and donor by Good Laboratory Practice (GLP)

QA has grown to be equivalent QC which can obtain the same result when and anywhere.

**4** 

*Romania* 

**The Use of Quality Function** 

Elena Condrea, Anca Cristina Stanciu

and Kamer Ainur Aivaz *"Ovidius" University of Constanta* 

**Deployment in the Implementation** 

**of the Quality Management System** 

Nowadays a strong accent focus on quality, this period of time being considered like one of

Firms are forced to reduce losses from the sales more and lower and also to extend their dispatch markets concomitantly with new customers gain. The competition more and more intense brought in the first plan the idea that quality is not something added, not something

Being a complex item, quality couldn't be directly and easily measured and expressed,

Quality depends also on corresponding materials, the equipments' performances, the technology, the technical control accuracy, the employee's suitable qualification etc. On the market, at a certain moment, several products could be found although they accomplish the same functions and have quite different performances and prices. To each set of performances correspond certain advantages, which could be quantified, but for each user,

Things look quite different if we try a quantitative expression of the quality or of a research project, an innovation initiative or of a business proposal. We could not talk about a quality "standard", or in any case, about a "universal" one. That's why it exists a need to find a very general method, with universal appliance, to permit each time the use some specific

As well, the conditions in which the products quality is determined are extremely complex,

Starting from here, B. Boehm, J.A. McCall, P. Richard and G. Walters structured a number of principles to allow a quantitative and objective measurement, in conformity with the

except for some technical characteristics and the majority of the economic one.

the advantages curve is different, in respect of the destination of the products.

**1. Introduction**

the "quality years".

instruments.

scheme:

good to be owned but a condition to survive.

due to both objective and subjective elements.

#### **9. References**


Elena Condrea, Anca Cristina Stanciu and Kamer Ainur Aivaz *"Ovidius" University of Constanta Romania* 

#### **1. Introduction**

54 Quality Assurance and Management

[1] BIPM. IEC. IFCC. ISO. IUPAC. OIML."Guide to the Expression of Uncertainty in measurement" in 1993, revised and corrected 2nd ISBN 92-67-10 pp188-9, 1995 [2] Joint Committee for Guides in Metrology (JCGM). "Guide of the expression of

[3] Barry N Taylor and Chris E Kuyatt. "Guideline for Evaluating and Expressing the

[4] Mario. Plebani "Errors in clinical laboratories or errors in laboratory medicine?" Clin.

[5] Barry Taylor. Chries E.Kuyatt "Guidelines for Evaluating and Expressing the

[6] Paula. R.G.Couto. Jailton.C.Damascene. Renata. M.H.Borges. "Uncertainty estimation of

[7] Lu Tang and Scott Willi "Uncertainty calculation in Analytical Chemistry" Chemical

[8] G.Fanti. A "Possible improvement to the iso-gum", {IMEKO-2000 World Congress,

[9] Voluntary EMC Laboratory Accreditation Center Inc. "Guideline for Traceability of

[10] K.Gregory. G. Bibbo. and J.E.Pattison. "A standard approach to measurement

[11] Jerzy M.Korczyski. "Calculation of expanded uncertainty" Joint IMEKO TC-1 & XXXIV MKM

[12] U.M.Koczynsky. "Calculation of expanded unceriainty" IMEKO world congress in 2000 [13] Pawel, Fotowics, "Method for calculating the coverage factor in calibration".{OIML

[14] Dr.Henrik.S. Nielse "Iso-gum 14253-1 Decision Rules –Good or Bad.?" National Conference of Standards Laboratories. Workshops & Symposium In 1999 [15] Mauro. Panteghini. "Traceability, Reference Systems and Result Comparability". Clin

[16] RODBAD and D,M.Hutt. "Statisticl analysis of immunoraiometory" (Labelld antibody

[21] Greg O Donnell and Robert Geyer "The Estimation of Uncertainty by the Utilization

[17] Colin.Selby "Inference in immunoassay" Ann.clin.Biochem. Vol.36. 1999.pp704-721 [18] Dorozhovets Mykhaylo. Warsza Zygmunt Lech. "Methods a upgrading the uncertainty

[19] Enginering statistics handbook issue NISTNIST/SEMATECH, issue NIST in 1992 [20] Maurice Cox, Peter Haris "Up a GUM tree? Try the Full Montel" National Physical

uncertainties for scientist and engineers in medicine". Australasian Physical &

Measurement" VLAC- VR103 Second Edition in 2002.12.19

Conference 2002. {Wroclaw 8-12 September, 2002.}

Biochem. Rew. Vol.26. p97-104 August 2007.

Laboratory. Tedding Middlesex. UK. TW11. OLW

[22] Wikipedia, the free encyclopedia "Radioimmunoassay"

and Quality Control Data" Thornleigh Ver, 1 18/06/2001

Engineering Sciences in medicine. Vol.28 No 2 P131-139 in 2005

Uncertainty of NIST Measurement result". {NIST Technical report Note 1297 in

Uncertainty of NIST Measurement Results" NIST Guide for Evaluation and

mechanical assay by iso-gum 95 and Monte Carlo simulation-case study: tensile strength, torqe and brinell hardness measurements" IMEKO September 17-22 2006

uncertainty in measurement". Supplement 1 In 2006

Express. Measurement. Uncertainty. In Sept 1994.

Metrology, INNS, URCC. February 2001

Vol.9, pp.77-82, in Austria, 2000}

Bulletin XLIII No.4 October, 2002}

ASSAY.) {IAEA SM-!77/278, 1977

of type A evaluation (2)"

**9. References** 

1994 edition}

in Brazil

Chem. Lab. Med. 2006. 44(6)

Nowadays a strong accent focus on quality, this period of time being considered like one of the "quality years".

Firms are forced to reduce losses from the sales more and lower and also to extend their dispatch markets concomitantly with new customers gain. The competition more and more intense brought in the first plan the idea that quality is not something added, not something good to be owned but a condition to survive.

Being a complex item, quality couldn't be directly and easily measured and expressed, except for some technical characteristics and the majority of the economic one.

Quality depends also on corresponding materials, the equipments' performances, the technology, the technical control accuracy, the employee's suitable qualification etc. On the market, at a certain moment, several products could be found although they accomplish the same functions and have quite different performances and prices. To each set of performances correspond certain advantages, which could be quantified, but for each user, the advantages curve is different, in respect of the destination of the products.

Things look quite different if we try a quantitative expression of the quality or of a research project, an innovation initiative or of a business proposal. We could not talk about a quality "standard", or in any case, about a "universal" one. That's why it exists a need to find a very general method, with universal appliance, to permit each time the use some specific instruments.

As well, the conditions in which the products quality is determined are extremely complex, due to both objective and subjective elements.

Starting from here, B. Boehm, J.A. McCall, P. Richard and G. Walters structured a number of principles to allow a quantitative and objective measurement, in conformity with the scheme:

*development of the relationships between demand and characteristics, starting with the products/services functions, followed by its characteristics, its components characteristics and ending* 

Transformation of the customer "voice" in technical requirements and quality plans;

The customer needs are restored more accurate into the specifications of the

 Experience and information are structured into a concise format, easy to be assimilated. The method allows the elaboration of a project concerning the clients' requirements. First of all it must take place an inquest to establish which are the functions of the products expected by the customers and also their importance. Then the characteristics are settled and after that they are correlated with functions. In the meantime, some comparisons with other firms' performances are made and also the characteristics are analyzed in their relationship,

Then, similarly, from the characteristics, the methodology follows with technical measurable

QFD method was developed in Japan at the end of the 60s by the professors Shigeru Miyuno and Yoji Akao. At that moment, the statistic control of the quality, introduced after the 2nd world wide war, had already roots in Japan. New quality methods were introduced, with the contribution of the quality control involvement in the business management, process

Professors Mizuno and Akao intended to develop a method to guarantee the quality, a way to make a product suitable for clients, before its issue on the market. Till that moment, the quality control methods were focused on the settlement of difficulties issued during the

The first important scale application was presented in 1966 by Kiyotaka Oshiumi from Bridgestone Tire in Japan. He used a "fish bone" diagram to discover the customer's expectations (outputs), and those characteristics and factors of the process (causes) which

This method was first used for the growth of the Naval Shipyards' performance, belonging to Mitsubishi. Initially, it was used to improve the quality of the company's products; in

in order to observe which ones are correlated and which ones are opposite.

*with the stages and characteristics of the processes from which it results.* 

The objectives of the method could be structured as follows:

Increasement of the quality level of the final products.

 Structuration of the design process; Reduction of the design cycles;

The advantages of using QFD are to be:

Shorter design and development cycles;

performances and then with materials technology.

**2.1 Short history of QFD development and application** 

products/process design;

 Lower costs, high productivity; Documentary orientation; The team involvement;

known afterwards as TQC or TQM.

influenced the respective result.

production or after.

Measurements (quantitative of the internal elements)

The main idea at the foundation of their studies is that the measure to express quality must result in an amount of numerous measurements, each having in view a certain characteristic. In conformity with the scheme, each product presents several characteristics to be appreciated. Each characteristic depends on an amount of internal elements of the subject to be measured, elements which could be quantitatively expressed [values of elements are to be found into numerical values of the characteristics and these, in their turn, will conduct to a quantitative indicator of the quality].

Meanwhile, with the reconsideration of the quality notion there appear also methods to allow finding solutions to better satisfy a certain buyer's segment. Such a method, known in the literature as *QFD* (*Quality Function Deployment* – extension of the quality function) or under the more familiar name of *House of Quality*, name given by the shape of one of the diagrams used which seemed to be a house, like having been designed by a child.

#### **2. Quality function deployment. Theoretical aspects**

Named also the "voice of customer", QFD is a systematic method to develop products/ services based on expectations and desires of customers, the position on the market of these products and services and their efficiency.

The basic principle of the method is represented by the customer's requirements in each step of their trajectory.

Specific for this method is the fact that all the development and renewal of products and services activities are perceived from the customer's perspective.

QFD is a team method, being applied by a team of 6-8 people, who must be involved in all the firm's departments.

QFD represents in fact a planning process, made to help the design, production and marketing of some products and services taking into account the customer's opinion.

The method issued in 1966, initiated by the Japanese Yoji Akao, being used in fact for the first time in 1972 by Mitsubishi and starting with the 80s, the method gained a large applicability both in USA and in Europe (1988), in order to design the development of different products, processes or projects.

Yoji Ajao defines QFD as being *a method which transforms the consumers requirements in quality characteristics and designs the quality of the finished product/service, through the systematic* 

*development of the relationships between demand and characteristics, starting with the products/services functions, followed by its characteristics, its components characteristics and ending with the stages and characteristics of the processes from which it results.* 

The objectives of the method could be structured as follows:


56 Quality Assurance and Management

Subject quality

Characteristics (externals)

Elements (internals)

Measurements (quantitative of the internal elements) The main idea at the foundation of their studies is that the measure to express quality must result in an amount of numerous measurements, each having in view a certain characteristic. In conformity with the scheme, each product presents several characteristics to be appreciated. Each characteristic depends on an amount of internal elements of the subject to be measured, elements which could be quantitatively expressed [values of elements are to be found into numerical values of the characteristics and these, in their turn,

Meanwhile, with the reconsideration of the quality notion there appear also methods to allow finding solutions to better satisfy a certain buyer's segment. Such a method, known in the literature as *QFD* (*Quality Function Deployment* – extension of the quality function) or under the more familiar name of *House of Quality*, name given by the shape of one of the

Named also the "voice of customer", QFD is a systematic method to develop products/ services based on expectations and desires of customers, the position on the market of these

The basic principle of the method is represented by the customer's requirements in each step

Specific for this method is the fact that all the development and renewal of products and

QFD is a team method, being applied by a team of 6-8 people, who must be involved in all

QFD represents in fact a planning process, made to help the design, production and

The method issued in 1966, initiated by the Japanese Yoji Akao, being used in fact for the

applicability both in USA and in Europe (1988), in order to design the development of

Yoji Ajao defines QFD as being *a method which transforms the consumers requirements in quality characteristics and designs the quality of the finished product/service, through the systematic* 

the method gained a large

marketing of some products and services taking into account the customer's opinion.

diagrams used which seemed to be a house, like having been designed by a child.

will conduct to a quantitative indicator of the quality].

**2. Quality function deployment. Theoretical aspects** 

services activities are perceived from the customer's perspective.

first time in 1972 by Mitsubishi and starting with the 80s,

products and services and their efficiency.

different products, processes or projects.

of their trajectory.

the firm's departments.


The advantages of using QFD are to be:


The method allows the elaboration of a project concerning the clients' requirements. First of all it must take place an inquest to establish which are the functions of the products expected by the customers and also their importance. Then the characteristics are settled and after that they are correlated with functions. In the meantime, some comparisons with other firms' performances are made and also the characteristics are analyzed in their relationship, in order to observe which ones are correlated and which ones are opposite.

Then, similarly, from the characteristics, the methodology follows with technical measurable performances and then with materials technology.

#### **2.1 Short history of QFD development and application**

QFD method was developed in Japan at the end of the 60s by the professors Shigeru Miyuno and Yoji Akao. At that moment, the statistic control of the quality, introduced after the 2nd world wide war, had already roots in Japan. New quality methods were introduced, with the contribution of the quality control involvement in the business management, process known afterwards as TQC or TQM.

Professors Mizuno and Akao intended to develop a method to guarantee the quality, a way to make a product suitable for clients, before its issue on the market. Till that moment, the quality control methods were focused on the settlement of difficulties issued during the production or after.

The first important scale application was presented in 1966 by Kiyotaka Oshiumi from Bridgestone Tire in Japan. He used a "fish bone" diagram to discover the customer's expectations (outputs), and those characteristics and factors of the process (causes) which influenced the respective result.

This method was first used for the growth of the Naval Shipyards' performance, belonging to Mitsubishi. Initially, it was used to improve the quality of the company's products; in

The central point of the diagram looks like a table with two entrances. On the rows there are the customers requirements and on the columns is underlined the correspondance between the customer's expectations and the quality characteristics of the respective product or

The QFD is also called the "House of Quality" because the solving solutions of the discussed problem are found in a series of matrix arranged as a house consisting of foundation, first

HOW/HOW

Technical characteristics (HOW)

> WHAT / HOW Relationship Matrix

Competition technical evaluation HOW MUCH / HOW

Objectives / technical metods h i Technical Response

Market (competition) FOR WHAT

WHAT / FOR WHAT

Corelations: Strong 9 Average 3 Weak 1 Voyce of

customer

service. This matrix or table is named *the matrix of relations*.

floor, attic and roof, as you can see in fig. 2.

Global characteristics

Points

Client's demands WHAT

Global impact

Organizing Difficulty

**2.2 QFD methodology** 

Required quality

> Objectives = HOW MUCH

Fig. 2. QFD example

time, however, they realized that, using the same analysis technique, the method could be used for the improvement of the quality belonging to every activity within an economic unit which produces or carries out services.

In time, certain famous Japanese companies had confirmed the method's efficiency, Toyota being among them, who introduced it in 1977 and which, in a 7 years application period, lowered the fabrication costs of an automobile by 40%, while significantly raising the quality and lowering the fabrication cycle.

In 1986, Ford and Xerox, in the United States, adopted the method.

QFD means a *mot-a-mot* translation of the Japanese words "*hinshitsu kino tenkai*", but first was translated like evolution of the quality function; name suggested by dr. L.T. Fan in 1978.

At the first workshop (seminar) about QFD in USA, the sponsor Masaaki Imai seemed that "*evolution*" doesn't reflect the sense of "*change*" and consequently, "*hinshitsu tenkai*" could be better translated like "*the quality development*". In that manner appears the name of QFD (*Development of the Quality Function*).

At the foundation of QFD method there is the *House of Quality*, a set of matrix used to link the voice of customer with the technical needs of a product, the control plans of the process and the production operations.

In the scheme (fig.1) we can observe the structure of the *House of Quality* and explanation of each component:


Fig. 1. The structure of the House of Quality

#### **2.2 QFD methodology**

58 Quality Assurance and Management

time, however, they realized that, using the same analysis technique, the method could be used for the improvement of the quality belonging to every activity within an economic unit

In time, certain famous Japanese companies had confirmed the method's efficiency, Toyota being among them, who introduced it in 1977 and which, in a 7 years application period, lowered the fabrication costs of an automobile by 40%, while significantly raising the quality

QFD means a *mot-a-mot* translation of the Japanese words "*hinshitsu kino tenkai*", but first was translated like evolution of the quality function; name suggested by dr. L.T. Fan in 1978. At the first workshop (seminar) about QFD in USA, the sponsor Masaaki Imai seemed that "*evolution*" doesn't reflect the sense of "*change*" and consequently, "*hinshitsu tenkai*" could be better translated like "*the quality development*". In that manner appears the name of QFD

At the foundation of QFD method there is the *House of Quality*, a set of matrix used to link the voice of customer with the technical needs of a product, the control plans of the process

In the scheme (fig.1) we can observe the structure of the *House of Quality* and explanation of

The clients' demand priority

Competitive evolution

In 1986, Ford and Xerox, in the United States, adopted the method.

**Inter-relations** 

Technical demands

between customer demands and technical requirements

> Technical requirements priorities

Voice of customer The relationship

Fig. 1. The structure of the House of Quality

which produces or carries out services.

and lowering the fabrication cycle.

(*Development of the Quality Function*).

and the production operations.

each component:

The central point of the diagram looks like a table with two entrances. On the rows there are the customers requirements and on the columns is underlined the correspondance between the customer's expectations and the quality characteristics of the respective product or service. This matrix or table is named *the matrix of relations*.

The QFD is also called the "House of Quality" because the solving solutions of the discussed problem are found in a series of matrix arranged as a house consisting of foundation, first floor, attic and roof, as you can see in fig. 2.

Fig. 2. QFD example

MUCH/HOW);

Extending the analysis.

National, regional, international standards;

The statistical control of the products/services;

coherence;

management, such as:

Market studies;

(graph).

house).

competitors.

decreasement, indifference).

competitors from 2 points of view: a. From the clients point of view; b. From the technical point of view.

The Quality Circles etc.

Comparing each technical characteristic with the competition (HOW

Calculating the importance of the technical characteristics and their hierarchizing;

7. Exploitation, binocular vision, which means analyzing the WHAT/HOW matrix's

The QFD analysis can be pictured as a branching graph, meaning that from each of the

With the QFD method, regarding the quality control and assurance, there can be put into practice other methods, techniques or instruments taken from the quality theory or

The Pareto Diagram, difference cause-effect Diagrams, Brainstorming Method etc.;

(opportunities of technological nature or a difference nature, risks' analysis etc.).

1. Determination of the customer's requirements and the importance for them.

QFD does not limit itself only to technological problems; it can also be applied to aspects regarding the reliability and costs, allowing for the definition of prioritary action directions

It can be said that, an important advantage of the QFD method is given by the *possibility of identifying the clients' latent demands* (demands which have not manifested themselves, yet).

2. Identification of the quality characteristics by the work team. The degree of requirements coverment through characteristics is evidenced by a score system

3. Determination of the quality characteristics which are to be performed for the new product and evaluation of the difficulty graph to be obtained. Concomitantly is processed also the sense of variation preferred for this values (increasement,

The evaluation of interaction and correlation between characteristics is emphasised in the correlation matrix (the superior zone of the diagram which forms the roof of the

Comparing the quality characteristics of products/services with those of the

4. The compared analysis of product/service planned with the product/service of

Filling out the "Global impact" and "Organizing difficulty" rows.

8. Carrying out the quality improvement process, which presumes: The process of improving the product's/service's quality;

resulting problems' diagrams, new QFD analyses can be made.

In the application of the method some steps are to be followed:

#### Thus:

*The first floor* is build out of "the client's voice" and here we find:


*The house's foundation* consists of:


*The house's attic* consists of the characteristics and/or methods needed to solve the problem's demands.

*The house's roof* represents the correlation matrix of the technical characteristics.

In addition to these building elements of the *House of Quality*, the analysis also requires to be made the correlation between the elements that form the *House of Quality*.

Generally, there are three types of correlations:


It comes by itself the fact that there is the possibility of not having a correlation between some elements of a matrix.

Each correlation category will be given a certain score, as well. A codified representation of each correlation category is recommended, in the purpose of x having a more suggestive matrix representation.

**The steps** of developing a QFD analysis are as follows:

	- Identifying the problem's demands or the client's wishes (the client's voice);
	- Their importance (giving points, percentages or hierarchizing the demands).
	- Gathering information about the market;
	- Commercial information;
	- Comparing with the main competitors , focusing on each of the clients' demands (WHAT/FOR WHAT);
	- The process of improving the product's/service's quality;
	- Extending the analysis.

The QFD analysis can be pictured as a branching graph, meaning that from each of the resulting problems' diagrams, new QFD analyses can be made.

With the QFD method, regarding the quality control and assurance, there can be put into practice other methods, techniques or instruments taken from the quality theory or management, such as:


60 Quality Assurance and Management

The correlation matrix of the problem's demands with the characteristics required by

The matrix which represents the firm's position in comparison with the main

 The objectives and technical measures needed in order to solve the analyzed problem; *The house's attic* consists of the characteristics and/or methods needed to solve the problem's

In addition to these building elements of the *House of Quality*, the analysis also requires to be

It comes by itself the fact that there is the possibility of not having a correlation between

Each correlation category will be given a certain score, as well. A codified representation of each correlation category is recommended, in the purpose of x having a more suggestive

 Identifying the problem's demands or the client's wishes (the client's voice); Their importance (giving points, percentages or hierarchizing the demands). 3. The perception and judging of the competition (FOR WHAT), which presumes the

Comparing with the main competitors , focusing on each of the clients' demands

4. Establishing the technical characteristics and/or methods that could compete to solve

5. Completion of the correlations in the WHAT/HOW and HOW/HOW matrix;

*The house's roof* represents the correlation matrix of the technical characteristics.

made the correlation between the elements that form the *House of Quality*.

*The first floor* is build out of "the client's voice" and here we find:

competitors, for each demand of the analyzed problem;

The technical evaluation matrix of the main competitors;

Generally, there are three types of correlations:

**The steps** of developing a QFD analysis are as follows:

Gathering information about the market;

6. Technical competition evaluation, which means:

Commercial information;

(WHAT/FOR WHAT);

the analyzed problem (HOW);

1. The selection and preparation of the problem; 2. Establishing the demands (WHAT), which means:

The problem's/client's demands;

*The house's foundation* consists of:

their solving;

Thus:

demands.

 Strong; Average; Weak.

some elements of a matrix.

matrix representation.

following:


QFD does not limit itself only to technological problems; it can also be applied to aspects regarding the reliability and costs, allowing for the definition of prioritary action directions (opportunities of technological nature or a difference nature, risks' analysis etc.).

It can be said that, an important advantage of the QFD method is given by the *possibility of identifying the clients' latent demands* (demands which have not manifested themselves, yet).

In the application of the method some steps are to be followed:


The evaluation of interaction and correlation between characteristics is emphasised in the correlation matrix (the superior zone of the diagram which forms the roof of the house).

	- a. From the clients point of view;
	- b. From the technical point of view.

Comparing the quality characteristics of products/services with those of the competitors.

**5. Evaluation of technical needs of the competitive products and services and** 

This stage happens usually on the base of information added or products tested. These evaluations are compared with the competitive evaluation of the customer's needs in order

If it is proved that a competitor product satisfies the customer's needs but the evaluation of the technical needs shows something else, then even the measurement was wrong, even if there is a difference of image (be it positive for competitor, or negative for the company product) which affects the consumer perception. For example, customers say they give a great importance to family and in the meantime, the competitive evaluation shows that these aspects are not accomplished. The establishment of a target regarding this need will satisfy the consumer need and will offer an advantage against the competitor's products.

In this stage there are identified the technical requirements which have a strong link with the customer needs, that are considered sale key points. In the rest of the process, the customer voice will be taken into account. Features not being considered critical don't need a greater attention, for example, the key factors in a fitness Centre are: the program, the

The six stages are just the beginning of the QFD process. There are used three houses of quality to develop the main parts of the customer needs, the process plan and the quality control.

The second house is very similar with the first, but it refers to the subsystem and

3

4

4

5

The technical needs of the first house of quality are described in detail (fig.3.).

3

to determine the disparities between the customer's needs and the technical ones.

**6. Selection of the technical needs to be modified in the process** 

equipment, the fee and the access to the Internet.

Fig. 3. The four quality houses – fill them in

2

3. Components' characteristics;

**establishment of targets** 

components.

Legend:

1

2

1. Client's needs; 2. Technical needs;

4. Process operations; 5. Quality control plan.

5. Establishment of the final values of quality characteristics for the new product.

The building of a *House of Quality* requires 6 basic steps:

#### **1. Identification of the customer needs**

The voice of customer remains at the base of the QFD process. Here below there are some essential approaches regarding gathering information from the clients:


#### **2. Identification of technical needs**

Technical needs are characteristics that describe the customer needs in the designer language. There must be measurable because the result is controlled (checked) and compared with the target objectives. The roof of the house shows the relationship being converted, using a series of symbols. A typical scheme uses the " " symbol for very strong relationships and a " " for weak relationships. For example, two technical requirements of certain superior services are the capacity, staff and equipment of a clinic. The relationship between them is strong and, in order to increase the capacity, more staff and equipment are needed.

#### **3. The link between the customer needs and the technical needs**

The customer needs must be written in the left column and the technical ones on the top. Inside the matrix the symbols indicate the type or relation in a similar way with those used at the roof of the House of Quality. The purpose of this matrix is to show if the final technical needs cover the customer needs. This kind of evaluation is usually based on the experts' experience, customer's reaction or controlled experiments.

The lack of a solid link between the customer needs and the technical ones show both that the needs are not covered and that the final product will hardly accomplish them.

And if a technical need doesn't affect a customer need it could be useless, or the designers might forget an important need of the customer.

#### **4. Addition of the competitive evaluation and of the sale key points**

In this stage the importance of every customer need is evaluated and the competitor's products and services are the ones which cover these needs and are also researched. These evaluations are very important and reflect the customer's expectations. Competitive evaluation underlines the strengths and weaknesses of the competition. Due to this stage, designers could discover methods to improve products and QFD method and the strategic vision of the company shows the priorities of the important customer are not satisfied by the competitor's products (such as family activities), thus, the company could obtain advantages by focusing on these aspects. The respective needs become sale key points and lie at the foundation of the marketing strategies.

#### **5. Evaluation of technical needs of the competitive products and services and establishment of targets**

This stage happens usually on the base of information added or products tested. These evaluations are compared with the competitive evaluation of the customer's needs in order to determine the disparities between the customer's needs and the technical ones.

If it is proved that a competitor product satisfies the customer's needs but the evaluation of the technical needs shows something else, then even the measurement was wrong, even if there is a difference of image (be it positive for competitor, or negative for the company product) which affects the consumer perception. For example, customers say they give a great importance to family and in the meantime, the competitive evaluation shows that these aspects are not accomplished. The establishment of a target regarding this need will satisfy the consumer need and will offer an advantage against the competitor's products.

#### **6. Selection of the technical needs to be modified in the process**

In this stage there are identified the technical requirements which have a strong link with the customer needs, that are considered sale key points. In the rest of the process, the customer voice will be taken into account. Features not being considered critical don't need a greater attention, for example, the key factors in a fitness Centre are: the program, the equipment, the fee and the access to the Internet.

The six stages are just the beginning of the QFD process. There are used three houses of quality to develop the main parts of the customer needs, the process plan and the quality control.

The second house is very similar with the first, but it refers to the subsystem and components.

The technical needs of the first house of quality are described in detail (fig.3.).

Fig. 3. The four quality houses – fill them in

Legend:

62 Quality Assurance and Management

The voice of customer remains at the base of the QFD process. Here below there are some

Technical needs are characteristics that describe the customer needs in the designer language. There must be measurable because the result is controlled (checked) and compared with the target objectives. The roof of the house shows the relationship being converted, using a series of symbols. A typical scheme uses the " " symbol for very strong relationships and a " " for weak relationships. For example, two technical requirements of certain superior services are the capacity, staff and equipment of a clinic. The relationship between them is strong and, in order to increase the capacity, more staff and equipment are

The customer needs must be written in the left column and the technical ones on the top. Inside the matrix the symbols indicate the type or relation in a similar way with those used at the roof of the House of Quality. The purpose of this matrix is to show if the final technical needs cover the customer needs. This kind of evaluation is usually based on the

The lack of a solid link between the customer needs and the technical ones show both that

And if a technical need doesn't affect a customer need it could be useless, or the designers

In this stage the importance of every customer need is evaluated and the competitor's products and services are the ones which cover these needs and are also researched. These evaluations are very important and reflect the customer's expectations. Competitive evaluation underlines the strengths and weaknesses of the competition. Due to this stage, designers could discover methods to improve products and QFD method and the strategic vision of the company shows the priorities of the important customer are not satisfied by the competitor's products (such as family activities), thus, the company could obtain advantages by focusing on these aspects. The respective needs become sale key points and lie at the

the needs are not covered and that the final product will hardly accomplish them.

5. Establishment of the final values of quality characteristics for the new product.

essential approaches regarding gathering information from the clients:

**3. The link between the customer needs and the technical needs** 

experts' experience, customer's reaction or controlled experiments.

**4. Addition of the competitive evaluation and of the sale key points** 

might forget an important need of the customer.

foundation of the marketing strategies.

The building of a *House of Quality* requires 6 basic steps:

**1. Identification of the customer needs** 

e. Direct contacts with clients

c. Official polls d. Focus groups

f. Claims analysis g. Online monitorization. **2. Identification of technical needs** 

needed.


1. Fat yoghurt PET bottle 900 g 2. Fat yoghurt PET bottle 2 kg 3. Sana PET bottle 900 g 4. Yoghurt cream PET bottle 900 g 5. Diet yoghurt PET bottle 900 g 6. Fresh cow cheese Plastic casserole 500 g 7. Fresh cow cheese Plastic bucket 5 kg 8. Făgăraș cheese Plastic casserole 250 g 9. Sour cream 25% fat Plastic casserole 450 g 10. Sour cream 20% fat Plastic casserole 450 g 11. Sour cream 20% fat Plastic bucket 5 kg 12. Cow telemea Plastic box 15 kg 13. Delicatesa Elda Plastic bucket 8 kg Table 1. The products made and commercialized by Elda Mec SRL Constanta, Romania

 The consumption market of these products is in continuous expansion due to the curing qualities of the dairy products. An important factor in this growth is represented by *The Alliance for Educational Milk Advertisement,* a nation-wide program for informing the people about the benefits of consuming milk and industrially-processed dairy products. This program informs people about the benefits of milk and industrially-processed dairy products rather than the unprocessed ones, about the increase of hygiene-sanitary

 This type of products is available to any consumer, milk being a food rich in calcium and phosphorous which decisively contributes to the growth and upkeep of the bone system and a good functioning of the muscles and nervous impulses transmission. Moreover, milk contains the vitamin complex B (B1, B2, B6, B12), which has an

 Given the fact that, on the Constanta market, there are not that many companies to cover the demand for fresh dairy products, S.C. ELDA MEC S.R.L. has the opportunity to impose itself among the other competitors, because it offers higher quality products

For distribution, S.C. ELDA MEC S.R.L. uses its own transportation, equipped with

 S.C. ELDA MEC S.R.L. distributes its products especially through the retail network, together with the wholesale, having sealed contracts with the main local trade chains and with some large public alimentation unit chains, restaurants with commercial vocation on the whole Romanian seashore. Moreover, the company intends to build its

 The production of milk and dairy products has risen by 95% in 2000-2009, while the annual average consumption of milk and dairy products has grown with 19% in

**3.1.1 The analysis of the marketing environment** 

to the consumers.

own trade network, in time.

the same period.

quality and nutritional value through industrial processing.

important role in the prevention of fatigue and nervous states.

storage installations for optimal product storage.

*The consumers' demand regarding the market's offer:* 

Product name Pakage Quantity

Nr. crt.

At this moment, the target values, function and aspect are to be settled. For example, the program of a fitness Centre could be partaged into: program for children, for family, etc. each with their specific needs and therefore, each with its own house of quality.

In the field of production, the majority of the QFD activities are represented by the first two houses which are displayed by developing the product and the engineering function. The next stage refers to needs surveyors and line operators.

In the third house, the process plan makes the link between the characteristics of the components and the key operations. That makes possible the passage from the plan to the application.

In some cases, there are used more simple houses of quality, which exclude the competitive analysis. For example in the health national organizations, competition doesn't interest anybody.

#### **3. Optimizing the activity of S.C. ELDA MEC S.R.L. through the application of the Quality Function Deployment method**

QFD was applied within ELDA MEC SRL Constanta in order to establish a product, which optimally corresponds with the dairy products consumers.

#### **3.1 Presentation of S.C. ELDA MEC S.R.L. Constanta**

S.C. ELDA MEC S.R.L. is a company with limited responsibility, with a completely private capital, established in 1996; activity field: *production and commerce of dairy products.* 

The company's headquarters is on Dumbrava Rosie Str. no 5, Constanta.

Work points – the company's headquarters and, respectively, in Topraisar, Constanta County, Romania (starting from July 2007).

In the purpose of carrying out this activity, the company owns fabrication licenses for the following products:


The products made and commercialized by Elda Mec are presented in table 1.

Currently, the unit is in conformity with European sanitary-veterinary norms regarding the production on a national level (L 41) and is carrying out the program for preparing the prime materials' suppliers for export quality.

Presently, the company has 15 employees, who assure a good functioning for the current production capacity.


Table 1. The products made and commercialized by Elda Mec SRL Constanta, Romania

#### **3.1.1 The analysis of the marketing environment**

64 Quality Assurance and Management

At this moment, the target values, function and aspect are to be settled. For example, the program of a fitness Centre could be partaged into: program for children, for family, etc.

In the field of production, the majority of the QFD activities are represented by the first two houses which are displayed by developing the product and the engineering function. The

In the third house, the process plan makes the link between the characteristics of the components and the key operations. That makes possible the passage from the plan to the

In some cases, there are used more simple houses of quality, which exclude the competitive analysis. For example in the health national organizations, competition doesn't interest

**3. Optimizing the activity of S.C. ELDA MEC S.R.L. through the application of** 

QFD was applied within ELDA MEC SRL Constanta in order to establish a product, which

S.C. ELDA MEC S.R.L. is a company with limited responsibility, with a completely private capital, established in 1996; activity field: *production and commerce of dairy* 

Work points – the company's headquarters and, respectively, in Topraisar, Constanta

In the purpose of carrying out this activity, the company owns fabrication licenses for the

Currently, the unit is in conformity with European sanitary-veterinary norms regarding the production on a national level (L 41) and is carrying out the program for preparing the

Presently, the company has 15 employees, who assure a good functioning for the current

each with their specific needs and therefore, each with its own house of quality.

next stage refers to needs surveyors and line operators.

**the Quality Function Deployment method** 

County, Romania (starting from July 2007).

Sour cream for consumption and whipped cream;

prime materials' suppliers for export quality.

optimally corresponds with the dairy products consumers.

**3.1 Presentation of S.C. ELDA MEC S.R.L. Constanta** 

The company's headquarters is on Dumbrava Rosie Str. no 5, Constanta.

The products made and commercialized by Elda Mec are presented in table 1.

application.

anybody.

*products.* 

following products:

 Milk for consumption; Acidophil products; Fresh cow cheese, creams;

Hard paste cheeses.

production capacity.

	- The production of milk and dairy products has risen by 95% in 2000-2009, while the annual average consumption of milk and dairy products has grown with 19% in the same period.

Widening the product equipment, as to fulfill the demands of as many consumers as

Intensification of the advertisement actions regarding the products made by *ELDA*, of

Expansion of the market coverage; selling the products to as many commercial chains

To increase production capacity, at the same time following the reaching and

To certify the Quality assurance system in conformity with ISO 9001 and the HACCP

To satisfy the demands (needs) of the clients through the assurance of products which

To inform and educate the consumers in order to differentiate natural products,

 To accentuate within the advertisement campaigns the quality difference and the therapeutically qualities of the natural dietetic dairy products in comparison with other

 To increase the market share in the Constanta area – currently we have a share of over 40% in the city of Constanta, being found in numerous supermarkets, Selgros and Mega

 To increase, on a yearly basis, the sales with at least 10% in order to, in up to 3 years, make the ELDA firm known through its quality not only on the Dobrogea market, but

The application of the QFD method at the firm Elda Mec S.R.L. Constanta, Romania aims to establish the optimal ratio between the functions and the quality characteristics of products

 Commercialization of some products (Yoghurt, Sana) under other distributor's brand Accessing nonrefundable funds so that the whole production-selling process responds

Increasing the investments towards establishing a performant laboratory analysis

Plan (Hazard Analysis. Critical Control Points) in the new production unit.

respectively the dietetic ones from the other products from the same array.

on the whole Romanian one and even in some European Union markets.

**3.2 Applying the QFD method within ELDA MEC S.R.L. Constanta** 

in order to optimal correspond to the customer's requirements.

The analyse focus on the 3 products presented as follows:

**3.1.3 Activities proposed for completion within the firm after the SWOT Analysis** 

Fabrication of new products based on traditional Romanian recipes

Increasing of the distributors' number , and that of the distribution channels

possible

Image.

 Fat Yoghurt; Low fat yoghurt; Cream of yoghurt.

the Elda Mec company in general

to the EU norms' requirements

maintaining of a high quality level;

as possible (national and international)

**3.1.4 S.C. ELDA MEC S.R.L. Constanta objectives** 

To implement the ISO 14000 environment standards.

are as diverse as possible and of a high quality.

dairy products available on the market.


#### **3.1.2 SWOT Analysis S.C. ELDA MEC S.R.L. Constanta**


#### **3.1.3 Activities proposed for completion within the firm after the SWOT Analysis**


#### **3.1.4 S.C. ELDA MEC S.R.L. Constanta objectives**


#### **3.2 Applying the QFD method within ELDA MEC S.R.L. Constanta**

The application of the QFD method at the firm Elda Mec S.R.L. Constanta, Romania aims to establish the optimal ratio between the functions and the quality characteristics of products in order to optimal correspond to the customer's requirements.

The analyse focus on the 3 products presented as follows:

Fat Yoghurt;

66 Quality Assurance and Management

to be improved.

 It is important the fact that, in the last years, the consumers are orientating themselves more and more towards high quality products, which offer consumption safety and the natural characteristics of milk. Thus, the yoghurts and cheeses obtained through pasteurizing technological processes, followed by implanting of carefully selected bacteria and molds. The products obtained by the company respects the consumers' norms regarding quality and taste. Furthermore, after the implementation of the current project, the product diversity and quality is

 In conformity with the Nielsen press statements and those of the Romanian Milk Industry Patronal Association, the yoghurt and sour cream segment is the only one

 In conformity with Nielsen, the dairy product buyers choose, mostly, hypermarkets (over 40%) when buying these products, followed by supermarkets (20%),

 The yoghurt segment remains, for the Romanian consumer, an item in the daily basket = an item present in day to day consumption of each Romanian household. What regards the milk and dairy products demand, we can say that, while under a certain price level, it is inflexible. However, it is sensible to the growth of the consumers' incomes, proof being the consumption evolution in the last years.

**Weaknesses** 

The lacking of a performant complete

**Threats** 

EU norms imposed to Romania in

 The fluctuation of the company's employees, especially after joining

 Fluctuating personnel Low number of distributors Poor product advertisement

laboratory analysis;

Direct competition

January 2007

the EU

which has grown by 3-5% even in the crisis periods.

discounters (15%) and local shops (10%).

**3.1.2 SWOT Analysis S.C. ELDA MEC S.R.L. Constanta** 

**Strengths** 

 The implementation of the HACCP Plan for most of the manufactured

 Varied product types equipment, orientation towards traditional

 Intermediate price situation, between economy and middle range Safe and secure, own distribution Location (are with high economic and

**Opportunities**  The opening of new commercial units chain and units for public alimentation, especially restaurants with commercial

Multiple possibilities for assortment

Accessing nonrefundable funds for

High quality products;

touristic potential)

products;

products;

vocation;

diversification

development


Lot establishment: lot = max 5000 kg, yoghurt of the same type, package, presented at the

Marking: marking or labeling with: factory brand, product name, type or fat content, net

Consumption indications: it can be consumed by all consumer categories, which do not have

*Cream of yoghurt* product paper

Aspect and consistency: Thinner consistence, without zer-eliminating gas

Lot establishment: lot = max 5000 kg, yoghurt of the same type, package, presented at the

Marking: marking or labeling with: factory brand, product name, type or fat content, net

Consumption indications: it can be consumed by all consumer categories, which do not have

QFD method was selected as an instrument for planning and development of the quality functions in conformity with de quality characteristics expected by customers and to permit

Having in view to develop the methodology, there was formed a multidisciplinary team, whose members are involved in the departments of production-quality, acquisition and

allergies or medical contraindications regarding the product's components.

Color: White, milk color or a slightly yellow tint Smell and taste Yoghurt specific, pleasant, bitter-sweet, without

bubbles

foreign taste or smell (sour, moldy).

8

allergies or medical contraindications regarding the product's components.

Characteristics Easy

Fat, %, minimum 2.8 Dry substance, %, minimum 8.5 Acidity, dgr. T 75 – 140 Proteic substances, %, minimum 7

Whey, %, maximum 0.5

Quality control, marking, storage, transport:

weight, expiration date, fabrication standard.

Organoleptic and physic-chemical characteristics:

Storage: clean refrigeration spaces clean, at 2-8 dgr. Celsius. Transport: clean, dry covered vehicles, at 8-12 dgr. Celsius.

same time at verifying. Packaging: 900g PET bottle.

Made from: cow milk;

Delivery temperature, dgr.C,

the achievement of this project.

marketing/commercial.

Quality control, marking, storage, transport:

weight, expiration date, fabrication standard.

Storage: clean refrigeration spaces clean, at 2-8 dgr. Celsius. Transport: clean, dry covered vehicles, at 8-12 dgr. Celsius

Types: fat;

maximum

same time at verifying. Packaging: 900g PET bottle.

The quality characteristics of the mentioned products are found in the product papers below:

#### *Fat Yoghurt* product paper

Made from: cow milk;

Types: fat;

Organoleptic and physic-chemical characteristics:


Quality control, marking, storage, transport:

Lot establishment: lot = max 5000 kg, yoghurt of the same type, package, presented at the same time at verifying.

Packaging: 900g PET bottle, 500g PET bottle, 2 kg PET bottle and 5 kg bucket.

Marking: marking or labeling with: factory brand, product name, type or fat content, net weight, expiration date, fabrication standard.

Storage: clean refrigeration spaces clean, at 2-8 dgr. Celsius.

Transport: clean, dry covered vehicles, at 8-12 dgr. Celsius.

Consumption indications: it can be consumed by all consumer categories, which do not have allergies or medical contraindications regarding the product's components.

*Low fat yoghurt* product paper

Made from: cow milk;

Types: light;

Organoleptic and physic-chemical characteristics:


Quality control, marking, storage, transport:

Lot establishment: lot = max 5000 kg, yoghurt of the same type, package, presented at the same time at verifying.

Packaging: 900g PET bottle.

68 Quality Assurance and Management

The quality characteristics of the mentioned products are found in the product papers

*Fat Yoghurt* product paper

Aspect and consistency: Thinner consistence, without zer-eliminating

Color: White, milk color or a slightly yellow tint Smell and taste Yoghurt specific, pleasant, a bit bitter, without

Lot establishment: lot = max 5000 kg, yoghurt of the same type, package, presented at the

Marking: marking or labeling with: factory brand, product name, type or fat content, net

Consumption indications: it can be consumed by all consumer categories, which do not have

*Low fat yoghurt* product paper

Aspect and consistency: Thinner consistence, without zer-eliminating

Color: White, milk color or a slightly yellow tint Smell and taste Yoghurt specific, pleasant, a bit bitter, without

Packaging: 900g PET bottle, 500g PET bottle, 2 kg PET bottle and 5 kg bucket.

allergies or medical contraindications regarding the product's components.

Characteristics Easy

Fat, %, minimum 0.1 Dry substance, %, minimum 8.5 Acidity, dgr. T 75 – 140

Delivery temperature, dgr.C, maximum 8 Whey, %, maximum 5

gas bubbles

foreign taste or smell (sour, moldy).

gas bubbles

foreign taste or smell (sour, moldy).

below:

Types: fat;

Made from: cow milk;

same time at verifying.

Made from: cow milk;

Proteic substances, %, minimum

Types: light;

Organoleptic and physic-chemical characteristics:

Quality control, marking, storage, transport:

weight, expiration date, fabrication standard.

Organoleptic and physic-chemical characteristics:

Storage: clean refrigeration spaces clean, at 2-8 dgr. Celsius. Transport: clean, dry covered vehicles, at 8-12 dgr. Celsius.

Characteristics Fat

Fat, %, minimum 2.8 Dry substance, %, minimum 9.0 Acidity, dgr. T 75 – 140 Proteic substances, %, minimum 3.2 Delivery tempterature, dgr.C, maximum 8 Whey, %, maximum 5

Marking: marking or labeling with: factory brand, product name, type or fat content, net weight, expiration date, fabrication standard.

Storage: clean refrigeration spaces clean, at 2-8 dgr. Celsius.

Transport: clean, dry covered vehicles, at 8-12 dgr. Celsius.

Consumption indications: it can be consumed by all consumer categories, which do not have allergies or medical contraindications regarding the product's components.

*Cream of yoghurt* product paper

Made from: cow milk; Types: fat; Organoleptic and physic-chemical characteristics:


Quality control, marking, storage, transport:

Lot establishment: lot = max 5000 kg, yoghurt of the same type, package, presented at the same time at verifying.

Packaging: 900g PET bottle.

Marking: marking or labeling with: factory brand, product name, type or fat content, net weight, expiration date, fabrication standard.

Storage: clean refrigeration spaces clean, at 2-8 dgr. Celsius.

Transport: clean, dry covered vehicles, at 8-12 dgr. Celsius

Consumption indications: it can be consumed by all consumer categories, which do not have allergies or medical contraindications regarding the product's components.

QFD method was selected as an instrument for planning and development of the quality functions in conformity with de quality characteristics expected by customers and to permit the achievement of this project.

Having in view to develop the methodology, there was formed a multidisciplinary team, whose members are involved in the departments of production-quality, acquisition and marketing/commercial.

**#** 

Appearance and firmness **#** 

**o** 

Packaging system **o** 

**o** 

Consequently to the definition and measurement of the consumers needs, the QFD team establishes that the respondents grant a major importance to the "fat content", "appearance and firmness", "tightness", "presence of foreign items " and "smell and taste". That fact proves the necessity to sustain and develop further the product "Fat yoghurt", otherwise, the product the best positioned on the market. The study demonstrated also the necessity to

Further on, the degree of correlation between the selected characteristics was evidenced in

The degree of correlation between the selected characteristics demonstrated by the QFD team that the majority of characteristics are sustained each other but it is necessary to improve the characteristics referring to the commodity and comfort in use, respectively the shape of the package, temperature at distribution, trade mark, labelling and the possibility

Taking into account the result after the consumer's investigation, the QFD team proposed the realization of a new package with improved characteristics that consequently assured a significant growth of sales for the analysed products, especially for the article "fat yoghurt".

Firms from all domains face the difficulty due to the modification more and more rapid of the requirements and expectations of the customers which vary significantly also between

The technical progress, the more and more important complexity of the production and the greater and greater pressure of innovation represent only several of the requirements in

QFD must contribute to the effective and efficient transformation of the customer's

Content of proteins % **\*** 

> Smell and taste **#**

Presence of foreign items **#** 

> Tightness **#**

Labelling **o** 

Nutritional contribution Content of fat %

Information Trade mark

Safety in consumption Temperature at distribution

pay a greater attention to the trade mark and labelling systems.

Sensorial needs satisfaction

in use

Comfort and commodity

the correlation matrix.

**4. Conclusions** 

the different market shares.

increasement that firms face.

requirements in the capabilities specific for a firm.

Fig. 4. The requirements cover matrix

to correlate the ratio price/quantity.

In a first stage, an inquest was developed in order to establish the functions of the three products from the "yoghurt" family of products, expected by the customers and their importance, based on a questionnaire, containing the following questions:


The objective of this team is to discover which are the quality characteristics of the products from the "yoghurt" category which respond better to the customer's expectations, in order to be sustained and developed in the benefit of the clients.

Taking into account the consumers of the firm's products opinion, it was observed the following correlation between the function and characteristics of products:


Using the questionnaire it was identified the degree of importance of the requirements, granted by customers, with points from *1* – *low important* to *5* – *very important*, as follows:


Table 2. Hierarchical values of the customers appreciations

The degree of covering the requirements through characteristics was evidenced with the relation matrix, with the following significance:

**\*** - Very low cover (possible)

o - Low
