**Part 1**

**Quality, General Definitions** 

**1** 

*Egypt* 

Ahmed Badr Eldin *Sigma Pharmaceutical Corp.,* 

**IA-Quality - General Concepts and Definitions**

**The Meanings of "Quality."** Of the many meanings of the word "quality," two are of



**Satisfaction and Dissatisfaction Are Not Opposites**. Customer *satisfaction* comes from those features which induce customers to buy the product. *Dissatisfaction* has its origin in deficiencies and is why customers complain. Some products give little or no dissatisfaction; they do what the producer said they would do. Yet they are not salable because some competing product has features that provide greater customer satisfaction. The early automated telephone exchanges employed electromagnetic analog switching methods. Recently, there was a shift to digital switching methods, owing to their superior product features. As a result, analog switching systems, even if absolutely free from product

However, providing more and/or better quality features usually requires an investment and hence usually involves increases in costs. Higher quality in this sense

**1. Introduction** 

critical importance to managing for quality:

and, one hopes, to increase income.

usually "costs more."

quality usually "costs less."

deficiencies, were no longer salable.

Do right things right.

given need. Fitness for use.

customer satisfaction and loyalty ;

Thus Quality can evolve several definitions such as:

 providing a product which is 'fit for the purpose'; providing an acceptable product at an acceptable cost;

quality crisis and is called the concept of "Big Q."

a standard which can be accepted by both the supplier and the customer.

the totality of features or characteristics of a product that bear on its ability to satisfy a

**Big Q And Little Q.** Definitions of words do not remain static. Sometimes they undergo extensive change. Such a change emerged during the 1980s. It originated in the growing

## **IA-Quality - General Concepts and Definitions**

## Ahmed Badr Eldin

*Sigma Pharmaceutical Corp., Egypt* 

## **1. Introduction**

**The Meanings of "Quality."** Of the many meanings of the word "quality," two are of critical importance to managing for quality:

	- However, providing more and/or better quality features usually requires an investment and hence usually involves increases in costs. Higher quality in this sense usually "costs more."

**Satisfaction and Dissatisfaction Are Not Opposites**. Customer *satisfaction* comes from those features which induce customers to buy the product. *Dissatisfaction* has its origin in deficiencies and is why customers complain. Some products give little or no dissatisfaction; they do what the producer said they would do. Yet they are not salable because some competing product has features that provide greater customer satisfaction. The early automated telephone exchanges employed electromagnetic analog switching methods. Recently, there was a shift to digital switching methods, owing to their superior product features. As a result, analog switching systems, even if absolutely free from product deficiencies, were no longer salable.

Thus Quality can evolve several definitions such as:


**Big Q And Little Q.** Definitions of words do not remain static. Sometimes they undergo extensive change. Such a change emerged during the 1980s. It originated in the growing quality crisis and is called the concept of "Big Q."

IA-Quality - General Concepts and Definitions 5

**The Effect on Income.** Income may consist of sales of an industrial company, taxes collected by a government body, appropriations received by a government agency, tuitions received by a school, and donations received by a charity. Whatever the source, the amount of the income relates in varying degrees to the features of the product produced by the recipient. In many markets, products with superior features are able to secure superior income, whether through higher share of market or through premium prices. Products that are not

Product deficiencies also can have an effect on income. The customer who encounters a deficiency may take action of a cost-related nature: file a complaint, return the product, make a claim, or file a lawsuit. The customer also may elect instead (or in addition) to stop buying from the guilty producer, as well as to publicize the deficiency and its source. Such

**The Effect on Costs.** The cost of poor quality consists of all costs that would disappear if there were no deficiencies—no errors, no rework, no field failures, and so on. This cost of poor quality is shockingly high. In the early 1980s, it was estimated that within the U.S. manufacturing industries, about a third of the work done consisted of redoing what had already been done. Since then, estimates from a sample of service industries suggest that a

Deficiencies that occur prior to sale obviously add to producers' costs. Deficiencies that occur after sale add to customers' costs as well as to producers' costs. In addition, they

To attain quality, it is well to begin by establishing the "vision" for the organization, along with policies and goals. Conversion of goals into results (making quality happen) is then done through managerial processes—sequences of activities that produce the intended results. Managing for quality makes extensive use of three such managerial processes:

competitive in features often must be sold at below-market prices.

similar situation prevails in service industries generally.

**3. How to manage for quality: the Juran trilogy** 

These processes are now known as the "**Juran trilogy**." A summery for the 3 process is illustrated in table 2.

reduce producers' repeat sales.

1. 1- Quality planning - 2- Quality control - 3- Quality improvement

Fig. 1. Juran Triology.

actions by multiple customers can do serious damage to a producer's income.

**2. Quality: the financial effects** 

Table 1 shows how the quality "umbrella" has been broadening dramatically. In turn, this broadening has changed the meanings of some key words. Adoption of Big Q grew during the 1980s, and the trend is probably irreversible. Those most willing to accept the concept of Big Q have been the quality managers and the upper managers. Those most reluctant have been managers in the technological areas and in certain staff functions.


Table 1. Contrast, Big Q and Little Q.

## **2. Quality: the financial effects**

4 Modern Approaches To Quality Control

Table 1 shows how the quality "umbrella" has been broadening dramatically. In turn, this broadening has changed the meanings of some key words. Adoption of Big Q grew during the 1980s, and the trend is probably irreversible. Those most willing to accept the concept of Big Q have been the quality managers and the upper managers. Those most reluctant have

been managers in the technological areas and in certain staff functions.

Table 1. Contrast, Big Q and Little Q.

**The Effect on Income.** Income may consist of sales of an industrial company, taxes collected by a government body, appropriations received by a government agency, tuitions received by a school, and donations received by a charity. Whatever the source, the amount of the income relates in varying degrees to the features of the product produced by the recipient. In many markets, products with superior features are able to secure superior income, whether through higher share of market or through premium prices. Products that are not competitive in features often must be sold at below-market prices.

Product deficiencies also can have an effect on income. The customer who encounters a deficiency may take action of a cost-related nature: file a complaint, return the product, make a claim, or file a lawsuit. The customer also may elect instead (or in addition) to stop buying from the guilty producer, as well as to publicize the deficiency and its source. Such actions by multiple customers can do serious damage to a producer's income.

**The Effect on Costs.** The cost of poor quality consists of all costs that would disappear if there were no deficiencies—no errors, no rework, no field failures, and so on. This cost of poor quality is shockingly high. In the early 1980s, it was estimated that within the U.S. manufacturing industries, about a third of the work done consisted of redoing what had already been done. Since then, estimates from a sample of service industries suggest that a similar situation prevails in service industries generally.

Deficiencies that occur prior to sale obviously add to producers' costs. Deficiencies that occur after sale add to customers' costs as well as to producers' costs. In addition, they reduce producers' repeat sales.

## **3. How to manage for quality: the Juran trilogy**

To attain quality, it is well to begin by establishing the "vision" for the organization, along with policies and goals. Conversion of goals into results (making quality happen) is then done through managerial processes—sequences of activities that produce the intended results. Managing for quality makes extensive use of three such managerial processes:


These processes are now known as the "**Juran trilogy**."

A summery for the 3 process is illustrated in table 2.

### Fig. 1. Juran Triology.

IA-Quality - General Concepts and Definitions 7

The new technologies required complex designs and precise execution. The empirical methods of earlier centuries were unable to provide appropriate product and process designs, so process yields were low and field failures were high. Companies tried to deal with low yields by adding inspections to separate the good from the bad. They tried to deal with field failures through warranties and customer service. These solutions were costly, and they did not reduce customer dissatisfaction. The need was to prevent defects and field

**Threats to Human Safety and Health and to the Environment.** With benefits from technology came uninvited guests. To accept the benefits required changes in lifestyle, which, in turn, made quality of life dependent on continuity of service. However, many products were failure-prone, resulting in many service interruptions. Most of these were minor, but some were serious and even frightening—threats to human safety and health, as

**Expansion of Government Regulation of Quality.** Government regulation of quality is of ancient origin. At the outset, it focused mainly on human safety and was conducted "after the fact"—laws provided for punishing those whose poor quality caused death or injury. Over the centuries, there emerged a trend to regulation "before the fact"—to become preventive in nature. This trend was intensified during the twentieth century. In the field of human health, laws were enacted to ensure the quality of food, pharmaceuticals, and medical devices. Licensing of practitioners was expanded. Other laws were enacted relating to product safety, highway safety, occupational safety, consumer protection, and so on. Growth of government regulation was a response to twentieth-century forces as well as a force in its own right. The rise of technology placed complex and dangerous products in the hands of amateurs—the public. Government regulation then demanded product designs

Consumers lacked expertise in technology. Their senses were unable to judge which of the competing products to buy, and the claims of competing companies often were contradictory. When products failed in service, consumers were frustrated by vague

"The system" seemed unable to provide recourse when things failed. Individual consumers were unable to fight the system, but collectively they were numerous and hence potentially powerful, both economically and politically. During the twentieth century, a "consumerism" movement emerged to make this potential a reality and to help consumers deal more effectively with these problems. This same movement also was successful in

**Intensified International Competition in Quality.** Cities and countries have competed for centuries. The oldest form of such competition was probably in military weaponry. This competition then intensified during the twentieth century under the pressures of two world wars. It led to the development of new and terrible weapons of mass destruction. A further stimulus to competition came from the rise of multinational companies. Large companies had found that foreign trade barriers were obstacles to export of their products. To get

To the companies, this intervention then became a force to be reckoned with.

stimulating new government legislation for consumer protection.

failures from happening in the first place.

well as to the environment.

that avoided these dangers.

**4.1 How to think about quality** 

warranties and poor service.

**4. The rise of the consumerism movement** 

Thus the critical need became quality.


Table 2. The three universal processes of managing for quality. [*Adapted from Juran, J.M. (1989). The Quality Trilogy: A Universal Approach to Managing for Quality. Juran Institute, Inc., Wilton, CT.*]

**Inspection and Inspectors.** The concepts of inspection and inspectors are of ancient origin. Wall and jewelry paintings in Egyptian tombs show the inspections used during stone construction projects. The measuring instruments included the square, level, and plumb bob for alignment control. Surface flatness of stones was checked by "boning rods" and by threads stretched across the faces of the stone blocks.

**Safety and Health of the Citizens.** Early forms of protection of safety and health were afterthe fact measures. The Code of Hammurabi (c. 2000 B.C.) prescribed the death penalty for any builder of a house that later collapsed and killed the owner. In medieval times, the same fate awaited the baker who inadvertently had mixed rat poison with the flour.

**The Industrial Revolution.** The Industrial Revolution began in Europe during the mid eighteenth century. Its origin was the simultaneous development of power-driven machinery and sources of mechanical power. It gave birth to factories that soon outperformed the artisans and small shops and made them largely obsolete.

**The Twentieth Century and Quality.** The twentieth century witnessed the emergence of some massive new forces that required responsive action. These forces included an explosive growth in science and technology, threats to human safety and health and to the environment, the rise of the consumerism movement, and intensified international competition in quality.

**An Explosive Growth in Science and Technology.** This growth made possible an outpouring of numerous benefits to human societies: longer life spans, superior communication and transport, reduced household drudgery, new forms of education and entertainment, and so on. Huge new industries emerged to translate the new technology into these benefits. Nations that accepted industrialization found it possible to improve their economies and the well-being of their citizenry.

Table 2. The three universal processes of managing for quality. [*Adapted from Juran, J.M. (1989). The Quality Trilogy: A Universal Approach to Managing for Quality. Juran Institute, Inc.,* 

threads stretched across the faces of the stone blocks.

economies and the well-being of their citizenry.

**Inspection and Inspectors.** The concepts of inspection and inspectors are of ancient origin. Wall and jewelry paintings in Egyptian tombs show the inspections used during stone construction projects. The measuring instruments included the square, level, and plumb bob for alignment control. Surface flatness of stones was checked by "boning rods" and by

**Safety and Health of the Citizens.** Early forms of protection of safety and health were afterthe fact measures. The Code of Hammurabi (c. 2000 B.C.) prescribed the death penalty for any builder of a house that later collapsed and killed the owner. In medieval times, the same

**The Industrial Revolution.** The Industrial Revolution began in Europe during the mid eighteenth century. Its origin was the simultaneous development of power-driven machinery and sources of mechanical power. It gave birth to factories that soon

**The Twentieth Century and Quality.** The twentieth century witnessed the emergence of some massive new forces that required responsive action. These forces included an explosive growth in science and technology, threats to human safety and health and to the environment, the rise of the consumerism movement, and intensified international

**An Explosive Growth in Science and Technology.** This growth made possible an outpouring of numerous benefits to human societies: longer life spans, superior communication and transport, reduced household drudgery, new forms of education and entertainment, and so on. Huge new industries emerged to translate the new technology into these benefits. Nations that accepted industrialization found it possible to improve their

fate awaited the baker who inadvertently had mixed rat poison with the flour.

outperformed the artisans and small shops and made them largely obsolete.

*Wilton, CT.*]

competition in quality.

The new technologies required complex designs and precise execution. The empirical methods of earlier centuries were unable to provide appropriate product and process designs, so process yields were low and field failures were high. Companies tried to deal with low yields by adding inspections to separate the good from the bad. They tried to deal with field failures through warranties and customer service. These solutions were costly, and they did not reduce customer dissatisfaction. The need was to prevent defects and field failures from happening in the first place.

**Threats to Human Safety and Health and to the Environment.** With benefits from technology came uninvited guests. To accept the benefits required changes in lifestyle, which, in turn, made quality of life dependent on continuity of service. However, many products were failure-prone, resulting in many service interruptions. Most of these were minor, but some were serious and even frightening—threats to human safety and health, as well as to the environment.

Thus the critical need became quality.

**Expansion of Government Regulation of Quality.** Government regulation of quality is of ancient origin. At the outset, it focused mainly on human safety and was conducted "after the fact"—laws provided for punishing those whose poor quality caused death or injury. Over the centuries, there emerged a trend to regulation "before the fact"—to become preventive in nature. This trend was intensified during the twentieth century. In the field of human health, laws were enacted to ensure the quality of food, pharmaceuticals, and medical devices. Licensing of practitioners was expanded. Other laws were enacted relating to product safety, highway safety, occupational safety, consumer protection, and so on.

Growth of government regulation was a response to twentieth-century forces as well as a force in its own right. The rise of technology placed complex and dangerous products in the hands of amateurs—the public. Government regulation then demanded product designs that avoided these dangers.

To the companies, this intervention then became a force to be reckoned with.

#### **4. The rise of the consumerism movement**

#### **4.1 How to think about quality**

Consumers lacked expertise in technology. Their senses were unable to judge which of the competing products to buy, and the claims of competing companies often were contradictory. When products failed in service, consumers were frustrated by vague warranties and poor service.

"The system" seemed unable to provide recourse when things failed. Individual consumers were unable to fight the system, but collectively they were numerous and hence potentially powerful, both economically and politically. During the twentieth century, a "consumerism" movement emerged to make this potential a reality and to help consumers deal more effectively with these problems. This same movement also was successful in stimulating new government legislation for consumer protection.

**Intensified International Competition in Quality.** Cities and countries have competed for centuries. The oldest form of such competition was probably in military weaponry. This competition then intensified during the twentieth century under the pressures of two world wars. It led to the development of new and terrible weapons of mass destruction. A further stimulus to competition came from the rise of multinational companies. Large companies had found that foreign trade barriers were obstacles to export of their products. To get

IA-Quality - General Concepts and Definitions 9

control and the fundamental managerial processes in total quality management. What is important for this section is to concentrate on the two "zones of control." In Figure 2 we can easily see that although the process is in control in the middle of the chart, we are running the process at an unacceptable level of waste. What is necessary here is not more control but improvement—actions to change the level of performance. After the improvements have been made, a new level of performance has been achieved. Now it is important to establish new controls at this level to prevent the performance level from deteriorating to the previous level or even worse. This is indicated by the second zone of control. The term "control of quality" emerged early in the twentieth century (Radford 1917, 1922). The concept was to broaden the approach to achieving quality, from the then-prevailing afterthe-fact inspection, to what we now call "defect prevention." For a few decades, the word "control" had a broad meaning which included the concept of quality planning. Then came events which narrowed the meaning of "quality control." The "statistical quality control" movement gave the impression that quality control consisted of using statistical methods. The "reliability" movement claimed that quality control applied only to quality at the time of test but not during service life. In the United States, the term "quality control" now often

The term "total quality management" (TQM) is now used as the all-embracing term.

Fig. 2. The Juran trilogy diagram. (*Juran Institute, Inc., Wilton, CT.*)

retains a broad meaning.

In Europe, the term "quality control" is also acquiring a narrower meaning. Recently, the European umbrella quality organization changed its name from European Organization for Quality Control to European Organization for Quality. In Japan, the term "quality control"

has the narrow meaning defined previously.

around these barriers, many set up foreign subsidiaries that then became their bases for competing in foreign markets, including competition in quality. The most spectacular twentieth-century demonstration of the power of competition in quality came from the Japanese. Following World War II, Japanese companies discovered that the West was unwilling to buy their products—Japan had acquired a reputation for making and exporting shoddy goods. The inability to sell became an alarm signal and a stimulus for launching the Japanese quality revolution during the 1950s. Within a few decades, that revolution propelled Japan into a position of world leadership in quality. This quality leadership in turn enabled Japan to become an economic superpower. It was a phenomenon without precedent in industrial history.

## **5. Quality to center stage**

The cumulative effect of these massive forces has been to "move quality to center stage." Such a massive move logically should have stimulated a corresponding response—a revolution in managing for quality. However, it was difficult for companies to recognize the need for such a revolution—they lacked the necessary alarm signals. Technological measures of quality did exist on the shop floors, but managerial measures of quality did not exist in the boardrooms. Thus, except for Japan, the needed quality revolution did not start until very late in the twentieth century. To make this revolution effective throughout the world, economies will require many decades—the entire twenty-first century. Thus, while the twentieth century has been the "century of productivity," the twenty-first century will be known as the "century of quality." The failure of the West to respond promptly to the need for a revolution in quality led to a widespread crisis. The 1980s then witnessed quality initiatives being taken by large numbers of companies.

Most of these initiatives fell far short of their goals. However, a few were stunningly successful and produced the lessons learned and role models that will serve as guides for the West in the decades ahead.

**Lessons Learned.** Companies that were successful in their quality initiatives made use of numerous strategies. Analysis shows that despite differences among the companies, there was much commonality—a lengthy list of strategies was common to most of the successful companies. These common strategies included:

*Customer focus:* Providing customer satisfaction became the chief operating goal.

*Quality has top priority:* This was written into corporate policies.

*Strategic quality planning:* The business plan was opened up to include planning for quality.

## **6. IB-quality control - general concept**

**Quality Control Defined.** "Quality control" is a universal managerial process for conducting operations so as to provide stability—to prevent adverse change and to "maintain the status quo."

To maintain stability, the quality control process evaluates actual performance, compares actual performance to goals, and takes action on the difference.

Quality control is one of the three basic managerial processes through which quality can be managed. The others are quality planning and quality improvement,. The Juran trilogy diagram (Figure 2) shows the interrelation of these processes. Figure 2 is used also to describe the relationships between quality planning, quality improvement, and quality

around these barriers, many set up foreign subsidiaries that then became their bases for competing in foreign markets, including competition in quality. The most spectacular twentieth-century demonstration of the power of competition in quality came from the Japanese. Following World War II, Japanese companies discovered that the West was unwilling to buy their products—Japan had acquired a reputation for making and exporting shoddy goods. The inability to sell became an alarm signal and a stimulus for launching the Japanese quality revolution during the 1950s. Within a few decades, that revolution propelled Japan into a position of world leadership in quality. This quality leadership in turn enabled Japan to become an economic superpower. It was a phenomenon without

The cumulative effect of these massive forces has been to "move quality to center stage." Such a massive move logically should have stimulated a corresponding response—a revolution in managing for quality. However, it was difficult for companies to recognize the need for such a revolution—they lacked the necessary alarm signals. Technological measures of quality did exist on the shop floors, but managerial measures of quality did not exist in the boardrooms. Thus, except for Japan, the needed quality revolution did not start until very late in the twentieth century. To make this revolution effective throughout the world, economies will require many decades—the entire twenty-first century. Thus, while the twentieth century has been the "century of productivity," the twenty-first century will be known as the "century of quality." The failure of the West to respond promptly to the need for a revolution in quality led to a widespread crisis. The 1980s then witnessed quality

Most of these initiatives fell far short of their goals. However, a few were stunningly successful and produced the lessons learned and role models that will serve as guides for

**Lessons Learned.** Companies that were successful in their quality initiatives made use of numerous strategies. Analysis shows that despite differences among the companies, there was much commonality—a lengthy list of strategies was common to most of the successful

*Strategic quality planning:* The business plan was opened up to include planning for quality.

**Quality Control Defined.** "Quality control" is a universal managerial process for conducting operations so as to provide stability—to prevent adverse change and to

To maintain stability, the quality control process evaluates actual performance, compares

Quality control is one of the three basic managerial processes through which quality can be managed. The others are quality planning and quality improvement,. The Juran trilogy diagram (Figure 2) shows the interrelation of these processes. Figure 2 is used also to describe the relationships between quality planning, quality improvement, and quality

*Customer focus:* Providing customer satisfaction became the chief operating goal.

*Quality has top priority:* This was written into corporate policies.

actual performance to goals, and takes action on the difference.

precedent in industrial history.

**5. Quality to center stage** 

the West in the decades ahead.

"maintain the status quo."

initiatives being taken by large numbers of companies.

companies. These common strategies included:

**6. IB-quality control - general concept** 

control and the fundamental managerial processes in total quality management. What is important for this section is to concentrate on the two "zones of control." In Figure 2 we can easily see that although the process is in control in the middle of the chart, we are running the process at an unacceptable level of waste. What is necessary here is not more control but improvement—actions to change the level of performance. After the improvements have been made, a new level of performance has been achieved. Now it is important to establish new controls at this level to prevent the performance level from deteriorating to the previous level or even worse. This is indicated by the second zone of control. The term "control of quality" emerged early in the twentieth century (Radford 1917, 1922). The concept was to broaden the approach to achieving quality, from the then-prevailing afterthe-fact inspection, to what we now call "defect prevention." For a few decades, the word "control" had a broad meaning which included the concept of quality planning. Then came events which narrowed the meaning of "quality control." The "statistical quality control" movement gave the impression that quality control consisted of using statistical methods. The "reliability" movement claimed that quality control applied only to quality at the time of test but not during service life. In the United States, the term "quality control" now often has the narrow meaning defined previously.

The term "total quality management" (TQM) is now used as the all-embracing term.

Fig. 2. The Juran trilogy diagram. (*Juran Institute, Inc., Wilton, CT.*)

In Europe, the term "quality control" is also acquiring a narrower meaning. Recently, the European umbrella quality organization changed its name from European Organization for Quality Control to European Organization for Quality. In Japan, the term "quality control" retains a broad meaning.

IA-Quality - General Concepts and Definitions 11

Fig. 4. The generic feedback loop. (*Making Quality Happen, Juran Institute, Inc., senior executive* 

The feedback loop is a universal. It is fundamental to any problem in quality control. It applies to all types of operations, whether in service industries or manufacturing industries, whether for profit or not. It applies to all levels in the hierarchy, from the chief executive officer to the work force, inclusive. However, there is wide variation in the nature of the elements of the feedback loop. In Figure 5 a simple flowchart is shown describing the

**The Process**. In all of the preceding discussion we have assumed a process. This may also be human or technological or both. It is the means for producing the product features, each of which is a control subject. All work is done by a process which consists of an input, labor, technology, procedures, energy, materials, and output. For a more complete discussion of

Fig. 5. The qualty control process. (*"Quality Control", Leadershitp for the Quality Century, Juran* 

*Institute, Inc., senior executive workshop, p. 2, Wilton, CT.*)

quality control process with the simple universal feedback loop imbedded.

*workshop, p. F-3, Wilton, CT.*)

process.

**8. The elements of the feedback loop** 

Their "total quality control" is roughly equivalent to our term "total quality management." In 1997 the Union of Japanese Scientists and Engineers (JUSE) adopted the term total quality management (TQM) to replace total quality control (TQC) to more closely align themselves with the more common terminology used in the rest of the world. The quality control process is one of the steps in the overall quality planning sequence,. Figure 3 shows the input-output features of this step. In Figure 3 the input is operating process features developed to produce the product features required to meet customer needs. The output consists of a system of product and process controls which can provide stability to the operating process.

Fig. 3. The input-output diagram for the quality control process.

## **7. The relation to quality assurance**

Quality control and quality assurance have much in common. Each evaluates performance. Each compares performance to goals. Each acts on the difference.

However they also differ from each other. Quality control has as its primary purpose to maintain control. Performance is evaluated during operations, and performance is compared to goals during operations. The resulting information is received and used by the operating forces. Quality assurance's main purpose is to verify that control is being maintained. Performance is evaluated after operations, and the resulting information is provided to both the operating forces and others who have a need to know. Others may include plant, functional, or senior management; corporate staffs; regulatory bodies; customers; and the general public.

**The Feedback Loop**. Quality control takes place by use of the feedback loop. A generic form of the feedback loop is shown in Figure 4. The progression of steps in Figure 4 is as follows:


Their "total quality control" is roughly equivalent to our term "total quality management." In 1997 the Union of Japanese Scientists and Engineers (JUSE) adopted the term total quality management (TQM) to replace total quality control (TQC) to more closely align themselves with the more common terminology used in the rest of the world. The quality control process is one of the steps in the overall quality planning sequence,. Figure 3 shows the input-output features of this step. In Figure 3 the input is operating process features developed to produce the product features required to meet customer needs. The output consists of a system of product and process controls which can provide stability to the

Quality control and quality assurance have much in common. Each evaluates performance.

However they also differ from each other. Quality control has as its primary purpose to maintain control. Performance is evaluated during operations, and performance is compared to goals during operations. The resulting information is received and used by the operating forces. Quality assurance's main purpose is to verify that control is being maintained. Performance is evaluated after operations, and the resulting information is provided to both the operating forces and others who have a need to know. Others may include plant, functional, or senior management; corporate staffs; regulatory bodies;

**The Feedback Loop**. Quality control takes place by use of the feedback loop. A generic form of the feedback loop is shown in Figure 4. The progression of steps in Figure 4 is as follows: 1. A sensor is "plugged in" to evaluate the actual quality of the control subject—the product or process feature in question. The performance of a process may be determined directly by evaluation of the process feature, or indirectly by evaluation of

4. The umpire compares actual performance to standard. If the difference is too great, the

5. The actuator stimulates the process (whether human or technological) to change the

3. The umpire also receives information on what is the quality goal or standard.

performance so as to bring quality into line with the quality goal.

Fig. 3. The input-output diagram for the quality control process.

Each compares performance to goals. Each acts on the difference.

the product feature—the product "tells" on the process.

2. The sensor reports the performance to an umpire.

**7. The relation to quality assurance** 

customers; and the general public.

umpire energizes an actuator.

operating process.

Fig. 4. The generic feedback loop. (*Making Quality Happen, Juran Institute, Inc., senior executive workshop, p. F-3, Wilton, CT.*)

#### **8. The elements of the feedback loop**

The feedback loop is a universal. It is fundamental to any problem in quality control. It applies to all types of operations, whether in service industries or manufacturing industries, whether for profit or not. It applies to all levels in the hierarchy, from the chief executive officer to the work force, inclusive. However, there is wide variation in the nature of the elements of the feedback loop. In Figure 5 a simple flowchart is shown describing the quality control process with the simple universal feedback loop imbedded.

**The Process**. In all of the preceding discussion we have assumed a process. This may also be human or technological or both. It is the means for producing the product features, each of which is a control subject. All work is done by a process which consists of an input, labor, technology, procedures, energy, materials, and output. For a more complete discussion of process.

Fig. 5. The qualty control process. (*"Quality Control", Leadershitp for the Quality Century, Juran Institute, Inc., senior executive workshop, p. 2, Wilton, CT.*)

IA-Quality - General Concepts and Definitions 13

Zero Defects was Crosby's nostrum, but is really just another way of saying quality is meeting specfications. Deming's ideas are much broader than that and are, perhaps, best captured with the phrase 'continual improvement'. This term connotes the ongoing nature of the strategy. According to Deming, quality is not a state to be achieved in manufacturing, but is, rather, an ongoing company-wide effort at continual improvement. What Bill Conway called "the process – the way everyone thinks, talks, works and acts every day." After all the nonsense is stripped away, the fact is that Japanese automakers (Toyota, Honda and Nissan) make better cars than American automakers (GM, Ford and Chrysler) and have done now for years. Buyers are not idiots. They understand value, and realize that better quality at the same or lower cost is excellent value. End of story. But the implications of the story are not just confined to the auto industry. Cameras, computers, appliances, power tools, earthmoving equipment, and more have fallen from America's basic manufacturing industries to a legacy of plant

Recent decades have witnessed a growing trend to improve the effectiveness of quality control by formal adoption of modern concepts, methodologies, and tools. These have included: Systematic planning for quality control, with extensive participation by the operating personnel Formal application of the feedback loop, and establishment of clear

decisions and actions Delegation of decisions to the work force through self-control and selfinspection Wide application of statistical process control and the associated training of the operating personnel A structured information network to provide a factual basis for decision making A systematic process for corrective action in the event of sporadic adverse change Formal company manuals for quality control, with periodic audits to ensure up-to-

a. There shall be a quality control unit that shall have the responsibility and authority to approve or reject all components, drug product containers, closures, in-process materials, packaging material, labeling, and drug products, and the authority to review production records to assure that no errors have occurred or, if errors have occurred, that they have been fully investigated. The quality control unit shall be responsible for approving or rejecting drug products manufactured, processed, packed, or held under

b. Adequate laboratory facilities for the testing and approval (or rejection) of components, drug product containers, closures, packaging materials, in-process materials, and drug

c. The quality control unit shall have the responsibility for approving or rejecting all procedures or specifications impacting on the identity, strength, quality, and purity of

d. The responsibilities and procedures applicable to the quality control unit shall be in

closings, job losses and dwindling revenues and profits.

**11. Quality control in pharmaceutical industries** 

products shall be available to the quality control unit.

writing; such written procedures shall be followed.

**11.1 Responsibilities of quality control unit** 

contract by another company.

the drug product.

**10. Quality control: what is new?** 

responsibility for the associated

dateness and conformance.

## **9. Deming chain reaction**

The so-called Deming Chain Reaction was actually borrowed from a model that Walter Shewhart developed. He probably borrowed the idea from another thinker. Basically the idea was for management to move away from thinking about quality as a desirable outcome, to thinking about quality as a competitive strategy. Competitive strategy as a concept has been around for centuries. A person selling an item similar to that sold by another can compete on price, by selling it for less money. Perhaps the seller may try to compete by adding extras, gift-wrapping, for example. Technical companies compete by being technology leaders and being on the cutting edge of new developments. There are no end to methods to compete. But some methods are more effective in the long run than others. It is not a mistake that Deming's first published book on the subject was entitled "On Quality, Productivity and Competitive Position". In the book, he sets forth the reasons why emphasis on quality leads to productivity improvement and how that is a very effective competitive strategy in the long run. Phil Crosby in the early 80s in his book, "Quality is Free" pointed out that improving quality lowered cost. But Deming had shown this to the Japanese 30 years earlier. And, Deming pointed out the benefits of developing a competitive strategy based on quality. One of the problems in talking about quality is that many people have pre-conceived notions of quality is. For some it is meeting specifications. Joseph Juran defines it as 'meeting customer requirements'.

The so-called Deming Chain Reaction was actually borrowed from a model that Walter Shewhart developed. He probably borrowed the idea from another thinker. Basically the idea was for management to move away from thinking about quality as a desirable outcome, to thinking about quality as a competitive strategy. Competitive strategy as a concept has been around for centuries. A person selling an item similar to that sold by another can compete on price, by selling it for less money. Perhaps the seller may try to compete by adding extras, gift-wrapping, for example. Technical companies compete by being technology leaders and being on the cutting edge of new developments. There are no end to methods to compete. But some methods are more effective in the long run than others. It is not a mistake that Deming's first published book on the subject was entitled "On Quality, Productivity and Competitive Position". In the book, he sets forth the reasons why emphasis on quality leads to productivity improvement and how that is a very effective competitive strategy in the long run. Phil Crosby in the early 80s in his book, "Quality is Free" pointed out that improving quality lowered cost. But Deming had shown this to the Japanese 30 years earlier. And, Deming pointed out the benefits of developing a competitive strategy based on quality. One of the problems in talking about quality is that many people have pre-conceived notions of quality is. For some it is meeting specifications.

**9. Deming chain reaction** 

Fig. 6.

Joseph Juran defines it as 'meeting customer requirements'.

Zero Defects was Crosby's nostrum, but is really just another way of saying quality is meeting specfications. Deming's ideas are much broader than that and are, perhaps, best captured with the phrase 'continual improvement'. This term connotes the ongoing nature of the strategy. According to Deming, quality is not a state to be achieved in manufacturing, but is, rather, an ongoing company-wide effort at continual improvement. What Bill Conway called "the process – the way everyone thinks, talks, works and acts every day." After all the nonsense is stripped away, the fact is that Japanese automakers (Toyota, Honda and Nissan) make better cars than American automakers (GM, Ford and Chrysler) and have done now for years. Buyers are not idiots. They understand value, and realize that better quality at the same or lower cost is excellent value. End of story. But the implications of the story are not just confined to the auto industry. Cameras, computers, appliances, power tools, earthmoving equipment, and more have fallen from America's basic manufacturing industries to a legacy of plant closings, job losses and dwindling revenues and profits.

## **10. Quality control: what is new?**

Recent decades have witnessed a growing trend to improve the effectiveness of quality control by formal adoption of modern concepts, methodologies, and tools. These have included: Systematic planning for quality control, with extensive participation by the operating personnel Formal application of the feedback loop, and establishment of clear responsibility for the associated

decisions and actions Delegation of decisions to the work force through self-control and selfinspection Wide application of statistical process control and the associated training of the operating personnel A structured information network to provide a factual basis for decision making A systematic process for corrective action in the event of sporadic adverse change Formal company manuals for quality control, with periodic audits to ensure up-todateness and conformance.

## **11. Quality control in pharmaceutical industries**

## **11.1 Responsibilities of quality control unit**


**2** 

*Malaysia* 

**Evaluating Quality Control Decisions:** 

Quality has become one of the core factors for almost all manufacturing and service companies that aim to achieve customer satisfaction. Therefore, improving quality is considered to be one of the efforts that companies consider a must to attain customer loyalty in today's complex global competitive environment. Studies concluded that any serious endeavour to improve quality will lead to an increase of cost of the product or service. Obviously, improving quality has its own costs. As a result, measuring cost of quality is very important as it provides valuable insights into the different cost of quality components. Thus, favourable returns on investment maybe achieved. For this fact, the quality cost concept was introduced and implemented in many manufacturing and service companies. The first model of cost of quality was introduced by Feigenbaum (1956) known as the P-A-F model which consists of prevention, appraisal and failure cost. Feigenbaum (1991) categorized the model into two major areas: costs of control (costs of conformance), and costs of failure of controls (costs of non-conformance), which since then used by numerous research studies (for example, Sumanth and Arora (1992), Burgess (1996), Purgslove and

Dale (1995), Gupta and Campbell (1995), Chang et al. (1996), Sorqvist (1997)).

Most of the reported literature does not provide a single universal definition for cost of quality (Dale and Plunkett (1999)). However, cost of quality is usually best understood in terms of the sum of costs of conformance and the costs of non-conformance which was first introduced by Crosby (1979). Here, cost of conformance is known as the costs associate with quality requirement for achieving specific quality standards for a product or service. On the other hand, cost of non-conformance is known as the cost of failure to deliver the required standard of quality for a product or service. From the voluminous literature, one may categorized the cost of quality models into five generic models which are: P-A-F model, Crosby model, opportunity or intangible cost model, process cost models and ABC (activity based costing) model. Traditional cost accounting approaches typically used to measure cost of quality has been reported in the literature to have serious limitations when dealing with the components of intangible costs (see Son (1991), Chiadamrong (2003) and Sharma (2007)). As manufactures continue to improve their factories, they discover that existing cost measure systems should be updated and no matter how sophisticated and reliable these economic evaluation measure maybe, such problems still remain if unreliable cost information is not obtained as inputs for these economic evaluation, Chiadamrong (2003).

**1. Introduction** 

**A Simulation Approach** 

Mohamed K. Omar1 and Sharmeeni Murugan2 *1Nottingham University Business school Malaysia* 

*2Faculty of Engineering & Technology Multimedia University* 

#### **12. References**

Phillips-Donaldson, Debbie, American Society for Quality 37 (5): 25–39(2000).

Bunkley, Nick, Joseph Juran, Pioneer in Quality Control, Dies", New York Times, 37,March 3 (2008).

Dr. Joseph M. Juran JURAN'S QUALITY HANDBOOK McGraw-Hill, p. 79, (2004).


## **Evaluating Quality Control Decisions: A Simulation Approach**

Mohamed K. Omar1 and Sharmeeni Murugan2 *1Nottingham University Business school Malaysia 2Faculty of Engineering & Technology Multimedia University Malaysia* 

#### **1. Introduction**

14 Modern Approaches To Quality Control

Bunkley, Nick, Joseph Juran, Pioneer in Quality Control, Dies", New York Times, 37,March 3

Parasuraman, A., Zeithami, Valarie A., and Berry, Leonard L, Journal of Marketing, Fall, pp.

Koura, Kozo Societas Qualitas, Japanese Union of Scientists and Engineers, Tokyo, p. 180 -

Dawes, Edgar W., and Siff, Walter, ASQC Annual Quality Congress Transactions, pp. 810–

Phillips-Donaldson, Debbie, American Society for Quality 37 (5): 25–39(2000).

Dr. Joseph M. Juran JURAN'S QUALITY HANDBOOK McGraw-Hill, p. 79, (2004).

**12. References** 

(2008).

41–50 (1985).

186 (1991).

816(1993).

Quality has become one of the core factors for almost all manufacturing and service companies that aim to achieve customer satisfaction. Therefore, improving quality is considered to be one of the efforts that companies consider a must to attain customer loyalty in today's complex global competitive environment. Studies concluded that any serious endeavour to improve quality will lead to an increase of cost of the product or service. Obviously, improving quality has its own costs. As a result, measuring cost of quality is very important as it provides valuable insights into the different cost of quality components. Thus, favourable returns on investment maybe achieved. For this fact, the quality cost concept was introduced and implemented in many manufacturing and service companies. The first model of cost of quality was introduced by Feigenbaum (1956) known as the P-A-F model which consists of prevention, appraisal and failure cost. Feigenbaum (1991) categorized the model into two major areas: costs of control (costs of conformance), and costs of failure of controls (costs of non-conformance), which since then used by numerous research studies (for example, Sumanth and Arora (1992), Burgess (1996), Purgslove and Dale (1995), Gupta and Campbell (1995), Chang et al. (1996), Sorqvist (1997)).

Most of the reported literature does not provide a single universal definition for cost of quality (Dale and Plunkett (1999)). However, cost of quality is usually best understood in terms of the sum of costs of conformance and the costs of non-conformance which was first introduced by Crosby (1979). Here, cost of conformance is known as the costs associate with quality requirement for achieving specific quality standards for a product or service. On the other hand, cost of non-conformance is known as the cost of failure to deliver the required standard of quality for a product or service. From the voluminous literature, one may categorized the cost of quality models into five generic models which are: P-A-F model, Crosby model, opportunity or intangible cost model, process cost models and ABC (activity based costing) model. Traditional cost accounting approaches typically used to measure cost of quality has been reported in the literature to have serious limitations when dealing with the components of intangible costs (see Son (1991), Chiadamrong (2003) and Sharma (2007)). As manufactures continue to improve their factories, they discover that existing cost measure systems should be updated and no matter how sophisticated and reliable these economic evaluation measure maybe, such problems still remain if unreliable cost information is not obtained as inputs for these economic evaluation, Chiadamrong (2003).

Evaluating Quality Control Decisions: A Simulation Approach 17

Examples of publications describing, analyzing or developing the model

Feigenbaum (1956), Purgslove and Dale (1995), Merino (1988), Chang *et al.*(1996), Sorqvist (1997), Plunkett and Dale (1988),Tatikonda and Tatikonda (1996), Bottorff (1997), Gupta and Campbell (1995), Burgess (1996), Dawes (1989), Sumanth and Arora (1992), Morse (1983), Weheba and Elshennawy

(2004), etc.

Suminsky (1994) and Denton

Carr (1992) and Malchi and

McGurk (2001)

Juran *et al.*(1975)

Heagy (1991) and Chiadamrong (2003)

and Hester (1993)

Ross (1977), Marsh (1989), Goulden and Rawlins (1995), Crossfield and Dale (1990)

Cooper (1988), Cooper and Kaplan (1988), Tsai (1998), Jorgenson and Enkerlin (1992), Dawes and Siff (1993)

and Kowalski (1988)

Sandoval-Chavez and Beruvides (1998) and Modarres and Ansari (1987)

Generic model Cost/ activity categories

P-A-F models Prevention + appraisal +

Crosby's model Conformance + non-

Process cost models Conformance + non-

ABC models Value added + non-value

Opportunity or intangible

cost models

failure

conformance

Prevention + appraisal + failure + opportunity

Conformance + non-

Tangibles + intangibles

opportunity cost)

conformance

added

conformance + opportunity

P-A-F (failure cost includes

Table 1. Generic cost models and cost categories adopted from Andrea and Thomson (2006).

It is true to state that cost of quality modelling will provides more accurate approach to determine the cost involved in any quality control activities. However, the challenge does not end here. The cost of quality model must be used to determine the cost of improving activities associated with quality control strategies introduced to improve customer expectations. Therefore, a realistic cost of quality estimation could be determined that allows managers to show the economic benefit or otherwise of that specific quality control strategy. In other words, a quality control improvement strategy can only be justified if the increase in profitability is sufficient to cover the costs involved in the implementation. Once the cost of quality model is developed, a simulation can be used to determine the impact of any quality control strategy that a company wish to investigate. Among the strategies that manufacturing companies may consider investigating would be the allowable defect rate in some process or operations. In this case, simulation could be used to calculate the impact of defective rate of that operation on the overall profitability and productivity of the manufacturing system. Moreover, simulation could be used to study the system before and after some quality control improvement policy. Once the study is completed, a true picture about the cost of that policy could be determined as well as the impact of that policy on the overall defect rate.

This chapter is organized in the following manner; the literature review is presented in section 2 and followed by cost of quality model in section 3. Problem and solution methodology are presented in section 4. Experiment design and model verification, results and discussions are presented in sections 5 and 7 respectively. Finally, conclusions are presented in section 7.

## **2. Literature review**

In its simplest definition of cost of quality, the American Society for Quality Control (ASQC (1971)) and the BS6143 Part 2 (1990) define cost of quality as the costs incurred in ensuring quality, together with the loss incurred when quality is not achieved. Feigenbaum (1956, 1961) introduces the so called PAF model in which cost of quality was classified into four components, prevention, appraisal and failure (internal and external) costs. In Plunkett and Dale (1987) survey, it is stated that literature suggests that most of the researchers use the PAF model for measuring cost of quality. However, Schiffauerova and Thomson (2006) reported that PAF model concept is not the only one since other models were found in the literature to be developed, discussed and used as detailed in Table 1. It is worth noting that Table 1 was originally developed by Schiffauerova and Thomson (2006) and updated by the authors.

The importance of cost of quality has been reported in many research works, Moyer and Gilmore (1979) reported that cost of quality could reach as high as 38% of sales and Albright and Roth (1992) estimated that the cost of quality may represents 30% of all manufacturing costs in the United States. Moreover, Harry and Schroeder (2000) asserted that most companies would find that cost of quality, if properly evaluated, falls somewhere between 15 and 25% of total sale-rather that 3-7% that often assumed. In other study, Giakatis et al. (2001) report that cost of quality represents considerable portion of company's total costs. More recently, Kent (2005) estimated that the turnover rates faced by companies are between 5 to 15% of the overall cost of quality. The benefits of implementing cost of quality system in any profitable organization has been reported extensively in the cost of quality literature, for example, Prickett and Rapley (2001) highlighted four common benefits that any organization is bound to gain from implementing cost of quality system :(1) it will be

It is true to state that cost of quality modelling will provides more accurate approach to determine the cost involved in any quality control activities. However, the challenge does not end here. The cost of quality model must be used to determine the cost of improving activities associated with quality control strategies introduced to improve customer expectations. Therefore, a realistic cost of quality estimation could be determined that allows managers to show the economic benefit or otherwise of that specific quality control strategy. In other words, a quality control improvement strategy can only be justified if the increase in profitability is sufficient to cover the costs involved in the implementation. Once the cost of quality model is developed, a simulation can be used to determine the impact of any quality control strategy that a company wish to investigate. Among the strategies that manufacturing companies may consider investigating would be the allowable defect rate in some process or operations. In this case, simulation could be used to calculate the impact of defective rate of that operation on the overall profitability and productivity of the manufacturing system. Moreover, simulation could be used to study the system before and after some quality control improvement policy. Once the study is completed, a true picture about the cost of that policy could be determined as well as the impact of that policy on the

This chapter is organized in the following manner; the literature review is presented in section 2 and followed by cost of quality model in section 3. Problem and solution methodology are presented in section 4. Experiment design and model verification, results and discussions are presented in sections 5 and 7 respectively. Finally, conclusions are

In its simplest definition of cost of quality, the American Society for Quality Control (ASQC (1971)) and the BS6143 Part 2 (1990) define cost of quality as the costs incurred in ensuring quality, together with the loss incurred when quality is not achieved. Feigenbaum (1956, 1961) introduces the so called PAF model in which cost of quality was classified into four components, prevention, appraisal and failure (internal and external) costs. In Plunkett and Dale (1987) survey, it is stated that literature suggests that most of the researchers use the PAF model for measuring cost of quality. However, Schiffauerova and Thomson (2006) reported that PAF model concept is not the only one since other models were found in the literature to be developed, discussed and used as detailed in Table 1. It is worth noting that Table 1 was originally developed by Schiffauerova and Thomson (2006) and updated by the

The importance of cost of quality has been reported in many research works, Moyer and Gilmore (1979) reported that cost of quality could reach as high as 38% of sales and Albright and Roth (1992) estimated that the cost of quality may represents 30% of all manufacturing costs in the United States. Moreover, Harry and Schroeder (2000) asserted that most companies would find that cost of quality, if properly evaluated, falls somewhere between 15 and 25% of total sale-rather that 3-7% that often assumed. In other study, Giakatis et al. (2001) report that cost of quality represents considerable portion of company's total costs. More recently, Kent (2005) estimated that the turnover rates faced by companies are between 5 to 15% of the overall cost of quality. The benefits of implementing cost of quality system in any profitable organization has been reported extensively in the cost of quality literature, for example, Prickett and Rapley (2001) highlighted four common benefits that any organization is bound to gain from implementing cost of quality system :(1) it will be

overall defect rate.

presented in section 7.

**2. Literature review**

authors.


Table 1. Generic cost models and cost categories adopted from Andrea and Thomson (2006).

Evaluating Quality Control Decisions: A Simulation Approach 19

case study company. Although there are voluminous literatures written on cost of quality, very few literatures were written on tracing the invisible element of cost of quality and also

The literature review presented so far indicates that estimating cost of quality is not a simple and straight forward issue. Moreover, the literature shows that there are several methods that can be used to determine cost of quality. Moreover, simulation stands as a favourite approach that might be used to investigate quality improvements justification in terms of profitability for a specific cost of implantation. Obviously, cost of quality model must first be developed and then simulation is followed to investigate the impact of some quality

In this section of the chapter we, intend to provide a brief description of the model developed by Son and Lie (1991) and the way it was modified to be suitable for our research

Son and Lie (1991) considered a small manufacturing system which consists of a machining area (sampling inspection) and a final inspection area (100% inspection) as indicated in Figure 1. However; they assume that 100% inspection is not always possible. They explained that a complete check of a component part may require the part removal from the

Therefore, they assume that sampling inspection is assumed for component parts (inprocess inspection), and 100% inspection for finished parts (final inspection). Throughout the machining process, the feature of the product quality is monitored by an x-bar chart which consists of a center line, lower, limit and an upper limit. A sample size in use is taken and inspected at a specific time interval. During machine process, when the sample mean fall outside of the control limit in the x-bar chart, an investigation is made during an average time period to check if the alarm is true or false. The false alarm occurs if the process is in

fixture; the removal makes it difficult to realign the part to its original position.

the method in measuring the element of cost of quality.

idea. First, the notations of the model are presented.

Θ average number of assignable cause per hour d1 mean time spent to identify an assignable cause d2 mean time spent to correct an assignable cause

to time inspection of the acceptance sampling cas cost of investigating acceptance sampling Γ probability of process in control state 2 β Probability of process in control state 2 τ time taken for assignable cause to occurs

T3 time required to investigate the true alarm T4 time to correct the assignable cause

T2 time period until the assignable is detected for the first time

strategies and or improvements.

**3. The cost of quality model**

t sampling interval

T1 in control period

**3.1 Description of the model** 

n sample size per sampling

g time required to sample a product

Notations

able to focus upon areas of poor performance that need improvements, (2) it will have the opportunity to monitor the progress of ongoing improvement activities, (3) it will have an opportunity to plan for quality improvement and (4) it will be able to communicate better within the organization for improving the overall of the quality control.

Moreover, Schiffauerova and Thomson (2006) in their extensive literature review on cost of quality indicate that companies that use cost of quality programs have been quite successful in reducing cost of quality and improving the quality for the customer.

Although the impact of implementing cost of quality systems on increasing profit of any organization is obvious, Yang (2008) reported that the literature on cost of quality systems implementation indicates that most of companies do not know the true cost of their own quality. Despite that 82% of companies in the United Sates are involved in quality programs, only 33% actually compute the cost of quality (Harry and Schroeder, 2000), and in north-east England 66% of organization do not make use of quality costing as reported by Prickett and Rapley (2001). Some studies have highlighted reasons for the lack of implementing cost of quality systems in practice, Harry and Schroeder (2000) state that many significant quality related costs cannot be captured by most types of accounting systems. Chen and Yang (2002) related the difficulties to measure cost of quality to the fact that there is a lack of adequate methods for determining the financial consequences of poor quality. Moreover, Chiadamrong (2003) has concluded that there is a widespread belief that quality cost cannot be measured in practical terms because traditional cost accounting systems have not been adapted to quantify the value of quality. The need for quantifying cost of quality as stated by Yang (2008) and has been reported in the literature by many researchers, (Feigenbaum (1956); Juran (1952, 1989); Krishnan et al. (2000); Giakatis et al. (2001); Prickett and Rapley (2001), Chen and Yang (2002). However, there are evidences in the literature that clearly indicate that quantifying cost of quality has been neglected by most of organization (Harry and Schroeder (2000) and Omachonu et al. (2004)).

Quality performance is not something that can be readily altered in practice therefore one cannot test with the actual system. Most of the operations system are interconnected and subjected to both variability and complexity. Hence it is impossible to predict the performance of operation systems that are potentially subjected to variability, interconnectedness and complexity. Simulation has been used to overcome this problem and to investigate on the effect of quality loss financially and also examines the effect of different quality strategies on the financial aspect. Simulation provides a flexible technique to model an extensive array of issues that arise relating to quality and manufacturing. The flexibility of simulation methods permitted the invention of models with greater complexity than analytical techniques. Burgess (1996) constructed a simulation model based on system dynamics nature where the model was incorporated with the traditional P-A-F element. The model has facilitated the precise examination of the major relationships concerning conformance quality and costs at the organizational level. Gardner et al*.* (1995) examines the quality improvement in a manufacturing system by using simulation approach. The modelling was more complicated as it allows the defective parts to move along the assembly operations to examine the impact on the profitability and productivity.

Tannock (1995, 1997) emphasizes the significance of process capability in the selection of quality control strategy and revealed the economic advantages of control charting where special or assignable causes exist. Clark and Tannock (1999) investigate the use of simulation model to estimate the quality cost associated with multi manufacturing system setup and quality control strategies. This approach was validated and aligned with actual costs at a case study company. Although there are voluminous literatures written on cost of quality, very few literatures were written on tracing the invisible element of cost of quality and also the method in measuring the element of cost of quality.

The literature review presented so far indicates that estimating cost of quality is not a simple and straight forward issue. Moreover, the literature shows that there are several methods that can be used to determine cost of quality. Moreover, simulation stands as a favourite approach that might be used to investigate quality improvements justification in terms of profitability for a specific cost of implantation. Obviously, cost of quality model must first be developed and then simulation is followed to investigate the impact of some quality strategies and or improvements.

## **3. The cost of quality model**

In this section of the chapter we, intend to provide a brief description of the model developed by Son and Lie (1991) and the way it was modified to be suitable for our research idea. First, the notations of the model are presented.

Notations

18 Modern Approaches To Quality Control

able to focus upon areas of poor performance that need improvements, (2) it will have the opportunity to monitor the progress of ongoing improvement activities, (3) it will have an opportunity to plan for quality improvement and (4) it will be able to communicate better

Moreover, Schiffauerova and Thomson (2006) in their extensive literature review on cost of quality indicate that companies that use cost of quality programs have been quite successful

Although the impact of implementing cost of quality systems on increasing profit of any organization is obvious, Yang (2008) reported that the literature on cost of quality systems implementation indicates that most of companies do not know the true cost of their own quality. Despite that 82% of companies in the United Sates are involved in quality programs, only 33% actually compute the cost of quality (Harry and Schroeder, 2000), and in north-east England 66% of organization do not make use of quality costing as reported by Prickett and Rapley (2001). Some studies have highlighted reasons for the lack of implementing cost of quality systems in practice, Harry and Schroeder (2000) state that many significant quality related costs cannot be captured by most types of accounting systems. Chen and Yang (2002) related the difficulties to measure cost of quality to the fact that there is a lack of adequate methods for determining the financial consequences of poor quality. Moreover, Chiadamrong (2003) has concluded that there is a widespread belief that quality cost cannot be measured in practical terms because traditional cost accounting systems have not been adapted to quantify the value of quality. The need for quantifying cost of quality as stated by Yang (2008) and has been reported in the literature by many researchers, (Feigenbaum (1956); Juran (1952, 1989); Krishnan et al. (2000); Giakatis et al. (2001); Prickett and Rapley (2001), Chen and Yang (2002). However, there are evidences in the literature that clearly indicate that quantifying cost of quality has been neglected by

within the organization for improving the overall of the quality control.

in reducing cost of quality and improving the quality for the customer.

most of organization (Harry and Schroeder (2000) and Omachonu et al. (2004)).

operations to examine the impact on the profitability and productivity.

Quality performance is not something that can be readily altered in practice therefore one cannot test with the actual system. Most of the operations system are interconnected and subjected to both variability and complexity. Hence it is impossible to predict the performance of operation systems that are potentially subjected to variability, interconnectedness and complexity. Simulation has been used to overcome this problem and to investigate on the effect of quality loss financially and also examines the effect of different quality strategies on the financial aspect. Simulation provides a flexible technique to model an extensive array of issues that arise relating to quality and manufacturing. The flexibility of simulation methods permitted the invention of models with greater complexity than analytical techniques. Burgess (1996) constructed a simulation model based on system dynamics nature where the model was incorporated with the traditional P-A-F element. The model has facilitated the precise examination of the major relationships concerning conformance quality and costs at the organizational level. Gardner et al*.* (1995) examines the quality improvement in a manufacturing system by using simulation approach. The modelling was more complicated as it allows the defective parts to move along the assembly

Tannock (1995, 1997) emphasizes the significance of process capability in the selection of quality control strategy and revealed the economic advantages of control charting where special or assignable causes exist. Clark and Tannock (1999) investigate the use of simulation model to estimate the quality cost associated with multi manufacturing system setup and quality control strategies. This approach was validated and aligned with actual costs at a


## **3.1 Description of the model**

Son and Lie (1991) considered a small manufacturing system which consists of a machining area (sampling inspection) and a final inspection area (100% inspection) as indicated in Figure 1. However; they assume that 100% inspection is not always possible. They explained that a complete check of a component part may require the part removal from the fixture; the removal makes it difficult to realign the part to its original position.

Therefore, they assume that sampling inspection is assumed for component parts (inprocess inspection), and 100% inspection for finished parts (final inspection). Throughout the machining process, the feature of the product quality is monitored by an x-bar chart which consists of a center line, lower, limit and an upper limit. A sample size in use is taken and inspected at a specific time interval. During machine process, when the sample mean fall outside of the control limit in the x-bar chart, an investigation is made during an average time period to check if the alarm is true or false. The false alarm occurs if the process is in

Evaluating Quality Control Decisions: A Simulation Approach 21

Costs associated with sampling inspections prior entering the manufacturing system are not considered in the model developed by Son and Lie (1991). Our idea consists of the fact that companies often receive a shipment of material from a supplier and need to ascertain that the quality of the shipment. If it is impractical to inspect every item in the shipment, then a sampling plan is the (*n*, *c*) is used. In this plan, *n* items are chosen (without replacement) from a batch of shipped material. If *c* or fewer of the sample items are defective, then the batch is accepted; otherwise, the batch is rejected. Therefore, there will be a cost incurred during inspection of the sample before entering the manufacturing system. The expected number of samples entering the manufacturing system is *n* whereas *to* is representing the time taken to inspect the sample. Let (CaS) be the cost of investigating acceptance sampling during a cycle of time. Then the expected cost associated with this activity is given by

E (CA) = Casnto (5) Hence, the total cost of quality can be represented by summing up all the cost components

E (CQ) = E(CP) + E(CF) + E(CA) (6) Where E(CP) is representing the prevention cost, E(CF) is representing the failure costs and

The improvement made to the cost of quality model developed by Son and Lie (1991) that was described in section 3.1.4 adds acceptance sampling plans. Almost all quality control managers in the manufacturing firms develop and implement such plans. As a result an element of appraisal cost is incurred prior to the commencement of the production activities. Moreover, unlike the work reported by Son and Lie (1991) that consider a single stage manufacturing system, this research work considers a two-stage manufacturing system. Therefore, the problem considered by this chapter can be described as: : a manufacturing system that consists of two stages, incoming raw materials inspection carried out according to some quality plan and as a result an element of appraisal cost in incurred prior to the commencement of the production activity. Once orders are realized by the company, raw materials are brought into the shop floor and at this stage; preventive and failure costs are incurred. In order to investigate the cost of quality for this manufacturing system, we have adopted the four strategies for inspection and removal of defectives across a range of detection rates reported by Gardner et.al (1995). The strategies are summarized in the

1. Inspection and removal of defectives based on acceptance sampling prior to assembly

The motivation for creating the above strategies (quality control decisions) and then using simulation tool is to investigate the impact of these strategies on cost of quality. One may conclude that the result of implementing any of these decisions is obvious. For example, the

2. Inspection and removal of defectives at completion of finished product only,

3. Inspection and removal of defectives prior to assembly points, 4. Inspection and removal of defectives following every operations, and

**3.1.4 Acceptance sample cost per cycle** 

(1) through (5) and is presented by equation 6.

E(CA) is representing the acceptance sampling costs.

**4. The problem and solution methodology** 

equation 5.

following manner:

points

Fig. 1. Simplified manufacturing system.

control and on the other hand, the true alarm occurs if an assignable cause of specific magnitude makes the process out of control. In their modelling, first they calculate the cycle time, and then quality cost (prevention and failure) per cost was determined. In the next section, equations for all components are stated, obviously, readers interested in the detail steps of deriving these equations should referred to the original article.

#### **3.1.1 Cycle time calculations**

The assignable cause occurs according to the Poisson process with mean rate Θ. The period of the process is in control is represented by T1. T1 follows an exponential distribution with mean 1/Θ.

The expected length of a cycle;

$$\rm E(T\_c) = E(T\_1) + E(T\_2) + E(T\_3) + E(T\_4) \tag{1}$$

Which can be obtained using equation 2

$$=\mathbf{1}/\Theta + (\mathbf{t} + \mathbf{t}\Gamma/(\mathbf{1}\cdot\mathbf{\beta})\mathbf{-t}) + (\mathbf{g}\mathbf{n} + \mathbf{d}\_1) + \mathbf{d}\_2 \tag{2}$$

#### **3.1.2 Prevention cost per cycle**

There are three types of prevention costs that occur at the machining area during a cycle of time. The first cost is associated with inspection works and denoted as (Cp1), the second cost is representing the cost of investigating the false alarms (Cp2), and the third cost is associated with adjusting the assignable cause (Cp3). Then, the expected prevention cost per cycle (Cp) can be determined by summing up these three components as in equation 3.

$$\rm E(C\_p) = E(C\_{p1}) + E(C\_{p2}) + E(C\_{p3}) \tag{3}$$

#### **3.1.3 Failure cost per cycle**

The failure cost has three components; the first component is the cost of rework per cycle (Cf1), the second component is the scrap cost per cycle (Cf2) and the third component is the external failure cost during a cycle of time (Cf3). Then, the expected failure cost per cycle (CF) can be determined by summing up these three components as in equation 4.

$$\text{E}\ (\text{C}\_{\text{F}}) = \text{E}(\text{C}\_{\text{f1}}) + \text{E}(\text{C}\_{\text{f2}}) + \text{E}(\text{C}\_{\text{f3}}) \tag{4}$$

#### **3.1.4 Acceptance sample cost per cycle**

20 Modern Approaches To Quality Control

control and on the other hand, the true alarm occurs if an assignable cause of specific magnitude makes the process out of control. In their modelling, first they calculate the cycle time, and then quality cost (prevention and failure) per cost was determined. In the next section, equations for all components are stated, obviously, readers interested in the detail

The assignable cause occurs according to the Poisson process with mean rate Θ. The period of the process is in control is represented by T1. T1 follows an exponential distribution with

E(Tc) = E(T1) + E(T2) + E(T3) + E(T4) (1)

There are three types of prevention costs that occur at the machining area during a cycle of time. The first cost is associated with inspection works and denoted as (Cp1), the second cost is representing the cost of investigating the false alarms (Cp2), and the third cost is associated with adjusting the assignable cause (Cp3). Then, the expected prevention cost per cycle (Cp)

E(Cp) = E(Cp1) +E(Cp2) + E(Cp3) (3)

The failure cost has three components; the first component is the cost of rework per cycle (Cf1), the second component is the scrap cost per cycle (Cf2) and the third component is the external failure cost during a cycle of time (Cf3). Then, the expected failure cost per cycle (CF)

can be determined by summing up these three components as in equation 3.

can be determined by summing up these three components as in equation 4.

= 1/Θ + (t + tΓ/(1-β)-τ) + (gn + d1) + d2 (2)

E (CF) = E(Cf1) + E(Cf2) + E(Cf3) (4)

steps of deriving these equations should referred to the original article.

Fig. 1. Simplified manufacturing system.

**3.1.1 Cycle time calculations** 

The expected length of a cycle;

**3.1.2 Prevention cost per cycle** 

**3.1.3 Failure cost per cycle** 

Which can be obtained using equation 2

mean 1/Θ.

Costs associated with sampling inspections prior entering the manufacturing system are not considered in the model developed by Son and Lie (1991). Our idea consists of the fact that companies often receive a shipment of material from a supplier and need to ascertain that the quality of the shipment. If it is impractical to inspect every item in the shipment, then a sampling plan is the (*n*, *c*) is used. In this plan, *n* items are chosen (without replacement) from a batch of shipped material. If *c* or fewer of the sample items are defective, then the batch is accepted; otherwise, the batch is rejected. Therefore, there will be a cost incurred during inspection of the sample before entering the manufacturing system. The expected number of samples entering the manufacturing system is *n* whereas *to* is representing the time taken to inspect the sample. Let (CaS) be the cost of investigating acceptance sampling during a cycle of time. Then the expected cost associated with this activity is given by equation 5.

$$\text{E (C}\_{\text{A}}) = \text{C}\_{\text{as}} \text{nt}\_{\text{o}} \tag{5}$$

Hence, the total cost of quality can be represented by summing up all the cost components (1) through (5) and is presented by equation 6.

$$\text{E}\ (\text{C}\_{\text{Q}}) = \text{E}(\text{C}\_{\text{P}}) + \text{E}(\text{C}\_{\text{F}}) + \text{E}(\text{C}\_{\text{A}}) \tag{6}$$

Where E(CP) is representing the prevention cost, E(CF) is representing the failure costs and E(CA) is representing the acceptance sampling costs.

#### **4. The problem and solution methodology**

The improvement made to the cost of quality model developed by Son and Lie (1991) that was described in section 3.1.4 adds acceptance sampling plans. Almost all quality control managers in the manufacturing firms develop and implement such plans. As a result an element of appraisal cost is incurred prior to the commencement of the production activities. Moreover, unlike the work reported by Son and Lie (1991) that consider a single stage manufacturing system, this research work considers a two-stage manufacturing system.

Therefore, the problem considered by this chapter can be described as: : a manufacturing system that consists of two stages, incoming raw materials inspection carried out according to some quality plan and as a result an element of appraisal cost in incurred prior to the commencement of the production activity. Once orders are realized by the company, raw materials are brought into the shop floor and at this stage; preventive and failure costs are incurred. In order to investigate the cost of quality for this manufacturing system, we have adopted the four strategies for inspection and removal of defectives across a range of detection rates reported by Gardner et.al (1995). The strategies are summarized in the following manner:


The motivation for creating the above strategies (quality control decisions) and then using simulation tool is to investigate the impact of these strategies on cost of quality. One may conclude that the result of implementing any of these decisions is obvious. For example, the

Evaluating Quality Control Decisions: A Simulation Approach 23

in process 1, a sample of 90 items will be inspected and the entire 1000 items are accepted if 0, 1, 2, 3, 4, or 5 defective items are found in the sample otherwise the batch is rejected. On the other hand, for process 2, a sample of 40 items will be inspected and the entire batch will be accepted if 0, 1, 2 or 3 defective items are found in the sample otherwise the batch will be rejected. The idea here is to set the Actable Quality Level (AQL) to be equal to 1% and the Lot Tolerance Percentage Defective (LTPD) to be equal to 5%. Once AQL and the LTPD are set, a simulation run is carried out to determine the cost of quality associated with the quality control strategies (decisions) detailed in section 4 in this chapter. It is worth noting and as indicated earlier, the process under consideration consists of two processes. In order to avoid computational complexities; it was decided to apply and conduct simulation runs for each management strategies to each process separately and then combined the two processes to examine the overall findings. The numerical values which were used in the

hour Θ 0.001 0.001

Lower specification level LSL 0.79 0.99

Upper specification level USL 0.81 1.01 Average height of the product population µ 0.80 1.00

Magnitude of an assignable cause δ 5 7 Value to decide control limit q 3 7 Sampling interval t 5 20 Sample size per sampling n 90 40

Time required to sample a product g 0.05 10

Strategy 1. Inspection and removal of defectives based on acceptance sampling prior to

This quality control strategy calls for the removal of defectives at the sampling stage and no quality control to be conducted after that stage. The total cost of quality associated with this

**Process 1 Process 2** 

**Symbol Value Value** 

σ 0.007 0.015

d1 0.5 0.5

d2 2 2.5

simulation model are portrayed in Tables 3 and 4.

Average number of assignable cause per

Standard deviation of the product

Mean time spent to identify an assignable

Mean time spent to correct an assignable

Table 3. Numeric values for processes 1 and 2.

strategy would be determined based on the following equation:

TCOQ1= Casnto

population

cause

cause

assembly points.

**Manufacturing system** 

strategy that calls to carry out the inspection works at the final stage of the production and then removal of defectives will minimize the inspection costs, however, the increase of defectives at the final stage of manufacturing is inevitable. Moreover, inspection and removal of defectives prior to the assembly line will increase the quality prevention cost component and indirectly reduce the failure cost component. However, without the development of the cost of quality model and the simulation works, the magnitude of cost of quality could not be determined and the impact would not be known. It is not a secret that many firms sacrifices parts of the quality control steps under pressure to reduce cycle time.

Once the cost of quality model and its three components is developed, a simulation model using @Risk spreadsheet simulation software is developed for the two-stage process. The popularity using spreadsheet technology among practitioners justifies the use of the proposed @Risk simulation software.

## **5. Experiment design**

Inspection time was considered as the main factor for calculating cost of quality. This is important as in real-life manufacturing systems; production line managers consider cycle time to be an important factor. Moreover, quite often, quality control activities or part of the activities would be sacrificed in order to attain a desired cycle time.

In order to simplify the experiment, it was decided to use the same fractions of defective rate to every operation in the manufacturing system for a given trial. The fractions of defective rate are divided into five categories: .001, 0.005, 0.010, 0.050 and 0.100. Every trial will run for about 10000 times of the simulation runs as indicated in Table 2.


Table 2. Experiment design for simulation run.

Therefore there will be twenty combinations of strategies and defective rates. All the combinations will run for 10000 iterations each. It is worth to note that during the design of the sampling acceptance procedure, it was assumed that the company would receives a batch of 10,000 items which require assembly activities using processes one and two respectively. Moreover, it is assumed that the quality engineers have developed a quality plan which is described as the (n, c) plan. In an (n, c) plan, n items are chosen (without replacement) from the batch of shipped material whereas c is the maximum number of defective items that a sample could have. If the number of the defective items in the sample is fewer or equal to c, then the batch is accepted otherwise the batch is rejected. For example,

strategy that calls to carry out the inspection works at the final stage of the production and then removal of defectives will minimize the inspection costs, however, the increase of defectives at the final stage of manufacturing is inevitable. Moreover, inspection and removal of defectives prior to the assembly line will increase the quality prevention cost component and indirectly reduce the failure cost component. However, without the development of the cost of quality model and the simulation works, the magnitude of cost of quality could not be determined and the impact would not be known. It is not a secret that many firms sacrifices parts of the quality control steps under pressure to reduce cycle

Once the cost of quality model and its three components is developed, a simulation model using @Risk spreadsheet simulation software is developed for the two-stage process. The popularity using spreadsheet technology among practitioners justifies the use of the

Inspection time was considered as the main factor for calculating cost of quality. This is important as in real-life manufacturing systems; production line managers consider cycle time to be an important factor. Moreover, quite often, quality control activities or part of the

In order to simplify the experiment, it was decided to use the same fractions of defective rate to every operation in the manufacturing system for a given trial. The fractions of defective rate are divided into five categories: .001, 0.005, 0.010, 0.050 and 0.100. Every trial will run

> 10000 run

> 10000 run

> 10000 run

> 10000 run

Therefore there will be twenty combinations of strategies and defective rates. All the combinations will run for 10000 iterations each. It is worth to note that during the design of the sampling acceptance procedure, it was assumed that the company would receives a batch of 10,000 items which require assembly activities using processes one and two respectively. Moreover, it is assumed that the quality engineers have developed a quality plan which is described as the (n, c) plan. In an (n, c) plan, n items are chosen (without replacement) from the batch of shipped material whereas c is the maximum number of defective items that a sample could have. If the number of the defective items in the sample is fewer or equal to c, then the batch is accepted otherwise the batch is rejected. For example,

Fraction Detection per Operation 0.001 0.005 0.01 0.05 0.1

> 10000 run

> 10000 run

> 10000 run

> 10000 run

10000 run

10000 run

10000 run

10000 run

10000 run

10000 run

10000 run

10000 run

activities would be sacrificed in order to attain a desired cycle time.

for about 10000 times of the simulation runs as indicated in Table 2.

10000 run

10000 run

run

10000 run

Inspection and defective removal

strategies

Acceptance sampling

Completion of finish product

Following every operation

Table 2. Experiment design for simulation run.

Prior to assembly <sup>10000</sup>

time.

proposed @Risk simulation software.

**5. Experiment design** 

in process 1, a sample of 90 items will be inspected and the entire 1000 items are accepted if 0, 1, 2, 3, 4, or 5 defective items are found in the sample otherwise the batch is rejected.

On the other hand, for process 2, a sample of 40 items will be inspected and the entire batch will be accepted if 0, 1, 2 or 3 defective items are found in the sample otherwise the batch will be rejected. The idea here is to set the Actable Quality Level (AQL) to be equal to 1% and the Lot Tolerance Percentage Defective (LTPD) to be equal to 5%. Once AQL and the LTPD are set, a simulation run is carried out to determine the cost of quality associated with the quality control strategies (decisions) detailed in section 4 in this chapter. It is worth noting and as indicated earlier, the process under consideration consists of two processes. In order to avoid computational complexities; it was decided to apply and conduct simulation runs for each management strategies to each process separately and then combined the two processes to examine the overall findings. The numerical values which were used in the simulation model are portrayed in Tables 3 and 4.


Table 3. Numeric values for processes 1 and 2.

Strategy 1. Inspection and removal of defectives based on acceptance sampling prior to assembly points.

This quality control strategy calls for the removal of defectives at the sampling stage and no quality control to be conducted after that stage. The total cost of quality associated with this strategy would be determined based on the following equation:

TCOQ1= Casnto

Evaluating Quality Control Decisions: A Simulation Approach 25

This strategy calls for inspection and removal of defectives following every operation. Hence, the inspection will take place at the end of every operation to determine if there are defectives and perform the necessary corrections (rework). As a result, there will be an element of failure costs. Moreover, since the system includes inspection costs before items enter the manufacturing system, an additional inspection cost will be incurred. The total

 TCOQ3 = E(CA) + E(Cp1) + E(Cp2) + E(CF)= E(CA) + E(Cp1) + E(Cp2) + E(Cf1) + E(Cf2) + E(Cf3) Each mathematical model that describe each strategy is developed in MS Excel spreadsheet for each process separately and then for the combinations of the two stages. The simulation software @Risk is imbedded and a simulation of 10000 iterations is carried out. This way, the impact of each of the strategy can be investigated and the impact on cost of quality can

A problem of model verifications arises when using simulation approach in decision evaluation. The literature referred to model verifications procedure developed by Naylor and Finger (1967) that ensures the developed model is free from any illogical error. In order to accomplish the task, the formulation and its parameters provided by Son and Lie (1991) were used to compute the prevention and failure costs reported by the authors. A simulation run was carried out and the results of the simulation model were compared with

Son and Lie (1991) results The Simulation results

Total length of a cycle, TC (hours)

> E[T1]= 100

\$214.21 E[T2]= 4.28 E[CP2] =

E[T3]= 0.55

E[TC]= 106.83

Table 5 indicates that the results obtained from the simulation model match exactly the results reported by Son and Lie (1991). The results from this finding indicate that the simulation model is an accurate presentation of that reported by Son and Lie (1991). Hence, it is concluded that the model is ready for investigation the evaluating the quality control

<sup>2</sup>

Prevention cost per cycle, CP (\$)

> E(CP1]= \$312.83

> > \$2.63

E[CP3]= \$100.00

E[CP]= \$415.46

Failure cost per cycle, CF (\$)

> E(CF1]= \$3292.96

E[CF2]= \$214.21

E[CF3]= \$32.29

E[CF]= \$3359.46

the results reported by Son and Lie (1991). The results are presented in Table 5.

Failure cost per cycle, CF (\$)

> E(CF1]= \$3292.96

E[CF2]=

E[CF3]= \$32.29

E[CF]= \$3539.46

2 E[T4] =

Strategy 4. Inspection and removal of defectives following every operations.

costs of quality associates on this strategy are;

**6. Model verification, results and discussions**

Prevention cost per cycle, CP (\$)

> E(CP1]= \$312.83

E[CP2] = \$2.63

E[CP3]= \$100.00

E[CP]= \$415.46

Table 5. Comparison results.

be determined.

Total length of a cycle, TC (hours)

> E[T1]= 100.00

> E[T2]= 4.28

> E[T3]= 0.55

E[T4] =

E[TC]= 106.83

decisions.

**6.1 Model verification** 


Table 4. Numeric values for processes 1 and 2.

Strategy 2. Inspection and removal of defectives at completion of finished product only. This strategy calls for inspection of items among completion of the two processes. Therefore, it means that there will be no quality control during the process of making the products. Hence, the cost of quality that will be incurred in this strategy is meanly failure costs. However, it is assumed here that samples have to undergo visual inspection before the items enter into the manufacturing system. Therefore, the total costs of quality associates on this strategy are;

$$\text{TCOQ}\_1 = \text{E(C}\_\text{A}) + \text{E(C}\_\text{F}) = \text{E(C}\_\text{A}) + \text{E(C}\_\text{I}) + \text{E(C}\_\text{I}) + \text{E(C}\_\text{C})$$

Strategy 3. Inspection and removal of defectives prior to assembly points.

This strategy requires that items must be inspected and defectives are removed prior to assembly. In this case, since the inspection is done at the end of the process, the cost of quality is simply the prevention cost which is incurred during inspection of the material. However, parts with defects could be produced since there is no quality control during and at the end of the assembly. Therefore, there will be some failure costs which is required to convert defective items into good items. Moreover, inspection costs before the items enter the manufacturing system have to be considered. Therefore, the total costs of quality associates on this strategy are;

$$\text{TCOQ}\_2 = \text{E(C}\_\text{A}) + \text{E(C}\_\text{p1}) + \text{E(C}\_\text{F})\\= \text{E(C}\_\text{A}) + \text{E(C}\_\text{p1}) + \text{E(C}\_\text{t1}) + \text{E(C}\_\text{t2}) + \text{E(C}\_\text{t3})$$

Strategy 4. Inspection and removal of defectives following every operations.

This strategy calls for inspection and removal of defectives following every operation. Hence, the inspection will take place at the end of every operation to determine if there are defectives and perform the necessary corrections (rework). As a result, there will be an element of failure costs. Moreover, since the system includes inspection costs before items enter the manufacturing system, an additional inspection cost will be incurred. The total costs of quality associates on this strategy are;

$$\text{TCOQ} \bullet \text{=E}(\text{C}\prime) + \text{E}(\text{C}\prime\_{l}) \, \text{ } + \text{E}(\text{C}\prime\_{l}) + \text{E}(\text{C}\prime) \bullet \text{=E}(\text{C}\prime) + \text{E}(\text{C}\prime\_{l}) \, \text{ } + \text{E}(\text{C}\prime\_{l}) + \text{E}(\text{C}\prime) + \text{E}(\text{C}\prime) + \text{E}(\text{C}\prime) \, \text{ } +$$

Each mathematical model that describe each strategy is developed in MS Excel spreadsheet for each process separately and then for the combinations of the two stages. The simulation software @Risk is imbedded and a simulation of 10000 iterations is carried out. This way, the impact of each of the strategy can be investigated and the impact on cost of quality can be determined.

## **6. Model verification, results and discussions**

#### **6.1 Model verification**

24 Modern Approaches To Quality Control

machining area per hour N 40 20

bad e1 0.01 0.01

good e2 0.005 0.005 Rate of restoring a defect to a good part w 0.95 0.95 Fixed sampling cost per sampling interval a1 25 30 Variable sampling cost per unit product a2 8 5 Cost of investigating a false alarm Cfs 80 80 Cost of correcting an assignable cause Ccr 150 150

misclassification Cg 100 100 Cost of reworking a defective part Cb 200 200

cannot be restored Cs 75 75

defective part Ca 100 100

Strategy 2. Inspection and removal of defectives at completion of finished product only. This strategy calls for inspection of items among completion of the two processes. Therefore, it means that there will be no quality control during the process of making the products. Hence, the cost of quality that will be incurred in this strategy is meanly failure costs. However, it is assumed here that samples have to undergo visual inspection before the items enter into the manufacturing system. Therefore, the total costs of quality associates on

This strategy requires that items must be inspected and defectives are removed prior to assembly. In this case, since the inspection is done at the end of the process, the cost of quality is simply the prevention cost which is incurred during inspection of the material. However, parts with defects could be produced since there is no quality control during and at the end of the assembly. Therefore, there will be some failure costs which is required to convert defective items into good items. Moreover, inspection costs before the items enter the manufacturing system have to be considered. Therefore, the total costs of quality

TCOQ2 = E(CA) + E(Cp1) + E(CF)= E(CA) + E(Cp1) + E(Cf1) + E(Cf2) + E(Cf3)

 TCOQ1 =E(CA) + E (CF) = E(CA) + E(Cf1) + E(Cf2) + E(Cf3) Strategy 3. Inspection and removal of defectives prior to assembly points.

**Process 1 Process 2** 

**Symbol Value Value** 

**Manufacturing system** 

Number of products produced at the

Error rate of misclassifying a good part into

Error rate of misclassifying a bad part as

Cost of reworking a good part because of

Cost of scrapping a defective part that

this strategy are;

associates on this strategy are;

Cost of dissatisfying a customer by selling a

Table 4. Numeric values for processes 1 and 2.

A problem of model verifications arises when using simulation approach in decision evaluation. The literature referred to model verifications procedure developed by Naylor and Finger (1967) that ensures the developed model is free from any illogical error. In order to accomplish the task, the formulation and its parameters provided by Son and Lie (1991) were used to compute the prevention and failure costs reported by the authors. A simulation run was carried out and the results of the simulation model were compared with the results reported by Son and Lie (1991). The results are presented in Table 5.


Table 5. Comparison results.

Table 5 indicates that the results obtained from the simulation model match exactly the results reported by Son and Lie (1991). The results from this finding indicate that the simulation model is an accurate presentation of that reported by Son and Lie (1991). Hence, it is concluded that the model is ready for investigation the evaluating the quality control decisions.

Evaluating Quality Control Decisions: A Simulation Approach 27

Figure 2 was constructed to display the summary statistics, the mean value for the acceptable quality level (AQL) over the tile percentage. From the chart it could be seen that the mean of the sample is equal to 0.9994 which very close to the value of 1 which indicate

As for the risk associated with low tolerance percentage defect (LTPD), the statistics summary results indicate that the mean has a value of 0.8622 which says that 86.22% of the sample size has defects and would be accepted as good parts. Therefore, such situation indicates that the risk would be about 86.22% of the sampling plan. Figure 3 summarizes the summary statistics. In Figure 3, the distribution is skewed towards the right represents the

In this case inspection and removal will took place upon completion of assembly activities. Simulation runs resulted in obtaining the cost of quality associated with strategy of inspection upon completion of assembly activities at each level of defects rate per operation

Cost of Quality (x 1000)

0.001 3558.95 3398.49 3239.98 3075.65 2917.15 0.005 1989.88 1829.28 1670.68 1506.19 1347.65 0.01 1794.33 1633.55 1474.85 1310.16 1151.55 0.05 1642.42 1480.26 1320.76 1154.48 995.41 0.10 1628.56 1464.93 1304.54 1136.43 976.81 Table 6. Cost of quality at different values of defect rate, time interval and inspection time. Table 6 reveals the total cost of quality is decreasing as we move towards the right side of the table. The reason for that is the fact that the more time is allocated for inspection work, the more of the defects will be found. This finding could be seen for example when considering the total cost at 0.001 defects rate at interval time of 20 hours and inspection time for 10 hours. Considering that example reveals that the cost has decrease by 27%. Moreover, Table 6 indicates that the maximum impact on total cost of quality resulted from implanting this strategy occurred at defect rate 0.001 with time equal to 20 hours and inspection time of 10 hours. On the other hand, the minimum cost of quality occurs at a defect rate of 0.10 with 8 hours interval time of inspection. The total cost of quality

t=17 g=9

behaviour at various interval times for each defect rate is presented in Figure 4.

curve that represents the defect rate of 0.10 has the smallest total cost of quality.

Figure 4 reveals that there is a large gap between the curve for the defects rate at 0.001 and the curve represents the defect rate at 0.005. As it could be seen, there is a drastic drop and the drop is due to the increase of the defects that will cause the time taken to inspect the defect is shorter and thus the time to correct the assignable cause shorter. Hence, it will reduce the total cost of quality in this situation compared to the lower defect rate. Moreover, the difference of the total cost of quality at defect rate of 0.05 and defect rate of 0.10 is small. Therefore, the curves represent this defect rate is overlapping with each other's and the

Time interval (t, hr) and time inspection of a product (g, hr), Cost is in MU=Monetary Unit

> t=15 g=8

t=10 g=7

t=8 g=6

that the risk is equal to 0.00006.

is presented in Table 6.

Defective rate per operation (%)

lot tolerance percentage defects with a mean of 86.22%.

**6.2.2 Inspection upon completion of assembly** 

t=20 g=10

#### **6.2 Results and discussions**

#### **6.2.1 Acceptance sampling prior to assembly points**

As indicted earlier, the quality control engineers develop quality control plan.

In this inspection as mentioned in section 4, the quality control engineers have developed a quality control plan based on the concept of producer's risk which is related to acceptable quality level (AQL) and the consumer's risk which is related to tolerance percentage defective (LTPD).

The summary of the statistics that resulted from the simulations runs shows that the mean was found to be equal to 0.994 with a tile of 15% and the maximum is equal to 1 with a tile of 5%. Since it was concluded that the sample is good, since 99.94% of the sample is within 1% of defectives, then the items are considered to be acceptable. On the other hand, the remaining 0.06% of the sample contributes to produce risk. This means that the plan has a risk of losing 0.06% of good sample. It is worth noting that the skewness has a negative value that is -40.7941969 indicating that distribution of the sample exceedingly to the left.

Fig. 2. Distribution of acceptance sampling for AQL.

Fig. 3. Distribution of acceptance sampling for LTPD.

In this inspection as mentioned in section 4, the quality control engineers have developed a quality control plan based on the concept of producer's risk which is related to acceptable quality level (AQL) and the consumer's risk which is related to tolerance percentage

The summary of the statistics that resulted from the simulations runs shows that the mean was found to be equal to 0.994 with a tile of 15% and the maximum is equal to 1 with a tile of 5%. Since it was concluded that the sample is good, since 99.94% of the sample is within 1% of defectives, then the items are considered to be acceptable. On the other hand, the remaining 0.06% of the sample contributes to produce risk. This means that the plan has a risk of losing 0.06% of good sample. It is worth noting that the skewness has a negative value that is -40.7941969 indicating that distribution of the sample exceedingly to the left.

Distribution of acceptance sampling for AQL


Distribution of acceptance sampling for LTPD


0 1 %

5% 90% 5%

90% 5

Mean=0.9994

Mean=0.9994

1

Mean=0.8622

As indicted earlier, the quality control engineers develop quality control plan.

**6.2 Results and discussions** 

defective (LTPD).

5%

1 21 4

1 61 820 **6.2.1 Acceptance sampling prior to assembly points** 

Fig. 2. Distribution of acceptance sampling for AQL.

Fig. 3. Distribution of acceptance sampling for LTPD.

Figure 2 was constructed to display the summary statistics, the mean value for the acceptable quality level (AQL) over the tile percentage. From the chart it could be seen that the mean of the sample is equal to 0.9994 which very close to the value of 1 which indicate that the risk is equal to 0.00006.

As for the risk associated with low tolerance percentage defect (LTPD), the statistics summary results indicate that the mean has a value of 0.8622 which says that 86.22% of the sample size has defects and would be accepted as good parts. Therefore, such situation indicates that the risk would be about 86.22% of the sampling plan. Figure 3 summarizes the summary statistics. In Figure 3, the distribution is skewed towards the right represents the lot tolerance percentage defects with a mean of 86.22%.

## **6.2.2 Inspection upon completion of assembly**

In this case inspection and removal will took place upon completion of assembly activities. Simulation runs resulted in obtaining the cost of quality associated with strategy of inspection upon completion of assembly activities at each level of defects rate per operation is presented in Table 6.


Table 6. Cost of quality at different values of defect rate, time interval and inspection time.

Table 6 reveals the total cost of quality is decreasing as we move towards the right side of the table. The reason for that is the fact that the more time is allocated for inspection work, the more of the defects will be found. This finding could be seen for example when considering the total cost at 0.001 defects rate at interval time of 20 hours and inspection time for 10 hours. Considering that example reveals that the cost has decrease by 27%. Moreover, Table 6 indicates that the maximum impact on total cost of quality resulted from implanting this strategy occurred at defect rate 0.001 with time equal to 20 hours and inspection time of 10 hours. On the other hand, the minimum cost of quality occurs at a defect rate of 0.10 with 8 hours interval time of inspection. The total cost of quality behaviour at various interval times for each defect rate is presented in Figure 4.

Figure 4 reveals that there is a large gap between the curve for the defects rate at 0.001 and the curve represents the defect rate at 0.005. As it could be seen, there is a drastic drop and the drop is due to the increase of the defects that will cause the time taken to inspect the defect is shorter and thus the time to correct the assignable cause shorter. Hence, it will reduce the total cost of quality in this situation compared to the lower defect rate. Moreover, the difference of the total cost of quality at defect rate of 0.05 and defect rate of 0.10 is small. Therefore, the curves represent this defect rate is overlapping with each other's and the curve that represents the defect rate of 0.10 has the smallest total cost of quality.

Evaluating Quality Control Decisions: A Simulation Approach 29

Total cost of quality behaviour at various interval times for each defect rate

Fig. 5. Cost of quality behaviour at various interval times for each defect rate.

This strategy calls for inspection works to be carried out at the end of the process for every operation. The cost of quality at various defective rates and inspection interval times is

0 5 10 15 20 25 **The interval time (hour)**

Cost of Quality (x 1000)

0.001 3062.25 2924.25 2787.93 2646.61 2510.30 0.005 1712.85 1574.73 1438.34 1296.88 1160.53 0.01 1544.67 1406.40 1269.92 1128.29 991.88 0.05 1414.03 1274.57 1137.40 994.40 857.61 0.10 1402.11 1261.39 1123.45 978.88 841.60

Table 8. Cost of quality at different values of defect rate, time interval and inspection time. Table 8 reveals that the maximum total cost of quality occurs at defective rate of 0.001, interval time of 20 hours and inspection time of 10 hours. On the other hand, the minim total cost of quality occurs at defective rate of 0.10, time interval of 8 hours and inspection time of 6 hours.. The relationship between total cost of quality, the time interval and defective rate is

t=17 g=9

Time interval (t, hr) and time inspection of a product (g, hr), Cost is in MU=Monetary Unit

> t=15 g=8

t=10 g=7

t=8 g=6

0.001 0.005 0.01 0.05 0.1

**6.2.4 Inspection following every assembly operations** 

t=20 g=10

presented in Table 8.

**Total cost of quality (x1000)**

0.00 500.00 1,000.00 1,500.00 2,000.00 2,500.00 3,000.00 3,500.00

Defective rate per operation (%)

presented in Figure 6.

Fig. 4. Cost of quality behaviour at various interval times for each defect rate.

## **6.2.3 Inspection prior to assembly operations**

This quality control strategy calls for inspection to be carried out at the commencement of the assembly process; the cost of quality at various defective rates and inspection interval times is presented in Table 7.


Table 7. Cost of quality at different values of defect rate, time interval and inspection time.

Table 7 reveals that the maximum total cost of quality occurs at defect rate of 0.001 and time interval of 20 hours and inspection time of 10 hours. On the other hand, the minimum total cost of quality occurs at defective rate of .10, time interval of 8 hours and inspection time of 6 hours. As it could be noticed that as the time of inspection become shorter, total cost of quality is reduced. The relationship between total cost of quality, the time interval and defective rate is presented in Figure 5.

Total cost of quality behaviour at various interval times for each defect

Fig. 4. Cost of quality behaviour at various interval times for each defect rate.

This quality control strategy calls for inspection to be carried out at the commencement of the assembly process; the cost of quality at various defective rates and inspection interval

0 5 10 15 20 25 **The interval time (hour)**

Cost of Quality (x 1000)

0.001 3073.71 2937.75 2803.23 2669.58 2539.01 0.005 1715.11 1577.40 1441.37 1301.44 1166.24 0.01 1545.79 1407.72 1271.42 1130.55 994.72 0.05 1414.24 1274.82 1137.69 994.84 858.15 0.10 1402.23 1261.52 1123.60 979.09 841.87

Table 7. Cost of quality at different values of defect rate, time interval and inspection time. Table 7 reveals that the maximum total cost of quality occurs at defect rate of 0.001 and time interval of 20 hours and inspection time of 10 hours. On the other hand, the minimum total cost of quality occurs at defective rate of .10, time interval of 8 hours and inspection time of 6 hours. As it could be noticed that as the time of inspection become shorter, total cost of quality is reduced. The relationship between total cost of quality, the time interval and

t=17 g=9

Time interval (t, hr) and time inspection of a product (g, hr), Cost is in MU=Monetary Unit

> t=15 g=8

t=10 g=7

t=8 g=6

0.001 0.005 0.01 0.05 0.1

**6.2.3 Inspection prior to assembly operations** 

t=20 g=10

defective rate is presented in Figure 5.

times is presented in Table 7.

0.00 500.00 1,000.00 1,500.00 2,000.00 2,500.00 3,000.00 3,500.00 4,000.00

**Total cost of quality (x1000)**

Defective rate per operation (%)

Fig. 5. Cost of quality behaviour at various interval times for each defect rate.

## **6.2.4 Inspection following every assembly operations**

This strategy calls for inspection works to be carried out at the end of the process for every operation. The cost of quality at various defective rates and inspection interval times is presented in Table 8.


Table 8. Cost of quality at different values of defect rate, time interval and inspection time.

Table 8 reveals that the maximum total cost of quality occurs at defective rate of 0.001, interval time of 20 hours and inspection time of 10 hours. On the other hand, the minim total cost of quality occurs at defective rate of 0.10, time interval of 8 hours and inspection time of 6 hours.. The relationship between total cost of quality, the time interval and defective rate is presented in Figure 6.

Evaluating Quality Control Decisions: A Simulation Approach 31

inspection following every assembly operations has the lowest total cost of quality. This is because as the appraisal cost increases, the failure costs will simultaneously reduced since

Since simulation modelling can provide a window of opportunity to investigate important quality control decisions and their impact on cost of quality. It is thought that the quality control management may wish to consider some combination of strategies for each process rather than having a single strategy for the whole assembly process. In this case, one has to determine the possible quality strategies combinations that could be considered for the simulation run. Let R be the possible strategies combination outcome, then we would have:

i = A strategy in process 1 i = 1, 2, 3, 4 and j = A strategy in process 2 j = 1, 2, 3, 4

Fraction Detection Rate per Operation

0.001 0.005 0.01 0.05 0.1

S1,1 4,845.07 2,137.34 1,651.57 1,224.13 1,023.91 S1,2 4,854.02 2,141.85 1,655.16 1,226.89 1,026.21 S1,3 4,368.78 1,889.97 1,451.73 1,067.25 891.27 S1,4 4,357.32 1,887.30 1,450.23 1,066.82 891.01 S2,1 4,848.23 2,138.03 1,651.92 1,224.21 1,023.93 S2,2 4,857.18 2,142.53 1,655.51 1,226.97 1,026.23 S2,3 4,371.95 1,890.65 1,452.08 1,067.32 891.30 S2,4 4,360.48 1,887.98 1,450.58 1,066.89 891.03 S3,1 4,622.03 2,090.06 1,630.57 1,217.32 1,023.83 S3,2 4,630.98 2,094.56 1,634.16 1,220.08 1,026.13 S3,3 4,145.74 1,842.68 1,430.73 1,060.44 891.20 S3,4 4,134.28 1,840.00 1,429.23 1,060.01 890.93 S4,1 4,472.97 2,052.62 1,605.53 1,209.65 1,016.16 S4,2 4,481.93 2,057.12 1,609.12 1,212.41 1,018,.47 S4,3 3,996.69 1,805.25 1,405.69 1,052.76 883.53 S4,4 3,985.23 1,802.57 1,404.19 1,052.33 883.26

Table 9 shows all possible quality strategies combinations with their respective cost at each selected fraction detection rate for each operation. As it could be seen from Table 9, the combination of {2, 2} strategies have caused the system to incur the largest cost of quality. This is the case when the management decided to use the quality strategy in which quality control inspection work is only to be carried at the end of the production runs. On the other hand, one can see from Table 9 that that adopting combination of {4, 4} strategies have caused the system to incur the least cost of quality. As it could be easily seen that {4, 4}

R = ∑Si, j (7)

less defect rate occurs and hence less items is sent for rework.

**6.2.5 Investigation of strategies combination** 

Inspection and defective removal strategies

Table 9. Total cost of quality for the assembly system.

Where

Fig. 6. Cost of quality behaviour at various interval times for each defect rate.

Figure 6 reveals that as defective rate per operation is increased, the total cost of quality decreases. This is because as the defective rate is increased, the time taken to detect the defect becomes shorter. Hence, the time taken to convert the process out of control to in control state will be shorter and finally the impact of cost of quality will be smaller.

In order to provide a good picture to the reader, Figure 7 was developed and a graphical presentation that shows the relationship between total cost of quality, and detection rate per operation.

Fig. 7. Cost of quality for the four strategies.

Figure 7 indicates that total cost of quality is at its maximum value when the detection rate is zero, and then steadily decreased as the window of inspection increases. Moreover, inspection following every assembly operations has the lowest total cost of quality. This is because as the appraisal cost increases, the failure costs will simultaneously reduced since less defect rate occurs and hence less items is sent for rework.

#### **6.2.5 Investigation of strategies combination**

Since simulation modelling can provide a window of opportunity to investigate important quality control decisions and their impact on cost of quality. It is thought that the quality control management may wish to consider some combination of strategies for each process rather than having a single strategy for the whole assembly process. In this case, one has to determine the possible quality strategies combinations that could be considered for the simulation run. Let R be the possible strategies combination outcome, then we would have:

$$\mathcal{R} = \sum \mathbb{S}\_{i,j} \tag{7}$$

Where

30 Modern Approaches To Quality Control

Total cost of quality behavior at various interval times for each defect rate

Fig. 6. Cost of quality behaviour at various interval times for each defect rate.

operation.

**Total cost of quality (x1000)**

0.00 500.00 1,000.00 1,500.00 2,000.00 2,500.00 3,000.00 3,500.00

**Total cost of quality**

Fig. 7. Cost of quality for the four strategies.

0.00 500,000.00 1,000,000.00 1,500,000.00 2,000,000.00 2,500,000.00 3,000,000.00 3,500,000.00 4,000,000.00

control state will be shorter and finally the impact of cost of quality will be smaller.

**Total cost of quality versus detection rate per operation**

Figure 6 reveals that as defective rate per operation is increased, the total cost of quality decreases. This is because as the defective rate is increased, the time taken to detect the defect becomes shorter. Hence, the time taken to convert the process out of control to in

**The interval time (hour)**

0 5 10 15 20 25

In order to provide a good picture to the reader, Figure 7 was developed and a graphical presentation that shows the relationship between total cost of quality, and detection rate per

> 1. Acceptance sampling 2. Completion of finished

0.001 0.005 0.01 0.05 0.1

3. Prior to assembly

4. Following every operation

product

Figure 7 indicates that total cost of quality is at its maximum value when the detection rate is zero, and then steadily decreased as the window of inspection increases. Moreover,

0 0.02 0.04 0.06 0.08 0.1 0.12 **Detection rate per operation**

i = A strategy in process 1 i = 1, 2, 3, 4 and j = A strategy in process 2 j = 1, 2, 3, 4


Table 9. Total cost of quality for the assembly system.

Table 9 shows all possible quality strategies combinations with their respective cost at each selected fraction detection rate for each operation. As it could be seen from Table 9, the combination of {2, 2} strategies have caused the system to incur the largest cost of quality. This is the case when the management decided to use the quality strategy in which quality control inspection work is only to be carried at the end of the production runs. On the other hand, one can see from Table 9 that that adopting combination of {4, 4} strategies have caused the system to incur the least cost of quality. As it could be easily seen that {4, 4}

Evaluating Quality Control Decisions: A Simulation Approach 33

Crossfield , R. T. and Dale, B. G. (1990). Mapping quality assurance systems: a methodology,

Dawes, E.W. and Siff, W. (1993). Using quality costs for continuous improvement, *ASQC* 

Denton, D.K. and Kowalski, T.P. (1988). Measuring nonconforming costs reduced

Gardner, L.L, Grant, M.E. and Rolston, L.J. (1995). Using simulation to assess costs of

Giakatis, G., Enkawa, T. & Washitani, K. (2001). Hidden quality costs and the distinction between quality cost and quality loss, *Total Quality Management*, 12(2), 179-190. Goulden, C. and Rawlins, L. (1995). A hybrid model for process quality costing, *International* 

Gupta, M. and Campbell, V. S. (1995). The cost of quality, *Productions and Inventory* 

Harry, M.J. and Schroeder, R. (2000). Six Sigma: The breakthrough management strategy

Heagy, C.D. (1991). Determining optimal quality costs by considering costs of loss sales, *Journal of Cost Management for the Manufacturing Industry,* Fall, 67-71. Hester, W.F. (1993). True quality cost with activity based costing, *ASQC Annual Quality* 

Jorgenson, D.M. and Enkerlin, M.E. (1992). Managing quality costs with the help of activity

Juran, J.M., Gryna, F.M. and Bingham, R. (1975). Quality Control Handbook*,* 3rd edition,

Kent, R. (2005). Manufacturing strategy for window fabricators 14 – the cost of quality,

Krishnan, S.K., Agus, A. and Husain, N. (2000). Cost of quality: The hidden costs, *Total* 

Malchi, G. and McGurk, H. (2001). Increasing value through the measurement of the cost of quality (CoQ) – A practical approach, *Pharmaceutical Engineering,* 21(3), 92-95. Marsh, J., (1989).Process modeling for quality improvement, *Proceedings of the Second* 

Merino, D. N. (1988). Economics of quality: Choosing among prevention alternatives,

Modarres, B. and Ansari, A. (1987). Two new dimensions in the cost of quality, *International* 

*International Journal of Quality & Reliability Management,* 5(7), 13-23.

*Journal of Quality & Reliability Management,* 4(4), 9-20.

*International Conference on Total Quality Management,* IFS publication, Bedford, 111-

based costing, *Journal of Electronics Manufacturing,* 2, 153-160. Juran, J .M. (1952). Quality Control Handbook (1st ed.). New York: McGraw-Hill. Juran, J. M. (1989). Juran on Leadership for Quality. New York: Free Press.

Tanagram Technology, available at: www.tanagram.co.uk.

*Quality Management*, 11(4, 5 & 6), 844–848.

revolutionizing the world's top corporations, New York: Doubleday, Random

Feigenbaum, A. V. (1956). Total quality control, *Harvard Business Review,* 34(6), 93-101. Feigenbaum, A. V. (1961). Total Quality Control, McGraw-Hill Inc., New York, USA.

applications, in Campanella, J. (Ed.), *Quality Costs: Ideas and Applications,* Vol.

manufacturer's cost of quality in product by \$200 000, *Industrial Engineering,* 20, 36-

*Quality and Reliability Engineering International,* 6(3), 167-178. Dale, B.G. and Plunkett, J. J. (1999). Quality Costing, 3rd ed., Gower Press, Aldershot. Dawes, E.W. (1989). Quality costs-new concepts and methods, quality costs: ideas &

2(pp.440.), ASQC Quality Press, Milwaukee, WI.

quality, *Proceedings of the Winter Simulation Conference*.

*Journal of Quality & Reliability Management,* 12(8), 32-47.

*Management Journal*, 36(3), 43-49.

*Congress Transactions*, 446-453.

New York, McGraw-Hill.

*Annual Quality Congress Transactions*, 444-449.

43.

House.

121.

strategies calls for quality control works to be carried after each operation and as a result, completely eliminating the failure costs. There are many important management implications illustrated in all the examples presented in this chapter. First, quality costs are very large when quality activities time window is sacrificed to reduce the total cycle time. Secondly, failure costs are very large and managers should completely avoid these costs since there is no trade off that exists with these costs. Another important issue is the fact that using simulation to measure and understand cost of quality has provide managers with opportunity to rank their process in terms of cost of quality and the cost consequences that resulted from adopting a specific set of strategies .

### **7. Conclusions**

In this chapter, an analytical model reported in the literature for cost of quality computations was considered and modified to include an important component of cost of quality. The model is then used to develop a simulation model for a two-stage manufacturing system. Moreover, quality control strategies common in the manufacturing community were used in the simulation works to investigate their impact on cost of quality. The results indicate that some of these practiced strategies when combined with the detection periods will significantly increase cost of quality. Furthermore, the results indicate simulation works can be used to understand and measure cost of quality. An interesting venue for further research is to use simulation to investigate a real-industrial application to investigate the cost of quality; the authors are currently considering such idea.

#### **8. References**


strategies calls for quality control works to be carried after each operation and as a result, completely eliminating the failure costs. There are many important management implications illustrated in all the examples presented in this chapter. First, quality costs are very large when quality activities time window is sacrificed to reduce the total cycle time. Secondly, failure costs are very large and managers should completely avoid these costs since there is no trade off that exists with these costs. Another important issue is the fact that using simulation to measure and understand cost of quality has provide managers with opportunity to rank their process in terms of cost of quality and the cost consequences that

In this chapter, an analytical model reported in the literature for cost of quality computations was considered and modified to include an important component of cost of quality. The model is then used to develop a simulation model for a two-stage manufacturing system. Moreover, quality control strategies common in the manufacturing community were used in the simulation works to investigate their impact on cost of quality. The results indicate that some of these practiced strategies when combined with the detection periods will significantly increase cost of quality. Furthermore, the results indicate simulation works can be used to understand and measure cost of quality. An interesting venue for further research is to use simulation to investigate a real-industrial application to

Albright, T.L. and Roth, H. P (1992) .The measurement of quality cost: an alternative

BS 6143 Part 2. (1990) Guide to determination and use of quality-related costs. London: BSI. Burgess, T.F. (1996). Modeling quality cost dynamics, *International Journal of Quality &* 

Carr, L.P. (1992). Applying cost of quality to a service business, *Sloan Management Reviews,*

Chang, S.J., Hyun, P.Y. and Park, E. H. (1996). Quality costs in multi-stage manufacturing

Chen, C.C. and Yang, C.C. (2002). Cost-effectiveness based performance evaluation for

Chiadamrong, N. (2003). The development of an economic quality cost model, *Total Quality* 

Clark, H.J and Tannock, J.D.T. (1999), The development and implementation of a simulation

Cooper, R. (1988). The rise of activity-based costing – Part I: What is an activity-based cost

Cooper, R. and Kaplan, R. S. (1988). Measure costs right: Make the right decisions, *Harvard* 

tool for the assessment of quality economics within a cell-based manufacturing

investigate the cost of quality; the authors are currently considering such idea.

ASQC . (1971) Quality costs, what and how? Milwaukee: WI: ASQC Quality Press. Bottorff, D. L.(1997). CoQ systems: the right stuff, *Quality Progress*, March, 33.

systems, *Computers & Industrial Engineering,* 31(1-2), 115-118.

*Management and Business Excellence*, 14(9), 999-1014.

system?, *Journal of Cost Management,* 2(2),.45-54.

Crosby, P. B. (1979). Quality is free, New York: McGraw-Hill.

*Business Review,* 66(5), .96-103.

suppliers and operations, *Quality Management Journal*, 9(4), 59–73.

company, *International Journal of Production Research*, 37, 979-995.

paradigm, *Accounting Horizons,* June, 15, 1992.

*Reliability Management*, 13(3), .8-26.

resulted from adopting a specific set of strategies .

**7. Conclusions**

**8. References** 

33(4), 72-78.


**3** 

Isin Akyar

*Turkey* 

**GLP: Good Laboratory Practice** 

*Acibadem University Faculty of Medicine Department of Medical Microbiology* 

In the early 70's FDA (United States Food and Drug administration) have realized cases of poor laboratory practice throughout the United States. FDA decided to check over 40 toxicology labs in-depth. They revealed lot dishonest activities and a lot of poor lab practices. Examples of some of these poor lab practices found were equipment not been calibrated to standard form, therefore giving wrong measurements, incorrect or inaccurate accounts of the actual lab study and incompetent test systems. Although the term "good laboratory practice" might have been used informal already for some time in many laboratories around the world GLP originated in the United States and it had a powerfull

GLP is an official regulation that was created by the FDA in 1978. The OECD (Organisation for Economic Co-operation and Development) Principles of Good Laboratory Practice were first created by an Expert Group on GLP set up in 1978 under the Special Programme on the Control of Chemicals. The GLP regulations that are accepted as international standards for non-clinical laboratory studies published by the US Food and Drug Administration in 1976 supplied the basis for the work of the Expert Group, which was guided by the United States and consisted experts from the following countries and organisations: Australia, Austria, Belgium, Canada, Denmark, France, the Federal Republic of Germany, Greece, Italy, Japan, the Netherlands, New Zealand, Norway, Sweden,Switzerland, the United Kingdom, the United States, the Commission of the European Communities, the World Health Organisation and the International Organisation for Standardisation. Eventually after United States other countries started making GLP regulations in their home countries. (Lori

2.1 Those Principles of GLP were officially suggested for use in member countries by the OECD Council in 1981. They were set about as an essential part of the Council Decision on Mutual Acceptance of Data in the Assessment of Chemicals, which expresses that"data denoted in the testing of chemicals in an OECD member country in accordance with OECD Test Guidelines and OECD Principles of Good Laboratory Practice shall be accepted in other member countries for the aims of assessment and other uses relating to the protection of

2.1.1 The work of the OECD associated with chemical safety is fulfilled in the Environmental Health and Safety Division. The Environmental Health and Safety Division publishes free-off

**1. Introduction** 

effect world wide.

et al., 2009)

man and the environment".

**2. History of Good Laboratory Practice (GLP)** 


## **GLP: Good Laboratory Practice**

Isin Akyar

*Acibadem University Faculty of Medicine Department of Medical Microbiology Turkey* 

## **1. Introduction**

34 Modern Approaches To Quality Control

Morse, W. J. (1983). Consumer product quality control cost revisited, *Measuring Quality* 

Moyer, D.R. and Gilmore, H.L. (1979) Product conformance in the steel foundry jobbing

Naylor, T.H., and Finger, J.M. (1967), Verification of computer models, *Management Science*,

Omachonu, V.K., Suthummanon, S. And Einspruch, N. G. (2004). The Relationship Between

Plunkett, J.J. and Dale, B. G. (1988).Quality costs: A critique of some 'economic cost of quality' models, *International Journal of Production Research,* 26(11), 1713-1726. Plunkett, J.J., Dale, B. G. (1987). A review of the literature on quality-related costs, *International Journal of Quality & Reliability Management*, 4 (1), .40-52. Prickett, T.W. and Rapley, C. W. (2001). Quality costing: A study of manufacturing organizations Part 2: Main survey, *Total Quality Management*, 12(2), 211–222. Purgslove, A.B. and Dale, B. G. (1995). Developing a quality costing system: Key features and outcomes, Omega*: International Journal of Management Science,* 23 (5), 567-575. Ross, D.T. (1977). Structured analysis (SA): A language for communicating ideas, *IEEE* 

Sandoval-Chavez, D.A. and Beruvides, M. G. (1998). Using opportunity costs to determine

Schiffauerova, A. and Thomson, V. (2006). A review of research on cost of quality models

Son, Y.K and Lie, F.H. (1991), A method of measuring quality costs*, International Journal of* 

Sorqvist, L. (1997). Effective methods for measuring the cost of poor quality, *European* 

Sumanth, D. J., and Arora, D. P. S.(1992). State of the art on linkage between quality, quality

Tannock, J.D.T. (1995). Choice of inspection strategy using quality simulation, *International* 

Tannock, J.D.T. (1997). An economic comparison of inspection and control charting using

Tatikonda, L.U. and Tatikonda, R.J. (1996). Measuring and reporting the cost of quality,

Tsai, W. H. (1998). Quality cost measurement under activity-based costing, *International* 

Weheba, G. S. and. Elshennawy, A. K. (2004). A revised model for the cost of quality, *International Journal of Quality & Reliability Management*, 21(3), 291-308. Yang, C.C. (2008). Improving the definition and quantification of quality costs, *Total Quality* 

Sharma, J. K. (2007). Business Statistics, 2nd edition, Pearson Education India.

Suminsky Jr., L. T. (1994). Measuring cost of quality, *Quality Digest,* 14 (3), 26-32.

*Production and Inventory Management Journal,* 37(2), 1-7.

*Management*, 19( 3), March, 175 – 191.

*Journal of Quality and Reliability Management,* 15(7), 719-752.

*Journal of Quality & Reliability Management*, Vol. 12, No. 6, pp. 75-84

the cost of quality: A case study in a continuous-process industry, *Engineering* 

and best practices, *International Journal of Quality and Reliability Management,* 23 (6),

costs and productivity, *International Journal of Materials and Product Technology*, 7(2),

simulation, *International Journal of Quality & Reliability Management*, Vol. 14, No. 7,

Quality and Cost of Quality for a Manufacturing Company, *International of Quality* 

*Costs, Cost and Management,* July/August, 16-20.

shop, *Quality Progress,* 12 (5),. 17-19.

*& Reliability Management*, 21 (3), 277-290.

*Transactions on Software Engineering,* 3(1), 16-34.

*Production Research*, Vol. 29, pp. 1785-1794.

*Economist,* 43(2), 107-124.

*Quality,* 4 (3), 40-42.

647-669.

150-169.

pp. 687-700.

14, pp92.

In the early 70's FDA (United States Food and Drug administration) have realized cases of poor laboratory practice throughout the United States. FDA decided to check over 40 toxicology labs in-depth. They revealed lot dishonest activities and a lot of poor lab practices. Examples of some of these poor lab practices found were equipment not been calibrated to standard form, therefore giving wrong measurements, incorrect or inaccurate accounts of the actual lab study and incompetent test systems. Although the term "good laboratory practice" might have been used informal already for some time in many laboratories around the world GLP originated in the United States and it had a powerfull effect world wide.

## **2. History of Good Laboratory Practice (GLP)**

GLP is an official regulation that was created by the FDA in 1978. The OECD (Organisation for Economic Co-operation and Development) Principles of Good Laboratory Practice were first created by an Expert Group on GLP set up in 1978 under the Special Programme on the Control of Chemicals. The GLP regulations that are accepted as international standards for non-clinical laboratory studies published by the US Food and Drug Administration in 1976 supplied the basis for the work of the Expert Group, which was guided by the United States and consisted experts from the following countries and organisations: Australia, Austria, Belgium, Canada, Denmark, France, the Federal Republic of Germany, Greece, Italy, Japan, the Netherlands, New Zealand, Norway, Sweden,Switzerland, the United Kingdom, the United States, the Commission of the European Communities, the World Health Organisation and the International Organisation for Standardisation. Eventually after United States other countries started making GLP regulations in their home countries. (Lori et al., 2009)

2.1 Those Principles of GLP were officially suggested for use in member countries by the OECD Council in 1981. They were set about as an essential part of the Council Decision on Mutual Acceptance of Data in the Assessment of Chemicals, which expresses that"data denoted in the testing of chemicals in an OECD member country in accordance with OECD Test Guidelines and OECD Principles of Good Laboratory Practice shall be accepted in other member countries for the aims of assessment and other uses relating to the protection of man and the environment".

2.1.1 The work of the OECD associated with chemical safety is fulfilled in the Environmental Health and Safety Division. The Environmental Health and Safety Division publishes free-off

GLP: Good Laboratory Practice 37

Hence GLP aims to decrease the occurrence of mistakes or mix-ups through large and specific labelling requirements. The registered information can be provided by demonstrating the application of the correct item in the stated amounts to the pertinent test

2.1.7 GLP experience is important to employers in some cases. An employer may find it useful if you have: Practical experience with working on a study according to the GLP principles. Good planning is the greater half of success. With a perfect propose in mind and a well figured out and defined testing procedure is it achiavable to acquire an evaluable outcome of a study. GLP places a high degree of reliance upon creating and following a pre-defined

The Management; The Quality Assurance; The Study Director; and The National Compliance Monitoring Authority. All of them serve important functions in the concordancy of performing and monitoring safety studies, and it should be kept in mind

2.2.1 Although GLP differs from other quality systems in aspects that are important not only for the traceability of data but especially for the full reconstructability of the study, there are certain co-occurances between GLP and other quality systems like accreditation schemes. (

2.2.2 The aim of this chapter will be to give enough information about the GLP in details with the test facility organisation and personel, the facilities of quality assurance programme, test system, archive and waste disposal, apparatus, material, and reagents, physical, chemical, biological test systems, receipt, handling, sampling and storage and characterisation of the test and reference items, standard operating procedures, performance of the study, reporting of study results, storage and retention of records and materials.

Test facility means the persons, premises and operational units that are necessary for

3.1 The term "test facility" may include several "test sites", at one or more geographical locations, where phases or components of a single overall study are conducted and does not only include buildings, rooms and other premises, but that it includes also the people who are working there and are liable for performing these studies (Seiler, 2005). For multi-site studies the test facility considers the site at which the Study Director is located and all

systems.

study plan.

Seiler, 2005).

4. Equipment

1. Test facility management 2. Quality assurance programme

**2.2 The principles of good laboratory practice**  Good Laboratory Practice is based on four principles:

that all of them are required for GLP to achieve quality data.

2.2.3 The concerns of the chapter may be summarized as follows:

conducting the non-clinical health and environmental safety study.

3. Meeting the requirements of the test facility

9. Storage and retention of records and materials.

5. Receipt, handling, sampling and storage

6. Standard operating procedures. 7. Performance of the study. 8. Reporting of study results

**3. Test facility management** 

charge documents in six different series: Testing and Assessment; Principles on Good Laboratory Practice and Compliance Monitoring; Pesticides; Risk Management; Chemical Accidents and Harmonization of Regulatory Oversight in Biotechnology.

2.1.2 In spite of the fact that there are many national guidelines setting Good Laboratory Practice, the one guideline that is most universally accepted by the various national guidelines is the regulation of GLP through the Principles of Good Laboratory Practice of the Organisation of Economic Cooperation and Development (OECD), since these have been discussed by an international panel of experts and have been agreed on at an international level; they also form the basis for the OECD Council Decision/Recommendation on the Mutual Acceptance of Data in the Assessment of Chemicals which has to be regarded as one of the cornerstone agreements amongst the OECD member states with regard to trade in chemicals and to the removal of non-tariff barriers to trade. Besides the utilisation of the OECD Guidelines for the Testing of Chemicals, they restated the application of GLP Principles and the establishment of consorted national GLP compliance monitoring programmes as necessary parts of the mutual acceptability of data. The working group of experts who had createded the OECD Principles of Good Laboratory Practice also proceeded to inform and publish guidance for the Monitoring Authorities with regard to the introduction of procedures essential for the monitoring of industry's compliance with these Principles, as well as guidance with respect to the actual conduct of the necessary control activities such as laboratory inspections and study audits. (OECD, 1998).

2.1.3 Thus, the Principles of Good Laboratory Practice (GLP) have been developed to promote the quality and validity of test data used for determining the safety of chemicals and chemical products. Its principles are postulated to be followed by test facilities carrying out studies to be referred to national authorities for the purposes of assessment of chemicals and other uses in regards with the protection of man and the environment. Good laboratory practice might be used to detect collusion, but it could also serve to protect the researcher from unfounded allegations. In this manner, the application of the basic rules of GLP could be benefit even to a instution or laboratory.

#### **2.1.4 Definition of GLP**

The quality is the capability to systematically produce the same product to meet the same specifications time after time. GLP was altered to protect the integrity and quality of laboratory data used to back up a product application. The definition of the term "Good Laboratory Practice" itself, which identifies GLP as "a quality system related with the organisational process and the conditions under which non-clinical health and environmental safety studies are planned, performed, monitored, recorded, archived and reported." can be considered as an example of a brief and accurate definition. GLP describes good practices for non-clinical lab studies that support research or marketing approvals for FDA-regulated products( Seiler, 2005).

#### **2.1.5 Purpose of GLP**

Everyone makes mistakes that's why GLP is needed. GLP principles are a good idea even if you are not required to follow the standards. There are some simple rules such as: Say What You Do (with written standard operating procedures), do what you say (follow the procedures), be able to prove it (with good record keeping) (Jean Cobb, 2007).

2.1.6 The principles of good laboratory practice (GLP) is to support the development of quality and validity of test data used for determining the safety of chemicals and chemicals product (Clasby, 2005).

charge documents in six different series: Testing and Assessment; Principles on Good Laboratory Practice and Compliance Monitoring; Pesticides; Risk Management; Chemical

2.1.2 In spite of the fact that there are many national guidelines setting Good Laboratory Practice, the one guideline that is most universally accepted by the various national guidelines is the regulation of GLP through the Principles of Good Laboratory Practice of the Organisation of Economic Cooperation and Development (OECD), since these have been discussed by an international panel of experts and have been agreed on at an international level; they also form the basis for the OECD Council Decision/Recommendation on the Mutual Acceptance of Data in the Assessment of Chemicals which has to be regarded as one of the cornerstone agreements amongst the OECD member states with regard to trade in chemicals and to the removal of non-tariff barriers to trade. Besides the utilisation of the OECD Guidelines for the Testing of Chemicals, they restated the application of GLP Principles and the establishment of consorted national GLP compliance monitoring programmes as necessary parts of the mutual acceptability of data. The working group of experts who had createded the OECD Principles of Good Laboratory Practice also proceeded to inform and publish guidance for the Monitoring Authorities with regard to the introduction of procedures essential for the monitoring of industry's compliance with these Principles, as well as guidance with respect to the actual conduct of the necessary control

2.1.3 Thus, the Principles of Good Laboratory Practice (GLP) have been developed to promote the quality and validity of test data used for determining the safety of chemicals and chemical products. Its principles are postulated to be followed by test facilities carrying out studies to be referred to national authorities for the purposes of assessment of chemicals and other uses in regards with the protection of man and the environment. Good laboratory practice might be used to detect collusion, but it could also serve to protect the researcher from unfounded allegations. In this manner, the application of the basic rules of GLP could

The quality is the capability to systematically produce the same product to meet the same specifications time after time. GLP was altered to protect the integrity and quality of laboratory data used to back up a product application. The definition of the term "Good Laboratory Practice" itself, which identifies GLP as "a quality system related with the organisational process and the conditions under which non-clinical health and environmental safety studies are planned, performed, monitored, recorded, archived and reported." can be considered as an example of a brief and accurate definition. GLP describes good practices for non-clinical lab studies that support research or marketing approvals for

Everyone makes mistakes that's why GLP is needed. GLP principles are a good idea even if you are not required to follow the standards. There are some simple rules such as: Say What You Do (with written standard operating procedures), do what you say (follow the

2.1.6 The principles of good laboratory practice (GLP) is to support the development of quality and validity of test data used for determining the safety of chemicals and chemicals

procedures), be able to prove it (with good record keeping) (Jean Cobb, 2007).

Accidents and Harmonization of Regulatory Oversight in Biotechnology.

activities such as laboratory inspections and study audits. (OECD, 1998).

be benefit even to a instution or laboratory.

FDA-regulated products( Seiler, 2005).

**2.1.4 Definition of GLP** 

**2.1.5 Purpose of GLP** 

product (Clasby, 2005).

Hence GLP aims to decrease the occurrence of mistakes or mix-ups through large and specific labelling requirements. The registered information can be provided by demonstrating the application of the correct item in the stated amounts to the pertinent test systems.

2.1.7 GLP experience is important to employers in some cases. An employer may find it useful if you have: Practical experience with working on a study according to the GLP principles.

Good planning is the greater half of success. With a perfect propose in mind and a well figured out and defined testing procedure is it achiavable to acquire an evaluable outcome of a study. GLP places a high degree of reliance upon creating and following a pre-defined study plan.

## **2.2 The principles of good laboratory practice**

Good Laboratory Practice is based on four principles:

The Management; The Quality Assurance; The Study Director; and The National Compliance Monitoring Authority. All of them serve important functions in the concordancy of performing and monitoring safety studies, and it should be kept in mind that all of them are required for GLP to achieve quality data.

2.2.1 Although GLP differs from other quality systems in aspects that are important not only for the traceability of data but especially for the full reconstructability of the study, there are certain co-occurances between GLP and other quality systems like accreditation schemes. ( Seiler, 2005).

2.2.2 The aim of this chapter will be to give enough information about the GLP in details with the test facility organisation and personel, the facilities of quality assurance programme, test system, archive and waste disposal, apparatus, material, and reagents, physical, chemical, biological test systems, receipt, handling, sampling and storage and characterisation of the test and reference items, standard operating procedures, performance of the study, reporting of study results, storage and retention of records and materials.

2.2.3 The concerns of the chapter may be summarized as follows:


### **3. Test facility management**

Test facility means the persons, premises and operational units that are necessary for conducting the non-clinical health and environmental safety study.

3.1 The term "test facility" may include several "test sites", at one or more geographical locations, where phases or components of a single overall study are conducted and does not only include buildings, rooms and other premises, but that it includes also the people who are working there and are liable for performing these studies (Seiler, 2005). For multi-site studies the test facility considers the site at which the Study Director is located and all

GLP: Good Laboratory Practice 39

3.1.6 Test Facility Management should guarantee that these Principles of GLP are requested in its test facility. General Requirements for GLP consists of appropriately qualified personnel, adequate resources , appropriate procedures for: sanitation, health precautions, clothing, test protocol development, test methods, data analysis, report development, appropriately qualified study director, quality assurance function. Test site management should be aware of the fact that the test facility management may be liable to inspection by the national GLP

3.1.7 "The Study Director" has overall responsibility for the technical conduct of the study, as well as for the interpretation, analysis, documentation, and reporting of results, and

3.1.8 The GLP Principles are designed to avoid the factors that would endanger the reconstructability of a study, by giving the only and final responsibility for the GLP compliant conduct of a study to one single person, the Study Director. For each nonclinical laboratory study, a scientist or other professional of appropriate education, training, and

3.1.9 The Study Director has to be aware of all possible circumstances that might affect the quality and integrity of a study. There should be communication between the Study Director and other personnel including all scientists involved in study conduct, in order to be kept at the forefront of developments in a study, and to be able to act, as considered appropriate, on unforeseen developments. All information has to be passed to the Study Director. He should make or at least acknowledge all the decisions. In such special circumstances where the Study Director cannot exercise his immediate control, the responsibilities of a Study Director may be

3.1.10 When the Study Director cannot exercise immediate supervision, at each test site study procedures may be controlled by a member of the staff, called the Principal Investigator. The Principal Investigator means an individual responsible for the conduct of certain defined phases of the study, acting for the Study Director. The Study Director has the final responsibility of for the overall quality and integrity of the study. He cannot share this responsibility with any other individual involved in the study. Nonetheless, the Principal Investigator should take the responsibility for the defined, delegated part of the study, he is not responsible for the study plan, and he can not approve any improvements to it. The general management must have a stiff interpretation and working agreement with the test site management as to how and by whom the Quality Assurance Programme (QAP) will be

3.1.11 Approved original and revised Standard Operating Procedures should be used in the studies. There should be a Quality Assurance Programme with assigned personel for each study an individual with the proper qualifications, training, and experience is designated by the management as the Study Director before the study is initiated. Personnel should clearly understand the functions that they are going to carry out, training should be provided when needed. Standard Operating Procedures should be established and followed. They should

3.1.12 The GLP Compliance Statement signed by the Study Director in the final study report is the declaration that gives the Regulatory Authority the guarantee for a appropriately performed, valid study. The results and conclusions of the study can be trusted to reflect the

compliance monitoring authority of the country in which the test site is located.

extended to other individuals such as specialised scientists ( Seiler, 2005).

represents the single point of study control." (OECD, 1998).

experience should be identified as the study director.

carried out.(OECD, 1998).

be appropriate and technically valid.

real data obtained in the study (Seiler, 2005).

individual test sites, which individually or collectively can be considered to be test facilities. The test facility should be of appropriate size, construction and location to meet the requirements of the study. It should be designed safe enough to get the validation results confidently. Research laboratories where test/reference item characterisation considering determination of identity, purity/strength, stability, and other related activities is conducted, one or more agricultural or other in- or outdoor sites where the test or control item is applied to the test system are the different test sites in the test facility. And in some cases, a processing facility where collected commodities are treated to prepare other items where collected specimens are analysed for chemical or biological residues, or are otherwise evaluated. (OECD, 1998).

3.1.1 Properties of biological test systems are generally more complex and mutable than the ones of physical/chemical test systems. Hence biological test systems need very careful characterisation in order to guarantee the quality and integrity of the data derived from them. The outcome of a study may be influenced by the state and condition of the test system at the time of the study which has special importance with regard to the reconstructability. The GLP Principles, in uttering the requirements for the accomodation and siting of these systems, for their maintenance and utilization, and for the associating documentation, aims at supplying the essential basis for confidence into the results obtained from biological test systems. A test item should only be used in studies if it can safely be regarded as being in its pure, unspoilt and not decomposed. Any change in the properties of the test item may lead to spurious and erroneous results, and to wrong interpretations of the effects the test item is supposed to have produced. Stability testing will lead to the definition of a time interval within which the test item will stay in this state, and as a result "expiry" or "re-analysis" dates have to be mentioned on the label of the test item container. With this necessity GLP aims to reduce the possibility that an item will be used in a study which does no longer correspond to the item that had been intended for testing. The aim of any safety testing is to analyze possible effects of the test item on the test system. Therefore, the effects observed in any test system should be traceable to the application of the item which was the designated subject of the study.

3.1.2 After the conduct of the respective safety test, in order to find out this even retrospectively, the documentation on the test item has to fulfil a number of requirements:

3.1.3 There must be documented proof that one item that had been intended to be tested indeed reached the sensitive parts of the test system confirming that the effects observed had really been originated by the test item, and that the application of this item to man or the environment would therefore not be expected to result in any effects other than those which can be concluded from the observed ones in the test systems utilised. "Tidiness" is a crucial point with consideration to the general claims on the test facility. When the laboratory bench is filled up with clean and dirty instruments, glassware some of which are being used and some are not, it is not so easy to locate all the materials needed for a specific activity.

3.1.4 Tidiness therefore has both functions of inspiring trust into the quality of the work performed, and facilitate the performance of the daily activities according to the quality standards. Tidiness makes the life easier to survive a compliance monitoring inspection, if even under the stress the technician can find the folder with the SOPs at once like without trying to find a treasure.

3.1.5 A test facility needs a Management, a Study Director, a Quality Assurance Unit, study personnel and a person responsible for the archives (Seiler, 2005).

individual test sites, which individually or collectively can be considered to be test facilities. The test facility should be of appropriate size, construction and location to meet the requirements of the study. It should be designed safe enough to get the validation results confidently. Research laboratories where test/reference item characterisation considering determination of identity, purity/strength, stability, and other related activities is conducted, one or more agricultural or other in- or outdoor sites where the test or control item is applied to the test system are the different test sites in the test facility. And in some cases, a processing facility where collected commodities are treated to prepare other items where collected specimens are analysed for chemical or biological residues, or are otherwise

3.1.1 Properties of biological test systems are generally more complex and mutable than the ones of physical/chemical test systems. Hence biological test systems need very careful characterisation in order to guarantee the quality and integrity of the data derived from them. The outcome of a study may be influenced by the state and condition of the test system at the time of the study which has special importance with regard to the reconstructability. The GLP Principles, in uttering the requirements for the accomodation and siting of these systems, for their maintenance and utilization, and for the associating documentation, aims at supplying the essential basis for confidence into the results obtained from biological test systems. A test item should only be used in studies if it can safely be regarded as being in its pure, unspoilt and not decomposed. Any change in the properties of the test item may lead to spurious and erroneous results, and to wrong interpretations of the effects the test item is supposed to have produced. Stability testing will lead to the definition of a time interval within which the test item will stay in this state, and as a result "expiry" or "re-analysis" dates have to be mentioned on the label of the test item container. With this necessity GLP aims to reduce the possibility that an item will be used in a study which does no longer correspond to the item that had been intended for testing. The aim of any safety testing is to analyze possible effects of the test item on the test system. Therefore, the effects observed in any test system should be traceable to the application of the item which was the

3.1.2 After the conduct of the respective safety test, in order to find out this even retrospectively, the documentation on the test item has to fulfil a number of requirements: 3.1.3 There must be documented proof that one item that had been intended to be tested indeed reached the sensitive parts of the test system confirming that the effects observed had really been originated by the test item, and that the application of this item to man or the environment would therefore not be expected to result in any effects other than those which can be concluded from the observed ones in the test systems utilised. "Tidiness" is a crucial point with consideration to the general claims on the test facility. When the laboratory bench is filled up with clean and dirty instruments, glassware some of which are being used and some

3.1.4 Tidiness therefore has both functions of inspiring trust into the quality of the work performed, and facilitate the performance of the daily activities according to the quality standards. Tidiness makes the life easier to survive a compliance monitoring inspection, if even under the stress the technician can find the folder with the SOPs at once like without

3.1.5 A test facility needs a Management, a Study Director, a Quality Assurance Unit, study

are not, it is not so easy to locate all the materials needed for a specific activity.

personnel and a person responsible for the archives (Seiler, 2005).

evaluated. (OECD, 1998).

designated subject of the study.

trying to find a treasure.

3.1.6 Test Facility Management should guarantee that these Principles of GLP are requested in its test facility. General Requirements for GLP consists of appropriately qualified personnel, adequate resources , appropriate procedures for: sanitation, health precautions, clothing, test protocol development, test methods, data analysis, report development, appropriately qualified study director, quality assurance function. Test site management should be aware of the fact that the test facility management may be liable to inspection by the national GLP compliance monitoring authority of the country in which the test site is located.

3.1.7 "The Study Director" has overall responsibility for the technical conduct of the study, as well as for the interpretation, analysis, documentation, and reporting of results, and represents the single point of study control." (OECD, 1998).

3.1.8 The GLP Principles are designed to avoid the factors that would endanger the reconstructability of a study, by giving the only and final responsibility for the GLP compliant conduct of a study to one single person, the Study Director. For each nonclinical laboratory study, a scientist or other professional of appropriate education, training, and experience should be identified as the study director.

3.1.9 The Study Director has to be aware of all possible circumstances that might affect the quality and integrity of a study. There should be communication between the Study Director and other personnel including all scientists involved in study conduct, in order to be kept at the forefront of developments in a study, and to be able to act, as considered appropriate, on unforeseen developments. All information has to be passed to the Study Director. He should make or at least acknowledge all the decisions. In such special circumstances where the Study Director cannot exercise his immediate control, the responsibilities of a Study Director may be extended to other individuals such as specialised scientists ( Seiler, 2005).

3.1.10 When the Study Director cannot exercise immediate supervision, at each test site study procedures may be controlled by a member of the staff, called the Principal Investigator. The Principal Investigator means an individual responsible for the conduct of certain defined phases of the study, acting for the Study Director. The Study Director has the final responsibility of for the overall quality and integrity of the study. He cannot share this responsibility with any other individual involved in the study. Nonetheless, the Principal Investigator should take the responsibility for the defined, delegated part of the study, he is not responsible for the study plan, and he can not approve any improvements to it. The general management must have a stiff interpretation and working agreement with the test site management as to how and by whom the Quality Assurance Programme (QAP) will be carried out.(OECD, 1998).

3.1.11 Approved original and revised Standard Operating Procedures should be used in the studies. There should be a Quality Assurance Programme with assigned personel for each study an individual with the proper qualifications, training, and experience is designated by the management as the Study Director before the study is initiated. Personnel should clearly understand the functions that they are going to carry out, training should be provided when needed. Standard Operating Procedures should be established and followed. They should be appropriate and technically valid.

3.1.12 The GLP Compliance Statement signed by the Study Director in the final study report is the declaration that gives the Regulatory Authority the guarantee for a appropriately performed, valid study. The results and conclusions of the study can be trusted to reflect the real data obtained in the study (Seiler, 2005).

GLP: Good Laboratory Practice 41

study plan with GLP; to assess the clarity and consistency of the study plan; to identify the critical phases of the study; and to plan a monitoring programme in relation to the study

4.1.4 Study plans and Standard Operating Procedures should be determined by the inspections and they should have been available to study personnel and are being followed. In the final reports it should be confirmed that the methods, procedures, and observations are accurately and completely described, and that the reported results accurately and

4.1.5 Inspection of facilities and experimental activities is one of the tools of Quality Assurance for ascertaining and guaranteeing the continued obeyence to the rules of GLP in a test facility inside the studies performed. Since it is recognised that randomly conducted inspections will be sufficient to ensure compliance with, the GLP Principles do not necessitate a fixed supervision. These inspections should involve those parts of a study that have particular importance for the validity of the data and the conclusions to be drawn therefrom, or where deviations from the rules of GLP would most heavily have a powerfull effect on the integrity of the study. Quality Assurance thus has to find a balance in their inspectional activities, evaluating the study type and "critical phases", in order to achieve a well supported view of the GLP compliance at the test facility and within the studies conducted. It is clear that any deviations from the rules of GLP that are observed in these inspections should be corrected. The audit of the final report, hence serves to ascertain the quality and integrity of the specific study with its detailed assessment of GLP compliance throughout the study and with its concomitant review of all relevant information, records and data. It is the responsibility of management to provide policies, guidelines, or procedural descriptions to ensure that this statement reflects Quality Assurance's acceptance of the Study Director's GLP compliance statement. The Quality Assurance statement has two functions: Serving to demonstrate that Quality Assurance has adequately monitored the conduct and progress of the study, from the first check of the study plan for GLP conformity to the audit of the final report as a "second opinion" on the completeness of the reporting and the adequacy of raw data coverage and providing the study with the seal of approval by attesting to the GLP compliant conduct. Thus, the Quality Assurance statement has a particular importance for the assessment of the study's integrity and validity. The Quality Assurance statement should show that the study report accurately reflects the study's raw

4.1.6 Before signing the Quality Assurance statement, Quality Assurance should ensure that all issues raised in the Quality Assurance audit, i.e. in the audit report to the Study Director and to management, have been addressed through appropriate changes of the final report, that all agreed actions have been completed, and that no additional changes have been made to the report which would require a further report audit. Through management policy it should certainly be made clear that the Quality Assurance statement would only be completed if the Study Director's claim to GLP compliance can be supported(Seiler, 2005). Laboratories use various supplied materials in studies conducted in compliance with the GLP Principles. Suppliers have attempted to produce products which satisfy users'

4.1.7 Accreditation can be especially useful to suppliers. Often accreditation schemes monitor members' implementation of national and international standards thus, a supplier or manufacturer's accreditation certificate may signify to the customer the satisfactory

implementation of a standard in addition to other aspects of accreditation.

(OECD, 1998).

data.

completely reflect the raw data of the studies

obligations as set out in the GLP Principles.

#### **4. Quality assurance programme**

Quality control is the process, procedures and authority used to accept or reject all components, drug product containers, closures, in-process materials, packaging material, labeling and drug products and the authority to review production records to assure that no errors have occurred, that they have been fully investigated. The quality and reliability of test data count on the state and condition of the test system which is used in its production.

This is meant to be the control of a number of technical features and specifications which are needed to ensure the integrity of the system and the quality of the data generated. In a study for compliance with GLP, the most important aspects may be characterised as "suitability", "capacity" and "integrity" (OECD, 1998).

4.1 "Trust is Good, Control is Better" says an old proverb. The quality which is supposed to be achieved in GLP is not a quality which can be controlled by easy, numerical or other means, but it is the şcontrol over the intrinsic quality of a test facility and its studies. Only through this independence a reliable assurance of the studies inherent quality that can be achieved. (Seiler, 2005).

4.1.1 The test facility should have a documented Quality Assurance Programme to guarantee that studies performed comply with these Principles of Good Laboratory Practice. The Quality Assurance Programme should be performed by an individual or by individuals designated by. These staff should be familiar with the test procedures and directly responsible to management.This individual(s) should not be involved in the conduct of the study being assured(OECD, 1998). It must be clear that what the exact area of responsibility is for the defined individual, what exactly is to be done at those test sites where such "phases" are conducted in delegating parts or"phases" of a study through the terms of appointment for the Contributing Scientist or the Principal Investigator (Seiler, 2005).

4.1.2 As the person responsible for the overall conduct of the study, to the Study Director's management, and to the latter's Quality Assurance Programme, there should be a full, frank flow of information to the responsible test site management, to the responsible Principal Investigator(s) and to the Study Director. In the same way, for notification of critical activities it should be essential to assure effective communications from the Study Director and/or Principal Investigators to the quality assurance personel. Because of the complex nature of field studies, and the fact that the exact time of certain activities will depend upon local weather or other conditions flexible quality assurance procedures may be required. The geographical spread of test sites may mean that quality assurance personnel will also need to manage langquage differences in order to communicate with local study personnel, the Study Director, Principal Investigators and test site management. Independent from the test sites, the written reports of quality assurance personnel must reach both management and the Study Director.Those reports receipt by management and the Study Director should be documented in the raw data.

4.1.3 The Quality Assurance personnel should be responsible of maintaining copies of all approved study plans and Standard Operating Procedures in use in the test facility and have access to an up-to-date copy of the master Schedule, verifying that the study plan contains the information required for compliance with these Principles of Good Laboratory Practice, conducting inspections to determine if all studies are conducted in accordance with these Principles of Good Laboratory Practice. Inspections should also determine that study plans and Standard Operating Procedures have been made available to study personnel and are being followed. The study plan allows Quality Assurance: to monitor compliance of the

Quality control is the process, procedures and authority used to accept or reject all components, drug product containers, closures, in-process materials, packaging material, labeling and drug products and the authority to review production records to assure that no errors have occurred, that they have been fully investigated. The quality and reliability of test data count on the state and condition of the test system which is used in its production. This is meant to be the control of a number of technical features and specifications which are needed to ensure the integrity of the system and the quality of the data generated. In a study for compliance with GLP, the most important aspects may be characterised as "suitability",

4.1 "Trust is Good, Control is Better" says an old proverb. The quality which is supposed to be achieved in GLP is not a quality which can be controlled by easy, numerical or other means, but it is the şcontrol over the intrinsic quality of a test facility and its studies. Only through this independence a reliable assurance of the studies inherent quality that can be

4.1.1 The test facility should have a documented Quality Assurance Programme to guarantee that studies performed comply with these Principles of Good Laboratory Practice. The Quality Assurance Programme should be performed by an individual or by individuals designated by. These staff should be familiar with the test procedures and directly responsible to management.This individual(s) should not be involved in the conduct of the study being assured(OECD, 1998). It must be clear that what the exact area of responsibility is for the defined individual, what exactly is to be done at those test sites where such "phases" are conducted in delegating parts or"phases" of a study through the terms of

appointment for the Contributing Scientist or the Principal Investigator (Seiler, 2005). 4.1.2 As the person responsible for the overall conduct of the study, to the Study Director's management, and to the latter's Quality Assurance Programme, there should be a full, frank flow of information to the responsible test site management, to the responsible Principal Investigator(s) and to the Study Director. In the same way, for notification of critical activities it should be essential to assure effective communications from the Study Director and/or Principal Investigators to the quality assurance personel. Because of the complex nature of field studies, and the fact that the exact time of certain activities will depend upon local weather or other conditions flexible quality assurance procedures may be required. The geographical spread of test sites may mean that quality assurance personnel will also need to manage langquage differences in order to communicate with local study personnel, the Study Director, Principal Investigators and test site management. Independent from the test sites, the written reports of quality assurance personnel must reach both management and the Study Director.Those reports receipt by management and the Study Director should

4.1.3 The Quality Assurance personnel should be responsible of maintaining copies of all approved study plans and Standard Operating Procedures in use in the test facility and have access to an up-to-date copy of the master Schedule, verifying that the study plan contains the information required for compliance with these Principles of Good Laboratory Practice, conducting inspections to determine if all studies are conducted in accordance with these Principles of Good Laboratory Practice. Inspections should also determine that study plans and Standard Operating Procedures have been made available to study personnel and are being followed. The study plan allows Quality Assurance: to monitor compliance of the

**4. Quality assurance programme** 

"capacity" and "integrity" (OECD, 1998).

achieved. (Seiler, 2005).

be documented in the raw data.

study plan with GLP; to assess the clarity and consistency of the study plan; to identify the critical phases of the study; and to plan a monitoring programme in relation to the study (OECD, 1998).

4.1.4 Study plans and Standard Operating Procedures should be determined by the inspections and they should have been available to study personnel and are being followed.

In the final reports it should be confirmed that the methods, procedures, and observations are accurately and completely described, and that the reported results accurately and completely reflect the raw data of the studies

4.1.5 Inspection of facilities and experimental activities is one of the tools of Quality Assurance for ascertaining and guaranteeing the continued obeyence to the rules of GLP in a test facility inside the studies performed. Since it is recognised that randomly conducted inspections will be sufficient to ensure compliance with, the GLP Principles do not necessitate a fixed supervision. These inspections should involve those parts of a study that have particular importance for the validity of the data and the conclusions to be drawn therefrom, or where deviations from the rules of GLP would most heavily have a powerfull effect on the integrity of the study. Quality Assurance thus has to find a balance in their inspectional activities, evaluating the study type and "critical phases", in order to achieve a well supported view of the GLP compliance at the test facility and within the studies conducted. It is clear that any deviations from the rules of GLP that are observed in these inspections should be corrected. The audit of the final report, hence serves to ascertain the quality and integrity of the specific study with its detailed assessment of GLP compliance throughout the study and with its concomitant review of all relevant information, records and data. It is the responsibility of management to provide policies, guidelines, or procedural descriptions to ensure that this statement reflects Quality Assurance's acceptance of the Study Director's GLP compliance statement. The Quality Assurance statement has two functions: Serving to demonstrate that Quality Assurance has adequately monitored the conduct and progress of the study, from the first check of the study plan for GLP conformity to the audit of the final report as a "second opinion" on the completeness of the reporting and the adequacy of raw data coverage and providing the study with the seal of approval by attesting to the GLP compliant conduct. Thus, the Quality Assurance statement has a particular importance for the assessment of the study's integrity and validity. The Quality Assurance statement should show that the study report accurately reflects the study's raw data.

4.1.6 Before signing the Quality Assurance statement, Quality Assurance should ensure that all issues raised in the Quality Assurance audit, i.e. in the audit report to the Study Director and to management, have been addressed through appropriate changes of the final report, that all agreed actions have been completed, and that no additional changes have been made to the report which would require a further report audit. Through management policy it should certainly be made clear that the Quality Assurance statement would only be completed if the Study Director's claim to GLP compliance can be supported(Seiler, 2005).

Laboratories use various supplied materials in studies conducted in compliance with the GLP Principles. Suppliers have attempted to produce products which satisfy users' obligations as set out in the GLP Principles.

4.1.7 Accreditation can be especially useful to suppliers. Often accreditation schemes monitor members' implementation of national and international standards thus, a supplier or manufacturer's accreditation certificate may signify to the customer the satisfactory implementation of a standard in addition to other aspects of accreditation.

GLP: Good Laboratory Practice 43

have a sufficient and suitable number of rooms or areas to assure the isolation of test systems and the isolation of individual projects, involving substances or organisms known to be or suspected of being biohazardous. There should be storage rooms or areas as needed for supplies and equipment and should provide adequate protection against infestation, contamination, and/or deterioration. Facilities for handling test and reference items should be planned. To prevent contamination or mix-ups, there should be separate rooms or areas for receipt and storage of the test and reference items, and mixing of the test items with a

Fig. 1. There should be separate working areas in the laboratory.

maintained & calibrated (Clasby, 2005).

5.1 Handling and disposal of wastes should be carried out in such a way as not to risk the integrity of studies. This includes provision for appropriate collection, storage and disposal facilities, and decontamination and transportation procedures. This policy is to assure that reagents used are specified in the standard operating procedure. Purchasing and testing should be handled by a quality assurance program. Reagents and solutions should be labeled, deteriorated or outdated reagents and solutions should not be used. The opening date should be recorded. They should be stored under ambient temperature and the expiration date should be considered(Lori et al , 2009).The equipments should be appropriately designed, adequate throughput capacity, appropriately located and routinely

5.1.1 In order to guarantee the quality of the data, appropriate conditions should be established and maintained for the storage, housing, handling and care of biological test systems. At the experimental starting date of a study, test systems should be free of any disease or condition that might interfere with the purpose or conduct of the study. If necessary to maintain the integrity of the study, test systems that become diseased or

vehicle(Figure 1).

4.1.8 It is recommended that suppliers seek membership, where feasible and/or appropriate, in national accreditation schemes. Although accreditation is a useful complementary tool to support compliance with the GLP Principles, it is not an acceptable alternative to GLP compliance nor will it lead to international recognition in the context of meeting the requirements for the mutual acceptance of data as set out in the OECD Council Acts. (OECD, 1998).

As an example ISO 17025 and GLP comparison can be considered (Table 1).


(Fox , 2011)

Table 1. ISO 17025 and GLP comparison.

## **5. Meeting the requirements of the test facility**

The GLP principles do not address the question of the specific requirements for the location of an archive, except that it should be "of suitable size, construction and location to meet requirements". Therefore there is complete freedom for every test facility to define the location of its archives and to designate the proper locations for each type of materials to be stored ( Seiler, 2005). Before they can be considered as GLP compliant General Requirements Facilities need to conform to a number of general rules. The facilities should be designed for the best suitability to the studies that are to be performed within. Some comfort for the employees comes of course with all the requirement of study quality, which means that the people working in a facility should certainly have sufficient room to move around in order to be able to perform the duties which the study calls for, and to perform them in a manner compatible with the quality, integrity and validity of the study. This is acknowledged absolutely in the general requirement that a test facility should be of appropriate size, construction and location, for both meeting the requirements of the study and minimising disturbance that would interfere with the validity of the study. jürgThe test facility should

4.1.8 It is recommended that suppliers seek membership, where feasible and/or appropriate, in national accreditation schemes. Although accreditation is a useful complementary tool to support compliance with the GLP Principles, it is not an acceptable alternative to GLP compliance nor will it lead to international recognition in the context of meeting the requirements for the mutual acceptance of data as set out in the OECD Council

The same standard for all ISO Different regulations in different countries

Manual Description of Quality System in SOPs

methods should be used) Study plan required for each study

Documented complaints procedures In case of problems, only course of law

The GLP principles do not address the question of the specific requirements for the location of an archive, except that it should be "of suitable size, construction and location to meet requirements". Therefore there is complete freedom for every test facility to define the location of its archives and to designate the proper locations for each type of materials to be stored ( Seiler, 2005). Before they can be considered as GLP compliant General Requirements Facilities need to conform to a number of general rules. The facilities should be designed for the best suitability to the studies that are to be performed within. Some comfort for the employees comes of course with all the requirement of study quality, which means that the people working in a facility should certainly have sufficient room to move around in order to be able to perform the duties which the study calls for, and to perform them in a manner compatible with the quality, integrity and validity of the study. This is acknowledged absolutely in the general requirement that a test facility should be of appropriate size, construction and location, for both meeting the requirements of the study and minimising disturbance that would interfere with the validity of the study. jürgThe test facility should

personnel Very specific responsibilities of personnel

and archiving

not required

format and content

regulatory requirements

Specific requirements for storage, retention

SOPs with detailed requirements for

Validation through inter-laboratory tests

Storage of test samples according to local

As an example ISO 17025 and GLP comparison can be considered (Table 1).

Designed for repetitive studies Designed for single studies

**ISO Members OECD Members** 

Description of Quality System in Quality

General statements for responsibilities of

No specific requirements for storage of

No study plans required (standardized

Written operating procedures without

Analysis methods must be verified through inter-laboratory test (Proficiency testing)

Storage of test samples and data until client

**5. Meeting the requirements of the test facility** 

Table 1. ISO 17025 and GLP comparison.

Acts. (OECD, 1998).

records and reports

specific format

accepts results

(Fox , 2011)

have a sufficient and suitable number of rooms or areas to assure the isolation of test systems and the isolation of individual projects, involving substances or organisms known to be or suspected of being biohazardous. There should be storage rooms or areas as needed for supplies and equipment and should provide adequate protection against infestation, contamination, and/or deterioration. Facilities for handling test and reference items should be planned. To prevent contamination or mix-ups, there should be separate rooms or areas for receipt and storage of the test and reference items, and mixing of the test items with a vehicle(Figure 1).

Fig. 1. There should be separate working areas in the laboratory.

5.1 Handling and disposal of wastes should be carried out in such a way as not to risk the integrity of studies. This includes provision for appropriate collection, storage and disposal facilities, and decontamination and transportation procedures. This policy is to assure that reagents used are specified in the standard operating procedure. Purchasing and testing should be handled by a quality assurance program. Reagents and solutions should be labeled, deteriorated or outdated reagents and solutions should not be used. The opening date should be recorded. They should be stored under ambient temperature and the expiration date should be considered(Lori et al , 2009).The equipments should be appropriately designed, adequate throughput capacity, appropriately located and routinely maintained & calibrated (Clasby, 2005).

5.1.1 In order to guarantee the quality of the data, appropriate conditions should be established and maintained for the storage, housing, handling and care of biological test systems. At the experimental starting date of a study, test systems should be free of any disease or condition that might interfere with the purpose or conduct of the study. If necessary to maintain the integrity of the study, test systems that become diseased or

GLP: Good Laboratory Practice 45

(Cobb, 2007). Equipment and materials used in a study should not interfere adversely with

6.1.1 Equipment used for the generation of physical/chemical data should be suitably located and of proper design and adequate capacity. The integrity of the physical/chemical test systems should be ensured. Appropriate conditions should be established and maintained for the storage*,* housing, handling and care of biological test systems, in order to ensure the quality of the data. Standardization, calibration, and verification are the definitions which have particular importance for the equipments. The difference between those should be well understood and performed by the laboratory personnel: Verification is the external check of equipment accuracy. It is the check balance accuracy against weights at

6.1.2 In calibration equipment is adjusted based on comparison to certified or known reference materials. The balance is adjusted after comparison to certified weights by trained professional. Standardization is made by comparison with similar equipments, such as

6.1.3 While monitorizing the study laboratory staff should always have the following questions on mind: Was the equipment functioning properly? Who performed the work, what was the date, and what specific parameters did they use? Was there a problem? How

was the problem fixed? Were there any problems with the reagents and solutions?

Fig. 2. Laboratory equipment should routinely be maintained and calibrated.

judgement of the responsible scientists.

6.1.4 The GLP Principles do not suggest or require any specific time intervals for such activities. Cleaning and maintenance intervals may be different from one type of equipment to the other, and such intervals may as well depend on the frequency of use or the workload imposed on the respective equipment. On the other hand the question of the correct frequency of such activities should be considered as a scientific one, calling for the expert

using two thermometers of similar design to compare readings.

the test systems. (OECD, 1998).

laboratory. There is no adjustment.

injured during the course of a study should be isolated and treated. Any diagnosis and treatment of any disease before or during a study should be recorded. Records of source, date of arrival, and arrival condition of test systems should be maintained. Biological test systems should be acclimatised to the test environment for an adequate period before the first administration/application of the test or reference item. All information needed to properly identify the test systems should appear on their housing or containers. Individual test systems that are to be removed from their housing or containers during the conduct of the study should bear appropriate identification, wherever possible. During use, housing or containers for test systems should be cleaned and sanitised at appropriate intervals. Any material that comes into contact with the test system should be free of contaminants at levels that would interfere with the study. Bedding for animals should be changed as required by sound husbandry practice. Use of pest control agents should be documented. Test systems used in field studies should be located so as to avoid interference in the study from spray drift and from past usage of pesticides (OECD, 1998).

5.1.2 The important principles can be summarised as follows:

There should be a unique identification for the study and all of its parts. All original observations in a study should be at once clearly and legibly recorded. The recording should be permanent and corrections should be made so as not to obscure the original entry; for all corrections the respective reasons have to be provided. All records should be in the form of bound notebooks or on continuously numbered sheets. All entries and corrections to them should be dated and initialled. Records related to the test system itself should be gathered and preserved. Specimens should be clearly identified so as to allow full traceability. At the end of a study, all raw data should be assembled, catalogued and archived. Archiving should support for secure storage of all raw data, samples and specimens, together with any other documents such as study plan and study report. (Jürg P. Seiler, 2005).

#### **6. Equipment**

6.1 Equipment, including validated computerised systems, used for the generation, storage and recovery of data, and for controlling environmental factors relevant to the study should be suitably located and of appropriate design and adequate capacity. Equipment records should include: name of the equipment and manufacturer, model or type for identification, serial number, date equipment was received in the laboratory, copy of manufacturers operating instruction(s). Equipment used in a study should be periodically inspected, cleaned, maintained, and calibrated according to Standard Operating Procedures. Records of these activities should be maintained. Calibration should be traceable to national or international standards of measurement. Instrumentation validation is a process necessary for any analytical laboratory. Data produced by "faulty" instruments may give the appearance of valid data. The frequency for calibration, re-validation and testing depends on the instrument and extent of its use in the laboratory. Chemicals, reagents, and solutions should be labelled to indicate identity, expiry date and specific storage instructions. Information concerning source, preparation date and stability should be available. The expiry date may be extended on the basis of documented evaluation or analysis. If a mistake is made, original data should not be obscured. Instead of this, a single strikeout should be drawn and a reason code should be added, later the date should be changed. Whenever an instrument's performance is outside the "control limits" reports must be discontinued

injured during the course of a study should be isolated and treated. Any diagnosis and treatment of any disease before or during a study should be recorded. Records of source, date of arrival, and arrival condition of test systems should be maintained. Biological test systems should be acclimatised to the test environment for an adequate period before the first administration/application of the test or reference item. All information needed to properly identify the test systems should appear on their housing or containers. Individual test systems that are to be removed from their housing or containers during the conduct of the study should bear appropriate identification, wherever possible. During use, housing or containers for test systems should be cleaned and sanitised at appropriate intervals. Any material that comes into contact with the test system should be free of contaminants at levels that would interfere with the study. Bedding for animals should be changed as required by sound husbandry practice. Use of pest control agents should be documented. Test systems used in field studies should be located so as to avoid interference in the study from spray

There should be a unique identification for the study and all of its parts. All original observations in a study should be at once clearly and legibly recorded. The recording should be permanent and corrections should be made so as not to obscure the original entry; for all corrections the respective reasons have to be provided. All records should be in the form of bound notebooks or on continuously numbered sheets. All entries and corrections to them should be dated and initialled. Records related to the test system itself should be gathered and preserved. Specimens should be clearly identified so as to allow full traceability. At the end of a study, all raw data should be assembled, catalogued and archived. Archiving should support for secure storage of all raw data, samples and specimens, together with any

6.1 Equipment, including validated computerised systems, used for the generation, storage and recovery of data, and for controlling environmental factors relevant to the study should be suitably located and of appropriate design and adequate capacity. Equipment records should include: name of the equipment and manufacturer, model or type for identification, serial number, date equipment was received in the laboratory, copy of manufacturers operating instruction(s). Equipment used in a study should be periodically inspected, cleaned, maintained, and calibrated according to Standard Operating Procedures. Records of these activities should be maintained. Calibration should be traceable to national or international standards of measurement. Instrumentation validation is a process necessary for any analytical laboratory. Data produced by "faulty" instruments may give the appearance of valid data. The frequency for calibration, re-validation and testing depends on the instrument and extent of its use in the laboratory. Chemicals, reagents, and solutions should be labelled to indicate identity, expiry date and specific storage instructions. Information concerning source, preparation date and stability should be available. The expiry date may be extended on the basis of documented evaluation or analysis. If a mistake is made, original data should not be obscured. Instead of this, a single strikeout should be drawn and a reason code should be added, later the date should be changed. Whenever an instrument's performance is outside the "control limits" reports must be discontinued

drift and from past usage of pesticides (OECD, 1998).

**6. Equipment** 

5.1.2 The important principles can be summarised as follows:

other documents such as study plan and study report. (Jürg P. Seiler, 2005).

(Cobb, 2007). Equipment and materials used in a study should not interfere adversely with the test systems. (OECD, 1998).

6.1.1 Equipment used for the generation of physical/chemical data should be suitably located and of proper design and adequate capacity. The integrity of the physical/chemical test systems should be ensured. Appropriate conditions should be established and maintained for the storage*,* housing, handling and care of biological test systems, in order to ensure the quality of the data. Standardization, calibration, and verification are the definitions which have particular importance for the equipments. The difference between those should be well understood and performed by the laboratory personnel: Verification is the external check of equipment accuracy. It is the check balance accuracy against weights at laboratory. There is no adjustment.

6.1.2 In calibration equipment is adjusted based on comparison to certified or known reference materials. The balance is adjusted after comparison to certified weights by trained professional. Standardization is made by comparison with similar equipments, such as using two thermometers of similar design to compare readings.

6.1.3 While monitorizing the study laboratory staff should always have the following questions on mind: Was the equipment functioning properly? Who performed the work, what was the date, and what specific parameters did they use? Was there a problem? How was the problem fixed? Were there any problems with the reagents and solutions?

Fig. 2. Laboratory equipment should routinely be maintained and calibrated.

6.1.4 The GLP Principles do not suggest or require any specific time intervals for such activities. Cleaning and maintenance intervals may be different from one type of equipment to the other, and such intervals may as well depend on the frequency of use or the workload imposed on the respective equipment. On the other hand the question of the correct frequency of such activities should be considered as a scientific one, calling for the expert judgement of the responsible scientists.

GLP: Good Laboratory Practice 47

integrity will, however, not only depend on the validation status of the system, but also, and to a very important extent, on the security measures developed for the utilisation of the system. Through the requirement of documented security procedures for the protection of hardware, software and data from corruption, unauthorised modification, or loss, GLP intends to provide for continuous data integrity. In general terms, security issues can be divided into measures of physical security, i.e. measures that can be instituted on the facility and apparatus level, and logical security, i.e. those that are related to software security at the

6.1.9 Physical location of computer hardware, peripheral components, communications equipment and electronic storage media should be considered. Extremes of temperature and humidity, dust, electromagnetic interference and proximity to high voltage cables should be avoided unless the equipment is specifically designed to operate under such conditions. Consideration must also be given to the electrical supply for computer equipment and, where appropriate, back-up or uninterruptable supplies for computerised systems, whose sudden failure would affect the results of a study. Adequate facilities should be provided for

6.1.10 Because of various reasons, in every test facility there may be computerised systems which have not been formally validated. Their use in a GLP environment should still be required, clear proof of their suitability can only be obtained through an evaluation of their past and actual performance. In order to get reconstructability and transparency, this proof has to be planned and documented, resulting in a final conclusion on the past, present and future suitability of the respective system. In this way GLP aims at providing evidence for the correct functioning of the computerised system and for estimating the extent of GLP

Sampl e tracking vary among laboratories. Receipt, handling, sampling and storage should be prepared appropriately. Records including test item and reference item characterisation, date of receipt, expiry date, quantities received and used in studies should be maintained. Handling, sampling, and storage procedures should be identified in order that the homogeneity and stability are assured to the degree possible and contamination or mixup are precluded (Seiler, 2005). They should maintain the unmistakable connection between a set of analytical data and the samples from which they were obtained. Original source of samples must be recorded and unmistakably connected with the set of analytical data (Cobb, 2007). Records including test item and reference item characterisation, date of receipt, expiry date, quantities received and used in studies should be maintained. Handling, sampling, and storage procedures should be identified in order that the homogeneity and stability are assured to the degree possible and contamination or mix-up are precluded. Storage container(s) should carry identification information, expiry date, and

7.1 Receipt and storage areas for specimens must be separate from storage areas for pesticide formulations and other test or reference items. Areas used for specimen and sample preparation, instrumentation, calibration of sprays, reference Standard preparation, and for washing glassware should be adequately isolated from each other and from other functions of the laboratory which might introduce contamination. Storage areas for test and reference items at all test sites should be environmentally monitored, if required, to assure

the secure retention of electronic storage media. (OECD, 1998).

**7. Receipt, handling, sampling and storage** 

access level (Seiler, 2005).

compliance.

specific storage instructions.

6.1.5 Generally the manufacturer's manuals provide useful signs or suggestions for cleaning and maintenance intervals. These same aspects are valid also for calibration frequencies, where in some cases calibration is routinely performed before each measurement, while in other cases the respective frequencies may be set in an arbitrary manner. The key point in the consideration of maintenance and calibration frequencies is the necessary assurance of data validity. In some cases it would be necessary to ensure the traceability of the calibrations performed to "national or international standards of measurement". The results of a study can be relied on only as far as the study itself is being appropriately conducted. Suitability of apparatus, materials and reagents is thus one of the key points in this judgement. Computerised systems have taken over an ever increasing part of different tasks in various areas within our daily lives. They are used during the planning, conduct and reporting of studies for a variety of purposes, including the direct or indirect data capture from automated instruments, the recording, processing, reporting, general management and storage of data, as well as in controlling and steering functions in numerous kinds of equipment. For these different activities, computerised systems can be of varying complexity from a simple, microchip controlled instrument up to a complex laboratory information management system (LIMS) with multiple functions. Whatever the scale of computer involvement, the GLP Principles have to be followed. The correct application of the GLP Principles to ensure compliance of computerised systems with the GLP rules may, however, pose some problems, which might be regarded to stem at least in part from the very origins of GLP. All computerised systems used for the generation, measurement or assessment of data intended for regulatory submission should be developed, validated, operated and maintained in ways which are compliant with the GLP Principles. Appropriate controls for security and system integrity must also be adequately addressed during the whole life cycle of any computerised system( Seiler, 2005).

6.1.6 All equipment used in a GLP context have to satisfy the specified requirements of the users. For computerised systems the evidence of suitability is provided by the validation procedure. This has to start with the exact definition of the user requirements which have subsequently to be translated into proof of adequate operation of the system in the actual environment. With this prospective validation assurance it should be provided that the computerised system will perform the tasks designed to execute in a correct, reproducible and reconstructable way.

6.1.7 Computerised systems associated with the conduct of studies bound for regulatory submission should be of appropriate design, adequate capacity and suitable for their intended purposes. There should be appropriate procedures to control and maintain these systems, and the systems should be developed, validated and operated in a way which is in compliance with the GLP Principles. The demonstration that a computerised system is suitable for its intended purpose is of fundamental importance and is referred to as computer validation. The validation process provides a high degree of assurance that a computerised system meets its pre-determined specifications. Validation should be undertaken by means of a formal validation plan and performed prior to operational use. (OECD, 1998).

6.1.8 Whether any system has been fully and prospectively validated or has just been retrospectively evaluated and qualified, there is a need for continued maintenance of the validation status to be sure of the continuence of data validity. This is accomplished through formal procedures that require any changes to the system to be fully documented. Data

6.1.5 Generally the manufacturer's manuals provide useful signs or suggestions for cleaning and maintenance intervals. These same aspects are valid also for calibration frequencies, where in some cases calibration is routinely performed before each measurement, while in other cases the respective frequencies may be set in an arbitrary manner. The key point in the consideration of maintenance and calibration frequencies is the necessary assurance of data validity. In some cases it would be necessary to ensure the traceability of the calibrations performed to "national or international standards of measurement". The results of a study can be relied on only as far as the study itself is being appropriately conducted. Suitability of apparatus, materials and reagents is thus one of the key points in this judgement. Computerised systems have taken over an ever increasing part of different tasks in various areas within our daily lives. They are used during the planning, conduct and reporting of studies for a variety of purposes, including the direct or indirect data capture from automated instruments, the recording, processing, reporting, general management and storage of data, as well as in controlling and steering functions in numerous kinds of equipment. For these different activities, computerised systems can be of varying complexity from a simple, microchip controlled instrument up to a complex laboratory information management system (LIMS) with multiple functions. Whatever the scale of computer involvement, the GLP Principles have to be followed. The correct application of the GLP Principles to ensure compliance of computerised systems with the GLP rules may, however, pose some problems, which might be regarded to stem at least in part from the very origins of GLP. All computerised systems used for the generation, measurement or assessment of data intended for regulatory submission should be developed, validated, operated and maintained in ways which are compliant with the GLP Principles. Appropriate controls for security and system integrity must also be adequately addressed

during the whole life cycle of any computerised system( Seiler, 2005).

and reconstructable way.

(OECD, 1998).

6.1.6 All equipment used in a GLP context have to satisfy the specified requirements of the users. For computerised systems the evidence of suitability is provided by the validation procedure. This has to start with the exact definition of the user requirements which have subsequently to be translated into proof of adequate operation of the system in the actual environment. With this prospective validation assurance it should be provided that the computerised system will perform the tasks designed to execute in a correct, reproducible

6.1.7 Computerised systems associated with the conduct of studies bound for regulatory submission should be of appropriate design, adequate capacity and suitable for their intended purposes. There should be appropriate procedures to control and maintain these systems, and the systems should be developed, validated and operated in a way which is in compliance with the GLP Principles. The demonstration that a computerised system is suitable for its intended purpose is of fundamental importance and is referred to as computer validation. The validation process provides a high degree of assurance that a computerised system meets its pre-determined specifications. Validation should be undertaken by means of a formal validation plan and performed prior to operational use.

6.1.8 Whether any system has been fully and prospectively validated or has just been retrospectively evaluated and qualified, there is a need for continued maintenance of the validation status to be sure of the continuence of data validity. This is accomplished through formal procedures that require any changes to the system to be fully documented. Data integrity will, however, not only depend on the validation status of the system, but also, and to a very important extent, on the security measures developed for the utilisation of the system. Through the requirement of documented security procedures for the protection of hardware, software and data from corruption, unauthorised modification, or loss, GLP intends to provide for continuous data integrity. In general terms, security issues can be divided into measures of physical security, i.e. measures that can be instituted on the facility and apparatus level, and logical security, i.e. those that are related to software security at the access level (Seiler, 2005).

6.1.9 Physical location of computer hardware, peripheral components, communications equipment and electronic storage media should be considered. Extremes of temperature and humidity, dust, electromagnetic interference and proximity to high voltage cables should be avoided unless the equipment is specifically designed to operate under such conditions. Consideration must also be given to the electrical supply for computer equipment and, where appropriate, back-up or uninterruptable supplies for computerised systems, whose sudden failure would affect the results of a study. Adequate facilities should be provided for the secure retention of electronic storage media. (OECD, 1998).

6.1.10 Because of various reasons, in every test facility there may be computerised systems which have not been formally validated. Their use in a GLP environment should still be required, clear proof of their suitability can only be obtained through an evaluation of their past and actual performance. In order to get reconstructability and transparency, this proof has to be planned and documented, resulting in a final conclusion on the past, present and future suitability of the respective system. In this way GLP aims at providing evidence for the correct functioning of the computerised system and for estimating the extent of GLP compliance.

#### **7. Receipt, handling, sampling and storage**

Sampl e tracking vary among laboratories. Receipt, handling, sampling and storage should be prepared appropriately. Records including test item and reference item characterisation, date of receipt, expiry date, quantities received and used in studies should be maintained. Handling, sampling, and storage procedures should be identified in order that the homogeneity and stability are assured to the degree possible and contamination or mixup are precluded (Seiler, 2005). They should maintain the unmistakable connection between a set of analytical data and the samples from which they were obtained. Original source of samples must be recorded and unmistakably connected with the set of analytical data (Cobb, 2007). Records including test item and reference item characterisation, date of receipt, expiry date, quantities received and used in studies should be maintained. Handling, sampling, and storage procedures should be identified in order that the homogeneity and stability are assured to the degree possible and contamination or mix-up are precluded. Storage container(s) should carry identification information, expiry date, and specific storage instructions.

7.1 Receipt and storage areas for specimens must be separate from storage areas for pesticide formulations and other test or reference items. Areas used for specimen and sample preparation, instrumentation, calibration of sprays, reference Standard preparation, and for washing glassware should be adequately isolated from each other and from other functions of the laboratory which might introduce contamination. Storage areas for test and reference items at all test sites should be environmentally monitored, if required, to assure

GLP: Good Laboratory Practice 49

present in that area at any one time. Special attention has to be given to such areas where

7.1.2 In such studies, the term "contamination" does not only mean "contamination by traces of other items" but also contamination by microorganisms, etc.,hence necessitating areas where the preparation of these items for the application in the study could be performed under aseptic conditions. By the same reason, GLP mandates that the available test item storage locations should be separate from the rooms or areas containing test systems in order to prevent excessive exposure of the systems to test items other than the

7.1.3 Of course, the storage facilities should supply adequate conditions to save the identity, purity and stability of the test items. Therefore it is necessary that storage areas at different temperature levels, for storage at room temperature or in refrigerators and deep freezers. Also protection from light, humidity or oxygen may be necessary for special cases. Also, there are security aspects to be mentioned. A suitable limitation for access to the test items should be advisable. It is very important that a good, accurate accounting system should be in place, which could be used to reconstruct the course of test item utilisation. (Seiler, 2005).

According to EPA(Environmental Protection Agency) GLP regulations, "Raw data" means any laboratory worksheets, records, memoranda, notes, or exact copies thereof, that are the result of original observations and activities of a study and are necessary for the reconstruction and evaluation of the report of that study. Logbooks for recording temperatures or equipment use, repair, and maintenance, field or laboratory notebooks, forms for field or laboratory observations, training reports, computer printouts, recorded data from automated instrument are examples of raw data. It's so hard and not necessary for anyone remember all these details and that's one of the functions of the Standard

8.1 In FDA it is said that :"If it is not documented..., it did not happen!" or, it's a rumor!" GLPs SOPs Can't do Guarantee "good science", guarantee good documentation, replace common sense, prevent all mistakes (Cobb, 2007). SOPs are written procedures for a laboratories program. They are approved protocols indicating test objectives and methods. Standard Operating Procedures are intended to ensure the quality and integrity of the data generated by the test facility. Revisions to Standard Operating Procedures should be

8.1.1 They define how to carry out protocol-specified activities. SOPs are most often written in a chronological listing of action steps. They are written to explain how the procedures are supposed to work SOP of routine inspection, cleaning, maintenance, testing and calibration, actions to be taken in response to equipment failure, analytical methods, definition of raw data, keeping records, reporting, storage, mixing, and recovery of data. (Standard Operating Procedures should have been written and approved by test facility management that are intended to ensure the quality and integrity of the data generated by that test facility. Revisions to Standard Operating Procedures should be approved by test facility management. Each separate test facility unit or area should have at once available current Standard Operating Procedures relevant to the activities being performed therein. Published text books, analytical methods*,* articles and manuals may be used as supplements to these Standard Operating Procedures. Deviations from Standard Operating Procedures related to

test, control and reference items are prepared for in vitro studies.

**8. Standard Operating Procedures (SOP)** 

approved by test facility management (OECD, 1998).

Operating Procedures (SOPs).

intended one.

conformance with established stability limits for these materials. Test and reference items should not be placed in the same storage containers with collected test system specimens and other materials of low concentrations which are being stored for shipment to the analytical laboratory or to off-site archives. There should be adequate storage and disposal facilities available for pesticide and related wastes such that there is no potential for crosscontamination of test systems, of test or reference items or of collected specimens. (OECD, 1998). Storage container(s) should carry identification information, expiry date, and specific storage instructions Each test and reference item should be properly identified. For each study, the identity, including batch number, purity, composition, concentrations, or other characteristics to appropriately define each batch of the test or reference items should be known. In cases where the test item is supplied by the sponsor, there should be a mechanism, developed in co-operation between the sponsor and the test facility, to verify the identity of the test item subject to the study. The stability of test and reference items under storage and test conditions should be known for all studies. If the test item is administered or applied in a vehicle, the homogeneity, concentration and stability of the test item in that vehicle should be determined. A sample for analytical purposes from each batch of test item should be retained for all studies except short-term studies. A well thought-out concept of logistics is needed for receiving, storing, handling and disposing test items, together with provisions for the adequate documentation of all procedures connected with test item handling. One aspect in this area of test item logistics is the physical location of these activities, and the GLP Principles underline the importance of identifying adequate facilities for them.

Fig. 3. Laboratory records of receipt, handling and storing should be carefully maintained.

7.1.1 While receipt and storage involves mainly the handling of closed containers, the opening of such a container exposes the test item to the facility environment and leads consequently to the possibility of contamination of either the test item or the environment. Moreover, the greater the number of different test items to be performed, the greater the danger that somebody would. Therefore, work in the special area where test items are prepared for application has to be carefully organised. For weighing of the test item and its mixing with the vehicle, it should be made compulsory that only one test item would be

conformance with established stability limits for these materials. Test and reference items should not be placed in the same storage containers with collected test system specimens and other materials of low concentrations which are being stored for shipment to the analytical laboratory or to off-site archives. There should be adequate storage and disposal facilities available for pesticide and related wastes such that there is no potential for crosscontamination of test systems, of test or reference items or of collected specimens. (OECD, 1998). Storage container(s) should carry identification information, expiry date, and specific storage instructions Each test and reference item should be properly identified. For each study, the identity, including batch number, purity, composition, concentrations, or other characteristics to appropriately define each batch of the test or reference items should be known. In cases where the test item is supplied by the sponsor, there should be a mechanism, developed in co-operation between the sponsor and the test facility, to verify the identity of the test item subject to the study. The stability of test and reference items under storage and test conditions should be known for all studies. If the test item is administered or applied in a vehicle, the homogeneity, concentration and stability of the test item in that vehicle should be determined. A sample for analytical purposes from each batch of test item should be retained for all studies except short-term studies. A well thought-out concept of logistics is needed for receiving, storing, handling and disposing test items, together with provisions for the adequate documentation of all procedures connected with test item handling. One aspect in this area of test item logistics is the physical location of these activities, and the GLP Principles underline the importance of identifying adequate

Fig. 3. Laboratory records of receipt, handling and storing should be carefully maintained. 7.1.1 While receipt and storage involves mainly the handling of closed containers, the opening of such a container exposes the test item to the facility environment and leads consequently to the possibility of contamination of either the test item or the environment. Moreover, the greater the number of different test items to be performed, the greater the danger that somebody would. Therefore, work in the special area where test items are prepared for application has to be carefully organised. For weighing of the test item and its mixing with the vehicle, it should be made compulsory that only one test item would be

facilities for them.

present in that area at any one time. Special attention has to be given to such areas where test, control and reference items are prepared for in vitro studies.

7.1.2 In such studies, the term "contamination" does not only mean "contamination by traces of other items" but also contamination by microorganisms, etc.,hence necessitating areas where the preparation of these items for the application in the study could be performed under aseptic conditions. By the same reason, GLP mandates that the available test item storage locations should be separate from the rooms or areas containing test systems in order to prevent excessive exposure of the systems to test items other than the intended one.

7.1.3 Of course, the storage facilities should supply adequate conditions to save the identity, purity and stability of the test items. Therefore it is necessary that storage areas at different temperature levels, for storage at room temperature or in refrigerators and deep freezers. Also protection from light, humidity or oxygen may be necessary for special cases. Also, there are security aspects to be mentioned. A suitable limitation for access to the test items should be advisable. It is very important that a good, accurate accounting system should be in place, which could be used to reconstruct the course of test item utilisation. (Seiler, 2005).

## **8. Standard Operating Procedures (SOP)**

According to EPA(Environmental Protection Agency) GLP regulations, "Raw data" means any laboratory worksheets, records, memoranda, notes, or exact copies thereof, that are the result of original observations and activities of a study and are necessary for the reconstruction and evaluation of the report of that study. Logbooks for recording temperatures or equipment use, repair, and maintenance, field or laboratory notebooks, forms for field or laboratory observations, training reports, computer printouts, recorded data from automated instrument are examples of raw data. It's so hard and not necessary for anyone remember all these details and that's one of the functions of the Standard Operating Procedures (SOPs).

8.1 In FDA it is said that :"If it is not documented..., it did not happen!" or, it's a rumor!" GLPs SOPs Can't do Guarantee "good science", guarantee good documentation, replace common sense, prevent all mistakes (Cobb, 2007). SOPs are written procedures for a laboratories program. They are approved protocols indicating test objectives and methods. Standard Operating Procedures are intended to ensure the quality and integrity of the data generated by the test facility. Revisions to Standard Operating Procedures should be approved by test facility management (OECD, 1998).

8.1.1 They define how to carry out protocol-specified activities. SOPs are most often written in a chronological listing of action steps. They are written to explain how the procedures are supposed to work SOP of routine inspection, cleaning, maintenance, testing and calibration, actions to be taken in response to equipment failure, analytical methods, definition of raw data, keeping records, reporting, storage, mixing, and recovery of data. (Standard Operating Procedures should have been written and approved by test facility management that are intended to ensure the quality and integrity of the data generated by that test facility. Revisions to Standard Operating Procedures should be approved by test facility management. Each separate test facility unit or area should have at once available current Standard Operating Procedures relevant to the activities being performed therein. Published text books, analytical methods*,* articles and manuals may be used as supplements to these Standard Operating Procedures. Deviations from Standard Operating Procedures related to

GLP: Good Laboratory Practice 51

9.1 The study plan should be approved by dated signature of the Study Director and verified for GLP compliance. Deviations from the study plan should be described, explained, recognized and dated in a timely fashion by the Study Director and/or Principal

9.1.1 In the study plan the identification of the study, the test item and reference item information should exist: A descriptive title; a statement which reveals the nature and purpose of the study; Identification of the test item by code or name; The reference item to be used. Information Concerning the Sponsor and the Test Facility should be declared. It should comprise: Name and address of the sponsor, any test facilities and test sites involved, Study Director, Principal Investigator(s), and the phase(s) of the study delegated by the Study Director and under the responsibility of the Principal Investigator(s) with the date of approval of the study plan by signature of the Study Director, of the study plan by signature of the test facility management and sponsor if required by national regulation or legislation in the country where the study is being performed, the proposed experimental starting and completion dates, reference to the OECD Test Guideline or other test guideline or method to be used, the justification for selection of the test system characterisation of the test system, such as the species, strain, substrain, source of supply, number, body weight range, sex, age and other pertinent information. It should also contain the method of administration and the reason for its choice; The dose levels and/or concentration(s), frequency, and duration of administration/application; detailed information on the experimental design, including a description of the chronological procedure of the study, all methods, materials and conditions, type and frequency of analysis, measurements, observations and examinations to be performed, and statistical methods to be used. Specimens from the study should be identified to confirm their origin. Such identification should enable traceability, as appropriate for the specimen and study. The study should be conducted in accordance with the study plan. All data generated during the conduct of the study should be recorded directly, punctually, correctly, and legibly by the individual entering the data. These entries should be signed or initialled and dated. Any change in the raw data should be made in order to understand the previous entry easily, should indicate the reason for change and should be dated and signed or initialled by the individual making

9.1.2 Computerised system design should always supply for the retention of full audit trails to show all changes to the data without obscuring the original data. It should be possible to associate all changes to data with the persons having made those changes. Reason for

All studies generate raw data that are the original data gathered during the conduct of a procedure. They are essential for the reconstruction of studies and contribute to the traceability of the events of a study. Raw data are the results of the experiment upon which the conclusions of the study will be based. Some of the raw data may be used directly, and some of them will be treated statistically. The results and their interpretations provided by the scientist in the study report must be a true and accurate reflection of the raw data. 10.1 A final report should be prepared for each study. The study report, like all the other scientific aspects of the study, is the responsibility of the Study Director. He/she must ensure that it describes the study accurately. Reports of Principal Investigators or scientists involved in the study should be signed and dated by them. The final report should be signed and dated

Investigator(s) and maintained with the study raw data.

the change.

changes should be given.

**10. Reporting of study results** 

the study should be documented and should be acknowledged by the Study Director and the Principal Investigator(s). SOPs are written, approved procedures that describe routine activities that are specific for daily operations at each facility. SOPs should allow appropriately qualified personnel to perform a procedure once trained.

8.1.2 The details given under each heading are to be considered as illustrative examples. Room preparation and environmental room conditions for the test system, procedures for receipt, transfer, proper placement, characterisation, identification and care of the test system, test system preparation, observations and examinations, before, during and at the conclusion of the study, handling of test system individuals found in a severe position or dead during the study, collection, identification and handling of specimens ,siting and placement of test systems in test conspiracy should be reviewed. And also operation of Quality Assurance personnel in planning, scheduling*,* performing, documenting and reporting inspections should be examined. Personnel should perform the same tasks using the same procedures. SOPs should accurately reflect how routine tasks are performed written by each facility based on their specific field and/or laboratory operations. Laboratory management must be sure that the SOPs used in the laboratory are useful in daily operations. They should be scientifically sound. And they should always be updated as necessary, rewrites should be the part of the routine process. While writing SOP guidelines there must be some precautions such as avoiding restrictive language such as "vortex for exactly 1 minute" but include clear instructions such as "vortex until homogenized" if that satisfies the purpose. Unnecessary steps should not be added such as "consult the manual" unless personnel are required to follow this step (Cobb, 2007). Study personnel should easily access to the study plan and appropriate Standard Operating Procedures should be applicable to their involvement in the study. It is their responsibility to comply with the instructions given in these documents. Study personnel should exercise health precautions to minimise risk to themselves and to ensure the integrity of the study. Standard Operating Procedures (SOPs) are intended to describe procedures that are routinely employed in the performance of test facility operations. Indeed they are defined as "documented procedures which describe how to perform tests or activities normally not specified in detail in study plans or test guidelines." The definition moreover implies that SOPs should describe all steps in the performance of an activity in such a detailed way that somebody not familiar with this activity might be able to perform it correctly and without having to recourse to outside help (Seiler, 2005).

8.1.3 It is suggested that test site personnel should follow test site SOPs. When they are required to follow other procedures specified by the Study Director, for example SOPs provided by the test facility management, this necessity should be identified in the study plan (OECD, 1998).

## **9. Performance of the study**

Performance of the Study should be monitorized carefully. All the standards supplied by the GLP should be followed from the beginning of the study to the end by the final report. For each study, a written plan should exist prior to the initiation of the study (Seiler, 2005). The study plan should contain the following information: Identification of the study, the test item and reference item, information concerning the sponsor and the test facility, dates, test methods, issues (where applicable)and records. (OECD, 1998).

the study should be documented and should be acknowledged by the Study Director and the Principal Investigator(s). SOPs are written, approved procedures that describe routine activities that are specific for daily operations at each facility. SOPs should allow

8.1.2 The details given under each heading are to be considered as illustrative examples. Room preparation and environmental room conditions for the test system, procedures for receipt, transfer, proper placement, characterisation, identification and care of the test system, test system preparation, observations and examinations, before, during and at the conclusion of the study, handling of test system individuals found in a severe position or dead during the study, collection, identification and handling of specimens ,siting and placement of test systems in test conspiracy should be reviewed. And also operation of Quality Assurance personnel in planning, scheduling*,* performing, documenting and reporting inspections should be examined. Personnel should perform the same tasks using the same procedures. SOPs should accurately reflect how routine tasks are performed written by each facility based on their specific field and/or laboratory operations. Laboratory management must be sure that the SOPs used in the laboratory are useful in daily operations. They should be scientifically sound. And they should always be updated as necessary, rewrites should be the part of the routine process. While writing SOP guidelines there must be some precautions such as avoiding restrictive language such as "vortex for exactly 1 minute" but include clear instructions such as "vortex until homogenized" if that satisfies the purpose. Unnecessary steps should not be added such as "consult the manual" unless personnel are required to follow this step (Cobb, 2007). Study personnel should easily access to the study plan and appropriate Standard Operating Procedures should be applicable to their involvement in the study. It is their responsibility to comply with the instructions given in these documents. Study personnel should exercise health precautions to minimise risk to themselves and to ensure the integrity of the study. Standard Operating Procedures (SOPs) are intended to describe procedures that are routinely employed in the performance of test facility operations. Indeed they are defined as "documented procedures which describe how to perform tests or activities normally not specified in detail in study plans or test guidelines." The definition moreover implies that SOPs should describe all steps in the performance of an activity in such a detailed way that somebody not familiar with this activity might be able to perform it correctly and without

8.1.3 It is suggested that test site personnel should follow test site SOPs. When they are required to follow other procedures specified by the Study Director, for example SOPs provided by the test facility management, this necessity should be identified in the study

Performance of the Study should be monitorized carefully. All the standards supplied by the GLP should be followed from the beginning of the study to the end by the final report. For each study, a written plan should exist prior to the initiation of the study (Seiler, 2005). The study plan should contain the following information: Identification of the study, the test item and reference item, information concerning the sponsor and the test facility, dates, test

appropriately qualified personnel to perform a procedure once trained.

having to recourse to outside help (Seiler, 2005).

methods, issues (where applicable)and records. (OECD, 1998).

plan (OECD, 1998).

**9. Performance of the study** 

9.1 The study plan should be approved by dated signature of the Study Director and verified for GLP compliance. Deviations from the study plan should be described, explained, recognized and dated in a timely fashion by the Study Director and/or Principal Investigator(s) and maintained with the study raw data.

9.1.1 In the study plan the identification of the study, the test item and reference item information should exist: A descriptive title; a statement which reveals the nature and purpose of the study; Identification of the test item by code or name; The reference item to be used. Information Concerning the Sponsor and the Test Facility should be declared. It should comprise: Name and address of the sponsor, any test facilities and test sites involved, Study Director, Principal Investigator(s), and the phase(s) of the study delegated by the Study Director and under the responsibility of the Principal Investigator(s) with the date of approval of the study plan by signature of the Study Director, of the study plan by signature of the test facility management and sponsor if required by national regulation or legislation in the country where the study is being performed, the proposed experimental starting and completion dates, reference to the OECD Test Guideline or other test guideline or method to be used, the justification for selection of the test system characterisation of the test system, such as the species, strain, substrain, source of supply, number, body weight range, sex, age and other pertinent information. It should also contain the method of administration and the reason for its choice; The dose levels and/or concentration(s), frequency, and duration of administration/application; detailed information on the experimental design, including a description of the chronological procedure of the study, all methods, materials and conditions, type and frequency of analysis, measurements, observations and examinations to be performed, and statistical methods to be used. Specimens from the study should be identified to confirm their origin. Such identification should enable traceability, as appropriate for the specimen and study. The study should be conducted in accordance with the study plan. All data generated during the conduct of the study should be recorded directly, punctually, correctly, and legibly by the individual entering the data. These entries should be signed or initialled and dated. Any change in the raw data should be made in order to understand the previous entry easily, should indicate the reason for change and should be dated and signed or initialled by the individual making the change.

9.1.2 Computerised system design should always supply for the retention of full audit trails to show all changes to the data without obscuring the original data. It should be possible to associate all changes to data with the persons having made those changes. Reason for changes should be given.

## **10. Reporting of study results**

All studies generate raw data that are the original data gathered during the conduct of a procedure. They are essential for the reconstruction of studies and contribute to the traceability of the events of a study. Raw data are the results of the experiment upon which the conclusions of the study will be based. Some of the raw data may be used directly, and some of them will be treated statistically. The results and their interpretations provided by the scientist in the study report must be a true and accurate reflection of the raw data.

10.1 A final report should be prepared for each study. The study report, like all the other scientific aspects of the study, is the responsibility of the Study Director. He/she must ensure that it describes the study accurately. Reports of Principal Investigators or scientists involved in the study should be signed and dated by them. The final report should be signed and dated

GLP: Good Laboratory Practice 53

Facilities Maintain adequate space/separation of chemicals from office areas

dates, labeling Records and Reports Timely reporting, storage of raw data and reports

11.1 When samples of test and reference items and specimens are disposed of before the expiry of the necessitated retention period for any reason, this should be justified and documented. Material preserved in the archives should be indexed so as to facilitate storage and retrieval in a tidy way. Safe storage should be provided for all of the samples, test

11.1.1 Only personnel authorised by management should have access to the archives.

11.1.2 Documentation should not be accepted only written documents but also the material generally related to the test facility. Quality Assurance is obliged to retain the respective records in a special archive. Therefore, management is responsible for providing archive facilities for the safe storage and recovery of study plans, raw data, final reports, samples of test items and specimens. Storage should be safe , therefore the design of, and environmental conditions in, these facilities should protect the archived material from illtimed deterioration. Although it may be enough to archive paper raw data, study plans and final reports to support the necessary space under dry conditions, protected from fire, water and corrosive gases, more stringent conditions will be essential for the storage of tissue specimens from toxicology studies. Samples of the test and reference items should to be stored under the original conditions which were applied during the testing phase. Reconstruction of a study could only be possible if all documents, records and materials from this study can be made available in an unadulterated and unspoiled condition. Traceability in GLP means that there has to be a perfect nonstop line of evidence, chaining together the test item with the effects displayed by the test systems. GLP aims to minimise mistakes or mix-ups through extensive and specific labelling requirements. Documented information can be provided evidencing the application of the correct item in the stated

11.1.3 The storage of records must enable their safekeeping for long periods of time without loss or deterioration. In order to encourage safe storage of data, restricted access is used to archive facilities and record the documents logged in and out to a limited number of staff.

11.1.4 During the conduct of multi-site studies, the temporary storage of materials should be carefully made. Such storage facilities should be safe enough and protect the integrity of their contents. When test site storage facilities are not adequate to satisfy GLP requirements, records and materials should be transferred to a GLP compliant archive. Test site management should ensure that adequate records are available to demonstrate test site

maintenance; check freezers

Chemical and sample inventory, track expiration

**GLP Regulations (Rules) Documentation (Tools)** 

Test, Control and Reference

Table 2. Documentation for GLP rules.

amounts to the relevant test system.

involvement in the study. OECD, 1998).

(Seiler, 2005).

Substances

(Cobb., 2007).

Organization and Personnel Training records, CVs, GLP training

Facility Operation Standard operating procedures

Equipment Calibration, logbooks of use, repair, and

materials and the reports produced. Figure 4 shows the storage of test material

Movement of material in and out of the archives should be recorded appropriately.

by the Study Director to indicate acceptance of responsibility for the validity of the data.If necessary, corrections and additions to a final report should be in the form of amendments. Amendments should clearly specify the reason for the corrections or additions and should be signed and dated by the Study Director. The Study Director is responsible for the scientific interpretation included in the study report and is also responsible for declaring to what extent the study was conducted in compliance with the GLP Principles. The GLP Principles list the essential elements to be included in a final study report.

10.1.1 The final report should include, the following information: A descriptive title; identification of the test item by code or name, characterisation of the test item including purity, stability and homogeneity. Information concerning the sponsor and the test facility should imply; name and address of the sponsor, any test facilities and test sites involved, the study Director, the Principal Investigator(s) and the phase(s) of the study, delegated and scientists having contributed reports to the final report, experimental starting and completion dates.

10.1.2 A Quality Assurance Programme statement listing the types of inspections made and their dates, including the phase(s) inspected*,* and the dates any inspection results should be reported to management and to the Study Director and Principal Investigator(s). This statement should also serve to confirm that the final report reflects the raw data. It should contain the Description of Materials and Test Methods. A summary of results should be given. All information and data required by the study plan; A presentation of the results, including calculations and determinations of statistical significance; An evaluation and discussion of the results and, where appropriate, conclusions. It should imply the location(s) where the study plan, samples of test and reference items, specimens, raw data and the final report are to be stored.

10.1.3 A computerised system to be used in a GLP area should include both the dating and timing of the original entry and the retention of a full audit trail. Such identification could be possible either by the use of personal passwords recognised by the computer or by digital signatures. Furthermore, the system should not accept any changes to data without concomitant entry of a reason or justification. In manual recording the entries made on a sheet of paper can be dated and signed to bear witness to the validity of data and to accept responsibility.

10.1.4 Therefore GLP wants to ensure that data safety and integrity remains the same in electronically as in manually recorded data, irrespective of how they were recorded, and that reconstruction of the way in which the final results and conclusions were obtained remains fully possible ( Seiler, 2005). The Study Director must sign and date the final report to indicate acceptance of responsibility for the validity of all the data. (OECD, 1998).

## **11. Storage and retention of records and materials**

Storage and retention of records and materials should be prepared appropriately. The following should be retained in the archives for the period specified by the appropriate authorities : the study plan, raw data, samples of test and reference items, specimens, and the final report of each study records of all inspections performed by the Quality Assurance Programme, as well as master schedules, records of qualifications, training, experience and job descriptions of personnel; records and reports of the maintenance and calibration of apparatus; validation documentation for computerised systems. In the absence of a necessitated retention period, the final arrangement of any study materials should be documented.The necessary documents for GLP regulations are given in Table 2.


(Cobb., 2007).

52 Modern Approaches To Quality Control

by the Study Director to indicate acceptance of responsibility for the validity of the data.If necessary, corrections and additions to a final report should be in the form of amendments. Amendments should clearly specify the reason for the corrections or additions and should be signed and dated by the Study Director. The Study Director is responsible for the scientific interpretation included in the study report and is also responsible for declaring to what extent the study was conducted in compliance with the GLP Principles. The GLP Principles list the

10.1.1 The final report should include, the following information: A descriptive title; identification of the test item by code or name, characterisation of the test item including purity, stability and homogeneity. Information concerning the sponsor and the test facility should imply; name and address of the sponsor, any test facilities and test sites involved, the study Director, the Principal Investigator(s) and the phase(s) of the study, delegated and scientists having contributed reports to the final report, experimental starting and

10.1.2 A Quality Assurance Programme statement listing the types of inspections made and their dates, including the phase(s) inspected*,* and the dates any inspection results should be reported to management and to the Study Director and Principal Investigator(s). This statement should also serve to confirm that the final report reflects the raw data. It should contain the Description of Materials and Test Methods. A summary of results should be given. All information and data required by the study plan; A presentation of the results, including calculations and determinations of statistical significance; An evaluation and discussion of the results and, where appropriate, conclusions. It should imply the location(s) where the study plan, samples of test and reference items, specimens, raw data and the final

10.1.3 A computerised system to be used in a GLP area should include both the dating and timing of the original entry and the retention of a full audit trail. Such identification could be possible either by the use of personal passwords recognised by the computer or by digital signatures. Furthermore, the system should not accept any changes to data without concomitant entry of a reason or justification. In manual recording the entries made on a sheet of paper can be dated and signed to bear witness to the validity of data and to accept

10.1.4 Therefore GLP wants to ensure that data safety and integrity remains the same in electronically as in manually recorded data, irrespective of how they were recorded, and that reconstruction of the way in which the final results and conclusions were obtained remains fully possible ( Seiler, 2005). The Study Director must sign and date the final report

Storage and retention of records and materials should be prepared appropriately. The following should be retained in the archives for the period specified by the appropriate authorities : the study plan, raw data, samples of test and reference items, specimens, and the final report of each study records of all inspections performed by the Quality Assurance Programme, as well as master schedules, records of qualifications, training, experience and job descriptions of personnel; records and reports of the maintenance and calibration of apparatus; validation documentation for computerised systems. In the absence of a necessitated retention period, the final arrangement of any study materials should be

to indicate acceptance of responsibility for the validity of all the data. (OECD, 1998).

documented.The necessary documents for GLP regulations are given in Table 2.

**11. Storage and retention of records and materials** 

essential elements to be included in a final study report.

completion dates.

report are to be stored.

responsibility.

Table 2. Documentation for GLP rules.

11.1 When samples of test and reference items and specimens are disposed of before the expiry of the necessitated retention period for any reason, this should be justified and documented. Material preserved in the archives should be indexed so as to facilitate storage and retrieval in a tidy way. Safe storage should be provided for all of the samples, test materials and the reports produced. Figure 4 shows the storage of test material

11.1.1 Only personnel authorised by management should have access to the archives. Movement of material in and out of the archives should be recorded appropriately.

11.1.2 Documentation should not be accepted only written documents but also the material generally related to the test facility. Quality Assurance is obliged to retain the respective records in a special archive. Therefore, management is responsible for providing archive facilities for the safe storage and recovery of study plans, raw data, final reports, samples of test items and specimens. Storage should be safe , therefore the design of, and environmental conditions in, these facilities should protect the archived material from illtimed deterioration. Although it may be enough to archive paper raw data, study plans and final reports to support the necessary space under dry conditions, protected from fire, water and corrosive gases, more stringent conditions will be essential for the storage of tissue specimens from toxicology studies. Samples of the test and reference items should to be stored under the original conditions which were applied during the testing phase. Reconstruction of a study could only be possible if all documents, records and materials from this study can be made available in an unadulterated and unspoiled condition. Traceability in GLP means that there has to be a perfect nonstop line of evidence, chaining together the test item with the effects displayed by the test systems. GLP aims to minimise mistakes or mix-ups through extensive and specific labelling requirements. Documented information can be provided evidencing the application of the correct item in the stated amounts to the relevant test system.

11.1.3 The storage of records must enable their safekeeping for long periods of time without loss or deterioration. In order to encourage safe storage of data, restricted access is used to archive facilities and record the documents logged in and out to a limited number of staff. (Seiler, 2005).

11.1.4 During the conduct of multi-site studies, the temporary storage of materials should be carefully made. Such storage facilities should be safe enough and protect the integrity of their contents. When test site storage facilities are not adequate to satisfy GLP requirements, records and materials should be transferred to a GLP compliant archive. Test site management should ensure that adequate records are available to demonstrate test site involvement in the study. OECD, 1998).

GLP: Good Laboratory Practice 55

Adequate labeling

Adequate characterization


nature & collection date

Proper receipt, storage, distribution

When mixed with a carrier, adequate methods to confirm

Written, approved protocol indicating test objectives &

Records of gross findings from postmortems available to



Records transferable with written FDA notification


Study conducted in accordance with protocol Study monitoring to confirm protocol compliance Appropriate labeling of specimens by test system, study,

pathologist for specimen histopathology




Grounds for disqualification: -Failure to comply with regulations &

modifying facility operations

12.1 "Good laboratory practice" can be considered as " essentially tidiness, cleanliness,

12.1.1 Quality combination with the GLP rules will be the way that the laboratories will tend to select more in the next years. This will be the leading way to the evidence based


methods


application

application


Reagents & Solutions

Test & Control Articles

Study Implementation

requirements

Final report of results

Standard data capture/recording

Study records & data methodically archived to facilitate expedient retrieval

Records retention (shortest of):

Facility Disqualification

Table 3. GLP regulations.

hygiene and common sense." (CWIS, 2000).

laboratory results with a more trustworthy approach.

(Clasby, 2005)

Fig. 4. Storage of the test material in an organized order.

## **12. Summary**

GLP regulations are summarized in Table 3.


products



Animal room prep Animal care

control articles

Histopathology

Lab tests

Appropriately designed Adequate thru-put capacity Appropriately located

Test system observations

Collection & ID of specimens

Data handling, storage & retrieval Equipment maintenance & calibration Transfer, proper placement & ID of animals

Routinely maintained & calibrated

Handling of moribund or dead animals Necropsy or postmortem exams of animals

Adequate resources Appropriate procedures for:

Quality assurance function

Describes good practices for non-clinical lab studies that support research or marketing approvals for FDA-regulated

Appropriately qualified personnel


Suitable size, construction, segregation


Receipt, ID, storage, handling, mixing & sampling of test &

Fig. 4. Storage of the test material in an organized order.

GLP regulations are summarized in Table 3.

**12. Summary** 

GLP General Requirements

GLP Facilities Requirements

Equipment Requirements

Standard Operating Procedures

GLP


(Clasby, 2005)

Table 3. GLP regulations.

12.1 "Good laboratory practice" can be considered as " essentially tidiness, cleanliness, hygiene and common sense." (CWIS, 2000).

12.1.1 Quality combination with the GLP rules will be the way that the laboratories will tend to select more in the next years. This will be the leading way to the evidence based laboratory results with a more trustworthy approach.

**Part 2** 

**Evaluating Analytical Data** 

## **13. References**

Clasby Ginger (2005). Good Laboratory Practice CFR 21 Part 58. A Review for OCRA US RAC Study Group September 2005. Available at : http://www.google.com.tr/search?hl=tr&source=hp&q=A+Review+for+OCRA+

US+RAC+Study+Group+September+2005+++&rlz=1W1ADFA\_tr&aq=f&aqi=&aql =&oq

Cobb Jean ( 2007). GLP: Good Laboratory Practice for Field and Research. ALS 52 04 Available at:

http://www.docstoc.com/docs/18191459/Good-Laboratory-Practices


https://springerlink3.metapress.com/content/mr20ux0343141g4k/resource-

secured/?target=fulltext.pdf&sid=sbx4al45ojtfu3vvjzteu045&sh=www.springerlin k.com


 http://www.oecd.org/officialdocuments/displaydocumentpdf/?cote=env/mc/ch em(98)17&doclanguage=en

Seiler Jürg P (2005) Good Laboratory Practice. The why and the how. ISBN 3-540-25348-3, Springer-Verlag Berlin Heidelberg, Printed in the European Union.
