**4.2 Differences and similarities in IRHS integration tools**

306 Health Management – Different Approaches and Solutions

calculate some indicators for instance the large (or null) use of dichotomous (yes/no) indicators

Responsiveness and equity are the dimensions less monitored and also those that register a

Regarding responsiveness, common indicators are those related to waiting times. Besides this type of indicators, other monitored topics concern patient satisfaction. Nevertheless lots of Regions declare to monitor patient satisfaction, methods are quite different from each others for instance some Regions, such as Lombardy and Bolzano, run sample surveys; others use the civic audit and finally others, such as Basilicata, control that surveys have been executed by

Concerning Equity, the Commission on Social Determinants of Health (CSDH) of WHO asserted that the systematic and continuous measuring of equity indicators is a fundamental

Only some Regions declare to have monitored equity. Most indicators related to equity require surveys so that many Regions seldom measured these type of indicators. The only two Regions that are able to measure systematically equity in access for some services (ie.

**Bolzano** *We are still studying systematic indicators on equità. Nowadays we focus on* 

**Apulia** *We pay attention on frailty classes. We reorganized the exemptions on the* 

*as mental health, elderly or drug addicted.* 

**Sardinia** *We don't have equity indicators. At the moment we look at frailty classes such* 

**Veneto** *Although equity is one of the key issue of our regional strategic plan, we don't have indicators that control this aspect in a systematic way.* 

Many ad hoc survey have been run on various topics. Inequalities are studied by the epidemiologic observatory, they have developed very high competences on these issues. In years Piedmont Region records the education degree in the hospitalization data so that we could control whether there are differences among social classes for

*We have indicators coming from survey related to the educational degree and systematic indicators related to the access of educational classes for inpatient* 

*We don't have systematic indicators on equity. Administrative data don't have* 

There is an ad hoc survey conducted by the specialized centre of Trento regarding all services. This study looked at indicators concerning the access per gender, age, education and so on.

*reliable information on education or income. Many surveys have been conducted by the university centre on this topic. Some of them are really* 

**Lombardy** Equity is pursued using indicators focused on frailty people.

or the use of specific indicators related to the treatment of particular chronic conditions.

Health Authorities without having information about the results (see table 4).

step in order to close the gap of inequities (CSDH, 2008).

**Regions Equity dimension Basilicata** *None at the moment.*

**Piedmont**

**Tuscany**

**Trento**

**Umbria**

*immigrants.*  **Liguria** *We are planning to control this aspect.* 

inpatients.

*services.* 

*important.* 

Table 5. Regional responses on equity dimension

*basis of those classes.* 

Hospital discharges) are Piedmont and Tuscany (see table 5).

high number of differences.

Responses about integration between PMS and rewarding system can be classified into three groups (as reported in figure 1).

In the first group there are Regions that have coped with central pressure on the deficit control, they suspended the CEOs rewarding system or linked it to normative fulfilments (Case A).

In the second group (Case B) there are Regions (Basilicata and Sardinia) which show full integration between rewarding system and performance measurement system. These regions have recently implemented performance measurement systems and in order to enforce them, they decided to strictly link the rewarding system. To this extent the rewarding system introduces an innovative way of measuring performance.

The last group of Regions (Case C) is characterized by a partial integration of rewarding and performance measurement systems. These Regions decided to make a selection of measures to be rewarded adding to the PMS' measures also other type of decisions.

Fig. 1. Integration between performance measurement systems and rewarding system.

In general PMS covers much more topics than the rewarding system as it is represented in case A and C. These two groups collect the majority part of the Regions that participated in the study. The only case where the rewarding system is almost overlapping with the PMS adopted is the case B. It seems that when Regions seek to implement new reliable control system they use the rewarding system as a driver of change.

#### **4.3 Differences and similarities in the regional attitude towards the use of benchmarking**

Benchmarking is seen by all Regions, with the exception of Apulia, as an interesting opportunity to improve their performance.

Performance Measurement

Lombardy (see table 6).

by National Government).

Fig. 2. Different visions on benchmarking.

**5. Discussions and conclusions** 

integrated and systematic way.

Features of the Italian Regional Healthcare Systems: Differences and Similarities 309

Moreover uncertainty about the future due to the economic crisis, the Italian fiscal federalism reform and the European parliament spectrum imposes health sector and policy makers to share information about performance and successful strategies as affirmed by

Although there is enthusiasm about benchmarking across Regions, this technique is not

Particularly interesting are the cases of Tuscany and Lombardy that both use benchmarking as learning tool among health authorities. Indeed while the former applies benchmarking to all indicators in a full transparent way (Nuti et al., forthcoming a), the second uses it

Even though most of Regions declare to be willing to compare their performance with

Some Regions declared that benchmarking should be done by National Government after having shared the selection of indicators, some says that the comparison should be run by an external benchmarking agency, others prefer having a regional supervision on how to

Figure 2 summarize the regional positions, pointing out the different visions that go from a regional system (where there is maximum autonomy on measuring performance, no benchmarking across Regions) to a national system (where everything is decided and done

Regions that are less willing to compare their performance are those that traditionally have had more autonomy (such as Trento) or those that have gone through a period of drastic cuts (such as Apulia). Regions more willing to enable benchmarking process and to go

beyond regional boundaries are those that already measure their health service.

Italian regional devolution on health care has led each Region to develop its own PMS. Although national reforms have pushed the adoption of managerial tools, the study points out that still few Regions have developed PMS capable to measure all the topical dimensions of the OECD framework (Efficiency, Responsiveness, Equity and Health improvement/outcome). In particular dimensions less controlled are: responsiveness and equity. Besides another weakness of the Italian regional PMSs is that often policy makers and regional managers use a plethora of tools in order to control the performance of health service and health system organizations. This highlights that in most cases regional policy level lacks of strategic tools capable of summing up the overall performance in an easy,

commonly applied within regional boundaries as governance tool.

especially for outcome indicators keeping clear the label of health authorities.

run comparison finally someone asks only for a comparison on methodology.

others (see table 6) they show some reserve on how benchmarking should be done.

These responses seem to be particularly influenced by contextual factors (described in table 1) such as the size of the Region and the environmental pressure. Indeed small regions such as Umbria feel, more than others, the necessity to look outside regional boundaries in order to gain the advantages of benchmarking (see table 6 Umbria, Trento and Bolzano quotations).


Table 6. Regional responses on the openness benchmarking

These responses seem to be particularly influenced by contextual factors (described in table 1) such as the size of the Region and the environmental pressure. Indeed small regions such as Umbria feel, more than others, the necessity to look outside regional boundaries in order to gain the advantages of benchmarking (see table 6 Umbria, Trento and Bolzano quotations).

BASILICATA *We are in favour of a general evaluation of health services. A minimum set of* 

LIGURIA *We start participating in a regional network that could enable learning* 

BOLZANO *We are the first ones who want to start benchmarking mechanism as a learning* 

*improving local performance evaluation systems.*

*We are definitely open to benchmarking.* 

*shared performance indicators can activate useful benchmarking processes.* 

*It is a must to enhance regional accountability. It is possible to identify a National set of indicators to be monitored at a Regional level. Sharing indicators and criteria is essential in order to guarantee a real comparison* 

*among Regions overcoming the risk of self referral assessment.* 

*processes thanks to benchmarking outside our regional boundaries.* 

*No wind is good for whom that does not know the rhumb line. It's a strategic problem, benchmarking can be a crucial help in defining the rhumb line. Above* 

*We are in favour of a benchmarking within the Regions because we believe that we would be at a good level of performance and we would have the same problems of other Regions but we ask for a regional network that smoothly* 

*It could be defined National guidelines in order both to compare regional health* 

*[...] Although we get data benchmarking, at this stage we prefer adopting a soft approach: in our opinion the measurement process has to be a supportive management tool. The assessment linked to performance benchmarking across* 

*Benchmarking enabled us to identify and face the unacceptable gaps between* 

*Data benchmarking across health authorities can enable Regions to overcome self referral attitude and it can enhance learning and assessment processes in* 

*It is important to be able to compare measures at National level. It is more useful doing benchmarking with similar units outside its own Region than going on regional averages as in the case of Perugia teaching hospital that is* 

*We are in favour of benchmarking at the National level. Results should be read by everyone. Indicators should be shared. Regions should create a linkage between National and Regional performance evaluation systems.* 

*system and to support Regions develop effective tool using the same* 

*health authorities could lead to disadvantages above all in terms of* 

*A performance evaluation system at a National level may activate useful benchmarking processes across regional health services and may help* 

**Regions Responses on the openness to benchmarking**

*all in the European context* 

*runs the comparison* 

*methodological issues.* 

*relationships.*

Table 6. Regional responses on the openness benchmarking

*Sicily and other Regions.* 

*order to highlight best practices*

*the sole regional teaching hospital* 

*tool*

CAMPANIA *--*

MARCHE *--*

FRIULI VENEZIA

GIULIA

LOMBARDY

PIEDMONT

TRENTO

APULIA

SICILY

TUSCANY

UMBRIA

VENETO

SARDINIA *--*

Moreover uncertainty about the future due to the economic crisis, the Italian fiscal federalism reform and the European parliament spectrum imposes health sector and policy makers to share information about performance and successful strategies as affirmed by Lombardy (see table 6).

Although there is enthusiasm about benchmarking across Regions, this technique is not commonly applied within regional boundaries as governance tool.

Particularly interesting are the cases of Tuscany and Lombardy that both use benchmarking as learning tool among health authorities. Indeed while the former applies benchmarking to all indicators in a full transparent way (Nuti et al., forthcoming a), the second uses it especially for outcome indicators keeping clear the label of health authorities.

Even though most of Regions declare to be willing to compare their performance with others (see table 6) they show some reserve on how benchmarking should be done.

Some Regions declared that benchmarking should be done by National Government after having shared the selection of indicators, some says that the comparison should be run by an external benchmarking agency, others prefer having a regional supervision on how to run comparison finally someone asks only for a comparison on methodology.

Figure 2 summarize the regional positions, pointing out the different visions that go from a regional system (where there is maximum autonomy on measuring performance, no benchmarking across Regions) to a national system (where everything is decided and done by National Government).

Fig. 2. Different visions on benchmarking.

Regions that are less willing to compare their performance are those that traditionally have had more autonomy (such as Trento) or those that have gone through a period of drastic cuts (such as Apulia). Regions more willing to enable benchmarking process and to go beyond regional boundaries are those that already measure their health service.
