**2. Methodology**

#### **2.1 Database**

The possible sources of information for scientometric research include multi-disciplinary databases such as Thomson Reuters' Web of Science, Elsevier's Scopus and resources such as Google Scholar, as well as specialised services such as Medline. These sources analyse research results in the form of scientific papers published in international journals and their subsequent citation by the rest of the scientific community.

Scopus, the Elsevier database created in 2004, lists over 18 000 journals edited by over 5 000 publishers1. When it first appeared, it was analysed by many authors and compared to other resources in a whole stream of papers (Fingerman, 2005; LaGuardia, 2005). It was chosen for the present study because of its broad subject area and linguistic coverage; in the understanding that world-wide scientific production is more fully represented in Scopus than in other databases (Sciverse Scopus, 2011). In addition, as a resource suitable for research conducted after 1996, it is particularly apt for a subject area such as pharmacology (Gorraiz and Schloegl, 2008).

Scopus' strong points as a source of information are reinforced by an open access, on-line tool known as SCImago Journal and Country Rank (SJR, 2007). As its name infers, this system of scientific information, drawing from Scopus contents from 1996 to 2010, ranks journals and countries using data intended for world-wide scientific assessment. The tool provides open access to both data and indicators by region or country, with international coverage. It proved to be particularly useful for the aims pursued in the present study.

#### **2.2 Indicators**

Two sets of bibliometric indicators were used in this study: one to determine the quantitative characteristics of scientific output and the other to analyse its quality, i.e., the qualitative characteristics of citations and journals (Rehn, 2007). The indicators included in each group are described below.

This study calculated the number of scientific papers published by the units analysed (world, region, country or industry) over the time span defined. All of the various possible types of papers (such as articles, reviews and notes to the editor,) were included in the *output* indicator.

<sup>1</sup> Available from http://www.info.sciverse.com/scopus/scopus-in-detail/facts/. 20/08/2011

The purpose of this chapter is to analyse international research in "pharmacology, toxicology and pharmaceutics" (hereafter pharmacology) on the basis of the scientific papers listed in the Scopus multidisciplinary database. This primary objective is reached by answering the following questions (in the section on results). What weight does the subject area "pharmacology, toxicology and pharmaceutics" carry in world-wide science? What is the percentage contribution made by the various regions of the world to the subject area "pharmacology, toxicology and pharmaceutics"? Can certain regions be identified as leaders on that basis, as in other scientific contexts? Are emerging countries present in the field? Do the most productive countries also publish the largest number of journals? What features

characterise the scientific output of companies that publish pharmacological papers?

subsequent citation by the rest of the scientific community.

The possible sources of information for scientometric research include multi-disciplinary databases such as Thomson Reuters' Web of Science, Elsevier's Scopus and resources such as Google Scholar, as well as specialised services such as Medline. These sources analyse research results in the form of scientific papers published in international journals and their

Scopus, the Elsevier database created in 2004, lists over 18 000 journals edited by over 5 000 publishers1. When it first appeared, it was analysed by many authors and compared to other resources in a whole stream of papers (Fingerman, 2005; LaGuardia, 2005). It was chosen for the present study because of its broad subject area and linguistic coverage; in the understanding that world-wide scientific production is more fully represented in Scopus than in other databases (Sciverse Scopus, 2011). In addition, as a resource suitable for research conducted after 1996, it is particularly apt for a subject area such as pharmacology

Scopus' strong points as a source of information are reinforced by an open access, on-line tool known as SCImago Journal and Country Rank (SJR, 2007). As its name infers, this system of scientific information, drawing from Scopus contents from 1996 to 2010, ranks journals and countries using data intended for world-wide scientific assessment. The tool provides open access to both data and indicators by region or country, with international coverage. It proved to be particularly useful for the aims pursued in the present study.

Two sets of bibliometric indicators were used in this study: one to determine the quantitative characteristics of scientific output and the other to analyse its quality, i.e., the qualitative characteristics of citations and journals (Rehn, 2007). The indicators included in

This study calculated the number of scientific papers published by the units analysed (world, region, country or industry) over the time span defined. All of the various possible types of papers (such as articles, reviews and notes to the editor,) were included in the *output* indicator.

1 Available from http://www.info.sciverse.com/scopus/scopus-in-detail/facts/. 20/08/2011

**2. Methodology** 

(Gorraiz and Schloegl, 2008).

each group are described below.

**2.2 Indicators** 

**2.1 Database**

When papers were co-authored by researchers from institutions in different countries, a complete computational approach was adopted. The growth rate, when provided, indicates the rise or decline in world-wide output in 2009 with respect to the baseline year, 1996.

A number of indicators were used to obtain an approximate view of the quality of world scientific output in the field of pharmacology. The number of *citations* received refers to the total number of times papers published by the unit analysed were cited during the period studied. This indicator provides an overview of the scientific impact of the articles published by the unit in question. The number of *citations per paper* was calculated as the mean number of citations received by all the papers published by the unit analysed in the period studied.

The *domestic citations* were separated from the total to determine the proportion of the output that was used as a reference in the same geographic area (region or country) and consequently, by simple subtraction, the proportion involving knowledge transfer to other areas. The results are shown as the percentage of the citations used for research conducted in the same geographic area. The *normalised citation* indicator is the relative number of times papers produced by a specific unit were cited, compared to the world-wide mean for papers of the same type, age and subject area.

While citations denote the subsequent use of papers once published, the *references* list the literature cited in papers published by a journal at any given time. The number of *references per paper* was found by dividing the total number of references by the number of papers published by the unit.

A country's *H-index*, in turn, specifies the number of papers (h) produced in that country and receiving at least h citation. It relates a country's scientific productivity (output) to its scientific impact (citations). The *international collaboration* indicator is the percentage of papers with author affiliations in more than one country. This indicator measures institutions' international networking capacity. In this chapter a journal's *% output in Q1* is the percentage of scientific papers published by an institution in what are classified as the most influential journals in the respective category, i.e., the periodicals in the first quartile or Q1, the upper 25 %, based on their SJR value.

Another qualitative indicator used, homonymous with the aforementioned scientific information system (SCImago Journal and Country Rank), was the *Scimago Journal Rank (SJR)*, used as an alternative to the traditional impact factor (I.F.). This indicator, which measures the visibility of the journals in the Scopus® database, is established by the SCImago2 research team on the grounds of the well-known Google PageRankTM algorithm. It differs from the I.F. in two ways: citations are computed over 3 rather than 2 years; and article citations are weighted, with citations in more visible or prominent journals carrying greater weight than citations in lower-ranking journals (González-Pereira et al., 2009).
