*3.1.3 The webometrics ranking weighing model*

Webometrics ranking system [58] performs an evaluation and ranking of universities of the world two times a year (January/February and June/July) by its own developed methodology. Webometrics ranking methodology includes several phases and applies several systems so that data necessary for ranking and analyses may be updated and collected in time.

According to [51], there are three key aspects that need to be measured in the academic web space:


Bibliometrics has traditionally ignored frequency of appearance of a journal on various locations or sources of data and has focused on an impact of a journal, i.e., relation between a number of quotations and a number of published articles in the journal. A similar approach was proposed in the case of Webometrics ranking.

Webometrics ranking performs monitoring of a certain group of parameters (criteria) (**Table 1**), but only size and visibility of a web host are included in the


### **Table 1.**

*Criteria and weights used in the calculation of the WR indicator [51].*

final data which are used for ranking. A model for ranking defines that a relation between these two parameters (size and visibility) is taken in the ratio 1:1. In order to take diversity of academic activities and services into consideration, component "size" is divided into three parts so that one could measure raw data about a quantity of websites, a number of rich files, and a number of articles and publications collected by Google Scholar system.

According to the work [51], criteria and weights used for calculation of WR indicators in those times were obtained from several sources only, which mainly were web search engines. Some of those search engines are not used to obtain data any more, but there are some new search engines together with some of the old ones which improved their algorithms for indexing and browsing of results from the web.

Pursuant to the proposed model, ranking (web ranking) is calculated with the following equation (Eq. (1)):

$$\text{WR} = \text{2} \times \text{Rank(S)} + \text{2} \times \text{Rank(R)} + \text{2} \times \text{Rank(Sc)} + \text{4} \times \text{Rank(V)}\tag{1}$$

The ratio which combines weights ascribed to each of the elements is (2 + 1 + 1):4 or 1:1, which was the initial intention. In order to avoid problems related to size, search engine bias, and other factors, results collected in this way, which initially were expressed as absolute values of numbers, are log-normalized and transformed into ordinal numbers and then combined with the previously mentioned equation for WR [59].

Over the years of application of the system, Cybermetrics Lab has made adjustments of indicators of calculation according to the analyses of data available during the years preceding the analyses. The data shown in tables on www.webometrics. info are basically ranks (smaller number, better ranking) which purpose is to show individual performances, but one has to bear in mind that those values are not applied in ranking calculations [60].

Due to technical problems in the previous versions of the ranking system, Cybermetrics Lab changed some of the ranking weights (presence and excellence from **Table 2**) in the last version of the ranking system so that the current methodology is shown in **Table 2** (January edition, 2019.1.0.).


**31**

*Advantages and Disadvantages of the Webometrics Ranking System*

*3.1.4 Several relevant facts about webometrics ranking of universities*

tions of higher education all over the world with their analyses.

Results of ranking of universities [58] have been published two times a year since 2004 (data are collected during the first week of January and July to be prepared and published in the end of both of the months) covering more than 28,000 institu-

The data are collected between January 1 and January 20, depending on a current edition of a ranking publication. Data are taken (sampled) for each of the variables at least two times during the certain period, and the greatest value is taken as the final value to be analyzed in order to avoid possible errors in data collection. Inconsistency of web search engines is very huge so that the obtained results may be diversified, and there is a small possibility for their replication if browsing is done several days later. Google is very geographically biased; that is why data are collected with *google.com* mirror related to the domain, and English is used for the

A final publishing of ranking data is performed at the end of January or July, usually not before the 28th day of the month. It is very important to mention that Cybermetrics Lab follows its general rule not to discuss any presented result or provide unprocessed data with which a specific ranking was performed [58].

Like other ranking systems, Webometrics ranking system has a range of advantages and disadvantages. Differently from other systems of ranking of scientists and universities, one can say that webometrics is a "global" ranking system. Why global? Most of the ranking systems include only several hundreds or thousands of the best universities, such as Shanghai list, while Webometrics includes most of the universities of the world, i.e., currently 28,000 scientific institutions from all over the world [60]. This list also enables ranking of scientific institutions, institutes, and individual members of a university, which can entice competitive spirit among individual members of a university. Why is this important? An extremely small number of universities of the world satisfy the Shanghai list criterium. However, this does not mean that there are no other universities of good quality besides those which are ranked as well as scientists working at those universities all over the world. It is easy to conclude that the universities from the Shanghai list and similar lists mainly originate from countries from well-developed economics and well-ordered educational systems, developed democracies, and high degrees of autonomies of their universities. Higher education systems of developed economies follow up the needs of the labor market and technology progress, and the quality of educational institutions is institutionally maintained due to strict accreditation criteria prescribed by authorized organs and ministries in every state. In developing countries and in poorly developed economies, there are great problems regarding an objective assessment and ranking of quality of institutions of higher education due to:

**4. Webometrics ranking system: advantages and disadvantages**

• Poor or no applying of international criteria [61, 62]

requirements and criteria prescribed by law [61, 62]

indexed in the leading databases

• Involvement of politics into institutions of higher education [61, 62]

• Devaluation of diplomas and criteria through institutions which do not fulfill

• An extremely low percentage of scientific production of relevant publications

*DOI: http://dx.doi.org/10.5772/intechopen.87207*

interface and Madrid (Spain) as the location.

**Table 2.**

*Webometrics university ranking methodology (January edition, 2019.1.0.) [60].*

*Scientometrics Recent Advances*

the web.

collected by Google Scholar system.

following equation (Eq. (1)):

mentioned equation for WR [59].

applied in ranking calculations [60].

ology is shown in **Table 2** (January edition, 2019.1.0.).

Presence Size (number of web pages) of the main web domain of

*Webometrics university ranking methodology (January edition, 2019.1.0.) [60].*

final data which are used for ranking. A model for ranking defines that a relation between these two parameters (size and visibility) is taken in the ratio 1:1. In order to take diversity of academic activities and services into consideration, component "size" is divided into three parts so that one could measure raw data about a quantity of websites, a number of rich files, and a number of articles and publications

According to the work [51], criteria and weights used for calculation of WR indicators in those times were obtained from several sources only, which mainly were web search engines. Some of those search engines are not used to obtain data any more, but there are some new search engines together with some of the old ones which improved their algorithms for indexing and browsing of results from

Pursuant to the proposed model, ranking (web ranking) is calculated with the

*WR* = *2* × *Rank*(*S*) + *1* × *Rank*(*R*) + *1* × *Rank*(*Sc*) + *4* × *Rank*(*V*) (1)

Over the years of application of the system, Cybermetrics Lab has made adjustments of indicators of calculation according to the analyses of data available during the years preceding the analyses. The data shown in tables on www.webometrics. info are basically ranks (smaller number, better ranking) which purpose is to show individual performances, but one has to bear in mind that those values are not

Due to technical problems in the previous versions of the ranking system, Cybermetrics Lab changed some of the ranking weights (presence and excellence from **Table 2**) in the last version of the ranking system so that the current method-

**Indicators Description Source Weight**

*Google* 5%

*SCImago* 35%

50%

10%

*Ahrefs, Majestic*

*Google Scholar Citations*

the institution. It includes all the subdomains sharing the same (central or main) web domain and all the file types including rich files like PDF documents

Number of external networks (subnets) originating backlinks to the institution's webpages After normalization, the average value between the two sources is selected

Number of citations from top authors according to the source

Number of papers among the top 10% most cited in 26 disciplines Data for the 5-year period (2012–2016)

The ratio which combines weights ascribed to each of the elements is (2 + 1 + 1):4 or 1:1, which was the initial intention. In order to avoid problems related to size, search engine bias, and other factors, results collected in this way, which initially were expressed as absolute values of numbers, are log-normalized and transformed into ordinal numbers and then combined with the previously

**30**

**Table 2.**

Visibility (or impact)

Transparency (or openness)

Excellence (or scholar)
