**4. Requirements of altmetrics**

Gaming is also a problem beyond the sources assessed by altmetric service providers. In a study by Meier and Tunger, it became apparent that it is possible to considerably influence the metrics specially developed by the ResearchGate platform, the RG score [16]. The RG score is intended to measure the "scientific reputation" of ResearchGate users. It is influenced by the impact of a user's own scientific publications but also by their social activities on the platform (see https://www.researchgate.net/RGScore/FAQ). Meier and Tunger found that it is possible within a relatively short time to achieve an RG score that is higher that the RG scores of half

In another study for the European commission, Kim Holmberg found that altmetrics are not yet practically applied in the EU for the purposes of scientific evaluation. In his view, such practice on a wide scale would be premature as long as what altmetrics actually measure

A semantic analysis of contributions in social media is lacking for the most part, which is a major issue making the evaluation of altmetrics counts so difficult. References are mostly counted based on identifiers such as the DOI; however, which references should be evaluated as positive and which as negative cannot be handled, which means that a "performance paradox" develops [18]. This paradox also exists in a similar form in classical bibliometrics and

Furthermore, the coverage of scientific publications is relatively low and the distribution varies heavily both across disciplines and across platforms. Haustein et al. found that 21.5% of all scientific publications in Web of Science in 2012 were mentioned in at least one Tweet, while the proportion of these publications in other social media was mentioned less than 5% [20]. In percentage comparison, 67% of the publications were cited in Web of Science at least once. A feasibility study conducted by BMBF shows strong variation concerning coverage on altmetric.com between the scientific disciplines: publications from the field of medicine are represented considerably more often than, for example, publications from the engineering sciences [7]. Differences in coverage appear to benefit the humanities sciences in particular. While these are scarcely considered in established databases such as Web of Science, their coverage is considerably better in the field of altmetrics, according to a study conducted by Hammarfelt: over 61% of the investigated publications in this field have at least one reader on

In general, the data basis underlying altmetrics is often problematic: the reproduction of data is almost impossible because data providers change, modify their data stock, or disappear completely [4]. For example, platforms such as Weibo or LinkedIn, which are included in the sources covered by altmetric.com, are now no longer analyzed since these data providers no longer grant access. Quality control, such as a validity check of accounts or the clean-up of duplicates, rarely occurs for social media platforms and complicates the aggregating and

**3. Problems associated with collecting and interpreting altmetrics**

must be considered as an inherent problem of quantitative metrics in use [19].

Mendeley and more than 20% have already been discussed on Twitter [21].

filtering of data for altmetrics service providers [22].

of all RG users solely by gaming without any scientific publications.

remains unclear [17].

126 Scientometrics

To date, the European Commission ascribes high significance to altmetrics, particularly against the backdrop of open science. This is also reflected in the establishment of the associated expert group. The efforts have so far led to a compilation of twelve recommendations within the open science context. In the political context of the European Union's supranational level, the importance of guidelines for the conscientious application of metrics is emphasized. These guidelines are interlaced in the following with the demands from the Leiden Manifesto for research metrics.

The Leiden manifesto emphasizes the aspect of complementarity as a central principle and basis of any evaluation practice. According to it, for the existing qualitative practices, the aim should be to complement each other in an advantageous manner. Peer review and expert assessment—this is the ambition—could be reinforced by the appropriate use of quantitative metrics, and further aspects beyond the traditional science system could be illuminated: "quantitative evaluation should support qualitative, expert assessment" [14].

extent this circumstance will change over time cannot currently be predicted and depends more on the social discourse on science and the opening of the science system than on further

Altmetrics: State of the Art and a Look into the Future http://dx.doi.org/10.5772/intechopen.76874 129

A part of communication on science and its visibility in the public sphere is represented by altmetrics. In any case, it should be noted that there is a rising trend in social media activity measured by the frequency of contributions and the number of people involved. Thus, it is becoming increasingly important to use social media platforms in order to proactively draw

As an example in this context, institutional efforts such as those undertaken by universities or the European Commission, can be observed, which strategically position their own publications and activities. Against the backdrop of the explorative state of these efforts, altmetrics could serve as feedback, for example, to test various approaches aimed at new target groups in society. With regard to research policy, particularly activities with a strong social relevance and their visibility could represent an interesting field of application complementing current evaluation approaches for analyzing media feedback. Initial network analyses are already delivering promising results and their application to research policy issues could be examined. Using specific issues associated with communication propagation, attention could be focused, for example, on the identification of relevant multipliers—for example, science journalists and representatives from politics, industry, and interest groups—in the dissemination of information. Identifying such mechanisms and transmission channels in pilot studies would be promising research priorities in this respect in addition to medial feedback already

Publishers already use the altmetric score mentioned in Section 3 as feedback on articles, albeit in a strongly aggregated and simplified form. Similar efforts are also apparent at universities and research institutions, which are testing the implementation of the Altmetric Donut both with and without the score, although the added value of these efforts has yet to be clarified. As part of a pilot measure, the OECD is currently investigating to what extent the altmetric explorer and the implementation of the altmetric score are suited to determine the

Science institutions can also use altmetrics within the scope of science marketing: it is conceivable that altmetrics could be used to focus attention on those publications by an institution that is widely discussed, shared, tweeted, or used in news pieces. This would permit the

Whether there is any benefit from altmetrics in economics or politics beyond science has not yet been verified. From our viewpoint, there would be benefits if more sources of economic or policy-relevant sources were covered by the altmetrics databases. In this case, it would be possible to regard or measure the contribution of science in economy or policy. With bibliometric instruments, such as publication or citation analyses, it is not possible to measure this contribution since the economic or political world does not publish articles in scientific outlets. With

methodological developments.

attention to research, that is, advertise it.

addressed through established investigation designs.

interface between science and society to be better addressed.

social range of policy documents.

**5.2. Public relations, visibility, and advertising of activities**

Another aspect is the openness and transparency of all steps in the analysis process: "keep data collection and analytical processes open, transparent and simple" [14], that is, analyses should be verifiable and the indicators should not be unnecessarily complicated. At the same time, this does not mean that simple indicators (e.g., pure absolute numbers) with no significance should be used instead.

This recommendation is particularly important against the backdrop of the altmetric attention score since this composite indicator always combines data from many different sources. Their individual significance is unknown so that the score value can only contribute rudimentary information on the visibility of a publication in social media and therefore not be used for evaluation. At this point, attention should also be drawn to the inappropriate use of the journal impact factor, which occurs in a cumulative form particularly in medical science: its incorrect use as a citation indicator instead of as a simple journal indicator shows that it is immensely difficult to eliminate a metric once it has been established. Metrics in the scientific context must be reliable, reproducible, and significant.
