Open access peer-reviewed chapter

Altmetrics: State of the Art and a Look into the Future

Written By

Dirk Tunger, Marcel Clermont and Andreas Meier

Submitted: 07 December 2017 Reviewed: 29 March 2018 Published: 05 November 2018

DOI: 10.5772/intechopen.76874

From the Edited Volume

Scientometrics

Edited by Mari Jibu and Yoshiyuki Osabe

Chapter metrics overview

1,506 Chapter Downloads

View Full Metrics

Abstract

The development of alternative indicators (altmetrics) can be traced back to a discussion a few years ago where the central question was: does the focus on classical bibliometric indicators still adequately reflect the scientific and social significance of scientific work in the Internet age? In the course of this discussion, the term “altmetrics” was introduced as a collective term for all those indicators that contain previously unnoticed information from the Internet—especially concerning social media. Altmetrics shed light on the reception of scientific publications in news websites as well as in scientific blogs, policy papers, and other web-based content. This chapter deals with the current state of the art of altmetrics, focusing on the present discussion about the informative value of altmetrics. Furthermore, we investigate to what extent altmetrics can be used in scientific evaluations. We conclude our chapter with an outlook on the potential prospects for success of altmetrics in different fields of application.

Keywords

  • altmetrics
  • bibliometrics
  • informative value
  • scientific evaluation
  • social media

1. Introduction

Similarly to many areas of private life and business, increasing numbers of processes, results, and discussions in science are shifting to the digital sphere. For example, the scientific output is shared and discussed in established social media such as Twitter and Facebook. In addition, platforms created specifically for scientists, such as Academia.edu, ResearchGate, or Mendeley [1, 2], are also growing in numbers. The “Science 2.0” [3] era is progressing and this simultaneously increases the demand for indicators capable of measuring web-based impact. A pure consideration of the citation numbers from classical bibliometrics appears outdated since they reflect only a limited picture of the impact of scientific publications [4].

To date, web-based impact in social media has been measured mainly by the number of downloads or clicks, or by using indicators created by the operators themselves, such as ResearchGate’s (RG) score [5]. These web-based metrics get the umbrella term “alternative metrics,” or “altmetrics” [6]. Collecting and analyzing altmetrics is gaining relevance, and not only in science. Political decision makers, too, are attaching corresponding importance to the issue. Thus, the German Federal Ministry of Education and Research (BMBF), for example, has launched the first study evaluating the possibilities and limitations of using altmetrics for impact measurements [7]. Furthermore, BMBF has initiated a funding line for quantitative science research, in which the further investigation of altmetrics plays a central role.

The present chapter gives an overview of the current stance of scientometric research on alt-metrics. We show example metrics and discuss what conclusions can be drawn from them. It will become apparent that altmetrics do not meet the expectation of measuring scientific impact because the data are too heterogeneous, their interpretation has not yet been sufficiently clarified, and an indicator system with meaningful and reliable benchmarks does not yet exist. Furthermore, we will investigate what strategies scientific institutions can pursue in using altmetrics and provide information on prospects for success.

Advertisement

2. Scientific discussion of altmetrics

2.1. Basic scientific context of altmetrics

The introduction of alternative indicators for the quantification of scientific output and the associated resonance on the Internet can be traced back to a discussion by Priem et al. in 2010 [6]. They questioned whether focusing on the classical bibliometric indicators adequately reflects the scientific and social significance of research in the era of the Internet. During the course of this discussion, the expression “altmetrics” was coined as a collective term for alternative metrics, which include web-based information on scientific publications. Therefore, altmetrics can be regarded as a complement to classical bibliometric indicators providing new information that was previously unavailable, predominantly from the social media sector. This new information makes it possible to examine the reception of scientific publications, for example, on news sites, in science blogs, policy papers, and other web-based sources.

The altmetrics community can now look back on almost 7 years of research. On the one hand, the “visibility and presence of altmetrics are quite impressive” [8] because they are used as marketing tools by many scientific publishers, more than 300 publications on the subject have appeared, and there are even conferences dedicated solely to altmetrics. On the other hand, there is no uniform definition, and therefore no consensus on what exactly is measured by altmetrics and what conclusions can be drawn from the results [8, 9, 10]. The only consensus regarding the term definition is that the indicators discussed are intended to measure the attention paid to scientific output where bibliometrics reaches its limits—that is, on the Internet [6]. There is, however, a lack of any further and more detailed differentiation of such metrics.

2.2. Tension between altmetrics and bibliometrics

Due to the fact that the base communities are the same, there is a certain tension between altmetrics and bibliometrics. Both (sub-)disciplines are intended to fulfill the same purpose, to generate a picture of scientific impact, but based on different influencing factors. Almost like a reflex, the two fields are often set in relation to each other, compared, or set up as an either/or selection.

In contrast, within the community itself, there is a general consensus that both disciplines complement each other instead of one excluding the other [11]. Altmetrics are not intended to replace the peer review process or bibliometrics; rather, they should be viewed as a second opinion [10] and a “new perspective on communication by and about science in social media” [7]. A report by the expert group on altmetrics on behalf of the European commission also argues for classical bibliometrics that they “offer complementary approaches to evaluation” together with alternative metrics [12]. The expert group furthermore sees potentials for including a wider audience beyond the closed science system and for collecting information considerably faster than with conventional metrics. Furthermore, the idea of this approach is not limited to conventional scientific publication formats but offers the perspective of making data sources such as software and data sets accessible (e.g., as part of research data management).

The big difference between bibliometrics and altmetrics is the aspect that scientific publications are the traditional and indispensable main output of science. Thus, bibliometrics measures something that is at the center of the scientific reward system. The communication of science to society—that is, what is measured by altmetrics—is not part of the scientific reward system as yet. Creating incentives and expanding this reward system at this point would likely lead to increased use of social media by science and thus also strengthen altmetrics.

2.3. Use of altmetrics in science evaluations

With regard to the practical application of altmetrics in research policy, science evaluations, and management, the scientific community is mostly skeptical. Bornmann and Haunschild [13] stress the problematic nature of the matter, namely that altmetrics should first confirm with the Leiden Manifesto for research metrics [14] before being applied on a greater scale. The central difficulties associated with altmetrics are presented, namely that there are currently no standardized indicators, that altmetric data are for the most part not accessible in a transparent and open manner, and that numbers can be manipulated through “gaming.” Gaming is a term for the targeted manipulation of data for the purposes of achieving better altmetric values. Such gaming activities are negative side effects of an orientation along user statistics in evaluation practice [9]. However, in spite of the difficulty in consistently unambiguously distinguishing gaming from marketing, altmetrics service providers are trying to minimize such effects. For example, altmetric.com manually removes obvious manipulations of altmetric scores or limits them by means of spammer lists [15].

Gaming is also a problem beyond the sources assessed by altmetric service providers. In a study by Meier and Tunger, it became apparent that it is possible to considerably influence the metrics specially developed by the ResearchGate platform, the RG score [16]. The RG score is intended to measure the “scientific reputation” of ResearchGate users. It is influenced by the impact of a user’s own scientific publications but also by their social activities on the platform (see https://www.researchgate.net/RGScore/FAQ). Meier and Tunger found that it is possible within a relatively short time to achieve an RG score that is higher that the RG scores of half of all RG users solely by gaming without any scientific publications.

In another study for the European commission, Kim Holmberg found that altmetrics are not yet practically applied in the EU for the purposes of scientific evaluation. In his view, such practice on a wide scale would be premature as long as what altmetrics actually measure remains unclear [17].

Advertisement

3. Problems associated with collecting and interpreting altmetrics

A semantic analysis of contributions in social media is lacking for the most part, which is a major issue making the evaluation of altmetrics counts so difficult. References are mostly counted based on identifiers such as the DOI; however, which references should be evaluated as positive and which as negative cannot be handled, which means that a “performance paradox” develops [18]. This paradox also exists in a similar form in classical bibliometrics and must be considered as an inherent problem of quantitative metrics in use [19].

Furthermore, the coverage of scientific publications is relatively low and the distribution varies heavily both across disciplines and across platforms. Haustein et al. found that 21.5% of all scientific publications in Web of Science in 2012 were mentioned in at least one Tweet, while the proportion of these publications in other social media was mentioned less than 5% [20]. In percentage comparison, 67% of the publications were cited in Web of Science at least once. A feasibility study conducted by BMBF shows strong variation concerning coverage on altmetric.com between the scientific disciplines: publications from the field of medicine are represented considerably more often than, for example, publications from the engineering sciences [7]. Differences in coverage appear to benefit the humanities sciences in particular. While these are scarcely considered in established databases such as Web of Science, their coverage is considerably better in the field of altmetrics, according to a study conducted by Hammarfelt: over 61% of the investigated publications in this field have at least one reader on Mendeley and more than 20% have already been discussed on Twitter [21].

In general, the data basis underlying altmetrics is often problematic: the reproduction of data is almost impossible because data providers change, modify their data stock, or disappear completely [4]. For example, platforms such as Weibo or LinkedIn, which are included in the sources covered by altmetric.com, are now no longer analyzed since these data providers no longer grant access. Quality control, such as a validity check of accounts or the clean-up of duplicates, rarely occurs for social media platforms and complicates the aggregating and filtering of data for altmetrics service providers [22].

Furthermore, Fraumann et al. ascertained that duplicates can be found in several types of sources on altmetric.com, which makes the credibility of the attention score uncertain [23]. This attention score is currently used by many scientific publishers and institutions as a marketing tool in the form of the “Altmetric Donut” (see Figure 1). The Altmetric Donut is implemented on the websites of the journals Nature and Science among others, and in the repositories of the universities of Cambridge and Zürich. The composition of the attention score is based on an algorithm that adds up the attention—weighted differently—of scientific output in diverse sources. This trend is regarded skeptically in science, viewing the Altmetric Donut as a successful gimmick that is meaningless for science [9]. In general, simply adding up counts in a single metric is “impossible and undesirable” [12]. Thus, benchmarks such as the attention score do not represent the impact of scientific performance, but are suited solely to filter out those articles that have sparked interest in social media [24].

Figure 1.

Example of the representation of the Altmetric Donut and its composition.

Advertisement

4. Requirements of altmetrics

To date, the European Commission ascribes high significance to altmetrics, particularly against the backdrop of open science. This is also reflected in the establishment of the associated expert group. The efforts have so far led to a compilation of twelve recommendations within the open science context. In the political context of the European Union’s supranational level, the importance of guidelines for the conscientious application of metrics is emphasized. These guidelines are interlaced in the following with the demands from the Leiden Manifesto for research metrics.

The Leiden manifesto emphasizes the aspect of complementarity as a central principle and basis of any evaluation practice. According to it, for the existing qualitative practices, the aim should be to complement each other in an advantageous manner. Peer review and expert assessment—this is the ambition—could be reinforced by the appropriate use of quantitative metrics, and further aspects beyond the traditional science system could be illuminated: “quantitative evaluation should support qualitative, expert assessment” [14].

Another aspect is the openness and transparency of all steps in the analysis process: “keep data collection and analytical processes open, transparent and simple” [14], that is, analyses should be verifiable and the indicators should not be unnecessarily complicated. At the same time, this does not mean that simple indicators (e.g., pure absolute numbers) with no significance should be used instead.

This recommendation is particularly important against the backdrop of the altmetric attention score since this composite indicator always combines data from many different sources. Their individual significance is unknown so that the score value can only contribute rudimentary information on the visibility of a publication in social media and therefore not be used for evaluation. At this point, attention should also be drawn to the inappropriate use of the journal impact factor, which occurs in a cumulative form particularly in medical science: its incorrect use as a citation indicator instead of as a simple journal indicator shows that it is immensely difficult to eliminate a metric once it has been established. Metrics in the scientific context must be reliable, reproducible, and significant.

Advertisement

5. Future potential of altmetrics in various fields of application

To what extent altmetrics will establish themselves in research policy depends fundamentally on empirical values from practical application in the sense of a learning experimental system. Therefore, potential fields of application are briefly outlined in the following paragraphs.

5.1. Science evaluation, performance assessment, and measurement of social impact

Due to the explorative development stage of altmetrics (as described above), they must be used carefully with regard to their application in the performance assessment of institutions and single scientists, for example within the scope of scientific evaluation. In particular, there is a lack of studies investigating how valid and reliable the evaluation of science based on altmetrics is. In the scientific discourse, a deeper understanding of the heterogeneity and the significance of the data must be achieved. In addition, useful indicators must be developed and benchmarking studies have to be conducted. According to current opinion, altmetrics will in the near future be more of a complementary component rather than an independent indicator for the assessment of scientific performance.

In addition, some research topics are more in the focus of society than others without necessarily displaying a larger social impact. In this context, attention should be drawn to the news values theory: it describes factors why some topics are reasonably sure to be reported and some are unlikely to become objects of journalistic reports in mass media [25]. Against this backdrop, altmetrics can be viewed as an incomplete indicator for social visibility. To what extent this circumstance will change over time cannot currently be predicted and depends more on the social discourse on science and the opening of the science system than on further methodological developments.

5.2. Public relations, visibility, and advertising of activities

A part of communication on science and its visibility in the public sphere is represented by altmetrics. In any case, it should be noted that there is a rising trend in social media activity measured by the frequency of contributions and the number of people involved. Thus, it is becoming increasingly important to use social media platforms in order to proactively draw attention to research, that is, advertise it.

As an example in this context, institutional efforts such as those undertaken by universities or the European Commission, can be observed, which strategically position their own publications and activities. Against the backdrop of the explorative state of these efforts, altmetrics could serve as feedback, for example, to test various approaches aimed at new target groups in society. With regard to research policy, particularly activities with a strong social relevance and their visibility could represent an interesting field of application complementing current evaluation approaches for analyzing media feedback. Initial network analyses are already delivering promising results and their application to research policy issues could be examined. Using specific issues associated with communication propagation, attention could be focused, for example, on the identification of relevant multipliers—for example, science journalists and representatives from politics, industry, and interest groups—in the dissemination of information. Identifying such mechanisms and transmission channels in pilot studies would be promising research priorities in this respect in addition to medial feedback already addressed through established investigation designs.

Publishers already use the altmetric score mentioned in Section 3 as feedback on articles, albeit in a strongly aggregated and simplified form. Similar efforts are also apparent at universities and research institutions, which are testing the implementation of the Altmetric Donut both with and without the score, although the added value of these efforts has yet to be clarified. As part of a pilot measure, the OECD is currently investigating to what extent the altmetric explorer and the implementation of the altmetric score are suited to determine the social range of policy documents.

Science institutions can also use altmetrics within the scope of science marketing: it is conceivable that altmetrics could be used to focus attention on those publications by an institution that is widely discussed, shared, tweeted, or used in news pieces. This would permit the interface between science and society to be better addressed.

Whether there is any benefit from altmetrics in economics or politics beyond science has not yet been verified. From our viewpoint, there would be benefits if more sources of economic or policy-relevant sources were covered by the altmetrics databases. In this case, it would be possible to regard or measure the contribution of science in economy or policy. With bibliometric instruments, such as publication or citation analyses, it is not possible to measure this contribution since the economic or political world does not publish articles in scientific outlets. With altmetrics one would be able to have a look at, for example, mentions of scientific publications in documents, which influence politics or discussions on the application of scientific research in economics or companies. Generally, it would be worthwhile to identify the impact of scientific contributions on individual groups more easily, if one could associate contributions on social media platforms to particular fields of application.

5.3. Reporting reputation

For scientists, the visibility of their publications is essential. The reputation resulting from the use by others of their scientific output in the form of ideas, statements, calculations, and findings is an essential part of the science system. Only the use of the generated output creates sustainable value for an individual scientist, be it in other scientific publications or in web-based communication, social media, or news pieces. Bibliometrics and altmetrics help scientists document the visibility of their work. Thus, the majority of the almost 700 scientists who participated in a survey on the RG platform stated that it is important to them to have a high RG score.

Altmetrics permit scientists to record, regulate, and document their own visibility to a greater extent than was previously possible. Particularly for early-career scientists, there is thus a great opportunity to increase attention and reputation independently from the traditional publication system. In the longer term, altmetrics could assume the function of documenting the mediation of science to society and of making it more transparent.

5.4. Support from libraries

Academic libraries are usually where contacts can be found within a scientific institution for issues related to publication data and bibliometric processes/indicators. Librarians’ clean data, compile publication profiles, and collect data within the scope of evaluations. They are thus specialists for handling data, particularly data related to publications, user statistics, and stock management.

This is where altmetrics represent a connecting element as they illuminate the use of publications in social media. Thus it is plausible for libraries to be directly involved whenever the issue of altmetrics is addressed at an institution. This makes sense because librarians are in contact with many areas of a scientific institution and offer advice on using information products. Roemer and Borchardt [26] identified this central role of libraries and summarize: “[…] librarians serve as natural leaders when it comes to altmetrics […]” [26]. They argue that this is due to the resources and data knowledge of libraries as well as their central position as contact partners for various target groups [27, 28].

Advertisement

6. Conclusion

In conclusion, altmetrics are currently still at an explorative stage and have far to go before they can make a regular contribution to quantitative science indicators of bibliometrics [29]. We show that there are still problems with the indicators and associated benchmarks. This is why the use of altmetrics in the context of science evaluations is not yet conceivable. Simultaneously, however, this insight could function as an incentive to enhance application maturity and to create the political boundary conditions for advancing further developments. Thanks to initial applications of altmetrics in the academic context, important experience is being gained. The scientific debate over the past few years has thus led to altmetrics achieving the validity and application maturity required for initial applications. However, they must be further developed for applications that are more thorough; particular indicators have to go beyond the level of individual publications and should also aggregate data on various levels. Additionally, the problems of altmetric indicators have to be addressed especially regarding coverage, representativeness, gaming, and validity.

Interviews of the bibliometrics team at Forschungszentrum Jülich with experts in the field of bibliometrics and altmetrics confirm the above-mentioned findings [7]. These experts gave statements about the meaningfulness and application maturity of altmetrics. They stated that the significance of altmetrics indicators is located at a low to medium range only. The initial euphoria in the field, with the focus on the far-reaching potentials up to the measurement of the social impact and the performance evaluation of science, seems to have subsided.

There was a consensus between the experts that altmetrics is not an alternative to bibliometrics, but a new perspective on communication from and about science in social media: Perception and “popularity” are in the foreground. However, scientific quality or excellence is marginally represented by altmetrics, since it correlates only partially positively with perception. In principle, this contradicts bibliometrics, which is based on an inherent and peer review-based approach for the evaluation of science.

In contrast to the meaningfulness, the experts’ assessments differ more strongly with regard to the maturity for application of altmetrics. This is sometimes due to the fact that expectations diverge: should these metrics be a purely quantitative indicator or do they provide the starting point for qualitative analyses? Furthermore, the areas of application are very broad and also include marketing activities that have so far been of secondary importance for research policy. Against this background, there is still unanimity that altmetrics can currently not be interpreted as a standalone and quantitative indicator. In particular, it was unanimously emphasized that altmetrics does not conform to a scientific database that is a prerequisite for the assessment of scientific work.

The appreciation of what role policymakers should play and how altmetrics can be used for research policy are divergent. However, in most of the interviews, the experts think that politicians should play an active role in shaping the implementation of altmetrics. Politicians could create a superordinate and binding framework for the application of altmetrics, for instance, by anchoring demands and formulating research questions.

In the long term, the increasing involvement of science in social media platforms will have a positive effect on the application of altmetrics. In addition, data providers are designing sources systematically and increasingly semantically. Current developments appear promising and point toward an expansion of source selection for English-language policy documents and news articles [15]. This would mean that in addition to the relevant news target groups, two complementary transmission channels of science into politics and industry can be covered.

References

  1. 1. Ortega JL. Relationship between altmetric and bibliometric indicators across academic social sites. The case of CSIC’s members. Journal of Informetrics. 2015;9:39-49. DOI: 10.1016/j.joi.2014.11.004
  2. 2. Thelwall M, Kousha K. Academia.edu: Social network or academic network? Journal of the Association for Information Science and Technology. 2014;65:721-731. DOI: 10.1002/asi.23038
  3. 3. Shneiderman B. Human-computer interaction redefines science [Internet]. 2008. Available from: www.sciencedaily.com/releases/2008/03/080306170924.htm [Accessed: October 23, 2017]
  4. 4. Haustein S, Peters I, Sugimoto CR, Thelwall M, Larivière V. Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature. Journal of the Association for Information Science and Technology. 2014;65:656-669. DOI: 10.1002/asi.23101
  5. 5. Kraker P, Lex E. A critical look at the ResearchGate score as a measure of scientific reputation. In: Quantifying and Analysing Scholarly Communication on the Web (ASCW ‘15); 2015
  6. 6. Priem J, Taraborelli D, Groth P, Neylon C. Altmetrics: A Manifesto [Internet]. 2010. Available from: http://altmetrics.org/manifesto [Accessed: October 23, 2017]
  7. 7. Tunger D, Meier A, Hartmann D. Machbarkeitsstudie altmetrics [Internet]. 2017. Available from: http://hdl.handle.net/2128/16419 [Accessed: March 08, 2018]
  8. 8. Haustein S. Vier Tage für fünf Jahre Altmetrics: Bericht über die Konferenz 2AM und den Workshop altmetrics15. b.i.t. Online. Vol. 19; 2016. pp. 110-112
  9. 9. Franzen M. Digitale Resonanz: Neue Bewertungskulturen fordern die Wissenschaft heraus. WZB Mitteilungen. 2017;155:30-33
  10. 10. Butler JS, Kaye ID, Sebastian AS, Wagner SC, Morrissey PB, Schroeder GD, Kepler CK Vaccaro AR. The evolution of current research impact metrics: From bibliometrics to altmetrics? Clinical Spine Surgery. 2017;30:226-228. DOI: 10.1097/BSD.0000000000000531
  11. 11. Wouters P, Thelwall M, Kousha K, Waltman L, de Rijcke S, Rushforth A, Franssen T. The metric tide: Literature review (Supplementary report I to the independent review of the role of metrics in research assessment and management) [Internet]. Available from: http://www.dcscience.net/2015_metrictideS1.pdf [Accessed March 08, 2018]
  12. 12. Wilsdon JR, Bar-Ilan J, Frodeman R, Lex E, Peters I, Wouters P. Next-generation metrics: responsible metrics and evaluation for open science [Internet]. 2017. Available from: http://eprints.whiterose.ac.uk/113919 [Accessed March 08, 2018]
  13. 13. Bornmann L, Haunschild R. To what extent does the Leiden Manifesto also apply to altmetrics? A discussion of the manifesto against the background of research into altmetrics. Online Information Review. 2016;40:529-543. DOI: 10.1108/OIR-09-2015-0314
  14. 14. Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I. Bibliometrics: The Leiden manifesto for research metrics. Nature. 2015;520:429-431. DOI: 10.1038/520429a
  15. 15. Altmetric.com. Personal interview on August 14-15, 2017
  16. 16. Meier A, Tunger D. Investigating the transparency and influenceability of altmetrics using the example of the RG score and the ResearchGate platform. Working Paper
  17. 17. Holmberg KJ. Altmetrics for Information Professionals: Past, Present and Future. Amsterdam: Chandos Publishing; 2015
  18. 18. Meyer MW, Gupta V. The performance paradox. Research in Organizational Behavior. 1994;16:309-369
  19. 19. Holbrook J, Barr K, Brown KW. We need negative metrics too. Nature. 2013;497:439. DOI: 10.1038/497439a
  20. 20. Haustein S, Costas R, Larivière V. Characterizing social media metrics of scholarly papers: The effect of document properties and collaboration patterns. PLoS One. 2015;10:e0120495. DOI: 10.1371/journal.pone.0120495
  21. 21. Hammarfelt B. Using altmetrics for assessing research impact in the humanities. Scientometrics. 2014;101:1419-1430. DOI: 10.1007/s11192-014-1261-3
  22. 22. Thelwall M. A brief history of Altmetrics. Research Trends. 2014;37:3-4
  23. 23. Fraumann G, Zahedi Z, Costas R. What do we know about Altmetric.com sources? A study of the top 200 blogs and news sites mentioning scholarly output [Internet]. Available at: http://hdl.handle.net/1887/48266 [Accessed: March 08, 2018]
  24. 24. Galtung J, Ruge MH. The structure of foreign news. Journal of Peace Research. 1965;2:64-90. DOI: 10.1177/002234336500200104
  25. 25. Warren HR, Raison N, Dasgupta P. The rise of altmetrics. Journal of the American Medical Association. 2017;317:131-132. DOI: 10.1001/jama.2016.18346
  26. 26. Roemer RC, Borchardt R. Altmetrics and the role of librarians. Library Technology Reports. 2015;51:31-38
  27. 27. Gimpl K. Evaluation von ausgewählten Altmetrics-Diensten für den Einsatz an wissenschaftlichen Bibliotheken [Internet]. Available from: https://publiscologne.th-koeln.de/frontdoor/index/index/docId/1034 [Accessed: March 08, 2018]
  28. 28. Forschungszentrum Jülich. Almetrics. Metrics for information on the dissemination of scientific publications [Internet]. Available at: http://www.fz-juelich.de/zb/EN/altmetrics [Accessed: March 28, 2018]
  29. 29. Haustein S. Grand challenges in altmetrics: Heterogeneity, data quality and dependencies. Scientometrics. 2016;108:413-423. DOI: 10.1007/s11192-016-1910-9

Written By

Dirk Tunger, Marcel Clermont and Andreas Meier

Submitted: 07 December 2017 Reviewed: 29 March 2018 Published: 05 November 2018