What are altmetrics? The term altmetrics – short for alternative metrics – appeared first in the “Altmetrics: A manifesto” (Priem, Tarborelli, Groth, & Neylon, 2010). The manifesto did not provide a definition, but a vision: “altmetrics expand our view of what impact looks like, but also of what’s making impact… they’re great for measuring impact in this diverse scholarly ecosystem”.
Through the years there were several attempts to develop that vision into a definition and provide a clear understanding of what the term actually captures. Here are some examples:
We see a variety of definitions, some are technical, others are broader but not detailed enough, thus it is important to clarify what the Journal considers as altmetrics. Our definition tries to be holistic on the one hand and detailed on the other hand.
Altmetrics has been coined as an umbrella term covering a large range of new online-based metrics related to scientific activities. They are expected to complement the more traditional science indicators (e.g. publication and citation couns) and methods for assessment with new tools and provide previously unexplored facets of “impact”. More importantly, they allow to unveil how research is communicated and received by wider public on social media platforms, in grey literature sources and on the Web in general. This allows for the broader analysis of the interactions between these diverse audiences and a large range of scholarly objects and research entities by (among others) viewing, reading, disseminating discussing, commenting, (dis)liking, and sharing scholarly-related information. As such, altmetrics go beyond just mere counting of events, because some of their forms (e.g., blogs, news, Facebook posts, comments, tweets) contain text, multimedia and the concurrence of ample audiences that open the possibility to study the interactions between these audiences and the scientific world. Thus, altmetrics track online attention and usage not just to journal articles, but to anything considered as part of the research process and all its related actors, including all sorts of scholarly-related outcomes (e.g., books, book chapters, scientific press releases, reports, patents, data, software and video) as well as all types of scholarly entities (e.g. researchers, research organizations, funders, journals, publishers, research topics, etc.).
Altmetrics are diverse and dynamic. Altmetrics are multi-faceted and fast. Fang and Costas (2018) showed that tweets and reddits to research output accumulate half of the counts in only about 15 days. They are often viewed as “early signals of impact” (Wilsdon et al., 2017).
It should be noted that sometimes the early interest does not reflect on the traditional measure of citation impact collected years later. As an example, consider the first two top-ranked publications in the Altmetric Top-100 list for 2014 (https://www.altmetric.com/top100/2014/). The first one, “Experimental evidence of massive-scale emotional contagion through social networks” had an altmetric score of 5,044 at the end of 2014, and the second one “Variation in melanism and female preference in proximate but ecologically distinct environments” with an attention score of 4,083. The first article received 612 citations on Scopus, 458 citations on WoS and 1371 citation on Google Scholar until mid-July 2018. Its current altmetric score is 6868. The second article was cited only 4 times both on WoS and on Scopus and 7 times on Google Scholar – thus in this case the early attention did not predict future citation impact. It is also reflected in its current Altmetric attention score which decreased from 4,083 to 1,887.
Despite of their many advantages, altmetrics have some downsides that need attention, here are some:
The most important challenges for me are to understand the meaning of altmetrics and what they measure (validity), why one gets involved, why others refrain from participating. Initial steps towards interpreting altmetrics were taken by Haustein, Bowman and Costas (2016).
That said, there is constant growth in the amount of research in the area of altmetrics which was demonstrated in a recent comprehensive literature review of the topic published by Sugimoto, Work, Larivière and Haustein (2017).
The need for a dedicated journal for altmetrics can also be demonstrated by the sheer growth in research articles in this area. Combining all the references in the above literature review with new articles that have ‘altmetrics” as a topic in their title, abstract or keyword, we found 978 publications on altmetrics and related topics that are indexed in Web of Science and Scopus.
Figure 1 demonstrates the overall growth in publications on the topic. It is worth noting that as of July 2018, the number of publications is more than half the articles published in 2017. This in itself supports our assumption that there is need for a specialized publication outlet for research on altmetrics in the form of The Journal of Altmetrics.
The five most prolific publication soureces on altmetrics are: Scientometrics (88), JASIST (38), PloS One (33), Profesional de la Informacion (29) and the Journal of Informetrics (25).
In order to analyze research on altmetrics, we considered only publications that were directly relevant, (publications providing background information were excluded) and had dois. This dataset is comprised of 693 publications. We retrieved citations from the Web of Science (WOS) and Scopus, reader counts from Mendeley and other altmetrics from the two major aggregators: Altmetric.com and PlumX.
As expected, the highest coverage among the altmetric indicators is by Mendeley (97.3%) (Zahedi, Costas, & Wouters, 2014). Its coverage is higher than that of the citation databases (WoS –75.0%, Scopus 91.8%). Mendeley is followed by Twitter (Altmetric 70.0%, PlumX 65.1%) (Thelwall, Haustein, Larivière, & Sugimoto, 2013), blogs (Altmetric, 32.6%, PlumX 16.0%), Facebook (Altmetric 20.1%, PlumX 24.8%) and Wikipedia Altmetric 6.6%, PlumX 8.4%). Altmetric.com and PlumX track other altmetric indicators as well, but these had negligible coverage, except for usage indicators, collected by PlumX. The altmetric coverage of “altmetrics” is considerably higher than those reported for a large dataset (more than 700,000 items) by Costas, Zahedi and Wouters (2015), indicating that altmetrics are a “hot” topic.
Table 1 displays the yearly average counts per publication for several indicators. Note that Altmetric counts the number of Facebook posts, while PlumX provides the sum of likes, comments and shares, thus the two are not comparable. Both Mendeley reader counts and tweets are much higher than citations.
|Year of publication||Average Scopus citations||Average WoS citation||Average Mendeley Readers||Average tweets – Altmetric||Average tweets – Plum||Average blog mentions – Altmetric||Average blog mentions – Plum||Average FB posts – Altmetric||Average likes, shares, comments on FB -Plum||Average Wikipedia mentions – Altmetric||Average Wikipedia mentions – Plum|
Lastly, Table 2 displays the “mosts” by different data sources. Interesting to note that the only item that appeared more than once was the article “Online collaboration: Scientists and the social network” (Van Noorden, 2014). This article presents the result of a large survey on how researchers use social media platforms.
|Data source||First author||Title||Source||Pub. Year|
|Scopus||Tenopir||Data sharing by scientists: Practices and perceptions||PloS1||2011|
|WoS||Eysenbach||Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact||JMIR||2011|
|Mendeley||Hall||The Kardashian index: A measure of discrepant social media profile for scientists||Genome Biology||2014|
|Twitter A & P, BlogA||Van Noorden||Online collaboration: Scientists and the social network||Nature||2014|
|Blog P||Faulkes||The vacuum shouts back: Postpublication peer review on social media||Neuron||2014|
|Facebook A||Van Noorden||Online collaboration: Scientists and the social network||Nature||2014|
|Facebook P||Mewburn||Why do academics blog? An analysis of audiences, purposes and challenges||Studies in Higher Education||2013|
|Wikipedia A||8 publications, each mentioned twice, including Van Noorden|
|Wikipedia P||Piwowar||Altmetrics: Value all research products||Nature||2013|
In this editorial, I described what altmetrics means to the Journal’s Editorial Board and showed the growing interest in the topic which justifies the launch of a new specialty journal, the Journal of Altmetrics. We hope that soon the Journal of Altmetrics will be among the top journals that publish research on altmetrics and will join the established bibliometrics and research assessment journals such as Scientometrics and the Journal of Informetrics.
The author has no competing interests to declare.
Aharony, N., Bar-Ilan, J., Julien, H., Benyamin-Kahana, M., & Cooper, T. (no date). Acceptance of altmetrics by LIS scholars: An exploratory study. Journal of Librarianship and Information Science. DOI: https://doi.org/10.1177/0961000617742461
Altmetric.com. (no date). What are altmetrics? Retrieved from: https://www.altmetric.com/about-altmetrics/what-are-altmetrics/.
Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903. DOI: https://doi.org/10.1016/j.joi.2014.09.005
Costas, R., Zahedi, Z., & Wouters, P. (2015). Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology, 66(10), 2003–2019. DOI: https://doi.org/10.1002/asi.23309
Fang, Z., & Costas, R. (2018). Studying the posts accumulation patterns of various Altmetric.com data sources. To be presented at the Altmetrics18 Workshop, London, UK.
Haustein, S., Bowman, T. D., & Costas, R. (2016). Interpreting ‘altmetrics’: Viewing acts on social media through the lens of citation and social theories. In: C. R. Sugimoto (Ed.), Theories of Informetrics and Scholarly Communication (pp. 307–405). Berlin: De Gruyter. DOI: https://doi.org/10.1515/9783110308464-022
NISO. (2016). Outputs of the NISO Alternative Assessment Metrics Project. NISO RP-25-2016. Retrieved from: https://groups.niso.org/apps/group_public/download.php/17091/NISO%20RP-25-2016%20Outputs%20of%20the%20NISO%20Alternative%20Assessment%20Project.pdf.
Plum Analytics. (2018). PlumX metrics. Retrieved from: https://plumanalytics.com/learn/about-metrics/.
Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. Retrieved from: http://altmetrics.org/manifesto.
Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PloS one, 8(5), e64841. DOI: https://doi.org/10.1371/journal.pone.0064841
Van Noorden, R. (2014). Online collaboration: Scientists and the social network. Nature News, 512(7513), 126. DOI: https://doi.org/10.1038/512126a
Wilsdon, J., Bar-Ilan, J., Frodeman, R., Lex, E., Peters, I., & Wouters, P. F. (2017). Next-generation metrics: Responsible metrics and mvaluation for Open Science. Report of the European Commission Expert Group on Altmetrics. Retrieved from: https://openaccess.leidenuniv.nl/bitstream/handle/1887/58254/report.pdf?sequence=1.
Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, 101(2), 1491–1513. DOI: https://doi.org/10.1007/s11192-014-1264-0