The main advantages and disadvantages of social media metrics that have been identified, described, and analyzed for research assessment include one of the main activities within the field of altmetrics (Haustein 2016).
While Wouters and Costas (2012) highlight the main advantages of altmetrics through four dimensions (broadness, diversity, speed, openness), Priem (2014) points out a series of disadvantages, including the lack of theory, ease of gaming, and possible biases. Bornmann (2014) expands the taxonomy of limitations to data quality (bias, target, multiple versions, different meanings, measurement standards, mention standards, cross-field and time normalization, and replication), missing evidence, and manipulation. Likewise, Haustein (2014) suggests the representativeness of altmetrics—which might be included within the ‘data quality’ category—as one of the main limitations of altmetrics.
The representativeness issue can be treated from different perspectives, namely the population of active users using social platforms (number of users), the order of magnitude of data collected by the direct sources of metrics (amount of data generated), the actual coverage of data obtained by altmetrics data providers (amount of data identified), and the coverage of documents indexed by altmetrics providers (percentage of published documents that are both mentioned and identified). This last issue represents the objective of this study.
Haustein et al. (2014a) performed one of the first studies oriented at knowing the coverage of documents with alternative metrics through Altmetric.com, obtaining a coverage percentage of 45.2% for a sample of 84,374 documents deposited in Arxiv.org. Later, Robinson-García et al. (2014) analyzed a corpus consisting of 2,792,706 articles (published between 2011 and 2013 with digital object identifier (DOI) and indexed in the Web of Science) ciphering the coverage by 19%, although with important differences according to the source, being the main source Twitter (87.1% of articles), Mendeley (64.8%), and Facebook (19.9%). However, because Altmetric.com only checks Mendeley if it has at least one other altmetric, this may explain the lower values of Mendeley with respect to Twitter.
Costas et al. (2015) performed another study through Altmetric.com (a set of 718,315 documents with DOI). The results reflected that “only 7% of all papers in the WoS (without any time restriction and with DOI) had some altmetric score as covered by Altmetric.com”, detecting however a growth in recent documents (as of 2010).
Scientific literature had already warned of the strong concentration of metrics in Mendeley and especially Twitter (Priem et al. 2012). Document coverage studies on these two platforms offer percentages that depend on the samples analyzed. For example, Zahedi et al. (2014) analyzed a random sample of 19,722 publications (published between 2005 and 2011, with DOI and indexed in Web of Science) through ImpactStory, finding that 62.6% of the documents had at least one reader in Mendeley and only 1.6% had a Tweet mention.
Dependency according to disciplines has also been evidenced in literature. Haustein et al. (2014a) show important differences in the percentage of documents with some mention on social networks (from 30.4% in ‘Nuclear Theory’ to 85.2% in ‘Quantitative Biology’), while Costas et al. (2015) obtained much lower percentages (from 5.4% in ‘Mathematics and Computer Science’ to 22.8% for ‘Biomedical and Health Science’). Another large sample (1,431,576 documents), precisely circumscribed to the area of biomedicine, shows that 9.4% of the sample had at least one Tweet (Haustein et al. 2014b) and 66% had a reader in Mendeley (Haustein et al. 2014c). The differences according to areas of knowledge are equally significant. Mohammadi and Thelwall (2014) determined Mendeley covered only 28% of articles in humanities (indexed in Web of Science and published in 2008), while that percentage increased to 58% in the case of social sciences.
Another parameter in which a dependence on altmetrics coverage has been demonstrated is the geographical area and country, where different degrees of penetration and use of social networks, alongside different cultures of diffusion of academic activity, shape the available altmetric data. Alperin (2015) shows how the coverage of altmetrics in Brazil and Chile are far superior to the rest of the countries in Latin America, according to a sample of 389,795 articles published in journals included in SciELO (Scientific Electronic Library Online) platform (http://www.scielo.org), of which 44.6% (173,733 documents) received at least one mention.
Literature has also discussed the analysis of different document aggregations, such as journals and institutions. Regarding journals, significant differences in coverage have been detected according to the area and the source analyzed, highlighting large multidisciplinary journals (e.g., Science, Nature, PNAS, PLoS One). Priem et al. (2012) estimated that 80% of the articles published in PLoS One were indexed in Mendeley, although this percentage dropped to 12% if the coverage was measured on Twitter. Barthel et al. (2015) subsequently warned that this percentage had grown (53%). Li et al. (2012) found elevated coverages for articles published in 2007 in Nature (93.8%) and Science (92.8%).
With regard to universities, Torres-Salinas et al. (2018) focused on the Spanish university system by collecting information on the activity of 66 universities in Altmetric.com through a corpus of 125,824 documents published on Web of Science between 2014 and 2016. The results indicate a total coverage of 42% of the articles, although there are highly important differences between universities, from 11% coverage at the Universidad Pontificia de Salamanca to 71% at the Universitat Pompeu Fabra.
As observed, literature provides variable coverage figures. However, these studies are limited to the selected samples, which are restricted to documents published in specific disciplines, time periods, geographic areas, or selective bibliographic databases. The lack of a global study about the coverage of academic documents that have achieved some kind of impact on social network platforms, as well as its evolution over time, is detected. This issue is of great importance when knowing the penetration degree of altmetrics in the academic scope, as well as to help contextualize the order of magnitude of the altmetric data gathered, especially at a macro-level and meso-level analyses.
The appearance of Dimensions (https://dimensions.ai) in January 2018 (Schonfeld 2018) may have constituted an opportunity to carry out global coverage studies of altmetrics. Dimensions is a bibliographic database launched by Digital Science (https://www.digital-science.com) covering publications, such as grants, patents, clinical trials, and policy documents, as well as citations received and altmetric attention data via Altmetric.com (Bode et al.; Hook et al. 2018). As of August 30, 2018, Dimensions includes 96,725,143 documents; whereas, Scopus includes 72,372,800 documents and Web of Science Core Collection (considering Emerging Sources, Proceedings and Book Citation Indexes) includes a total of 69,842,611 documents.
The coverage of Dimensions, together with its quality control, its available API, as well as its connection to Altmetric.com (Orduna-Malea & Delgado López-Cózar 2018a), place this database a priori as an optimal bibliometric tool for the purpose of analyzing the global coverage of altmetrics.
In addition, this database allows researchers to easily analyze the coverage of documents with altmetric mentions not only at a general level (total documents indexed), but also at the unit level (documents according to institutions, countries, cities, journals, field, etc.). However, its validity and accuracy to this end is still to be tested.
The work by Thelwall (2018) constitutes the first bibliometric analysis of the product, in which the coverage of this database is emphasized to already be similar to that of Scopus. Similarly, the study concludes that Dimensions and Scopus strongly correlate in terms of citation counts, thereby opening the doors to its use as a bibliometric analysis tool. However, Orduna-Malea and Delgado López-Cózar (2018b) later detected a series of inconsistencies in the thematic classification used, also discussed by Bornmann (2018) and Herzog and Lunn (2018). However, its potential use as a next-generation research and discovery platform for better and more efficient access to academic material has already been tested (Chen 2018).
In relation to the worldwide altmetrics coverage of scientific literature, the following research questions are addressed:
RQ1. What is the total coverage of academic documents with altmetric mentions at present? What is the evolution of this coverage like over time?
RQ2. What places in the world (countries, cities) show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?
RQ3. What institutions show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?
RQ4. What journals show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?
RQ5. What research categories show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?
RQ6. What funding bodies show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?
RQ7. Is Dimensions an accurate bibliographic tool to carry out an analysis of the worldwide penetration of altmetrics, according to different units of analysis?
Dimensions Pro version database directly provides structured information about the set of documents that match with each specific query, including the number of publications, number of citations, citations per publication, the Relative Citation Ratio (RCR) Mean, Field Citation Ratio (FCR) Mean, the percentage of articles cited, and the percentage of publications with an Altmetric Attention Score of one or higher, hereinafter referred to as the AAS percentage.
In order to address RQ1, we defined a global query (97,531,400 documents; time interval: 1665 to 2018). This query was subsequently filtered according to the open access level (all, publisher, and repository). Finally, all publications between 2000 and 2017 (52,048,103 documents) were considered as the study sample.
In regard to RQ2, RQ3, RQ4, RQ5, and RQ6, we selected the top 50 journals, countries, cities, institutions, and funding bodies according to the total number of publications indexed in the database. In the case of disciplines, all of them (152) were gathered in order to minimize potential biases in the results.
Then, for each entity (journals, countries, cities, institutions, funding bodies, and research categories), the annual number of publications from 2000 to 2017 was additionally gathered. Finally, for each entity and year of publication, the corresponding percentage of publications cited and the AAS percentage were calculated.
In the case of documents with multiple authors, binary counting was used to quantify the locations (countries and cities) and institutions. That is each location and institution was counted once per publication.
Lastly, with regard to RQ7, we took into account various procedures to test the reliability of the database:
All data were extracted manually and then statistically analyzed with XLStat. The first sample was gathered in July 2018 and the second sample in October 2018. Data were analyzed in November 2018. The raw results per entity are available in supplementary file 1.
Considering the total coverage of Dimensions as of October 2018 (97,531,400 publications), the percentage of documents (all typologies) with an Altmetric Attention Score amounts to 9.4 (Table 1). If we consider only open access documents (19.9% of publications in the database), the percentage increases to 18.9. This fact can prove a greater visibility of the documents when they are available in some open access form (i.e., an open access altmetric advantage).
|Open Access – All||19,388,638||18.9|
|Open Access – Publisher||15,180,909||18.6|
|Open Access – Repository||4,207,729||20|
The percentage of publications with AAS remains stable around 8 from 2000 to 2010, experiencing a notable increase from 2012 (14.8) to 2017 (22.5) (Figure 1). This growth can be directly related to the launch of Altmetric.com, the company that provides Dimensions with altmetric data, as well as an increase in the usage of academic social network platforms by researchers, already foretold by different Nature surveys (Van Noorden 2014; Harseim & Goodey 2017).
If we disaggregate the results according to each of the units of analysis considered (top 50 countries, cities, institutions, journals, disciplines, and funding bodies), we can see that the total AAS percentage (considering the complete coverage in the database) varies depending on the type of entity analyzed (Table 2). While it is more homogeneous for countries and cities (standard deviation of 4.9 and 5.8, respectively), it seems to be more sparse for funding bodies and, especially, for journals. This effect can be visualized in the box plots performed for each entity (Figure 2).
The global peak achieved in past years (see Figure 1), together with the raw academic publication output growth, may distort to some extent the overall percentage of documents with altmetrics, as well as the specific values obtained at the unit level. Table 3 (publications) and Table 4 (Altmetric Attention Score percentage) show, according to each of the units of analysis considered (top 50 countries, cities, institutions, journals, disciplines, and funding bodies), the Spearman correlation between the results obtained in 2017 with those obtained both at the beginning of the analyzed period (2000) and the total values (from 1665).
|Entity||Total vs 2017||2000 vs 2017|
|Entity||Total vs 2017||2000 vs 2017|
As we can observe, the results (both for publications and AAS percentage) related to 2017 strongly correlate to the total value for all units except for journals. Conversely, the correlation between 2017 and 2010 is weaker. A reasonable explanation of this fact is that the concentration of publications and altmetrics in 2017 is striking.
The coverage of journals (and articles published by each journal) constitutes the fundamental piece on which the coverage of any bibliographic database is sustained and, therefore, will determine the coverage of documents with altmetrics (both total and broken down by aggregated entities) that Dimensions will show.
PLoS One is the journal with the highest number of articles published with an AAS percentage (approximately 131,508 documents), followed by the Proceedings of the National Academy of Sciences (PNAS) (approximately 74,312 documents). Table 5 contains the journals with the highest and lowest total AAS percentages, the AAS percentage in 2017, the statistical range (AAS percentage in 2017 minus AAS percentage in 2000), the standard deviation (SD) (from 2000 to 2017), the total number of publications with an AAS percentage (Palt), and the ranking position of the journal according to the total number of articles indexed in the database.
|PERFORMANCE||Journals||Publication||Altmetric Attention Score (% mentioned)|
|J. of Biological Chemistry||15||47,654||26.7||77.1||31.8||14.1|
|J. of Organic Chemistry||40||20,855||23.6||25.2||–3.9||4.0|
|Applied Mechanics & Materials||30||109||0.1||0.1||0.1||0.1|
|Choice Reviews Online||11||0||0||N/A||–0.1||0.1|
Among the 50 journals with the most indexed publications, nine obtain a total AAS percentage value less than one, where ChemInform (0 articles with an altmetric attention score) especially stands out for being the journal with the most articles indexed in Dimensions (791,868), though it ceased its publication in 2017.
Apart from the obvious differences among disciplines (see Research Categories section), multidisciplinary journals show higher performance (PLoS One and PNAS), especially in the last years. Precisely, Nature (97.8) and Science (95.7) are the journals with the highest AAS percentage in 2017 (Figure 3).
Harvard University stands as the institution not only with the highest number of publications with AAS, but also with the highest total AAS percentage (Figure 4). We can also observe a predominance of North American universities (7 out of the 10 institutions with higher total AAS percentage are from USA). The University College London (UK) should also be pointed out because this institution achieved the highest AAS percentage in 2017 (70.5 of their documents published have an AAS of one or higher), considering the top 50 institutions with the highest productivity in Dimensions. On the contrary, Japanese institutions achieve low AAS percentages despite their elevated productivity, especially Osaka University (14th position in total productivity, with 13.1 of AAS percentage), and University of Tokyo (1st position in total productivity, with 14.7 AAS percentage) (Table 6).
|PERFORMANCE||Institutions||Publication||Altmetric Attention Score (%)|
|Univ. California, San Francisco||28||50979||35.7||64.2||41.0||16.1|
|Johns Hopkins Univ.||10||66803||33.6||62.3||40.0||15.3|
|Univ College London||6||68265||33.1||70.5||52.7||18.7|
|Univ. of Melbourne||39||37683||31.4||57.5||44.0||16.8|
|Russian Academy of Sciences||37||9523||7.5||13||8.1||4.3|
The top 10 universities according to the total AAS percentage are included in Figure 4 so that we can observe the evolution of their AAS percentage over time (2000 to 2017). We can notice a similar pattern as previously observed for the global coverage (see Figure 1), with one notable increase in the AAS percentage located in 2012, which marks a growing trend until 2017.
Given the total productivity, it is not surprising to confirm that the United States (2,999,786 documents) and the United Kingdom (907,637 documents) are the countries with the most publications with AAS. However, Australia stands out as the country with the highest total AAS percentage (24.6), followed by Denmark (24.4) (Table 7), considering only the top 50 countries according to the total productivity.
|PERFORMANCE||Countries||Publication||Altmetric Attention Score (%)|
A world map (both for publications with AAS and for total AAS percentage) is offered in Figure 5. Saint Kitts and Nevis (50.6) and Guinea-Bissau (46.3) show the highest total AAS percentages in the world due to a statistical artefact (scarce productivity). For this reason, only countries with a minimum threshold in productivity should be considered when analyzing the AAS percentage.
Data shows China (3rd position in total productivity), Japan (5th position), India (9th position) and Russia (14th position) achieving low AAS percentages (13.3, 11.1, 11.6, and 6.3, respectively). This low impact on social media metrics, not reflected in the number of citations received (Table 8), might be associated with a lower impact of non-English contents on Twitter, the main carrier of altmetric mentions, as well as the usage of Twitter in these countries.
|Cited (%)||AAS (%)||Cited (%)||AAS (%)||Cited (%)||AAS (%)||Cited (%)||AAS (%)||Cited (%)||AAS (%)|
The 50 cities in the world with the highest number of publications can be visualized in Figure 6. Cambridge (Massachusetts, US) achieves the highest AAS percentage in the sample, though there are doubts about whether it should be included as part of Boston (3rd position). As with institutions, cities in the United States take the first positions (Table 9); whereas, a lack of visibility is detected in Japan (Tokyo is the most productive city in the sample, however it shows an AAS percentage of 12.9), Russia (Moscow holds the lowest AAS percentage, 6.9), and China (Beijing is in the 3rd position regarding total productivity, but has a total AAS percentage of 14.6). The presence of other cities (such as Seattle in the United States) can be related to their elevated publication output in highly-cited research disciplines (clinical sciences, public health, and biochemistry).
|PERFORMANCE||Cities||Publication||Altmetric Attention Score (%)|
Genetics (31.6) and public health and health services (31) constitute the research categories with the highest total AAS percentages in the sample. On the contrary, we find fields related with mathematics (applied mathematics, 4.8; pure mathematics, 4.6; numerical and computational mathematics, 4.4) in the lowest positions (Table 10). In addition, low values are identified for engineering fields (material engineering, 8.9; electrical and electronic engineering, 7.8; communication technologies, 7.3; interdisciplinary engineering, 6.5) and computer sciences (computer software, 7.5, artificial intelligence and image processing, 8.8), or even combined fields (computation theory and mathematics, 6.8).
|PERFORMANCE||Categories||Publication||Altmetric Attention Score (%)|
|Public Health and Health Services||3||831773||31||52.9||35.5||14.0|
|Numerical and Computational Mathematics||43||13319||4.4||7.3||3.0||1.2|
As with the previously analyzed remaining entities, an increase in the number of publications with altmetrics occurred in 2012. The evolution of the five research fields with higher and lower AAS percentages in 2017 and the number of publications is offered in Figure 7. As we can observe, the AAS percentage of these disciplines is between 5 and 25 in the period before 2012, while in 2017, the differences among fields have been evidenced, from communication technologies (4.9) to genetics (62.6).
The AAS percentage for each of the 22 fields in which Dimensions integrates research categories is available in Table 11. The results obtained not only reinforce previous findings (high values for medicine and biology-related disciplines; low results for mathematics, engineering, and computer sciences), but also provide an overall picture of all disciplines, locating humanities and social sciences in the global framework, with unexpected high values for education (comprising 4 research categories) and studies in human society (covering 9 research categories).
|FIELD||Number Categories||Publications||Cited (%) avg||AAS (%) avg|
|Medical and Healh Sciences||18||15539667||80.4||21.5|
|Studies in Human Society||9||1150598||62.7||20.1|
|Agriculture and Veterinary Sciences||8||330732||77.8||17.2|
|Psychology and Cognitive Sciences||3||1851047||65.8||14.9|
|History and Archaeology||3||685797||51.4||14.1|
|Language, Communication and Culture||6||595284||52.6||11.9|
|Built Environment and Design||5||40991||51.2||11.4|
|Studies in Creative Arts and Writing||6||49216||41.9||9.8|
|Philosophy and Religious Studies||5||269957||46.6||9.2|
|Commerce, Management, Tourism and Services||7||526412||53.5||9.1|
|Information and Computing Sciences||7||3759470||64.4||8.2|
|Law and Legal Studies||2||261334||51.9||8.1|
Regarding the funding bodies, the Medical Research Council (UK) holds the highest total AAS percentage (58.3) in the sample (Table 12), although The Wellcome Trust (UK) also stands out for achieving the highest percentage in 2015 (85.8), 2016 (87.2), and 2017 (87.7).
|PERFORMANCE||Funders||Publication||Altmetric Attention Score (%)|
|HIGH||Medical Research Council||21||88375||58.3||83.9||48.7||20.1|
|European Research Council||34||55482||56.3||64.7||32.0||18.3|
|Canadian Institutes of Health Research||30||57183||51.2||78||42.1||18.5|
|Directorate for Biological Sciences (USA)||36||47706||48.7||80.6||56.0||21.5|
|LOW||National Natural Science Foundation of China||1||247376||17.5||24.5||16.4||6.3|
|Ministry of Science and Technology (Taiwan)||20||28445||17.5||31.5||19.9||6.9|
|China Postdoctoral Science Foundation||50||11921||16.9||21.4||18.6||5.9|
|Directorate for Computer & Information Science & Engineering (USA)||28||19203||16.3||22.8||14.0||3.4|
|Ministry of Education of the People’s Republic of China||13||35006||15.4||23.5||16.5||6.0|
The National Natural Science Foundation of China is the funding body of the sample with the highest number of associated publications. However, and in accordance with the data obtained in previous sections, it achieves a low AAS percentage (17.5). This same occurs with other Chinese funding bodies (Ministry of Science and Technology of the People’s Republic of China, 19.7; China Postdoctoral Science Foundation, 17.5; or Ministry of Education of the People’s Republic of China, 15.4).
In the case of Europe, we can observe a difference between the European Research Council (second highest total AAS percentage) and the European Commission (26th position). Their evolution (from 2000 to 2017) is available in Figure 8.
Although Dimensions offers statistics for any query, including a null query (e.g., the whole database), the AAS percentage (percentage of documents returned by a query that has an Altmetric Attention Score of one or above one) for the aggregated entities analyzed firstly depends on the coverage of documents and, secondly, on the information extracted from each document.
In this sense, the in-house version of Dimensions as of July covers a total of 40,711,747 publications published between 2000–2017. Of these, only 55.1% (22,433,285) have an associated research category field, and an affiliation field appears for 53.8% (21,884,709).
Journal coverage in Dimensions presents some inconsistencies. First, including SSRN Electronic Journal (it is currently not a peer-reviewed journal) is debatable (7th position in total number of publications). Secondly, some journals (Advance Materials Research, Applied Mechanics and Materials, Inpharma Weekly, Scientific Reports, and Medicine & Science in Sports & Medicine) show unusual annual article indexing rates in some years (from 2000 to 2017) that might bias the results.
In order to delve into this issue, we compared the number of articles indexed by these journals per year both in Dimensions and the Web of Science. Out of the 50 journals in the sample—those with more total publications indexed in Dimensions—8 are not indexed in the Web of Science.
When one journal changes its name from Name 1 to Name 2, Dimensions merges the articles of all two titles under the bibliographic record corresponding to Name 2. For this reason, in order to compare the output in Dimensions offered by Web of Science, we need to locate all previous journal names in WoS and then merge their production, in order to compare the total volume of publications indexed in both databases. This issue was detected in the following journals of the analyzed sample: Biochimica et Biophysica Acta, Angewandte Chemie, The Lancet, Physical Review A-general physics, Physical Review B-solid state, British Medical Journal, and Analytical and Bioanalytical Chemistry.
The Spearman correlation of all articles indexed by each journal is statistically significant but unexpectedly low (0.54; p-value: 0.000) and increases when only the period (2000 to 2017) is considered (0.87; p-value: <0.0001). A scatter plot comparing the ranking position of journals according to the total number of articles indexed in Dimensions with the ranking position that these journals occupy in the Web of Science confirms that the two databases are offering a different coverage of the academic output (Figure 9). As we can observe, it seems the number of articles is somewhat inflated in Dimensions, with three clear outliers (Notes and Lectures, Scientific American, and Journal of Geophysical Research).
The case of the Journal of Geophysical Research highlights an indexing problem. This journal was gradually divided into several sections (each of them with a distinctive ISSN). In Web of Science, there are 17,071 articles under the general “Journal of Geographical Research” name (stopped at 1985) and 98,472 publications if we consider all the remaining articles in all the current journal sections. In Dimensions, we can find also a record for the general journal, as well as for each of the seven sections. However, assigning articles to the general journal shows inconsistencies (see Figure 10), which inflates the number of articles indexed in the old journal.
Finally, a retroactive growth of Dimensions has been carried out. As we can observe in Table 13, the database significantly grows from July to October when we analyze the same years. This effect is more pronounced in some years (especially 2007), probably due to the index of new journals. However, the AAS percentage variation is low and only slightly meaningful in 2017 (global decrease of 0.7 points).
|YEAR||July 2018||October 2018||VARIATION|
|TOT PUB||ALT PUB||AAS (%)||TOT PUB||ALT PUB||AAS (%)||PUB||PUB (%)||AAS (%)|
The effect of the retroactive growth per type of entity is low (average variation of countries: 0.24; cities: 0.36; institutions: 0.24; journals: 0.26; disciplines: 0.27). However, some outliers are found. The maximum variation for each entity is the following:
The results show certain limitations of Dimensions data that can jeopardize the main purpose of discovering the coverage of documents with altmetrics mentions (measured through the number of documents with an Altmetric Attention Score of one or above one).
The percentage of publications without an affiliation field is high (46.2%). This parameter is of importance due to the fact that information about institutions, cities, and countries is extracted precisely from the affiliation field. Moreover, some inconsistencies (documents indexed by journal inflated, unusual annual indexing rates, errors in assigning the article to the right journal) may change the ranking of the most productive journals in the database. Otherwise, the number of publications without a research category assigned is also high (44.9%). Moreover, publication categorization, performed at the article level instead of the journal level, has been proved in literature to show some inconsistencies (Orduna-Malea and Delgado Lopez-Cozar 2018; Bornmann 2018). Finally, the retroactive growth causes some minor variations in the Altmetric Attention Score percentage, which in timely manner may affect the results of specific entities depending on the data collection time.
Nonetheless, despite some particular exceptions, the results offered are plausible and reflect some general well-known patterns (see Results section). Moreover, because a specific period of time (2000 to 2017) and entities (top 50 entities per type) are considered, the error rate is minimized and we were allowed to specifically concentrate on the years where altmetric activity is higher (2012 onwards). In this sense, the research questions established in this work can be answered in a general way and considered with caution.
Apart from Dimensions—the database is continuously growing and improving its functionalities—other external variables may bias the results obtained.
Firstly, the percentage of documents with altmetric mentions is gathered via one specific data provider (Altmetric.com), whose results may differ from those obtained by other data providers, such as PlumX (Zahedi & Costas 2018). Also, Altmetric.com only uses mentions driven by DOI. This method potentially disregarded publication mentions without DOIs, an aspect already discussed in literature (Weller et al. 2011; Mahrt et al. 2012). Moreover, not all publications have a DOI. Gorráiz et al. (2016) estimates that 10% of articles in Web of Science (2005 to 2014) in sciences and social sciences do not have DOIs, and this percentage is much lower for humanities (exceeding 50% only since 2013). For this reason, all scores about altmetric mentions via Altmetric.com can be considered an underestimation of the real value.
Secondly, publication coverage in Dimensions is wider than in Web of Science and Scopus. At the time of writing this study, Dimensions includes 10,180,612 book chapters (with an AAS percentage of 1) and 375,080 books (AAS percentage of 8.9). This coverage definitely affects all comparisons with altmetric coverage performed previously. For example, while Torres-Salinas et al. (2018) ciphered University Pompeu Fabra as the Spanish public university with the highest percentage of documents (from 2014 to 2016) in Altmetric.com (71%), the Altmetric Attention Score percentage in Dimensions for the same period is 62.3%. Despite the different methods used—the percentage of documents included in Altmetric.com does not necessarily coincide with the percentage of documents with an Altmetric Attention Score of one or above one—the results should be expected to be closer. Therefore, a wider coverage of Dimensions offers a new perspective on the coverage of documents with altmetrics.
Thirdly, there are external variables that have been proved to bias the reception of altmetrics by the publications (Sugimoto et al. 2017). Non-biomedical disciplines (Haustein et al. 2014b; Holmberg & Thelwall 2014; Ortega 2018; Zahedi et al. 2014), disciplinary journals (Zahedi et al. 2014), Latin-American countries (Alperin 2015), and, in general, older publications (Zahedi et al. 2014) statistically obtain less social media metrics than disciplines on biomedicine, multidisciplinary journals, English-speaking countries, and recent publications. All these previous conclusions are in line with the results obtained via Dimensions, which reinforces its reliability.
The total number of publications with an Altmetric Attention Score (AAS) of one or above one is low (9,167,952 documents; 9.4% out of the total coverage) and highly concentrated in recent years (from 2012 onwards), especially 2017, which contains 10.6% of all the documents with AAS. The percentage of documents with an AAS percentage is higher (18.9%) when only open access documents are considered (an Open Access Altmetric advantage).
Multidisciplinary (Nature, Science PNAS, PLoS One) and medicine (Journal of the American Medical Association, British Medical Journal, New England Journal of Medicine, The Lancet) journals represent the sources with the highest AAS percentages (especially in 2017). Therefore, the most visible institutions are those active in these fields, highlighting English-speaking universities (Harvard, UCLA, Johns Hopkins, University College London, or University of Melbourne). Otherwise, Japanese, Chinese, and Russian institutions, despite having high annual publication outputs, obtain lower AAS percentages. Both language and the use of alternative social media platforms (especially in China, where Twitter is blocked), which are not covered enough by Altmetric.com yet, may explain this issue.
This circumstance is subsequently inherited in the analysis of cities and countries, diminishing the presence of Japan, China, Russia, or India. However, Australia has the highest AAS percentage within the top 50 highly productive countries in the world, followed by Denmark and the Netherlands.
Disciplines are also directly influenced by journal coverage in Dimensions. Research categories such as genetics, immunology, microbiology, or medical microbiology have held higher AAS percentages in the last years. Conversely, fields related with mathematics, engineering, and, unexpectedly, computer science, achieve lower AAS percentages.
Finally, research bodies reflect the predominance of medical-, health- and biological-related disciplines (Medical Research Council, Canadian Institutes of Health Research, Directorate for Biological Sciences). On the other hand, Chinese funding bodies, following the previous pattern, hold the lowest AAS percentages.
The analysis has brought out some inconsistencies in the quality of the data. In this sense, AAS percentages can vary from those offered in the database, although we estimate that this issue will not substantially modify the general patterns found. However, the database still needs some improvements. The nature of Dimensions (wide coverage and structured metadata to be exported and re-used) makes this bibliographic database and research framework an essential tool to monitor the coverage and evolution of scientific literature impact (mentions) on social media platforms. The volume of data and growth rate confirms Dimensions as a new player in the ecosystem of research information.
The additional files for this article can be found as follows:Appendix A
Altmetric Attention Score (%) for Top 50 Countries with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s1Appendix B
Countries: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s2Appendix C
Altmetric Attention Score (%) for Top 50 Cities with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s3Appendix D
Cities: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s4Appendix E
Altmetric Attention Score (%) for Top 50 Universities with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s5Appendix F
Universities: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s6Appendix G
Altmetric Attention Score (%) for Top 50 Journals with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s7Appendix H
Journals: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s8Appendix I
Altmetric Attention Score (%) for Top 50 Research Categories with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s9Appendix J
Categories: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s10Appendix K
Research Categories matched with General Fields: Publications, Cited (%), FCR, RCR, AAS (%). DOI: https://doi.org/10.29024/joa.13.s11Appendix L
Altmetric Attention Score (%) for Top 50 Funders with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s12Appendix M
Funders: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s13
Authors would like to thank Dr. Stefanie Haustein and Dr. Rodrigo Costas for their feedback and help during the performance of this work, partly developed in Ottawa (Canada).
The authors have no competing interests to declare.
Alperin, J. P. (2015). Geographic variation in social media metrics: An analysis of Latin American journal articles. Aslib Journal of Information Management, 67(3), 289–304. DOI: https://doi.org/10.1108/AJIM-12-2014-0176
Barthel, S., Tönnies, S., Köhncke, B., Siehndel, P., & Balke, W. T. (2015). What does twitter measure? Influence of diverse user groups in altmetrics. In Proceedings of the 15th ACM/IEEE-CE on Joint Conference on Digital Libraries, Knoxville, USA. DOI: https://doi.org/10.1145/2756406.2756913
Bode, C., Herzog, C., Hook, D., & McGrath, R. (2018). A guide to the dimensions data approach. A collaborative approach to creating a modern infrastructure for data describing research: where we are and where we want to take it. Retrieved from https://www.digital-science.com/resources/portfolio-reports/a-guide-to-the-dimensions-data-approach.
Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 89–903. DOI: https://doi.org/10.1016/j.joi.2014.09.005
Bornmann, L. (2018). Field classification of publications in Dimensions: A first case study testing its reliability and validity. Scientometrics, 117(1), 637–640. DOI: https://doi.org/10.1007/s11192-018-2855-y
Chen, C. (2018). Cascading Citation Expansion. Retrieved from: https://arxiv.org/ftp/arxiv/papers/1806/1806.00089.pdf.
Costas, R., Zahedi, Z., & Wouters, P. (2015). Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology, 66(10), 2003–2019. DOI: https://doi.org/10.1002/asi.23309
Gorraiz, J., Melero-Fuentes, D., Gumpenberger, C., & Valderrama-Zurián, J. C. (2016). Availability of digital object identifiers (DOIs) in Web of Science and Scopus. Journal of Informetrics, 10(1), 98–109. DOI: https://doi.org/10.1016/j.joi.2015.11.008
Harseim, T., & Goodey, G. (2017). How do researchers use social media and scholarly collaboration networks (SCNs). Nature Blog, 15. Retrieved from http://blogs.nature.com/ofschemesandmemes/2017/06/15/how-do-researchers-use-social-media-and-scholarly-collaboration-networks-scns.
Haustein, S. (2014). Readership Metrics. In B. Cronin, C. R. Sugimoto (Eds.). Beyond ibliometrics: Harnessing multi-dimensional indicators of performance (pp. 327–344). Cambridge (MA), USA: MIT Press.
Haustein, S. (2016). Grand challenges in altmetrics: Heterogeneity, data quality and dependencies. Scientometrics, 108(1), 413–423. DOI: https://doi.org/10.1007/s11192-016-1910-9
Haustein, S., Bowman, T. D., Macaluso, B., Sugimoto, C. R., & Larivière, V. (2014a). Measuring Twitter activity of arXiv e-prints and published papers. Paper presented at Altmetrics14: Expanding Impacts and Metrics, Workshop at Web Science Conference, Bloomington (Indiana), USA. DOI: https://doi.org/10.6084/m9.figshare.1041514%20.
Haustein, S., Larivière, V., Thelwall, M., Amyot, D., & Peters, I. (2014b). Tweets vs. Mendeley readers: How do these two social media metrics differ?. IT-Information Technology, 56(5), 207–215. DOI: https://doi.org/10.1515/itit-2014-1048
Haustein, S., Peters, I., Sugimoto, C. R., Thelwall, M., & Larivière, V. (2014c). Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature. Journal of the Association for Information Science and Technology, 65(4), 656–669. DOI: https://doi.org/10.1002/asi.23101
Herzog, C., & Lunn, B. K. (2018). Response to the letter ‘Field classification of publications in Dimensions: A first case study testing its reliability and validity. Scientometrics, 117(1), 641–645. DOI: https://doi.org/10.1007/s11192-018-2854-z
Holmberg, K., & Thelwall, M. (2014). Disciplinary differences in Twitter scholarly communication. Scientometrics, 101(2), 1027–1042. DOI: https://doi.org/10.1007/s11192-014-1229-3
Hook, D., Porter, S., & Herzog, C. (2018). Dimensions: Building context for search and evaluation. Frontiers in Research Metrics and Analytics, 3. DOI: https://doi.org/10.3389/frma.2018.00023
Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471. DOI: https://doi.org/10.1007/s11192-011-0580-x
Mahrt, M., Weller, K., & Peters, I. (2012). Twitter in scholarly communication. In K. Weller, A. Bruns, J. Burgess, M. Mahrt, & C. Puschmann (Eds.). Twitter and Society (pp. 399–410). New York, USA: Peter Lang.
Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the American Society for Information Science and Technology, 65(8), 1627–1638. DOI: https://doi.org/10.1002/asi.23071
Orduña-Malea, E., & Delgado-López-Cózar, E. (2018a). ¡Viva la competencia! Nuevas dimensiones para la búsqueda y evaluación de la información científica. Anuario Think EPI, 12. DOI: https://doi.org/10.3145/thinkepi.2018.45
Orduña-Malea, E., & Delgado-López-Cózar, E. (2018b). Dimensions: Re-discovering the ecosystem of scientific information. El Profesional de la Información, 27(2), 420–431. DOI: https://doi.org/10.3145/epi.2018.mar.21
Ortega, J. L. (2018). Disciplinary differences of the impact of altmetric. FEMS microbiology letters, 365(7). DOI: https://doi.org/10.1093/femsle/fny049
Priem, J., Piwowar, H., & Hemminger, B. M. (2012). Altmetrics in the wild: Using social media to explore scholarly impact. Retrieved from http://arxiv.org/html/1203.4745.
Robinson-García, N., Torres-Salinas, D., Zahedi, Z., & Costas, R. (2014). New data, new possibilities: Exploring the insides of Altmetric.com. El professional de la informacion, 23(4), 359–366. DOI: https://doi.org/10.3145/epi.2014.jul.03
Schonfeld, R. C. (2018). A new citation database launches today: Digital Science’s Dimensions. Retrieved from https://scholarlykitchen.sspnet.org/2018/01/15/new-citation-database-dimensions.
Sugimoto, C. R., Work, S., Larivière, V., & Haustein, S. (2017). Scholarly use of social media and altmetrics: A review of the literature. Journal of the Association for Information Science and technology, 68(9), 2037–2062. DOI: https://doi.org/10.1002/asi.23833
Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430–435. DOI: https://doi.org/10.1016/j.joi.2018.03.006
Torres-Salinas, D., Castillo-Valdivieso, P. A., Pérez-Luque, A., & Romero-Frías, E. (2018). Altmétricas a nivel institucional: Visibilidad en la Web de la producción científica de las universidades españolas a partir de Altmetric.com. El Profesional de la Información, 27(3), 483–492. DOI: https://doi.org/10.3145/epi.2018.may.03
Van Noorden, R. (2014). Online collaboration: Scientists and the social network. Nature news, 512(7513), 126. DOI: https://doi.org/10.1038/512126a
Weller, K., Dröge, E., & Puschmann, C. (2011). Citation analysis in Twitter: Approaches for defining and measuring information flows within Tweets during scientific conferences. In M. Rowe, M. Stankovic, A.-S. Dadzie, & M. Hardey (Eds.), Making Sense of Microposts (MSM2011) (pp. 1–12). Heraklion: CEUR Workshop Proceedings.
Zahedi, Z., & Costas, R. (2018). General discussion of data quality challenges in social media metrics: Extensive comparison of four major altmetric data aggregators. PloS One, 13(5). DOI: https://doi.org/10.1371/journal.pone.0197326
Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, 101(2), 1491–1513. DOI: https://doi.org/10.1007/s11192-014-1264-0