Start Submission Become a Reviewer

Reading: Demography of Altmetrics under the Light of Dimensions: Locations, Institutions, Journals, D...

Download

A- A+
Alt. Display

Research

Demography of Altmetrics under the Light of Dimensions: Locations, Institutions, Journals, Disciplines and Funding Bodies in the Global Research Framework

Authors:

Enrique Orduna-Malea ,

Universitat Politècnica de València, ES
X close

Emilio Delgado López-Cózar

Universidad de Granada, ES
X close

Abstract

The interconnection between the Dimensions database and Altmetric.com provides an opportunity to carry out a worldwide analysis on altmetrics coverage of scientific literature, analyzing the percentage of documents with altmetric mentions not only in general (indexed documents), but also filtered according to different units of analysis. In order to do so, the Dimensions Pro version database was directly used to retrieve 97,531,400 documents, which were subsequently filtered to obtain the top journals, countries, cities, institutions, research fields and funding bodies according to the total number of publications indexed in the database. For each entity and year of publication (from 2000 to 2017), the corresponding percentage of publications cited and the Altmetric Attention Score (% mentioned) were calculated. The main results indicate that the total number of publications with an Altmetric Attention Score (AAS) of one or over one is low (9.4% out of the total coverage), which has been highly concentrated in recent years, and higher for open access documents (18.9%), showing an open access altmetric advantage. Otherwise, English-speaking universities stand out, which determines an increase in the presence of specific cities from Anglo-Saxon countries, diminishing the presence in Japan, China, Russia or India, despite their elevated productivity. Multidisciplinary and medicine-related journals are also highlighted, which in turn influences the research disciplines with a higher AAS (% mentioned): Genetics, Immunology, Microbiology or Medical Microbiology. However, since the conducted analysis has brought out some inconsistencies in the quality of the data, results must be taken with caution.

How to Cite: Orduna-Malea, E. and Delgado López-Cózar, E., 2019. Demography of Altmetrics under the Light of Dimensions: Locations, Institutions, Journals, Disciplines and Funding Bodies in the Global Research Framework. Journal of Altmetrics, 2(1), p.3. DOI: http://doi.org/10.29024/joa.13
232
Views
43
Downloads
  Published on 18 Jun 2019
 Accepted on 15 May 2019            Submitted on 11 Feb 2019

1. Introduction

The main advantages and disadvantages of social media metrics that have been identified, described, and analyzed for research assessment include one of the main activities within the field of altmetrics (Haustein 2016).

While Wouters and Costas (2012) highlight the main advantages of altmetrics through four dimensions (broadness, diversity, speed, openness), Priem (2014) points out a series of disadvantages, including the lack of theory, ease of gaming, and possible biases. Bornmann (2014) expands the taxonomy of limitations to data quality (bias, target, multiple versions, different meanings, measurement standards, mention standards, cross-field and time normalization, and replication), missing evidence, and manipulation. Likewise, Haustein (2014) suggests the representativeness of altmetrics—which might be included within the ‘data quality’ category—as one of the main limitations of altmetrics.

The representativeness issue can be treated from different perspectives, namely the population of active users using social platforms (number of users), the order of magnitude of data collected by the direct sources of metrics (amount of data generated), the actual coverage of data obtained by altmetrics data providers (amount of data identified), and the coverage of documents indexed by altmetrics providers (percentage of published documents that are both mentioned and identified). This last issue represents the objective of this study.

Haustein et al. (2014a) performed one of the first studies oriented at knowing the coverage of documents with alternative metrics through Altmetric.com, obtaining a coverage percentage of 45.2% for a sample of 84,374 documents deposited in Arxiv.org. Later, Robinson-García et al. (2014) analyzed a corpus consisting of 2,792,706 articles (published between 2011 and 2013 with digital object identifier (DOI) and indexed in the Web of Science) ciphering the coverage by 19%, although with important differences according to the source, being the main source Twitter (87.1% of articles), Mendeley (64.8%), and Facebook (19.9%). However, because Altmetric.com only checks Mendeley if it has at least one other altmetric, this may explain the lower values of Mendeley with respect to Twitter.

Costas et al. (2015) performed another study through Altmetric.com (a set of 718,315 documents with DOI). The results reflected that “only 7% of all papers in the WoS (without any time restriction and with DOI) had some altmetric score as covered by Altmetric.com”, detecting however a growth in recent documents (as of 2010).

Scientific literature had already warned of the strong concentration of metrics in Mendeley and especially Twitter (Priem et al. 2012). Document coverage studies on these two platforms offer percentages that depend on the samples analyzed. For example, Zahedi et al. (2014) analyzed a random sample of 19,722 publications (published between 2005 and 2011, with DOI and indexed in Web of Science) through ImpactStory, finding that 62.6% of the documents had at least one reader in Mendeley and only 1.6% had a Tweet mention.

Dependency according to disciplines has also been evidenced in literature. Haustein et al. (2014a) show important differences in the percentage of documents with some mention on social networks (from 30.4% in ‘Nuclear Theory’ to 85.2% in ‘Quantitative Biology’), while Costas et al. (2015) obtained much lower percentages (from 5.4% in ‘Mathematics and Computer Science’ to 22.8% for ‘Biomedical and Health Science’). Another large sample (1,431,576 documents), precisely circumscribed to the area of biomedicine, shows that 9.4% of the sample had at least one Tweet (Haustein et al. 2014b) and 66% had a reader in Mendeley (Haustein et al. 2014c). The differences according to areas of knowledge are equally significant. Mohammadi and Thelwall (2014) determined Mendeley covered only 28% of articles in humanities (indexed in Web of Science and published in 2008), while that percentage increased to 58% in the case of social sciences.

Another parameter in which a dependence on altmetrics coverage has been demonstrated is the geographical area and country, where different degrees of penetration and use of social networks, alongside different cultures of diffusion of academic activity, shape the available altmetric data. Alperin (2015) shows how the coverage of altmetrics in Brazil and Chile are far superior to the rest of the countries in Latin America, according to a sample of 389,795 articles published in journals included in SciELO (Scientific Electronic Library Online) platform (http://www.scielo.org), of which 44.6% (173,733 documents) received at least one mention.

Literature has also discussed the analysis of different document aggregations, such as journals and institutions. Regarding journals, significant differences in coverage have been detected according to the area and the source analyzed, highlighting large multidisciplinary journals (e.g., Science, Nature, PNAS, PLoS One). Priem et al. (2012) estimated that 80% of the articles published in PLoS One were indexed in Mendeley, although this percentage dropped to 12% if the coverage was measured on Twitter. Barthel et al. (2015) subsequently warned that this percentage had grown (53%). Li et al. (2012) found elevated coverages for articles published in 2007 in Nature (93.8%) and Science (92.8%).

With regard to universities, Torres-Salinas et al. (2018) focused on the Spanish university system by collecting information on the activity of 66 universities in Altmetric.com through a corpus of 125,824 documents published on Web of Science between 2014 and 2016. The results indicate a total coverage of 42% of the articles, although there are highly important differences between universities, from 11% coverage at the Universidad Pontificia de Salamanca to 71% at the Universitat Pompeu Fabra.

As observed, literature provides variable coverage figures. However, these studies are limited to the selected samples, which are restricted to documents published in specific disciplines, time periods, geographic areas, or selective bibliographic databases. The lack of a global study about the coverage of academic documents that have achieved some kind of impact on social network platforms, as well as its evolution over time, is detected. This issue is of great importance when knowing the penetration degree of altmetrics in the academic scope, as well as to help contextualize the order of magnitude of the altmetric data gathered, especially at a macro-level and meso-level analyses.

The appearance of Dimensions (https://dimensions.ai) in January 2018 (Schonfeld 2018) may have constituted an opportunity to carry out global coverage studies of altmetrics. Dimensions is a bibliographic database launched by Digital Science (https://www.digital-science.com) covering publications, such as grants, patents, clinical trials, and policy documents, as well as citations received and altmetric attention data via Altmetric.com (Bode et al.; Hook et al. 2018). As of August 30, 2018, Dimensions includes 96,725,143 documents; whereas, Scopus includes 72,372,800 documents and Web of Science Core Collection (considering Emerging Sources, Proceedings and Book Citation Indexes) includes a total of 69,842,611 documents.

The coverage of Dimensions, together with its quality control, its available API, as well as its connection to Altmetric.com (Orduna-Malea & Delgado López-Cózar 2018a), place this database a priori as an optimal bibliometric tool for the purpose of analyzing the global coverage of altmetrics.

In addition, this database allows researchers to easily analyze the coverage of documents with altmetric mentions not only at a general level (total documents indexed), but also at the unit level (documents according to institutions, countries, cities, journals, field, etc.). However, its validity and accuracy to this end is still to be tested.

The work by Thelwall (2018) constitutes the first bibliometric analysis of the product, in which the coverage of this database is emphasized to already be similar to that of Scopus. Similarly, the study concludes that Dimensions and Scopus strongly correlate in terms of citation counts, thereby opening the doors to its use as a bibliometric analysis tool. However, Orduna-Malea and Delgado López-Cózar (2018b) later detected a series of inconsistencies in the thematic classification used, also discussed by Bornmann (2018) and Herzog and Lunn (2018). However, its potential use as a next-generation research and discovery platform for better and more efficient access to academic material has already been tested (Chen 2018).

2. Research questions

In relation to the worldwide altmetrics coverage of scientific literature, the following research questions are addressed:

RQ1. What is the total coverage of academic documents with altmetric mentions at present? What is the evolution of this coverage like over time?

RQ2. What places in the world (countries, cities) show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?

RQ3. What institutions show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?

RQ4. What journals show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?

RQ5. What research categories show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?

RQ6. What funding bodies show a higher percentage of documents with altmetric mentions? What is the evolution of this coverage like over time?

RQ7. Is Dimensions an accurate bibliographic tool to carry out an analysis of the worldwide penetration of altmetrics, according to different units of analysis?

Method

Dimensions Pro version database directly provides structured information about the set of documents that match with each specific query, including the number of publications, number of citations, citations per publication, the Relative Citation Ratio (RCR) Mean, Field Citation Ratio (FCR) Mean, the percentage of articles cited, and the percentage of publications with an Altmetric Attention Score of one or higher, hereinafter referred to as the AAS percentage.

In order to address RQ1, we defined a global query (97,531,400 documents; time interval: 1665 to 2018). This query was subsequently filtered according to the open access level (all, publisher, and repository). Finally, all publications between 2000 and 2017 (52,048,103 documents) were considered as the study sample.

In regard to RQ2, RQ3, RQ4, RQ5, and RQ6, we selected the top 50 journals, countries, cities, institutions, and funding bodies according to the total number of publications indexed in the database. In the case of disciplines, all of them (152) were gathered in order to minimize potential biases in the results.

Then, for each entity (journals, countries, cities, institutions, funding bodies, and research categories), the annual number of publications from 2000 to 2017 was additionally gathered. Finally, for each entity and year of publication, the corresponding percentage of publications cited and the AAS percentage were calculated.

In the case of documents with multiple authors, binary counting was used to quantify the locations (countries and cities) and institutions. That is each location and institution was counted once per publication.

Lastly, with regard to RQ7, we took into account various procedures to test the reliability of the database:

  • Data availability was analyzed by checking the number of records without the institutional or discipline fields. To do this, an SQL query to the database was performed using the in-house version of Dimensions available by Centre for Science and Technology Studies (CWTS) as of July 2018.
  • Data volatility was tested by performing a retroactive analysis of Dimensions. All the same queries were repeated twice (July and October 2018) and directly compared (top 25 entities per entity type were used for this purpose).
  • Data indexing was finally checked by comparing the number of articles indexed for the set of top 50 journals considered in Dimensions, with the number of articles indexed for the same journals (when available) in the Web of Science.

All data were extracted manually and then statistically analyzed with XLStat. The first sample was gathered in July 2018 and the second sample in October 2018. Data were analyzed in November 2018. The raw results per entity are available in supplementary file 1.

4. Results

Global coverage

Considering the total coverage of Dimensions as of October 2018 (97,531,400 publications), the percentage of documents (all typologies) with an Altmetric Attention Score amounts to 9.4 (Table 1). If we consider only open access documents (19.9% of publications in the database), the percentage increases to 18.9. This fact can prove a greater visibility of the documents when they are available in some open access form (i.e., an open access altmetric advantage).

Table 1

Total Altmetric Attention Score (AAS) percentage. Data according to the collection considered (ALL documents or open access documents).

Collection Publications AAS (%)

ALL 97,531,400 9.4
Open Access – All 19,388,638 18.9
Open Access – Publisher 15,180,909 18.6
Open Access – Repository 4,207,729 20

The percentage of publications with AAS remains stable around 8 from 2000 to 2010, experiencing a notable increase from 2012 (14.8) to 2017 (22.5) (Figure 1). This growth can be directly related to the launch of Altmetric.com, the company that provides Dimensions with altmetric data, as well as an increase in the usage of academic social network platforms by researchers, already foretold by different Nature surveys (Van Noorden 2014; Harseim & Goodey 2017).

Figure 1 

Evolution of the percentage of documents with Altmetric Attention Score. Source: Dimensions and Altmetric.com.

If we disaggregate the results according to each of the units of analysis considered (top 50 countries, cities, institutions, journals, disciplines, and funding bodies), we can see that the total AAS percentage (considering the complete coverage in the database) varies depending on the type of entity analyzed (Table 2). While it is more homogeneous for countries and cities (standard deviation of 4.9 and 5.8, respectively), it seems to be more sparse for funding bodies and, especially, for journals. This effect can be visualized in the box plots performed for each entity (Figure 2).

Table 2

Total Altmetric Attention Score. Descriptive data according to the entity analyzed (Top 50 entities according to the total number of publications per unit of analysis).

Entity Min Max Median Mean Standard Deviation

Countries 5.2 24.6 16.8 16.7 4.9
Cities 6.9 31.6 22.8 22.1 5.8
Institutions 7.5 37.3 25.6 24.2 6.7
Journals 0.0 65.9 8.9 12.6 14.5
Disciplines 4.4 31.6 15.3 15.7 7.5
Funders 15.4 58.3 33.9 33.8 10.9
Figure 2 

Box plots of the Altmetric Attention Score percentage distribution according to the entity analyzed (top 50 entities according to the total number of publications per unit of analysis).

The global peak achieved in past years (see Figure 1), together with the raw academic publication output growth, may distort to some extent the overall percentage of documents with altmetrics, as well as the specific values obtained at the unit level. Table 3 (publications) and Table 4 (Altmetric Attention Score percentage) show, according to each of the units of analysis considered (top 50 countries, cities, institutions, journals, disciplines, and funding bodies), the Spearman correlation between the results obtained in 2017 with those obtained both at the beginning of the analyzed period (2000) and the total values (from 1665).

Table 3

Publication output. Correlation between years according to entity analyzed (Top 50 entities according to the total number of publications per unit of analysis).

Entity Total vs 2017 2000 vs 2017

R p-value R p-value

Countries **0.95 <0,0001 **0.87 <0,0001
Cities **0.80 <0,0001 **0.48 0.0006
Institutions **0.62 <0,0001 **0.37 0.0089
Journals –0.01 0.9383 0.28 0.0771
Disciplines **0.93 <0,0001 **0.85 <0,0001
Funders **0.68 <0,0001 0.10 0.4854

** Values are different from 0 with a significance level α = 0.01.

Table 4

Altmetric Attention Score percentage. Correlation between years according to entity analyzed (Top 50 entities according to the total number of publications per unit of analysis).

Entity Total vs 2017 2000 vs 2017

R p-value R p-value

Countries **0.90 <0,0001 **0.83 <0,0001
Cities **0.94 <0,0001 **0.87 <0,0001
Institutions **0.94 <0,0001 **0.82 <0,0001
Journals **0.49 0.002 **0.52 0.001
Disciplines **0.95 <0,0001 **0.71 <0,0001
Funders **0.94 <0,0001 **0.83 <0,0001

** Values are different from 0 with a significance level α = 0.01.

As we can observe, the results (both for publications and AAS percentage) related to 2017 strongly correlate to the total value for all units except for journals. Conversely, the correlation between 2017 and 2010 is weaker. A reasonable explanation of this fact is that the concentration of publications and altmetrics in 2017 is striking.

Journals

The coverage of journals (and articles published by each journal) constitutes the fundamental piece on which the coverage of any bibliographic database is sustained and, therefore, will determine the coverage of documents with altmetrics (both total and broken down by aggregated entities) that Dimensions will show.

PLoS One is the journal with the highest number of articles published with an AAS percentage (approximately 131,508 documents), followed by the Proceedings of the National Academy of Sciences (PNAS) (approximately 74,312 documents). Table 5 contains the journals with the highest and lowest total AAS percentages, the AAS percentage in 2017, the statistical range (AAS percentage in 2017 minus AAS percentage in 2000), the standard deviation (SD) (from 2000 to 2017), the total number of publications with an AAS percentage (Palt), and the ranking position of the journal according to the total number of articles indexed in the database.

Table 5

Journals with the highest and lowest Altmetric Attention Score percentage. Data: Top 50 journals according to total number of publications indexed.

PERFORMANCE Journals Publication Altmetric Attention Score (% mentioned)

Rank PAlt TOT 2017 Range SD

HIGH Scientific Reports 47 52,782 65.9 67.6 67.6 20.8
PLoS One 9 131,508 63.6 74.9 74.9 14.4
PNAS 20 74,312 53.7 93.6 37.8 14.7
J. of Biological Chemistry 15 47,654 26.7 77.1 31.8 14.1
J. of Organic Chemistry 40 20,855 23.6 25.2 –3.9 4.0

LOW Reactions Weekly 17 159 0.1 0 –0.1 0.3
Applied Mechanics & Materials 30 109 0.1 0.1 0.1 0.1
Inpharma Weekly 41 87 0.1 N/A –0.2 0.1
ChemInform 1 0 0 N/A 0.0 0.1
Choice Reviews Online 11 0 0 N/A –0.1 0.1

Among the 50 journals with the most indexed publications, nine obtain a total AAS percentage value less than one, where ChemInform (0 articles with an altmetric attention score) especially stands out for being the journal with the most articles indexed in Dimensions (791,868), though it ceased its publication in 2017.

Apart from the obvious differences among disciplines (see Research Categories section), multidisciplinary journals show higher performance (PLoS One and PNAS), especially in the last years. Precisely, Nature (97.8) and Science (95.7) are the journals with the highest AAS percentage in 2017 (Figure 3).

Figure 3 

Journals. Altmetric Attention Score percentage for Nature and Science (from 2010 to 2017).

Institutions

Harvard University stands as the institution not only with the highest number of publications with AAS, but also with the highest total AAS percentage (Figure 4). We can also observe a predominance of North American universities (7 out of the 10 institutions with higher total AAS percentage are from USA). The University College London (UK) should also be pointed out because this institution achieved the highest AAS percentage in 2017 (70.5 of their documents published have an AAS of one or higher), considering the top 50 institutions with the highest productivity in Dimensions. On the contrary, Japanese institutions achieve low AAS percentages despite their elevated productivity, especially Osaka University (14th position in total productivity, with 13.1 of AAS percentage), and University of Tokyo (1st position in total productivity, with 14.7 AAS percentage) (Table 6).

Figure 4 

Evolution of the Altmetric Attention Score percentage (2010 to 2017) for institutions. Source: Dimensions and Altmetric.com.

Table 6

Institutions with the highest and lowest Altmetric Attention Score percentage. Data: Top 50 institutions according to total number of publications indexed.

PERFORMANCE Institutions Publication Altmetric Attention Score (%)

R PAlt TOT 2017 Range SD

HIGH Harvard Univ. 3 90164 37.3 66.5 40.3 15.8
Univ. California, San Francisco 28 50979 35.7 64.2 41.0 16.1
Johns Hopkins Univ. 10 66803 33.6 62.3 40.0 15.3
Univ College London 6 68265 33.1 70.5 52.7 18.7
Univ. of Melbourne 39 37683 31.4 57.5 44.0 16.8

LOW Osaka Univ. 14 23203 13.1 33.1 23.0 8.4
Nagoya Univ. 43 15020 12.7 33 25.0 8.9
Kyushu Univ. 44 14594 12.5 31.3 22.8 8.6
Tohoku Univ. 19 18070 11 29.6 22.2 7.8
Russian Academy of Sciences 37 9523 7.5 13 8.1 4.3

The top 10 universities according to the total AAS percentage are included in Figure 4 so that we can observe the evolution of their AAS percentage over time (2000 to 2017). We can notice a similar pattern as previously observed for the global coverage (see Figure 1), with one notable increase in the AAS percentage located in 2012, which marks a growing trend until 2017.

Geographies

Given the total productivity, it is not surprising to confirm that the United States (2,999,786 documents) and the United Kingdom (907,637 documents) are the countries with the most publications with AAS. However, Australia stands out as the country with the highest total AAS percentage (24.6), followed by Denmark (24.4) (Table 7), considering only the top 50 countries according to the total productivity.

Table 7

Countries with the highest and lowest Altmetric Attention Score percentage. Data: Top 50 countries according to total number of publications indexed.

PERFORMANCE Countries Publication Altmetric Attention Score (%)

R PAlt TOT 2017 Range SD

HIGH Australia 10 355021 24.6 47 34.3 13.1
Denmark 22 112481 24.4 49.5 35.7 13.1
Netherlands 12 271443 24.1 51 37.6 13.7
Ireland 35 50944 23.9 47.7 37.0 13.1
New Zealand 34 56860 23.2 46 32.8 12.7

LOW Indonesia 47 8935 9.7 7.4 –3.1 1.6
Romania 42 13544 9.2 16 12.2 4.0
Tunisia 50 7696 9.2 12.9 10.1 3.4
Russia 14 61548 6.3 12.1 8.8 3.4
Ukraine 40 9245 5.2 10.4 7.8 3.4

A world map (both for publications with AAS and for total AAS percentage) is offered in Figure 5. Saint Kitts and Nevis (50.6) and Guinea-Bissau (46.3) show the highest total AAS percentages in the world due to a statistical artefact (scarce productivity). For this reason, only countries with a minimum threshold in productivity should be considered when analyzing the AAS percentage.

Figure 5 

World map of the total Altmetric Attention Score percentage per country (up) and total publications with Altmetric Attention Score percentage per country (down). Source: Dimensions and Altmetric.com.

Data shows China (3rd position in total productivity), Japan (5th position), India (9th position) and Russia (14th position) achieving low AAS percentages (13.3, 11.1, 11.6, and 6.3, respectively). This low impact on social media metrics, not reflected in the number of citations received (Table 8), might be associated with a lower impact of non-English contents on Twitter, the main carrier of altmetric mentions, as well as the usage of Twitter in these countries.

Table 8

Comparison between the percentage of documents cited and the percentage of documents with an Altmetric Attention Score percentage in China, Japan, India, and Russia (2011 to 2015).

Country 2011 2012 2013 2014 2015

Cited (%) AAS (%) Cited (%) AAS (%) Cited (%) AAS (%) Cited (%) AAS (%) Cited (%) AAS (%)

China 64.8 6.3 69.7 11.3 69.1 11.9 71.4 15.1 77.5 20.6
Japan 68.1 10.1 68.3 16.7 67.2 17.8 65.8 20.5 63.4 24.3
India 80.2 10.5 78.8 14.7 73.2 14.3 69.6 15.2 68.2 17
Russia 70.6 5.4 68.0 9.4 66.3 9.3 64.4 10.5 60.8 11.1

The 50 cities in the world with the highest number of publications can be visualized in Figure 6. Cambridge (Massachusetts, US) achieves the highest AAS percentage in the sample, though there are doubts about whether it should be included as part of Boston (3rd position). As with institutions, cities in the United States take the first positions (Table 9); whereas, a lack of visibility is detected in Japan (Tokyo is the most productive city in the sample, however it shows an AAS percentage of 12.9), Russia (Moscow holds the lowest AAS percentage, 6.9), and China (Beijing is in the 3rd position regarding total productivity, but has a total AAS percentage of 14.6). The presence of other cities (such as Seattle in the United States) can be related to their elevated publication output in highly-cited research disciplines (clinical sciences, public health, and biochemistry).

Figure 6 

World map of the top 50 cities according to number of publications. Circle size: AAS percentage. Source: Dimensions and Altmetric.com.

Table 9

Cities with the highest and lowest Altmetric Attention Score percentage. Data: Top 50 cities according to total number of publications indexed.

PERFORMANCE Cities Publication Altmetric Attention Score (%)

R PAlt TOT 2017 Range SD

HIGH Cambridge 10 150812 31.6 59.1 35.8 13.3
Bethesda 38 82012 30.3 63.9 38.5 15.4
Boston 6 171306 30.2 55.9 35.6 13.8
Baltimore 20 109593 30.2 57.6 38.0 14.7
Seattle 28 91707 29.8 58.5 38.7 14.9

LOW Tsukuba 46 31415 13.6 33 24.3 8.4
Wuhan 45 31308 13.5 22.7 18.5 6.5
Osaka 22 45733 13 31.4 21.3 7.8
Tokyo 1 172511 12.9 29.3 20.1 7.6
Moscow 8 36106 6.9 13.6 9.8 3.9

Research categories

Genetics (31.6) and public health and health services (31) constitute the research categories with the highest total AAS percentages in the sample. On the contrary, we find fields related with mathematics (applied mathematics, 4.8; pure mathematics, 4.6; numerical and computational mathematics, 4.4) in the lowest positions (Table 10). In addition, low values are identified for engineering fields (material engineering, 8.9; electrical and electronic engineering, 7.8; communication technologies, 7.3; interdisciplinary engineering, 6.5) and computer sciences (computer software, 7.5, artificial intelligence and image processing, 8.8), or even combined fields (computation theory and mathematics, 6.8).

Table 10

Research categories with the highest and lowest Altmetric Attention Score percentage. Data: Top 50 categories according to total number of publications indexed).

PERFORMANCE Categories Publication Altmetric Attention Score (%)

R PAlt TOT 2017 Range SD

HIGH Genetics 8 551623 31.6 62.6 41.1 16.0
Public Health and Health Services 3 831773 31 52.9 35.5 14.0
Microbiology 40 94362 28.6 53.9 32.6 12.5
Immunology 14 236281 27.3 59.5 37.4 14.6
Medical Microbiology 21 166919 27.1 57.8 37.0 14.0

LOW Interdisciplinary Engineering 15 51609 6.5 12.3 7.0 2.6
Applied Mathematics 30 20991 4.8 7.5 3.5 1.6
Pure Mathematics 18 31839 4.6 9.8 7.3 3.1
Civil Engineering 34 17207 4.5 7.2 2.5 1.2
Numerical and Computational Mathematics 43 13319 4.4 7.3 3.0 1.2

As with the previously analyzed remaining entities, an increase in the number of publications with altmetrics occurred in 2012. The evolution of the five research fields with higher and lower AAS percentages in 2017 and the number of publications is offered in Figure 7. As we can observe, the AAS percentage of these disciplines is between 5 and 25 in the period before 2012, while in 2017, the differences among fields have been evidenced, from communication technologies (4.9) to genetics (62.6).

Figure 7 

Evolution of the Altmetric Attention Score percentage (2010 to 2017) for research categories. Source: Dimensions and Altmetric.com.

The AAS percentage for each of the 22 fields in which Dimensions integrates research categories is available in Table 11. The results obtained not only reinforce previous findings (high values for medicine and biology-related disciplines; low results for mathematics, engineering, and computer sciences), but also provide an overall picture of all disciplines, locating humanities and social sciences in the global framework, with unexpected high values for education (comprising 4 research categories) and studies in human society (covering 9 research categories).

Table 11

Fields and Research Categories: average percentage of documents cited and mentioned in social media.

FIELD Number Categories Publications Cited (%) avg AAS (%) avg

Biological Sciences 9 6357474 83.8 27.1
Education 4 592948 70 25.2
Medical and Healh Sciences 18 15539667 80.4 21.5
Environmental Sciences 4 731728 77.7 20.9
Studies in Human Society 9 1150598 62.7 20.1
Chemical Sciences 8 4574758 82.5 18.1
Earth Sciences 6 1146074 80.9 17.6
Agriculture and Veterinary Sciences 8 330732 77.8 17.2
Psychology and Cognitive Sciences 3 1851047 65.8 14.9
Technology 8 900900 76.2 14.6
History and Archaeology 3 685797 51.4 14.1
Economics 3 910597 64.3 13.1
Language, Communication and Culture 6 595284 52.6 11.9
Physical Sciences 7 2317087 73.4 11.7
Built Environment and Design 5 40991 51.2 11.4
Studies in Creative Arts and Writing 6 49216 41.9 9.8
Engineering 16 5599269 68.4 9.2
Philosophy and Religious Studies 5 269957 46.6 9.2
Commerce, Management, Tourism and Services 7 526412 53.5 9.1
Information and Computing Sciences 7 3759470 64.4 8.2
Law and Legal Studies 2 261334 51.9 8.1
Mathematical Sciences 5 2005160 73.8 6.4

Note: Each document can be assigned to more than one research category.

Note: The field average of Documents Cited and Documents with Altmetric Attention Score is calculated through the Cited (%) and AAS (%) of each research category within the field. Therefore, results should be considered as approximate indicative numbers.

Funders

Regarding the funding bodies, the Medical Research Council (UK) holds the highest total AAS percentage (58.3) in the sample (Table 12), although The Wellcome Trust (UK) also stands out for achieving the highest percentage in 2015 (85.8), 2016 (87.2), and 2017 (87.7).

Table 12

Funding bodies with the highest and lowest Altmetric Attention Score percentage. Data: Top 50 funding bodies according to total number of publications indexed).

PERFORMANCE Funders Publication Altmetric Attention Score (%)

R PAlt TOT 2017 Range SD

HIGH Medical Research Council 21 88375 58.3 83.9 48.7 20.1
European Research Council 34 55482 56.3 64.7 32.0 18.3
Canadian Institutes of Health Research 30 57183 51.2 78 42.1 18.5
Wellcome Trust 24 70999 50.9 87.7 51.6 21.4
Directorate for Biological Sciences (USA) 36 47706 48.7 80.6 56.0 21.5

LOW National Natural Science Foundation of China 1 247376 17.5 24.5 16.4 6.3
Ministry of Science and Technology (Taiwan) 20 28445 17.5 31.5 19.9 6.9
China Postdoctoral Science Foundation 50 11921 16.9 21.4 18.6 5.9
Directorate for Computer & Information Science & Engineering (USA) 28 19203 16.3 22.8 14.0 3.4
Ministry of Education of the People’s Republic of China 13 35006 15.4 23.5 16.5 6.0

The National Natural Science Foundation of China is the funding body of the sample with the highest number of associated publications. However, and in accordance with the data obtained in previous sections, it achieves a low AAS percentage (17.5). This same occurs with other Chinese funding bodies (Ministry of Science and Technology of the People’s Republic of China, 19.7; China Postdoctoral Science Foundation, 17.5; or Ministry of Education of the People’s Republic of China, 15.4).

In the case of Europe, we can observe a difference between the European Research Council (second highest total AAS percentage) and the European Commission (26th position). Their evolution (from 2000 to 2017) is available in Figure 8.

Figure 8 

Funding bodies. Altmetric Attention Score percentage of European Research Council and the European Commission (2010 to 2017).

Dimensions data

a) Data availability

Although Dimensions offers statistics for any query, including a null query (e.g., the whole database), the AAS percentage (percentage of documents returned by a query that has an Altmetric Attention Score of one or above one) for the aggregated entities analyzed firstly depends on the coverage of documents and, secondly, on the information extracted from each document.

In this sense, the in-house version of Dimensions as of July covers a total of 40,711,747 publications published between 2000–2017. Of these, only 55.1% (22,433,285) have an associated research category field, and an affiliation field appears for 53.8% (21,884,709).

b) Data indexing

Journal coverage in Dimensions presents some inconsistencies. First, including SSRN Electronic Journal (it is currently not a peer-reviewed journal) is debatable (7th position in total number of publications). Secondly, some journals (Advance Materials Research, Applied Mechanics and Materials, Inpharma Weekly, Scientific Reports, and Medicine & Science in Sports & Medicine) show unusual annual article indexing rates in some years (from 2000 to 2017) that might bias the results.

In order to delve into this issue, we compared the number of articles indexed by these journals per year both in Dimensions and the Web of Science. Out of the 50 journals in the sample—those with more total publications indexed in Dimensions—8 are not indexed in the Web of Science.

When one journal changes its name from Name 1 to Name 2, Dimensions merges the articles of all two titles under the bibliographic record corresponding to Name 2. For this reason, in order to compare the output in Dimensions offered by Web of Science, we need to locate all previous journal names in WoS and then merge their production, in order to compare the total volume of publications indexed in both databases. This issue was detected in the following journals of the analyzed sample: Biochimica et Biophysica Acta, Angewandte Chemie, The Lancet, Physical Review A-general physics, Physical Review B-solid state, British Medical Journal, and Analytical and Bioanalytical Chemistry.

The Spearman correlation of all articles indexed by each journal is statistically significant but unexpectedly low (0.54; p-value: 0.000) and increases when only the period (2000 to 2017) is considered (0.87; p-value: <0.0001). A scatter plot comparing the ranking position of journals according to the total number of articles indexed in Dimensions with the ranking position that these journals occupy in the Web of Science confirms that the two databases are offering a different coverage of the academic output (Figure 9). As we can observe, it seems the number of articles is somewhat inflated in Dimensions, with three clear outliers (Notes and Lectures, Scientific American, and Journal of Geophysical Research).

Figure 9 

Scatter plot. Ranking position of journals with more publications indexed in Dimensions and the corresponding ranking position in the Web of Science.

The case of the Journal of Geophysical Research highlights an indexing problem. This journal was gradually divided into several sections (each of them with a distinctive ISSN). In Web of Science, there are 17,071 articles under the general “Journal of Geographical Research” name (stopped at 1985) and 98,472 publications if we consider all the remaining articles in all the current journal sections. In Dimensions, we can find also a record for the general journal, as well as for each of the seven sections. However, assigning articles to the general journal shows inconsistencies (see Figure 10), which inflates the number of articles indexed in the old journal.

Figure 10 

Erroneous journal assigning in Dimensions.

c) Data volatility

Finally, a retroactive growth of Dimensions has been carried out. As we can observe in Table 13, the database significantly grows from July to October when we analyze the same years. This effect is more pronounced in some years (especially 2007), probably due to the index of new journals. However, the AAS percentage variation is low and only slightly meaningful in 2017 (global decrease of 0.7 points).

Table 13

Retroactive growth (July–October) of the Dimensions database according to the number of publications and AAS percentage per year.

YEAR July 2018 October 2018 VARIATION

TOT PUB ALT PUB AAS (%) TOT PUB ALT PUB AAS (%) PUB PUB (%) AAS (%)

2000 1690841 123431 7.3 1722054 125710 7.3 31213 1.85 0
2001 1677778 129189 7.7 1714087 131985 7.7 36309 2.16 0
2002 1750564 138295 7.9 1783923 140930 7.9 33359 1.91 0
2003 1879545 148484 7.9 1904880 152390 8 25335 1.35 0.1
2004 2089850 165098 7.9 2100383 165930 7.9 10533 0.50 0
2005 2197035 175763 8 2204030 178526 8.1 6995 0.32 0.1
2006 2384324 185977 7.8 2392543 189011 7.9 8219 0.34 0.1
2007 2569800 195305 7.6 2690652 196418 7.3 120852 4.70 –0.3
2008 2634499 202856 7.7 2648589 203941 7.7 14090 0.53 0
2009 2821544 214437 7.6 2830264 217930 7.7 8720 0.31 0.1
2010 2934715 225973 7.7 2943985 229631 7.8 9270 0.32 0.1
2011 3344453 317723 9.5 3357385 318952 9.5 12932 0.39 0
2012 3444045 509719 14.8 3459753 508584 14.7 15708 0.46 –0.1
2013 3703468 596258 16.1 3721231 599118 16.1 17763 0.48 0
2014 3870263 700518 18.1 3890949 704262 18.1 20686 0.53 0
2015 4013907 834893 20.8 4039335 836142 20.7 25428 0.63 –0.1
2016 4145209 907801 21.9 4191948 905461 21.6 46739 1.13 –0.3
2017 4384178 986440 22.5 4452112 970560 21.8 67934 1.55 –0.7

The effect of the retroactive growth per type of entity is low (average variation of countries: 0.24; cities: 0.36; institutions: 0.24; journals: 0.26; disciplines: 0.27). However, some outliers are found. The maximum variation for each entity is the following:

  • Countries: maximum variation of 11.3, detected for the USA in 2017 (43.9 in July; 55.2 in October).
  • Cities: maximum variation of 2.5, detected for Ann Arbor in 2017 (49.1 in July; 51.6 in October). Moreover, erratic variation throughout the whole period is detected.
  • Disciplines: maximum variation of 2, detected for neurosciences in 2016 (59.4 in July; 56.4 in October).
  • Journals: maximum variation of 9.2, detected for New England Journal of Medicine in 2011 (58.5 in July; 67.7 in October). Moreover, erratic variation from 2011 to 2017 is detected.
  • Universities: maximum variation of 2, detected for University College London in 2017 (68.5 in July; 70.5 in October).

5. Discussion

The results show certain limitations of Dimensions data that can jeopardize the main purpose of discovering the coverage of documents with altmetrics mentions (measured through the number of documents with an Altmetric Attention Score of one or above one).

The percentage of publications without an affiliation field is high (46.2%). This parameter is of importance due to the fact that information about institutions, cities, and countries is extracted precisely from the affiliation field. Moreover, some inconsistencies (documents indexed by journal inflated, unusual annual indexing rates, errors in assigning the article to the right journal) may change the ranking of the most productive journals in the database. Otherwise, the number of publications without a research category assigned is also high (44.9%). Moreover, publication categorization, performed at the article level instead of the journal level, has been proved in literature to show some inconsistencies (Orduna-Malea and Delgado Lopez-Cozar 2018; Bornmann 2018). Finally, the retroactive growth causes some minor variations in the Altmetric Attention Score percentage, which in timely manner may affect the results of specific entities depending on the data collection time.

Nonetheless, despite some particular exceptions, the results offered are plausible and reflect some general well-known patterns (see Results section). Moreover, because a specific period of time (2000 to 2017) and entities (top 50 entities per type) are considered, the error rate is minimized and we were allowed to specifically concentrate on the years where altmetric activity is higher (2012 onwards). In this sense, the research questions established in this work can be answered in a general way and considered with caution.

Apart from Dimensions—the database is continuously growing and improving its functionalities—other external variables may bias the results obtained.

Firstly, the percentage of documents with altmetric mentions is gathered via one specific data provider (Altmetric.com), whose results may differ from those obtained by other data providers, such as PlumX (Zahedi & Costas 2018). Also, Altmetric.com only uses mentions driven by DOI. This method potentially disregarded publication mentions without DOIs, an aspect already discussed in literature (Weller et al. 2011; Mahrt et al. 2012). Moreover, not all publications have a DOI. Gorráiz et al. (2016) estimates that 10% of articles in Web of Science (2005 to 2014) in sciences and social sciences do not have DOIs, and this percentage is much lower for humanities (exceeding 50% only since 2013). For this reason, all scores about altmetric mentions via Altmetric.com can be considered an underestimation of the real value.

Secondly, publication coverage in Dimensions is wider than in Web of Science and Scopus. At the time of writing this study, Dimensions includes 10,180,612 book chapters (with an AAS percentage of 1) and 375,080 books (AAS percentage of 8.9). This coverage definitely affects all comparisons with altmetric coverage performed previously. For example, while Torres-Salinas et al. (2018) ciphered University Pompeu Fabra as the Spanish public university with the highest percentage of documents (from 2014 to 2016) in Altmetric.com (71%), the Altmetric Attention Score percentage in Dimensions for the same period is 62.3%. Despite the different methods used—the percentage of documents included in Altmetric.com does not necessarily coincide with the percentage of documents with an Altmetric Attention Score of one or above one—the results should be expected to be closer. Therefore, a wider coverage of Dimensions offers a new perspective on the coverage of documents with altmetrics.

Thirdly, there are external variables that have been proved to bias the reception of altmetrics by the publications (Sugimoto et al. 2017). Non-biomedical disciplines (Haustein et al. 2014b; Holmberg & Thelwall 2014; Ortega 2018; Zahedi et al. 2014), disciplinary journals (Zahedi et al. 2014), Latin-American countries (Alperin 2015), and, in general, older publications (Zahedi et al. 2014) statistically obtain less social media metrics than disciplines on biomedicine, multidisciplinary journals, English-speaking countries, and recent publications. All these previous conclusions are in line with the results obtained via Dimensions, which reinforces its reliability.

6. Conclusions

RQ1. Total coverage of publications with altmetrics

The total number of publications with an Altmetric Attention Score (AAS) of one or above one is low (9,167,952 documents; 9.4% out of the total coverage) and highly concentrated in recent years (from 2012 onwards), especially 2017, which contains 10.6% of all the documents with AAS. The percentage of documents with an AAS percentage is higher (18.9%) when only open access documents are considered (an Open Access Altmetric advantage).

RQ2 to RQ6. Journals, places (countries, cities), institutions, categories, and funding bodies

Multidisciplinary (Nature, Science PNAS, PLoS One) and medicine (Journal of the American Medical Association, British Medical Journal, New England Journal of Medicine, The Lancet) journals represent the sources with the highest AAS percentages (especially in 2017). Therefore, the most visible institutions are those active in these fields, highlighting English-speaking universities (Harvard, UCLA, Johns Hopkins, University College London, or University of Melbourne). Otherwise, Japanese, Chinese, and Russian institutions, despite having high annual publication outputs, obtain lower AAS percentages. Both language and the use of alternative social media platforms (especially in China, where Twitter is blocked), which are not covered enough by Altmetric.com yet, may explain this issue.

This circumstance is subsequently inherited in the analysis of cities and countries, diminishing the presence of Japan, China, Russia, or India. However, Australia has the highest AAS percentage within the top 50 highly productive countries in the world, followed by Denmark and the Netherlands.

Disciplines are also directly influenced by journal coverage in Dimensions. Research categories such as genetics, immunology, microbiology, or medical microbiology have held higher AAS percentages in the last years. Conversely, fields related with mathematics, engineering, and, unexpectedly, computer science, achieve lower AAS percentages.

Finally, research bodies reflect the predominance of medical-, health- and biological-related disciplines (Medical Research Council, Canadian Institutes of Health Research, Directorate for Biological Sciences). On the other hand, Chinese funding bodies, following the previous pattern, hold the lowest AAS percentages.

RQ7. Dimensions

The analysis has brought out some inconsistencies in the quality of the data. In this sense, AAS percentages can vary from those offered in the database, although we estimate that this issue will not substantially modify the general patterns found. However, the database still needs some improvements. The nature of Dimensions (wide coverage and structured metadata to be exported and re-used) makes this bibliographic database and research framework an essential tool to monitor the coverage and evolution of scientific literature impact (mentions) on social media platforms. The volume of data and growth rate confirms Dimensions as a new player in the ecosystem of research information.

Additional Files

The additional files for this article can be found as follows:

Appendix A

Altmetric Attention Score (%) for Top 50 Countries with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s1

Appendix B

Countries: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s2

Appendix C

Altmetric Attention Score (%) for Top 50 Cities with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s3

Appendix D

Cities: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s4

Appendix E

Altmetric Attention Score (%) for Top 50 Universities with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s5

Appendix F

Universities: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s6

Appendix G

Altmetric Attention Score (%) for Top 50 Journals with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s7

Appendix H

Journals: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s8

Appendix I

Altmetric Attention Score (%) for Top 50 Research Categories with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s9

Appendix J

Categories: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s10

Appendix K

Research Categories matched with General Fields: Publications, Cited (%), FCR, RCR, AAS (%). DOI: https://doi.org/10.29024/joa.13.s11

Appendix L

Altmetric Attention Score (%) for Top 50 Funders with higher number of publications in Dimensions (2000 to 2017). DOI: https://doi.org/10.29024/joa.13.s12

Appendix M

Funders: Publication, Cited (%) and Altmetric Attention Score (%). DOI: https://doi.org/10.29024/joa.13.s13

Acknowledgements

Authors would like to thank Dr. Stefanie Haustein and Dr. Rodrigo Costas for their feedback and help during the performance of this work, partly developed in Ottawa (Canada).

Competing Interests

The authors have no competing interests to declare.

References

  1. Alperin, J. P. (2015). Geographic variation in social media metrics: An analysis of Latin American journal articles. Aslib Journal of Information Management, 67(3), 289–304. DOI: https://doi.org/10.1108/AJIM-12-2014-0176 

  2. Barthel, S., Tönnies, S., Köhncke, B., Siehndel, P., & Balke, W. T. (2015). What does twitter measure? Influence of diverse user groups in altmetrics. In Proceedings of the 15th ACM/IEEE-CE on Joint Conference on Digital Libraries, Knoxville, USA. DOI: https://doi.org/10.1145/2756406.2756913 

  3. Bode, C., Herzog, C., Hook, D., & McGrath, R. (2018). A guide to the dimensions data approach. A collaborative approach to creating a modern infrastructure for data describing research: where we are and where we want to take it. Retrieved from https://www.digital-science.com/resources/portfolio-reports/a-guide-to-the-dimensions-data-approach. 

  4. Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 89–903. DOI: https://doi.org/10.1016/j.joi.2014.09.005 

  5. Bornmann, L. (2018). Field classification of publications in Dimensions: A first case study testing its reliability and validity. Scientometrics, 117(1), 637–640. DOI: https://doi.org/10.1007/s11192-018-2855-y 

  6. Chen, C. (2018). Cascading Citation Expansion. Retrieved from: https://arxiv.org/ftp/arxiv/papers/1806/1806.00089.pdf. 

  7. Costas, R., Zahedi, Z., & Wouters, P. (2015). Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology, 66(10), 2003–2019. DOI: https://doi.org/10.1002/asi.23309 

  8. Gorraiz, J., Melero-Fuentes, D., Gumpenberger, C., & Valderrama-Zurián, J. C. (2016). Availability of digital object identifiers (DOIs) in Web of Science and Scopus. Journal of Informetrics, 10(1), 98–109. DOI: https://doi.org/10.1016/j.joi.2015.11.008 

  9. Harseim, T., & Goodey, G. (2017). How do researchers use social media and scholarly collaboration networks (SCNs). Nature Blog, 15. Retrieved from http://blogs.nature.com/ofschemesandmemes/2017/06/15/how-do-researchers-use-social-media-and-scholarly-collaboration-networks-scns. 

  10. Haustein, S. (2014). Readership Metrics. In B. Cronin, C. R. Sugimoto (Eds.). Beyond ibliometrics: Harnessing multi-dimensional indicators of performance (pp. 327–344). Cambridge (MA), USA: MIT Press. 

  11. Haustein, S. (2016). Grand challenges in altmetrics: Heterogeneity, data quality and dependencies. Scientometrics, 108(1), 413–423. DOI: https://doi.org/10.1007/s11192-016-1910-9 

  12. Haustein, S., Bowman, T. D., Macaluso, B., Sugimoto, C. R., & Larivière, V. (2014a). Measuring Twitter activity of arXiv e-prints and published papers. Paper presented at Altmetrics14: Expanding Impacts and Metrics, Workshop at Web Science Conference, Bloomington (Indiana), USA. DOI: https://doi.org/10.6084/m9.figshare.1041514%20. 

  13. Haustein, S., Larivière, V., Thelwall, M., Amyot, D., & Peters, I. (2014b). Tweets vs. Mendeley readers: How do these two social media metrics differ?. IT-Information Technology, 56(5), 207–215. DOI: https://doi.org/10.1515/itit-2014-1048 

  14. Haustein, S., Peters, I., Sugimoto, C. R., Thelwall, M., & Larivière, V. (2014c). Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature. Journal of the Association for Information Science and Technology, 65(4), 656–669. DOI: https://doi.org/10.1002/asi.23101 

  15. Herzog, C., & Lunn, B. K. (2018). Response to the letter ‘Field classification of publications in Dimensions: A first case study testing its reliability and validity. Scientometrics, 117(1), 641–645. DOI: https://doi.org/10.1007/s11192-018-2854-z 

  16. Holmberg, K., & Thelwall, M. (2014). Disciplinary differences in Twitter scholarly communication. Scientometrics, 101(2), 1027–1042. DOI: https://doi.org/10.1007/s11192-014-1229-3 

  17. Hook, D., Porter, S., & Herzog, C. (2018). Dimensions: Building context for search and evaluation. Frontiers in Research Metrics and Analytics, 3. DOI: https://doi.org/10.3389/frma.2018.00023 

  18. Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471. DOI: https://doi.org/10.1007/s11192-011-0580-x 

  19. Mahrt, M., Weller, K., & Peters, I. (2012). Twitter in scholarly communication. In K. Weller, A. Bruns, J. Burgess, M. Mahrt, & C. Puschmann (Eds.). Twitter and Society (pp. 399–410). New York, USA: Peter Lang. 

  20. Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the American Society for Information Science and Technology, 65(8), 1627–1638. DOI: https://doi.org/10.1002/asi.23071 

  21. Orduña-Malea, E., & Delgado-López-Cózar, E. (2018a). ¡Viva la competencia! Nuevas dimensiones para la búsqueda y evaluación de la información científica. Anuario Think EPI, 12. DOI: https://doi.org/10.3145/thinkepi.2018.45 

  22. Orduña-Malea, E., & Delgado-López-Cózar, E. (2018b). Dimensions: Re-discovering the ecosystem of scientific information. El Profesional de la Información, 27(2), 420–431. DOI: https://doi.org/10.3145/epi.2018.mar.21 

  23. Ortega, J. L. (2018). Disciplinary differences of the impact of altmetric. FEMS microbiology letters, 365(7). DOI: https://doi.org/10.1093/femsle/fny049 

  24. Priem, J. (2014). Altmetrics. In B. Cronin, C. R. Sugimoto (Eds.), Beyond bibliometrics: Harnessing multi-dimensional indicators of performance. Cambridge (MA), USA: MIT Press. 

  25. Priem, J., Piwowar, H., & Hemminger, B. M. (2012). Altmetrics in the wild: Using social media to explore scholarly impact. Retrieved from http://arxiv.org/html/1203.4745. 

  26. Robinson-García, N., Torres-Salinas, D., Zahedi, Z., & Costas, R. (2014). New data, new possibilities: Exploring the insides of Altmetric.com. El professional de la informacion, 23(4), 359–366. DOI: https://doi.org/10.3145/epi.2014.jul.03 

  27. Schonfeld, R. C. (2018). A new citation database launches today: Digital Science’s Dimensions. Retrieved from https://scholarlykitchen.sspnet.org/2018/01/15/new-citation-database-dimensions. 

  28. Sugimoto, C. R., Work, S., Larivière, V., & Haustein, S. (2017). Scholarly use of social media and altmetrics: A review of the literature. Journal of the Association for Information Science and technology, 68(9), 2037–2062. DOI: https://doi.org/10.1002/asi.23833 

  29. Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430–435. DOI: https://doi.org/10.1016/j.joi.2018.03.006 

  30. Torres-Salinas, D., Castillo-Valdivieso, P. A., Pérez-Luque, A., & Romero-Frías, E. (2018). Altmétricas a nivel institucional: Visibilidad en la Web de la producción científica de las universidades españolas a partir de Altmetric.com. El Profesional de la Información, 27(3), 483–492. DOI: https://doi.org/10.3145/epi.2018.may.03 

  31. Van Noorden, R. (2014). Online collaboration: Scientists and the social network. Nature news, 512(7513), 126. DOI: https://doi.org/10.1038/512126a 

  32. Weller, K., Dröge, E., & Puschmann, C. (2011). Citation analysis in Twitter: Approaches for defining and measuring information flows within Tweets during scientific conferences. In M. Rowe, M. Stankovic, A.-S. Dadzie, & M. Hardey (Eds.), Making Sense of Microposts (MSM2011) (pp. 1–12). Heraklion: CEUR Workshop Proceedings. 

  33. Wouters, P., & Costas, R. (2012). Users, narcissism and control: tracking the impact of scholarly publications in the 21st century. Utrecht, Netherlands: SURFfoundation. 

  34. Zahedi, Z., & Costas, R. (2018). General discussion of data quality challenges in social media metrics: Extensive comparison of four major altmetric data aggregators. PloS One, 13(5). DOI: https://doi.org/10.1371/journal.pone.0197326 

  35. Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, 101(2), 1491–1513. DOI: https://doi.org/10.1007/s11192-014-1264-0 

comments powered by Disqus