Start Submission Become a Reviewer

Reading: Measure of Scientific Impact: How Altmetrics Can Innovate the Approach in a Multidimensional...

Download

A- A+
Alt. Display

Research

Measure of Scientific Impact: How Altmetrics Can Innovate the Approach in a Multidimensional Model

Authors:

Valeria Scotti ,

Center for Scientific Documentation, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, IT
X close

Annalisa De Silvestri,

Clinical Epidemiology and Biostatistics Unit, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, IT
X close

Luigia Scudeller,

ESCMID Medical Guidelines Director Scientific, Clinical Epidemiology and Biostatistics, Scientific Direction, IRCCS Ca’ Granda Ospedale Maggiore Policlinico Foundation, IT
X close

Chiara Rebuffi,

Center for Scientific Documentation, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, IT
X close

Funda Topuz,

Center for Scientific Documentation, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, JP
X close

Moreno Curti

Center for Scientific Documentation, Scientific Direction, Fondazione I.R.C.C.S Policlinico San Matteo, Pavia, IT
X close

Abstract

With the development of the web 2.0 social media era and the rapid increase of open access journals, novel alternative analytic metrics, known as altmetrics, have emerged. Altmetrics are not substitutes for traditional bibliometrics; however, they function as complementary additions to assess both impact and influence of a research project on the society (e.g., patients). The goal of this research was to analyze our scientific production and determine how this could help us evaluate our scientific production, research themes, and research with greater impact. Our analysis shows that some research themes had unexpected good altmetric scores compared to traditional citations, and we confirmed a high correlation between altmetric scores and standard bibliometric indexes at the institutional level. Our study shows that altmetrics are mature enough to represent an interesting complement to citation and impact factor.
How to Cite: Scotti, V., De Silvestri, A., Scudeller, L., Rebuffi, C., Topuz, F. and Curti, M., 2020. Measure of Scientific Impact: How Altmetrics Can Innovate the Approach in a Multidimensional Model. Journal of Altmetrics, 3(1), p.1. DOI: http://doi.org/10.29024/joa.23
192
Views
80
Downloads
  Published on 18 Mar 2020
 Accepted on 21 Feb 2020            Submitted on 13 Dec 2019

Background

The problem of measuring the scientific and social impact of research publications has been of extreme interest to scientists and scholars, as the first source of research waste chain (Macleod et al. 2014) is the limited relevance of many research questions to patients. Taking into account their opinion in selecting research priorities should lead to improvement in research and decrease of waste. Waste results when the needs of users of research evidence are ignored. If researchers do not meet the needs of the users of research, evidence will have less of an effect on clinical and public health practice than it should. The principal users of clinical and epidemiological research are clinicians and the patients who look to them for help. Both are often frustrated by mismatches between the uncertainties that they wish to see addressed in research and the questions that researchers choose to investigate (Liberati 2011).

Alternative metrics claim to measure research impact outside the academic community. As a consequence, the concept of impact of research has been rapidly evolving in the recent years from a scenario evaluating only impact on scientific community to a scenario evaluating impact also on general society: altmetrics take their place alongside well-known terms as H-index or impact factor (IF). A new scenario for the evaluation of science opens, where the interaction between scholarly work and social networks and, more widely, society can be explored (Bornmann et al. 2018).

Librarians are interested in these themes, and their knowledge about this issue is rapidly growing (Gómez-Sánchez et al. 2019); their role is changing as they become more and more involved in research design.

Objectives

We seek to answer the following questions. How can we help our researchers with this new data? Using courses, training, help in completing a CV or something new? How can we use this data for the institution (compared with traditional methods)? What are the clinics that get the most citations and altmetric scores? Which lines of research are most attractive (maybe for funds or grants)? What was the citational trend over the years of our hospital?

Methods/Description

In this monocentric study, we collected the scientific production from the year 2011 (for a total of n. 3176) articles of our hospital.

With FileMaker 11 software (https://www.filemaker.com/), we have created a database that collects citations and altmetrics of all research articles produced by our researchers at Foundation IRCCS Policlinico San Matteo. FileMaker is a cross-platform relational database application: Data (field values) can be imputed or retrieved importing from another application. Field values in a FileMaker file can be: text, numbers, dates, times, timestamps, pictures, sounds, movies, enclosed files, calculated values, and summary values. It is possible to create reports to group or summarize data.

We retrieved citations for each article through the Web of Science and Scopus databases. Through the PMID and the DOI of each publication, we obtained each one’s score on Almetric.com (Figure 1). Launching the update, the system was able to connect to both Web of Science, to Scopus and to Altmetric.com (Figure 2). Data can then be broken down by year, department, or unit.

Figure 1 

Institutional central database created with FileMaker 11 software. There are the data of the article, the IF of the journal, the DOI, the research line, the WoS number and the Scopus Eid number.

Figure 2 

Tools of the study (WoS, Altmetric.com, Scopus). The system connects to the various databases.

We assessed the correlation between altmetrics, citation counts and traditional indexes, by Spearman’s rank correlation coefficient. The magnitude of an effect size for correlation coefficients was evaluated as follows, as described by Cohen: small for correlation coefficients on the order of 0.1, medium for those on the order of 0.3, and large for those on the order of 0.5 (Cohen 1988). In our study, we considered a correlation coefficient of greater than 0.3 significant, in line with many correlation coefficients reported in the literature (Hemphill 2003). The results of papers were then grouped across departments or research themes and both WoS citation and altmetrics scores were summed up.

Also, we analyzed trends over time of both altmetrics and traditional indexes using the trend test across ordered groups (Cuzick 1985).

Results

a) Our institution every year promotes a course on the use of social media for researchers with great attendance. Some of our researchers are interested in including altmetrics scores in their CV.

b) Good correlation between Altmetric.com score and traditional metrics (WOS citation) is observed (Figure 3) for the whole period and for each year considered separately (r ranging from 0.33 in 2012 and 0.45 in 2013).

Figure 3 

Correlation between Web of Science and Altmetric.com citation score.

Some research themes (defined using lines of research designed by Italian MoH for funding purposes) had an unexpectedly good altmetric score compared to traditional citations, such as chronic immunological diseases (this could be a sign of particular interests of patients and patient organizations). In contrast, bone marrow transplantation and related diseases have a greater citation index than the altmetric score (this could be much more interesting for the clinician and research community). Despite observing a good correlation (rho = 0.40) in department analysis, some discrepancies emerged (for example, papers from pain therapy unit have a higher score on altmetrics than on WoS citation).

A high percentage of papers have their own altmetric score and the altmetric’s total score increases every year (p for trend < 0.001).

Discussion

In our study, we confirmed the correlation of altmetric score with standard bibliometric indexes at the institutional level already shown in a previous work (Scotti et al. 2016) based on 2013 data.

As expected, comparisons between altmetrics scores and more traditional indexes can discriminate between research themes that may have a greater impact on the general lay public than on the research community. This could be important to identify those research areas that need more consideration if we want to reduce research waste that results when the needs of end-users of research are ignored. So it is becoming more and more evident that alternative metrics may play a crucial role in helping society as well as patient communities to retrieve reliable information on research needs. In synthesis, designing research not only based on systematic reviews of the available evidence (evidence-based research), but also on papers and themes more discussed by the public could result in less wasteful research, especially in the medical field.

Researchers, together with knowledgeable scientific journalists, could contribute to spreading relevant scientific results for the scientific education of the public. In fundraising activity, they may also highlight the value of the most successful research programs (meaning research programs that have a higher social impact, i.e., a higher altmetric score) to their institutions, because altmetrics measure the impact in real time. Showing how research is relevant to the general public is useful, especially for institutions and foundations funded by public money, such as the one taken into consideration in this study.

Altmetrics could contribute to the ‘creation of value’ and give a more complete prospective on the important question of the democratization of evaluation, as unlike citation metrics, altmetrics will track the impact outside the academy. As a matter of fact, a wider use of quantitative indicators and the emergence of altmetrics can be seen as part of the transition to a more accountable and transparent research system. As the San Francisco Declaration on Research Assessment (DORA) states, ‘The outputs from scientific research are many and varied […] It is thus imperative that scientific output is measured accurately and evaluated wisely.’

Limitation and strength of the study

The data from our study are from a single institution, resulting in a smaller sample size compared to other studies. However, the in-depth analysis of various research themes or departments in a single institution reduces the heterogeneity inherent in data coming from different institutions, and allows for an analysis of a real-life situation and a pragmatic measure of the impact this new metric should have in addition to the traditional ones.

Conclusions and further works

Altmetrics are confirmed as an interesting complement to citations. We would like to further explore the possibility of combining altmetrics with traditional indicators in a more multidimensional model that could also include the single component of altmetric score, to assess the impact of scientific works over a given period of time, and to assess the reliability of such a complex model.

Competing Interests

The authors have no competing interests to declare.

References

  1. Bornmann, L., & Haunschild, R. (2018). Alternative article-level metrics: The use of alternative metrics in research evaluation. EMBO Reports, 19(12), e47260. DOI: https://doi.org/10.15252/embr.201847260 

  2. Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillsdale, NJ: Lawrence Erlbaum Associates. 

  3. Cuzick, J. (1985). A Wilcoxon-type test for trend. Statistics in Medicine, 14(4), 445–6. DOI: https://doi.org/10.1002/sim.4780140409 

  4. FileMaker 11 software. https://www.filemaker.com/ 

  5. Gómez-Sánchez, A., Scotti, V., De Silvestri, A., Ardita, G., Halling, T., & Chaleplioglou, A. (2019). Metric competencies for biomedical librarians: Results of a survey developed by the EAHIL Evaluation and Metrics group. Journal of the European Association for Health Information and Libraries, 15(3), 17–2. DOI: https://doi.org/10.32384/jeahil15332 

  6. Hemphill, J. F. (2003). Interpreting the magnitudes of correlation coefficients. Am Psychol, 58, 78–79. DOI: https://doi.org/10.1037/0003-066X.58.1.78 

  7. Liberati, A. (2011). Need to realign patient-oriented and commercial and academic research. Lancet, 378, 1777–1778. DOI: https://doi.org/10.1016/S0140-6736(11)61772-8 

  8. Macleod, M. R., Michie, S., Roberts, I., Dirnagl, U., Chalmers, I., Ioannidis, J. P., Al-Shahi Salman, R., Chan, A. W., & Glasziou, P. (2014). Biomedical research: Increasing value, reducing waste. Lancet, Jan 11, 383(9912), 101–4. DOI: https://doi.org/10.1016/S0140-6736(13)62329-6 

  9. San Francisco Declaration on Research Assessment (DORA). https://sfdora.org/read/ 

  10. Scotti, V., De Silvestri, A., Scudeller, L., Abele, P., Topuz, F., & Curti, M. (2016). Novel bibliometric scores for evaluating research quality and output: a correlation study with established indexes. The International Journal of Biological Markers, 31(4), e451–e455. DOI: https://doi.org/10.5301/jbm.5000217 

comments powered by Disqus