Researchers in some countries need to demonstrate the societal impact of their research (Wilsdon et al. 2015), despite this being impossible to directly measure. Social media mention counts (also known as “altmetrics”) have been proposed as a solution to this problem (Priem et al. 2010), on the basis that evidence of research being mentioned may serve as evidence that has been noticed, or as a proxy for evidence that has been found to be useful. The original definition of altmetrics has been expanded to encompass similar indicators derived from the wider web, such as online news, syllabus or policy document mentions. Evidence of societal impact is especially needed in the arts and humanities because these often claim to address the cultural needs of wide sections of the public (McCarthy et al. 2001; Nussbaum 2010). In addition, humanities research is often characterised by the need to explain the purpose of the research in detail, requiring greater engagement by the author with potential readers (Fry & Talja 2007; Whitley 2000). The social sciences underpin many numerically large professions as well as generating public interest for insights into the human condition and may expect to have a large non-academic audience (Lynd 2015). It therefore seems reasonable to expect that some specialization in these areas might generate substantial interest online. This article investigates the social sciences, arts and humanities together as relatively neglected areas even though they encompass widely varying research topics and practices (e.g. from Visual Arts & Performing Arts to Public Administration).
Although there is no detailed analysis of altmetrics for the social sciences, arts and humanities, one previous study has analysed half a million Web of Science (WoS) articles and reviews with Digital Object Identifiers (DOIs) published between July until December 2011. It used altmetric data collected via the Altmetric.com API on or before November 2014 (Costas, Zahedi, & Wouters 2015). It mapped the prevalence of various types of altmetrics in 250 WoS subject categories, and reported mean and median values by broad area for the social reference sharing site Mendeley. It analysed the proportion of fields within three broad areas (Science, Social Science, Arts & Humanities) that had average altmetrics above or below the full data set median (Table IV of Costas, Zahedi, & Wouters 2015), finding almost all Arts and Humanities subjects (81%–96%) to have below-average altmetric scores for Mendeley, Twitter, Facebook, Google Plus, blogs and news sources, whereas most Social Sciences subjects (63%–88%) had above-average scores. It also found Twitter to be most prevalent for medicine, psychology and the social sciences. This study used relatively old data, however it gives relatively coarse-grained information (other than the mappings) and does not analyse these subjects in detail. A comparison of Mendeley reader counts in social sciences and humanities found reader counts to be twice as high in the social sciences (Mohammadi & Thelwall 2014). Books are important in some social sciences, arts and humanities, but these seem to have lower altmetric scores than articles (Torres-Salinas, Robinson-Garcia, & Gorraiz 2017). A study of articles with PubMed IDs in multiple fields found that the proportion of articles that had been tweeted was relatively high in professional fields (17%) and psychology (15%) compared to other social sciences (9%) and humanities (7%) (Haustein et al. 2014), but more recent studies have reported much higher proportions tweeted. An analysis of Swedish humanities journal articles from 2012 found that 21% had been tweeted, compared to 3% mentioned in Facebook and 2% blogged (Hammarfelt 2014). Alternative indicators have also been developed for humanities books, such as library holding counts (Torres-Salinas & Moed 2009; White et al. 2009; White & Zuccala 2018), Goodreads reviews (Zuccala et al. 2015) and syllabus mentions (Kousha & Thelwall 2016).
Some content analyses of altmetrics have given insights into why articles can be cited in the social web. Most tweets citing academic papers just tweet a title or short summary and few tweet an opinion (Thelwall et al. 2013). Tweets from a set of researchers were often about research, although with disciplinary differences in the extent of tweeting (Holmberg & Thelwall 2014; see also: Haustein et al. 2014; Priem & Costello 2010). An investigation into citations in health-related science blogs found that these were a valuable means of translating academic research for the public, even sometimes including practical health advice (Shema, Bar-Ilan, & Thelwall 2015). A survey of people that had tweeted a link to academic research found that a substantial minority were non-academics, providing useful evidence that tweets may reflect public interest in scholarship to a moderate extent (Mohammadi et al. 2018).
Given the lack of current information about the prevalence of altmetrics across fields in the social sciences, arts and humanities, the current paper investigates this using data provided by the main current altmetric gatherer, Altmetric.com. Mendeley reader counts are excluded because they have been extensively researched and shown to be a citation-like indicator (Mohammadi et al. 2016; Thelwall 2017c). The following research questions drive the investigation, which focuses on the ten Altmetric.com indicators shared via their Applications Programming Interface (API) (blogs, news, tweets, Reddit, Facebook, Pinterest, Wikipedia, reviews, questions, Google Plus):
The articles were taken from Scopus rather than WoS because of its greater coverage of non-English sources (Falagas et al. 2008; Mongeon, & Paul-Hus 2016), which is important for the Arts and Humanities because these often publish in local languages. Thus, a Web of Science dataset would be likely to overrepresent articles from English-speaking nations more than Scopus. Google Scholar has greater coverage (Halevi, Moed, & Bar-Ilan 2017; Harzing & Alakangas 2016) but lacks a classification scheme or automated access for fields. Both Microsoft Academic (Harzing & Alakangas 2017; Thelwall 2017a) and Dimensions (Thelwall 2018) also probably have greater coverage but both lack reliable fine-grained classification schemes.
All documents of type journal article that were published in 2013 and classified in Scopus Social Sciences (22 fields), Arts and Humanities (13) narrow fields were collected 15–17 July 2018 from the Scopus API. The year 2013 was chosen to give enough time (5 years) for articles to accrue mature citation counts in addition to altmetrics. Arts and humanities citations can be slow to accrue and so a relatively long citation window is appropriate here. Altmetrics for each article in this set with a DOI were collected 5–7 August 2018 from the free Altmetric.com API using a DOI query. Articles were analysed according to the Scopus classification of the journal in which they were published. This is not an ideal approach because articles within multidisciplinary journals can be allocated to inappropriate categories. Alternative classification methods, such as those based on keywords, article/title text, citations and references (Ruiz-Castillo & Waltman 2015) might largely solve this problem but would give less transparent and less practically useful conclusions for the current article, since there is no agreed procedure for algorithmic article-level classification.
The geometric mean and proportion cited was calculated for citation counts and each of the ten Altmetric.com indicators (Blogs, News, Tweets, Reddit, Facebook, Pinterest, Wikipedia, Reviews, Questions, G+) separately for each field. Geometric means are more appropriate than arithmetic means for skewed data (Fairclough & Thelwall 2015) and more fine-grained than medians. This is important when there are many zeros, as for all the indicators in all fields (all fields had a median of zero). Wilson Score Intervals (Wilson 1927) were used for the proportion cited calculations because these are reasonable estimates for 95% confidence intervals (Agresti & Coull 1998). Although the data is not a sample from a population in the conventional statistical sense, the data can be thought of as a sample of the papers that could have been written under similar circumstances (i.e. the apparent population: Berk, Western, & Weiss 1995; Bollen 1995; Leahey 2005) and confidence intervals cover the underlying likelihood of papers from each field attracting altmetrics.
To investigate fields with relatively high proportions of non-zero altmetric scores (RQ2), narrow fields with the highest proportions were investigated for each altmetric with outliers. This investigation took the form of tracing the source of the mentions or citations to find patterns in their creation.
For the third research question, the proportions of articles with non-zero scores on each altmetric for each field were correlated. Pearson correlations were used instead of Spearman because the proportion data was not highly skewed.
Although citation counts for journal articles are low in the arts and humanities, in all fields they were more common than the commonest altmetric, Tweeters (Table 1). The rest of the results focus on the proportion of articles with a non-zero altmetric score. This is more informative than average scores because most articles have a score of 0 on all altmetrics.
|Arts & Humanities (misc)||6973||5.74||0.67|
|Language & Linguistics||5666||2.35||0.27|
|Archeology (arts & humanities)||2896||4.54||0.33|
|History & Philosophy of Science||2961||3.31||0.67|
|Literature & Literary Theory||4043||0.59||0.12|
|Visual Arts & Performing Arts||3135||0.92||0.21|
|Social Sciences (misc)||5458||4.69||0.73|
|Geography, Plan & Development||6698||4.53||0.38|
|Health (social science)||6744||3.99||0.74|
|Human Factors & Ergonomics||1784||6.29||0.53|
|Library & Information Sciences||5287||3.33||0.38|
|Linguistics & Language||5736||2.23||0.24|
|Sociology & Political Science||7132||3.65||0.72|
|Life-span & Life Course Stud||1332||6.40||0.57|
|Political Science & International Relations||5995||2.88||0.64|
The main results are displayed for two indicators per graph, ignoring the three indicators (Questions, Reviews and highlights, Pinners) with less than one citation per thousand articles. The Twitter altmetric counts the number of different Twitter users that have tweeted a link to an article for which a DOI could be extracted, using data from the Twitter firehose. Although there are variations and exceptions, on average, 35% of articles in social sciences fields are tweeted and 15% of articles in Arts and Humanities fields (Figure 1). The most tweeted field is unsurprisingly health-related (e.g. see the Health circles in Figure 1B of Haustein et al. 2014; see also: Costas, Zahedi, & Wouters 2015). The least tweeted, Classics, is not a natural fit with modern social media in the sense that it deals with communication in an age that pre-dated the web. No fields have an anomalously high rate of tweeting. No investigations were conducted for Twitter since there have been many previous content analyses of Twitter and no field has unusually many tweeted articles. Overall, there is a substantial difference in the prevalence of tweeting between fields, varying from 6% to 41% of articles, a factor of seven.
The Facebook altmetric covers links from public Facebook wall posts, excluding all private content within the site. Google Plus content also originates only from public posts. Since Twitter is public by default and Facebook is private by default, it is not clear whether articles are more prevalent in Twitter than Facebook, despite the lower scores for the latter. The proportion of articles with non-zero scores for Facebook and Google Plus are low, with only three fields having over 10% of their articles posted about publicly on Facebook (Figure 2). There is a substantial difference between fields in the prevalence of Facebook posts, varying from 2% to 11% of articles, a factor of five. For Google Plus, the range is from 0% to 1.7%. The three most posted fields were investigated for insights into why articles were discussed on Facebook. The most important conclusions are highlighted in bold.
Ignoring the general category, the field with the highest proportion of articles with Google Plus citations was investigated for causes.
News counts record links from Altmetric.com’s list of science-related mainstream media online sources that can be tracked to DOIs. It may not cover social science, arts and humanities press content as well, given that it may appear in general sections of newspapers and websites. Similarly, the blog citations are from Altmetric.com’s list of science-related blogs, which may tend to omit arts and humanities blogs as well as occasional coverage of scholarship in other blogs (e.g. literary blogs). The prevalence of News varies from 0% to 5% of articles, and for Blogs from 0.6% to 7% gives a factor of 12. Ignoring the miscellaneous categories, the fields attracting the most mainstream media news attention (Figure 3) were investigated to find the causes or origins of the attention.
The fields attracting the most blogging were investigated for possible causes of the interest.
Wikipedia citations are derived directly from Wikipedia references and Reddit citations are extracted by the Reddit API. Both are dependent on DOIs in references or the cited article pages. The prevalence of Wikipedia references varies by a factor of 8 from 0.5% to 3.8% of articles, and for Reddit it varies from 0% to 0.7%. The field attracting the most Wikipedia citations (Figure 4) was investigated for coverage reasons.
Ignoring the miscellaneous categories, the field attracting the most Reddit posts (Figure 4) was investigated for coverage reasons.
For RQ3, there is a moderately strong tendency for fields that have higher proportions of articles cited by one altmetric source to also have higher proportions cited by the others (Table 2). The correlations are all statistically significant and vary between 0.375 (Twitter/Wikipedia) and 0.891 (Twitter/News). Ignoring Wikipedia, the correlations are all strong, with a minimum of 0.558 (Twitter/Reddit). Thus, with the partial exception of Wikipedia and individual outliers, fields tend to attract proportionate interest in all six social media sources (Twitter, Facebook Gplus, Blogs, News, Reddit).
The altmetric sources could be split conceptually into three groups: news related (News, Blogs, Reddit); social network sites (Twitter, Facebook, Gplus); and knowledge record (Wikipedia). The first two groups are not reflected in the correlations since the strongest correlation is between them (Twitter and News) rather than within either of the two groups. Thus, the results suggest that the level of social network interest and news discussion does not vary much between narrow fields: they tend to have proportionate amounts of both. This is surprising given that tweets are short, informational (Thelwall et al. 2013), and require the original article to be read, whereas news and blog posts are in their nature longer, involve discussion, and often translate research for a non-specialist audience (Shema et al. 2015).
A factor analysis of the data for Table 2 (n = 35 fields) with Promax non-orthogonal rotation (because there is no reason to believe that different dimensions of online attention/impact would be orthogonal) found only one significant factor (one eigenvalue of 5.5 and the rest under 1.0; the proportion data had low skewness and kurtosis and passed Bartlett and KMO tests, so was suitable for factor analysis). Thus, the altmetrics do not split into natural groups by field. In the factor analysis, Wikipedia did not form its own factor because of its strong correlation with Facebook.
An important limitation of this study is that the Altmetic.com scores are likely to be substantial underestimates of public interest in journal articles in all fields. This is because Altmetric.com cannot scan private web pages, such as most of Facebook, and presumably does not have the resources to crawl the entire web for mentions. Moreover, it can only associate a citation with an article when it can be tracked to the cited article’s DOI, loosing informal mentions even formal citations that lack DOIs. Its data may also contain errors or miss citations for technical reasons (Ortega 2018), and automated tweets may also have been collected because they are difficult to detect (Haustein et al. 2016). Another limitation is the implicit assumption that all Scopus categories cover similarly scholarly journals but this is unlikely to be true. For example, the Health (social science) category contains journals publishing articles aimed at a more general or professional audience (e.g. ‘Identifying urinary incontinence in the home setting: Part 2: Treatment and related care of incontinence’ in Home Healthcare Nurse). Language and international differences in the uptake of the social web sites covered by Altmetric.com are also important factor that affect the results. For example, articles with a national focus outside of the countries that extensively use the social web sites covered by Altmetric.com may be discussed elsewhere on the social web.
Since monographs are central to the humanities and books and other outputs/activities can also be central to individual social sciences, arts and humanities fields, the results should not be used as evidence as a lack of public online discussion about the fields analysed. These discussions may instead centre on other outputs or activities of the scholars concerned.
The proportions tweeted in each field are higher than found in previous research for the humanities, (Hammarfelt 2014; Haustein et al. 2014) presumably due to increases over time in the amount of academic tweeting. The results echo, from a different perspective, the greater amount of altmetric data for the social sciences than for the arts and humanities (Costas, Zahedi, & Wouters 2015; Haustein, Larivière et al. 2014).
The analyses of fields with high scores on each individual altmetric identified six cases where a field had generated an above average level of non-academic interest: Archaeology and Gender Studies on Facebook; Sociology and Political Science, Life Span and Life Course Studies, and Demography in News sources; Gender Studies on Wikipedia. These give reassurance that social science and humanities subjects can attract online attention for their subject matter, but the evidence is fragile due to the low proportions involved. Given the apparent widespread interest in social science, arts and humanities topics, it seems strange that evidence of online discussions of articles in these areas is rare. It is likely that there is much more extensive discussion around relevant news stories, magazine articles and popular books for topics in these areas, but this does not seem to translate into interest in journal articles. To give a concrete example, the Life Span & Life Course Studies article, ‘The role of peer rejection in the link between reactive aggression and academic performance’ in Child & Youth Care Forum from 2013 had an Altmetric.com score of 0 and a web search for its title discovered no examples of the article being mentioned other than in academic reference lists (e.g. from 28 Google Scholar citations), publication announcements and CVs. Its finding, ‘high levels of reactive, not proactive, aggression were uniquely associated with low levels of academic performance, and peer rejection accounted for this association’ (Fite et al. 2013), seems to be of immediate practical significance within education despite the lack of public online non-academic mentions. Its full text is online (in ResearchGate) and the publisher website has a clear summary. It might be discussed in private areas of the social web and in subscription magazines for social workers and education professionals. Nevertheless, it is hard to understand why it has not generated any recordable online commentary. More generally, there seems to be a large gap between the provision of potentially useful research and its wider uptake, at least as recorded in the social web.
This article assessed the prevalence of altmetrics for journal articles in social science, arts and humanities fields, comparing the results at the level of whole fields rather than individual articles. Most articles in all fields had a score of zero for all altmetrics. This extends and updates prior research with Altmetric.com data (Table IV of Costas et al. 2015) by giving finer grained results. Excluding the announcement-based Twitter, at most 12% of articles in any field attracted a non-zero altmetric score. Thus, the Altmetric.com data suggests that most social science, arts and humanities papers are not discussed online in all fields. Altmetric.com data cannot be expected to be comprehensive and ignores, for example, mentions in private Facebook pages, and so this conclusion is tentative. From a policy perspective, and particularly in the current era of increasing expenditure on open science, it is worrying that so much research is ignored outside academia despite being presumably high quality, expensive to produce and frequently useful. It is therefore important to assess the extent to which research that is not discussed online is influential offline, but this may be impossible to systematically track. Despite all the altmetrics investigated being zero for most articles in all fields, there are substantial differences between fields in the extent to which articles have nonzero scores in the seven altmetrics examined in more. Thus, altmetrics are much more useful for some fields than others. Overall, they are twice as prevalent in the social sciences than in the arts and humanities.
Positive correlations between the seven main altmetrics and Scopus citation counts suggests that, with the partial exception of Wikipedia (weak correlation of 0.230), more cited fields are also likely to attract attention from all altmetrics (i.e. a higher proportion of articles with Scopus citations indicates a higher proportion of articles with a positive altmetric score). Individual exceptions include the relatively high number of public Facebook posts for Cultural Studies and the relatively low attention for Transportation. Due to these exceptions, the prevalence of relative prevalence of altmetrics must be assessed separately for individual fields. Similarly, and again with the partial exception of Wikipedia, there was a strong tendency for fields in which a higher proportion of articles attracted attention in one altmetric to also receive more attention from all the others. Together with the anomalies found in some of the investigations of outliers, this suggests that the extent to which social science, arts and humanities fields attract social web and news attention is not site-specific. Finally, since all altmetrics examined give a score of zero for most articles in all fields, they are not prevalent enough to be used to routinely compare individual articles for formal or informal evaluations. Nevertheless, they can still be used to compare groups of articles to assess the proportion with a non-zero score between groups (Thelwall 2017b), and to identify individual high impact articles. This may be useful for self-evaluations and to support more formal evaluations (e.g. to compare research groups) when the risk of systematic, bot or accidental manipulation (Haustein et al. 2016; Wouters & Costas 2012) is low.
Thank you to Altmetric.com for making the data used in this study available free for research.
The author has no competing interests to declare.
Berk, R. A., Western, B., & Weiss, R. E. (1995). Statistical inference for apparent populations. Sociological Methodology, 421–458. DOI: https://doi.org/10.2307/271073
Bollen, K. A. (1995). Apparent and nonapparent significance tests. Sociological Methodology, 25, 459–468. DOI: https://doi.org/10.2307/271074
Costas, R., Zahedi, Z., & Wouters, P. (2015). The thematic orientation of publications mentioned on social media: Large-scale disciplinary comparison of social media metrics with citations. Aslib Journal of Information Management, 67(3), 260–288. DOI: https://doi.org/10.1108/AJIM-12-2014-0173
Fairclough, R., & Thelwall, M. (2015). More precise methods for national research citation impact comparisons. Journal of Informetrics, 9(4), 895–906. DOI: https://doi.org/10.1016/j.joi.2015.09.005
Falagas, M. E., Pitsouni, E. I., Malietzis, G. A., & Pappas, G. (2008). Comparison of PubMed, Scopus, web of science, and Google scholar: Strengths and weaknesses. The FASEB Journal, 22(2), 338–342. DOI: https://doi.org/10.1096/fj.07-9492LSF
Fite, P. J., Hendrickson, M., Rubens, S. L., Gabrielli, J., & Evans, S. (2013). The role of peer rejection in the link between reactive aggression and academic performance. Child & Youth Care Forum, 42(3), 193–205. DOI: https://doi.org/10.1007/s10566-013-9199-9
Fry, J., & Talja, S. (2007). The intellectual and social organization of academic fields and the shaping of digital resources. Journal of information Science, 33(2), 115–133. DOI: https://doi.org/10.1177/0165551506068153
Halevi, G., Moed, H., & Bar-Ilan, J. (2017). Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation—Review of the literature. Journal of Informetrics, 11(3), 823–834. DOI: https://doi.org/10.1016/j.joi.2017.06.005
Hammarfelt, B. (2014). Using altmetrics for assessing research impact in the humanities. Scientometrics, 101(2), 1419–1430. DOI: https://doi.org/10.1007/s11192-014-1261-3
Harzing, A. W., & Alakangas, S. (2016). Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787–804. DOI: https://doi.org/10.1007/s11192-015-1798-9
Harzing, A. W., & Alakangas, S. (2017). Microsoft Academic is one year old: The Phoenix is ready to leave the nest. Scientometrics, 112(3), 1887–1894. DOI: https://doi.org/10.1007/s11192-017-2454-3
Haustein, S., Bowman, T. D., Holmberg, K., Peters, I., & Larivière, V. (2014). Astrophysicists on Twitter: An in-depth analysis of tweeting and scientific publication behavior. Aslib Journal of Information Management, 66(3), 279–296. DOI: https://doi.org/10.1108/AJIM-09-2013-0081
Haustein, S., Bowman, T. D., Holmberg, K., Tsou, A., Sugimoto, C. R., & Larivière, V. (2016). Tweets as impact indicators: Examining the implications of automated ‘bot’ accounts on Twitter. Journal of the Association for Information Science and Technology, 67(1), 232–238. DOI: https://doi.org/10.1002/asi.23456
Haustein, S., Larivière, V., Thelwall, M., Amyot, D., & Peters, I. (2014). Tweets vs. Mendeley readers: How do these two social media metrics differ? IT-Information Technology, 56(5), 207–215. DOI: https://doi.org/10.1515/itit-2014-1048
Holmberg, K., & Thelwall, M. (2014). Disciplinary differences in Twitter scholarly communication. Scientometrics, 101(2), 1027–1042. DOI: https://doi.org/10.1007/s11192-014-1229-3
Kousha, K., & Thelwall, M. (2016). An automatic method for assessing the teaching impact of books from online academic syllabi. Journal of the Association for Information Science and Technology, 67(12), 2993–3007. DOI: https://doi.org/10.1002/asi.23542
Leahey, E. (2005). Alphas and asterisks: The development of statistical significance testing standards in sociology. Social Forces, 84(1), 1–24. DOI: https://doi.org/10.1353/sof.2005.0108
Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the Association for Information Science and Technology, 65(8), 1627–1638. DOI: https://doi.org/10.1002/asi.23071
Mohammadi, E., Thelwall, M., & Kousha, K. (2016). Can Mendeley bookmarks reflect readership? A survey of user motivations. Journal of the Association for Information Science and Technology, 67(5), 1198–1209. DOI: https://doi.org/10.1002/asi.23477
Mohammadi, E., Thelwall, M., Kwasny, M., & Holmes, K. (2018). Academic information on Twitter: A user survey. PLOS ONE, 13(5), e0197265. DOI: https://doi.org/10.1371/journal.pone.0197265
Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213–228. DOI: https://doi.org/10.1007/s11192-015-1765-5
Ortega, J. L. (2018). Reliability and accuracy of altmetric providers: A comparison among Altmetric, PlumX and Crossref Event Data. http://jlortega.scienceontheweb.net/articles/Ortega2018.pdf. DOI: https://doi.org/10.1007/s11192-018-2838-z
Priem, J., & Costello, K. L. (2010). How and why scholars cite on Twitter. Proceedings of the American Society for Information Science and Technology, 47(1), 1–4. DOI: https://doi.org/10.1002/meet.14504701201
Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. http://altmetrics.org/manifesto/.
Ruiz-Castillo, J., & Waltman, L. (2015). Field-normalized citation impact indicators using algorithmically constructed classification systems of science. Journal of Informetrics, 9(1), 102–117. DOI: https://doi.org/10.1016/j.joi.2014.11.010
Shema, H., Bar-Ilan, J., & Thelwall, M. (2015). How is research blogged? A content analysis approach. Journal of the Association for Information Science and Technology, 66(6), 1136–1149. DOI: https://doi.org/10.1002/asi.23239
Thelwall, M. (2017a). Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals. Journal of Informetrics, 11(4), 1201–1212. DOI: https://doi.org/10.1016/j.joi.2017.10.006
Thelwall, M. (2017b). Three practical field normalised alternative indicator formulae for research evaluation. Journal of Informetrics, 11(1), 128–151. DOI: https://doi.org/10.1016/j.joi.2016.12.002
Thelwall, M. (2017c). Are Mendeley reader counts useful impact indicators in all fields? Scientometrics, 113(3), 1721–1731. DOI: https://doi.org/10.1007/s11192-017-2557-x
Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430–435. DOI: https://doi.org/10.1016/j.joi.2018.03.006
Thelwall, M., Tsou, A., Weingart, S., Holmberg, K., & Haustein, S. (2013). Tweeting links to academic articles. Cybermetrics: International Journal of Scientometrics, Informetrics and Bibliometrics, 17, 1–8.
Torres-Salinas, D., & Moed, H. F. (2009). Library catalog analysis as a tool in studies of social sciences and humanities: An exploratory study of published book titles in economics. Journal of Informetrics, 3(1), 9–26. DOI: https://doi.org/10.1016/j.joi.2008.10.002
Torres-Salinas, D., Robinson-Garcia, N., & Gorraiz, J. (2017). Filling the citation gap: Measuring the multidimensional impact of the academic book at institutional level with PlumX. Scientometrics, 113(3), 1371–1384. DOI: https://doi.org/10.1007/s11192-017-2539-z
White, H. D., Boell, S. K., Yu, H., Davis, M., Wilson, C. S., & Cole, F. T. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096. DOI: https://doi.org/10.1002/asi.21045
White, H. D., & Zuccala, A. (2018). Libcitations, WorldCat, cultural impact, and fame. Journal of the Association for Information Science and Technology. DOI: https://doi.org/10.1002/asi.24064
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Tinkler, J., et al. (2015). The Metric Tide. London, UK: HEFCE. http://www.hefce.ac.uk/pubs/rereports/year/2015/metrictide/.
Wilson, E. B. (1927). Probable inference, the law of succession, and statistical inference. Journal of the American Statistical Association, 22(158), 209–212. DOI: https://doi.org/10.1080/01621459.1927.10502953
Zuccala, A. A., Verleysen, F. T., Cornacchia, R., & Engels, T. C. (2015). Altmetrics for the humanities: Comparing Goodreads reader ratings with citations to history books. Aslib Journal of Information Management, 67(3), 320–336. DOI: https://doi.org/10.1108/AJIM-11-2014-0152