Become a Reviewer

Reading: Altmetric Prevalence in the Social Sciences, Arts and Humanities: Where are the Online Discu...


A- A+
Alt. Display


Altmetric Prevalence in the Social Sciences, Arts and Humanities: Where are the Online Discussions?


Mike Thelwall

University of Wolverhampton, GB
About Mike
Professor of data science and head of the Statistical Cybermetrics Research Group at the University of Wolverhampton. Member of the UK Forum for Responsible Research Metrics.
X close


The social sciences, arts and humanities all address issues of general interest that may generate broad societal impacts and public discussion. Although prior research suggests that this potential is not captured by altmetrics, it is not known whether this is true for all fields. In response, this article compares 35 social sciences, arts and humanities fields for 10 scores (blogs, news, Twitter, Reddit, Facebook, Pinterest, Wikipedia, reviews, questions, Google Plus) for articles published in 2013. Excluding Twitter (maximum 41%), no field had more than 12% of its articles registering a non-zero score on any altmetric five years after publication. In some cases, fields with relatively high levels of attention were due to self-publicity or the activities of individuals rather than public discussion. There were substantial differences between fields, with Classics and Literature & Literary Theory being almost ignored and Archeology generating a relatively high level of attention on Facebook. Although journal articles are not central to many social sciences, arts and humanities fields, the apparently universally low levels of discussion about them online is surprising given their potential audience.

How to Cite: Thelwall, M., 2018. Altmetric Prevalence in the Social Sciences, Arts and Humanities: Where are the Online Discussions?. Journal of Altmetrics, 1(1), p.4. DOI:
  Published on 28 Nov 2018
 Accepted on 01 Oct 2018            Submitted on 27 Aug 2018


Researchers in some countries need to demonstrate the societal impact of their research (Wilsdon et al. 2015), despite this being impossible to directly measure. Social media mention counts (also known as “altmetrics”) have been proposed as a solution to this problem (Priem et al. 2010), on the basis that evidence of research being mentioned may serve as evidence that has been noticed, or as a proxy for evidence that has been found to be useful. The original definition of altmetrics has been expanded to encompass similar indicators derived from the wider web, such as online news, syllabus or policy document mentions. Evidence of societal impact is especially needed in the arts and humanities because these often claim to address the cultural needs of wide sections of the public (McCarthy et al. 2001; Nussbaum 2010). In addition, humanities research is often characterised by the need to explain the purpose of the research in detail, requiring greater engagement by the author with potential readers (Fry & Talja 2007; Whitley 2000). The social sciences underpin many numerically large professions as well as generating public interest for insights into the human condition and may expect to have a large non-academic audience (Lynd 2015). It therefore seems reasonable to expect that some specialization in these areas might generate substantial interest online. This article investigates the social sciences, arts and humanities together as relatively neglected areas even though they encompass widely varying research topics and practices (e.g. from Visual Arts & Performing Arts to Public Administration).

Although there is no detailed analysis of altmetrics for the social sciences, arts and humanities, one previous study has analysed half a million Web of Science (WoS) articles and reviews with Digital Object Identifiers (DOIs) published between July until December 2011. It used altmetric data collected via the API on or before November 2014 (Costas, Zahedi, & Wouters 2015). It mapped the prevalence of various types of altmetrics in 250 WoS subject categories, and reported mean and median values by broad area for the social reference sharing site Mendeley. It analysed the proportion of fields within three broad areas (Science, Social Science, Arts & Humanities) that had average altmetrics above or below the full data set median (Table IV of Costas, Zahedi, & Wouters 2015), finding almost all Arts and Humanities subjects (81%–96%) to have below-average altmetric scores for Mendeley, Twitter, Facebook, Google Plus, blogs and news sources, whereas most Social Sciences subjects (63%–88%) had above-average scores. It also found Twitter to be most prevalent for medicine, psychology and the social sciences. This study used relatively old data, however it gives relatively coarse-grained information (other than the mappings) and does not analyse these subjects in detail. A comparison of Mendeley reader counts in social sciences and humanities found reader counts to be twice as high in the social sciences (Mohammadi & Thelwall 2014). Books are important in some social sciences, arts and humanities, but these seem to have lower altmetric scores than articles (Torres-Salinas, Robinson-Garcia, & Gorraiz 2017). A study of articles with PubMed IDs in multiple fields found that the proportion of articles that had been tweeted was relatively high in professional fields (17%) and psychology (15%) compared to other social sciences (9%) and humanities (7%) (Haustein et al. 2014), but more recent studies have reported much higher proportions tweeted. An analysis of Swedish humanities journal articles from 2012 found that 21% had been tweeted, compared to 3% mentioned in Facebook and 2% blogged (Hammarfelt 2014). Alternative indicators have also been developed for humanities books, such as library holding counts (Torres-Salinas & Moed 2009; White et al. 2009; White & Zuccala 2018), Goodreads reviews (Zuccala et al. 2015) and syllabus mentions (Kousha & Thelwall 2016).

Some content analyses of altmetrics have given insights into why articles can be cited in the social web. Most tweets citing academic papers just tweet a title or short summary and few tweet an opinion (Thelwall et al. 2013). Tweets from a set of researchers were often about research, although with disciplinary differences in the extent of tweeting (Holmberg & Thelwall 2014; see also: Haustein et al. 2014; Priem & Costello 2010). An investigation into citations in health-related science blogs found that these were a valuable means of translating academic research for the public, even sometimes including practical health advice (Shema, Bar-Ilan, & Thelwall 2015). A survey of people that had tweeted a link to academic research found that a substantial minority were non-academics, providing useful evidence that tweets may reflect public interest in scholarship to a moderate extent (Mohammadi et al. 2018).

Given the lack of current information about the prevalence of altmetrics across fields in the social sciences, arts and humanities, the current paper investigates this using data provided by the main current altmetric gatherer, Mendeley reader counts are excluded because they have been extensively researched and shown to be a citation-like indicator (Mohammadi et al. 2016; Thelwall 2017c). The following research questions drive the investigation, which focuses on the ten indicators shared via their Applications Programming Interface (API) (blogs, news, tweets, Reddit, Facebook, Pinterest, Wikipedia, reviews, questions, Google Plus):

  • RQ1: Are there substantial differences between Social Sciences, Arts and Humanities narrow fields in the prevalence of altmetrics?
  • RQ2: Do any Social Sciences, Arts and Humanities narrow fields attract high proportions of non-zero scores for any altmetric?
  • RQ3: Do individual Social Sciences, Arts and Humanities narrow fields that attract high proportions of non-zero scores for one altmetric also tend to attract high proportions of non-zero scores for other altmetrics?


The articles were taken from Scopus rather than WoS because of its greater coverage of non-English sources (Falagas et al. 2008; Mongeon, & Paul-Hus 2016), which is important for the Arts and Humanities because these often publish in local languages. Thus, a Web of Science dataset would be likely to overrepresent articles from English-speaking nations more than Scopus. Google Scholar has greater coverage (Halevi, Moed, & Bar-Ilan 2017; Harzing & Alakangas 2016) but lacks a classification scheme or automated access for fields. Both Microsoft Academic (Harzing & Alakangas 2017; Thelwall 2017a) and Dimensions (Thelwall 2018) also probably have greater coverage but both lack reliable fine-grained classification schemes.

All documents of type journal article that were published in 2013 and classified in Scopus Social Sciences (22 fields), Arts and Humanities (13) narrow fields were collected 15–17 July 2018 from the Scopus API. The year 2013 was chosen to give enough time (5 years) for articles to accrue mature citation counts in addition to altmetrics. Arts and humanities citations can be slow to accrue and so a relatively long citation window is appropriate here. Altmetrics for each article in this set with a DOI were collected 5–7 August 2018 from the free API using a DOI query. Articles were analysed according to the Scopus classification of the journal in which they were published. This is not an ideal approach because articles within multidisciplinary journals can be allocated to inappropriate categories. Alternative classification methods, such as those based on keywords, article/title text, citations and references (Ruiz-Castillo & Waltman 2015) might largely solve this problem but would give less transparent and less practically useful conclusions for the current article, since there is no agreed procedure for algorithmic article-level classification.

The geometric mean and proportion cited was calculated for citation counts and each of the ten indicators (Blogs, News, Tweets, Reddit, Facebook, Pinterest, Wikipedia, Reviews, Questions, G+) separately for each field. Geometric means are more appropriate than arithmetic means for skewed data (Fairclough & Thelwall 2015) and more fine-grained than medians. This is important when there are many zeros, as for all the indicators in all fields (all fields had a median of zero). Wilson Score Intervals (Wilson 1927) were used for the proportion cited calculations because these are reasonable estimates for 95% confidence intervals (Agresti & Coull 1998). Although the data is not a sample from a population in the conventional statistical sense, the data can be thought of as a sample of the papers that could have been written under similar circumstances (i.e. the apparent population: Berk, Western, & Weiss 1995; Bollen 1995; Leahey 2005) and confidence intervals cover the underlying likelihood of papers from each field attracting altmetrics.

To investigate fields with relatively high proportions of non-zero altmetric scores (RQ2), narrow fields with the highest proportions were investigated for each altmetric with outliers. This investigation took the form of tracing the source of the mentions or citations to find patterns in their creation.

For the third research question, the proportions of articles with non-zero scores on each altmetric for each field were correlated. Pearson correlations were used instead of Spearman because the proportion data was not highly skewed.


Although citation counts for journal articles are low in the arts and humanities, in all fields they were more common than the commonest altmetric, Tweeters (Table 1). The rest of the results focus on the proportion of articles with a non-zero altmetric score. This is more informative than average scores because most articles have a score of 0 on all altmetrics.

Table 1

Sample sizes and geometric mean Scopus citations and tweeters for each narrow field analysed. The 13 fields at the top of the table are Arts & Humanities, and the table order mimics the Scopus categorisation scheme order.

Field Articles Scopus citations Tweeters

Arts & Humanities (misc) 6973 5.74 0.67
History 5761 1.33 0.28
Language & Linguistics 5666 2.35 0.27
Archeology (arts & humanities) 2896 4.54 0.33
Classics 538 0.75 0.05
Conservation 569 2.31 0.16
History & Philosophy of Science 2961 3.31 0.67
Literature & Literary Theory 4043 0.59 0.12
Museology 233 2.18 0.28
Music 1363 1.35 0.19
Philosophy 5349 1.62 0.21
Religious Studies 4289 0.91 0.17
Visual Arts & Performing Arts 3135 0.92 0.21
Social Sciences (misc) 5458 4.69 0.73
Archeology 2626 4.48 0.36
Development 5281 4.65 0.43
Education 6434 3.23 0.44
Geography, Plan & Development 6698 4.53 0.38
Health (social science) 6744 3.99 0.74
Human Factors & Ergonomics 1784 6.29 0.53
Law 5155 2.86 0.47
Library & Information Sciences 5287 3.33 0.38
Linguistics & Language 5736 2.23 0.24
Safety Research 1643 4.06 0.45
Sociology & Political Science 7132 3.65 0.72
Transportation 2675 8.69 0.17
Anthropology 4910 3.52 0.65
Communication 5256 3.27 0.65
Cultural Studies 5791 1.52 0.39
Demography 1770 4.84 0.47
Gender Studies 2557 3.11 0.62
Life-span & Life Course Stud 1332 6.40 0.57
Political Science & International Relations 5995 2.88 0.64
Public Administration 2894 4.20 0.62
Urban Studies 2570 5.06 0.41

The main results are displayed for two indicators per graph, ignoring the three indicators (Questions, Reviews and highlights, Pinners) with less than one citation per thousand articles. The Twitter altmetric counts the number of different Twitter users that have tweeted a link to an article for which a DOI could be extracted, using data from the Twitter firehose. Although there are variations and exceptions, on average, 35% of articles in social sciences fields are tweeted and 15% of articles in Arts and Humanities fields (Figure 1). The most tweeted field is unsurprisingly health-related (e.g. see the Health circles in Figure 1B of Haustein et al. 2014; see also: Costas, Zahedi, & Wouters 2015). The least tweeted, Classics, is not a natural fit with modern social media in the sense that it deals with communication in an age that pre-dated the web. No fields have an anomalously high rate of tweeting. No investigations were conducted for Twitter since there have been many previous content analyses of Twitter and no field has unusually many tweeted articles. Overall, there is a substantial difference in the prevalence of tweeting between fields, varying from 6% to 41% of articles, a factor of seven.

Figure 1 

The proportion of Scopus articles from 2013 with DOIs that have at least one Tweet or Scopus citation for each Social Science, Arts & Humanities narrow field. Fields are ordered by average citation count. Error bars show 95% confidence intervals from Wilson score intervals.

The Facebook altmetric covers links from public Facebook wall posts, excluding all private content within the site. Google Plus content also originates only from public posts. Since Twitter is public by default and Facebook is private by default, it is not clear whether articles are more prevalent in Twitter than Facebook, despite the lower scores for the latter. The proportion of articles with non-zero scores for Facebook and Google Plus are low, with only three fields having over 10% of their articles posted about publicly on Facebook (Figure 2). There is a substantial difference between fields in the prevalence of Facebook posts, varying from 2% to 11% of articles, a factor of five. For Google Plus, the range is from 0% to 1.7%. The three most posted fields were investigated for insights into why articles were discussed on Facebook. The most important conclusions are highlighted in bold.

  • For Sociology and Political Science, 750 out of 7132 articles were posted on Facebook. The journal Critical Sociology in this category had its own Facebook page and posted about 41 out of 43 of its articles. The same was true for East European Politics and Societies (posting 31 out of 34 of its articles). Removing these two journals would reduce the percentage of posted articles from 10.5% to 9.6%.
  • For Archeology, 274 out of 2626 articles were posted on Facebook. No journal had close to comprehensive Facebook coverage. The most influential was Journal of Archaeological Science, with 98 of its 416 articles on Facebook (23.6%), which is more than double the field average (10.4%). This journal seemed to be popular on Facebook for its articles having a general interest value for the public. For example, ‘Cacao consumption during the 8th century at Alkali Ridge, Southeastern Utah’ was posted by an educational research centre Facebook page because it had been mentioned in Science magazine as the earliest evidence of chocolate in North America. The less promising, ‘Statistical means for identifying hunter-gatherer residential features in a lithic landscape’ was posted by a religious organisation’s Facebook page as part of a hunter-gatherer event. Similarly, ‘A new approach to tracking connections between the Indus Valley and Mesopotamia: Initial results of strontium isotope analyses from Harappa and Ur’ was posted by a man in India, presumably for its local relevance. Thus, some Archeology seems to generate substantial enough non-academic interest for it to be broadcast independently on Facebook.
  • For Gender Studies, 256 out of 2557 articles were posted on Facebook. No journals had close to complete coverage. The most influential journal was Psychology of Women Quarterly, with 21 out of 35 articles posted to Facebook (60%), although mostly from the journal Facebook page. As for Archeology, this journal’s articles seemed to have general interest on Facebook. For example, ‘Gender and emotion: What we think we know, what we need to know, and why it matters’ was discussed on Facebook.
Figure 2 

The proportion of Scopus articles from 2013 with DOIs that have at least one G+ or Facebook wall post for each Social Science, Arts & Humanities narrow field. Fields are ordered by average citation count. Error bars show 95% confidence intervals based on Wilson score intervals.

Ignoring the general category, the field with the highest proportion of articles with Google Plus citations was investigated for causes.

  • For History and Philosophy of Science, 46 out of 2961 articles were cited on public Google Plus pages. No journal systematically posted its articles in this site. Posts were typically simply announcements that an article had been published, accompanied by a title and abstract or other information. The articles posted mainly related to biomedical science (‘Parents’ and professionals’ perceptions of family-centered care for children with autism spectrum disorder across service sectors’ from the journal Social Science and Medicine and eight articles from Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences). Thus, the relatively high coverage for the Arts & Humanities narrow field History and Philosophy of Science seems to be due to its partial coverage of biomedical and other out of field research.

News counts record links from’s list of science-related mainstream media online sources that can be tracked to DOIs. It may not cover social science, arts and humanities press content as well, given that it may appear in general sections of newspapers and websites. Similarly, the blog citations are from’s list of science-related blogs, which may tend to omit arts and humanities blogs as well as occasional coverage of scholarship in other blogs (e.g. literary blogs). The prevalence of News varies from 0% to 5% of articles, and for Blogs from 0.6% to 7% gives a factor of 12. Ignoring the miscellaneous categories, the fields attracting the most mainstream media news attention (Figure 3) were investigated to find the causes or origins of the attention.

  • For Sociology and Political Science, 329 out of 7132 articles were referenced by news sources (4.6%). Some politics journals attracted media attention. Journal of Politics had news stories written about 15 out of 50 articles (30%). In this journal, ‘Counterframing effects’ was written up in Journalist’s Resource, an online intermediary news site that aims to ‘curate scholarship relevant to media practitioners, bloggers, educators, students and general readers’ ( The article ‘Legislative organization and the second face of power: Evidence from U.S. state legislatures’ was discussed by, an explanatory journalism news site. The article, ‘The Return of Old-Fashioned Racism to White Americans’ Partisan Preferences in the Early Obama Era’ was discussed by the Washington Post. Thus, the practical political implications of Journal of Politics articles have attracted the attention of analytic mainstream media sources.
  • For Life Span and Life Course Studies, 58 out of 1332 articles were referenced by news sources (4.4%). Child Development Perspectives had nine out of 45 articles written up (20%). ‘The Biological Residue of Childhood Poverty’ was picked up by for its story, ‘Why are Americans no longer the tallest people in the world? One theory: inequality’. ‘Infants and toddlers in foster care’ was discussed by liberal news site HuffPost. ‘Pollution and infant health’ was discussed in the Hungarian news site The health and social implications of Child Development Perspectives articles have again attracted the attention of analytic mainstream media sources.
  • For Demography, 76 out of 1770 articles were referenced by news sources (4.3%). The biggest contributor, Population and Development Review had 11 out of 53 articles discussed (21%). The most discussed article, ‘The Characteristics Approach to the Measurement of Population Aging’, was picked up by 11 news sources (including HuffPost) for mentioning an alternative way to measure age that readers might apply to themselves. ‘Surplus Chinese men: Demographic determinants of the sex ratio at marriageable ages in China’ was discussed by Scientific American. ‘Demographic Metabolism: A Predictive Theory of Socioeconomic Change’ was cited for its theory about population demographic changes in an Eco-Business article focusing on a different article about climate change. Thus, Demography seems to contribute information of general and topical interest for the mass media.
Figure 3 

The proportion of Scopus articles from 2013 with DOIs that have at least one news or blog citation for each Social Science, Arts & Humanities narrow field. Fields are ordered by average citation count. Error bars show 95% confidence intervals based on Wilson score intervals.

The fields attracting the most blogging were investigated for possible causes of the interest.

  • For History and Philosophy of Science, 203 out of 2961 articles were blogged (6.9%). History of the Human Sciences had eight out of 21 articles blogged (38%), none of which also attracted news stories. ‘Historians in the Archive: Changing Historiographical Practices in the Nineteenth Century’ and six other articles from this journal were blogged about by Campo Historiográfico, from a history professor. In contrast, ‘Oikonomia in the age of empires’ was referenced in a 4,000-word academic essay in the Disorder Of Things group blog for, ‘the critical inquiry of global politics’. Public Opinion Quarterly had seven out of 25 articles blogged (28%). ‘Coding voter turnout responses in the current population survey’ had been blogged by the Washington Post. ‘Extreme voices: Interest groups and the misrepresentation of issue publics’ had been blogged by Journalist’s Resource. ‘The polls’ trends: Americans’ changing views on crime and punishment’ and ‘Does Biology Justify Ideology? The Politics of Genetic Attribution’ were discussed by the same academic blog. ‘Commemoration matters’ was discussed by a different academic blog. Social Studies of Science had 11 out of 41 articles blogged (27%). ‘A history of deep brain stimulation’ was discussed in an academic blog. ‘Designing a market-like entity: Economics in the politics of market formation’ was announced without comment in a commercial blog. ‘The management of non-evidence in guideline development’ was discussed in a BMJ (medical journal) blog. Another article was apparently randomly selected as example of an academic article in a Scientific American blog about peer review and another was an author self-citation. Thus, this category attracted academic blogging as well as individual prolific bloggers.
  • For Demography, 114 out of 1770 articles were blogged (6.4%). All 26 International Journal of Refugee Studies articles were announced without discussion by the Refugee Archives academic blog. Thus, systematic content-free blogging from a single source is the cause here.

Wikipedia citations are derived directly from Wikipedia references and Reddit citations are extracted by the Reddit API. Both are dependent on DOIs in references or the cited article pages. The prevalence of Wikipedia references varies by a factor of 8 from 0.5% to 3.8% of articles, and for Reddit it varies from 0% to 0.7%. The field attracting the most Wikipedia citations (Figure 4) was investigated for coverage reasons.

  • For Gender Studies, 96 out of 2557 articles were cited by Wikipedia (3.8%). Gender and Language had 11 out of 14 articles cited. ‘Domestic violence and public participation in the media: The case of citizen journalism’ was cited in a page about citizen journalism. ‘Normal straight gays: Lexical collocations and ideologies of masculinity in personal ads of Serbian gay teenagers’ was cited in a page about LGBT rights in Serbia. ‘The construction of ‘tough’ masculinity: Negotiation, alignment and rejection’ was cited in the Masculinity page. All were added by the same Wikipedia editor, who has a feminism focus. Thus, an individual Wikipedian has contributed to effective representation of Gender and Language research in Wikipedia. Sex Roles had 11 out of 102 articles in Wikipedia (11%), referenced by different Wikipedians, and so seems to provide generally well-known research that is useful for topics covered by Wikipedia.
Figure 4 

The proportion of Scopus articles from 2013 with DOIs that have at least one Wikipedia or Reddit citation for each Social Science, Arts & Humanities narrow field. Fields are ordered by average citation count. Error bars show 95% confidence intervals based on Wilson score intervals.

Ignoring the miscellaneous categories, the field attracting the most Reddit posts (Figure 4) was investigated for coverage reasons.

  • For History and Philosophy of Science, 18 out of 2961 articles were posted about on Reddit (0.6%). Six were from Pragmatics and Cognition, out of 24 (25%), none of which had been discussed in any social media site. They had all been announced without discussion by a single user to the r/cognitivelinguistics subreddit.

Relationships between altmetrics

For RQ3, there is a moderately strong tendency for fields that have higher proportions of articles cited by one altmetric source to also have higher proportions cited by the others (Table 2). The correlations are all statistically significant and vary between 0.375 (Twitter/Wikipedia) and 0.891 (Twitter/News). Ignoring Wikipedia, the correlations are all strong, with a minimum of 0.558 (Twitter/Reddit). Thus, with the partial exception of Wikipedia and individual outliers, fields tend to attract proportionate interest in all six social media sources (Twitter, Facebook Gplus, Blogs, News, Reddit).

Table 2

Pearson correlations between the proportions cited for the seven main altmetrics and Scopus (n = 35 fields). All correlations are statistically significant with p < 0.05 except for the correlations between Scopus and Reddit/Wikipedia.

Correlation Twitter Facebook Gplus Blogs News Reddit Wikipedia

Scopus 0.689 0.503 0.538 0.595 0.708 0.332 0.230
Twitter 1 0.762 0.649 0.727 0.891 0.558 0.375
Facebook 1 0.647 0.813 0.796 0.726 0.724
Gplus 1 0.732 0.659 0.748 0.418
Blogs 1 0.790 0.680 0.591
News 1 0.674 0.479
Reddit 1 0.550

The altmetric sources could be split conceptually into three groups: news related (News, Blogs, Reddit); social network sites (Twitter, Facebook, Gplus); and knowledge record (Wikipedia). The first two groups are not reflected in the correlations since the strongest correlation is between them (Twitter and News) rather than within either of the two groups. Thus, the results suggest that the level of social network interest and news discussion does not vary much between narrow fields: they tend to have proportionate amounts of both. This is surprising given that tweets are short, informational (Thelwall et al. 2013), and require the original article to be read, whereas news and blog posts are in their nature longer, involve discussion, and often translate research for a non-specialist audience (Shema et al. 2015).

A factor analysis of the data for Table 2 (n = 35 fields) with Promax non-orthogonal rotation (because there is no reason to believe that different dimensions of online attention/impact would be orthogonal) found only one significant factor (one eigenvalue of 5.5 and the rest under 1.0; the proportion data had low skewness and kurtosis and passed Bartlett and KMO tests, so was suitable for factor analysis). Thus, the altmetrics do not split into natural groups by field. In the factor analysis, Wikipedia did not form its own factor because of its strong correlation with Facebook.


An important limitation of this study is that the scores are likely to be substantial underestimates of public interest in journal articles in all fields. This is because cannot scan private web pages, such as most of Facebook, and presumably does not have the resources to crawl the entire web for mentions. Moreover, it can only associate a citation with an article when it can be tracked to the cited article’s DOI, loosing informal mentions even formal citations that lack DOIs. Its data may also contain errors or miss citations for technical reasons (Ortega 2018), and automated tweets may also have been collected because they are difficult to detect (Haustein et al. 2016). Another limitation is the implicit assumption that all Scopus categories cover similarly scholarly journals but this is unlikely to be true. For example, the Health (social science) category contains journals publishing articles aimed at a more general or professional audience (e.g. ‘Identifying urinary incontinence in the home setting: Part 2: Treatment and related care of incontinence’ in Home Healthcare Nurse). Language and international differences in the uptake of the social web sites covered by are also important factor that affect the results. For example, articles with a national focus outside of the countries that extensively use the social web sites covered by may be discussed elsewhere on the social web.

Since monographs are central to the humanities and books and other outputs/activities can also be central to individual social sciences, arts and humanities fields, the results should not be used as evidence as a lack of public online discussion about the fields analysed. These discussions may instead centre on other outputs or activities of the scholars concerned.

The proportions tweeted in each field are higher than found in previous research for the humanities, (Hammarfelt 2014; Haustein et al. 2014) presumably due to increases over time in the amount of academic tweeting. The results echo, from a different perspective, the greater amount of altmetric data for the social sciences than for the arts and humanities (Costas, Zahedi, & Wouters 2015; Haustein, Larivière et al. 2014).

The analyses of fields with high scores on each individual altmetric identified six cases where a field had generated an above average level of non-academic interest: Archaeology and Gender Studies on Facebook; Sociology and Political Science, Life Span and Life Course Studies, and Demography in News sources; Gender Studies on Wikipedia. These give reassurance that social science and humanities subjects can attract online attention for their subject matter, but the evidence is fragile due to the low proportions involved. Given the apparent widespread interest in social science, arts and humanities topics, it seems strange that evidence of online discussions of articles in these areas is rare. It is likely that there is much more extensive discussion around relevant news stories, magazine articles and popular books for topics in these areas, but this does not seem to translate into interest in journal articles. To give a concrete example, the Life Span & Life Course Studies article, ‘The role of peer rejection in the link between reactive aggression and academic performance’ in Child & Youth Care Forum from 2013 had an score of 0 and a web search for its title discovered no examples of the article being mentioned other than in academic reference lists (e.g. from 28 Google Scholar citations), publication announcements and CVs. Its finding, ‘high levels of reactive, not proactive, aggression were uniquely associated with low levels of academic performance, and peer rejection accounted for this association’ (Fite et al. 2013), seems to be of immediate practical significance within education despite the lack of public online non-academic mentions. Its full text is online (in ResearchGate) and the publisher website has a clear summary. It might be discussed in private areas of the social web and in subscription magazines for social workers and education professionals. Nevertheless, it is hard to understand why it has not generated any recordable online commentary. More generally, there seems to be a large gap between the provision of potentially useful research and its wider uptake, at least as recorded in the social web.


This article assessed the prevalence of altmetrics for journal articles in social science, arts and humanities fields, comparing the results at the level of whole fields rather than individual articles. Most articles in all fields had a score of zero for all altmetrics. This extends and updates prior research with data (Table IV of Costas et al. 2015) by giving finer grained results. Excluding the announcement-based Twitter, at most 12% of articles in any field attracted a non-zero altmetric score. Thus, the data suggests that most social science, arts and humanities papers are not discussed online in all fields. data cannot be expected to be comprehensive and ignores, for example, mentions in private Facebook pages, and so this conclusion is tentative. From a policy perspective, and particularly in the current era of increasing expenditure on open science, it is worrying that so much research is ignored outside academia despite being presumably high quality, expensive to produce and frequently useful. It is therefore important to assess the extent to which research that is not discussed online is influential offline, but this may be impossible to systematically track. Despite all the altmetrics investigated being zero for most articles in all fields, there are substantial differences between fields in the extent to which articles have nonzero scores in the seven altmetrics examined in more. Thus, altmetrics are much more useful for some fields than others. Overall, they are twice as prevalent in the social sciences than in the arts and humanities.

Positive correlations between the seven main altmetrics and Scopus citation counts suggests that, with the partial exception of Wikipedia (weak correlation of 0.230), more cited fields are also likely to attract attention from all altmetrics (i.e. a higher proportion of articles with Scopus citations indicates a higher proportion of articles with a positive altmetric score). Individual exceptions include the relatively high number of public Facebook posts for Cultural Studies and the relatively low attention for Transportation. Due to these exceptions, the prevalence of relative prevalence of altmetrics must be assessed separately for individual fields. Similarly, and again with the partial exception of Wikipedia, there was a strong tendency for fields in which a higher proportion of articles attracted attention in one altmetric to also receive more attention from all the others. Together with the anomalies found in some of the investigations of outliers, this suggests that the extent to which social science, arts and humanities fields attract social web and news attention is not site-specific. Finally, since all altmetrics examined give a score of zero for most articles in all fields, they are not prevalent enough to be used to routinely compare individual articles for formal or informal evaluations. Nevertheless, they can still be used to compare groups of articles to assess the proportion with a non-zero score between groups (Thelwall 2017b), and to identify individual high impact articles. This may be useful for self-evaluations and to support more formal evaluations (e.g. to compare research groups) when the risk of systematic, bot or accidental manipulation (Haustein et al. 2016; Wouters & Costas 2012) is low.


Thank you to for making the data used in this study available free for research.

Competing Interests

The author has no competing interests to declare.


  1. Agresti, A., & Coull, B. A. (1998). Approximate is better than ‘exact’ for interval estimation of binomial proportions. The American Statistician, 52(2), 119–126. 

  2. Berk, R. A., Western, B., & Weiss, R. E. (1995). Statistical inference for apparent populations. Sociological Methodology, 421–458. DOI: 

  3. Bollen, K. A. (1995). Apparent and nonapparent significance tests. Sociological Methodology, 25, 459–468. DOI: 

  4. Costas, R., Zahedi, Z., & Wouters, P. (2015). The thematic orientation of publications mentioned on social media: Large-scale disciplinary comparison of social media metrics with citations. Aslib Journal of Information Management, 67(3), 260–288. DOI: 

  5. Fairclough, R., & Thelwall, M. (2015). More precise methods for national research citation impact comparisons. Journal of Informetrics, 9(4), 895–906. DOI: 

  6. Falagas, M. E., Pitsouni, E. I., Malietzis, G. A., & Pappas, G. (2008). Comparison of PubMed, Scopus, web of science, and Google scholar: Strengths and weaknesses. The FASEB Journal, 22(2), 338–342. DOI: 

  7. Fite, P. J., Hendrickson, M., Rubens, S. L., Gabrielli, J., & Evans, S. (2013). The role of peer rejection in the link between reactive aggression and academic performance. Child & Youth Care Forum, 42(3), 193–205. DOI: 

  8. Fry, J., & Talja, S. (2007). The intellectual and social organization of academic fields and the shaping of digital resources. Journal of information Science, 33(2), 115–133. DOI: 

  9. Halevi, G., Moed, H., & Bar-Ilan, J. (2017). Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation—Review of the literature. Journal of Informetrics, 11(3), 823–834. DOI: 

  10. Hammarfelt, B. (2014). Using altmetrics for assessing research impact in the humanities. Scientometrics, 101(2), 1419–1430. DOI: 

  11. Harzing, A. W., & Alakangas, S. (2016). Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787–804. DOI: 

  12. Harzing, A. W., & Alakangas, S. (2017). Microsoft Academic is one year old: The Phoenix is ready to leave the nest. Scientometrics, 112(3), 1887–1894. DOI: 

  13. Haustein, S., Bowman, T. D., Holmberg, K., Peters, I., & Larivière, V. (2014). Astrophysicists on Twitter: An in-depth analysis of tweeting and scientific publication behavior. Aslib Journal of Information Management, 66(3), 279–296. DOI: 

  14. Haustein, S., Bowman, T. D., Holmberg, K., Tsou, A., Sugimoto, C. R., & Larivière, V. (2016). Tweets as impact indicators: Examining the implications of automated ‘bot’ accounts on Twitter. Journal of the Association for Information Science and Technology, 67(1), 232–238. DOI: 

  15. Haustein, S., Larivière, V., Thelwall, M., Amyot, D., & Peters, I. (2014). Tweets vs. Mendeley readers: How do these two social media metrics differ? IT-Information Technology, 56(5), 207–215. DOI: 

  16. Holmberg, K., & Thelwall, M. (2014). Disciplinary differences in Twitter scholarly communication. Scientometrics, 101(2), 1027–1042. DOI: 

  17. Kousha, K., & Thelwall, M. (2016). An automatic method for assessing the teaching impact of books from online academic syllabi. Journal of the Association for Information Science and Technology, 67(12), 2993–3007. DOI: 

  18. Leahey, E. (2005). Alphas and asterisks: The development of statistical significance testing standards in sociology. Social Forces, 84(1), 1–24. DOI: 

  19. Lynd, R. S. (2015). Knowledge for what: The place of social science in American culture. Princeton, NJ: Princeton University Press. 

  20. McCarthy, K. F., Ondaatje, E. H., Zakaras, L., & Brooks, A. (2001). Gifts of the muse: Reframing the debate about the benefits of the arts. New York, NY: Rand Corporation. 

  21. Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the Association for Information Science and Technology, 65(8), 1627–1638. DOI: 

  22. Mohammadi, E., Thelwall, M., & Kousha, K. (2016). Can Mendeley bookmarks reflect readership? A survey of user motivations. Journal of the Association for Information Science and Technology, 67(5), 1198–1209. DOI: 

  23. Mohammadi, E., Thelwall, M., Kwasny, M., & Holmes, K. (2018). Academic information on Twitter: A user survey. PLOS ONE, 13(5), e0197265. DOI: 

  24. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213–228. DOI: 

  25. Nussbaum, M. C. (2010). Not for profit: Why democracy needs the humanities. Princeton, NJ: Princeton University Press. 

  26. Ortega, J. L. (2018). Reliability and accuracy of altmetric providers: A comparison among Altmetric, PlumX and Crossref Event Data. DOI: 

  27. Priem, J., & Costello, K. L. (2010). How and why scholars cite on Twitter. Proceedings of the American Society for Information Science and Technology, 47(1), 1–4. DOI: 

  28. Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. 

  29. Ruiz-Castillo, J., & Waltman, L. (2015). Field-normalized citation impact indicators using algorithmically constructed classification systems of science. Journal of Informetrics, 9(1), 102–117. DOI: 

  30. Shema, H., Bar-Ilan, J., & Thelwall, M. (2015). How is research blogged? A content analysis approach. Journal of the Association for Information Science and Technology, 66(6), 1136–1149. DOI: 

  31. Thelwall, M. (2017a). Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals. Journal of Informetrics, 11(4), 1201–1212. DOI: 

  32. Thelwall, M. (2017b). Three practical field normalised alternative indicator formulae for research evaluation. Journal of Informetrics, 11(1), 128–151. DOI: 

  33. Thelwall, M. (2017c). Are Mendeley reader counts useful impact indicators in all fields? Scientometrics, 113(3), 1721–1731. DOI: 

  34. Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430–435. DOI: 

  35. Thelwall, M., Tsou, A., Weingart, S., Holmberg, K., & Haustein, S. (2013). Tweeting links to academic articles. Cybermetrics: International Journal of Scientometrics, Informetrics and Bibliometrics, 17, 1–8. 

  36. Torres-Salinas, D., & Moed, H. F. (2009). Library catalog analysis as a tool in studies of social sciences and humanities: An exploratory study of published book titles in economics. Journal of Informetrics, 3(1), 9–26. DOI: 

  37. Torres-Salinas, D., Robinson-Garcia, N., & Gorraiz, J. (2017). Filling the citation gap: Measuring the multidimensional impact of the academic book at institutional level with PlumX. Scientometrics, 113(3), 1371–1384. DOI: 

  38. White, H. D., Boell, S. K., Yu, H., Davis, M., Wilson, C. S., & Cole, F. T. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096. DOI: 

  39. White, H. D., & Zuccala, A. (2018). Libcitations, WorldCat, cultural impact, and fame. Journal of the Association for Information Science and Technology. DOI: 

  40. Whitley, R. (2000). The intellectual and social organization of the sciences. Oxford, UK: Oxford University Press. 

  41. Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Tinkler, J., et al. (2015). The Metric Tide. London, UK: HEFCE. 

  42. Wilson, E. B. (1927). Probable inference, the law of succession, and statistical inference. Journal of the American Statistical Association, 22(158), 209–212. DOI: 

  43. Wouters, P., & Costas, R. (2012). Users, narcissism and control: Tracking the impact of scholarly publications in the 21st century, 847–857. Utrecht: SURFfoundation. 

  44. Zuccala, A. A., Verleysen, F. T., Cornacchia, R., & Engels, T. C. (2015). Altmetrics for the humanities: Comparing Goodreads reader ratings with citations to history books. Aslib Journal of Information Management, 67(3), 320–336. DOI: 

comments powered by Disqus