Metrics and social media as components of scholarly research: new developments for Online Information Review

Gary E Gorman (Asia-New Zealand Informatics Associates Ltd, Trentham, New Zealand)

Online Information Review

ISSN: 1468-4527

Article publication date: 9 February 2015

1939

Citation

Gorman, G.E. (2015), "Metrics and social media as components of scholarly research: new developments for Online Information Review", Online Information Review, Vol. 39 No. 1. https://doi.org/10.1108/OIR-12-2014-0297

Publisher

:

Emerald Group Publishing Limited


Metrics and social media as components of scholarly research: new developments for Online Information Review

Article Type: Editorial From: Online Information Review, Volume 39, Issue 1

Some of the bibliometric data contributing to this (Excellence for Research in Australia, or ERA) ranking suffer statistical issues associated with skewed distributions. Other data are standardized year-by-year, placing undue emphasis on the most recent publications which may not yet have reliable citation patterns (Vanclay and Bornmann, 2012).

This simple statement highlights one of the major unresolved disputes within academia – how to deal with the use and abuse of metrics in the evaluation/ranking/“quality assurance” game. On the one hand, few of us would doubt that metrics do have a role in research evaluation or that an individual metric can offer insights into the assessment of research outputs. On the other hand, most are somewhat cynical about the value of a single metric being used on its own to evaluate anything because of the likelihood of misinterpretation and misunderstanding.

We have come to recognize that research productivity and evaluation should be seen as quite complex activities and thus should involve several approaches, one being metrics. But how are the approaches to be determined, and what is the actual role of metrics in the emerging mix of methods? What sources should be drawn upon when determining the metrics? If we are being more comprehensive, where should we be looking for evaluation criteria? These and many other questions have made the blogosphere and twitter-land run hot with debate and disagreement – just look at the blogs and tweets in response to the Higher Education Funding Council for England’s call for responses to the proposal to use metrics in research assessments (Higher Education Funding Council for England (HEFCE), 2014) if you need to confirm the heatedness of the discussion. When debate becomes so heated and emotive, I tend to think that we need some very focused studies to determine correlations or otherwise between various metrics and purported research quality, but who will fund this research, and who will do it?

And if you happen to think that we can corral the debate and contain it, then you’ve been living in cloud-cuckoo land, because social media have come roaring in to offer new and rapidly evolving perspectives on metrics. Blogs and Twitter in particular are now seen, particularly by younger researchers (and those who publish their work) as a natural way to broadcast research result and publications to interested communities, thereby alerting these communities to newly available work and inviting debate on the research findings. As Hitchcock (2014) sees it, “Twitter and blogs, and embarrassingly enthusiastic drunken conversations at parties, are not add-ons to academic research, but a simple reflection of the passion that underpins it”.

Given this exploding interest in the use of social media as part of one’s research activity, the research community has begun to exploit altmetrics with considerable enthusiasm. PLOS (2014) defines altmetrics as:

[…] the study and use of non-traditional scholarly impact measures that are based on activity in web-based environments. As scholarship increasingly moves online, these metrics track associated interactions and activity to generate fine-grained data, allowing researchers and policy makers to create a higher resolution picture of the reach and impact of academic research.

Since social media covers a broad spectrum of activities and sources of data, altmetrics is equally broad in data gathering, ranging from HTML views and PDF downloads to blog discussions, Twitter references and social bookmarking traffic. All of these web-based activities are seen as casting new and informative light on research impact beyond the traditional metrics.

Cognisant of the significance of social media as part of the researcher’s promotional and collegial responsibilities and the expansion of metrics to include social media activities, we at Online Information Review have extended our editorial objectives to include altmetrics and related metrics for measuring research impact and also social media, social networks and social media analytics. We encourage submissions in all of these areas, within the broader online information environment.

Further, we have introduced a new position and some new features into the journal in an attempt to assist readers in keeping abreast of relevant developments in metrics and social media. In the first instance we have created the position of Associate Editor – Social Media, with the first holder of this editorial position being Dr Rebecca Reynolds of Rutgers University (New Jersey). Dr Reynolds’ brief is to ensure that our readers are regularly informed of OIR issues, papers and developments through the relevant social media.

In addition we are introducing two new regular viewpoints into the OIR mix. Dr David Stuart (Centre for e-Research, King’s College London) will be writing on “Taming Metrics”. These twice-per-volume viewpoint papers will cover important research and technological developments in metrics for research evaluation. With research institutions increasingly interested in the potential of bibliometrics and altmetrics to both evaluate and demonstrate research impact, and new metrics and tools being developed to capture an increasingly diverse range of content, Dr Stuart’s viewpoint pieces will help to distinguish the substance from the snake oil.

The second set of viewpoints (also two per volume) will be presented by Dr Katrin Weller (GESIS – Leibniz Institute for the Social Sciences, Cologne), offering a “Spotlight on Social Media Research”. These viewpoint papers will focus on current developments in research related to online social networks and social media user communities. Her discussions will cover interdisciplinary approaches to such research, interesting research findings, discussion of methods, methodologies and technical challenges in social media analytics and also Big Data. The purpose of Dr Weller’s viewpoint papers is to assist researchers in keeping abreast of new developments, as well as highlighting new trends and questioning research findings.

Finally, after an absence of more than two years, we are re-introducing Special Issues into the OIR line-up. The two forthcoming Special Issues will address matters that are relevant to metrics and social media: “Open Access: Redrawing the Landscape of Scholarly Communication” and “Social Media Analytics”. Descriptions and calls for papers may be found at www.emeraldgrouppublishing.com/products/journals/journals.htm?id=oir

G.E. Gorman

References

Higher Education Funding Council for England (2014), #HEFCEmetrics, available at: https://storify.com/Impactstory/hefcemetrics (accessed 10 December 2014).

Hitchcock, T. (2014), “Twitter and blogs are not add-ons to academic research, but a simple reflection of the passion that underpins it”, The Impact Blog, available at: http://blogs.lse.ac.uk/impactofsocialsciences/2014/07/28/twitter-and-blogs-academic-public-sphere/ (accessed 10 December 2014).

PLOS (2014), “PLOS collections: article collections”, available at: www.ploscollections.org/article/browseIssue.action?issue=info:doi/10.1371/issue.pcol.v02.i19#tocGrp_0 (accessed 10 December 2014).

Vanclay, J.K. and Bornmann, L. (2012), “Metrics to evaluate research performance in academic institutions: a critique of ERA 2010 as applied in forestry and the indirect H2 index as a possible alternative”, Scientometrics, Vol. 91 No. 3, pp. 751-771

Related articles