To read this content please select one of the options below:

Characterizing peer-judged answer quality on academic Q&A sites: A cross-disciplinary case study on ResearchGate

Lei Li (Department of Information Management, Nanjing University of Science and Technology, Nanjing, China)
Daqing He (School of Information Sciences, University of Pittsburgh, Pittsburgh, Pennsylvania, USA)
Chengzhi Zhang (Department of Information Management, Nanjing University of Science and Technology, Nanjing, China)
Li Geng (New York City College of Technology, City University of New York, New York, USA)
Ke Zhang (School of Information Sciences, University of Pittsburgh, Pittsburgh, Pennsylvania, USA)

Aslib Journal of Information Management

ISSN: 2050-3806

Article publication date: 30 July 2018

Issue publication date: 30 July 2018

1597

Abstract

Purpose

Academic social (question and answer) Q&A sites are now utilised by millions of scholars and researchers for seeking and sharing discipline-specific information. However, little is known about the factors that can affect their votes on the quality of an answer, nor how the discipline might influence these factors. The paper aims to discuss this issue.

Design/methodology/approach

Using 1,021 answers collected over three disciplines (library and information services, history of art, and astrophysics) in ResearchGate, statistical analysis is performed to identify the characteristics of high-quality academic answers, and comparisons were made across the three disciplines. In particular, two major categories of characteristics of the answer provider and answer content were extracted and examined.

Findings

The results reveal that high-quality answers on academic social Q&A sites tend to possess two characteristics: first, they are provided by scholars with higher academic reputations (e.g. more followers, etc.); and second, they provide objective information (e.g. longer answer with fewer subjective opinions). However, the impact of these factors varies across disciplines, e.g., objectivity is more favourable in physics than in other disciplines.

Originality/value

The study is envisioned to help academic Q&A sites to select and recommend high-quality answers across different disciplines, especially in a cold-start scenario where the answer has not received enough judgements from peers.

Keywords

Acknowledgements

The authors gratefully acknowledge the help of Wei Jeng who helped to provide suggestions. This work is supported by Major Projects of National Social Science Fund (No. 16ZAD224), Fujian Provincial Key Laboratory of Information Processing and Intelligent Control (Minjiang University) (No. MJUKF201704) and Qing Lan Project.

Citation

Li, L., He, D., Zhang, C., Geng, L. and Zhang, K. (2018), "Characterizing peer-judged answer quality on academic Q&A sites: A cross-disciplinary case study on ResearchGate", Aslib Journal of Information Management, Vol. 70 No. 3, pp. 269-287. https://doi.org/10.1108/AJIM-11-2017-0246

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Emerald Publishing Limited

Related articles