skip to main content
10.1145/2740908.2742129acmotherconferencesArticle/Chapter ViewAbstractPublication PagesthewebconfConference Proceedingsconference-collections
research-article

Answer Quality Characteristics and Prediction on an Academic Q&A Site: A Case Study on ResearchGate

Published: 18 May 2015 Publication History

Abstract

Despite various studies on examining and predicting answer quality on generic social Q&A sites such as Yahoo! Answers, little is known about why answers on academic Q&A sites are voted on by scholars who follow the discussion threads to be high quality answers. Using 1021 answers obtained from the Q&A part of an academic social network site ResearchGate (RG), we firstly explored whether various web-captured features and human-coded features can be the critical factors that influence the peer-judged answer quality. Then using the identified critical features, we constructed three classification models to predict the peer-judged rating. Our results identify four main findings. Firstly, responders' authority, shorter response time and greater answer length are the critical features that positively associate with the peer-judged answer quality. Secondly, answers containing social elements are very likely to harm the peer-judged answer quality. Thirdly, an optimized SVM algorithm has an overwhelming advantage over other models in terms of accuracy. Finally, the prediction based on web-captured features had better performance when comparing to prediction on human-coded features. We hope that these interesting insights on ResearchGate's answer quality can help the further design of academic Q&A sites.

References

[1]
Liu, Y., Bian, J., & Agichtein, E. (2008, July). Predicting information seeker satisfaction in community question answering. In Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval (pp. 483--490). ACM.DOI=10.1145/1390334.1390417
[2]
Harper, F. M., Raban, D., Rafaeli, S., & Konstan, J. A. (2008, April). Predictors of answer quality in online Q&A sites. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(pp. 865--874). ACM. DOI= 10.1145/1357054.1357191.
[3]
Shah, C., & Pomerantz, J. (2010, July). Evaluating and predicting answer quality in community QA. In Proceedings of the 33rd international ACM SIGIR conference on Research and development in information retrieval (pp. 411--418). ACM. DOI= 10.1145/1835449.1835518
[4]
Fichman, P. (2011). A comparative assessment of answer quality on four question answering sites. Journal of Information Science,37(5), 476--486.DOI=10.1177/0165551511415584
[5]
John, B. M., Chua, A. Y., & Goh, D. H. L. (2011). What makes a high-quality user-generated answer? Internet Computing, IEEE, 15(1), 66--71. DOI=10.1109/MIC.2011.23
[6]
Chua, A. Y., & Banerjee, S. (2013). So fast so good: An analysis of answer quality and answer speed in community Question?answering sites. Journal of the American Society for Information Science and Technology,64(10), 2058--2068. DOI=10.1002/asi.22902
[7]
Jeng, W., He, D., & Jiang, J. (2014), User participation in an academic social networking service: A survey of open group users on Mendeley. Journal of the Association for Information Science and Technology.
[8]
Nández, G., & Borrego, Á. (2013). Use of social networks for academic purposes: a case study. Electronic Library, The, 31(6), 781--791.
[9]
Goodwin, S., Jeng, W. & He, D. (2014). Changing Communication on ResearchGate Through Interface Updates. ASIS&T 2014 Annual Meeting.
[10]
Jordan, K. (2014). Academics and their online networks: Exploring the role of academic social networking sites. First Monday, 19(11).
[11]
Thelwall, M., & Kousha, K. (2014). Academia. edu: Social Network or Academic Network?. Journal of the Association for Information Science and Technology, 65(4), 721--731.
[12]
Thelwall, M., & Kousha, K. (2014). ResearchGate: Disseminating, communicating, and measuring Scholarship?. Journal of the Association for Information Science and Technology.
[13]
Yahoo Design Patterns. Vote to Promote. https://developer.yahoo.com/ypatterns/social/objects/feedback/votetopromote.html
[14]
Kim, Soojung, and Sanghee Oh. "Users' relevance criteria for evaluating answers in a social Q&A site." Journal of the American Society for Information Science and Technology 60, no. 4 (2009): 716--727.
[15]
Suryanto, M. A., Lim, E. P., Sun, A., & Chiang, R. H. (2009, February). Quality-aware collaborative question answering: methods and evaluation. In Proceedings of the second ACM international conference on web search and data mining (pp. 142--151). ACM. DOI=10.1145/1498759.1498820
[16]
Jurczyk, P., & Agichtein, E. (2007, November). Discovering authorities in question answer communities by using link analysis. In Proceedings of the sixteenth ACM conference on Conference on information and knowledge management (pp. 919--922). ACM. DOI=10.1145/1321440.1321575
[17]
Ong, C. S., Day, M. Y., & Hsu, W. L. (2009). The measurement of user satisfaction with question answering systems. Information & Management,46(7), 397--403. DOI=10.1016/j.im.2009.07.004.
[18]
About us. (n.d.). ResearchGate. Retrieved January 25, 2015, from https://www.researchgate.net/about
[19]
Henri, F. (1992). Computer conferencing and content analysis. In Collaborative learning through computer conferencing (pp. 117--136). Springer Berlin Heidelberg. DOI=10.1007/978--3--642--77684--7_8
[20]
Pena-Shaff, J. B., & Nicholls, C. (2004). Analyzing student interactions and meaning construction in computer bulletin board discussions. Computers & Education, 42(3), 243--265. DOI=10.1016/j.compedu.2003.08.003
[21]
Viera, A. J., & Garrett, J. M. (2005). Understanding interobserver agreement: the kappa statistic. Fam Med, 37(5), 360--363.

Cited By

View all

Index Terms

  1. Answer Quality Characteristics and Prediction on an Academic Q&A Site: A Case Study on ResearchGate

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        WWW '15 Companion: Proceedings of the 24th International Conference on World Wide Web
        May 2015
        1602 pages
        ISBN:9781450334730
        DOI:10.1145/2740908

        Sponsors

        • IW3C2: International World Wide Web Conference Committee

        In-Cooperation

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 18 May 2015

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. academic q&a site
        2. academic social networking
        3. peer rating
        4. researchgate
        5. social q&a
        6. user judgment

        Qualifiers

        • Research-article

        Funding Sources

        • Major Projects of National Social Science Fund
        • National Social Science Fund Project
        • National Natural Science Foundation of China
        • Project of the Education Ministry's Humanities and Social Science

        Conference

        WWW '15
        Sponsor:
        • IW3C2

        Acceptance Rates

        Overall Acceptance Rate 1,899 of 8,196 submissions, 23%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)30
        • Downloads (Last 6 weeks)5
        Reflects downloads up to 09 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media