skip to main content
10.1145/2766462.2767786acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

Search Engine Evaluation based on Search Engine Switching Prediction

Published: 09 August 2015 Publication History

Abstract

In this paper we present a novel application of the search engine switching prediction model for online evaluation. We propose a new metric pSwitch for A/B-testing, which allows us to evaluate the quality of search engines in different aspects such as the quality of the user interface and the quality of the ranking function. pSwitch is a search session-level metric, which relies on the predicted probability that the session contains a switch to another search engine and reflects the degree of the failure of the session. We demonstrate the effectiveness and validity of pSwitch using A/B-testing experiments with real users of search engine Yandex. We compare our metric with recently proposed SpU (sessions per user) metric and other widely used query-level A/B metrics, such as Abandonment Rate and Time to First Click, which we used as our baseline metrics. We observed that pSwitch metric is more sensitive in comparison with those baseline metrics and also that pSwitch and SpU are more consistent with ground truth, than Abandonment Rate and Time to First Click.

References

[1]
M. R. Chernick and R. A. LaBudde. An Introduction to Bootstrap Methods with Applications to R. Wiley Publishing, 1st edition, 2011.
[2]
C. Cleverdon. Readings in information retrieval. chapter The Cranfield Tests on Index Language Devices, pages 47--59. Morgan Kaufmann Publishers Inc., 1997.
[3]
H. A. Feild, J. Allan, and R. Jones. Predicting searcher frustration. In Proc. SIGIR 2010, pages 34--41. ACM.
[4]
J. H. Friedman. Stochastic gradient boosting. Comput. Stat. Data Anal., 38(4):367--378, Feb. 2002.
[5]
Q. Guo, R. W. White, Y. Zhang, B. Anderson, and S. T. Dumais. Why searchers switch: Understanding and predicting engine switching rationales. In Proceedings SIGIR 2011, pages 335--344. ACM.
[6]
R. Kohavi, R. Longbotham, D. Sommerfield, and R. M. Henne. Controlled experiments on the web: Survey and practical guide. Data Min. Knowl. Discov., 18(1):140--181, Feb. 2009.
[7]
F. Radlinski and N. Craswell. Comparing the sensitivity of information retrieval metrics. In Pr. SIGIR 2010, pages 667--674. ACM.
[8]
D. Savenkov, D. Lagun, and Q. Liu. Search engine switching detection based on user personal preferences and behavior patterns. In Pr. SIGIR 2013, pages 33--42. ACM.
[9]
Y. Song, X. Shi, and X. Fu. Evaluating and predicting user engagement change with degraded search relevance. In Pr. WWW 2013, pages 1213--1224. International World Wide Web Conferences Steering Committee.
[10]
R. W. White, A. Kapoor, and S. T. Dumais. Modeling long-term search engine usage. In Pr. UMAP 2010.

Cited By

View all
  • (2022)Proposing a New Combined Indicator for Measuring Search Engine Performance and Evaluating Google, Yahoo, DuckDuckGo, and Bing Search Engines based on Combined IndicatorJournal of Librarianship and Information Science10.1177/0961000622113857956:1(178-197)Online publication date: 8-Dec-2022
  • (2019)Effective Online Evaluation for Web SearchProceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3331184.3331378(1399-1400)Online publication date: 18-Jul-2019
  • (2018)Consistent Transformation of Ratio Metrics for Efficient Online Controlled ExperimentsProceedings of the Eleventh ACM International Conference on Web Search and Data Mining10.1145/3159652.3159699(55-63)Online publication date: 2-Feb-2018
  • Show More Cited By

Index Terms

  1. Search Engine Evaluation based on Search Engine Switching Prediction

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '15: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval
    August 2015
    1198 pages
    ISBN:9781450336215
    DOI:10.1145/2766462
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 August 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. online evaluation
    2. search engine switching

    Qualifiers

    • Short-paper

    Conference

    SIGIR '15
    Sponsor:

    Acceptance Rates

    SIGIR '15 Paper Acceptance Rate 70 of 351 submissions, 20%;
    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 01 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)Proposing a New Combined Indicator for Measuring Search Engine Performance and Evaluating Google, Yahoo, DuckDuckGo, and Bing Search Engines based on Combined IndicatorJournal of Librarianship and Information Science10.1177/0961000622113857956:1(178-197)Online publication date: 8-Dec-2022
    • (2019)Effective Online Evaluation for Web SearchProceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3331184.3331378(1399-1400)Online publication date: 18-Jul-2019
    • (2018)Consistent Transformation of Ratio Metrics for Efficient Online Controlled ExperimentsProceedings of the Eleventh ACM International Conference on Web Search and Data Mining10.1145/3159652.3159699(55-63)Online publication date: 2-Feb-2018
    • (2017)Using the Delay in a Treatment Effect to Improve Sensitivity and Preserve Directionality of Engagement Metrics in A/B ExperimentsProceedings of the 26th International Conference on World Wide Web10.1145/3038912.3052664(1301-1310)Online publication date: 3-Apr-2017
    • (2017)Periodicity in User Engagement with a Search Engine and Its Application to Online Controlled ExperimentsACM Transactions on the Web10.1145/285682211:2(1-35)Online publication date: 14-Apr-2017
    • (2016)Online Evaluation for Information RetrievalFoundations and Trends in Information Retrieval10.1561/150000005110:1(1-117)Online publication date: 1-Jun-2016

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media