skip to main content
10.1145/2983323.2983816acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article
Public Access

Multiple Queries as Bandit Arms

Published: 24 October 2016 Publication History

Abstract

Existing retrieval systems rely on a single active query to pull documents from the index. Relevance feedback may be used to iteratively refine the query, but only one query is active at a time. If the user's information need has multiple aspects, the query must represent the union of these aspects. We consider a new paradigm of retrieval where multiple queries are kept ``active'' simultaneously. In the presence of rate limits, the active queries take turns accessing the index to retrieve another ``page'' of results. Turns are assigned by a multi-armed bandit based on user feedback. This allows the system to explore which queries return more relevant results and to exploit the best ones. In empirical tests, query pools outperform solo, combined queries. Significant improvement is observed both when the subtopic queries are known in advance and when the queries are generated in a user-interactive process.

References

[1]
R. Agrawal, S. Gollapudi, A. Halverson, and S. Ieong. Diversifying search results. In Proc. of WSDM, 2009.
[2]
P. Auer, N. Cesa-Bianchi, and P. Fischer. Finite-time analysis of the multiarmed bandit problem. Machine learning, 2002.
[3]
D. Bouneffouf, A. Bouzeghoub, and A. L. Gançarski. Contextual bandits for context-based information retrieval. In Neural Information Processing. Springer, 2013.
[4]
J. Carbonell and J. Goldstein. The use of mmr, diversity-based reranking for reordering documents and producing summaries. In Proc. of SIGIR, 1998.
[5]
B. Carterette, E. Kanoulas, M. Hall, and P. Clough. Overview of the trec 2013 session track.
[6]
A. Garivier and E. Moulines. On Upper-Confidence Bound Policies for Switching Bandit Problems. 2011.
[7]
D. Guan, H. Yang, and N. Goharian. Effective structured query formulation for session search. Technical report, DTIC Document, 2012.
[8]
D. Harman. Relevance feedback revisited. In Proc. of SIGIR, 1992.
[9]
J. He, V. Hollink, and A. de Vries. Combining implicit and explicit topic representations for result diversification. In Proc. of SIGIR, 2012.
[10]
C.-C. Hsieh, J. Neufeld, T. King, and J. Cho. Efficient approximate thompson sampling for search query recommendation. 2015.
[11]
J. Jiang, S. Han, J. Wu, and D. He. Pitt at trec 2011 session track. In TREC, 2011.
[12]
X. Jin, M. Sloan, and J. Wang. Interactive exploratory search for multi page search results. In Proc. of WWW, 2013.
[13]
K. Jones and V. Rijsbergen. Report on the Need for and Provision of an Ideal Information Retrieval Test Collection. 1975.
[14]
R. Krovetz. Viewing morphology as an inference process. In Proc. of SIGIR, 1993.
[15]
T. L. Lai and H. Robbins. Asymptotically efficient adaptive allocation rules. Advances in applied mathematics, 1985.
[16]
C. Li, Y. Wang, P. Resnick, and Q. Mei. Req-rec: High recall retrieval with query pooling and interactive classification. In Proc. of SIGIR, 2014.
[17]
L. Li, W. Chu, J. Langford, and X. Wang. Unbiased offline evaluation of contextual-bandit-based news article recommendation algorithms. In Proc. of WSDM, 2011.
[18]
D. E. Losada, J. Parapar, and A. Barreiro. Feeling lucky? multi-armed bandits for ordering judgements in pooling-based evaluation. In 31st Symposium on Applied Computing, 2016.
[19]
J. Luo, S. Zhang, X. Dong, and H. Yang. Designing states, actions, and rewards for using pomdp in session search. In Advances in Information Retrieval. 2015.
[20]
J. Luo, S. Zhang, and H. Yang. Win-win search: Dual-agent stochastic game in session search. In Proc. of SIGIR, 2014.
[21]
C. D. Manning, P. Raghavan, and H. Schütze. Introduction to information retrieval. Cambridge University Press Cambridge, 2008.
[22]
F. Radlinski and S. Dumais. Improving personalized web search using result diversification. In Proc. of SIGIR, 2006.
[23]
J. J. Rocchio. Relevance feedback in information retrieval. 1971.
[24]
R. L. Santos, C. Macdonald, and I. Ounis. Exploiting query reformulations for web search result diversification. In Proc. of WWW, 2010.
[25]
B. Settles. Active learning literature survey. University of Wisconsin, Madison, 2010.
[26]
X. Shen, B. Tan, and C. Zhai. Context-sensitive information retrieval using implicit feedback. In Proc. of SIGIR, 2005.
[27]
X. Shen and C. Zhai. Active feedback in ad hoc information retrieval. In Proc. of SIGIR, 2005.
[28]
M. Sloan and J. Wang. Dynamical information retrieval modelling: a portfolio-armed bandit machine approach. In Proc. of WWW, 2012.
[29]
M. Sloan and J. Wang. Dynamic information retrieval: Theoretical framework and application. In Proc. of ICTIR, 2015.
[30]
E. J. Sondik. The optimal control of partially observable markov processes over the infinite horizon: Discounted costs. Operations Research, 1978.
[31]
S. Srinivasan, E. Talvitie, M. Bowling, and C. Szepesvári. Improving exploration in uct using local manifolds. In Proc. of AAAI, 2015.
[32]
X. Wang, H. Fang, and C. Zhai. A study of methods for negative relevance feedback. In Proc. of SIGIR, 2008.
[33]
C. Zhai and J. Lafferty. A study of smoothing methods for language models applied to ad hoc information retrieval. In Proc. of SIGIR, 2001.
[34]
C. X. Zhai, W. W. Cohen, and J. Lafferty. Beyond independent relevance: methods and evaluation metrics for subtopic retrieval. In Proc. of SIGIR, 2003.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CIKM '16: Proceedings of the 25th ACM International on Conference on Information and Knowledge Management
October 2016
2566 pages
ISBN:9781450340731
DOI:10.1145/2983323
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 October 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. multi-armed bandits
  2. query pooling

Qualifiers

  • Research-article

Funding Sources

Conference

CIKM'16
Sponsor:
CIKM'16: ACM Conference on Information and Knowledge Management
October 24 - 28, 2016
Indiana, Indianapolis, USA

Acceptance Rates

CIKM '16 Paper Acceptance Rate 160 of 701 submissions, 23%;
Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

Upcoming Conference

CIKM '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)82
  • Downloads (Last 6 weeks)15
Reflects downloads up to 24 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media