skip to main content
10.1145/564376.564394acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

Improving realism of topic tracking evaluation

Published: 11 August 2002 Publication History

Abstract

Topic tracking and information filtering are models of interactive tasks, but their evaluations are generally done in a way that does not reflect likely usage. The models either force frequent judgments or disallow any at all, assume the user is always available to make a judgment, and do not allow for user fatigue. In this study we extend the evaluation framework for topic tracking to incorporate those more realistic issues. We demonstrate that tracking can be done in a realistic interactive setting with minimal impact on tracking cost and with substantial reduction in required interaction.

References

[1]
J. Allan. Relevance feedback with too much data. In Proceedings of ACM SIGIR, pages 337--343, 1995.
[2]
J. Allan, editor. Topic Detection and Tracking: Event-based Information Organization. Kluwer Academic Publishers, Boston, 2002.
[3]
J. Allan, V. Lavrenko, D. Frey, and V. Khandelwal. UMass at TDT 2000. Notebook publication for participants only, Nov. 2001.
[4]
C. Cieri, S. Strassel, D. Graff, N. Martey, K. Rennert, and M. Liberman. Corpora for topic detection and tracking. In J. Allan, editor, Topic Detection and Tracking: Event-based Information Organization, pages 33--66. Kluwer Academic Publishers, 2002.
[5]
J. G. Fiscus and G. R. Doddington. Topic detection and tracking evaluation overview. In J. Allan, editor, Topic Detection and Tracking: Event-based Information Organization, pages 17--31. Kluwer Academic Publishers, Boston, 2002.
[6]
W. Hersh and P. Over. The TREC-9 interactive track report. In Proceedings of the Text Retrieval Conference (TREC-9), pages 41--50, 2001.
[7]
G. N. Lance and W. T. Williams. A general theory of classificatory sorting strategies: 1. hierarchical systems. Computer Journal, 9:373--380, 1967.
[8]
A. Leuski. Interactive Information Organization: Techniques and Evaluation. PhD thesis, University of Massachusetts at Amherst, May 2001.
[9]
A. Leuski and J. Allan. Evaluating a visual navigation system for a digital library. International Journal on Digital Libraries, 3(2):170--184, 2000.
[10]
D. D. Lewis and W. A. Gale. A sequential algorithm for training text classifiers. In Proceedings of ACM SIGIR, pages 385--404, 1994.
[11]
I. Mani, T. Firmin, D. House, G. Klein, B. Sundheim, and L. Hirschman. The TIPSTER SUMMAC text summarization evaluation. In Proc. of EACL'99, 1999.
[12]
K. McKeown and D. Radev. Generating summaries of multiple news articles. In I. Mani and M. Maybury, editors, Advances in Automatic Text Summarization. MIT Press, Cambridge, Massachusetts, 1999.
[13]
S. Robertson and D. A. Hull. The TREC-9 filtering track final report. In Proceedings of the Text Retrieval Conference (TREC-9), pages 25--40, 2001.
[14]
S. E. Robertson, S. Walker, S. Jones, M. M. Hancock-Beaulieu, and M. Gatford. Okapi at TREC-3. In D. Harman and E. Voorhees, editors, Third Text REtrieval Conference (TREC-3). NIST, 1995.
[15]
G. Salton. Automatic Text Processing. Addison-Wesley, 1989.
[16]
Proceedings of the TDT 2001 workshop. Notebook publication for participants only, 2001.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGIR '02: Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval
August 2002
478 pages
ISBN:1581135610
DOI:10.1145/564376
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 August 2002

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. filtering
  2. interactive tracking
  3. topic detection and tracking

Qualifiers

  • Article

Conference

SIGIR02
Sponsor:

Acceptance Rates

SIGIR '02 Paper Acceptance Rate 44 of 219 submissions, 20%;
Overall Acceptance Rate 792 of 3,983 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)1
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media