Modeling term dependencies with quantum language models for IR
Titre | Modeling term dependencies with quantum language models for IR |
Type de publication | Conference Paper |
Année de publication | 2013 |
Auteurs | Sordoni, A., J-Y. Nie, and Y. Bengio |
Nom de la conférence | Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval |
Résumé | Traditional information retrieval (IR) models use bag-of-words as the basic representation and assume that some form of independence holds between terms. Representing term dependencies and defining a scoring function capable of integrating such additional evidence is theoretically and practically challenging. Recently, Quantum Theory (QT) has been proposed as a possible, more general framework for IR. However, only a limited number of investigations have been made and the potential of QT has not been fully explored and tested. We develop a new, generalized Language Modeling approach for IR by adopting the probabilistic framework of QT. In particular, quantum probability could account for both single and compound terms at once without having to extend the term space artificially as in previous studies. This naturally allows us to avoid the weight-normalization problem, which arises in the current practice by mixing scores from matching compound terms and from matching single terms. Our model is the first practical application of quantum probability to show significant improvements over a robust bag-of-words baseline and achieves better performance on a stronger non bag-of-words baseline. |
PDF: