Abstract
Users may strive to formulate an adequate textual query for
their information need. Search engines assist the users by
presenting query suggestions. To preserve the original search
intent, suggestions should be context-aware and account for
the previous queries issued by the user. Achieving context
awareness is challenging due to data sparsity. We present
a probabilistic suggestion model that is able to account for
sequences of previous queries of arbitrary lengths. Our novel
hierarchical recurrent encoder-decoder architecture allows
the model to be sensitive to the order of queries in the context
while avoiding data sparsity. Additionally, our model
can suggest for rare, or long-tail, queries. The produced suggestions
are synthetic and are sampled one word at a time,
using computationally cheap decoding techniques. This is in
contrast to current synthetic suggestion models relying upon
machine learning pipelines and hand-engineered feature sets.
Results show that it outperforms existing context-aware approaches
in a next query prediction setting. In addition to
query suggestion, our model is general enough to be used in
a variety of other applications.
their information need. Search engines assist the users by
presenting query suggestions. To preserve the original search
intent, suggestions should be context-aware and account for
the previous queries issued by the user. Achieving context
awareness is challenging due to data sparsity. We present
a probabilistic suggestion model that is able to account for
sequences of previous queries of arbitrary lengths. Our novel
hierarchical recurrent encoder-decoder architecture allows
the model to be sensitive to the order of queries in the context
while avoiding data sparsity. Additionally, our model
can suggest for rare, or long-tail, queries. The produced suggestions
are synthetic and are sampled one word at a time,
using computationally cheap decoding techniques. This is in
contrast to current synthetic suggestion models relying upon
machine learning pipelines and hand-engineered feature sets.
Results show that it outperforms existing context-aware approaches
in a next query prediction setting. In addition to
query suggestion, our model is general enough to be used in
a variety of other applications.
Original language | English |
---|---|
Title of host publication | CIKM '15 Proceedings of the 24th ACM International on Conference on Information and Knowledge Management |
Number of pages | 10 |
Publisher | Association for Computing Machinery |
Publication date | 2015 |
Pages | 553-562 |
ISBN (Electronic) | 978-1-4503-3794-6 |
DOIs | |
Publication status | Published - 2015 |
Event | CIKM 2015: ACM International Conference on Information and Knowledge Management - Melbourne, Australia Duration: 19 Oct 2015 → 23 Oct 2015 |
Conference
Conference | CIKM 2015 |
---|---|
Country/Territory | Australia |
City | Melbourne |
Period | 19/10/2015 → 23/10/2015 |