Paper

CARL: Aggregated Search with Context-Aware Module Embedding Learning

Aggregated search aims to construct search result pages (SERPs) from blue-links and heterogeneous modules (such as news, images, and videos). Existing studies have largely ignored the correlations between blue-links and heterogeneous modules when selecting the heterogeneous modules to be presented. We observe that the top ranked blue-links, which we refer to as the \emph{context}, can provide important information about query intent and helps identify the relevant heterogeneous modules. For example, informative terms like "streamed" and "recorded" in the context imply that a video module may better satisfy the query. To model and utilize the context information for aggregated search, we propose a model with context attention and representation learning (CARL). Our model applies a recurrent neural network with an attention mechanism to encode the context, and incorporates the encoded context information into module embeddings. The context-aware module embeddings together with the ranking policy are jointly optimized under the Markov decision process (MDP) formulation. To achieve a more effective joint learning, we further propose an optimization function with self-supervision loss to provide auxiliary supervision signals. Experimental results based on two public datasets demonstrate the superiority of CARL over multiple baseline approaches, and confirm the effectiveness of the proposed optimization function in boosting the joint learning process.

Results in Papers With Code
(↓ scroll down to see all results)