Recursive Attentive Methods with Reused Item Representations for Sequential Recommendation

by   Bo Peng, et al.
The Ohio State University

Sequential recommendation aims to recommend the next item of users' interest based on their historical interactions. Recently, the self-attention mechanism has been adapted for sequential recommendation, and demonstrated state-of-the-art performance. However, in this manuscript, we show that the self-attention-based sequential recommendation methods could suffer from the localization-deficit issue. As a consequence, in these methods, over the first few blocks, the item representations may quickly diverge from their original representations, and thus, impairs the learning in the following blocks. To mitigate this issue, in this manuscript, we develop a recursive attentive method with reused item representations (RAM) for sequential recommendation. We compare RAM with five state-of-the-art baseline methods on six public benchmark datasets. Our experimental results demonstrate that RAM significantly outperforms the baseline methods on benchmark datasets, with an improvement of as much as 11.3 wider models for better performance. Our run-time performance comparison signifies that RAM could also be more efficient on benchmark datasets.


page 1

page 2

page 3

page 4


Sequential Recommendation with Relation-Aware Kernelized Self-Attention

Recent studies identified that sequential Recommendation is improved by ...

Attention-based Fusion for Outfit Recommendation

This paper describes an attention-based fusion method for outfit recomme...

Multi-modality Meets Re-learning: Mitigating Negative Transfer in Sequential Recommendation

Learning effective recommendation models from sparse user interactions r...

C3SASR: Cheap Causal Convolutions for Self-Attentive Sequential Recommendation

Sequential Recommendation is a prominent topic in current research, whic...

Non-invasive Self-attention for Side Information Fusion in Sequential Recommendation

Sequential recommender systems aim to model users' evolving interests fr...

Tensor-based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations

Self-attentive transformer models have recently been shown to solve the ...

AdaMCT: Adaptive Mixture of CNN-Transformer for Sequential Recommendation

Sequential recommendation (SR) aims to model users' dynamic preferences ...

Please sign up or login with your details

Forgot password? Click here to reset