An Improved Sequential Recommendation Algorithm based on Short-Sequence Enhancement and Temporal Self-Attention Mechanism

Complexity 2022:1-15 (2022)
  Copy   BIBTEX

Abstract

Sequential recommendation algorithm can predict the next action of a user by modeling the user’s interaction sequence with an item. However, most sequential recommendation models only consider the absolute positions of items in the sequence, ignoring the time interval information between items, and cannot effectively mine user preference changes. In addition, existing models perform poorly on sparse data sets, which make a poor prediction effect for short sequences. To address the above problems, an improved sequential recommendation algorithm based on short-sequence enhancement and temporal self-attention mechanism is proposed in this paper. In the proposed algorithm, a backward prediction model is trained first, to predict the prior items in the user sequence. Then, the reverse prediction model is used to generate a batch of pseudo-historical items before the initial items of the short sequence, to achieve the goal of enhancing the short sequence. Finally, the absolute position information and time interval information of the user sequence are modeled, and a time-aware self-attention model is adopted to predict the user’s next action and generate a recommendation list. Various experiments are conducted on two public data sets. The experimental results show that the method proposed in this paper has excellent performance on both dense and sparse data sets, and its effect is better than that of the state of the art.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,386

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Research on Context-Awareness Mobile SNS Recommendation Algorithm.Zhijun Zhang & Hong Liu - 2015 - Pattern Recognition and Artificial Intelligence 28.

Analytics

Added to PP
2022-10-01

Downloads
9 (#1,224,450)

6 months
5 (#629,136)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references