Sequence recommendation using multi-level self-attention network with gated spiking neural P systems

TitleSequence recommendation using multi-level self-attention network with gated spiking neural P systems
Publication TypeJournal Papers
Year of Publication2023
AuthorsBai, X., Huang Y., Peng H., Wang J., Yang Q., Orellana-Martín D., Ramírez-de-Arellano A., & Pérez-Jiménez M. J.
Journal TitleInformation Sciences
PublisherElsevier
Place PublishedAmsterdam (The Netherlands)
Pages119916
Abstract

Sequence recommendation is used to predict the user's next potentially interesting items and behaviors. It not only focuses on the user's independent interaction behavior, but also considers the user's historical behavior sequence. However, sequence recommendation still faces some challenges: the existing models still have shortcomings in addressing long-term dependencies and fully utilizing contextual information in sequence recommendation. To address these challenges, we propose a four-channel model based on a multi-level self-attention network with gated spiking neural P (GSNP) systems, termed SR-MAG model. The four channels are divided into two groups, and each group is composed of an attention channel and an GSNP attention channel. Moreover, they process long-term sequences and short-term sequences respectively to obtain long-term or short-term attention channel features. These features are then passed through a self-attention network to effectively extract user context information. The proposed SR-MAG model is tested on three real datasets and compared with 10 baseline methods. Experimental results demonstrate the effectiveness of the proposed SR-MAG model in sequence recommendation tasks.

KeywordsGated spiking neural P systems, Nonlinear spiking neural P systems, Self-attention mechanism, Sequence recommendation
URLhttps://www.sciencedirect.com/science/article/pii/S0020025523015013
ISSN Number0020-0255
DOIhttps://doi.org/10.1016/j.ins.2023.119916