<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="6.x">Drupal-Biblio</source-app><ref-type>13</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Xinzhu Bai</style></author><author><style face="normal" font="default" size="100%">Yanping Huang</style></author><author><style face="normal" font="default" size="100%">Hong Peng</style></author><author><style face="normal" font="default" size="100%">Jun Wang</style></author><author><style face="normal" font="default" size="100%">Qian Yang</style></author><author><style face="normal" font="default" size="100%">David Orellana-Martín</style></author><author><style face="normal" font="default" size="100%">Ramírez-de-Arellano, Antonio</style></author><author><style face="normal" font="default" size="100%">Mario J. Pérez-Jiménez</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Sequence recommendation using multi-level self-attention network with gated spiking neural P systems</style></title><secondary-title><style face="normal" font="default" size="100%">Information Sciences</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Gated spiking neural P systems</style></keyword><keyword><style  face="normal" font="default" size="100%">Nonlinear spiking neural P systems</style></keyword><keyword><style  face="normal" font="default" size="100%">Self-attention mechanism</style></keyword><keyword><style  face="normal" font="default" size="100%">Sequence recommendation</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2023</style></year></dates><urls><web-urls><url><style face="normal" font="default" size="100%">https://www.sciencedirect.com/science/article/pii/S0020025523015013</style></url></web-urls></urls><publisher><style face="normal" font="default" size="100%">Elsevier</style></publisher><pub-location><style face="normal" font="default" size="100%">Amsterdam (The Netherlands)</style></pub-location><pages><style face="normal" font="default" size="100%">119916</style></pages><abstract><style face="normal" font="default" size="100%">Sequence recommendation is used to predict the user's next potentially interesting items and behaviors. It not only focuses on the user's independent interaction behavior, but also considers the user's historical behavior sequence. However, sequence recommendation still faces some challenges: the existing models still have shortcomings in addressing long-term dependencies and fully utilizing contextual information in sequence recommendation. To address these challenges, we propose a four-channel model based on a multi-level self-attention network with gated spiking neural P (GSNP) systems, termed SR-MAG model. The four channels are divided into two groups, and each group is composed of an attention channel and an GSNP attention channel. Moreover, they process long-term sequences and short-term sequences respectively to obtain long-term or short-term attention channel features. These features are then passed through a self-attention network to effectively extract user context information. The proposed SR-MAG model is tested on three real datasets and compared with 10 baseline methods. Experimental results demonstrate the effectiveness of the proposed SR-MAG model in sequence recommendation tasks.</style></abstract></record></records></xml>