[1] HAWKES A G. Spectra of some self-exciting and mutually exciting point processes[J]. Biometrika, 1971, 58(1): 83-90.
[2] RIZOIU M A, LEE Y, MISHRA S, et al. A tutorial on Hawkes processes for events in social media[J]. arXiv:1708.06401, 2017.
[3] LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. Nature, 2015, 521(7553): 436-444.
[4] OMI T, AIHARA K. Fully neural network based model for general temporal point processes[C]//Advances in Neural Information Processing Systems, 2019.
[5] LI S, XIAO S, ZHU S, et al. Learning temporal point processes via reinforcement learning[C]//Advances in Neural Information Processing Systems, 2018.
[6] MA C, KANG P, LIU X. Hierarchical gating networks for sequential recommendation[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019: 825-833.
[7] TANJIM M M. DynamicRec: a dynamic convolutional network for next item recommendation[C]//Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM-2020), 2020.
[8] DU N, DAI H, TRIVEDI R, et al. Recurrent marked temporal point processes: embedding event history to vector[C]//Proceedings of the 22nd ACM SIGKDD International Conference on knowledge discovery and Data Mining, 2016: 1555-1564.
[9] ZHANG Q, LIPANI A, KIRNAP O, et al. Self-attentive Hawkes process[C]//International Conference on Machine Learning, 2020: 11183-11193.
[10] REN R, LIU Z, LI Y, et al. Sequential recommendation with self-attentive multi-adversarial network[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, 2020: 89-98.
[11] 郭磊, 李秋菊, 刘方爱, 等. 基于自注意力网络的共享账户跨域序列推荐[J]. 计算机研究与发展, 2021, 58(11): 2524-2537.
GUO L, LI Q J, LIU F A, et al. Shared-account cross-domain sequential recommendation with self-attention network[J]. Journal of? Computer Research and Development, 2021, 58(11): 2524-2537.
[12] 张忠林, 张艳. 改进FA优化LSTM的时序预测模型[J]. 计算机工程与应用, 2022, 58(11): 125-132.
ZHANG Z L, ZHANG Y. Improved FA optimizing LSTM time series prediction model[J]. Computer Engineering and Applications, 2022, 58(11): 125-132.
[13] PASCANU R, MIKOLOV T, BENGIO Y. On the difficulty of training recurrent neural networks[C]//International Conference on Machine Learning, 2013: 1310-1318.
[14] XIAO S, YAN J, YANG X, et al. Modeling the intensity function of point process via recurrent neural networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2017.
[15] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780.
[16] ZUO S, JIANG H, LI Z, et al. Transformer Hawkes process[C]//International Conference on Machine Learning, 2020: 11692-11702.
[17] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems, 2017.
[18] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018.
[19] MEI H, EISNER J M. The neural Hawkes process: a neurally self-modulating multivariate point process[C]//Advances in Neural Information Processing Systems, 2017.
[20] MEI H, WAN T, EISNER J. Noise-contrastive estimation for multivariate point processes[C]//Advances in Neural Information Processing Systems, 2020: 5204-5214.
[21] BOYD A, BAMLER R, MANDT S, et al. User-dependent neural sequence models for continuous-time event data[C]//Advances in Neural Information Processing Systems, 2020: 21488-21499.
[22] GUPTA V, BEDATHUR S, BHATTACHARYA S, et al. Learning temporal point processes with intermittent observations[C]//International Conference on Artificial Intelligence and Statistics, 2021: 3790-3798.
[23] LE Q, MIKOLOV T. Distributed representations of sentences and documents[C]//International Conference on Machine Learning, 2014: 1188-1196.
[24] WANG S, HU L, CAO L, et al. Attention-based transactional context embedding for next-item recommendation[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2018.
[25] DAUPHIN Y N, FAN A, AULI M, et al. Language modeling with gated convolutional networks[C]//International Conference on Machine Learning, 2017: 933-941.
[26] YANG D, ZHANG D, ZHENG V W, et al. Modeling user activity preference by leveraging user spatial temporal characteristics in LBSNs[J]. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2014, 45(1): 129-142.
[27] CHO E, MYERS S A, LESKOVEC J. Friendship and mobility: user movement in location-based social networks[C]//Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2011: 1082-1090.
[28] LESKOVEC J, BACKSTROM L, KLEINBERG J. Meme-tracking and the dynamics of the news cycle[C]//Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2009: 497-506. |