[1] ENGUEHARD J, BUSBRIDGE D, BOZSON A, et al. Neural temporal point processes for modelling electronic health records[C]//Proceedings of the Machine Learning for Health, 2020: 85-113.
[2] BACRY E, MASTROMATTEO I, MUZY J F. Hawkes processes in finance[J]. Market Microstructure and Liquidity, 2015, 1(1): 1550005.
[3] OGATA Y. On Lewis’ simulation method for point processes[J]. IEEE Transactions on Information Theory, 1981, 27(1): 23-31.
[4] KOBAYASHI R, LAMBIOTTE R. TiDeH: time-dependent hawkes process for predicting retweet dynamics[C]//Proceedings of the International AAAI Conference on Web and Social Media, 2016: 191-200.
[5] HAWKES A G. Spectra of some self-exciting and mutually exciting point processes[J]. Biometrika, 1971, 58(1): 83-90.
[6] BESSY-ROLAND Y, BOUMEZOUED A, HILLAIRET C. Multivariate Hawkes process for cyber insurance[J]. Annals of Actuarial Science, 2021, 15(1): 14-39.
[7] DU N, DAI H, TRIVEDI R, et al. Recurrent marked temporal point processes: embedding event history to vector[C]//Proceedings of the 22nd International Conference on Knowledge Discovery and Data Mining, 2016: 1555-1564.
[8] MEI H, EISNER J M. The neural Hawkes process: a neurally self-modulating multivariate point process[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017: 6757-6767.
[9] HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780.
[10] CHUNG J, GULCEHRE C, CHO K H, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[J]. arXiv:1412.3555, 2014.
[11] PASCANU R, MIKOLOV T, BENGIO Y. On the difficulty of training recurrent neural networks[C]//Proceedings of the International Conference on Machine Learning, 2013: 1310-1318.
[12] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]. arXiv:1706.03672, 2017.
[13] CHEN W, XING X, XU X, et al. SpeechFormer: a hierarchical efficient framework incorporating the characteristics of speech[J]. arXiv:2203.03812, 2022.
[14] DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: Transformers for image recognition at scale[J]. arXiv:2010.11929, 2020.
[15] CARION N, MASSA F, SYNNAEVE G, et al. End-to-end object detection with transformers[C]//Proceedings of the 16th European Conference on Computer Vision, 2020: 213-229.
[16] ZHENG S, LU J, ZHAO H, et al. Rethinking semantic segmentation from a sequence-to-sequence perspective with transformers[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021: 6881-6890.
[17] ZHANG Q, LIPANI A, KIRNAP O, et al. Self-attentive Hawkes process[C]//Proceedings of the International Conference on Machine Learning, 2020: 11183-11193.
[18] ZUO S, JIANG H, LI Z, et al. Transformer Hawkes process[C]//Proceedings of the International Conference on Machine Learning, 2020: 11692-11702.
[19] AHMED K, KESKAR N S, SOCHER R. Weighted transformer network for machine translation[J]. arXiv:1711.02132, 2017.
[20] OGATA Y. Seismicity analysis through point-process modeling: a review[J]. Pure and Applied Geophysics, 1999, 155: 471-507.
[21] KWON J, ZHENG Y, JUN M. Flexible spatio-temporal Hawkes process models for earthquake occurrences[J]. Spatial Statistics, 2023: 100728.
[22] KIRCHNER M. An estimation procedure for the Hawkes process[J]. Quantitative Finance, 2017, 17(4): 571-595.
[23] LI X, GENEST C, JALBERT J. A self‐exciting marked point process model for drought analysis[J]. Environmetrics, 2021, 32(8): 2697.
[24] LEE K. Multi-kernel property in high-frequency price dynamics under Hawkes model[J]. arXiv:2302.11822, 2023.
[25] XIAO S, YAN J, YANG X, et al. Modeling the intensity function of point process via recurrent neural networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2017: 1597-1603.
[26] ZHANG L N, LIU J W, SONG Z Y, et al. Universal transformer Hawkes process[C]//Proceedings of the International Joint Conference on Neural Networks, 2021: 1-7.
[27] ZHANG L N, LIU J W, SONG ZY, et al. Temporal attention augmented transformer Hawkes process[J]. arXiv:2112.14472, 2021.
[28] LI S, JIN X, XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019: 5243-5253.
[29] ROBERT C P, CASELLA G, CASELLA G. Monte Carlo statistical methods[M]. New York: Springer, 1999.
[30] HILDEBRAND F B. Introduction to numerical analysis[M]. New York: Courier Corporation, 1987.
[31] LESKOVEC J, KREVL A. SNAP Datasets: Stanford large network dataset collection[DB/OL]. (2014-06-09)[2023-08-01]. https://snap.stanford.edu/data/.
[32] JOHNSON A E W, POLLARD T J, SHEN L, et al. MIMIC-Ⅲ, a freely accessible critical care database[J]. Scientific Data, 2016, 3(1): 1-36.
[33] ZHAO Q, ERDOGDU MA, HE HY, et al. Seismic: a self-exciting point process model for predicting tweet popularity[C]//Proceedings of the 21th International Conference on Knowledge Discovery and Data Mining, 2015: 1513-1522. |