Automatic Text Summarization Technology Based on ALBERT-UniLM Model
SUN Baoshan, TAN Hao
1.School of Computer Science and Technology, Tiangong University, Tianjin 300387, China
2.Tianjin Key Laboratory of Autonomous Intelligence Technology and Systems, Tiangong University, Tianjin 300387, China
SUN Baoshan, TAN Hao. Automatic Text Summarization Technology Based on ALBERT-UniLM Model[J]. Computer Engineering and Applications, 2022, 58(15): 184-190.
[1] RUSH A M,CHOPRA S,WESTON J.A neural attention model for abstractive sentence summarization[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing,2015:379-389.
[2] NALLAPATI R,ZHOU B,DOS SANTOS C,et al.Abstractive text summarization using sequence-to-sequence RNNs and beyond[C]//Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning,2016:280-290.
[3] SEE A,LIU P J,MANNING C D.Get to the point:sum- marization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers),2017:1073-1083.
[4] TAN J,WAN X,XIAO J.Abstractive document summrization with a graph-based attentional neural model[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers),2017:1171-1181.
[5] LIN J,SUN X,MA S,et al.Global encoding for abs- tractive summarization[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics(Volume 2:Short Papers),2018:163-169.
[6] REN P,CHEN Z,REN Z,et al.Leveraging contextual sentence relations for extractive summarization using a neural attention model[C]//Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval,2017:95-104.
[7] YU L,ZHANG W,WANG J,et al.Sequence generative adversarial nets with policy gradient[C]//AAAI Conference on Artificial Intelligence,2017.
[8] DEVLIN J,CHANG M W,LEE K,et al.BERT:pre- training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies(Volume 1:Long and Short Papers),2019:4171-4186.
[9] LIU G,GUO J.Bidirectional LSTM with attention mechanism and convolutional layer for text classification[J].Neuro-Computing,2019,337:325-338.
[10] LAN Z,CHEN M,GOODMAN S,et al.ALBERT:a lite BERT for self-supervised learning of language representations[C]//International Conference on Learning Representations,2019.
[11] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems,2017:6000-6010.
[12] ZHOU Q,YANG N,WEI F,et al.Neural document sum- marization by jointly learning to score and select sentences[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers),2018:654-663.
[13] WANG H,WANG X,XIONG W,et al.Self-supervised learning for contextualized extractive summarization[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics,2019:2221-2227.
[14] LIU Y.Fine-tune BERT for extractive summarization[J].arXiv:1903.10318,2019.
[15] WANG D,LIU P,ZHENG Y,et al.Heterogeneous graph neural networks for extractive document summarization[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics,2020:6209-6219.
[16] CHI P H,CHUNG P H,WU T H,et al.Audio albert:a lite bert for self-supervised learning of audio representation[C]//2021 IEEE Spoken Language Technology Workshop(SLT),2021:344-350.
[17] DONG L,YANG N,WANG W,et al.Unified language model pre-training for natural language understanding and generation[J].arXiv:1905.03197,2019.
[18] GEHRMANN S,DENG Y,RUSH A M.Bottom-up abstractive summarization[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing,2018:4098-4109.
[19] PETERS M,NEUMANN M,IYYER M,et al.Deep contextualized word representations[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies(Volume 1:Long Papers),2018:2227-2237.
[20] TAM Y C.Cluster-based beam search for pointer-generator chatbot grounded by knowledge[J].Computer Speech& Language,2020,64:101094.
[21] KAHNG M,ANDREWS P Y,KALRO A,et al.ActiVis:visual exploration of industry-scale deep neural network models[J].IEEE Transactions on Visualization and Computer Graphics,2017,24(1):88-97.
[22] LI W,XIAO X,LYU Y,et al.Improving neural abstractive document summarization with explicit information selection modeling[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing,2018:1787-1796.
[23] REDDY S,CHEN D,MANNING C D.Coqa:a conversational question answering challenge[J].Transactions of the Association for Computational Linguistics,2019,7:249-266.
[24] LI X,CHEN S,HU X,et al.Understanding the disharmony between dropout and batch normalization by variance shift[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition,2019:2682-2690.
[25] LIN C Y.Rouge:a package for automatic evaluation of summaries[C]//Proceedings of the Workshop on Text Summarization Branches Out,2004:74-81.
LIU Chang, Abudukelimu·Abulizi, YAO Dengfeng, Halidanmu·Abudukelimu.
Survey for Uyghur Morphological Analysis
[J]. Computer Engineering and Applications, 2021, 57(15): 42-61.