Microblog Sentiment Analysis Based on BERT and Hierarchical Attention
ZHAO Hong, FU Zhaoyang, ZHAO Fan
1.School of Computer and Communication, Lanzhou University of Technology, Lanzhou 730050, China
2.Gansu Institute of Science and Technology Information, Lanzhou 730000, China
ZHAO Hong, FU Zhaoyang, ZHAO Fan. Microblog Sentiment Analysis Based on BERT and Hierarchical Attention[J]. Computer Engineering and Applications, 2022, 58(5): 156-162.
[1] YUAN J,SHI J,CHE J,et al.Modeling and simulation analysis of public opinion polarization in a dynamic network environment[J].Concurrency and Computation:Practice and Experience,2020,32(19):e5771.
[2] 陈兴蜀,常天祐,王海舟,等.基于微博数据的“新冠肺炎疫情”舆情演化时空分析[J].四川大学学报(自然科学版),2020,57(2):409-416.
CHEN Xingshu,CHANG Tianyou,WANG Haizhou,et al.Spatial and temporal analysis on public opinion evolution of epidemic situation about novel coronavirus pneumonia based on micro-blog data[J].Journal of Sichuan University(Natural Science Edition),2020,57(2):409-416.
[3] PANG B,LEE L.A sentimental education:sentiment analysis using subjectivity summarization based on minimum cuts[C]//Proc of the ACL 2004.Morristown:ACL,2004:271-278.
[4] ALHARBI N M,ALGHAMDI N S,ALKHAMMASH E H,et al.Evaluation of sentiment analysis via word embedding and RNN variants for amazon online reviews[J].Mathematical Problems in Engineering,2021:532-543.
[5] LEE G T,KIM C O,SONG M.Semisupervised sentiment analysis method for online text reviews[J].Journal of Information Science,2021,47(3):387-403.
[6] 余同瑞,金冉,韩晓臻,等.自然语言处理预训练模型的研究综述[J].计算机工程与应用,2020,56(23):12-22.
YU Tongrui,JIN Ran,HAN Xiaozhen,et al.Review of pre-training models for natural language processing[J].Computer Engineering and Applications,2020,56(23):12-22.
[7] KIM Y.Convolutional neural networks for sentence classification[C]//Proceedings of 2014 Conference on Empirical Methods in Natural Language Processing,2014:1746-1751.
[8] 刘龙飞,杨亮,张绍武,等.基于卷积神经网络的微博情感倾向性分析[J].中文信息学报,2015,29(6):159-165.
LIU Longfei,YANG Liang,ZHANG Shaowu,et al.Convolutional neural networks for Chinese micro-blog sentiment analysis[J].Journal of Chinese Information Processing,2015,29(6):159-165.
[9] MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems,2013:3111-3119.
[10] SAK H,SENIOR A W,BEAUFAYS F.Long short-term memory recurrent neural network architectures for large scale acoustic modeling[C]//Proceedings of the 15th Annual Conference of the International Speech Communication Association.Minneapolis:ISCA,2014:338-342.
[11] 方炯焜,陈平华,廖文雄.结合GloVe和GRU的文本分类模型[J].计算机工程与应用,2020,56(20):98-103.
FANG Jiongkun,CHEN Pinghua,LIAO Wenxiong.Text classification model based on GloVe and GRU[J].Computer Engineering and Applications,2020,56(20):98-103.
[12] 田竹.基于深度特征提取的文本情感极性分类研究[D].济南:山东大学,2017.
TIAN Zhu.Research on sentiment analysis based on deep feature representation[D].Jinan:Shandong University,2017.
[13] TANG D,QIN B,LIU T.Document modeling with gated recurrent neural network for sentiment classification[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing,2015:1422-1432.
[14] BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[C]//Proc of the 3rd International Conference on Learning Rrepresentations,2015:1-15.
[15] LUONG M T,PHAM H,MANNING C D.Effective approaches to attention-based neural machine translation[C]//Proc of Conference on Empirical Methods in Natural Language Processing,2015:1412-1421.
[16] YANG Z,YANG D,DYER C,et al.Hierarchical attention networks for document classification[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies,2016:1480-1489.
[17] DEVLIN J,CHANG M W,LEE K,et al.BERt:pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Stroudsburg,PA:Association for Computational Linguistics,2019:4171-4186.
[18] SUTSKEVER I,VINYALS O,LE Q V.Sequence to sequence learning with neural networks[C]//Advances in Neural Information Processing Systems,2014:3104-3112.
[19] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[C]//Advances in Neural Information Processing Systems,2017:5998-6008.
[20] DEY R,SALEM F M.Gate-variants of gated recurrent unit(GRU) neural networks[C]//2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS),2017:1597-1600.