期刊文献+

基于BiGRU与方面注意力模块的情感分类方法

Sentiment classification method based on BiGRU and aspect attention module
在线阅读 下载PDF
导出
摘要 方面级别的情感分析已广泛应用于文本信息挖掘,但目前广泛使用的LSTM(Long Short-Term Memory)循环神经网络在处理方面级情感分类任务时,不能充分学习文本上下文信息,且存在模型计算复杂、训练时间长的问题。针对该问题,本文提出利用双向门控循环神经网络与方面注意力模块结合的情感分类方法。双向门控循环神经网络参数更少,模型训练更快,可以有效提取文本深层次的信息;将注意力操作与方面信息相结合,能充分提取特定方面的信息。该方法在SemEval(Semantic Evaluation)数据集上的实验结果表明,相对现有的方面级情感分析方法,能有效提升处理速度和优化情感分类效果。 Aspect level sentiment analysis has been widely used in text information mining,but the widely used LSTM recurrent neural network can not fully learn the text context information when dealing with aspect level emotion classification tasks,and it has the problem of complex model calculation and long training time.This paper proposes an emotion classification method based on the combination of bidirectional gated cyclic neural network and aspect attention module.Firstly,the bidirectional gated recurrent neural network has fewer parameters and faster model training,which can effectively extract the deep-seated information of text;secondly,the combination of attention operation and aspect information can fully extract the hidden information of specific aspects.The experimental results on SemEval dataset show that compared with the existing aspect level sentiment analysis methods,the proposed method can effectively improve the processing speed and optimize the effect of sentiment classification.
作者 宋焕民 张云华 SONG Huanmin;ZHANG Yunhua(School of Information Science and Technology,Zhejiang Sci-Tech University,Hangzhou 310018,China)
出处 《智能计算机与应用》 2020年第11期83-87,共5页 Intelligent Computer and Applications
关键词 基于方面的情感分析 自然语言处理 注意力机制 双向门控循环神经网络 Aspect level Sentiment Analysis text sentiment classification attention mechanism bidirectional gated recurrent unity(BiGRU)
作者简介 宋焕民(1995-),男,硕士研究生,主要研究方向:智能信息处理;张云华(1965-),男,博士,教授,主要研究方向:软件工程、智能信息处理。
  • 相关文献

参考文献4

二级参考文献18

  • 1MarkoffJ. How many computers to identify a cat?[NJ The New York Times, 2012-06-25.
  • 2MarkoffJ. Scientists see promise in deep-learning programs[NJ. The New York Times, 2012-11-23.
  • 3李彦宏.2012百度年会主题报告:相信技术的力量[R].北京:百度,2013.
  • 410 Breakthrough Technologies 2013[N]. MIT Technology Review, 2013-04-23.
  • 5Rumelhart D, Hinton G, Williams R. Learning representations by back-propagating errors[J]. Nature. 1986, 323(6088): 533-536.
  • 6Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks[J]. Science. 2006, 313(504). Doi: 10. 1l26/science. 1127647.
  • 7Dahl G. Yu Dong, Deng u, et a1. Context-dependent pre?trained deep neural networks for large vocabulary speech recognition[J]. IEEE Trans on Audio, Speech, and Language Processing. 2012, 20 (1): 30-42.
  • 8Jaitly N. Nguyen P, Nguyen A, et a1. Application of pretrained deep neural networks to large vocabulary speech recognition[CJ //Proc of Interspeech , Grenoble, France: International Speech Communication Association, 2012.
  • 9LeCun y, Boser B, DenkerJ S. et a1. Backpropagation applied to handwritten zip code recognition[J]. Neural Computation, 1989, I: 541-551.
  • 10Large Scale Visual Recognition Challenge 2012 (ILSVRC2012)[OLJ.[2013-08-01J. http://www. image?net.org/challenges/LSVRC/2012/.

共引文献695

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部