全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于Bert-BILSTM-BIGRU-CNN的文本情感分析模型
Sentiment Analysis of Text Based on BERT-BiLSTM-BiGRU-CNN Model

DOI: 10.12677/MOS.2024.131005, PP. 42-49

Keywords: 情感分析,自注意力机制,双向长短期记忆网络,双向门控循环单元
Sentiment Analysis
, Self-Attention, BiLSTM, BiGRU

Full-Text   Cite this paper   Add to My Lib

Abstract:

针对当前情感分类模型不能充分提取短文本显著特征的问题,提出一种融合多特征和注意力机制的情感分析模型Bert-BILSTM-BIGRU-CNN。首先,使用BERT预训练语言模型进行文本表示。然后,将双向长短期记忆网络(BiLSTM)、双向门控循环单元(BiGRU)和一维卷积神经网络(CNN)模型进行集成,用于提取文本特征,并添加了自注意力机制以更好地理解上下文。最后,在亚马逊评论数据集上对所提模型进行了训练和验证。通过实验证明,所提出的模型在准确率、召回率和F1值等三种指标上都优于现有模型,在二元情感分类分析中表现更为出色。
To overcome the shortcoming that current sentiment classification models cannot sufficiently ex-tract salient features from short texts, a new emotion analysis model, Bert-BILSTM-BIGRU-CNN, which combines multi-features and attention mechanisms, is proposed. Firstly, we use the BERT pre-trained language model for text representation. Next, we integrate bidirectional long short-term memory networks (BiLSTM), bidirectional gated recurrent units (BiGRU), and one-dimensional convolutional neural networks (CNN) models for text feature extraction, and in-corporate a self-attention mechanism for a better understanding of context. Finally, we train and validate the proposed model on the Amazon review dataset. Experimental results demonstrate that our model outperforms existing models in terms of accuracy, recall, and F1 score, exhibiting supe-rior performance in binary sentiment classification analysis.

References

[1]  Singh, N.K., Tomar, D.S. and Sangaiah, A.K. (2020) Sentiment Analysis: A Review and Comparative Analysis over Social Media. Journal of Ambient Intelligence and Humanized Computing, 11, 97-117.
https://doi.org/10.1007/s12652-018-0862-8
[2]  Catal, C. and Nangir, M. (2017) A Sentiment Classification Model Based on Multiple Classifiers. Applied Soft Computing, 50, 135-141.
https://doi.org/10.1016/j.asoc.2016.11.022
[3]  邓入菡, 张清华, 黄帅帅, 等. 基于多粒度特征融合的新型图卷积网络用于方面级情感分析[J]. 计算机科学, 2023, 50(10): 80-87.
[4]  张军, 张丽, 沈凡凡, 等. RoBERTa融合BiLSTM及注意力机制的隐式情感分析[J]. 计算机工程与应用, 2022, 58(23): 142-150.
[5]  Appel, O., Chiclana, F., Carter, J., et al. (2018) A Hybrid Approach to the Sentiment Analysis Problem at the Sentence Level. Knowledge-Based Systems, 108, 110-124.
https://doi.org/10.1016/j.knosys.2016.05.040
[6]  Wei, J.Y., Lao, J., Yang, Z.F., et al. (2020) BiLSTM with Mul-ti-Polarity Orthogonal Attention for Implicit Sentiment Analysis. Neurocomputing, 383, 165-173.
https://doi.org/10.1016/j.neucom.2019.11.054
[7]  Milagros, F.G., Tamara, A.L., et al. (2016) Unsupervised Method for Sentiment Analysis in Online Texts. Expert Systems with Applications, 58, 57-75.
https://doi.org/10.1016/j.eswa.2016.03.031
[8]  Tang, D.Y., Qin, B. and Liu, T. (2015) Deep Learning for Sentiment Analysis: Successful Approaches and Future Challenges. WIRES Data Mining and Knowledge Discovery, 5, 292-303.
https://doi.org/10.1002/widm.1171
[9]  Bojanowski, P., Grave, E., Joulin, A., et al. (2017) Enriching Word Vectors with Subword Information. Transactions of the Association for Computational Linguistics, 5, 135-146.
https://doi.org/10.1162/tacl_a_00051
[10]  Alaparthi, S. and Mishra, M. (2021) BERT: A Sentiment Analysis Odyssey. Journal of Marketing Analytics, 9, 118-126.
https://doi.org/10.1057/s41270-021-00109-8
[11]  曾桢, 王擎宇. 融合BERT中间隐藏层的方面级情感分析模型[J]. 科学技术与工程, 2023, 23(12): 5161-5169.
[12]  杨秀璋, 郭明镇, 候红涛, 等. 融合情感词典的改进BiLSTM-CNN+Attention情感分类算法[J]. 科学技术与工程, 2022, 22(20): 8761-8770.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413