%0 Journal Article %T 基于Bert-BILSTM-BIGRU-CNN的文本情感分析模型
Sentiment Analysis of Text Based on BERT-BiLSTM-BiGRU-CNN Model %A 朱昆 %A 刘姜 %A 倪枫 %A 朱佳怡 %J Modeling and Simulation %P 42-49 %@ 2324-870X %D 2024 %I Hans Publishing %R 10.12677/MOS.2024.131005 %X 针对当前情感分类模型不能充分提取短文本显著特征的问题,提出一种融合多特征和注意力机制的情感分析模型Bert-BILSTM-BIGRU-CNN。首先,使用BERT预训练语言模型进行文本表示。然后,将双向长短期记忆网络(BiLSTM)、双向门控循环单元(BiGRU)和一维卷积神经网络(CNN)模型进行集成,用于提取文本特征,并添加了自注意力机制以更好地理解上下文。最后,在亚马逊评论数据集上对所提模型进行了训练和验证。通过实验证明,所提出的模型在准确率、召回率和F1值等三种指标上都优于现有模型,在二元情感分类分析中表现更为出色。
To overcome the shortcoming that current sentiment classification models cannot sufficiently ex-tract salient features from short texts, a new emotion analysis model, Bert-BILSTM-BIGRU-CNN, which combines multi-features and attention mechanisms, is proposed. Firstly, we use the BERT pre-trained language model for text representation. Next, we integrate bidirectional long short-term memory networks (BiLSTM), bidirectional gated recurrent units (BiGRU), and one-dimensional convolutional neural networks (CNN) models for text feature extraction, and in-corporate a self-attention mechanism for a better understanding of context. Finally, we train and validate the proposed model on the Amazon review dataset. Experimental results demonstrate that our model outperforms existing models in terms of accuracy, recall, and F1 score, exhibiting supe-rior performance in binary sentiment classification analysis. %K 情感分析,自注意力机制,双向长短期记忆网络,双向门控循环单元
Sentiment Analysis %K Self-Attention %K BiLSTM %K BiGRU %U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=78828