Although Convolutional Neural Networks (CNN) can extract local features and Long-term Short-term Memory Networks
(LSTM) can extract global features, which show good classification results, CNN has some shortcomings in obtaining contextual
global information of text, and LSTM tends to ignore the hidden feature information between words. Therefore, this paper proposes
to use the CNN_BiLSTM_Attention parallel model for text sentiment classification. First, CNN is used to extract local features,
while BiLSTM is used to extract global features with contextual semantic information, and then the features extracted from the two
are spliced together for feature fusion. This method allows the model to capture both local phrase-level features and contextual struc
ture information, and use the attention mechanism to assign different weights to the importance of feature words, thereby improving
the classification effect of the model. By combining with a single model such as CNN or LSTM, the CNN_BiLSTM_Attention paral
lel model proposed in this paper has improved comprehensive evaluation index F1 score and accuracy. The experimental results
show that the model proposed in this paper has achieved better results in text sentiment classification tasks than other neural net
works, and has better practical value.