Publication year

A Multi-Channel Text Sentiment Analysis Model Integrating Pre-training Mechanism

Resource type
Authors/contributors
Title
A Multi-Channel Text Sentiment Analysis Model Integrating Pre-training Mechanism
Abstract
The number of tourist attractions reviews, travel notes and other texts has grown exponentially in the Internet age. Effectively mining users’ potential opinions and emotions on tourist attractions, and helping to provide users with better recommendation services, which is of great practical significance. This paper proposes a multi-channel neural network model called Pre-BiLSTM combined with a pre-training mechanism. The model uses a combination of coarse and fine- granularity strategies to extract the features of text information such as reviews and travel notes to improve the performance of text sentiment analysis. First, we construct three channels and use the improved BERT and skip-gram methods with negative sampling to vectorize the word-level and vocabulary-level text, respectively, so as to obtain more abundant textual information. Second, we use the pre-training mechanism of BERT to generate deep bidirectional language representation relationships. Third, the vectors of the three channels are input into the BiLSTM network in parallel to extract global and local features. Finally, the model fuses the text features of the three channels and classifies them using SoftMax classifier. Furthermore, numerical experiments are conducted to demonstrate that Pre-BiLSTM outperforms the baselines by 6.27%, 12.83% and 18.12% in average in terms of accuracy, precision and F1-score.
Publication
Information Technology and Control
Volume
52
Issue
2
Pages
263-275
Date
2023-07-15
Language
en
ISSN
2335-884X
Accessed
1/13/24, 6:20 AM
Library Catalog
itc.ktu.lt
Rights
Copyright (c) 2023 Information Technology and Control
Extra
Number: 2
Citation
Liang, S., Jin, J., Du, W., & Qu, S. (2023). A Multi-Channel Text Sentiment Analysis Model Integrating Pre-training Mechanism. Information Technology and Control, 52(2), 263–275. https://doi.org/10.5755/j01.itc.52.2.31803