Your search

Author or contributor
Publication year
  • The number of tourist attractions reviews, travel notes and other texts has grown exponentially in the Internet age. Effectively mining users’ potential opinions and emotions on tourist attractions, and helping to provide users with better recommendation services, which is of great practical significance. This paper proposes a multi-channel neural network model called Pre-BiLSTM combined with a pre-training mechanism. The model uses a combination of coarse and fine- granularity strategies to extract the features of text information such as reviews and travel notes to improve the performance of text sentiment analysis. First, we construct three channels and use the improved BERT and skip-gram methods with negative sampling to vectorize the word-level and vocabulary-level text, respectively, so as to obtain more abundant textual information. Second, we use the pre-training mechanism of BERT to generate deep bidirectional language representation relationships. Third, the vectors of the three channels are input into the BiLSTM network in parallel to extract global and local features. Finally, the model fuses the text features of the three channels and classifies them using SoftMax classifier. Furthermore, numerical experiments are conducted to demonstrate that Pre-BiLSTM outperforms the baselines by 6.27%, 12.83% and 18.12% in average in terms of accuracy, precision and F1-score.

  • Text classification is an important topic in natural language processing, with the development of social network, many question-and-answer pairs regarding health-care and medicine flood social platforms. It is of great social value to mine and classify medical text and provide targeted medical services for patients. The existing algorithms of text classification can deal with simple semantic text, especially in the field of Chinese medical text, the text structure is complex and includes a large number of medical nomenclature and professional terms, which are difficult for patients to understand. We propose a Chinese medical text classification model using a BERT-based Chinese text encoder by N-gram representations (ZEN) and capsule network, which represent feature uses the ZEN model and extract the features by capsule network, we also design a N-gram medical dictionary to enhance medical text representation and feature extraction. The experimental results show that the precision, recall and F1-score of our model are improved by 10.25%, 11.13% and 12.29%, respectively, compared with the baseline models in average, which proves that our model has better performance.

Last update from database: 5/2/24, 12:02 AM (UTC)