A cross-language short text classification model based on BERT and multilayer collaborative convolutional neural network (MCNN) DOI Open Access
Qiong Hu

Molecular & cellular biomechanics, Год журнала: 2024, Номер 21(3), С. 739 - 739

Опубликована: Ноя. 25, 2024

This study focuses on cross-lingual short text classification tasks and aims to combine the advantages of BERT Multi-layer Collaborative Convolutional Neural Network (MCNN) build an efficient model. model provides rich semantic information for with its powerful language understanding bidirectional context modeling ability, while MCNN effectively extracts local global features in through multi-layer convolution structure collaborative working mechanism. In this study, output is used as input MCNN, further mine deep text, so realize high-precision text. The experimental results show that has achieved significant performance improvement dataset, which a new effective solution tasks.

Язык: Английский

A cross-language short text classification model based on BERT and multilayer collaborative convolutional neural network (MCNN) DOI Open Access
Qiong Hu

Molecular & cellular biomechanics, Год журнала: 2024, Номер 21(3), С. 739 - 739

Опубликована: Ноя. 25, 2024

This study focuses on cross-lingual short text classification tasks and aims to combine the advantages of BERT Multi-layer Collaborative Convolutional Neural Network (MCNN) build an efficient model. model provides rich semantic information for with its powerful language understanding bidirectional context modeling ability, while MCNN effectively extracts local global features in through multi-layer convolution structure collaborative working mechanism. In this study, output is used as input MCNN, further mine deep text, so realize high-precision text. The experimental results show that has achieved significant performance improvement dataset, which a new effective solution tasks.

Язык: Английский

Процитировано

0