A BERT–CNN–BIGRU HYBRID MODEL BASED ON INTEGRATION OF CONTEXTUAL AND LOCAL SEMANTIC FEATURES IN TEXT CLASSIFICATION

loading.default
thumbnail.default.alt

item.page.date

item.page.journal-title

item.page.journal-issn

item.page.volume-title

item.page.publisher

Modern American Journals

item.page.abstract

In this paper, a hybrid neural architecture integrating contextual and local semantic features is proposed to improve accuracy and robustness in text classification. The proposed model combines BERT-based contextual vector representations (embeddings), local semantic features extracted using a Convolutional Neural Network (CNN), and global sequence connections learned using a Bidirectional Gated Recurrent Unit (BiGRU). In the model, semantic features at different levels are combined into a single spatial representation through a feature fusion mechanism, and the final classification result is determined using a softmax activation function. Experimental results show that the proposed BERT–CNN–BiGRU model achieves high accuracy and F1-criterion indicators compared to traditional word vector-based models. This approach can be effectively applied to tasks such as sentiment analysis, topic classification, and automatic information analysis.

item.page.description

item.page.citation

item.page.endorsement

item.page.review

item.page.supplemented

item.page.referenced