Paper
1 June 2023 Bi-directional LSTM-attention Chinese text classification model based on subsampling optimization
Jingze Lv, Wuzhe Huang
Author Affiliations +
Proceedings Volume 12718, International Conference on Cyber Security, Artificial Intelligence, and Digital Economy (CSAIDE 2023); 1271812 (2023) https://doi.org/10.1117/12.2681571
Event: International Conference on Cyber Security, Artificial Intelligence, and Digital Economy (CSAIDE 2023), 2023, Nanjing, China
Abstract
This paper proposes a text classification model based on a bi-directional LSTM network with attention mechanism and subsampling in the word vector stage. Firstly, we use the Skip-gram model in Word2Vec for feature extraction, and then combine the two-way LSTM network with subsampling to extract and classify the key semantic information in the text, and integrate the Attention mechanism to optimize the generation weights, enhance the feature transfer between layers, and focus attention on the high weight words. Experimental results on different categories of open datasets show that the proposed model improves the recognition and extraction of high-featured content compared to a single network, and can effectively improve classification accuracy.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jingze Lv and Wuzhe Huang "Bi-directional LSTM-attention Chinese text classification model based on subsampling optimization", Proc. SPIE 12718, International Conference on Cyber Security, Artificial Intelligence, and Digital Economy (CSAIDE 2023), 1271812 (1 June 2023); https://doi.org/10.1117/12.2681571
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Education and training

Classification systems

Feature extraction

Mathematical optimization

Statistical modeling

Semantics

Back to Top