![]() By comparing the average accuracy of real datasets with long short-term memory, Bi-LSTM, gated recurrent units, and MCNN and calculating the main indexes (Accuracy, Precision, Recall, and F1-score), it can be observed that our method can improve the average accuracy and optimize the structure of the recurrent neural network and effectively solve the problems of exploding and vanishing gradients.ĭata classification is one of the most important tasks for different applications, such as text categorization, tone recognition, image classification, microarray gene expression, and protein structure prediction ( Choi et al., 2017 Johnson and Zhang, 2017 Malhotra et al., 2017 Aggarwal et al., 2018 Fang et al., 2018 Mikołajczyk and Grochowski, 2018 Kerkeni et al., 2019 Saritas and Yasar, 2019 Yildirim et al., 2019 Chandrasekar et al., 2020). It provides six pathways so as to fully and deeply explore the effect and influence of historical information on the RNNs. For each method, there are two ways for historical information addition: 1) direct addition and 2) adding weight weighting and function mapping to activation function. To include the historical information, we design two different processing methods for the SS-RNN in continuous and discontinuous ways, respectively. At the same time, for the time direction, it can improve the correlation of states at different moments. ![]() It can enhance the long-term memory ability. To solve these problems, this paper proposes a new algorithm called SS-RNN, which directly uses multiple historical information to predict the current time information. However, they have problems such as insufficient memory ability and difficulty in gradient back propagation. Recurrent neural networks are widely used in time series prediction and classification. 2School of Computer Science and Artificial Intelligence, Wuhan Textile University, Wuhan, China. ![]() 1Research Center of Nonlinear Science, School of Mathematical and Physical Sciences, Wuhan Textile University, Wuhan, China.Wenjie Cao 1,2, Ya-Zhou Shi 1, Huahai Qiu 1 and Bengong Zhang 1*
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |