Open Access Repository

Improving recurrent neural networks with predictive propagation for sequence labelling

Tran, SN ORCID: 0000-0002-5912-293X, Zhang, Q, Nguyen, A, Vu, X-S and Ngo, S 2018 , 'Improving recurrent neural networks with predictive propagation for sequence labelling', in L Cheng and ACS Leung and S Ozawa (eds.), Proceedings of the 25th International Conference on Neural Information Processing (ICONIP 2018), Lecture Notes in Computer Science, volume 11301 , Springer, Cham, Switzerland, pp. 452-462 , doi: 10.1007/978-3-030-04167-0_41.

Full text not available from this repository.

Abstract

Recurrent neural networks (RNNs) is a useful tool for sequence labelling tasks in natural language processing. Although in practice RNNs suffer a problem of vanishing/exploding gradient, their compactness still offers efficiency and make them less prone to overfitting. In this paper we show that by propagating the prediction of previous labels we can improve the performance of RNNs while keeping the number of parameters in RNNs unchanged and adding only one more step for inference. As a result, the models are still more compact and efficient than other models with complex memory gates. In the experiment, we evaluate the idea on optical character recognition and Chunking which achieve promising results. © 2018, Springer Nature Switzerland AG.

Item Type: Conference Publication
Authors/Creators:Tran, SN and Zhang, Q and Nguyen, A and Vu, X-S and Ngo, S
Keywords: natural language processing, recurrent neural networks, sequence labelling
Journal or Publication Title: Proceedings of the 25th International Conference on Neural Information Processing (ICONIP 2018), Lecture Notes in Computer Science, volume 11301
Publisher: Springer
DOI / ID Number: 10.1007/978-3-030-04167-0_41
Copyright Information:

Copyright 2018 Springer

Item Statistics: View statistics for this item

Actions (login required)

Item Control Page Item Control Page
TOP