Skip to Main content Skip to Navigation
Conference papers

Forward Learning Convolutional Neural Network

Abstract : A conventional convolutional neural network (CNN) is trained by back-propagation (BP) from output layer to input layer through the entire network. In this paper, we propose a novel training approach such that CNN can be trained in forward way unit by unit. For example, we separate a CNN network with three convolutional layers into three units. Each unit contains one convolutional layer and will be trained one by one in sequence. Experiments shows that training can be restricted in local unit and processed one by one from input to output. In most cases, our novel feed forward approach has equal or better performance compared to the traditional approach. In the worst case, our novel feed forward approach is inferior to the traditional approach less than 5% accuracy. Our training approach also obtains benefits from transfer learning by setting different targets for middle units. As the full network back propagation is unnecessary, BP learning becomes more efficiently and least square method can be applied to speed learning. Our novel approach gives out a new focus on training methods of convolutional neural network.
Document type :
Conference papers
Complete list of metadata

Cited literature [22 references]  Display  Hide  Download
Contributor : Hal Ifip <>
Submitted on : Tuesday, July 30, 2019 - 5:02:02 PM
Last modification on : Tuesday, July 30, 2019 - 5:12:17 PM


Files produced by the author(s)


Distributed under a Creative Commons Attribution 4.0 International License



Hong Hu, Xin Hong, Dan Hou, Zhongzhi Shi. Forward Learning Convolutional Neural Network. 10th International Conference on Intelligent Information Processing (IIP), Oct 2018, Nanning, China. pp.51-61, ⟨10.1007/978-3-030-00828-4_6⟩. ⟨hal-02197796⟩



Record views


Files downloads