An Improved Retraining Scheme for Convolutional Neural Network
Keywords:
Convolutional Neural Network, winner-takes-all-approach, multilayer perceptron, neural networkAbstract
A feed-forward neural network artificial model, or multilayer perceptron (MLP), learns input samples adaptively and solves non-linear problems for data that are noisy and imprecise. Another variant of MLP, known as Convolutional Neural Network (CNN) has additional features such as weight sharing, local receptive field, and subsampling, making CNN superior in handling challenging pattern-recognition tasks. Although CNN has improved the performance of MLP, the complexity of its structure has caused retraining processes to become inefficient whenever new categories or neurons using a winner-takes-all approach are added at the classifier stage. Thus, it is necessary to retrain the complete network set when new categories are added to the network. However, such a retraining incurs additional cost and training time. In this paper, we propose a retraining scheme that could overcome the mentioned problem. The proposed retraining scheme generalizes the feature of extraction layers, hence the retraining process only involves the last two layers instead of the whole network. The design was evaluated on AT&T and JAFFE databases. The results obtained have proved that training an additional category is approximately more than 70 times faster than retraining the whole network architectureDownloads
How to Cite
Issue
Section
License
TRANSFER OF COPYRIGHT AGREEMENT
The manuscript is herewith submitted for publication in the Journal of Telecommunication, Electronic and Computer Engineering (JTEC). It has not been published before, and it is not under consideration for publication in any other journals. It contains no material that is scandalous, obscene, libelous or otherwise contrary to law. When the manuscript is accepted for publication, I, as the author, hereby agree to transfer to JTEC, all rights including those pertaining to electronic forms and transmissions, under existing copyright laws, except for the following, which the author(s) specifically retain(s):
- All proprietary right other than copyright, such as patent rights
- The right to make further copies of all or part of the published article for my use in classroom teaching
- The right to reuse all or part of this manuscript in a compilation of my own works or in a textbook of which I am the author; and
- The right to make copies of the published work for internal distribution within the institution that employs me
I agree that copies made under these circumstances will continue to carry the copyright notice that appears in the original published work. I agree to inform my co-authors, if any, of the above terms. I certify that I have obtained written permission for the use of text, tables, and/or illustrations from any copyrighted source(s), and I agree to supply such written permission(s) to JTEC upon request.