Optimisation of Neural Network with Simultaneous Feature Selection and Network Prunning using Evolutionary Algorithm


  • WK Wong Curtin University Sarawak, Miri, Sarawak.
  • Chekima Ali Universiti Malaysia Sabah, Kota Kinabalu.
  • Wong Kii Ing Curtin University Sarawak, Miri, Sarawak.
  • Law Kah Haw Curtin University Sarawak, Miri, Sarawak.
  • Vincent Lee Curtin University Sarawak, Miri, Sarawak.


Neuroevolution, Feature Selection, Network Pruning, Evolutionary Algorithm


Most advances on the Evolutionary Algorithm optimisation of Neural Network are on recurrent neural network using the NEAT optimisation method. For feed forward network, most of the optimisation are merely on the Weights and the bias selection which is generally known as conventional Neuroevolution. In this research work, a simultaneous feature reduction, network pruning and weight/biases selection is presented using fitness function design which penalizes selection of large feature sets. The fitness function also considers feature and the neuron reduction  in the hidden layer. The results were demonstrated using two sets of data sets which are the cancer datasets and Thyroid datasets. Results showed backpropagation gradient descent error weights/biased optimisations performed slightly better at classification of the two datasets with lower misclassification rate and error. However, features and hidden neurons were reduced with the simultaneous feature /neurons switching using Genetic Algorithm. The number of features were reduced from 21 to 4 (Thyroid dataset) and 9 to 3 (cancer dataset) with only 1 hidden neuron in the processing layer for both network structures for the respective datasets.  This research work will present the chromosome representation and the fitness function design.


Stuart Russell, Peter Norvig, Artificial Intelligence A modern approach, pg 578

Rumenlhart. David E; Hinton G; Williams Ronald J, 1986, ‘Learning representations by back-propagating errors’, Nature 323(6088):533-536

Hochreiter. S; Bengio Y ; Fransconi P; Schmidhuber J. ‘Gradient Flow in Recurrent nets, the difficulty of learning long term dependencies, A field guide to dynamical Recurrent Neural Networks IEE press, 2001

Hochreiter. S; Schmidhubber Jurgen, 1997, “Long short Term Memory”, Neural Computation 9 (8):1735-1780

Xin Yao, Yong Li u, 1997, ‘A new Evolutionary System for evolving Artificial neural networks, IEEE transactions on neural networks 8(3)

Kenneth O. Stanley ; Bobby D. Bryant and Risto Miikukuilainen, 2005, ‘Evolving Neural network Agents in the Nero Video Game’, Evolutionary Computation, vol9(6) : 653-668

Reeder J.; Miquez R.; Sparks J.; Georgipoulus M. and Anagnostopoulos G., 2008,’Interactively evolved modular neural networks for game and agent control’, Computational Intelligence and Games:167-174

Kenneth O Stanley, Risto Miikkuilainen, 2002 ‘Evolving Neural Networks through augmenting Topologies’, Evolutionary computation 10(2):99-107






How to Cite

Wong, W., Ali, C., Kii Ing, W., Kah Haw, L., & Lee, V. (2016). Optimisation of Neural Network with Simultaneous Feature Selection and Network Prunning using Evolutionary Algorithm. Journal of Telecommunication, Electronic and Computer Engineering (JTEC), 8(12), 83–86. Retrieved from https://jtec.utem.edu.my/jtec/article/view/1440