Note; during crossover the synaptic weights are curried over from the Parent NN, during mutation-addition a random weight is assigned to the random new synapse. The Synaptic weights will then be fine-tuned using error back-propagation during the memetic step.
Optimally Wired Neural Networks
Finally its time to used the method described above to design Optimally Wired neural networks. To do so the bi-objective optimization problem minimizing:
a) Error (average training error over training set) and
b) Complexity (number of synaptic connections) is used,
asserting that we not only look for the best-fit model but also for the simplest model able to do so.
The resulting Pareto Front (presenting the optimal compromises between our 2 objectives) is presented in figure 4. Two solutions are also presented on the same plot:
Solution A: the Solution with the smallest Error and
Solution B: a solution with similar Error as A but with much lower complexity, requiring only 1/3 of synaptic connections in comparison with A.