C. Christenson and K. Kaikhah (USA)
incremental evolution, neural networks, training, backwards compatible
: Supervised learning has long been used to modify the artificial neural network in order to perform classification tasks. However, the standard fully connected layered design is often inadequate when performing such tasks. We demonstrate that evolution can be used to design an artificial neural network that learns faster and more accurately. By evolving artificial neural networks within a dynamic environment, the artificial neural network is forced to use learning. This strategy combined with incremental evolution produces an artificial neural network that outperforms the standard fully-connected layered design. The resulting artificial neural network can learn to perform an entire domain of tasks, including those of reduced complexity. Evolution alone can be used to create a network that performs a single task. However, real world environments are dynamic and thus require the ability to adapt to changes.
Important Links:
Go Back