Envisioning the Benefits of Back-Drive in Evolutionary Algorithms
2020 IEEE Congress on Evolutionary Computation (CEC) : 1-8 (2020)
Abstract
Among the characteristics of traditional evolutionary algorithms governed by models, memory volatility is one of the most frequent. This is commonly due to the limitations of the models used to guide this kind of algorithms, which are generally very efficient when sampling, but tend to struggle when facing large amounts of data to represent. Neural networks are one type of model which conveniently thrives when facing vast amounts of data, and does not see its performance particularly worsened by large dimensionality. Several successful neural generative models, which could perfectly fit as a model for driving an evolutionary process are available in the literature. Whereas the behavior of these generative models in evolutionary algorithms has already been widely tested, other neural models -those intended for supervised learning- have not enjoyed that much attention from the research community. In this paper, we take one step forward in this direction, exploring the capacities and particularities of back-drive, a method that enables a neural model intended for regression to be used as a solution sampling model. In this context, by performing extensive research into the most influential aspects of the algorithm, we study the conditions which favor the performance of the back-drive algorithm as the sole guiding factor in an evolutionary approach.