Tatis Posada, David
Loading...
1 results
Publication Search Results
Now showing 1 - 1 of 1
Publication Meta-learning to initialize genetic algorithm population for hyperparameter tuning of deep neural networks(2023-04-11) Tatis Posada, David; Arzuaga, Emmanuel; College of Engineering; Sierra Gil, Heidy; RodrÃguez MartÃnez, Manuel; RodrÃguez SolÃs, Rafael; Department of Electrical and Computer Engineering; Acuña Guzmán, Salvador F.Artificial intelligence models such as Deep Neural Networks (DNN) are designed taking into account the data, available computational resources and the problem to be solved (i.e. prediction or classification). Before training the models, it is necessary to define a series of hyperparameters that are defined as parameters of a Deep Neural Network that are not modified during the training process. Hyperparameters have a big impact on model results, so they must be carefully selected. Possible hyperparameters when designing a DNN model include the learning rate, number of epochs, batch size, activation function, number of hidden layers, units, dropout for regularization, and more. Having such a large search space, it is necessary to implement strategies and algorithms that find better combinations of hyperparameters, reducing the number of trained models. The objective of this thesis is to design, implement and study an algorithm for hyperparameter tuning of Deep Neural Networks based on a meta-algorithm that learns from previous experiments (meta-dataset) to initialize the population of a genetic algorithm. The genetic algorithm iterates to find the combination of hyperparameters that improve the performance of a model in a cost-effective way. For this, different Deep Neural Networks architectures are implemented and applied to active research problems (such as materials properties prediction and image segmentation) to create a meta-dataset that will be used to train a Machine Learning model to predict how good a combination of hyperparameters can be at a given model, task and data. In addition, a genetic algorithm for hyperparameter tuning is implemented which is contrasted by using and not using the meta learner to initialize its population.