Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleJune 2021
Evaluating medical aesthetics treatments through evolved age-estimation models
- Risto Miikkulainen,
- Elliot Meyerson,
- Xin Qiu,
- Ujjayant Sinha,
- Raghav Kumar,
- Karen Hofmann,
- Yiyang Matt Yan,
- Michael Ye,
- Jingyuan Yang,
- Damon Caiazza,
- Stephanie Manson Brown
GECCO '21: Proceedings of the Genetic and Evolutionary Computation ConferencePages 1009–1017https://doi.org/10.1145/3449639.3459378Estimating a person's age from a facial image is a challenging problem with clinical applications. Several medical aesthetics treatments have been developed that alter the skin texture and other facial features, with the goal of potentially improving ...
- research-articleJune 2021
Using novelty search to explicitly create diversity in ensembles of classifiers
GECCO '21: Proceedings of the Genetic and Evolutionary Computation ConferencePages 849–857https://doi.org/10.1145/3449639.3459308The diversity between individual learners in an ensemble is known to influence its performance. However, there is no standard agreement on how diversity should be defined, and thus how to exploit it to construct a high-performing classifier. We propose ...
- research-articleJune 2021
Regularized evolutionary population-based training
GECCO '21: Proceedings of the Genetic and Evolutionary Computation ConferencePages 323–331https://doi.org/10.1145/3449639.3459292Metalearning of deep neural network (DNN) architectures and hyperparameters has become an increasingly important area of research. At the same time, network regularization has been recognized as a crucial dimension to effective training of DNNs. However,...
- research-articleJune 2021
Optimizing loss functions through multi-variate taylor polynomial parameterization
GECCO '21: Proceedings of the Genetic and Evolutionary Computation ConferencePages 305–313https://doi.org/10.1145/3449639.3459277Metalearning of deep neural network (DNN) architectures and hyperparameters has become an increasingly important area of research. Loss functions are a type of metaknowledge that is crucial to effective training of DNNs, however, their potential role in ...