skip to main content
10.1145/3583131.3590395acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Pareto Local Search is Competitive with Evolutionary Algorithms for Multi-Objective Neural Architecture Search

Published: 12 July 2023 Publication History

Abstract

Neural architecture search (NAS) involves automatically searching for promising deep neural network structures in certain architecture spaces. Depending on the number of criteria being concerned, NAS can be formulated as single-objective optimization problems (SONAS) or multi-objective optimization problems (MONAS). Evolutionary algorithms (EAs) are common approaches for NAS due to their effectiveness in solving challenging combinatorial problems. Recent studies, however, have analyzed SONAS landscapes and indicated that while NAS problems are multi-modal but local search algorithms with simple perturbation operators can escape local optima to reach global optima without much difficulty. Such investigations for MONAS remain under-explored. In this paper, we employ local optimal networks (LONs) for visually structuring the MONAS landscape with a simple local search procedure. Via detailed analyses, we then design LOMONAS, a dedicated Pareto local search algorithm for MONAS. The experimental results on four NAS benchmarks (MacroNAS, NAS-Bench-101, NAS-Bench-201, and NAS-Bench-ASR) exhibit the superior performance of LOMONAS compared to two widely-used multi-objective EAs (MOEAs), NSGA-II and MOEA/D. The findings indicate that Pareto local search algorithms are competitive with MOEAs in solving MONAS problems.

Supplementary Material

PDF File (p348-phan-suppl.pdf)
Supplemental material.

References

[1]
Noor Awad, Neeratyoy Mallik, and Frank Hutter. 2020. Differential Evolution for Neural Architecture Search. In Proceedings of the 1st workshop on neural architecture search(@ICLR'20).
[2]
Peter A. N. Bosman and Dirk Thierens. 2003. The balance between proximity and diversity in multiobjective evolutionary algorithms. IEEE Trans. Evol. Comput. 7, 2 (2003), 174--188.
[3]
Jürgen Branke, Kalyanmoy Deb, Henning Dierolf, and Matthias Osswald. 2004. Finding Knees in Multi-objective Optimization. In Parallel Problem Solving from Nature - PPSN 2004 (Lecture Notes in Computer Science, Vol. 3242). Springer, Birmingham, UK, 722--731.
[4]
Kalyanmoy Deb, Samir Agrawal, Amrit Pratap, and T. Meyarivan. 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6, 2 (2002), 182--197.
[5]
Xuanyi Dong and Yi Yang. 2020. NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search. In International Conference on Learning Representations, ICLR 2020. OpenReview.net, Addis Ababa, Ethiopia.
[6]
Jonathan E. Fieldsend and Khulood AlYahya. 2019. Visualising the landscape of multi-objective problems using local optima networks. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO 2019. ACM, Prague, Czech Republic, 1421--1429.
[7]
Aric A. Hagberg, Daniel A. Schult, and Pieter J. Swart. 2008. Exploring Network Structure, Dynamics, and Function using NetworkX. In Proceedings of the 7th Python in Science Conference. Pasadena, CA USA, 11 -- 15.
[8]
Arnaud Liefooghe, Bilel Derbel, Sébastien Vérel, Manuel López-Ibáñez, Hernán E. Aguirre, and Kiyoshi Tanaka. 2018. On Pareto Local Optimal Solutions Networks. In Parallel Problem Solving from Nature - PPSN XV - 15th International Conference, Coimbra, Portugal, September 8--12, 2018, Proceedings, Part II (Lecture Notes in Computer Science, Vol. 11102). Springer, 232--244.
[9]
Guangyuan Liu, Yangyang Li, Licheng Jiao, Yanqiao Chen, and Ronghua Shang. 2021. Multiobjective evolutionary algorithm assisted stacked autoencoder for PolSAR image classification. Swarm Evol. Comput. 60 (2021), 100794.
[10]
Zhichao Lu, Ran Cheng, Yaochu Jin, Kay Chen Tan, and Kalyanmoy Deb. 2022. Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment. IEEE Transactions on Evolutionary Computation (2022), 1--1.
[11]
Zhichao Lu, Kalyanmoy Deb, Erik D. Goodman, Wolfgang Banzhaf, and Vishnu Naresh Boddeti. 2020. NSGANetV2: Evolutionary Multi-objective Surrogate-Assisted Neural Architecture Search. In Computer Vision - ECCV 2020 (Lecture Notes in Computer Science, Vol. 12346). Springer, Glasgow, UK, 35--51.
[12]
Zhichao Lu, Gautam Sreekumar, Erik D. Goodman, Wolfgang Banzhaf, Kalyanmoy Deb, and Vishnu Naresh Boddeti. 2021. Neural Architecture Transfer. IEEE Trans. Pattern Anal. Mach. Intell. 43, 9 (2021), 2971--2989.
[13]
Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh D. Dhebar, Kalyanmoy Deb, Erik D. Goodman, and Wolfgang Banzhaf. 2019. NSGA-Net: neural architecture search using multi-objective genetic algorithm. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2019. ACM, Prague, Czech Republic, 419--427.
[14]
Abhinav Mehrotra, Alberto Gil C. P. Ramos, Sourav Bhattacharya, Lukasz Dudziak, Ravichander Vipperla, Thomas Chau, Mohamed S. Abdelfattah, Samin Ishtiaq, and Nicholas Donald Lane. 2021. NAS-Bench-ASR: Reproducible Neural Architecture Search for Speech Recognition. In International Conference on Learning Representations, ICLR 2021. OpenReview.net, Virtual Event, Austria.
[15]
Gabriela Ochoa, Marco Tomassini, Sébastien Vérel, and Christian Darabos. 2008. A study of NK landscapes' basins and local optima networks. In Genetic and Evolutionary Computation Conference, GECCO 2008. ACM, Atlanta, GA, USA, 555--562.
[16]
Gabriela Ochoa and Nadarajen Veerapen. 2022. Neural Architecture Search: A Visual Analysis. In Parallel Problem Solving from Nature - PPSN 2022 (Lecture Notes in Computer Science, Vol. 13398). Springer, Dortmund, Germany, 603--615.
[17]
Tom Den Ottelander, Arkadiy Dushatskiy, Marco Virgolin, and Peter A. N. Bosman. 2021. Local Search is a Remarkably Strong Baseline for Neural Architecture Search. In Evolutionary Multi-Criterion Optimization EMO 2021 (Lecture Notes in Computer Science, Vol. 12654). Springer, Shenzhen, China, 465--479.
[18]
Luís Paquete, Marco Chiarandini, and Thomas Stützle. 2004. Pareto Local Optimum Sets in the Biobjective Traveling Salesman Problem: An Experimental Study. In Metaheuristics for Multiobjective Optimisation. Lecture Notes in Economics and Mathematical Systems, Vol. 535. Springer, 177--199.
[19]
Quan Minh Phan and Ngoc Hoang Luong. 2021. Enhancing Multi-objective Evolutionary Neural Architecture Search with Surrogate Models and Potential Point-Guided Local Searches. In International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE 2021, Vol. 12798. Springer, Kuala Lumpur, Malaysia, 460--472.
[20]
Quan Minh Phan and Ngoc Hoang Luong. 2022. Enhancing multi-objective evolutionary neural architecture search with training-free Pareto local search. Applied Intelligence (2022), 1--19.
[21]
Isak Potgieter, Christopher W. Cleghorn, and Anna S. Bosman. 2022. A Local Optima Network Analysis of the Feedforward Neural Architecture Space. In International Joint Conference on Neural Networks, IJCNN 2022. IEEE, Padua, Italy, 1--8.
[22]
Nuno M. Rodrigues, Katherine M. Malan, Gabriela Ochoa, Leonardo Vanneschi, and Sara Silva. 2022. Fitness landscape analysis of convolutional neural network architectures for image classification. Inf. Sci. 609 (2022), 711--726.
[23]
Marco Tomassini, Sébastien Vérel, and Gabriela Ochoa. 2008. Complex-network analysis of combinatorial spaces: The NK landscape case. Phys. Rev. E 78 (Dec 2008), 066114. Issue 6.
[24]
Sébastien Verel, Gabriela Ochoa, and Marco Tomassini. 2011. Local Optima Networks of NK Landscapes With Neutrality. IEEE Transactions on Evolutionary Computation 15, 6 (2011), 783--797.
[25]
An Vo, Tan Ngoc Pham, Van Bich Nguyen, and Ngoc Hoang Luong. 2022. Training-Free Multi-Objective and Many-Objective Evolutionary Neural Architecture Search with Synaptic Flow. In The 11th International Symposium on Information and Communication Technology, SoICT 2022, Hanoi, Vietnam, December 1--3, 2022. ACM, 1--8.
[26]
Bin Wang, Yanan Sun, Bing Xue, and Mengjie Zhang. 2019. Evolving deep neural networks by multi-objective particle swarm optimization for image classification. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2019, Prague, Czech Republic, July 13--17, 2019. ACM, 490--498.
[27]
Bin Wang, Bing Xue, and Mengjie Zhang. 2020. Particle Swarm optimisation for Evolving Deep Neural Networks for Image Classification by Evolving and Stacking Transferable Blocks. In IEEE Congress on Evolutionary Computation, CEC 2020, Glasgow, United Kingdom, July 19--24, 2020. IEEE, 1--8.
[28]
Yu-Wei Wen, Sheng-Hsuan Peng, and Chuan-Kang Ting. 2021. Two-Stage Evolutionary Neural Architecture Search for Transfer Learning. IEEE Trans. Evol. Comput. 25, 5 (2021), 928--940.
[29]
Colin White, Sam Nolen, and Yash Savani. 2021. Exploring the loss landscape in neural architecture search. In Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, UAI 2021 (Proceedings of Machine Learning Research, Vol. 161). AUAI Press, Virtual Event, 654--664.
[30]
Lingxi Xie and Alan L. Yuille. 2017. Genetic CNN. In IEEE International Conference on Computer Vision, ICCV 2017. IEEE Computer Society, Venice, Italy, 1388--1397.
[31]
Chris Ying, Aaron Klein, Eric Christiansen, Esteban Real, Kevin Murphy, and Frank Hutter. 2019. NAS-Bench-101: Towards Reproducible Neural Architecture Search. In International Conference on Machine Learning, ICML 2019 (Proceedings of Machine Learning Research, Vol. 97). PMLR, 7105--7114.
[32]
Weiqin Ying, Kaijie Zheng, Yu Wu, Junhui Li, and Xin Xu. 2020. Neural Architecture Search Using Multi-objective Evolutionary Algorithm Based on Decomposition. In Artificial Intelligence Algorithms and Applications: 11th International Symposium, ISICA 2019. Springer, Guangzhou, China, 143--154.
[33]
Yao Zhou, Gary G. Yen, and Zhang Yi. 2021. A Knee-Guided Evolutionary Algorithm for Compressing Deep Neural Networks. IEEE Trans. Cybern. 51, 3 (2021), 1626--1638.
[34]
Barret Zoph and Quoc V. Le. 2017. Neural Architecture Search with Reinforcement Learning. In International Conference on Learning Representations, ICLR 2017. OpenReview.net, Toulon, France.
[35]
Barret Zoph, Vijay Vasudevan, Jonathon Shlens, and Quoc V. Le. 2018. Learning Transferable Architectures for Scalable Image Recognition. In 2018 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2018, Salt Lake City, UT, USA, June 18--22, 2018. Computer Vision Foundation / IEEE Computer Society, 8697--8710.

Cited By

View all

Index Terms

  1. Pareto Local Search is Competitive with Evolutionary Algorithms for Multi-Objective Neural Architecture Search

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        GECCO '23: Proceedings of the Genetic and Evolutionary Computation Conference
        July 2023
        1667 pages
        ISBN:9798400701191
        DOI:10.1145/3583131
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 12 July 2023

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. neural architecture search
        2. multi-objective optimization
        3. evolutionary algorithms
        4. local search
        5. local optima networks

        Qualifiers

        • Research-article

        Funding Sources

        • Vingroup Innovation Foundation (VINIF)

        Conference

        GECCO '23
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)83
        • Downloads (Last 6 weeks)2
        Reflects downloads up to 25 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media