skip to main content
10.1145/3533050.3533053acmotherconferencesArticle/Chapter ViewAbstractPublication PagesismsiConference Proceedingsconference-collections
research-article
Open access

A Hybrid Multi-Objective Teaching Learning-Based Optimization Using Reference Points and R2 Indicator

Published: 24 June 2022 Publication History

Abstract

Hybrid multi-objective evolutionary algorithms have recently become a hot topic in the domain of metaheuristics. Introducing new algorithms that inherit other algorithms’ operators and structures can improve the performance of the algorithm. Here, we proposed a hybrid multi-objective algorithm based on the operators of the genetic algorithm (GA) and teaching learning-based optimization (TLBO) and the structures of reference point-based (from NSGA-III) and R2 indicators methods. The new algorithm (R2-HMTLBO) improves diversity and convergence by using NSGA-III and R2-based TLBO, respectively. Also, to enhance the algorithm performance, an elite archive is proposed. The proposed multi-objective algorithm is evaluated on 19 benchmark test problems and compared to four state-of-the-art algorithms. IGD metric is applied for comparison, and the results reveal that the proposed R2-HMTLBO outperforms MOEA/D, MOMBI-II, and MOEA/IGD-NS significantly in 16/19 tests, 14/19 tests and 13/19 tests, respectively. Furthermore, R2-HMTLBO obtained considerably better results compared to all other algorithms in 4 test problems, although it does not outperform NSGA-III on a number of tests.

References

[1]
Dimo Brockhoff, Tobias Wagner, and Heike Trautmann. 2012. On the properties of the R2 indicator. In Proceedings of the 14th annual conference on Genetic and evolutionary computation. 465–472.
[2]
Guangming Dai, Chong Zhou, Maocai Wang, and Xiangping Li. 2018. Indicator and reference points co-guided evolutionary algorithm for many-objective optimization problems. Knowledge-Based Systems 140 (2018), 50–63.
[3]
Indraneel Das and John E Dennis. 1998. Normal-boundary intersection: A new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM journal on optimization 8, 3 (1998), 631–657.
[4]
Kalyanmoy Deb and Himanshu Jain. 2013. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE transactions on evolutionary computation 18, 4(2013), 577–601.
[5]
Kalyanmoy Deb, Amrit Pratap, Sameer Agarwal, and TAMT Meyarivan. 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE transactions on evolutionary computation 6, 2(2002), 182–197.
[6]
Kalyanmoy Deb, Lothar Thiele, Marco Laumanns, and Eckart Zitzler. 2005. Scalable test problems for evolutionary multiobjective optimization. In Evolutionary multiobjective optimization. Springer, 105–145.
[7]
Joaquín Derrac, Salvador García, Daniel Molina, and Francisco Herrera. 2011. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation 1, 1 (2011), 3–18.
[8]
Mark Fleischer. 2003. The measure of Pareto optima applications to multi-objective metaheuristics. In International Conference on Evolutionary Multi-Criterion Optimization. Springer, 519–533.
[9]
David Hadka. 2015. Platypus–Multiobjective Optimization in Python.
[10]
Michael Pilegaard Hansen and Andrzej Jaszkiewicz. 1994. Evaluating the quality of approximations to the non-dominated set. Citeseer.
[11]
DP Hardin and EB Saff. 2005. Minimal Riesz energy point configurations for rectifiable d-dimensional manifolds. Advances in Mathematics 193, 1 (2005), 174–204.
[12]
Raquel Hernández Gómez and Carlos A Coello Coello. 2015. Improved metaheuristic based on the R2 indicator for many-objective optimization. In Proceedings of the 2015 annual conference on genetic and evolutionary computation. 679–686.
[13]
John Henry Holland 1992. Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. MIT press.
[14]
Hisao Ishibuchi, Yuji Sakane, Noritaka Tsukamoto, and Yusuke Nojima. 2009. Adaptation of scalarizing functions in MOEA/D: An adaptive scalarizing function-based multiobjective evolutionary algorithm. In International Conference on Evolutionary Multi-Criterion Optimization. Springer, 438–452.
[15]
Hisao Ishibuchi, Yuji Sakane, Noritaka Tsukamoto, and Yusuke Nojima. 2010. Simultaneous use of different scalarizing functions in MOEA/D. In Proceedings of the 12th annual conference on Genetic and evolutionary computation. 519–526.
[16]
James Kennedy and Russell Eberhart. 1995. Particle swarm optimization. In Proceedings of ICNN’95-international conference on neural networks, Vol. 4. IEEE, 1942–1948.
[17]
Ahmed Korashy, Salah Kamel, Francisco Jurado, and Abdel-Raheem Youssef. 2019. Hybrid whale optimization algorithm and grey wolf optimizer algorithm for optimal coordination of direction overcurrent relays. Electric Power Components and Systems 47, 6-7 (2019), 644–658.
[18]
Man-Fai Leung, Carlos Artemio Coello Coello, Chi-Chung Cheung, Sin-Chun Ng, and Andrew Kwok-Fai Lui. 2020. A hybrid leader selection strategy for many-objective particle swarm optimization. IEEE Access 8(2020), 189527–189545.
[19]
Ke Li, Kalyanmoy Deb, Qingfu Zhang, and Sam Kwong. 2014. An evolutionary many-objective optimization algorithm based on dominance and decomposition. IEEE Transactions on Evolutionary Computation 19, 5(2014), 694–716.
[20]
R Venkata Rao, Vimal J Savsani, and DP Vakharia. 2011. Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Computer-Aided Design 43, 3 (2011), 303–315.
[21]
Oliver Schutze, Xavier Esquivel, Adriana Lara, and Carlos A Coello Coello. 2012. Using the averaged Hausdorff distance as a performance measure in evolutionary multiobjective optimization. IEEE Transactions on Evolutionary Computation 16, 4(2012), 504–522.
[22]
Rainer Storn and Kenneth Price. 1997. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization 11, 4 (1997), 341–359.
[23]
Jiaze Sun, Jiahui Deng, and Yang Li. 2020. Indicator & crowding distance-based evolutionary algorithm for combined heat and power economic emission dispatch. Applied Soft Computing 90 (2020), 106158.
[24]
Ye Tian, Ran Cheng, Xingyi Zhang, Fan Cheng, and Yaochu Jin. 2017. An indicator-based multiobjective evolutionary algorithm with reference point adaptation for better versatility. IEEE Transactions on Evolutionary Computation 22, 4(2017), 609–622.
[25]
Ye Tian, Ran Cheng, Xingyi Zhang, and Yaochu Jin. 2017. PlatEMO: A MATLAB platform for evolutionary multi-objective optimization [educational forum]. IEEE Computational Intelligence Magazine 12, 4 (2017), 73–87.
[26]
Ye Tian, Xingyi Zhang, Ran Cheng, and Yaochu Jin. 2016. A multi-objective evolutionary algorithm based on an enhanced inverted generational distance metric. In 2016 IEEE congress on evolutionary computation (CEC). IEEE, 5222–5229.
[27]
Nguyen Huy Truong and Dinh-Nam Dao. 2020. New hybrid between NSGA-III with multi-objective particle swarm optimization to multi-objective robust optimization design for Powertrain mount system of electric vehicles. Advances in Mechanical Engineering 12, 2 (2020), 1687814020904253.
[28]
Lyndon While, Lucas Bradstreet, and Luigi Barone. 2011. A fast way of calculating exact hypervolumes. IEEE Transactions on Evolutionary Computation 16, 1(2011), 86–95.
[29]
Nianyin Zeng, Dandan Song, Han Li, Yancheng You, Yurong Liu, and Fuad E Alsaadi. 2021. A competitive mechanism integrated multi-objective whale optimization algorithm with differential evolution. Neurocomputing 432(2021), 170–182.
[30]
Huifeng Zhang, Jianzhong Zhou, Yongchuan Zhang, Youlin Lu, and Yongqiang Wang. 2013. Culture belief based multi-objective hybrid differential evolutionary algorithm in short term hydrothermal scheduling. Energy conversion and management 65 (2013), 173–184.
[31]
Qingfu Zhang and Hui Li. 2007. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on evolutionary computation 11, 6(2007), 712–731.
[32]
Qingfu Zhang, Aimin Zhou, Shizheng Zhao, Ponnuthurai Nagaratnam Suganthan, Wudong Liu, Santosh Tiwari, 2008. Multiobjective optimization test instances for the CEC 2009 special session and competition. University of Essex, Colchester, UK and Nanyang technological University, Singapore, special session on performance assessment of multi-objective optimization algorithms, technical report 264 (2008), 1–30.
[33]
Xin Zhou, Xuewu Wang, and Xingsheng Gu. 2021. A decomposition-based multiobjective evolutionary algorithm with weight vector adaptation. Swarm and Evolutionary Computation 61 (2021), 100825.
[34]
Eckart Zitzler, Kalyanmoy Deb, and Lothar Thiele. 2000. Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary computation 8, 2 (2000), 173–195.
[35]
Eckart Zitzler and Simon Künzli. 2004. Indicator-based selection in multiobjective search. In International conference on parallel problem solving from nature. Springer, 832–842.

Index Terms

  1. A Hybrid Multi-Objective Teaching Learning-Based Optimization Using Reference Points and R2 Indicator
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        ISMSI '22: Proceedings of the 2022 6th International Conference on Intelligent Systems, Metaheuristics & Swarm Intelligence
        April 2022
        117 pages
        ISBN:9781450396288
        DOI:10.1145/3533050
        This work is licensed under a Creative Commons Attribution International 4.0 License.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 24 June 2022

        Check for updates

        Author Tags

        1. NSGA-III
        2. R2 indicator
        3. multi-objective evolutionary algorithm (MOEA)
        4. optimization algorithm
        5. reference point-based method
        6. teaching learning-based optimization (TLBO)

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        ISMSI 2022

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 332
          Total Downloads
        • Downloads (Last 12 months)123
        • Downloads (Last 6 weeks)18
        Reflects downloads up to 18 Jan 2025

        Other Metrics

        Citations

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media