skip to main content
research-article

Improving NSGA-III algorithms with information feedback models for large-scale many-objective optimization

Published: 01 June 2020 Publication History

Abstract

Recently, more and more multi/many-objective algorithms have been proposed. However, most evolutionary algorithms only focus on solving small-scale multi/many-objective optimization problems, and few researchers pay attention to solving large-scale optimization problems. Like NSGA-III, although it performs well in solving many-objective optimization problems, it does not perform well in large-scale optimization problems. To solve this problem, in this paper, we introduce the information feedback models to improve the ability of NSGA-III to solve large-scale optimization problems. In the feedback models, the historical information of individuals in the previous generations is used in the update process of the current generation. According to the different ways of selecting individual, information feedback models are divided into two categories: one is to select individuals in a fixed way (called M-F) and the other is to select individuals randomly (called M-R). For each type of model, if the number of individuals selected is different, the information feedback models also have some difference. According to the number and mode of individual selection, there are six models in this paper. These six models are respectively embedded into NSGA-III to generate six improving NSGA-III algorithms (these six algorithms are collectively referred to as IFM-NSGAIII). These six algorithms are compared with the original NSGA-III on 9 benchmark problems to search the best information feedback model and the best IFM-NSGAIII algorithm. Then the two best IFM-NSGAIII algorithms are compared with four state-of-the-art algorithms on 9 test functions. Experiments show that the proposed algorithms are highly competitive on test problems.

Highlights

Information feedback models are to improve NSGA-III to solve large-scale problems.
Individuals’ historical information was reused via the information feedback model.
Individuals’ historical information is selected with a fixed way or randomly.

References

[1]
Bagley J.D., The Behavior of Adaptive Systems Which Employ Genetic and Correlation Algorithms, 1967.
[2]
Kirkpatrick S., Gelatt C.D., Vecchi M.P., Optimization by simulated annealing, Science 220 (4598) (1983) 671–680.
[3]
R. Eberhart, J. Kennedy, Particle swarm optimization, in: Proceedings of the IEEE International Conference on Neural Networks, 1995, pp. 1942-1948.
[4]
A. Colorni, M. Dorigo, V. Maniezzo, Distributed optimization by ant colonies, in: Proceedings of the First European Conference on Artificial Life, 1992, pp. 134-142.
[5]
Storn R., Price K., Differential Evolution–a Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces, International Computer Science, 1995.
[6]
Srinivas N., Deb K., Muiltiobjective optimization using nondominated sorting in genetic algorithms, Evol. Comput. 2 (3) (1994) 221–248.
[7]
Deb K., Pratap A., Agarwal S., Meyarivan T., A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Trans. Evol. Comput. 6 (2) (2002) 182–197.
[8]
Zhang Q., Li H., MOEA/D: a multiobjective evolutionary algorithm based on decomposition, IEEE Trans. Evol. Comput. 11 (6) (2007) 712–731.
[9]
Sang H.-Y., Pan Q.-K., Duan P.-Y., Li J.-Q., An effective discrete invasive weed optimization algorithm for lot-streaming flowshop scheduling problems, J. Intell. Manuf. 29 (6) (2015) 1337–1349.
[10]
Li M., Guo Y., Huang J., Li Y., Cryptanalysis of a chaotic image encryption scheme based on permutation-diffusion structure, Signal Process., Image Commun. 62 (2018) 164–172.
[11]
Fan H., Li M., Liu D., Zhang E., Cryptanalysis of a colour image encryption using chaotic APFM nonlinear adaptive filter, Signal Process. 143 (2018) 28–41.
[12]
Dong W., Shi G., Li X., Ma Y., Huang F., Compressive sensing via nonlocal low-rank regularization, IEEE Trans. Image Process 23 (8) (2014) 3618–3632.
[13]
Zhang Y., Gong D., Hu Y., Zhang W., Feature selection algorithm based on bare bones particle swarm optimization, Neurocomputing 148 (2015) 150–157.
[14]
Zhang Y., X.-f. Song D., D.-w. Gong Y., A return-cost-based binary firefly algorithm for feature selection, Inf. Sci. 418-419 (2017) 561–574.
[15]
Mao W., He J., Tang J., Li Y., Predicting remaining useful life of rolling bearings based on deep feature representation and long short-term memory neural network, Adv. Mech. Eng. 10 (12) (2018).
[16]
Jian M., Lam K.-M., Dong J., Facial-feature detection and localization based on a hierarchical scheme, Inform. Sci. 262 (2014) 1–14.
[17]
Wang G.-G., Chu H.E., Mirjalili S., Three-dimensional path planning for UCAV using an improved bat algorithm (in English), Aerosp. Sci. Technol. 49 (2016) 231–238.
[18]
Wang G., Guo L., Duan H., Liu L., Wang H., Shao M., Path planning for uninhabited combat aerial vehicle using hybrid meta-heuristic DE/BBO algorithm, Adv. Sci. Eng. Med. 4 (6) (2012) 550–564.
[19]
Jian M., Lam K.-M., Dong J., Illumination-insensitive texture discrimination based on illumination compensation and enhancement, Inform. Sci. 269 (2014) 60–72.
[20]
Wang G.-G., Guo L., Duan H., Liu L., Wang H., The model and algorithm for the target threat assessment based on Elman_AdaBoost strong predictor, Acta Electron. Sin. 40 (5) (2012) 901–906.
[21]
Jian M., Lam K.M., Dong J., Shen L., Visual-patch-attention-aware saliency detection, IEEE Trans. Cybern. 45 (8) (2015) 1575–1586.
[22]
Wang G.-G., Lu M., Dong Y.-Q., Zhao X.-J., Self-adaptive extreme learning machine, Neural Comput. Appl. 27 (2) (2016) 291–303.
[23]
Mao W., Zheng Y., Mu X., Zhao J., Uncertainty evaluation and model selection of extreme learning machine based on Riemannian metric, Neural Comput. Appl. 24 (7–8) (2013) 1613–1625.
[24]
Liu G., Zou J., Level set evolution with sparsity constraint for object extraction, IET Image Process. 12 (8) (2018) 1413–1422.
[25]
Liu K., Gong D., Meng F., Chen H., Wang G.-G., Gesture segmentation based on a two-phase estimation of distribution algorithm, Inf. Sci. 394-395 (2017) 88–105.
[26]
Parouha R.P., Das K.N., Economic load dispatch using memory based differential evolution, Int. J. Bio-Inspired Comput. 11 (3) (2018) 159–170.
[27]
Rizk-Allah R.M., El-Sehiemy R.A., Wang G.-G., A novel parallel hurricane optimization algorithm for secure emission/economic load dispatch solution, Appl. Soft Comput. 63 (2018) 206–222.
[28]
Rizk-Allah R.M., El-Sehiemy R.A., Deb S., Wang G.-G., A novel fruit fly framework for multi-objective shape design of tubular linear synchronous motor, J. Supercomput. 73 (3) (2017) 1235–1256.
[29]
Srikanth K., Panwar L.K., Panigrahi B.K., Herrera-Viedma E., Sangaiah A.K., Wang G.-G., Meta-heuristic framework: quantum inspired binary grey wolf optimizer for unit commitment problem, Comput. Electr. Eng. 70 (2018) 243–260.
[30]
Chen S., Chen R., Wang G.-G., Gao J., Sangaiah A.K., An adaptive large neighborhood search heuristic for dynamic vehicle routing problems, Comput. Electr. Eng. (2018).
[31]
Feng Y., Wang G.-G., Binary moth search algorithm for discounted { 0 − 1 } knapsack problem, IEEE Access 6 (2018) 10708–10719.
[32]
Feng Y., Wang G.-G., Wang L., Solving randomized time-varying knapsack problems by a novel global firefly algorithm, Eng. Comput.-Ger. 34 (3) (2018) 621–635.
[33]
Abdel-Basset M., Zhou Y., An elite opposition-flower pollination algorithm for a 0-1 knapsack problem, Int. J. Bio-Inspired Comput. 11 (1) (2018) 46–53.
[34]
Yi J.-H., Wang J., Wang G.-G., Improved probabilistic neural networks with self-adaptive strategies for transformer fault diagnosis problem, Adv. Mech. Eng. 8 (1) (2016) 1–13.
[35]
Mao W., He J., Li Y., Yan Y., Bearing fault diagnosis with auto-encoder extreme learning machine: A comparative study, Proc. Inst. Mech. Eng. Part C: J. Mech. Eng. Sci. 231 (8) (2016) 1560–1578.
[36]
Mao W., Feng W., Liang X., A novel deep output kernel learning method for bearing fault structural diagnosis, Mech. Syst. Signal Process. 117 (2019) 293–318.
[37]
Duan H., Zhao W., Wang G., Feng X., Test-sheet composition using analytic hierarchy process and hybrid metaheuristic algorithm TS/BBO, Math. Probl. Eng. 2012 (2012) 1–22.
[38]
Deb K., Jain H., An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints, IEEE Trans. Evol. Comput. 18 (4) (2014) 577–601.
[39]
Y.-y. Tan K., Y.-c. Jiao H., Li H., X.-k. Wang X., MOEA/D + uniform design: A new version of MOEA/d for optimization problems with many objectives, Comput. Oper. Res. 40 (6) (2013) 1648–1660.
[40]
Sieni E., Di Barba P., Forzan M., Migration NSGA: method to improve a non-elitist searching of pareto front, with application in magnetics, Inverse Probl. Sci. Eng. 24 (4) (2016) 543–566.
[41]
Sieni E., Di Barba P., Dughiero F., Forzan M., Self-adaptive migration NSGA and optimal design of inductors for magneto-fluid hyperthermia, Eng. Comput. 35 (4) (2018) 1727–1746.
[42]
Wang G., Tan Y., Improving metaheuristic algorithms with information feedback models, IEEE Trans. Cybern. 49 (2) (2019) 542–555.
[43]
Schaffer J.D., Multiple objective optimization with vector evaluated genetic algorithms, in: Proceedings of the First International Conference on Genetic Algorithms and their Applications, Carnegie-Mellon University, Pittsburg, PA, 1985.
[44]
Goldberg D., Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley Publishing Company, Inc., New York, 1989.
[45]
J. rey Horn, N. Nafpliotis, D.E. Goldberg, A niched Pareto genetic algorithm for multiobjective optimization, in: Proceedings of the First IEEE Conference on Evolutionary Computation (CEC 1994), Orlando, Florida, USA, 1994, pp. 82-87.
[46]
M. Erickson, A. Mayer, J. Horn, The niched Pareto genetic algorithm 2 applied to the design of groundwater remediation systems, in: Evolutionary Multi-Criterion Optimization (EMO 2001), Zurich, Switzerland, 2001, pp. 681-695.
[47]
Molina J., Santana L.V., Hernández-Díaz A.G., Coello Coello C.A., Caballero R., G-dominance: Reference point based dominance for multiobjective metaheuristics, European J. Oper. Res. 197 (2) (2009) 685–692.
[48]
V.L. Vachhani, V.K. Dabhi, H.B. Prajapati, Improving NSGA-II for solving multi objective function optimization problems, in: 2016 International Conference on Computer Communication and Informatics (ICCCI 2016), Coimbatore, India 2016, pp. 1-6.
[49]
Qi Y., Ma X., Liu F., Jiao L., Sun J., Wu J., MOEA/D with adaptive weight adjustment, Evol. Comput. 22 (2) (2013) 231–264.
[50]
Ho-Huu V., Hartjes S., Visser H.G., Curran R., An improved MOEA/D algorithm for bi-objective optimization problems with complex pareto fronts and its application to structural optimization, Expert Syst. Appl. 92 (2018) 430–446.
[51]
Wang G.-G., Cai X., Cui Z., Min G., Chen J., High performance computing for cyber physical social systems by using evolutionary multi-objective optimization algorithm, IEEE Trans. Emerg. Top. Comput. (2017).
[52]
Guo Y., Yang H., Chen M., Cheng J., Gong D., Ensemble prediction-based dynamic robust multi-objective optimization methods, Swarm Evol. Comput. 48 (2019) 156–171.
[53]
Yi J.-H., Deb S., Dong J., Alavi A.H., Wang G.-G., An improved NSGA-III algorithm with adaptive mutation operator for big data optimization problems, Future Gener. Comput. Syst. 88 (2018) 571–585.
[54]
Deb K., Mohan M., Mishra S., Evaluating the ε-domination based multi-objective evolutionary algorithm for a quick computation of pareto-optimal solutions, Evol. Comput. 13 (4) (2005) 501–525.
[55]
M. Köppen, R. Vicente-Garcia, B. Nickolay, Fuzzy-pareto-dominance and its application in evolutionary multi-objective optimization, in: International Conference on Evolutionary Multi-Criterion Optimization (EMO 2005), Guanajuato, Mexico, 2005, pp. 399-412.
[56]
Farina M., Amato P., A fuzzy definition of optimality for many-criteria optimization problems, IEEE Trans. Syst. Man, Cybern. A Syst. Hum. 34 (3) (2004) 315–326.
[57]
Amarjeet M., Chhabra J.K., FP-ABC: Fuzzy-pareto dominance driven artificial bee colony algorithm for many-objective software module clustering, Comput. Lang. Syst. Str. 51 (2018) 1–21.
[58]
Yuan Y., Xu H., Wang B., Yao X., A new dominance relation-based evolutionary algorithm for many-objective optimization, IEEE Trans. Evol. Comput. 20 (1) (2016) 16–37.
[59]
Yang S., Li M., Liu X., Zheng J., A grid-based evolutionary algorithm for many-objective optimization, IEEE Trans. Evol. Comput. 17 (5) (2013) 721–736.
[60]
Jain H., Deb K., An evolutionary many-objective optimization algorithm using reference-point based nondominated sorting approach, part II: handling constraints and extending to an adaptive approach, IEEE Trans. Evol. Comput. 18 (4) (2014) 602–622.
[61]
Xiang Y., Peng J., Zhou Y., Li M., Chen Z., An angle based constrained many-objective evolutionary algorithm, Appl. Intell. 47 (3) (2017) 705–720.
[62]
Gong D., Sun J., Ji X., Evolutionary algorithms with preference polyhedron for interval multi-objective optimization problems, Inform. Sci. 233 (2013) 141–161.
[63]
Gong D., Sun J., Miao Z., A set-based genetic algorithm for interval many-objective optimization problems, IEEE Trans. Evolut. Comput. 22 (1) (2018) 47–60.
[64]
Sun J., Miao Z., Gong D., Zeng X.-J., Li J., Wang G.-G., Interval multi-objective optimization with memetic algorithms, IEEE Trans. Cybern. (2019).
[65]
Cheng R., Jin Y., Olhofer M., sendhoff B., Test problems for large-scale multiobjective and many-objective optimization, IEEE Trans. Cybern. 47 (12) (2017) 4108–4121.
[66]
Zhang X., Tian Y., Cheng R., Jin Y., A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization, IEEE Trans. Evol. Comput. 22 (1) (2018) 97–112.
[67]
Qi Y., Bao L., Ma X., Miao Q., Li X., Self-adaptive multi-objective evolutionary algorithm based on decomposition for large-scale problems: A case study on reservoir flood control operation, Inform. Sci. 367–368 (2016) 529–549.
[68]
Zille H., Ishibuchi H., Mostaghim S., Nojima Y., Weighted optimization framework for large-scale multi-objective optimization, in: Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion (GECCO 2016), Denver, Colorado, USA, 2016, pp. 83–84.
[69]
A.J. Nebro, J.J. Durillo, J. Garcia-Nieto, C.C. Coello, F. Luna, E. Alba, SMPSO: A new PSO-based metaheuristic for multi-objective optimization, in: 2009 IEEE Symposium on Computational Intelligence in Multi-Criteria Decision-Making (MCDM 2009), Nashville, TN, USA, 2009, pp. 66-73.
[70]
Cui Z., Chang Y., Zhang J., Cai X., Zhang W., Improved NSGA-III with selection-and-elimination operator, Swarm Evol. Comput. 49 (2019) 23–33.
[71]
Yuan Y., Ong Y.S., Gupta A., Hua X., Objective reduction in many-objective optimization: Evolutionary multiobjective approaches and comprehensive analysis, IEEE Trans. Evol. Comput. 22 (2) (2018) 189–210.
[72]
Das I., Dennis J.E., Normal-boundary intersection: A new method for generating the pareto surface in nonlinear multicriteria optimization problems, SIAM. J. Optim. 8 (3) (1998) 631–657.
[73]
Jiang S., Ong Y.-S., Zhang J., Feng L., Consistencies and contradictions of performance metrics in multiobjective optimization, IEEE Trans. Cybern. 44 (12) (2014) 2391–2404.
[74]
Liu Y., Wei J., Li X., Li M., Generational distance indicator-based evolutionary algorithm with an improved niching method for many-objective optimization problems, IEEE Access 7 (2019) 63881–63891.
[75]
A. Menchaca-Mendez, C.A.C. Coello, GDE-MOEA: a new moea based on the generational distance indicator and ε-dominance, in: 2015 IEEE Congress on Evolutionary Computation (CEC 2015), Sendai, Japan, 2015, pp. 947-955.
[76]
Y. Tian, X. Zhang, R. Cheng, Y. Jin, A multi-objective evolutionary algorithm based on an enhanced inverted generational distance metric, in: 2016 IEEE Congress on Evolutionary Computation (CEC 2016), Vancouver, Canada, 2016, pp. 5222-5229.
[77]
Ishibuchi H., Imada R., Setoguchi Y., Nojima Y., Reference point specification in inverted generational distance for triangular linear Pareto front, IEEE Trans. Evol. Comput. 22 (6) (2018) 961–975.
[78]
D.W. Corne, N.R. Jerram, J.D. Knowles, M.J. Oates, PESA-II: Region-based selection in evolutionary multiobjective optimization, in: Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation (GECCO 2001), San Francisco, CA, USA, 2001, pp. 283-290.

Cited By

View all

Index Terms

  1. Improving NSGA-III algorithms with information feedback models for large-scale many-objective optimization
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image Future Generation Computer Systems
          Future Generation Computer Systems  Volume 107, Issue C
          Jun 2020
          1155 pages

          Publisher

          Elsevier Science Publishers B. V.

          Netherlands

          Publication History

          Published: 01 June 2020

          Author Tags

          1. Terms-Information feedback models
          2. NSGA-III
          3. IFM-NSGAIII
          4. Many-objective
          5. Large-scale optimization

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 29 Jan 2025

          Other Metrics

          Citations

          Cited By

          View all

          View Options

          View options

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media