The three tables generated by MCMC further underscore
the complete dominance of the optimized algorithm's returns
over the standard NSGA-II algorithm. Table I (NSGA-II)
shows the results of the standard NSGA-II algorithm. Table
II (Selection Optimized) illustrates the improvements made
through selection optimization. Table III (Fully Optimized)
presents the fully optimized results, demonstrating the
complete dominance of the optimized algorithm's returns
over the standard NSGA-II algorithm. For each year in the
upcoming decade, the optimized solution presents a higher
likelihood of yielding superior returns.
V. CONCLUSION
. Over the course of this study, we first revisited the
foundational principles of portfolio optimization and delved
into the nuances of Multi-objective Evolutionary Algorithms,
specifically the NSGA-II. Our primary goal was to optimize
performance of the original NSGA-II algorithm, by applying
a set of strategic modifications. Our proposed enhancements
aimed to catalyze individuals' exploration of the solution
space more intelligently and effectively. Through different
testing and comparison, particularly using the MCMC for
forecasting over a decade, we observed evident
improvements in the modified NSGA-II's efficiency.
Additionally, with the growing interest in the application
of deep neural networks (DNN) to optimization tasks,
contrasting our model with DNN-based optimization
strategies could provide fresh insights and pave the way for
hybrid solutions.
In summary, while many advancements like NSGA-III
have exhibited superior capabilities, the potential of the
original NSGA-II remains vast. By tweaking its rules and
functions, we've illustrated the algorithm's potential in
navigating the intricacies of modern portfolio optimization.
REFERENCES
[1] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist
multiobjective genetic algorithm: NSGA-II,” IEEE Transactions on
Evolutionary Computation, vol. 6, no. 2, pp. 182–197, Apr. 2002, doi:
https://doi.org/10.1109/4235.996017.
[2] G.-Y. Ban, N. El Karoui, and A. E. B. Lim, “Machine Learning and
Portfolio Optimization,” Management Science, vol. 64, no. 3, pp.
1136–1154, Mar. 2018, doi: https://doi.org/10.1287/mnsc.2016.2644.
[3] Z. Zhang, S. Zohren, and S. Roberts, “Deep Learning for Portfolio
Optimization,” The Journal of Financial Data Science, vol. 2, no. 4,
pp. 8–20, Aug. 2020, doi: https://doi.org/10.3905/jfds.2020.1.042.
[4] R. Mansini, W. Ogryczak, and M. G. Speranza, “Twenty years of
linear programming based portfolio optimization,” European Journal
of Operational Research, vol. 234, no. 2, pp. 518–535, Apr. 2014, doi:
https://doi.org/10.1016/j.ejor.2013.08.035.
[5] S. Dhanalakshmi, S. Kannan, K. Mahadevan, and S. Baskar,
“Application of modified NSGA-II algorithm to Combined Economic
and Emission Dispatch problem,” International Journal of Electrical
Power & Energy Systems, vol. 33, no. 4, pp. 992–1002, May 2011,
doi: https://doi.org/10.1016/j.ijepes.2011.01.014.
[6] Rio D’Souza, K. Chandra Sekaran, and A. Kandasamy, “Improved
NSGA-II Based on a Novel Ranking Scheme,” Feb. 2010, doi:
https://doi.org/10.48550/arxiv.1002.4005.
[7] K. Deb and H. Jain, “An Evolutionary Many-Objective Optimization
Algorithm Using Reference-Point-Based Nondominated Sorting
Approach, Part I: Solving Problems With Box Constraints,” IEEE
Transactions on Evolutionary Computation, vol. 18, no. 4, pp. 577–
601, Aug. 2014, doi: https://doi.org/10.1109/tevc.2013.2281535.
[8] Haitham Seada and K. Deb, “U-NSGA-III: A Unified Evolutionary
Optimization Procedure for Single, Multiple, and Many Objectives:
Proof-of-Principle Results,” pp. 34–49, Jan. 2015, doi:
https://doi.org/10.1007/978-3-319-15892-1_3.
[9] Q. Liu, X. Liu, J. Wu, and Y. Li, “An Improved NSGA-Ⅲ Algorithm
Using Genetic K-means Clustering Algorithm,” IEEE Access, pp. 1–1,
2019, doi: https://doi.org/10.1109/access.2019.2960531.
[10] Z.-M. Gu and G.-G. Wang, “Improving NSGA-III algorithms with
information feedback models for large-scale many-objective
optimization,” Future Generation Computer Systems, vol. 107, pp.
49–69, Jun. 2020, doi: https://doi.org/10.1016/j.future.2020.01.048.
[11] B. Doerr and Z. Qu, “Runtime Analysis for the NSGA-II: Provable
Speed-Ups from Crossover,” vol. 37, no. 10, pp. 12399–12407, Jun.
2023, doi: https://doi.org/10.1609/aaai.v37i10.26461.
[12] B. Doerr and Z. Qu, “From Understanding the Population Dynamics
of the NSGA-II to the First Proven Lower Bounds,” vol. 37, no. 10,
pp. 12408–12416, Jun. 2023, doi:
https://doi.org/10.1609/aaai.v37i10.26462.
[13] X. Chu and X. Yu, “Improved Crowding Distance for NSGA-II,” Nov.
2018, doi: https://doi.org/10.48550/arxiv.1811.12667.