A Generic Form of Evolutionary Algorithms and Manifold Drift Concept

Main Article Content

Chidchanok Lursinsap

Abstract

Most of optimization problems in various fields are in NP-class. This implies that the time to find the optimum solution of any problem is obviously non-polynomial. Although the development of high speed computer architectures and the concept parallel computing is practically successful, some of these problems are constrained by the problem of tight data dependency which prevents the possibility of deploying a parallel architecture as well as processing to solve the problem. Evolutionary algorithms which are based on guessing solutions have been developed to find an acceptable solution in a short time. However, the processing time based on guessing to achieve the acceptable solution is unpredictable and uncontrollable. In this paper, we compare the guessing process in some popular algorithms to define a generic structure of searching process and solution finding process. This structure will help develop a new evolutionary algorithm. Furthermore, a new concept of manifold drift for avoiding the guessing process in order to speed up the solution search is also discussed.

Downloads

Download data is not yet available.

Article Details

How to Cite
Lursinsap, C. (2019). A Generic Form of Evolutionary Algorithms and Manifold Drift Concept. NTERNATIONAL CIENTIFIC OURNAL F NGINEERING ND ECHNOLOGY (ISJET), 2(1), 1-10. etrieved from https://ph02.tci-thaijo.org/index.php/isjet/article/view/175897
Section
Academic Article

References

[1] Zhuang and J. Jiang, ”Artificial Fish Swarm Optimization Algorithm Based on Mixed Crossover Strategy”, International Symposium on Neural Networks ISNN 2013: Advances in Neural Networks ? ISNN 2013 pp. 367-374.
[2] X. Yang, and S. Deb, “Cuckoo Search via Levy Flights”, 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC 2009), pp. 210-214, 2009.
[3] I. Zelinka, “A survey on evolutionary algorithms dynamics and its complexity–Mutual relations, past, present and future”, Swarm and Evolutionary Computation, vol. 25, pp. 2-14, Dec. 2015.
[4] I. Fister Jr., D. Fister, and X. Yang, “A Hybrid Bat Algorithm”,ELEKTROTEHNISKI VESTNIK, vol. 80(1-2), pp. 1-7, 2013.
[5] D. Karaboga, and B. Basturk, “Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems”, P. Melin et al. (Eds.): IFSA 2007, LNAI 4529, pp. 789-798, 2007.
[6] I. Fister Jr., D. Fister, and I. Fister, “A comprehensive review of cuckoo search: variants and hybrids”, Int. J. Mathematical Modelling and Numerical Optimisation, vol. 4, no. 4, pp. 387-409, 2013.
[7] F. Bergh, and A. Engelbrecht, “A Convergence Proof for the Particle Swarm Optimiser”, Fundamenta Informaticae archive, vol. 105(4), pp. 341-374, Dec. 2010.
[8] X. Yang, “Bat algorithm: literature review and applications”, Int. J. Bio-Inspired Computation, vol. 5, no. 3, pp. 141-149, 2013.
[9] M. Dorigo, and L. Gambardella, “Ant Colony System: A Cooperative Learning Approach to the Traveling Salesman Problem”, IEEE TRANS- ACTIONS ON EVOLUTIONARY COMPUTATION, vol. 1, no. 1, Apr. 1997.
[10] P. Civicioglu, and E. Besdok, “A conceptual comparison of the Cuckoosearch, particle swarm optimization, differential evolution and artificial bee colony algorithms”, Artificial Intelligence Review, vol. 39, no. 4, pp. 315-346, Apr. 2013.
[11] M. Dorigoa, and C. Blumb, “Ant colony optimization theory: A survey”, Theoretical Computer Science, vol.344, pp. 243-278, Nov. 2005.
[12] S. Das and P. Suganthan, “Differential Evolution: A Survey of the State-of-the-Art”, IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, vol. 15, no. 1, pp. 4-31. Feb. 2011.
[13] I. Boussad, J. Lepagnot, and P. Siarry, “A survey on optimization metaheuristics”, Information Sciences, vol. 237, pp. 82-117, Jul. 2013
[14] H. Escalante, M. Montes, and L. Sucar, “Particle Swarm Model Selec- tion”, Journal of Machine Learning Research, vol. 10, pp. 405-440, 2009.
[15] S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by Simulated Annealing”, Science, New Series, vol. 220, no. 4598, pp. 671-680, May. 1983.
[16] D. Bertsimass, and J. Tsitsiklis, “Simulated Annealing”, Statistical Science, vol. 8, no. 1, pp. 10-15, 1993.
[17] S. Mirjalili, and A. Lewis, “The Whale Optimization Algorithm”, Ad- vances in Engineering Software, vol. 95, pp. 51-67, May. 2016.
[18] S. He, Q. H. Wu, and J. R. Saunders, “Group Search Optimizer: An Optimization Algorithm Inspired by Animal Searching Behavior”, IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, vol. 13, no. 5, pp. 973-990, Oct. 2009.
[19] Y. Shi, “Brain Storm Optimization Algorithm”, Y. Tan et al. (Eds.): ICSI 2011, Part I, LNCS 6728, pp. 303-309, 2011.
[20] W. Lva, C. Hea, D. Lia, S. Chenga, S. Luoa, and X. Zhanga, “Election campaign optimization algorithm”, Procedia Computer Science, vol.1(1) , pp. 1377-1386, May.2012.
[21] E. Fernandes, T. Martins, and A. Rocha, “Fish Swarm Intelligent Algorithm for Bound Constrained Global Optimization”, Proceedings of the International Conference on Computational and Mathematical Methods
in Science and Engineering, CMMSE 2009, 30 June, 1-3 July 2009.
[22] J. Holland, “Genetic Algorithms”, Scientific American, pp. 66-72, July. 1992.
[23] S. Forrest, “Genetic Algorithms: Principle of Natural Selection Applied to Computation”, Science, New Series, vol. 261(5123), pp. 872-878, Aug. 1993.
[24] T. Saenphon, S. Phimoltares, and C. Lursinsap, “Combining new Fast Opposite Gradient Search with Ant Colony Optimization for solving traveling salesman problem”, Engineering Applications of Artificial Intelligence, vol.35, pp. 324-334, Oct. 2014.