Quantum-inspired evolutionary algorithm (QEA) has been designed by integrating some quantum mechanical principles in the framework of evolutionary algorithms. They have been successfully employed as a computational technique in solving difficult optimization problems. It is well known that QEAs provide better balance between exploration and exploitation as compared to the conventional evolutionary algorithms. The population in QEA is evolved by variation operators, which move the Q-bit towards an attractor. A modification for improving the performance of QEA was proposed by changing the selection of attractors, namely, versatile QEA. The improvement attained by versatile QEA over QEA indicates the impact of population structure on the performance of QEA and motivates further investigation into employing fine-grained model. The QEA with fine-grained population model (FQEA) is similar to QEA with the exception that every individual is located in a unique position on a two-dimensional toroidal grid and has four neighbors amongst which it selects its attractor. Further, FQEA does not use migrations, which is employed by QEAs. This paper empirically investigates the effect of the three different population structures on the performance of QEA by solving well-known discrete benchmark optimization problems. 1. Introduction Evolutionary algorithms (EAs) represent a class of computational techniques, which draw inspiration from nature [1] and are loosely based on the Darwinian principle of “survival of the fittest” [2–4]. EAs have been successfully applied in solving wide variety of real life difficult optimization problems (i.e., problems which do not have efficient deterministic algorithms for solving them, yet known) and where near optimal solutions are acceptable (as EAs do not guarantee finding optimal solutions). Moreover, EAs are not limited by the requirements of domain specific information as in the case of traditional calculus based optimization techniques [5]. EAs, typically, maintain a population of candidate solutions, which compete for survival from one generation to the next, and, with the generation of new solutions by employing the variation operators like crossover, mutation, rotation gate, and so forth, the population gradually evolves to contain the optimal or near optimal solutions. EAs are popular due to their simplicity and ease of implementation. However, EAs suffer from convergence issues like stagnation, slow convergence, and premature convergence [6]. Efforts have been made by researchers to overcome the convergence issues by
References
[1]
X.-S. Yang, S. Koziel, and L. Leifsson, “Computational optimization, modelling and simulation: recent trends and challenges,” in Proceedings of the 13th Annual International Conference on Computational Science (ICCS '13), vol. 18, pp. 855–860, June 2013.
[2]
E. Alba and M. Tomassini, “Parallelism and evolutionary algorithms,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 5, pp. 443–462, 2002.
[3]
Z. Michalewicz, Genetic Algorithms + Data Structures = Evolution Programs, Springer, London, UK, 3rd edition, 1996.
[4]
Z. Michalewicz and D. B. Fogel, How To Solve It: Modern Heuristics, Springer, Berlin, Germany, 2nd edition, 2004.
[5]
D. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, New York, NY, USA, 1989.
[6]
A. Mani and C. Patvardhan, “Analysis of photoluminescence measurement data from interdiffused quantum wells by real coded quantum inspired evolutionary algorithm,” in Proceedings of the 13th International Workshop on Advanced Computing and Analysis Techniques in Physics Research, Proceedings of Science, Jaipur, India, February 2010.
[7]
K.-H. Han and J.-H. Kim, “Quantum-inspired evolutionary algorithm for a class of combinatorial optimization,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 6, pp. 580–593, 2002.
[8]
M. D. Platel, S. Sehliebs, and N. Kasabov, “A versatile quantum-inspired evolutionary algorithm,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '07), pp. 423–430, Singapore, September 2007.
[9]
L. Kliemann, O. Kliemann, C. Patvardhan, V. Sauerland, and A. Srivastav, “A new QEA computing near-optimal low-discrepancy colorings in the hypergraph of arithmetic progressions,” in Proceedings of 12th International Symposium Experimental Algorithms, vol. 7933 of Lecture Notes in Computer Science, pp. 67–78, June 2013.
[10]
D. H. Kim, Y. H. Yoo, S. J. Ryu, W. Y. Go, and J. H. Kim, “Automatic color detection for MiroSOT using quantum-inspired evolutionary algorithm,” in Intelligent Robotics Systems: Inspiring the NEXT Communications in Computer and Information Science, vol. 376, pp. 11–20, 2013.
[11]
A. C. Kumari, K. Srinivas, and M. P. Gupta, “Software requirements selection using quantum-inspired elitist multi-objective evolutionary algorithm,” in Proceedings of the 1st International Conference on Advances in Engineering, Science and Management (ICAESM '12), pp. 782–787, March 2012.
[12]
H. Lei and K. Qin, “Quantum-inspired evolutionary algorithm for analog test point selection,” Analog Integrated Circuits and Signal Processing, vol. 75, no. 3, pp. 491–498, 2013.
[13]
Y. Li, S. Feng, X. Zhang, and L. Jiao, “SAR image segmentation based on quantum-inspired multiobjective evolutionary clustering algorithm,” Information Processing Letters, vol. 114, no. 6, pp. 287–293, 2014.
[14]
A. Mani and C. Patvardhan, “A hybrid quantum evolutionary algorithm for solving engineering optimization problems,” International Journal of Hybrid Intelligent System, vol. 7, no. 3, pp. 225–235, 2010.
[15]
A. Mani and C. Patvardhan, “An improved model of ceramic grinding process and its optimization by adaptive Quantum inspired evolutionary algorithm,” International Journal of Simulations: Systems Science and Technology, vol. 11, no. 3, pp. 76–85, 2012.
[16]
C. Patvardhan, A. Narain, and A. Srivastav, “Enhanced quantum evolutionary algorithms for difficult knapsack problems,” in Proceedings of International Conference on Pattern Recognition and Machine Intelligence, vol. 4815 of Lecture Notes in Computer Science, Springer, Berlin, Germany, 2007.
[17]
G. Zhang, “Quantum-inspired evolutionary algorithms: a survey and empirical study,” Journal of Heuristics, vol. 17, pp. 303–351, 2011.
[18]
E. Alba and B. Dorronsoro, “The exploration/exploitation tradeoff in dynamic cellular genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 9, no. 2, pp. 126–142, 2005.
[19]
K.-H. Han and J.-H. Kim, “Quantum-inspired evolutionary algorithms with a new termination criterion, Hε gate, and two-phase scheme,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 2, pp. 156–169, 2004.
[20]
J. Kennedy and R. Mendes, “Population structure and particle swarm performance,” in Proceeding of the Congress on Evolutionary Computation (CEC '02), vol. 2, pp. 1671–1676, May 2002.
[21]
E. Alba and J. M. Troya, “Improving flexibility and efficiency by adding parallelism to genetic algorithms,” Statistics and Computing, vol. 12, no. 2, pp. 91–114, 2002.
[22]
P. Spiessens and B. Manderick, “A massively parallel genetic algorithm,” in Proceeding of 4th International Conference on Genetic Algorithms, R. Bclew and L. Booker, Eds., pp. 279–286, 1991.
[23]
K. W. C. Ku, M. W. Mak, and W. C. Siu, “Adding learning to cellular genetic algorithms for training recurrent neural networks,” IEEE Transactions on Neural Networks, vol. 10, no. 2, pp. 239–252, 1999.
[24]
G. Folino, C. Pizzuti, and G. Spezzano, “Parallel hybrid method for SAT that couples genetic algorithms and local search,” IEEE Transactions on Evolutionary Computation, vol. 5, no. 4, pp. 323–334, 2001.
[25]
D. A. Sofge, “Prospective algorithms for quantum evolutionary computation,” in Proceedings of the 2nd Quantum Interaction Symposium (QI ’08), College Publications, Oxford, UK, 2008, http://arxiv.org/ftp/arxiv/papers/0804/0804.1133.pdf.
[26]
M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information, Cambridge University Press, Cambridge, UK, 2006.
[27]
A. Narayanan and M. Moore, “Quantum-inspired genetic algorithms,” in Proceedings of the IEEE International Conference on Evolutionary Computation (ICEC '96), pp. 61–66, May 1996.
[28]
S. Droste, T. Jansen, and I. Wegener, “A natural and simple function which is hard for all evolutionary algorithms,” in Proceedings of the 26th Annual Conference of the IEEE Electronics Society (IECON '00), pp. 2704–2709, Nagoya, Japan, October 2000.
[29]
K. de Jong, M. Potter, and W. Spears, “Using problem generators to explore the effects of epistasis,” in Proceedings of the 7th International Conference on Genetic Algorithms, T. B?ck, Ed., pp. 338–345, 1997.
[30]
D. Pisinger, “Where are the hard knapsack problems?” Computers & Operations Research, vol. 32, no. 9, pp. 2271–2284, 2005.
[31]
Y. Feng, G.-G. Wang, Q. Feng, and X.-J. Zhao, “An effective hybrid cuckoo search algorithm with improved shuffled frog leaping algorithm for 0-1 Knapsack problems,” Computational Intelligence and Neuroscience, vol. 2014, Article ID 857254, 17 pages, 2014.
[32]
S. Das and P. N. Suganthan, “Differential evolution: a survey of the state-of-the-art,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2011.
[33]
R. Mallipeddi, P. N. Suganthan, Q. K. Pan, and M. F. Tasgetiren, “Differential evolution algorithm with ensemble of parameters and mutation strategies,” Applied Soft Computing Journal, vol. 11, no. 2, pp. 1679–1696, 2011.
[34]
A. Gherboudj, A. Layeb, and S. Chikhi, “Solving 0-1 knapsack problems by a discrete binary version of cuckoo search algorithm,” International Journal of Bio-Inspired Computation, vol. 4, no. 4, pp. 229–236, 2012.
[35]
N. Mani, G. Srivastava, A. K. Sinha, and A. Mani, “An evaluation of cellular population model for improving quantum-inspired evolutionary algorithm,” in Proceedings of the 14th International Conference on Genetic and Evolutionary Computation (GECCO '12), pp. 1437–1438, Philadelphia, Pa, USA, July 2012.