Abstract: Knapsack problem is one kind of NP-Complete problem. Unbounded knapsack problems are more complex and harder than general knapsack problem. In this paper, we apply QGAs (Quantum Genetic Algorithms) to solve unbounded knapsack problem and then follow other procedures. First, present the problem into the mode of QGAs and figure out the corresponding genes types and their fitness functions. Then, find the perfect combination of limitation and largest benefit. Finally, the best solution will be found. Primary experiment indicates that our method has well results.
Key words: Quantum Genetic Algorithms; Knapsack Program; Optimization Problems
doi:10.3969/j.issn.1673-0194.2009.15.018
CLC number: TP14Article character:AArticle ID:1673-0194(2009)15-0057-04
1 Introduction
The general Knapsack problem is a well-known problem in the field of optimal operations research [1]. It concerns the choice of items to be placed in a mountaineer’s Knapsack: though each item benefits the climber, capacity is limited. The unbounded Knapsack problem is defined [2]: places no upper bound on the number of copies of each kind item. The number of each item is unbounded. There are no restrictions on choice. The equation is shown the condition of xi is different from the Knapsack problem. The value of xi is a positive integer including zero.
Maximize f (x) =∑mi=1xipi
Subject to ∑mi=1xiwi≤C,
0 ≤xi,xi∈ integer, i=1……n;
We have n kinds of items, xi through xn. Each item x has a profit value p and a weight w. The maximum weight that we can carry in the bag is C, and each item has many copies.
In 1975, genetic algorithms (GAs) proposed by Holland [3], has been applied to many different areas. In 1989, Goldberg[3] developed a simple genetic algorithms (SGAs) using GAs that provides a important base for computer program analysis of GAs. In order to improve efficient for GAs, some mixed GAs were proposed. For example, combine GAs with fuzzy theory [4] that uses the power of decision inference by fuzzy theory, then thought GAs to find the best solution. Besides, to combine GAs with simulated annealing method (SA) [5] that advances the power of partial search for GAs. In addition, combine GAs with neural network [6] that enhances the speed of solving problems for GAs.
Han and Kim proposed quantum genetic algorithms (QGAs)[7-8] that combines GAs with quantum computing in 2002. He solved the knapsack problem using QGAs that indicate by quantum bit (qubit) and replace three operators of classical genetic algorithms with rotation gate to achieve evolution goal. The result shows it is more excellent that traditional GAs in group multiplicity and computing parallel and enhances the algorithms speed and decreases untimely convergence to enhance efficiency. In this paper we will solve unbounded knapsack problem using QGAs.
The reminder of the paper is organized as follows. We give a system overview in section 2. section 3 intrudes the quantum genetic algorithm setting. In section 4, we proposed the experimental results. Conclusions are given in section 5.
2 The system overview
The smallest unit of information stored in a two-state quantum computer is called a quantum bit or qubit. A qubit may be in the “1”and “0” superposition state that can be represented as
ψ = α| 0>+ β| 1>
where α and β are complex numbers that conform with
|α|2+|β|2=1
The representations of QGAs encoding are defined below.
Definition 1: A qubit as αβwhere |α|2+|β|2=1, |α|2 gives the probability that the qubit will be found in the “0” state and |β|2 gives the probability that the qubit will be found in the “1” state.
Definition 2: A qubit individual as a string of m qubits is defined as
Fig.1 The workflow of the QGAs
qij=αtj1αtj2…αtNm
βtj1βtj2…βtNm,j =1,2,…, N,
Where m is the length of quantum chromosome; N is the number of quantum chromosome in group.
Definition 3:The group Q individual as a string of N chromosome is defined as Q (t) =qtj|qt1Qt2……qtN, where N is the number of the group.
The representations of the chromosome use quantum bits in QGAs that make a chromosome can show multiple message of state and represent states at the same time. This paper will use case study methods with QGAs on practical application. It proposed by Han and Kim in 2002. Figure 1 shows the workflow of the program and the process.
3 Quantum Genetic Algorithms Setting
This system is based on the steps of the QGAs proposed by Han and Kim. Generally speaking, processing the QGAs requires several steps, but we generalize 4 parts that are main frames of QGAs. The 4 parts are including gene coding, P(t) repairing, fitness function calculation and probability of Q(t) updating that explain as follow:
(1) Gene coding
Select a number from range [0, 1] in the same probability and compare with the probability of |β|2 that means qubit is “1”. If it is smaller than|β|2 , that means the probability of qubit will be “1” is greater, so we set binary (xtj1) equate 1. Otherwise, set it equates 0 until finish selecting all items. The result is a binary string.
(2) P(t) repair
P (t) repair is revising the unsuitable solutions. In order to let the knapsack conform to total weights were allowed. If the knapsack over total weights, we must give some items up by order until satisfy with request. If the knapsack under weights limit, we select some items into knapsack by order to satisfy with the restriction. The restriction satisfies with ∑mi=1wi×xi≤CFitness function calculation
After finishing P (t) repair, we will calculate its fitness that is the total profits by chose items in the knapsack. Fitness value is Max f (x) =∑mi=1xipi. We will get the best profits of the combination after formula substitution.
Q(t) update
Setting rotation angle Δθi we use lookup table as Table 1. where f(x) is the best fitness value of new generation chromosome and f (b) is the best fitness value of old generation chromosome. We will classify several categories to explain as follow and Figure 2 depicts the polar plot of the rotation gate for Q-bit individuals.
Fig.2 Polar plot of the rotation gate for Q-bit
individualsTable 1.The setting of rotation angle
In the Table 1, x1 are ith bits of P(t), b1 are ith bits of B(t), f (x) are profits of P(t) and f(b) are profits of B(t), we classify some situation as follow. If x1=0 and b1=1 and f(x) < f(b), the Q-bit is located in the first or the third quadrant in Figure 2, the value of Δθi is set to a positive value to increase the probability of the state “1”; else, it is located in the second or the fourth quadrant, the value of -Δθi is set.
If there are out of situation above, all the value ofΔθi are set “0” that [ θi| i= 1 2 4 6 7 8 ]=0. In general, the rotation angle θ3 and θ5 are set between 0.01π to 0.09π.
Update Q(t),
qt+1j=αtj1αtj2αtj3…αtjm
βtj1βtj2βtj3…βtjm×Δθ
The size of rotation angle will affect convergence speed. If the value is set too lager, it will converge too early; if the value is set too small, it will make the speed of converge slow.
4 Experimental results and analysis
4.1 experimental arrangements
The system was implemented using C++ language on PC with Intel Pentium(R) processor 2.2 GHZ. We use twenty items as Table 2 and the weights limit will be set 200KG, generations are 200, groups are 50, and rotation angle is 0.03π. The QGAs find the best solution is on the 65 generations. The best solution is: one 4th items, one 9th items, and seven 16th items and the total values are 10122 that are optimal.
4.2 Experimental results and discussion
This experiment solves unbounded knapsack problem thought Quantum Genetic Algorithms. In the first part of experiment, we focus on static parameter settings and observe the result. QGAs modify the generation to search for optimal solutions. When other conditions are the same as previous setting, we set generations to 50, 100, and 200. Thirty runs were executed. And the results show when generations are greater, they will make search space more lager. Then, we can get average profits. The generations of the best solutions and generations are positive correlation.
Next, we adjust groups for analyze optimal solutions. We set the groups to 20, 50, and 100. The results show the groups are more lager and the profits are better. When the groups are 100, the average value is less than others. It is because the search space is too big enough to find the optimal solutions in the early generations.
Then, we consider the relations with the rotation angels and optimal solutions. Assume that rotation angel is 0.01, 0.05, and 0.09 under others settings are the same. In the results, we discovered when rotation angle is 0.01, no matter the average profits or the counts of the best solutions are the best. The average generations of the best solutions are greater than others when rotation angle is 0.01. This is because rotation angles are smaller and the convergence speed is slower. So, the convergence range is smaller and we get the optimal solutions in the last generations.
In the second part of experiment, we accede to greedy method into QGAs to enhance performance. In the greedy method, the large ratio of benefit to weight is the better candidate. We evaluate the input data of benefits and weights, counting the ratio and put larger ratio items in the righter side. The right item gets the higher probability to operate select mechanism. We use the essence of the greedy method that always chooses the best chromosome before each run. If we only use greedy method to find the optimal solution, we just get the optimal solution is 10098, not 10122. So we join the greedy method into QGAs and the result as Figure 3. We only adjust the rotation angle and with greedy method or not. In the Figure 3(a), QGAs with greedy method get better profits than not. In the generations of the best solutions, QGAs with greedy method are less than not as Figure 3(b). This is because greedy method makes converge early. Besides, we see QGAs with greedy method is excellent than not in the counts of the best solutions as Figure 3(c). Therefore, the experiment proves that QGAs with greedy method can raise the algorithm’s performance.
Fig.3 The comparison of QGAs and QGAs with Greedy method
5 Conclusion and future work
In this paper, we solve unbounded knapsack problem using QGAs by quantum bit and replace three operators of traditional genetic algorithms with rotation gate to achieve evolution goal. The unbounded knapsack problem is more complex than the general Knapsack problem. We used the QGAs for solving unbounded knapsack problem, overcoming the problem of slow convergence in traditional GAs. We also join greedy method into QGAs to enhance performance. Our approach is able to find the optimal solution using the multiple selection strategy in a wide search space. In the future we will try to find the best relationship between generations size and rotation angle in order to improve the efficiency of the method.
References
[1] K L Li, G M Dai and Q H Li. A Genetic Algorithm for the Unbounded Knapsack Problem[C]. IEEE Conference on Machine Learning and Cybernetics, Vol.3, 2003:1586-1590.
[2] Wikipedia. Knapsack problem[EB/OL]. http://en.wikipedia.org/wiki/Knapsack_problem.
[3] D E Goldberg Genetic Algorithms in Search, Optimization and Machine Learning[M]. Addison-Wesley, 1989.
[4] B Chakrabort. Genetic Algorithm with Fuzzy Fitness Function for Feature Selection“ Proc. of the IEEE Int.Sym. on Industrial Electronics, Iwate, Japan, 2002: 315-319.
[5] L Wang and D Z Zheng. A Modified Genetic Algorithm for Job Shop Scheduling[J]. International Journal of Advanced Manufacturing Technology, 2002, 20(1): 72-76, 2002.
[6] N Chaiyaratana and A M S Zalzala. Hybridization of Neural Networks and Genetic Algorithms for Time-Optimal Control[C].Proceedings of the Congress on Evolutionary Computation, vol.1, 1999: 389-396.
[7] K H Han and J H Kim. Quantum-Inspired Evolutionary Algorithm for a Class of Combinatorial Optimization[J]. IEEE Transactions on Evolutionary Computation, 2002, 6(6): 580-593.
[8] K H Han and J H Kim. Genetic Quantum Algorithm and its Application to Combinatorial Optimization Problems[C]. IEEE Conference on Evolutionary Computation, 2000,1354-1360.