A New Cuckoo Search

. In this paper, we intend to formulate a new Cuckoo Search (NC-S) for solving optimization problems. This algorithm is based on the obligate brood parasitic behavior of some cuckoo species in combination with the L(cid:19)evy (cid:13)ight behavior of some birds and fruit (cid:13)ies, at the same time, combine particle swarm optimization (PSO), evolutionary computation technique. It is tested with a set of benchmark continuous functions and compared their optimization results, and we validate the proposed NCS against test functions with big size and then compare its performance with those of PSO and original Cuckoo search. Finally, we discuss the implication of the results and suggestion for further research.


Introduction
In factual production and management, there are many complicated optimization problems. So many scientists have constantly proposed the new intelligent algorithms to solve them. For example, PSO was inspired by fish and bird swarm intelligence [1] [2]. These nature-inspired metaheuristic algorithms have been used in a wide range of optimization problems including NP-hard problems such as the travelling salesman problem (TSP) and scheduling problems [3] [4]. Based on the interesting breeding behavior such as brood parasitism of certain species of cuckoos, Yang and Deb [5] has formulated the Cuckoo Search (CS) algorithm. Yang and Deb in [6] review the fundamental ideas of CS and the latest developments as well as its applications. They analyze the algorithm, gain insight into its search mechanisms and find out why it is efficient.
In this paper, we intend to formulate a new Cuckoo Search (NCS) with different evolution mode, for solving function optimization problems. The NCS is based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flight behavior of some birds and fruit flies. Moreover it integrates the PSO with some evolutionary computation technique. The PSO has particularly gained prominence due to its relative ease of operation and capability to quickly arrive at an optimal/near-optimal solution. This algorithm is considered with the advantages of CS and PSO, avoid tendency to get stuck in a near optimal solution in reaching optimum solutions especially for middle or large size optimization problems. This work differs from existing ones at least in three aspects: it proposes the iterative formula of NCS, in which combines the iterative equations of CS and PSO. it finds the best combine parameters of NCS for different size optimization problems. by strictly analyzes the performance of NCS, we validate it against test functions and then compare its performance with those of PSO and CS with different random coefficient generate.
Finally, we discuss the implication of the results and suggestion for further research.

The model of new CS algorithm
Yang [7] provides an overview of CS and firefly algorithm as well as their latest developments and applications. In this paper, we research on different random distribution number and their influence for algorithm. Furthermore, we also present a new CS with evolutionary pattern. A New Cuckoo Search (NCS) combining CS and PSO is presented in this paper, in which is based on the obligate brood parasitic behavior of some cuckoo species and considered with the Lévy flight behavior of some birds and fruit flies. In the process of evolution, nests/particles/solutions of next generation share the historical and global best of the ith nests/particles/solutions. NCS improves upon the PSO and CS variation to increase accuracy of solution without sacrificing the speed of search solution significantly, and its detailed information will be given in following.
Suppose that the searching space is D-dimensional and m nests/particles/solutions form the colony. The ith nest/particle represents a D-dimensional vector X i = (x i1 , x i2 , . . . , x iD ), (i = 1, 2, . . . , m), and it means that the ith nest/particle located at X i in the searching is a potential solution. Calculate the nest/particle/solution fitness by putting it into a designated objective function. The historical best of the ith nests/particles/solutions denotes as P i = (p i1 , p i2 , . . . , p iD ), called IBest, and the best of the global as P g = (p 1 , p 2 , . . . , p D ), called GBest respectively. At the same time, some of the new nest/particle/solution should be generated combining Lévy walk around the best solution obtained so far. It will speed up the local search.
After finding above two best values, and with Lévy flight the nests/particles/solutions of NCS updates as formulas (1).
In (1), Lévy is same as (??) and its step size follows a random walk Lévy distribution. A part from distant random position solution is far from the optimal solution, so it can make sure that the system does not fall into local optimal solution. Where the δ ∈ [0, 1] is weight index that is chosen according to different optimization problem. It reflects relatively important degree of the t generation Lévy fly, the best nests/particles/solutions of individual historical P (t) i and the best nest/particle/solution of global P (t) g . In addition, the NCS evolution process, the global best nest/particle/solution P (t) g may in K% forced preserved, K ∈ (0, 100). Others parameters such as α and R 1 , R 2 ∈ (0, 1) are same as the ones in [2].
The search is a repeated process, and the stop criteria are that the maximum iteration number is reached or the minimum error condition is satisfied. The stop condition depends on the problem to be optimized. In the NCS evolution process, the nests/particles/solutions will be mainly updated through the three parts: -Lévy walk; the distance between the best nests/particles/solutions of individual historical P (t) i and its current nests/particles/solutions; the distance between the best nest/particle/solution of individual historical P (t) i and its current nest/particle/solution There are some significant differences of NCS, CS and PSO. Firstly, their iterative equations are not the same. The NCS integrates the advantages of CS and P-SO algorithm, which share the excellent information of nests/particles/solutions. It uses some sort of elitism and/or selection which is similar to the ones used in harmony search (HS). Secondly, the randomization is more efficient as the step length is heavy-tailed, and any large step is possible. Thirdly, the parameter δ is to be turned and easy to find the highest efficiency parameters which are adapted to a wider class of optimization problems. In addition, the NCS can thus be extended to the type of meta-population algorithm.

Test functions
To proof-test the effectiveness of NCS for optimization problems, 14 representative benchmark functions with different dimensions are employed to compare with CS and PSO described in Table 1, where the f 10 is given as formula (2). Generate initial global best population P (t) g with the lowest fitness nest/particle/solution in the whole population; 10: Generate randomly initial individual historical best population P (t) i ; 11: t:=0; 12: while (t ¡ Endge) or (stop criterion) do 13: t:=t+1; 14: Generate next nests/particle/solutions by equations (1); i )); 21: Evaluate nests/particles/solutions; 22: {Compute each nest/particle/solution's fitness Fi in the population; 23: Find new P (t) g of nest/particle/solution by comparison, and update P (t) g ;} 24: Keep the best nest/particle/solution; 25: Rank solutions and find the current best; 26: t:=t+1; 27: Choose a nest/particle/solution in population randomly; 28: if (Fi > Fj) then 29: Replace j by the new nest/particle/solution; 30: end if 31: A fraction pα of worse nests/particles/solutions are abandoned and new ones are built; 32: The iteration calculation are as follows (X randn is generated randomly): 33: Keep the best nest/particle/solution; 40: Rank solutions and find the current best; 41: Randomly selected nests/particles/solutions of populations K%, forced instead them by the highest quality nest/particle/solution P where  Remark 1: In table 2, the parameters of Fun and Dim denote function and its dimension respectively. The Best is this functions optimum value. The PS and EG indicate algorithm population size and their terminate generation number. The better solutions and corresponding parameters found in NCS algorithm are illustrated with bold letters. The best minimum average and standard deviation are shown in italic and underline respectively.

Experimental results and comparison used against test function with big size
From table 2, in general it can observe that the δ ∈ [0.5, 0.8] has the highest performance since using them have smaller minimum and arithmetic mean in relation to solutions obtained by others. Especially the δ ≈ 0.5 has better search efficiency. In NCS, to optimize different problem should select various parameter δ, as a whole, it is better when δ ≈ 0.5. The CS optimization function f 4 and f 12 is almost divergence. The NCS in all kinds of function optimization show excellent performance. In above function tests, the minimum value of NCS searched is NS and PSO searched 1/1000, or even 1/10000. Especially for large size problem, and function argument value in large range, optimal is strong search ability. NCS From simulation results we can obtain that the NCS is clearly better than PSO and CS for continuous non-linear function optimization problem. The NCS algorithm searches performance is strong, which can get better the minimum, mean and standard deviation relatively, but it computes more part ( i )), therefore more cost of hardware resources, although each running use a little more time, which is negligible. The convergence figures of most effective, distribution figures of run 32 times to search the optimal value of NCS comparing with PSO and CS for 14 instances are shown in Figure 1 Figure 4, it can discover that the convergence rate of NCS is clearly faster than the PSO and CS on every benchmark function. Especially it is more efficacious than PSO for middle and large size optimization problem. Accordingly, we can do state that the NCS is more effective than PSO and CS.

Conclusions and perspectives
According to shortcoming of CS and PSO algorithm especially solving the middle or large size problem, we proposed NCS. Using 14 representative instances with different dimensions and compared with the PSO and CS, the performance of the NCS shows that is efficacious for solving optimization problems.
The proposed NCS algorithm can be considered as effective mechanisms from this point of view. There are a number of research directions which can be regarded as useful extensions of this research. Although this algorithm is tested with 14 representative instances, a more comprehensive computational study should be made to measure it. In the future it is maybe do experiments with different parameters and evaluate the performance of NCS. Furthermore, it finds the best parameters and usage scenarios such as TSP, scheduling, etc.

Acknowledgements
This work is supported by Academic Discipline Project of Shanghai Dianji University (Number: 16YSXK04) the Shanghai Natural Science Foundation (Number: 14ZR1417300).