999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

An Exploration of Two Evolutionary Computing for Function Optimization

2009-04-29 00:00:00LUANShao-junGAOXue-dong
中國管理信息化 2009年15期

Abstract: In this paper, we implement 2 ECs: a particle swarm optimizer(PSO),a Society of Hill-climbers(SoHC). Each EC is implemented to evolve a solution with a fitness above a threshold for F7 function. We use the same parameter sets to compare the performances of the two ECs, according to their success rates, average function evaluation times and average best fitness.

Key words:Evolutionary Computing; PSO Full-model;Fuction Optimization

doi:10.3969/j.issn.1673-0194.2009.15.021

CLC number: TP3Article character:AArticle ID:1673-0194(2009)15-0068-04

1 Introduction

In this paper, we implement 2 ECs: a particle swarm optimizer(PSO),a Society of Hill-climbers(SoHC).Each EC is implemented to evolve a solution with a fitness above a threshold for F7 function.

Hill-climbers are local search techniques. Therefore it is important to define the neighbor of the local search. A Society of Hill-climbers(SoHCs) as proposed by Sebag Schoenauer(1977) can be seen as being composed of a (μ+λ)-EC and a repoussoir which is an adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions[1].

The aim of this paper is to compare the performances of the two types of algorithms, according to their success rates, average function evaluation times and average best fitness.

Section 2 describes the problem to be solved by using the two types of Evolution Computation. Section 3 introduces these algorithms and their implementation. Section 4 describes the experiment parameter setup. Section 5 brings forth the experiment results and analyze them.

2 Problem Statement

In this paper, a particle swarm optimizer (PSO) Full-model, a Society of Hill-climbers (SoHC) need to be implemented to evolve a solution with the fitness above a threshold (23.30923) for F7 function:

F7(x,y)=(x2+y2)0.25sin(50(x2+y2)0.1)2+1.0」

The problem is allowed a maximum of 4000 function evaluations, and x and y have the value in the range [-100, 100].

It is obvious that this objective functions is complex and we could not just use the technique of gradient descent or analytical discovery to find the optimum solution.

We will apply the PSO Full-model, the SoHC to solve the problem.

3 Evolutionary Computation (ECs) Implementation

3.1 The PSO Full-model (using the off-the-shelf parameter setting)

We set φ1>0 and φ2>0 to get Full Model swam type for using Particle Swarm Optimization for F7 function optimization here. And we use the off-the-shelf parameter setting as the base algorithm.

The basis steps for the PSO Full-model Algorithm are implemented with the following steps:

Step 1 Initialization: each individual has x and y values. They will be created randomly within a range, [-100, 100]. Each x has its Vx value and each y has its Vy value. The Vx and Vy values will be created as x’s or y’s initialization, a random value within the range, [-100, 100]. Generated μ individuals are evaluated by fitness function, F7.

And we initialize the individual best P value by the individual’s initialization value.

We get the global neighborhood bestPg by selecting the best individual among the population according to their fitness value.

Step 2 Swarm Moving: each time, the μ individuals will update their velocities first, and then move ahead. The formula is listed below:

When Xmin

Vi=K(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi),

Otherwise

Vi=0.

Xi+1=Xi+Vi,

If Xi>Xmax, set X value as Xmax;

IfXi

Here K=2/|2-φ-φ2-4×φ|; φ = 4.1;

φ1+φ2=φ .

Then the individuals will be evaluated and update P and Pg value. Pg value is updated asynchronously, which means after each individual is generated, it will be used to update the Pg value.

Step 2 will cycle to run until the best fitness gets above threshold (23.309239) or the total evaluation times equal to 4000.

The whole process runs for 100 times for each parameter set.

3.2 A Society of Hill-climbers

The society of Hill-climbers are being composed of a (μ+ λ)-EC and a repoussoir which is an adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions. In this algorithm, we total have 6 parameters. They are: μ, λ, M, R, α, T.

μ represents the number of Hill-climbers.

λ represents the number of offspring created.

M represents the number of bits to be mutated to create an offspring.

R the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller.

α represents the decay rate for the components of R.

T is the tournament size for selecting a variable (gene) of the parent to mutate.

In above parameters, the only sensitive parameter for historical societies is M.

The basis steps for this algorithm are:

Step 1 initialize: The population size is μ. We use random method to produce the first generation. We use binary code to represent the values. And all values should be in range [-100,100]. We need initialize each component of repoussoir to 0.5.

Step 2 Create-offspring: The method of create-offspring is more complexible. We will create total λ children. So every parent will create λ/μ children. We will use tournament method to select M bit to mutate the bit of parent to create children. The tournament method is described as follows:

The tournament function has 3 parameters: μ, Repoussoir, and T. Randomly select T bits i1,i2, …,iT from binary code list of parent; Return ik such that |parent(ik)-Repoussoir(ik)| is minimum. Then the selected bit will be flipped from 1 to 0 or from 0 to 1. The mutation bit is M.

Step 3 Parent Selection: Now we total have μ+λ individuals. We select μ best one from μ+λ to be as our next generation parents.

Step 4 Update Repoussoir: Hill-climbers are provided with some memory of the past generations-repeller for next generation. We need update repoussoir when getting new generation. We select R% worst individuals from μ+λ individuals. Then calculate average worst bits(AvgWorst). This is like a array which stored each average worst bit as components. The formula Repeller = (1 – α) Repeller + α AvgWorst will be used to update new repoussoirs.

About the result, you can see the Table 2.

4 Experimental Setup

To compare these two EC algorithms, we use the same parameter values and at last find the parameter set for each EC algorithm to get the best result.

(1) μ (Population size):{2,12,30,50,100}.

(2) λ (Offspring size ):{4, 24,60,100,200}.

(3) M (the number of bits to be mutated to create an offspring): {1,2, 5}.

(4) R (the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller.):{0.5}.

(5) α (the decay rate for the components of R):

{0.01,0.1,1.0}

(6) T (the tournament size for selecting a variable (gene) of the parent to mutate):{2,4,6}

The maximum function evaluation times for these two algorithms are 4000. For each parameter set, the algorithm will run 100 times to get the results, and calculate the success rate, average number of function evaluations, standard deviation of function evaluations, average best fitness found and standard deviation of the best fitness found.

5 Results

In the results tables,they have the same parameters as the following:

μ: population size

φ1: cognition component coefficient

φ2: social component coefficient

λ : offspring size

M : the number of mutated bits

R : the worst fit CSs.

α : the decay rate for the components of R

T : the tournament size for selecting a variable of the parent to mutate

S.R : Success Rate (%)

A.F.E : Average number of function evaluations

S. F. E : Standard deviation of function evaluations

A.B.F : Average best fitness found

#: means don’t care

5.1. The PSO Full-model (using the off-the-shelf parameter setting)

We use the off-the-shelf parameter setting as our base algorithm. It uses global neighborhood, and asynchronously updates Pg value. To test for better result, we choose some other values besides 30 to set particle number of the swarm, and we use other 1 and 2 values besides 2.8 and 1.3 as in the experiment setting:

μ = {2, 12, 30, 50, 100}

φ= 4.1

φ1 = {1.00, 1.30, 2.05, 2.80}

φ2 = φ - φ1

Table 1 lists the result of using X value to determine V value as in the off-the-shelf setting:

WhenXmin

Vi=K×(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi);

Otherwise, Vi=0.

Table 2 lists the result of NOT using X value to determine V value:

Vi=K×(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi) not matterXmin

We can see from the results that not using X value to determine V value will get us better result as in Table 2 for F7 function optimization.

From Table 1, we can see when using X value to determine V value, the best success rate is 77% and the parameter values are as in the off-the-shelf parameter setting.

From particle number in Table 1, we can see when the particle number is 50 or 30, the success rate result is better than the particle number is 12 or 100. Using 2 as the particle number will get us the worst result.

And from Table 1, when using X value to determine V value, generally 2.8 used as φ1 value will give us better result than other values we tried to use for φ1.

From Table 2, we can see when not using X value to determine V value, the best success rate is 100% and the average function evaluation times is 442.1 when the parameter values are μ=12, φ1=2.05, and φ2=2.05.

From particle number in Table 1, we can see when the particle number changes from 12 to 30 to 50 then 100 and at last 2, the result becomes worse. The same as Table 1, using 2 as the particle number will get us the worst result.

And from Table 2, we can see the same as Table 1, generally 2.8 used as φ1 value will give us better result than other values we tried to use for φ1, except for when particle number is 12, 2.05 as φ1 value will give us some better result than 2.8.

5.2 A Society of Hill-climbers

Society of Hill-climbers(SoHCs) can be seen as being composed of a (μ+λ)-EC and a repoussoir adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions. In this algorithm,we total set 6 parameters to check the performance of algorithm. They are: < μ, λ, M, R, α, T >. The meaning of these parameters has been introduced in “3.2 A Society of Hill-climbers” of this paper.

The six parameter values we set are:

μ = {2,12,30,50,100};

λ= {4, 24,60,100,200};

M = {1, 2, 5};

R = {0.5};

α = {0.01,0.1,1.0};

T = {2, 4, 6};

Table 1 PSO Full-model

(V value determined by X value)

Table 2 PSO Full-model

(V value Not determined by X value)

 Bigger population size will make the diversity of generation bigger. In our group, we set population size μ={2,12,30,50,100}. In Table 3.When μ equal 50 or 100, the highest success rate can be 58% .

 Parameter M has important role. See Table 3. When M equal 1 or 2, it will get better performance than M equal 5. If we only compare the situation when M equal 1 and M equal 2, we can see that if population size is 2,12,30, the performance of M equal 2 is a little bit better than the performance of M equal 1. When population size is above 30,we can see in Table 3 that the performance of M equal 1 is a little bit better than the performance of M equal 2.

 Parameter α in this algorithm does not make big effect on success rate. We think if we add our bit number bigger, that will make effect.

 Parameter T is the number of bits selected for mutation. The larger T can provide the better chance that created 2 offspring in one parent is different each other. But in this algorithm, T has no big effect on success rate.

 In all of values, when population size μ is 30, λ is 60, M is 1, R is 0.5, α is 1,T is 2. the success rate will reach to highest 59%. And parameter σ has more important role than population size μ to get solution.

Current test is based on Bit number (chromsome) is 10. After we test some other values, we found if set bit number to 5, we will get much better results. Almost all of parameters can arrive to 100% success rate. See Table 4, that shows the result.

6 Discussions and summary

Table 5 is the two ECs’s performance comparison,and we could find that PSO full model are much effective in the success rate ,yet the two ECs have the same average best fitness. Hence PSO has better performance in this solution.

Table 3 Result SoHC with 10 Bit number

Table 5 comparison of the two Ecs

Types of Ecsμ12λMRTS.RAEESEEABFS.BF

PSO Full-model122121#####1004 42113 26523.309 20.000 002 7

SoHC30##601112592 6281 348.123.309 20.000 098

Reference

[1] M Sebag, M Schoenauer. A Society of Hill-climbers[C]// Proceedings of the Fourth IEEE International Conference on Evolutionary Computation, 1997: 319-324.


登錄APP查看全文

主站蜘蛛池模板: 国产一级一级毛片永久| 九九香蕉视频| www.精品国产| 国产网友愉拍精品视频| 人妻免费无码不卡视频| 国产午夜看片| 日本亚洲最大的色成网站www| 曰AV在线无码| 国产成人AV综合久久| 国产视频你懂得| 直接黄91麻豆网站| 亚洲中文字幕在线观看| 国产a v无码专区亚洲av| 影音先锋亚洲无码| 无码中文字幕精品推荐| 亚洲一区二区视频在线观看| 四虎永久免费在线| 国产成人免费手机在线观看视频 | 亚洲视频欧美不卡| 国产你懂得| 野花国产精品入口| 午夜无码一区二区三区| 91在线播放免费不卡无毒| 日本久久久久久免费网络| 性视频一区| 久久综合色88| 91娇喘视频| 丁香综合在线| 国内精品视频在线| 精品国产福利在线| 免费毛片视频| 国产一区在线观看无码| 新SSS无码手机在线观看| 多人乱p欧美在线观看| 久久永久视频| 精品国产成人国产在线| 国产区成人精品视频| 丝袜国产一区| 亚洲欧洲自拍拍偷午夜色| 亚洲精品无码在线播放网站| 午夜日b视频| 高清无码不卡视频| 色有码无码视频| 久久99精品国产麻豆宅宅| 97久久精品人人| www.日韩三级| 特级精品毛片免费观看| 亚洲A∨无码精品午夜在线观看| 成年人久久黄色网站| 91视频青青草| 99re在线免费视频| 国产农村精品一级毛片视频| 天堂成人在线| 日韩中文精品亚洲第三区| 色婷婷丁香| 国产成a人片在线播放| 亚洲日本精品一区二区| 亚洲人网站| 亚洲一区二区三区在线视频| 99在线观看视频免费| 亚洲嫩模喷白浆| 九九久久精品免费观看| 午夜精品一区二区蜜桃| 伊人久久大线影院首页| 久久综合五月婷婷| 麻豆精品在线| 久久精品人人做人人爽电影蜜月| 亚洲69视频| 久久精品丝袜| 国产人人乐人人爱| 亚洲天堂视频网站| 国产麻豆另类AV| 黄色网址手机国内免费在线观看| 暴力调教一区二区三区| 91无码国产视频| 国产精品久线在线观看| 无码免费视频| 爆乳熟妇一区二区三区| 国产精品任我爽爆在线播放6080| 国产97公开成人免费视频| 久久综合干| a国产精品|