﻿ 依概率收敛的改进粒子群优化算法
«上一篇
 文章快速检索 高级检索

 智能系统学报  2017, Vol. 12 Issue (4): 511-518  DOI: 10.11992/tis.201610004 0

### 引用本文

QIAN Weiyi, LI Ming. Improved particle swarm optimization algorithmwith probability convergence[J]. CAAI Transactions on Intelligent Systems, 2017, 12(4): 511-518. DOI: 10.11992/tis.201610004.

### 文章历史

Improved particle swarm optimization algorithmwith probability convergence
QIAN Weiyi, LI Ming
College of Mathematics and Physics, Bohai University, Jinzhou 121013, China
Abstract: The particle swarm optimization (PSO) algorithm is a stochastic optimization algorithm that does not converge to a global optimal solution on the basis of probability 1. In this paper, we present a new probability-based convergent PSO algorithm that introduces two mutation operators with exploration and exploitation abilities, which are applied to the previous best position of a particle with a certain probability. This algorithm converges to the-optimum solution on the basis of probability 1.We applied the proposed algorithm in 13 typical test functions and compared its performance with that of other PSO algorithms. Our numerical results show that the proposed algorithm can improve solution precision and convergence speed.
Key words: particle swarm optimization    stochastic optimization algorithm    mutation operator    probability convergence    global optimization    evolutionary computation    heuristic algorithm    Gaussian distribution

1 标准PSO算法

 $v_{id}^{k + 1} = \omega v_{id}^k + {c_1}{r_1}\left( {p_{id}^k - x_{id}^k} \right) + {c_2}{r_2}\left( {p_{gd}^k - x_{id}^k} \right)$ (1)
 $x_{id}^{k + 1} = x_{id}^k + v_{id}^{k + 1}$ (2)

 ${\omega _k} = \left( {{\omega _{{\rm{start}}}} - {\omega _{{\rm{end}}}}} \right)\left( {\frac{{{K_{\max }} - k}}{{{K_{\max }}}}} \right) + {\omega _{{\rm{end}}}}$ (3)

2 改进的PSO算法

 $\bar P_{id}^k = P_{id}^k \cdot {\rho _1}N\left( {0,{\sigma ^2}} \right)$ (4)
 $\bar P_{id}^k = P_{id}^k + {\rho _2}N\left( {0,{\sigma ^2}} \right)$ (5)

 $\sigma = {\sigma _{\max }} - \left( {{\sigma _{\max }} - {\sigma _{\min }}} \right)\frac{k}{{{K_{\max }}}}$ (6)

 ${f_{{\rm{ave}}}} = \frac{1}{{{N_p}}}\sum\limits_{j = 1}^{{N_p}} {f\left( {\mathit{\boldsymbol{P}}_j^k} \right)} ,$ (7)
 $\mu = \left\{ \begin{array}{l} \frac{{\alpha \left( {{f_{{\rm{ave}}}} - f\left( {\mathit{\boldsymbol{P}}_i^k} \right)} \right)}}{{{f_{{\rm{ave}}}} - f\left( {\mathit{\boldsymbol{P}}_g^k} \right)}},\;\;\;\;f\left( {\mathit{\boldsymbol{P}}_i^k} \right) \le {f_{{\rm{ave}}}}\\ \frac{{\alpha \left( {f\left( {P_i^k} \right) - {f_{{\rm{ave}}}}} \right)}}{{{f_{\max }} - {f_{{\rm{ave}}}}}},\;\;\;\;\;f\left( {\mathit{\boldsymbol{P}}_i^k} \right) > {f_{{\rm{ave}}}} \end{array} \right.$ (8)

 $\alpha = 1 - 0.8\frac{k}{{{K_{\max }}}}$ (9)

 ${\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \in {\mathit{\boldsymbol{B}}_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right) \ge \rho$

 ${\rm{pr}}\left( {\chi \left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1}} \right) = 1} \right) = \alpha ,{\rm{pr}}\left( {\chi \left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1}} \right) = 0} \right) = 1 - \alpha$

 $\begin{array}{*{20}{c}} {{\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right) \ge {\rm{pr}}\left( {\chi \left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1}} \right) = 1} \right) \cdot }\\ {{\rm{pr}}\left( {\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right)\left| {\chi \left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1}} \right) = 1} \right.} \right) = }\\ {\alpha \cdot {\rm{pr}}\left( {\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right)\left| {\chi \left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1}} \right) = 1} \right.} \right) = }\\ {\alpha \cdot {\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1}\left( {{\rm{new}}} \right) \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right)} \end{array}$

 ${\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right) \ge \alpha \cdot {\rm{pr}}\left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right)$

 $g\left( {{x_1},{x_2}, \cdots ,{x_D}} \right) = {\left( {\frac{1}{{\sqrt {2{\rm{\pi }}} \sigma {\rho _2}}}} \right)^D}\exp \left( { - \sum\limits_{d = 1}^D {\frac{{{{\left( {{x_d} - p_{gd}^{k + 1}} \right)}^2}}}{{2{\sigma ^2}\rho _2^2}}} } \right)$

 $\begin{array}{*{20}{c}} {{\rm{pr}}\left( {\mathit{\boldsymbol{\bar P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right) = }\\ {\int_{{B_\varepsilon }} {g\left( {{x_1},{x_2}, \cdots ,{x_D}} \right){\rm{d}}{x_1},{\rm{d}}{x_2}, \cdots ,{\rm{d}}{x_D}} \ge }\\ {{{\left( {\frac{1}{{\sqrt {2{\rm{ \mathsf{ π} }}} {\sigma _{\max }}{\rho _2}}}} \right)}^D}\exp \left( { - D\frac{{2{m^2}}}{{\sigma _{\max }^2\rho _2^2}}} \right)\mu \left( {{\mathit{\boldsymbol{B}}_\varepsilon }} \right)} \end{array}$

$ρ=\left(\frac 1{\sqrt{2π}σ_\maxρ_2}\right)^D\exp(－D\frac{2m^2}{σ_\min\ ^2ρ^2_2})μ(\mathit{\boldsymbol{B}}_ε)$，则

 ${\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \in {B_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_k} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right) \ge \rho > 0。$

 $\mathop {\lim }\limits_{k \to \infty } {\rm{pr}}\left\{ {p_g^k \in {\mathit{\boldsymbol{B}}_\varepsilon }} \right\} = 1$

 $\begin{array}{*{20}{c}} {{\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \notin {\mathit{\boldsymbol{B}}_\varepsilon }} \right) = {\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^1 \notin {\mathit{\boldsymbol{B}}_\varepsilon }} \right) \cdot }\\ {\prod\limits_{t = 1}^k {{\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{t + 1} \notin {\mathit{\boldsymbol{B}}_\varepsilon }\left| {\mathit{\boldsymbol{P}}_g^t \notin {\mathit{\boldsymbol{B}}_\varepsilon }} \right.} \right)} \le }\\ {\prod\limits_{t = 1}^k {{\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{t + 1} \notin {\mathit{\boldsymbol{B}}_\varepsilon }\left| {\mathit{\boldsymbol{P}}_g^t \notin {\mathit{\boldsymbol{B}}_\varepsilon }} \right.} \right)} } \end{array}$

 $\Delta = \left\{ {\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}\left| {{\mathit{\boldsymbol{P}}_g} \notin {\mathit{\boldsymbol{B}}_\varepsilon }} \right.} \right\} \subset {\mathit{\Phi }_{{N_p}}}$

$\forall \mathit{\boldsymbol{\varPhi}}∈\mathit{\boldsymbol{\varPhi}}_{N_p}$，由引理3有

 $\begin{array}{*{20}{c}} {{\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{t + 1} \notin {\mathit{\boldsymbol{B}}_\varepsilon }\left| {\mathit{\boldsymbol{P}}_g^t \notin {\mathit{\boldsymbol{B}}_\varepsilon }} \right.} \right) \le }\\ {{\rm{pr}}\left( {\mathit{\boldsymbol{P}}_g^{k + 1} \notin {\mathit{\boldsymbol{B}}_\varepsilon }\left| {{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}_t} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}} \right.} \right) = 1 - \rho } \end{array}$

 $\mathop {\lim }\limits_{k \to \infty } {\rm{pr}}\left\{ {p_g^k \in {\mathit{\boldsymbol{B}}_\varepsilon }} \right\} = 1$
4 数值实验

 图 1 函数F1的收敛曲线 Fig.1 Convergence curve of function F1
 图 2 函数F4的收敛曲线 Fig.2 Convergence curve of function F4
 图 3 函数F6的收敛曲线 Fig.3 Convergence curve of function F6

 图 4 函数F8的收敛曲线 Fig.4 Convergence curve of function F8
 图 5 函数F10的收敛曲线 Fig.5 Convergence curve of function F10
 图 6 函数F12的收敛曲线 Fig.6 Convergence curve of function F12
5 结束语

 [1] KENNEDY J, EBERHART R C. Particle swarm optimization[C]//IEEE International Conference on Neural Networks. Perth, Australia, 1995:1942-1948. (0) [2] SHI Y H, EBERHART R C. Empirical study of particle swarm optimization[C]//IEEE International Conference on Evolutionary Computation. Washington, USA, 1999:1945-1950. (0) [3] FENG Y, TENG G F A X, WANG Y, et al. Chaotic inertia weight in particle swarm optimization[C]//Proceedings of the 2nd international conference on innovative computing, information and control. Kumamoto, Japan, 2007:475. (0) [4] SHEN X, CHI Z Y, CHEN J C. et al. Particle swarm optimization with dynamic adaptive inertia weight[C]//International conference on challenges in environmental science and computer engineering. Wuhan, China, 2010:287-290. (0) [5] ADEWUMI A O, ARASOMWAN A M. An improved particle swarm optimizer based on swarm success rate for global optimization problems[J]. Journal of experimental and theoretical artificial intelligence, 2016, 28(3): 441-483. DOI:10.1080/0952813X.2014.971444 (0) [6] PLUHACEK M, SENKERIK R, DAVENDRA D, et al. On the behavior and performance of chaos driven PSO algorithm with inertia weight[J]. Computers and mathematics with applications, 2013, 66: 122-134. DOI:10.1016/j.camwa.2013.01.016 (0) [7] YANG C, GAO W, LIU N, et al. Low-discrepancy sequence initialized particle swarm optimization algorithm with high-order nonlinear time-varying inertia weight[J]. Applied soft computing, 2015, 29: 386-394. DOI:10.1016/j.asoc.2015.01.004 (0) [8] 郭建涛, 刘瑞杰, 陈新武. 用于跳频分量选取的修正适应度距离比粒子群算法[J]. 重庆邮电大学学报:自然科学版, 2015, 27(1): 26-30. GUO Jiantao, LIU Ruijie, CHEN Xinwu. Modified fitness-distance ratio based particle swarm optimizer for selection of frequency hopping components[J]. Journal of chongqing university of posts and telecommunications:natural science edition, 2015, 27(1): 26-30. (0) [9] ARDIZZON G, CAVAZZINI G, PAVESI G. Adaptive acceleration coefficients for a new search diversification strategy in particle swarm optimization algorithms[J]. Information sciences, 2015, 299: 337-378. DOI:10.1016/j.ins.2014.12.024 (0) [10] 姜建国, 田旻, 王向前, 等. 采用扰动加速因子的自适应粒子群优化算法[J]. 西安电子科技大学学报, 2012, 39(4): 74-79. JIANG Jianguo, TIAN Min, WANG Xiangqian, et al. Adaptive particle swarm optimization via disturbing acceleration coefficients[J]. Journal of xidian university, 2012, 39(4): 74-79. (0) [11] 任凤鸣, 李丽娟. 改进的PSO算法中学习因子(c1, c2)取值的实验与分析[J]. 广东工业大学学报, 2008, 25(1): 86-89. REN Fengming, LI Lijuan. Experiment and analysis of the value selection of acceleration coefficients (c1, c2) in IPSO method[J]. Journal of Guangdong university of technology, 2008, 25(1): 86-89. (0) [12] GRIMACCIA F, MUSSETTA M, ZICH R. Genetical swarm optimization:self-adaptive hybrid evolutionary algorithm for electromagnetics[J]. IEEE transactions on antennas and propagation, 2007, 55(3): 781-785. DOI:10.1109/TAP.2007.891561 (0) [13] 熊伟丽, 徐保国, 吴晓鹏, 等. 带变异算子的改进粒子群算法研究[J]. 计算机工程与应用, 2006, 42(26): 1-3. XIONG Weili, XU Baoguo, WU Xiaopeng, et al. Study on the particle swarm optimization with mutation operator[J]. Computer engineering and applications, 2006, 42(26): 1-3. DOI:10.3321/j.issn:1002-8331.2006.26.001 (0) [14] 段玉红, 高岳林. 基于差分演化的粒子群算法[J]. 计算机仿真, 2009, 26(6): 212-215. DUAN Yuhong, GAO Yuelin. A particle swarm optimization algorithm based on differential evolution[J]. Computer simulation, 2009, 26(6): 212-215. (0) [15] 王丽芳, 曾建潮. 基于微粒群算法与模拟退火算法的协同进化方法[J]. 自动化学报, 2006, 32(4): 630-635. WANG Lifang, ZENG JianChao. A cooperative evolutionary algorithm based on particle swarm optimization and simulated annealing algorithm[J]. Acta automatica sinica, 2006, 32(4): 630-635. (0) [16] 艾兵, 董明刚. 基于高斯扰动和自然选择的改进粒子群优化算法[J]. 计算机应用, 2016, 36(3): 687-691. AI Bing, DONG Minggang. Improved particle swarm optimization algorithm based on Gaussian disturbance and natural selection[J]. Journal of computer applications, 2016, 36(3): 687-691. DOI:10.11772/j.issn.1001-9081.2016.03.687 (0) [17] 刘宏达, 马忠丽. 均匀粒子群算法[J]. 智能系统学报, 2010, 5(4): 336-341. LIU Hongda, MA Zhongli. A particle swarm optimization algorithm based on uniform design[J]. CAAI transactions on intelligent systems, 2010, 5(4): 336-341. (0) [18] PEHLIVANOGLU Y V. A new particle swarm optimization method enhanced with a periodic mutation strategy and neural networks[J]. IEEE transactions on evolutionary computation, 2013, 17: 436-452. DOI:10.1109/TEVC.2012.2196047 (0) [19] LIM W H, MATISA N A. An adaptive two-layer particle swarm optimization with elitist learning strategy[J]. Information sciences, 2014, 273: 49-72. DOI:10.1016/j.ins.2014.03.031 (0) [20] WU G, QIU D, YU Y, et al. Superior solution guided particle swarm optimization combined with local search techniques[J]. Expert systems with applications, 2014, 41: 7536-7548. DOI:10.1016/j.eswa.2014.06.005 (0) [21] WANG H, SUN H, LI C, et al. Diversity enhanced particle swarm optimization with neighborhood search[J]. Information sciences, 2013, 223: 119-135. DOI:10.1016/j.ins.2012.10.012 (0) [22] TIAN D P. A review of convergence analysis of particle swarm optimization[J]. international journal of grid and distributed computing, 2013, 6: 117-128. DOI:10.14257/ijgdc (0) [23] LIU Q. Order-2 stability analysis of particle swarm optimization[J]. Evolutionary computation, 2014, 23: 187-216. (0) [24] PAN F, LI X T, ZHOU Q, et al. Analysis of standard particle swarm optimization algorithm based on Markov chain[J]. Acta automatica sinica, 2013, 39(4): 81-289. (0) [25] 戴月明, 朱达祥, 吴定会. 核矩阵协同进化的震荡搜索粒子群优化算法[J]. 重庆邮电大学学报:自然科学版, 2016, 28(2): 247-253. DAI Yueming, ZHU Daxiang, WU Dinghui. Shock search particle swarm optimization algorithm based on kernel matrix synergistic evolution[J]. Journal of chongqing university of posts and telecommunications:natural science edition, 2016, 28(2): 247-253. (0) [26] YAO X, LIU Y, LIN G. Evolutionary programming made faster[J]. IEEE transactions on evolutionary computation, 1999, 3(2): 81-102. (0)