Author | Title | Year | Journal/Proceedings | Reftype | DOI/URL |
---|---|---|---|---|---|
Angeline, P. | Using selection to improve particle swarm optimization | 1998 | Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on | inproceedings | DOI |
Abstract: This paper describes a evolutionary optimization algorithm that is a hybrid based on the particle swarm algorithm but with the addition of a standard selection mechanism from evolutionary computations. A comparison is performed between the hybrid swarm and the ordinary particle swarm that shows selection to provide an advantage for some (but not all) complex functions | |||||
BibTeX:
@inproceedings{Angeline1998, author = {Angeline, P.J.}, title = {Using selection to improve particle swarm optimization}, booktitle = {Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on}, year = {1998}, pages = {84-89}, doi = {http://dx.doi.org/10.1109/ICEC.1998.699327} } |
|||||
van den Bergh, F. & Engelbrecht, A. | A study of particle swarm optimization particle trajectories | 2006 | Information Sciences | article | DOIURL |
Abstract: Particle swarm optimization (PSO) has shown to be an efficient, robust and simple optimization algorithm. Most of the PSO studies are empirical, with only a few theoretical analyses that concentrate on understanding particle trajectories. These theoretical studies concentrate mainly on simplified PSO systems. This paper overviews current theoretical studies, and extend these studies to investigate particle trajectories for general swarms to include the influence of the inertia term. The paper also provides a formal proof that each particle converges to a stable point. An empirical analysis of multi-dimensional stochastic particles is also presented. Experimental results are provided to support the conclusions drawn from the theoretical findings. | |||||
BibTeX:
@article{vandenBergh2006, author = {F. van den Bergh and A.P. Engelbrecht}, title = {A study of particle swarm optimization particle trajectories}, journal = {Information Sciences}, year = {2006}, volume = {176}, number = {8}, pages = {937 - 971}, url = {http://www.sciencedirect.com/science/article/pii/S0020025505000630}, doi = {http://dx.doi.org/10.1016/j.ins.2005.02.003} } |
|||||
Blum, C. | Ant colony optimization: Introduction and recent trends | 2005 | Physics of Life Reviews | article | DOIURL |
Abstract: Ant colony optimization is a technique for optimization that was introduced in the early 1990's. The inspiring source of ant colony optimization is the foraging behavior of real ant colonies. This behavior is exploited in artificial ant colonies for the search of approximate solutions to discrete optimization problems, to continuous optimization problems, and to important problems in telecommunications, such as routing and load balancing. First, we deal with the biological inspiration of ant colony optimization algorithms. We show how this biological inspiration can be transfered into an algorithm for discrete optimization. Then, we outline ant colony optimization in more general terms in the context of discrete optimization, and present some of the nowadays best-performing ant colony optimization variants. After summarizing some important theoretical results, we demonstrate how ant colony optimization can be applied to continuous optimization problems. Finally, we provide examples of an interesting recent research direction: The hybridization with more classical techniques from artificial intelligence and operations research. | |||||
BibTeX:
@article{Blum2005, author = {Christian Blum}, title = {Ant colony optimization: Introduction and recent trends}, journal = {Physics of Life Reviews}, year = {2005}, volume = {2}, number = {4}, pages = {353 - 373}, url = {http://www.sciencedirect.com/science/article/pii/S1571064505000333}, doi = {http://dx.doi.org/10.1016/j.plrev.2005.10.001} } |
|||||
Bratton, D. & Blackwell, T. | A simplified recombinant PSO [BibTeX] |
2008 | J. Artif. Evol. App. | article | DOIURL |
BibTeX:
@article{Bratton2008, author = {Bratton, Dan and Blackwell, Tim}, title = {A simplified recombinant PSO}, journal = {J. Artif. Evol. App.}, publisher = {Hindawi Publishing Corp.}, year = {2008}, volume = {2008}, pages = {14:1--14:10}, url = {http://dx.doi.org/10.1155/2008/654184}, doi = {http://dx.doi.org/10.1155/2008/654184} } |
|||||
Brits, R., Engelbrecht, A. & van den Bergh, F. | Locating multiple optima using particle swarm optimization | 2007 | Applied Mathematics and Computation | article | DOIURL |
Abstract: Many scientific and engineering applications require optimization methods to find more than one solution to multi-modal optimization problems. This paper presents a new particle swarm optimization (PSO) technique to locate and refine multiple solutions to such problems. The technique, NichePSO, extends the inherent unimodal nature of the standard PSO approach by growing multiple swarms from an initial particle population. Each subswarm represents a different solution or niche; optimized individually. The outcome of the NichePSO algorithm is a set of particle swarms, each representing a unique solution. Experimental results are provided to show that NichePSO can successfully locate all optima on a small set of test functions. These results are compared with another PSO niching algorithm, lbest PSO, and two genetic algorithm niching approaches. The influence of control parameters is investigated, including the relationship between the swarm size and the number of solutions (niches). An initial scalability study is also done. | |||||
BibTeX:
@article{Brits2007, author = {R. Brits and A.P. Engelbrecht and F. van den Bergh}, title = {Locating multiple optima using particle swarm optimization}, journal = {Applied Mathematics and Computation}, year = {2007}, volume = {189}, number = {2}, pages = {1859 - 1883}, url = {http://www.sciencedirect.com/science/article/pii/S0096300306017826}, doi = {http://dx.doi.org/10.1016/j.amc.2006.12.066} } |
|||||
Carlisle, A. & Dozier, G. | An Off-The-Shelf PSO | 2001 | PSO Workshop | inproceedings | |
Abstract: What attributes and settings of the Particle Swarm Optimizer constants result in a good, off-the-shelf, PSO implementation? There are many parameters, both explicit and implicit, associated with the Particle Swarm Optimizer that may affect its performance. There are the social and cognitive learning rates and magnitudes, the population size, the neighborhood size (including global neighborhoods), synchronous or asynchronous updates, and various additional controls, such as inertia and constriction factors. For any given problem, the values and choices for some of these parameters may have significant impact on the efficiency and reliability of the PSO, and yet varying other parameters may have little or no effect. What set of values, then, constitutes a good, general purpose PSO? While some of these factors have been investigated in the literature, others have not. In this paper we use existing literature and a selection of benchmark problems to determine a set of starting values suitable for an ìoff the shelf PSO. | |||||
BibTeX:
@inproceedings{Carlisle2001, author = {Carlisle, A. and Dozier, G.}, title = {An Off-The-Shelf PSO}, booktitle = {PSO Workshop}, year = {2001} } |
|||||
Chongpeng, H., Yuling, Z., Dingguo, J. & Baoguo, X. | On Some Non-linear Decreasing Inertia Weight Strategies in Particle Swarm Optimization | 2007 | Control Conference, 2007. CCC 2007. Chinese | inproceedings | DOI |
Abstract: Inspired by analyzing principle of PSO, some non-linear strategies for decreasing inertia weight (DIW) are proposed based on the existing linear DIW (LDIW), in this paper. Then a power function is designed to unify them. Four benchmark functions are used to evaluate these strategies on the PSO performance and select the best one. The experimental results show that for most continuous optimization problems, the best one gains an advantage over the linear strategy and others. It has more varieties of the swarm at the early stages so can escape from local minimum more easily, and also can speed up the convergence of particles at the later stages to improve the performance of PSO. | |||||
BibTeX:
@inproceedings{Chongpeng2007, author = {Huang Chongpeng and Zhang Yuling and Jiang Dingguo and Xu Baoguo}, title = {On Some Non-linear Decreasing Inertia Weight Strategies in Particle Swarm Optimization}, booktitle = {Control Conference, 2007. CCC 2007. Chinese}, year = {2007}, pages = {750-753}, doi = {http://dx.doi.org/10.1109/CHICC.2006.4347175} } |
|||||
Clerc, M. | Discrete particle swarm optimization illustrated by the traveling salesman problem [BibTeX] |
2004 | New optimization techniques in engineering | article | |
BibTeX:
@article{clerc2004discrete, author = {Clerc, Maurice}, title = {Discrete particle swarm optimization illustrated by the traveling salesman problem}, journal = {New optimization techniques in engineering}, publisher = {Springer Heidelberg, Germany}, year = {2004}, volume = {141}, pages = {219--239} } |
|||||
Clerc, M. & Kennedy, J. | The particle swarm - explosion, stability, and convergence in a multidimensional complex space | 2002 | Evolutionary Computation, IEEE Transactions on | article | DOI |
Abstract: The particle swarm is an algorithm for finding optimal regions of complex search spaces through the interaction of individuals in a population of particles. This paper analyzes a particle's trajectory as it moves in discrete time (the algebraic view), then progresses to the view of it in continuous time (the analytical view). A five-dimensional depiction is developed, which describes the system completely. These analyses lead to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies. Some results of the particle swarm optimizer, implementing modifications derived from the analysis, suggest methods for altering the original algorithm in ways that eliminate problems and increase the ability of the particle swarm to find optima of some well-studied test functions | |||||
BibTeX:
@article{Clerc2002, author = {Clerc, M. and Kennedy, J.}, title = {The particle swarm - explosion, stability, and convergence in a multidimensional complex space}, journal = {Evolutionary Computation, IEEE Transactions on}, year = {2002}, volume = {6}, number = {1}, pages = {58-73}, doi = {http://dx.doi.org/10.1109/4235.985692} } |
|||||
Dorigo, M. | Optimization, Learning and Natural Algorithms (in Italian) [BibTeX] |
1992 | School: Dipartimento di Elettronica, Politecnico di Milano | phdthesis | |
BibTeX:
@phdthesis{Dor1992thesis, author = {Marco Dorigo}, title = {Optimization, Learning and Natural Algorithms (in Italian)}, school = {Dipartimento di Elettronica, Politecnico di Milano}, year = {1992} } |
|||||
Dorigo, M., Birattari, M. & Stutzle, T. | Ant colony optimization | 2006 | Computational Intelligence Magazine, IEEE | article | DOI |
Abstract: Swarm intelligence is a relatively new approach to problem solving that takes inspiration from the social behaviors of insects and of other animals. In particular, ants have inspired a number of methods and techniques among which the most studied and the most successful is the general purpose optimization technique known as ant colony optimization. Ant colony optimization (ACO) takes inspiration from the foraging behavior of some ant species. These ants deposit pheromone on the ground in order to mark some favorable path that should be followed by other members of the colony. Ant colony optimization exploits a similar mechanism for solving optimization problems. From the early nineties, when the first ant colony optimization algorithm was proposed, ACO attracted the attention of increasing numbers of researchers and many successful applications are now available. Moreover, a substantial corpus of theoretical results is becoming available that provides useful guidelines to researchers and practitioners in further applications of ACO. The goal of this article is to introduce ant colony optimization and to survey its most notable applications | |||||
BibTeX:
@article{Dorigo2006, author = {Dorigo, M. and Birattari, M. and Stutzle, T.}, title = {Ant colony optimization}, journal = {Computational Intelligence Magazine, IEEE}, year = {2006}, volume = {1}, number = {4}, pages = {28-39}, doi = {http://dx.doi.org/10.1109/MCI.2006.329691} } |
|||||
Dorigo, M. & Krzysztof, S. | An Introduction to Ant Colony Optimization [BibTeX] |
2006 | IRIDIA Technical Report Series | article | |
BibTeX:
@article{dorigo2006introduction, author = {Dorigo, Marco and Krzysztof, Socha}, title = {An Introduction to Ant Colony Optimization}, journal = {IRIDIA Technical Report Series}, year = {2006} } |
|||||
Dorigo, M., Maniezzo, V., Colorni, A., Dorigo, M., Maniezzo, V., Colorni, A. & others | Positive feedback as a search strategy [BibTeX] |
1991 | article | ||
BibTeX:
@article{dorigo1991positive, author = {Dorigo, Marco and Maniezzo, Vittorio and Colorni, Alberto and Dorigo, M and Maniezzo, V and Colorni, A and others}, title = {Positive feedback as a search strategy}, year = {1991} } |
|||||
Eberhart, R. & Kennedy, J. | A new optimizer using particle swarm theory | 1995 | Micro Machine and Human Science, 1995. MHS '95., Proceedings of the Sixth International Symposium on | inproceedings | DOI |
Abstract: The optimization of nonlinear functions using particle swarm methodology is described. Implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm. Benchmark testing of both paradigms is described, and applications, including neural network training and robot task learning, are proposed. Relationships between particle swarm optimization and both artificial life and evolutionary computation are reviewed | |||||
BibTeX:
@inproceedings{Eberhart1995, author = {Eberhart, R. and Kennedy, J.}, title = {A new optimizer using particle swarm theory}, booktitle = {Micro Machine and Human Science, 1995. MHS '95., Proceedings of the Sixth International Symposium on}, year = {1995}, pages = {39-43}, doi = {http://dx.doi.org/10.1109/MHS.1995.494215} } |
|||||
Eberhart, R. & Shi, Y. | Comparing inertia weights and constriction factors in particle swarm optimization | 2000 | Evolutionary Computation, 2000. Proceedings of the 2000 Congress on | inproceedings | DOI |
Abstract: The performance of particle swarm optimization using an inertia weight is compared with performance using a constriction factor. Five benchmark functions are used for the comparison. It is concluded that the best approach is to use the constriction factor while limiting the maximum velocity Vmax to the dynamic range of the variable Xmax on each dimension. This approach provides performance on the benchmark functions superior to any other published results known by the authors | |||||
BibTeX:
@inproceedings{Eberhart2000, author = {Eberhart, R.C. and Shi, Y.}, title = {Comparing inertia weights and constriction factors in particle swarm optimization}, booktitle = {Evolutionary Computation, 2000. Proceedings of the 2000 Congress on}, year = {2000}, volume = {1}, pages = {84-88 vol.1}, doi = {http://dx.doi.org/10.1109/CEC.2000.870279} } |
|||||
Evers, G. I. | An automatic regrouping mechanism to deal with stagnation in particle swarm optimization [BibTeX] |
2009 | School: University of Texas-Pan American | phdthesis | |
BibTeX:
@phdthesis{evers2009automatic, author = {Evers, George I}, title = {An automatic regrouping mechanism to deal with stagnation in particle swarm optimization}, school = {University of Texas-Pan American}, year = {2009} } |
|||||
Goss, S., Aron, S., Deneubourg, J. & Pasteels, J. | Self-organized shortcuts in the Argentine ant [BibTeX] |
1989 | Naturwissenschaften | article | URL |
BibTeX:
@article{Goss1989, author = {Goss, S. and Aron, S. and Deneubourg, J.L. and Pasteels, J.M.}, title = {Self-organized shortcuts in the Argentine ant}, journal = {Naturwissenschaften}, year = {1989}, volume = {76}, number = {12}, pages = {579-581}, note = {cited By (since 1996) 257}, url = {http://www.scopus.com/inward/record.url?eid=2-s2.0-0024827650&partnerID=40&md5=343e32f88f6b5dc7f2bd8c3af138b341} } |
|||||
Kennedy, J. & Eberhart, R. | A discrete binary version of the particle swarm algorithm | 1997 | Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on | inproceedings | DOI |
Abstract: The particle swarm algorithm adjusts the trajectories of a population of “particles†through a problem space on the basis of information about each particle's previous best performance and the best previous performance of its neighbors. Previous versions of the particle swarm have operated in continuous space, where trajectories are defined as changes in position on some number of dimensions. The paper reports a reworking of the algorithm to operate on discrete binary variables. In the binary version, trajectories are changes in the probability that a coordinate will take on a zero or one value. Examples, applications, and issues are discussed | |||||
BibTeX:
@inproceedings{Kennedy1997, author = {Kennedy, J. and Eberhart, R.C.}, title = {A discrete binary version of the particle swarm algorithm}, booktitle = {Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on}, year = {1997}, volume = {5}, pages = {4104-4108 vol.5}, doi = {http://dx.doi.org/10.1109/ICSMC.1997.637339} } |
|||||
Kennedy, J. & Eberhart, R. | Particle swarm optimization | 1995 | Neural Networks, 1995. Proceedings., IEEE International Conference on | inproceedings | DOI |
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described | |||||
BibTeX:
@inproceedings{Kennedy1995, author = {Kennedy, J. and Eberhart, R.}, title = {Particle swarm optimization}, booktitle = {Neural Networks, 1995. Proceedings., IEEE International Conference on}, year = {1995}, volume = {4}, pages = {1942-1948 vol.4}, doi = {http://dx.doi.org/10.1109/ICNN.1995.488968} } |
|||||
Kennedy, J. & Eberhart, R. C. | Swarm intelligence [BibTeX] |
2001 | book | ||
BibTeX:
@book{Kennedy2001, author = {Kennedy, James and Eberhart, Russell C.}, title = {Swarm intelligence}, publisher = {Morgan Kaufmann Publishers Inc.}, year = {2001} } |
|||||
Kennedy, J. & Mendes, R. | Population structure and particle swarm performance | 2002 | Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on | inproceedings | DOI |
Abstract: The effects of various population topologies on the particle swarm algorithm were systematically investigated. Random graphs were generated to specifications, and their performance on several criteria was compared. What makes a good population structure? We discovered that previous assumptions may not have been correct | |||||
BibTeX:
@inproceedings{Kennedy2002, author = {Kennedy, J. and Mendes, R.}, title = {Population structure and particle swarm performance}, booktitle = {Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on}, year = {2002}, volume = {2}, pages = {1671-1676}, doi = {http://dx.doi.org/10.1109/CEC.2002.1004493} } |
|||||
Krink, T. & Løvbjerg, M. | The LifeCycle model: Combining Particle Swarm Optimisation, Genetic Algorithms and HillClimbers [BibTeX] |
2002 | Proceedings of the 7th International Conference on Parallel Problem Solving from Nature | inproceedings | |
BibTeX:
@inproceedings{Krink02thelifecycle, author = {Thiemo Krink and Morten Løvbjerg}, title = {The LifeCycle model: Combining Particle Swarm Optimisation, Genetic Algorithms and HillClimbers}, booktitle = {Proceedings of the 7th International Conference on Parallel Problem Solving from Nature}, year = {2002}, pages = {621--630} } |
|||||
Li, X. | Particle swarm optimization: an introduction and its recent developments [BibTeX] |
2006 | School of Computer Science and IT(2006), RMIT University | conference | |
BibTeX:
@conference{Li2006, author = {Li, Xiaodong}, title = {Particle swarm optimization: an introduction and its recent developments}, booktitle = {School of Computer Science and IT(2006), RMIT University}, year = {2006} } |
|||||
Li, X. & Engelbrecht, A. P. | Particle swarm optimization: an introduction and its recent developments [BibTeX] |
2007 | Proceedings of the 2007 GECCO conference companion on Genetic and evolutionary computation | conference | DOIURL |
BibTeX:
@conference{Li2007, author = {Li, Xiaodong and Engelbrecht, Andries P.}, title = {Particle swarm optimization: an introduction and its recent developments}, booktitle = {Proceedings of the 2007 GECCO conference companion on Genetic and evolutionary computation}, publisher = {ACM}, year = {2007}, pages = {3391--3414}, url = {http://doi.acm.org/10.1145/1274000.1274118}, doi = {http://dx.doi.org/10.1145/1274000.1274118} } |
|||||
Liang, J. J. & Suganthan, P. | Dynamic multi-swarm particle swarm optimizer | 2005 | Swarm Intelligence Symposium, 2005. SIS 2005. Proceedings 2005 IEEE | inproceedings | DOI |
Abstract: In this paper, a novel dynamic multi-swarm particle swarm optimizer (PSO) is introduced. Different from the existing multi-swarm PSOs and the local version of PSO, the swarms are dynamic and the swarms' size is small. The whole population is divided into many small swarms, these swarms are regrouped frequently by using various regrouping schedules and information is exchanged among the swarms. Experiments are conducted on a set of shifted rotated benchmark functions and results show its better performance when compared with some recent PSO variants. | |||||
BibTeX:
@inproceedings{Liang2005, author = {Liang, J. J. and Suganthan, P.N.}, title = {Dynamic multi-swarm particle swarm optimizer}, booktitle = {Swarm Intelligence Symposium, 2005. SIS 2005. Proceedings 2005 IEEE}, year = {2005}, pages = {124-129}, doi = {http://dx.doi.org/10.1109/SIS.2005.1501611} } |
|||||
Liang, J. J. & Suganthan, P. | Dynamic multi-swarm particle swarm optimizer with local search | 2005 | Evolutionary Computation, 2005. The 2005 IEEE Congress on | inproceedings | DOI |
Abstract: In this paper, the performance of a modified dynamic multi-swarm particle swarm optimizer (DMS-PSO) on the set of benchmark functions provided by CEC2005 is reported. Different from the existing multi-swarm PSOs and local versions of PSO, the swarms are dynamic and the swarms' size is small. The whole population is divided into many small swarms, these swarms are regrouped frequently by using various regrouping schedules and information is exchanged among the swarms. The quasi-Newton method is combined to improve its local search ability | |||||
BibTeX:
@inproceedings{Liang2005-2, author = {Liang, J. J. and Suganthan, P.N.}, title = {Dynamic multi-swarm particle swarm optimizer with local search}, booktitle = {Evolutionary Computation, 2005. The 2005 IEEE Congress on}, year = {2005}, volume = {1}, pages = {522-528 Vol.1}, doi = {http://dx.doi.org/10.1109/CEC.2005.1554727} } |
|||||
Lovbjerg, M. & Krink, T. | Extending particle swarm optimisers with self-organized criticality | 2002 | Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on | inproceedings | DOI |
Abstract: Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions | |||||
BibTeX:
@inproceedings{Lovbjerg2002, author = {Lovbjerg, M. and Krink, T.}, title = {Extending particle swarm optimisers with self-organized criticality}, booktitle = {Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on}, year = {2002}, volume = {2}, pages = {1588-1593}, doi = {http://dx.doi.org/10.1109/CEC.2002.1004479} } |
|||||
Nickabadi, A., Ebadzadeh, M. M. & Safabakhsh, R. | A novel particle swarm optimization algorithm with adaptive inertia weight | 2011 | Applied Soft Computing | article | DOIURL |
Abstract: Particle swarm optimization (PSO) is a stochastic population-based algorithm motivated by intelligent collective behavior of some animals. The most important advantages of the PSO are that PSO is easy to implement and there are few parameters to adjust. The inertia weight (w) is one of PSO's parameters originally proposed by Shi and Eberhart to bring about a balance between the exploration and exploitation characteristics of PSO. Since the introduction of this parameter, there have been a number of proposals of different strategies for determining the value of inertia weight during a course of run. This paper presents the first comprehensive review of the various inertia weight strategies reported in the related literature. These approaches are classified and discussed in three main groups: constant, time-varying and adaptive inertia weights. A new adaptive inertia weight approach is also proposed which uses the success rate of the swarm as its feedback parameter to ascertain the particles’ situation in the search space. The empirical studies on fifteen static test problems, a dynamic function and a real world engineering problem show that the proposed particle swarm optimization model is quite effective in adapting the value of w in the dynamic and static environments. | |||||
BibTeX:
@article{Nickabadi2011, author = {Ahmad Nickabadi and Mohammad Mehdi Ebadzadeh and Reza Safabakhsh}, title = {A novel particle swarm optimization algorithm with adaptive inertia weight}, journal = {Applied Soft Computing}, year = {2011}, volume = {11}, number = {4}, pages = {3658 - 3670}, url = {http://www.sciencedirect.com/science/article/pii/S156849461100055X}, doi = {http://dx.doi.org/10.1016/j.asoc.2011.01.037} } |
|||||
Niknam, T. & Amiri, B. | An efficient hybrid approach based on PSO, ACO and k-means for cluster analysis | 2010 | Applied Soft Computing | article | DOIURL |
Abstract: Clustering is a popular data analysis and data mining technique. A popular technique for clustering is based on k-means such that the data is partitioned into K clusters. However, the k-means algorithm highly depends on the initial state and converges to local optimum solution. This paper presents a new hybrid evolutionary algorithm to solve nonlinear partitional clustering problem. The proposed hybrid evolutionary algorithm is the combination of FAPSO (fuzzy adaptive particle swarm optimization), ACO (ant colony optimization) and k-means algorithms, called FAPSO-ACO–K, which can find better cluster partition. The performance of the proposed algorithm is evaluated through several benchmark data sets. The simulation results show that the performance of the proposed algorithm is better than other algorithms such as PSO, ACO, simulated annealing (SA), combination of PSO and SA (PSO–SA), combination of ACO and SA (ACO–SA), combination of PSO and ACO (PSO–ACO), genetic algorithm (GA), Tabu search (TS), honey bee mating optimization (HBMO) and k-means for partitional clustering problem. | |||||
BibTeX:
@article{Niknam2010, author = {Taher Niknam and Babak Amiri}, title = {An efficient hybrid approach based on PSO, ACO and k-means for cluster analysis}, journal = {Applied Soft Computing}, year = {2010}, volume = {10}, number = {1}, pages = {183 - 197}, url = {http://www.sciencedirect.com/science/article/pii/S1568494609000854}, doi = {http://dx.doi.org/10.1016/j.asoc.2009.07.001} } |
|||||
Omran, M. G., Engelbrecht, A. P. & Salman, A. | Bare bones differential evolution | 2009 | European Journal of Operational Research | article | DOIURL |
Abstract: The barebones differential evolution (BBDE) is a new, almost parameter-free optimization algorithm that is a hybrid of the barebones particle swarm optimizer and differential evolution. Differential evolution is used to mutate, for each particle, the attractor associated with that particle, defined as a weighted average of its personal and neighborhood best positions. The performance of the proposed approach is investigated and compared with differential evolution, a Von Neumann particle swarm optimizer and a barebones particle swarm optimizer. The experiments conducted show that the BBDE provides excellent results with the added advantage of little, almost no parameter tuning. Moreover, the performance of the barebones differential evolution using the ring and Von Neumann neighborhood topologies is investigated. Finally, the application of the BBDE to the real-world problem of unsupervised image classification is investigated. Experimental results show that the proposed approach performs very well compared to other state-of-the-art clustering algorithms in all measured criteria. | |||||
BibTeX:
@article{Omran2009, author = {Mahamed G.H. Omran and Andries P. Engelbrecht and Ayed Salman}, title = {Bare bones differential evolution}, journal = {European Journal of Operational Research}, year = {2009}, volume = {196}, number = {1}, pages = {128 - 139}, url = {http://www.sciencedirect.com/science/article/pii/S0377221708002440}, doi = {http://dx.doi.org/10.1016/j.ejor.2008.02.035} } |
|||||
Ozcan, E. & Mohan, C. | Particle swarm optimization: surfing the waves | 1999 | Evolutionary Computation, 1999. CEC 99. Proceedings of the 1999 Congress on | inproceedings | DOI |
Abstract: A new optimization method has been proposed by J. Kennedy and R.C. Eberhart (1997; 1995), called Particle Swarm Optimization (PSO). This approach combines social psychology principles and evolutionary computation. It has been applied successfully to nonlinear function optimization and neural network training. Preliminary formal analyses showed that a particle in a simple one-dimensional PSO system follows a path defined by a sinusoidal wave, randomly deciding on both its amplitude and frequency (Y. Shi and R. Eberhart, 1998). The paper takes the next step, generalizing to obtain closed form equations for trajectories of particles in a multi-dimensional search space | |||||
BibTeX:
@inproceedings{Ozcan1999, author = {Ozcan, E. and Mohan, C.K.}, title = {Particle swarm optimization: surfing the waves}, booktitle = {Evolutionary Computation, 1999. CEC 99. Proceedings of the 1999 Congress on}, year = {1999}, volume = {3}, pages = {-1944 Vol. 3}, doi = {http://dx.doi.org/10.1109/CEC.1999.785510} } |
|||||
Ozcan, E. & Mohan, C. K. | Analysis of a simple particle swarm optimization system [BibTeX] |
1998 | Intelligent Engineering Systems Through Artificial Neural Networks | article | |
BibTeX:
@article{ozcan1998analysis, author = {Ozcan, Ender and Mohan, Chilukuri K}, title = {Analysis of a simple particle swarm optimization system}, journal = {Intelligent Engineering Systems Through Artificial Neural Networks}, year = {1998}, volume = {8}, pages = {253--258} } |
|||||
Parrott, D. & Li, X. | Locating and tracking multiple dynamic optima by a particle swarm model using speciation | 2006. | Evolutionary Computation, IEEE Transactions on | article | DOI |
Abstract: This paper proposes an improved particle swarm optimizer using the notion of species to determine its neighborhood best values for solving multimodal optimization problems and for tracking multiple optima in a dynamic environment. In the proposed species-based particle swam optimization (SPSO), the swarm population is divided into species subpopulations based on their similarity. Each species is grouped around a dominating particle called the species seed. At each iteration step, species seeds are identified from the entire population, and then adopted as neighborhood bests for these individual species groups separately. Species are formed adaptively at each step based on the feedback obtained from the multimodal fitness landscape. Over successive iterations, species are able to simultaneously optimize toward multiple optima, regardless of whether they are global or local optima. Our experiments on using the SPSO to locate multiple optima in a static environment and a dynamic SPSO (DSPSO) to track multiple changing optima in a dynamic environment have demonstrated that SPSO is very effective in dealing with multimodal optimization functions in both environments | |||||
BibTeX:
@article{Parrott2006, author = {Parrott, D. and Xiaodong Li}, title = {Locating and tracking multiple dynamic optima by a particle swarm model using speciation}, journal = {Evolutionary Computation, IEEE Transactions on}, year = {2006.}, volume = {10}, number = {4}, pages = {440-458}, doi = {http://dx.doi.org/10.1109/TEVC.2005.859468} } |
|||||
Parsopoulos, K. & Vrahatis, M. | On the computation of all global minimizers through particle swarm optimization | 2004 | Evolutionary Computation, IEEE Transactions on | article | DOI |
Abstract: This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimizer. The aforementioned techniques are incorporated in the context of the particle swarm optimization (PSO) method, resulting in an efficient algorithm which has the ability to avoid previously detected solutions and, thus, detect all global minimizers of a function. Experimental results on benchmark problems originating from the fields of global optimization, dynamical systems, and game theory, are reported, and conclusions are derived. | |||||
BibTeX:
@article{Parsopoulos2004, author = {Parsopoulos, K.E. and Vrahatis, M.N.}, title = {On the computation of all global minimizers through particle swarm optimization}, journal = {Evolutionary Computation, IEEE Transactions on}, year = {2004}, volume = {8}, number = {3}, pages = {211-224}, doi = {http://dx.doi.org/10.1109/TEVC.2004.826076} } |
|||||
Pedersen, M. & Chipperfield, A. | Simplifying Particle Swarm Optimization | 2010 | Applied Soft Computing | article | DOIURL |
Abstract: The general purpose optimization method known as Particle Swarm Optimization (PSO) has received much attention in past years, with many attempts to find the variant that performs best on a wide variety of optimization problems. The focus of past research has been with making the PSO method more complex, as this is frequently believed to increase its adaptability to other optimization problems. This study takes the opposite approach and simplifies the PSO method. To compare the efficacy of the original PSO and the simplified variant here, an easy technique is presented for efficiently tuning their behavioural parameters. The technique works by employing an overlaid meta-optimizer, which is capable of simultaneously tuning parameters with regard to multiple optimization problems, whereas previous approaches to meta-optimization have tuned behavioural parameters to work well on just a single optimization problem. It is then found that not only the PSO method and its simplified variant have comparable performance for optimizing a number of Artificial Neural Network problems, but also the simplified variant appears to offer a small improvement in some cases. | |||||
BibTeX:
@article{Pedersen2010, author = {M.E.H. Pedersen and A.J. Chipperfield}, title = {Simplifying Particle Swarm Optimization}, journal = {Applied Soft Computing}, year = {2010}, volume = {10}, number = {2}, pages = {618 - 628}, url = {http://www.sciencedirect.com/science/article/pii/S1568494609001549}, doi = {http://dx.doi.org/10.1016/j.asoc.2009.08.029} } |
|||||
Pedersen, M. E. H. | Tuning & simplifying heuristical optimization [BibTeX] |
2010 | School: PhD thesis, University of Southampton | phdthesis | |
BibTeX:
@phdthesis{pedersen2010tuning, author = {Pedersen, Magnus Erik Hvass}, title = {Tuning & simplifying heuristical optimization}, school = {PhD thesis, University of Southampton}, year = {2010} } |
|||||
Peram, T., Veeramachaneni, K. & Mohan, C. | Fitness-distance-ratio based particle swarm optimization [BibTeX] |
2003 | Swarm Intelligence Symposium, 2003. SIS '03. Proceedings of the 2003 IEEE | inproceedings | DOI |
BibTeX:
@inproceedings{Peram2003, author = {Peram, T. and Veeramachaneni, K. and Mohan, C.K.}, title = {Fitness-distance-ratio based particle swarm optimization}, booktitle = {Swarm Intelligence Symposium, 2003. SIS '03. Proceedings of the 2003 IEEE}, year = {2003}, pages = {174-181}, doi = {http://dx.doi.org/10.1109/SIS.2003.1202264} } |
|||||
Qin, Z. b., Yu, F., Shi, Z. & Wang, Y. | Adaptive inertia weight particle swarm optimization | 2006 | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | article | URL |
Abstract: Adaptive inertia weight is proposed to rationally balance the global exploration and local exploitation abilities for particle swarm optimization. The resulting algorithm is called adaptive inertia weight particle swarm optimization algorithm (AIW-PSO) where a simple and effective measure, individual search ability (ISA), is defined to indicate whether each particle lacks global exploration or local exploitation abilities in each dimension. A transform function is employed to dynamically calculate the values of inertia weight according to ISA. In each iteration during the run, every particle can choose appropriate inertia weight along every dimension of search space according to its own situation. By this fine strategy of dynamically adjusting inertia weight, the performance of PSO algorithm could be improved. In order to demonstrate the effectiveness of AIW-PSO, comprehensive experiments were conducted on three well-known benchmark functions with 10, 20, and 30 dimensions. AIW-PSO was compared with linearly decreasing inertia weight PSO, fuzzy adaptive inertia weight PSO and random number inertia weight PSO. Experimental results show that AIW-PSO achieves good performance and outperforms other algorithms. © Springer-Verlag Berlin Heidelberg 2006. | |||||
BibTeX:
@article{Qin2006, author = {Qin, Z.a b and Yu, F.a and Shi, Z.a and Wang, Y.b }, title = {Adaptive inertia weight particle swarm optimization}, journal = {Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)}, year = {2006}, volume = {4029 LNAI}, pages = {450-459}, note = {cited By (since 1996) 11}, url = {http://www.scopus.com/inward/record.url?eid=2-s2.0-33746239398&partnerID=40&md5=735e5e95ea1f009867a929942dfdba3e} } |
|||||
Rada-Vilela, J., Zhang, M. & Seah, W. | A performance study on synchronous and asynchronous updates in particle swarm optimization [BibTeX] |
2011 | Proceedings of the 13th annual conference on Genetic and evolutionary computation | inproceedings | DOIURL |
BibTeX:
@inproceedings{Rada-Vilela2011, author = {Rada-Vilela, Juan and Zhang, Mengjie and Seah, Winston}, title = {A performance study on synchronous and asynchronous updates in particle swarm optimization}, booktitle = {Proceedings of the 13th annual conference on Genetic and evolutionary computation}, publisher = {ACM}, year = {2011}, pages = {21--28}, url = {http://doi.acm.org/10.1145/2001576.2001581}, doi = {http://dx.doi.org/10.1145/2001576.2001581} } |
|||||
Ratnaweera, A., Halgamuge, S. & Watson, H. | Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients | 2004 | Evolutionary Computation, IEEE Transactions on | article | DOI |
Abstract: This paper introduces a novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations. Initially, to efficiently control the local search and convergence to the global optimum solution, time-varying acceleration coefficients (TVAC) are introduced in addition to the time-varying inertia weight factor in particle swarm optimization (PSO). From the basis of TVAC, two new strategies are discussed to improve the performance of the PSO. First, the concept of "mutation" is introduced to the particle swarm optimization along with TVAC (MPSO-TVAC), by adding a small perturbation to a randomly selected modulus of the velocity vector of a random particle by predefined probability. Second, we introduce a novel particle swarm concept "self-organizing hierarchical particle swarm optimizer with TVAC (HPSO-TVAC)". Under this method, only the "social" part and the "cognitive" part of the particle swarm strategy are considered to estimate the new velocity of each particle and particles are reinitialized whenever they are stagnated in the search space. In addition, to overcome the difficulties of selecting an appropriate mutation step size for different problems, a time-varying mutation step size was introduced. Further, for most of the benchmarks, mutation probability is found to be insensitive to the performance of MPSO-TVAC method. On the other hand, the effect of reinitialization velocity on the performance of HPSO-TVAC method is also observed. Time-varying reinitialization step size is found to be an efficient parameter optimization strategy for HPSO-TVAC method. The HPSO-TVAC strategy outperformed all the methods considered in this investigation for most of the functions. Furthermore, it has also been observed that both the MPSO and HPSO strategies perform poorly when the acceleration coefficients are fixed at two. | |||||
BibTeX:
@article{Ratnaweera2004, author = {Ratnaweera, A. and Halgamuge, S. and Watson, H.C.}, title = {Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients}, journal = {Evolutionary Computation, IEEE Transactions on}, year = {2004}, volume = {8}, number = {3}, pages = {240-255}, doi = {http://dx.doi.org/10.1109/TEVC.2004.826071} } |
|||||
Roy, R., Dehuri, S. & Cho, S. B. | A Novel Particle Swarm Optimization Algorithm for Multi-Objective Combinatorial Optimization Problem [BibTeX] |
2011 | International Journal of Applied Metaheuristic Computing (IJAMC) | article | |
BibTeX:
@article{roy2011novel, author = {Roy, Rahul and Dehuri, Satchidananda and Cho, Sung Bae}, title = {A Novel Particle Swarm Optimization Algorithm for Multi-Objective Combinatorial Optimization Problem}, journal = {International Journal of Applied Metaheuristic Computing (IJAMC)}, publisher = {IGI Global}, year = {2011}, volume = {2}, number = {4}, pages = {41--57} } |
|||||
Sedighizadeh, D. & Masehian, E. | Particle swarm optimization methods, taxonomy and applications [BibTeX] |
2009 | International Journal of Computer Theory and Engineering | article | |
BibTeX:
@article{sedighizadeh2009particle, author = {Sedighizadeh, Davoud and Masehian, Ellips}, title = {Particle swarm optimization methods, taxonomy and applications}, journal = {International Journal of Computer Theory and Engineering}, year = {2009}, volume = {1}, number = {5}, pages = {1793--8201} } |
|||||
Shelokar, P., Siarry, P., Jayaraman, V. & Kulkarni, B. | Particle swarm and ant colony algorithms hybridized for improved continuous optimization [BibTeX] |
2007 | Applied mathematics and computation | article | |
BibTeX:
@article{shelokar2007particle, author = {Shelokar, PS and Siarry, Patrick and Jayaraman, VK and Kulkarni, BD}, title = {Particle swarm and ant colony algorithms hybridized for improved continuous optimization}, journal = {Applied mathematics and computation}, publisher = {Elsevier}, year = {2007}, volume = {188}, number = {1}, pages = {129--142} } |
|||||
Shi, Y. & Eberhart, R. | A modified particle swarm optimizer | 1998 | Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on | inproceedings | DOI |
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through “genetic†operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are “evolved†by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer | |||||
BibTeX:
@inproceedings{Shi1998, author = {Yuhui Shi and Eberhart, R.}, title = {A modified particle swarm optimizer}, booktitle = {Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on}, year = {1998}, pages = {69-73}, doi = {http://dx.doi.org/10.1109/ICEC.1998.699146} } |
|||||
Suresh, K., Ghosh, S., Kundu, D., Sen, A., Das, S. & Abraham, A. | Inertia-adaptive particle swarm optimizer for improved global search | 2008 | Proceedings - 8th International Conference on Intelligent Systems Design and Applications, ISDA 2008 | conference | URL |
Abstract: This paper describes a method for improving the final accuracy and the convergence speed of Particle Swarm Optimization (PSO) by adapting its inertia factor in the velocity updating equation and also by adding a new coefficient to the position updating equation. These modifications do not impose any serious requirements on the basic algorithm in terms of the number of Function Evaluations (FEs). The new algorithm has been shown to be statistically significantly better than four recent variants of PSO on an eight-function test-suite for the following performance matrices: Quality of the final solution, time to find out the solution, frequency of hitting the optima, and scalability. © 2008 IEEE. | |||||
BibTeX:
@conference{Suresh2008, author = {Suresh, K.a and Ghosh, S.a and Kundu, D.a and Sen, A.a and Das, S.a and Abraham, A.b }, title = {Inertia-adaptive particle swarm optimizer for improved global search}, journal = {Proceedings - 8th International Conference on Intelligent Systems Design and Applications, ISDA 2008}, year = {2008}, volume = {2}, pages = {253-258}, note = {cited By (since 1996) 7}, url = {http://www.scopus.com/inward/record.url?eid=2-s2.0-67449124383&partnerID=40&md5=8310e199951ac9d0931f7160d1c2c53f} } |
|||||
Trelea, I. C. | The particle swarm optimization algorithm: convergence analysis and parameter selection | 2003 | Information Processing Letters | article | DOIURL |
Abstract: The particle swarm optimization algorithm is analyzed using standard results from the dynamic system theory. Graphical parameter selection guidelines are derived. The exploration–exploitation tradeoff is discussed and illustrated. Examples of performance on benchmark functions superior to previously published results are given. | |||||
BibTeX:
@article{Trelea2003, author = {Ioan Cristian Trelea}, title = {The particle swarm optimization algorithm: convergence analysis and parameter selection}, journal = {Information Processing Letters}, year = {2003}, volume = {85}, number = {6}, pages = {317 - 325}, url = {http://www.sciencedirect.com/science/article/pii/S0020019002004477}, doi = {http://dx.doi.org/10.1016/S0020-0190(02)00447-7} } |
|||||
Van Den Bergh, F. | An analysis of particle swarm optimizers [BibTeX] |
2002 | phdthesis | ||
BibTeX:
@phdthesis{VanDenBergh:2002, author = {Van Den Bergh, Frans}, title = {An analysis of particle swarm optimizers}, publisher = {University of Pretoria}, year = {2002}, note = {AAI0804353} } |
|||||
Xie, X., Zhang, W. & Yang, Z. | Dissipative particle swarm optimization | 2002 | Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on | inproceedings | DOI |
Abstract: A dissipative particle swarm optimization is developed according to the self-organization of dissipative structure. The negative entropy is introduced to construct an opening dissipative system that is far-from-equilibrium so as to driving the irreversible evolution process with better fitness. The testing of two multimodal functions indicates it improves the performance effectively | |||||
BibTeX:
@inproceedings{Xiao-Feng2002, author = {Xiao-Feng Xie and Wen-Jun Zhang and Zhi-Lian Yang}, title = {Dissipative particle swarm optimization}, booktitle = {Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on}, year = {2002}, volume = {2}, pages = {1456-1461}, doi = {http://dx.doi.org/10.1109/CEC.2002.1004457} } |
|||||
Xinchao, Z. | A perturbed particle swarm algorithm for numerical optimization | 2010 | Applied Soft Computing Journal | article | URL |
Abstract: The canonical particle swarm optimization (PSO) has its own disadvantages, such as the high speed of convergence which often implies a rapid loss of diversity during the optimization process, which inevitably leads to undesirable premature convergence. In order to overcome the disadvantage of PSO, a perturbed particle swarm algorithm (pPSA) is presented based on the new particle updating strategy which is based upon the concept of perturbed global best to deal with the problem of premature convergence and diversity maintenance within the swarm. A linear model and a random model together with the initial max-min model are provided to understand and analyze the uncertainty of perturbed particle updating strategy. pPSA is validated using 12 standard test functions. The preliminary results indicate that pPSO performs much better than PSO both in quality of solutions and robustness and comparable with GCPSO. The experiments confirm us that the perturbed particle updating strategy is an encouraging strategy for stochastic heuristic algorithms and the max-min model is a promising model on the concept of possibility measure. © 2009 Elsevier B.V. All rights reserved. | |||||
BibTeX:
@article{Xinchao2010, author = {Xinchao, Z.}, title = {A perturbed particle swarm algorithm for numerical optimization}, journal = {Applied Soft Computing Journal}, year = {2010}, volume = {10}, number = {1}, pages = {119-124}, note = {cited By (since 1996) 47}, url = {http://www.scopus.com/inward/record.url?eid=2-s2.0-70350103089&partnerID=40&md5=df4fe5c85b3a3ae2191348ce78f84b31} } |
|||||
Yang, X. | Nature-Inspired Metaheuristic Algorithms: Second Edition [BibTeX] |
2011 | book | URL | |
BibTeX:
@book{yang2011nature, author = {Yang, X.S.}, title = {Nature-Inspired Metaheuristic Algorithms: Second Edition}, publisher = {Luniver Press}, year = {2011}, url = {http://books.google.es/books?id=iVBETlh4ogC} } |
|||||
Yang, X., Deb, S. & Fong, S. | Accelerated particle swarm optimization and support vector machine for business optimization and applications [BibTeX] |
2011 | Networked Digital Technologies | article | |
BibTeX:
@article{yang2011accelerated, author = {Yang, Xin-She and Deb, Suash and Fong, Simon}, title = {Accelerated particle swarm optimization and support vector machine for business optimization and applications}, journal = {Networked Digital Technologies}, publisher = {Springer}, year = {2011}, pages = {53--66} } |
|||||
Yasuda, K., Ide, A. & Iwasaki, N. | Adaptive particle swarm optimization | 2003 | Systems, Man and Cybernetics, 2003. IEEE International Conference on | inproceedings | DOI |
Abstract: The particle swarm optimization (PSO) method is one of the most powerful methods for solving unconstrained and constrained global optimization problems. Little is, however, known about how the PSO method works or finds a globally optimal solution of a global optimization problem when the method is applied to global optimization problems. This paper deals with the analysis of the dynamics of PSO in order to obtain an understanding about how it searches a globally optimal solution and a strategy about how to tune its parameters. While a generalized reduced model of PSO is proposed in order to analyze the dynamics of PSO, the stability analysis is carried out on the basis of both the eigenvalue analysis and some numerical simulations on a typical global optimization problem. | |||||
BibTeX:
@inproceedings{Yasuda2003, author = {Yasuda, K. and Ide, A. and Iwasaki, N.}, title = {Adaptive particle swarm optimization}, booktitle = {Systems, Man and Cybernetics, 2003. IEEE International Conference on}, year = {2003}, volume = {2}, pages = {1554-1559 vol.2}, doi = {http://dx.doi.org/10.1109/ICSMC.2003.1244633} } |
|||||
Zhan, Z., Zhang, J., Li, Y. & Chung, H. | Adaptive Particle Swarm Optimization | 2009. | Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on | article | DOI |
Abstract: An adaptive particle swarm optimization (APSO) that features better search efficiency than classical particle swarm optimization (PSO) is presented. More importantly, it can perform a global search over the entire search space with faster convergence speed. The APSO consists of two main steps. First, by evaluating the population distribution and particle fitness, a real-time evolutionary state estimation procedure is performed to identify one of the following four defined evolutionary states, including exploration, exploitation, convergence, and jumping out in each generation. It enables the automatic control of inertia weight, acceleration coefficients, and other algorithmic parameters at run time to improve the search efficiency and convergence speed. Then, an elitist learning strategy is performed when the evolutionary state is classified as convergence state. The strategy will act on the globally best particle to jump out of the likely local optima. The APSO has comprehensively been evaluated on 12 unimodal and multimodal benchmark functions. The effects of parameter adaptation and elitist learning will be studied. Results show that APSO substantially enhances the performance of the PSO paradigm in terms of convergence speed, global optimality, solution accuracy, and algorithm reliability. As APSO introduces two new parameters to the PSO paradigm only, it does not introduce an additional design or implementation complexity. | |||||
BibTeX:
@article{Zhan2009, author = {Zhi-Hui Zhan and Jun Zhang and Yun Li and Chung, H.S.-H.}, title = {Adaptive Particle Swarm Optimization}, journal = {Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on}, year = {2009.}, volume = {39}, number = {6}, pages = {1362-1381}, doi = {http://dx.doi.org/10.1109/TSMCB.2009.2015956} } |
|||||
Zhang, W. & Xie, X. | DEPSO: hybrid particle swarm with differential evolution operator | 2003 | Systems, Man and Cybernetics, 2003. IEEE International Conference on | inproceedings | DOI |
Abstract: A hybrid particle swarm with differential evolution operator, termed DEPSO, which provide the bell-shaped mutations with consensus on the population diversity along with the evolution, while keeping the self-organized particle swarm dynamics, is proposed. Then it is applied to a set of benchmark functions, and the experimental results illustrate its efficiency. | |||||
BibTeX:
@inproceedings{Zhang2003, author = {Wen-Jun Zhang and Xiao-Feng Xie}, title = {DEPSO: hybrid particle swarm with differential evolution operator}, booktitle = {Systems, Man and Cybernetics, 2003. IEEE International Conference on}, year = {2003}, volume = {4}, pages = {3816-3821 vol.4}, doi = {http://dx.doi.org/10.1109/ICSMC.2003.1244483} } |
|||||
Zheng, Y., Ma, L., Zhang, L. & Qian, J. | On the convergence analysis and parameter selection in particle swarm optimization | 2003 | Machine Learning and Cybernetics, 2003 International Conference on | inproceedings | DOI |
Abstract: A PSO with increasing inertia weight, distinct from a widely used PSO with decreasing inertia weight, is proposed in this paper. Far from drawing conclusions from sole empirical study or rule of thumb, this algorithm is derived from particle trajectory study and convergence analysis. Four standard test functions are used to confirm its validity finally. From the experiments, it is clear that a PSO with increasing inertia weight outperforms the one with decreasing inertia weight, both in convergent speed and solution precision, with no additional computing load. | |||||
BibTeX:
@inproceedings{Zheng2003, author = {Yong-ling Zheng and Long-Hua Ma and Li-yan Zhang and Ji-xin Qian}, title = {On the convergence analysis and parameter selection in particle swarm optimization}, booktitle = {Machine Learning and Cybernetics, 2003 International Conference on}, year = {2003}, volume = {3}, pages = {1802-1807 Vol.3}, doi = {http://dx.doi.org/10.1109/ICMLC.2003.1259789} } |
|||||
[BibTeX] |
other | URL | |||
BibTeX:
@other{ACO_scholarpedia:Online,, url = {http://www.scholarpedia.org/article/Ant_colony_optimization } } |
|||||
[BibTeX] |
other | URL | |||
BibTeX:
@other{ParticleSawarm:Online,, url = {http://www.particleswarm.info/Programs.html} } |
|||||
[BibTeX] |
other | URL | |||
BibTeX:
@other{PSO_scholarpedia:Online,, url = {http://www.scholarpedia.org/article/Particle_swarm_optimization} } |
|||||
[BibTeX] |
other | URL | |||
BibTeX:
@other{SwarmIntelligence:Online,, url = {http://www.swarmintelligence.org/} } |
Created with JabRef.
Return to Class Hierarchy