09-05-2011, 12:05 PM
ABSTRACT
Selection operator is one of the important aspects in the GA process. There are several ways for selection.Some of them are Tournament selection, Ranking selection, and Proportional selection. There are manyways for proportional selection. The most popular are Roulette Wheel Selection (RWS), StochasticReminder Roulette Wheel Selection (SRRWS), and Stochastic Universal Sampling (SUS). In this paper amodified RWS method is proposed to increase the gain of resources, reliability and diversity; anddecrease the uncertainty in selection process.
Keywords: Elitism, Genetic Algorithms, Selection, Optimization, Robust.
1. INTRODUCTION
John Holland proposed the first GeneticAlgorithm (GA) in 1975 [1], and GAs becamepopularized by the publication of DavidGoldberg’s book in 1989 [2]. Since that time,GAs have been used in a wide range ofapplications where optimization is needed. GAsare evolved from evolutionary process and basedon evolutionary operators like selection, mutationand crossover with the help of survivalcharacteristics through fitness for arriving at thebest solutions for specified problems. A modifiedalgorithm of RWS (named ranked based roulettewheel selection RRWS) is presented in this paper.It is faster and robust than RWS.This paper is organized as follows: Section 2 is anIntroduction to GA, Section 3 presents Design ofGA with Ranked Based Roulette, Section 4illustrates the experimental work, followed bySection 5 which discusses the Results, and endswith Section 6 the Conclusions.
2. AN INTRODUCTION TO GA
Genetic Algorithms are among several types ofoptimization methods that use a stochasticapproach to randomly search for good solutions toa specified problem, including SimulatedAnnealing [3], Tabu search [4], and numerousvariations. These stochastic approaches usevarious analogies to natural systems to build frompromising solutions, ensuring greater efficiencythan completely random search. The advantage ofusing these types of approaches over traditionaloptimization methods, such as nonlinearprogramming, is that they can solve any type ofproblem without explicit specifications ofproblem characteristics (e.g. derivatives of theobjective function). This property is particularlyimportant for complex applications, where theoptimization problem often involves integerdecision variables or potential solutions that mustbe evaluated with complex existing simulationmodels, where derivative calculations would bedifficult or impossible. Stochastic approaches alsoperform broad global search, unlike traditionalnonlinear optimization approaches such asnonlinear programming that can converge to localminima in multimodal optimization problems.These benefits have resulted in widespread useand acceptance of these approaches among theresearchers. The disadvantages of theseapproaches are that they are not guaranteed to findthe globally optimal solution and they can besubstantially slower than traditional optimizationmethods for problems that can be solved usingtraditional approaches. Therefore, theyrecommend the use of these methods only forproblems that cannot be effectively solved usingtraditional optimization approaches, such as thosewith numerous integer decision variables, nonconvexities,or other irregularities
Download full report
http://jatitvolumes/research-papers/Vol4No4/3vol4no4.pdf