Single Objective Genetic Computation (EC) Techniques
35.3 Single Objective Genetic Computation (EC) Techniques
Optimization Search Algorithm (SOGA)
A comparison between Conventional Optimization Techniques and evolutionary algorithms (like GA and PSO) is presented in GAs are an evolutionary optimization approach which is an Table 35.1 [1].
alternative to traditional optimization methods. GA is most
TABLE 35.1 Comparison between conventional optimization procedures and evolutionary algorithms Property
Evolutionary
Traditional
Search space
Trajectory by a single point Motivation
Population of potential solutions
Mathematical properties (gradient, Hessian) Applicability
Natural selection and Social adaptation
Applicable to a specific problem domain Point Transition
Domain independent, Applicable to variety of problems
Deterministic Prerequisites
Probabilistic
Auxiliary knowledge such as gradient vectors Initial guess
An objective function to be optimized
Provided by user Flow of control
Automatically generated by the algorithm
Mostly parallel
Mostly serial
CPU time
Local optimum, dependant of initial guess Advantages
Global optimum more probable
Convergence proof Drawbacks
Global search, parallel, speed
No general formal convergence proof
Locality, computational cost
35 Novel AI-Based Soft Computing Applications in Motor Drives 995 appropriate for complex nonlinear models where the location
2. [Fitness] Evaluate the fitness f (x) of each chromosome of the global optimum is a difficult task. It may be possible
x in the population
to use GA techniques to consider problems which may not be
3. [New population] Create a new population by repeat- modeled as accurately using other approaches.
ing the following steps until the new population is Therefore, GA appears to be a potentially useful approach.
complete:
GA is particularly applicable to problems which are large, (a) [Selection] Select two parent chromosomes nonlinear, and possibly discrete in nature, features that tradi- from a population according to their fitness (the tionally add to the degree of complexity of solution. Due to better fitness, the bigger chance to be selected). the probabilistic development of the solution, GA does not (b) [Crossover] With a crossover probability cross guarantee optimality even when it may be reached. However, over the parents to form a new offspring (chil- they are likely to be close to the global optimum. This prob- dren). If no crossover was performed, the off- abilistic nature of the solution is also the reason they are not
contained by local optima. The GA procedure is based on the spring is an exact copy of parents. (c) [Mutation] With a mutation probability mutate
Darwinian principle of survival of the fittest. An initial popu- new offspring at each locus (position in chro- lation is created containing a predefined number of individuals
mosome).
(or solutions), each represented by a genetic string (incorporat- (d) [Accepting] Place the new offspring in a new ing the variable information). Each individual has an associated
population.
fitness measure, typically representing an objective value. The concept that fittest (or best) individuals in a population will
4. [Replace] Use new generated population for a further produce fitter offspring is then implemented in order to repro-
run of algorithm
duce the next population. Selected individuals are chosen for
5. [Test] If the end condition is satisfied, stop, and return reproduction (or crossover) at each generation, with an appro-
the best solution in current population priate mutation factor to randomly modify the genes of an
[Loop] Go to step 2
individual, in order to develop the new population. The result GAs can be applied to many scientific, engineering problems, is another set of individuals based on the original subjects lead- once solutions of a given problem can be encoded to chromo- ing to subsequent populations with better (min. or ma35.) somes in GA, and compare the relative performance (fitness) of individual fitness. Therefore, the algorithm identifies the indi- solutions. An effective GA representation and meaningful fit- viduals with the optimizing fitness values, and those with lower ness evaluation are the keys of the success in GA applications. fitness will naturally get discarded from the population [2]. The appeal of GAs comes from their simplicity and elegance as Figure 35.1 shows the general flow chart of the GA algorithm robust search algorithms as well as from their power to discover based on the total error iterative minimum search. The steps of good solutions rapidly for difficult high-dimensional prob- the GA are depicted as follows: lems. The main advantage of GA is that models which cannot
be developed using other solution methods without some form somes (suitable solutions for the problem)
1. [Start] Generate random population of n chromo-
of approximation can be considered in an un-approximated form. Size of the model, i.e., number of probabilistic variables, has a significant effect on the speed of solution therefore model
specification can be crucial. Unlike other solution methods, integer variables are easier to accommodate in GA than con- tinuous variables. This is due to the resulting restricted search
Start
Generation of initial population space. Further, variable bound values can be applied to achieve similar results. GAs can be used for problem-solving and for
modeling when the search space is large, complex, or poorly understood, domain knowledge is scarce or expert knowledge is difficult to encode to narrow the search space, no mathemat-
ical analysis is available, and traditional search methods fail.
Mutation