Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Genetic algorithm
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Adaptive GAs === Genetic algorithms with adaptive parameters (adaptive genetic algorithms, AGAs) is another significant and promising variant of genetic algorithms. The probabilities of crossover (pc) and mutation (pm) greatly determine the degree of solution accuracy and the convergence speed that genetic algorithms can obtain. Researchers have analyzed GA convergence analytically.<ref>{{Cite journal |last1=Stannat |first1=W. |title=On the convergence of genetic algorithms β a variational approach |journal=Probab. Theory Relat. Fields |volume=129 |pages=113β132 |year=2004 |doi=10.1007/s00440-003-0330-y|s2cid=121086772 |doi-access=free }}</ref><ref>{{Cite journal |last1=Sharapov |first1=R.R. |last2=Lapshin |first2=A.V. |title=Convergence of genetic algorithms |journal=Pattern Recognit. Image Anal. |volume=16 |pages=392β397 |year=2006 |issue=3 |doi=10.1134/S1054661806030084|s2cid=22890010 }}</ref> Instead of using fixed values of ''pc'' and ''pm'', AGAs utilize the population information in each generation and adaptively adjust the ''pc'' and ''pm'' in order to maintain the population diversity as well as to sustain the convergence capacity. In AGA (adaptive genetic algorithm),<ref>{{Cite journal |last1=Srinivas |first1=M. |last2=Patnaik |first2=L. |title=Adaptive probabilities of crossover and mutation in genetic algorithms |journal=IEEE Transactions on Systems, Man, and Cybernetics |volume=24 |issue=4 |pages=656β667 |year=1994 |doi=10.1109/21.286385 |url=http://eprints.iisc.ac.in/6971/2/adaptive.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://eprints.iisc.ac.in/6971/2/adaptive.pdf |archive-date=2022-10-09 |url-status=live }}</ref> the adjustment of ''pc'' and ''pm'' depends on the fitness values of the solutions. There are more examples of AGA variants: Successive zooming method is an early example of improving convergence.<ref>{{Cite journal |last1=Kwon |first1=Y.D. |last2=Kwon |first2=S.B. |last3=Jin |first3=S.B. |last4=Kim |first4=J.Y. |title=Convergence enhanced genetic algorithm with successive zooming method for solving continuous optimization problems |journal=Computers & Structures |volume=81 |issue=17 |pages=1715β1725 |year=2003 |doi=10.1016/S0045-7949(03)00183-4}}</ref> In ''CAGA'' (clustering-based adaptive genetic algorithm),<ref>{{cite journal |last1=Zhang |first1=J. |last2=Chung |first2=H. |last3=Lo, W. L. |title=Clustering-Based Adaptive Crossover and Mutation Probabilities for Genetic Algorithms |journal=IEEE Transactions on Evolutionary Computation |volume=11 |issue=3 |pages=326–335 |year=2007 |doi=10.1109/TEVC.2006.880727 |s2cid=2625150 }}</ref> through the use of clustering analysis to judge the optimization states of the population, the adjustment of ''pc'' and ''pm'' depends on these optimization states. Recent approaches use more abstract variables for deciding ''pc'' and ''pm''. Examples are dominance & co-dominance principles<ref>{{Cite journal |last1=Pavai |first1=G. |last2=Geetha |first2=T.V. |title= New crossover operators using dominance and co-dominance principles for faster convergence of genetic algorithms|journal=Soft Comput |volume=23 |pages=3661β3686 |year=2019 |issue=11 |doi=10.1007/s00500-018-3016-1|s2cid=254028984 }}</ref> and LIGA (levelized interpolative genetic algorithm), which combines a flexible GA with modified A* search to tackle search space anisotropicity.<ref>{{Cite journal |last1=Li |first1=J.C.F. |last2=Zimmerle |first2=D. |last3=Young |first3=P. |title=Flexible networked rural electrification using levelized interpolative genetic algorithm |journal=Energy & AI |volume=10 |pages=100186 |year=2022 |doi=10.1016/j.egyai.2022.100186|s2cid=250972466 |doi-access=free |bibcode=2022EneAI..1000186L }}</ref> It can be quite effective to combine GA with other optimization methods. A GA tends to be quite good at finding generally good global solutions, but quite inefficient at finding the last few mutations to find the absolute optimum. Other techniques (such as [[Hill climbing|simple hill climbing]]) are quite efficient at finding absolute optimum in a limited region. Alternating GA and hill climbing can improve the efficiency of GA {{Citation needed|date=July 2016}} while overcoming the lack of robustness of hill climbing. This means that the rules of genetic variation may have a different meaning in the natural case. For instance – provided that steps are stored in consecutive order – crossing over may sum a number of steps from maternal DNA adding a number of steps from paternal DNA and so on. This is like adding vectors that more probably may follow a ridge in the phenotypic landscape. Thus, the efficiency of the process may be increased by many orders of magnitude. Moreover, the [[Chromosomal inversion|inversion operator]] has the opportunity to place steps in consecutive order or any other suitable order in favour of survival or efficiency.<ref>See for instance [http://www.thisurlisfalse.com/evolution-in-a-nutshell/ Evolution-in-a-nutshell] {{Webarchive|url=https://web.archive.org/web/20160415193505/http://www.thisurlisfalse.com/evolution-in-a-nutshell/ |date=15 April 2016 }} or example in [[travelling salesman problem]], in particular the use of an [[edge recombination operator]].</ref> A variation, where the population as a whole is evolved rather than its individual members, is known as gene pool recombination. A number of variations have been developed to attempt to improve performance of GAs on problems with a high degree of fitness epistasis, i.e. where the fitness of a solution consists of interacting subsets of its variables. Such algorithms aim to learn (before exploiting) these beneficial phenotypic interactions. As such, they are aligned with the Building Block Hypothesis in adaptively reducing disruptive recombination. Prominent examples of this approach include the mGA,<ref>{{cite journal |url=http://www.complex-systems.com/issues/03-5.html |first1=D. E. |last1=Goldberg |first2=B. |last2=Korb |first3=K. |last3=Deb |title=Messy Genetic Algorithms : Motivation Analysis, and First Results |journal=Complex Systems |volume=5 |issue=3 |pages=493β530 |year=1989 }}</ref> GEMGA<ref>[https://www.osti.gov/servlets/purl/524858 Gene expression: The missing link in evolutionary computation]</ref> and LLGA.<ref>{{cite thesis |last=Harik |first=G. |date=1997 |title=Learning linkage to efficiently solve problems of bounded difficulty using genetic algorithms |type=PhD |publisher=Dept. Computer Science, University of Michigan, Ann Arbour |url=http://portal.acm.org/citation.cfm?id=269517 }}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Genetic algorithm
(section)
Add topic