Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Random optimization
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|OPTIMISATION TECHNIC IN MATHS}} '''Random optimization (RO)''' is a family of numerical [[Optimization (mathematics)|optimization]] methods [[Derivative-free optimization|that do not require the gradient]] of the optimization problem and RO can hence be used on functions that are not [[Continuous function|continuous]] or [[Differentiable function|differentiable]]. Such optimization methods are also known as direct-search, derivative-free, or black-box methods. The name random optimization is attributed to Matyas <ref name=matyas65random/> who made an early presentation of RO along with basic mathematical analysis. RO works by iteratively moving to better positions in the search-space which are sampled using e.g. a [[normal distribution]] surrounding the current position. == Algorithm == {{See also|Simulated annealing}} Let <math>f: \mathbb{R}^{n} \rarr \mathbb{R}</math> be the fitness or cost function which must be minimized. Let <math>x \isin \mathbb{R}^{n}</math> designate a position or candidate solution in the search-space. The basic RO algorithm can then be described as: * Initialize '''x''' with a random position in the search-space. * Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following: ** Sample a new position '''y''' by adding a [[normal distribution|normally distributed]] random vector to the current position '''x''' ** If (''f''('''y''') < ''f''('''x''')) then move to the new position by setting '''x''' = '''y''' * Now '''x''' holds the best-found position. This algorithm corresponds to a (1+1) [[evolution strategy]] with constant step-size. == Convergence and variants == Matyas showed the basic form of RO converges to the optimum of a simple [[unimodal function]] by using a [[Limit (mathematics)|limit-proof]] which shows convergence to the optimum is certain to occur if a potentially infinite number of iterations are performed. However, this proof is not useful in practice because a finite number of iterations can only be executed. In fact, such a theoretical limit-proof will also show that purely random sampling of the search-space will inevitably yield a sample arbitrarily close to the optimum. Mathematical analyses are also conducted by Baba <ref name=baba81convergence/> and Solis and Wets <ref name=solis81random/> to establish that convergence to a region surrounding the optimum is inevitable under some mild conditions for RO variants using other [[probability distribution]]s for the sampling. An estimate on the number of iterations required to approach the optimum is derived by Dorea.<ref name=dorea83expected/> These analyses are criticized through empirical experiments by Sarma <ref name=sarma90convergence/> who used the optimizer variants of Baba and Dorea on two real-world problems, showing the optimum to be approached very slowly and moreover that the methods were actually unable to locate a solution of adequate fitness, unless the process was started sufficiently close to the optimum to begin with. == See also == * [[Random search]] is a closely related family of optimization methods which sample from a [[hypersphere]] instead of a normal distribution. * [[LuusโJaakola]] is a closely related optimization method using a [[Uniform distribution (continuous)|uniform distribution]] in its sampling and a simple formula for exponentially decreasing the sampling range. * [[Pattern search (optimization)|Pattern search]] takes steps along the axes of the search-space using exponentially decreasing step sizes. * [[Stochastic optimization]] == References == {{reflist|refs= <ref name=matyas65random> {{cite journal |last=Matyas |first=J. |title=Random optimization |journal=Automation and Remote Control |year=1965 |volume=26 |number=2 |pages=246โ253 |url=http://www.mathnet.ru/eng/at11288 }} </ref> <ref name=baba81convergence> {{cite journal |last=Baba |first=N. |title=Convergence of a random optimization method for constrained optimization problems |journal=Journal of Optimization Theory and Applications |year=1981 |volume=33 |number=4 |pages=451โ461 |doi=10.1007/bf00935752 }} </ref> <ref name=solis81random> {{cite journal | last1=Solis | first1=Francisco J. | last2=Wets | first2=Roger J.-B. | authorlink2=Roger J-B Wets | title=Minimization by random search techniques | journal=[[Mathematics of Operations Research]] | year=1981 | volume=6 | number=1 | pages=19โ30 | doi=10.1287/moor.6.1.19}} </ref> <ref name=dorea83expected> {{cite journal |last1=Dorea |first1=C.C.Y. |title=Expected number of steps of a random optimization method |journal=Journal of Optimization Theory and Applications |year=1983 |volume=39 |number=3 |pages=165โ171 |doi=10.1007/bf00934526 }} </ref> <ref name=sarma90convergence> {{cite journal |last1=Sarma |first1=M.S. |title=On the convergence of the Baba and Dorea random optimization methods |journal=Journal of Optimization Theory and Applications |year=1990 |volume=66 |number=2 |pages=337โ343 |doi=10.1007/bf00939542 }} </ref> }} {{Major subfields of optimization}} {{DEFAULTSORT:Random Optimization}} [[Category:Optimization algorithms and methods]]
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Templates used on this page:
Template:Major subfields of optimization
(
edit
)
Template:Reflist
(
edit
)
Template:See also
(
edit
)
Template:Short description
(
edit
)
Search
Search
Editing
Random optimization
Add topic