Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Artificial intelligence
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Search and optimization === AI can solve many problems by intelligently searching through many possible solutions.<ref>[[Search algorithm]]s: {{Harvtxt|Russell|Norvig|2021|loc=chpts. 3β5}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=113β163}}, {{Harvtxt|Luger|Stubblefield|2004|pp=79β164, 193β219}}, {{Harvtxt|Nilsson|1998|loc=chpts. 7β12}}</ref> There are two very different kinds of search used in AI: [[state space search]] and [[Local search (optimization)|local search]]. ==== State space search ==== [[State space search]] searches through a tree of possible states to try to find a goal state.<ref>[[State space search]]: {{Harvtxt|Russell|Norvig|2021|loc=chpt. 3}}</ref> For example, [[Automated planning and scheduling|planning]] algorithms search through trees of goals and subgoals, attempting to find a path to a target goal, a process called [[means-ends analysis]].{{Sfnp|Russell|Norvig|2021|loc=sect. 11.2}} [[Brute force search|Simple exhaustive searches]]<ref>[[Uninformed search]]es ([[breadth first search]], [[depth-first search]] and general [[state space search]]): {{Harvtxt|Russell|Norvig|2021|loc=sect. 3.4}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=113β132}}, {{Harvtxt|Luger|Stubblefield|2004|pp=79β121}}, {{Harvtxt|Nilsson|1998|loc=chpt. 8}}</ref> are rarely sufficient for most real-world problems: the [[Search algorithm|search space]] (the number of places to search) quickly grows to [[Astronomically large|astronomical numbers]]. The result is a search that is [[Computation time|too slow]] or never completes.<ref name="Intractability and efficiency and the combinatorial explosion"/> "[[Heuristics]]" or "rules of thumb" can help prioritize choices that are more likely to reach a goal.<ref>[[Heuristic]] or informed searches (e.g., greedy [[Best-first search|best first]] and [[A* search algorithm|A*]]): {{Harvtxt|Russell|Norvig|2021|loc=sect. 3.5}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=132β147}}, {{Harvtxt|Poole|Mackworth|2017|loc=sect. 3.6}}, {{Harvtxt|Luger|Stubblefield|2004|pp=133β150}}</ref> [[Adversarial search]] is used for [[game AI|game-playing]] programs, such as chess or Go. It searches through a [[Game tree|tree]] of possible moves and countermoves, looking for a winning position.<ref>[[Adversarial search]]: {{Harvtxt|Russell|Norvig|2021|loc=chpt. 5}}</ref> ==== Local search ==== [[File:Gradient descent.gif|class=skin-invert-image|thumb|Illustration of [[gradient descent]] for 3 different starting points; two parameters (represented by the plan coordinates) are adjusted in order to minimize the [[loss function]] (the height)]] [[Local search (optimization)|Local search]] uses [[mathematical optimization]] to find a solution to a problem. It begins with some form of guess and refines it incrementally.<ref>[[Local search (optimization)|Local]] or "[[optimization]]" search: {{Harvtxt|Russell|Norvig|2021|loc=chpt. 4}}</ref> [[Gradient descent]] is a type of local search that optimizes a set of numerical parameters by incrementally adjusting them to minimize a [[loss function]]. Variants of gradient descent are commonly used to train [[Artificial neural network|neural networks]],<ref>{{Cite web |last=Singh Chauhan |first=Nagesh |date=December 18, 2020 |title=Optimization Algorithms in Neural Networks |url=https://www.kdnuggets.com/optimization-algorithms-in-neural-networks |access-date=2024-01-13 |website=KDnuggets}}</ref> through the [[backpropagation]] algorithm. Another type of local search is [[evolutionary computation]], which aims to iteratively improve a set of candidate solutions by "mutating" and "recombining" them, [[Artificial selection|selecting]] only the fittest to survive each generation.<ref>[[Evolutionary computation]]: {{Harvtxt|Russell|Norvig|2021|loc=sect. 4.1.2}}</ref> Distributed search processes can coordinate via [[swarm intelligence]] algorithms. Two popular swarm algorithms used in search are [[particle swarm optimization]] (inspired by bird [[flocking]]) and [[ant colony optimization]] (inspired by [[ant trail]]s).{{Sfnp|Merkle|Middendorf|2013}}
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Artificial intelligence
(section)
Add topic