Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Linear programming
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Interior point === In contrast to the simplex algorithm, which finds an optimal solution by traversing the edges between vertices on a polyhedral set, interior-point methods move through the interior of the feasible region. ==== Ellipsoid algorithm, following Khachiyan ==== This is the first [[worst-case complexity|worst-case]] [[polynomial-time]] algorithm ever found for linear programming. To solve a problem which has ''n'' variables and can be encoded in ''L'' input bits, this algorithm runs in <math> O(n^6 L) </math> time.<ref name = "khachiyan79" /> [[Leonid Khachiyan]] solved this long-standing complexity issue in 1979 with the introduction of the [[ellipsoid method]]. The convergence analysis has (real-number) predecessors, notably the [[iterative method]]s developed by [[Naum Z. Shor]] and the [[approximation algorithm]]s by Arkadi Nemirovski and D. Yudin. ==== Projective algorithm of Karmarkar ==== {{main|Karmarkar's algorithm}} Khachiyan's algorithm was of landmark importance for establishing the polynomial-time solvability of linear programs. The algorithm was not a computational break-through, as the simplex method is more efficient for all but specially constructed families of linear programs. However, Khachiyan's algorithm inspired new lines of research in linear programming. In 1984, [[Narendra Karmarkar|N. Karmarkar]] proposed a<!-- n interior-point --> [[projective method]] for linear programming. Karmarkar's algorithm<ref name = "karmarkar84" /> improved on Khachiyan's<ref name = "khachiyan79" /> worst-case polynomial bound (giving <math>O(n^{3.5}L)</math>). Karmarkar claimed that his algorithm was much faster in practical LP than the simplex method, a claim that created great interest in interior-point methods.<ref name="Strang">{{cite journal|last=Strang|first=Gilbert|author-link=Gilbert Strang|title=Karmarkar's algorithm and its place in applied mathematics|journal=[[The Mathematical Intelligencer]]|date=1 June 1987|issn=0343-6993|pages=4β10|volume=9|doi=10.1007/BF03025891|mr=883185|issue=2|s2cid=123541868}}</ref> Since Karmarkar's discovery, many interior-point methods have been proposed and analyzed. ==== Vaidya's 87 algorithm ==== In 1987, Vaidya proposed an algorithm that runs in <math> O(n^3) </math> time.<ref>{{cite conference|title= An algorithm for linear programming which requires <math>{O} (((m+ n) n^2+(m+ n)^{1.5} n) L)</math> arithmetic operations | conference = 28th Annual IEEE Symposium on Foundations of Computer Science | series = FOCS |last1=Vaidya|first1=Pravin M. |year=1987 }}</ref> ==== Vaidya's 89 algorithm ==== In 1989, Vaidya developed an algorithm that runs in <math>O(n^{2.5})</math> time.<ref>{{cite conference|chapter= Speeding-up linear programming using fast matrix multiplication | conference = 30th Annual Symposium on Foundations of Computer Science| series = FOCS |last1=Vaidya|first1=Pravin M. | title = 30th Annual Symposium on Foundations of Computer Science|year=1989| pages = 332β337| doi = 10.1109/SFCS.1989.63499 | isbn = 0-8186-1982-1}}</ref> Formally speaking, the algorithm takes <math>O( (n+d)^{1.5} n L)</math> arithmetic operations in the worst case, where <math>d</math> is the number of constraints, <math> n </math> is the number of variables, and <math>L</math> is the number of bits. ==== Input sparsity time algorithms ==== In 2015, Lee and Sidford showed that linear programming can be solved in <math>\tilde O((nnz(A) + d^2)\sqrt{d}L)</math> time,<ref>{{cite conference|title= Efficient inverse maintenance and faster algorithms for linear programming | conference = FOCS '15 Foundations of Computer Science |last1=Lee|first1=Yin-Tat|last2=Sidford|first2=Aaron |year=2015| arxiv = 1503.01752 }}</ref> where <math>\tilde O</math> denotes the [[soft O notation]], and <math>nnz(A)</math> represents the number of non-zero elements, and it remains taking <math>O(n^{2.5}L)</math> in the worst case. ==== Current matrix multiplication time algorithm ==== In 2019, Cohen, Lee and Song improved the running time to <math>\tilde O( ( n^{\omega} + n^{2.5-\alpha/2} + n^{2+1/6} ) L)</math> time, <math> \omega </math> is the exponent of [[matrix multiplication]] and <math> \alpha </math> is the dual exponent of [[matrix multiplication]].<ref>{{cite conference|title= Solving Linear Programs in the Current Matrix Multiplication Time | conference = 51st Annual ACM Symposium on the Theory of Computing |last1=Cohen|first1=Michael B.|last2=Lee|first2=Yin-Tat|last3=Song|first3=Zhao |year=2018| arxiv = 1810.07896 | series = STOC'19 }}</ref> <math> \alpha </math> is (roughly) defined to be the largest number such that one can multiply an <math> n \times n </math> matrix by a <math> n \times n^\alpha </math> matrix in <math> O(n^2) </math> time. In a followup work by Lee, Song and Zhang, they reproduce the same result via a different method.<ref>{{cite conference|title= Solving Empirical Risk Minimization in the Current Matrix Multiplication Time | conference = Conference on Learning Theory |last1=Lee|first1=Yin-Tat|last2=Song|first2=Zhao |last3=Zhang|first3=Qiuyi|year=2019| arxiv = 1905.04447 | series = COLT'19 }}</ref> These two algorithms remain <math>\tilde O( n^{2+1/6} L ) </math> when <math> \omega = 2 </math> and <math> \alpha = 1 </math>. The result due to Jiang, Song, Weinstein and Zhang improved <math> \tilde O ( n^{2+1/6} L) </math> to <math> \tilde O ( n^{2+1/18} L) </math>.<ref>{{cite conference|title= Faster Dynamic Matrix Inverse for Faster LPs |last1=Jiang|first1=Shunhua|last2=Song|first2=Zhao |last3=Weinstein|first3=Omri|last4=Zhang|first4=Hengjie|year=2020| arxiv = 2004.07470 }}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Linear programming
(section)
Add topic