feasible region of possible values for those variables. In the two-variable case this region is in the shape of a convex
simple polygon.
Basis exchange algorithms Simplex algorithm of Dantzig The
simplex algorithm, developed by
George Dantzig in 1947, solves LP problems by constructing a feasible solution at a vertex of the
polytope and then walking along a path on the edges of the polytope to vertices with non-decreasing values of the objective function until an optimum is reached for sure. In many practical problems, "
stalling" occurs: many pivots are made with no increase in the objective function. In rare practical problems, the usual versions of the simplex algorithm may actually "cycle". which is similar to its behavior on practical problems. However, the simplex algorithm has poor worst-case behavior: Klee and Minty constructed a family of linear programming problems for which the simplex method takes a number of steps exponential in the problem size. In fact, for some time it was not known whether the linear programming problem was solvable in
polynomial time, i.e. of
complexity class P.
Criss-cross algorithm Like the simplex algorithm of Dantzig, the
criss-cross algorithm is a basis-exchange algorithm that pivots between bases. However, the criss-cross algorithm need not maintain feasibility, but can pivot rather from a feasible basis to an infeasible basis. The criss-cross algorithm does not have
polynomial time-complexity for linear programming. Both algorithms visit all 2
D corners of a (perturbed)
cube in dimension
D, the
Klee–Minty cube, in the
worst case.
Interior point In contrast to the simplex algorithm, which finds an optimal solution by traversing the edges between vertices on a polyhedral set, interior-point methods move through the interior of the feasible region.
Ellipsoid algorithm, following Khachiyan This is the first
worst-case polynomial-time algorithm ever found for linear programming. To solve a problem which has
n variables and can be encoded in
L input bits, this algorithm runs in O(n^6 L) time. Since Karmarkar's discovery, many interior-point methods have been proposed and analyzed.
Vaidya's 87 algorithm In 1987, Vaidya proposed an algorithm that runs in O(n^3) time.
Vaidya's 89 algorithm In 1989, Vaidya developed an algorithm that runs in O(n^{2.5}) time with the use of
fast matrix multiplication algorithms. Formally speaking, the algorithm takes O( (n+d)^{1.5} n L) arithmetic operations in the worst case, where d is the number of constraints, n is the number of variables, and L is the number of bits.
Input sparsity time algorithms In 2015, Lee and Sidford showed that linear programming can be solved in \tilde O((nnz(A) + d^2)\sqrt{d}L) time, where \tilde O denotes the
soft O notation, and nnz(A) represents the number of non-zero elements, and it remains taking O(n^{2.5}L) in the worst case.
Current matrix multiplication time algorithm In 2019, Cohen, Lee and Song improved the running time to \tilde O( ( n^{\omega} + n^{2.5-\alpha/2} + n^{2+1/6} ) L) time, \omega is the exponent of matrix multiplication and \alpha is the dual exponent of matrix multiplication. \alpha is (roughly) defined to be the largest number such that one can multiply an n \times n matrix by a n \times n^\alpha matrix in O(n^2) time. In a followup work by Lee, Song and Zhang, they reproduce the same result via a different method. These two algorithms remain \tilde O( n^{2+1/6} L ) when \omega = 2 and \alpha = 1 . The result due to Jiang, Song, Weinstein and Zhang improved \tilde O ( n^{2+1/6} L) to \tilde O ( n^{2+1/18} L) .
Comparison of interior-point methods and simplex algorithms The current opinion is that the efficiencies of good implementations of simplex-based methods and interior point methods are similar for routine applications of linear programming. However, for specific types of LP problems, it may be that one type of solver is better than another (sometimes much better), and that the structure of the solutions generated by interior point methods versus simplex-based methods are significantly different with the support set of active variables being typically smaller for the latter one. == Open problems and recent work ==