The Held–Karp algorithm has exponential time complexity \Theta(2^n n^2), significantly better than the superexponential performance \Theta(n!) of a brute-force algorithm. Held–Karp, however, requires \Theta(n 2^n) space to hold all computed values of the function g(S, e), while brute force needs only \Theta(n^2) space to store the graph itself.
Time Computing one value of g(S, e) for a k-element subset S of \{2, \ldots, n\} requires finding the shortest of k possible paths, each found by adding a known value of g and an edge length from the original graph; that is, it requires time proportional to k. There are \binom{n-1}{k} k-element subsets of \{2, \ldots, n\}; and each subset gives n-k-1 possible values of e. Computing all values of g(S, e) where |S| = k thus requires time k (n-k-1) \binom{n-1}{k} = (n-1) (n-2) \binom{n-3}{k-1}, for a total time across all subset sizes (n-1)(n-2) \sum_{k=1}^{n-2} \binom{n-3}{k-1} = (n-1) (n-2) 2^{n-3} = \Theta(n^2 2^n). The second stage of the algorithm, finding a complete cycle from n-1 candidates, takes \Theta(n) time and does not affect asymptotic performance. For
undirected graphs, the algorithm can be stopped early after the k = \lfloor\frac{n-1}{2}\rfloor step, and finding the minimum g(S, e) + g(S^c\setminus\{1,e\}, e) for every \{(S,e): |S|= \lfloor\frac{n-1}{2}\rfloor, e \in S^c\setminus\{1\}\}, where S^c is the
complement set of S. This is analogous to a
bidirectional search starting at 1 and meeting at midpoint e. However, this is a constant factor improvement and does not affect asymptotic performance.
Space Storing all values of g(S, e) for subsets of size k requires keeping (n-k-1) \binom{n-1}{k} = (n-1) \binom{n-2}{k} values. A complete table of values of g(S, e) thus requires space \sum_{k=0}^{n-2} (n-1) \binom{n-2}{k} = (n-1) 2^n = \Theta(n 2^n). This assumes that n is sufficiently small enough such that S can be stored as a
bitmask of constant multiple of
machine words, rather than an explicit k-tuple. If only the length of the shortest cycle is needed, not the cycle itself, then
space complexity can be improved somewhat by noting that calculating g(S, e) for a S of size k requires only values of g for subsets of size k-1. Keeping only the (n-1) \left[ \binom{n-2}{k-1} + \binom{n-2}{k} \right] values of g(S, e) where S has size either k-1 or k reduces the algorithm's maximum space requirements, attained when k = \left\lfloor \frac{n}{2} \right\rfloor, to \Theta(\sqrt{n} 2^n). ==Pseudocode==