MarketGauss–Legendre method
Company Profile

Gauss–Legendre method

In numerical analysis and scientific computing, the Gauss–Legendre methods are a family of numerical methods for ordinary differential equations. Gauss–Legendre methods are implicit Runge–Kutta methods. More specifically, they are collocation methods based on the points of Gauss–Legendre quadrature. The Gauss–Legendre method based on s points has order 2s.

Intuition
Gauss-Legendre Runge-Kutta (GLRK) methods solve an ordinary differential equation \dot{x} = f(x) with x(0) = x_0. The distinguishing feature of GLRK is the estimation of x(h) - x_0 = \int_0^h f( x(t) ) \, dt with Gaussian quadrature. x(h) = x(0) + \frac{h}{2} \sum_{i=1}^\ell w_i k_i + O(h^{2\ell}), where k_i = f( x( h c_i) ) are the sampled velocities, w_i are the quadrature weights, c_i = \frac{1}{2} h (1+r_i) are the abscissas, and r_i are the roots P_\ell(r_i) = 0 of the Legendre polynomial of degree \ell. A further approximation is needed, as k_i is still impossible to evaluate. To maintain truncation error of order O(h^{2\ell}), we only need k_i to order O(h^{2\ell-1}). The Runge-Kutta implicit definition k_i = f{\left(x_0 + h \sum_j a_{ij} k_j \right)} is invoked to accomplish this. This is an implicit constraint that must be solved by a root finding algorithm like Newton's method. The values of the Runge-Kutta parameters a_{ij} can be determined from a Taylor series expansion in h. == Practical example ==
Practical example
The Gauss-Legendre methods are implicit, so in general they cannot be applied exactly. Instead one makes an educated guess of k_i , and then uses Newton's method to converge arbitrarily close to the true solution. Below is a Matlab function which implements the Gauss-Legendre method of order four. % starting point x = [ 10.5440; 4.1124; 35.8233]; dt = 0.01; N = 10000; x_series = [x]; for i = 1:N x = gauss_step(x, @lorenz_dynamics, dt, 1e-7, 1, 100); x_series = [x_series x]; end plot3( x_series(1,:), x_series(2,:), x_series(3,:) ); set(gca,'xtick',[],'ytick',[],'ztick',[]); title('Lorenz Attractor'); return; function [td, j] = lorenz_dynamics(state) % return a time derivative and a Jacobian of that time derivative x = state(1); y = state(2); z = state(3); sigma = 10; beta = 8/3; rho = 28; td = [sigma*(y-x); x*(rho-z)-y; x*y-beta*z]; j = [-sigma, sigma, 0; rho-z, -1, -x; y, x, -beta]; end function x_next = gauss_step( x, dynamics, dt, threshold, damping, max_iterations ) [d,~] = size(x); sq3 = sqrt(3); if damping > 1 || damping threshold && iteration threshold error('Newton did not converge by %d iterations.', max_iterations); end x_next = x + dt / 2 * (k1 + k2); end This algorithm is surprisingly cheap. The error in k_i can fall below 10^{-12} in as few as 2 Newton steps. The only extra work compared to explicit Runge-Kutta methods is the computation of the Jacobian. == Time-symmetric variants ==
Time-symmetric variants
At the cost of adding an additional implicit relation, these methods can be adapted to have time reversal symmetry. In these methods, the averaged position (x_f+x_i)/2 is used in computing k_i instead of just the initial position x_i in standard Runge-Kutta methods. The method of order 2 is just an implicit midpoint method. : k_1 = f\left(\frac{x_f+x_i}{2}\right) : x_f = x_i + h k_1 The method of order 4 with 2 stages is as follows. : k_1 = f\left( \frac{x_f+x_i}{2} - \frac{\sqrt{3}}{6} h k_2\right) : k_2 = f\left( \frac{x_f+x_i}{2} + \frac{\sqrt{3}}{6} h k_1\right) : x_f = x_i + \frac{h}{2}(k_1 + k_2) The method of order 6 with 3 stages is as follows. : k_1 = f\left( \frac{x_f + x_i}{2} - \frac{\sqrt{15}}{15} h k_2 - \frac{\sqrt{15}}{30} h k_3 \right) : k_2 = f\left( \frac{x_f + x_i}{2} + \frac{\sqrt{15}}{24} h k_1 - \frac{\sqrt{15}}{24} h k_3 \right) : k_3 = f\left( \frac{x_f + x_i}{2} + \frac{\sqrt{15}}{30} h k_1 + \frac{\sqrt{15}}{15} h k_2 \right) : x_f = x_i + \frac{h}{18}( 5 k_1 + 8k_2 + 5k_3) == Notes ==
tickerdossier.comtickerdossier.substack.com