Optimization techniques based on
static program analysis of the
source code consider code performance improvements without actually executing the program. No
dynamic program analysis is performed. For example, inferring or placing formal constraints on the number of iterations a
loop is likely to execute is fundamentally useful when considering whether to
unroll it or not, but such facts typically rely on complex runtime factors that are difficult to conclusively establish. Usually, static analysis will have incomplete information and only be able to approximate estimates of the eventual runtime conditions. The first high-level compiler, introduced as the Fortran Automatic Coding System in 1957, broke the code into blocks and devised a table of the frequency each block is executed via a simulated execution of the code in a
Monte Carlo fashion in which the outcome of conditional transfers (as via IF-type statements) is determined by a
random number generator suitably weighted by whatever FREQUENCY statements were provided by the programmer. Rather than programmer-supplied frequency information, profile-guided optimization uses the results of profiling test runs of the
instrumented program to optimize the final
generated code. {{cite web
Just-in-time compilation can make use of
runtime information to dynamically recompile parts of the executed code to generate more efficient native code. If the dynamic profile changes during execution, it can deoptimize the previous native code, and generate a new code optimized with the information from the new profile. ==Adoption==