In several scientific fields, "complexity" has a precise meaning: • In
computational complexity theory, the
amounts of resources required for the execution of
algorithms is studied. The most popular types of computational complexity are the time complexity of a problem equal to the number of steps that it takes to solve an instance of the problem as a function of the
size of the input (usually measured in
bits), using the most efficient
algorithm, and the space complexity of a problem equal to the volume of the
memory used by the algorithm (e.g., cells of the tape) that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. This allows classification of computational problems by
complexity class (such as
P,
NP, etc.). An
axiomatic approach to computational complexity was developed by
Manuel Blum. It allows one to deduce many properties of concrete computational complexity measures, such as time complexity or space complexity, from properties of axiomatically defined measures. • In
algorithmic information theory, the
Kolmogorov complexity (also called
descriptive complexity,
algorithmic complexity or
algorithmic entropy) of a
string is the length of the shortest binary
program that outputs that string.
Minimum message length is a practical application of this approach. Different kinds of Kolmogorov complexity are studied: the uniform complexity, prefix complexity, monotone complexity, time-bounded Kolmogorov complexity, and space-bounded Kolmogorov complexity. An axiomatic approach to Kolmogorov complexity based on
Blum axioms (Blum 1967) was introduced by Mark Burgin in the paper presented for publication by
Andrey Kolmogorov. The axiomatic approach encompasses other approaches to Kolmogorov complexity. It is possible to treat different kinds of Kolmogorov complexity as particular cases of axiomatically defined generalized Kolmogorov complexity. Instead of proving similar
theorems, such as the basic invariance theorem, for each particular measure, it is possible to easily deduce all such results from one corresponding theorem proved in the axiomatic setting. This is a general advantage of the axiomatic approach in
mathematics. The axiomatic approach to Kolmogorov complexity was further developed in the book (Burgin 2005) and applied to software metrics (Burgin and Debnath, 2003; Debnath and Burgin, 2003). • In
information theory,
information fluctuation complexity is the fluctuation of information about
information entropy. It is derivable from fluctuations in the predominance of order and chaos in a dynamic system and has been used as a measure of complexity in many diverse fields. • In
information processing, complexity is a measure of the total number of
properties transmitted by an object and detected by an
observer. Such a collection of properties is often referred to as a
state. • In
physical systems, complexity is a measure of the
probability of the
state vector of the system. This should not be confused with
entropy; it is a distinct mathematical measure, one in which two distinct states are never conflated and considered equal, as is done for the notion of entropy in
statistical mechanics. • In
dynamical systems, statistical complexity measures the size of the minimum program able to statistically reproduce the patterns (configurations) contained in the data set (sequence). • In
mathematics,
Krohn–Rhodes complexity is an important topic in the study of finite
semigroups and
automata. • In
network theory, complexity is the product of richness in the connections between components of a system, and defined by a very unequal distribution of certain measures (some elements being highly connected and some very few, see
complex network). • In
software engineering,
programming complexity is a measure of the interactions of the various elements of the software. This differs from the computational complexity described above in that it is a measure of the design of the software.
Halstead complexity measures,
cyclomatic complexity,
time complexity, and
parameterized complexity are closely linked concepts. • In
model theory,
U-rank is a measure of the complexity of a complete type in the context of stable theories. • In
bioinformatics,
linguistic sequence complexity is a measure of the vocabulary richness of a genetic text in gene sequences • In
statistical learning theory, the
Vapnik–Chervonenkis dimension is a measure of the size (capacity, complexity, expressive power, richness, or flexibility) of a class of sets. • In
computational learning theory,
Rademacher complexity is a measure of richness of a class of sets with respect to a probability distribution. • In
sociology,
social complexity is a
conceptual framework used in the
analysis of society. • In
combinatorial game theory, measures of
game complexity involve understanding game positions, possible outcomes, and computation required for various game scenarios. • In
binary number, Abstract Complexity Definition (ACD) formalizes complexity as the complexity of a binary structure, expressed by the formula: C = N² / n, where N is the number of detectable regularities (contrasts), and n is the number of basic elements (0 and 1). These regularities are identified with contrasts—tensions arising from interactions of common and differentiating features—and with structural information (as opposed to Shannon’s telecommunication information).The formula incorporates two factors:• the N/n ratio, representing information compression/density, and the number of regularities N, introduced to account for the difficulty of compressing longer structures. This definition applies both analogically and directly wherever a system can be expressed in binary form, e.g., in music. Differences from classical structural complexity definitions: Traditional definitions focus on the number of elements and relations without specifying the nature of these relations. In ACD, relations are explicitly defined as tensions (contrasts) resulting from interactions of common and differentiating features of system elements, with a defined dynamics. This makes the definition constructive and operational—allowing formal calculation of complexity.Intuitive criterion of complexity:Complexity is the number of distinguishable elements and the number of connections between them. In relation to contrast: differentiating features correspond to distinguishable elements, common features correspond to connections between those elements. The more such features exist and the stronger their interactions, the greater the contrast, and thus the higher the complexity. Other fields introduce less precisely defined notions of complexity: • A
complex adaptive system has some or all of the following attributes: • The number of parts (and types of parts) in the system and the number of relations between the parts is non-trivial – however, there is no general rule to separate "trivial" from "non-trivial"; • The system has memory or includes
feedback; • The system can adapt itself according to its history or feedback; • The relations between the system and its environment are non-trivial or non-linear; • The system can be influenced by, or can adapt itself to, its environment; • The system is highly sensitive to initial conditions. •
Peak complexity is the concept that human societies address problems by adding social and economic complexity, but that process is subject to diminishing marginal returns == Study ==