Early developments The first programmable computers were invented during the 1940s, and with them, the first programming languages. The earliest computers were programmed in
first-generation programming languages (1GLs),
machine language (simple instructions that could be directly executed by the processor). This code was very difficult to debug and was not
portable between different computer systems. In order to improve the ease of programming,
assembly languages (or
second-generation programming languages—2GLs) were invented, diverging from the machine language to make programs easier to understand for humans, although they did not increase portability. Initially, hardware resources were scarce and expensive, while
human resources were cheaper. Therefore, cumbersome languages that were time-consuming to use, but were closer to the hardware for higher efficiency were favored. The introduction of
high-level programming languages (
third-generation programming languages—3GLs)—revolutionized programming. These languages
abstracted away the details of the hardware, instead being designed to express
algorithms that could be understood more easily by humans. For example, arithmetic expressions could now be written in symbolic notation and later translated into machine code that the hardware could execute. In 1957,
Fortran (FORmula TRANslation) was invented. Often considered the first
compiled high-level programming language, Fortran has remained in use into the twenty-first century.
1960s and 1970s mainframe—the first hardware to support
floating-point arithmetic—in 1957.
Fortran was designed for this machine. Around 1960, the first
mainframes—general purpose computers—were developed, although they could only be operated by professionals and the cost was extreme. The data and instructions were input by
punch cards, meaning that no input could be added while the program was running. The languages developed at this time therefore are designed for minimal interaction. After the invention of the
microprocessor, computers in the 1970s became dramatically cheaper. New computers also allowed more user interaction, which was supported by newer programming languages.
Lisp, implemented in 1958, was the first
functional programming language. Unlike Fortran, it supported
recursion and
conditional expressions, and it also introduced
dynamic memory management on a
heap and automatic
garbage collection. For the next decades, Lisp dominated
artificial intelligence applications. In 1978, another functional language,
ML, introduced
inferred types and polymorphic
parameters. After
ALGOL (ALGOrithmic Language) was released in 1958 and 1960, it became the standard in computing literature for describing
algorithms. Although its commercial success was limited, most popular imperative languages—including
C,
Pascal,
Ada,
C++,
Java, and
C#—are directly or indirectly descended from ALGOL 60. Among its innovations adopted by later programming languages included greater portability and the first use of
context-free,
BNF grammar.
Simula, the first language to support
object-oriented programming (including
subtypes,
dynamic dispatch, and
inheritance), also descends from ALGOL and achieved commercial success. C, another ALGOL descendant, has sustained popularity into the twenty-first century. C allows access to lower-level machine operations than other contemporary languages. Its power and efficiency, generated in part with flexible
pointer operations, comes at the cost of making it more difficult to write correct code.
Prolog, designed in 1972, was the first
logic programming language, communicating with a computer using
formal logic notation. With logic programming, the programmer specifies a desired result and allows the
interpreter to decide how to achieve it.
1980s to 2000s During the 1980s, the invention of the
personal computer transformed the roles for which programming languages were used. New languages introduced in the 1980s included C++, a
superset of C that can compile C programs but also supports
classes and
inheritance.
Ada and other new languages introduced support for
concurrency. The Japanese government invested heavily into the so-called
fifth-generation languages that added support for concurrency to logic programming constructs, but these languages were outperformed by other concurrency-supporting languages. Due to the rapid growth of the
Internet and the
World Wide Web in the 1990s, new programming languages were introduced to support
web pages and
networking.
Java, based on C++ and designed for increased portability across systems and security, enjoyed large-scale success because these features are essential for many Internet applications. Another development was that of
dynamically typed scripting languages—
Python,
JavaScript,
PHP, and
Ruby—designed to quickly produce small programs that coordinate existing
applications. Due to their integration with
HTML, they have also been used for building web pages hosted on
servers.
2000s to present During the 2000s, there was a slowdown in the development of new programming languages that achieved widespread popularity. One innovation was
service-oriented programming, designed to exploit
distributed computing systems which components are connected by a network. Services are similar to objects in object-oriented programming, but run on a separate process.
C# and
F# cross-pollinated ideas between imperative and functional programming. After 2010, several new languages—
Rust,
Go,
Swift,
Zig, and
Carbon —competed for the performance-critical software for which C had historically been used. Most of the new programming languages use
static typing while a few numbers of new languages use
dynamic typing like
Julia. Some of the new programming languages are classified as
visual programming languages like
Scratch and
LabVIEW. Also, some of these languages mix between textual and visual programming usage like
Ballerina. Also, this trend lead to developing projects that help in developing new visual languages like
Blockly by
Google. Many game engines like
Unreal and
Unity support visual scripting. ==Definition==