The digital revolution converted technology from analog format to digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the
digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media and to access or distribute it remotely. One turning point of the revolution was the change from analog to digitally recorded music. During the 1980s, the digital format of optical compact discs gradually replaced
analog formats, such as
vinyl records and
cassette tapes, as the popular medium of choice.
Previous inventions Humans have manufactured tools for counting and calculating since ancient times, such as the
abacus,
astrolabe,
equatorium, and mechanical timekeeping devices. More complicated devices started appearing in the 1600s, including the
slide rule and
mechanical calculators. By the early 1800s, the
Industrial Revolution had produced mass-market calculators like the
arithmometer and the enabling technology of the
punch card.
Charles Babbage proposed a mechanical general-purpose computer called the
Analytical Engine, but it was never successfully built, and was largely forgotten by the 20th century, and unknown to most of the inventors of modern computers. The
Second Industrial Revolution, in the last quarter of the 19th century, developed useful electrical circuits and the
telegraph. In the 1880s,
Herman Hollerith developed electromechanical tabulating and calculating devices using punch cards and
unit record equipment, which became widespread in business and government. Meanwhile, various
analog computer systems used electrical, mechanical, or hydraulic systems to model problems and calculate answers. These included an 1872
tide-predicting machine,
differential analysers,
perpetual calendar machines, the
Deltar for water management in the Netherlands,
network analyzers for electrical systems, and various machines for aiming military guns and bombs. The construction of problem-specific analog computers continued in the late 1940s and beyond, with
FERMIAC for neutron transport,
Project Cyclone for various military applications, and the
Phillips Machine for economic modeling. Building on the complexity of the
Z1 and
Z2, German inventor
Konrad Zuse used electromechanical systems to complete in 1941 the
Z3, the world's first working programmable, fully automatic digital computer. Also, during
World War II, Allied engineers constructed electromechanical
bombes to break the German
Enigma machine encoding. The base-10 electromechanical
Harvard Mark I was completed in 1944, and was to some degree improved with inspiration from Charles Babbage's designs.
1947–1969: Origins in
Philadelphia cites the creation of
ENIAC, the "first all-purpose digital computer", in 1946 as the beginning of the Information Age. In 1947, the first working
transistor, the
germanium-based
point-contact transistor, was invented by
John Bardeen and
Walter Houser Brattain while working under
William Shockley at
Bell Labs. This led the way to more advanced
digital computers. From the late 1940s, universities, the military, and businesses developed computer systems to digitally replicate and automate previously manually performed mathematical calculations, with the
LEO being the first commercially available general-purpose computer.
Digital communication became economical for widespread adoption after the invention of the personal computer in the 1970s.
Claude Shannon, a
Bell Labs mathematician, is generally credited with laying the foundations of
digitalization in his pioneering 1948 article,
A Mathematical Theory of Communication. In 1948, Bardeen and Brattain patented an insulated-gate transistor (IGFET) with an inversion layer. Their concept forms the basis of CMOS and DRAM technology today. In 1957, at Bell Labs, Frosch and Derick were able to manufacture planar silicon dioxide transistors, later a team at Bell Labs demonstrated a working
MOSFET. The first integrated circuit milestone was achieved by
Jack Kilby in 1958. Other important technological developments included the invention of the monolithic
integrated circuit chip by
Robert Noyce at
Fairchild Semiconductor in 1959, made possible by the
planar process developed by
Jean Hoerni. In 1963,
complementary MOS (CMOS) was developed by
Chih-Tang Sah and
Frank Wanlass at Fairchild Semiconductor. The
self-aligned gate transistor, which further facilitated mass production, was invented in 1966 by Robert Bower at
Hughes Aircraft and independently by Robert Kerwin,
Donald Klein, and John Sarace at Bell Labs. In 1962, AT&T deployed the
T-carrier for long-haul
pulse-code modulation (PCM) digital voice transmission. The T1 format carried 24 pulse-code modulated, time-division multiplexed speech signals, each encoded in 64 kbit/s streams, leaving 8 kbit/s of framing information, which facilitated the synchronization and demultiplexing at the receiver. Over the subsequent decades, the digitisation of voice became the norm for all but the last mile (where analogue continued to be the norm right into the late 1990s). Following the development of
MOS integrated circuit chips in the early 1960s, MOS chips reached higher
transistor density and lower manufacturing costs than
bipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted by
Moore's law, leading to
large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to
computing was the basis for the first
microprocessors, as engineers began recognizing that a complete
computer processor could be contained on a single MOS LSI chip. In 1968, Fairchild engineer
Federico Faggin improved MOS technology with his development of the
silicon-gate MOS chip, which he later used to develop the
Intel 4004, the first single-chip microprocessor. It was released by
Intel in 1971 and laid the foundations for the
microcomputer revolution that began in the 1970s. MOS technology also led to the development of semiconductor
image sensors suitable for
digital cameras. The first such image sensor was the
charge-coupled device, developed by
Willard S. Boyle and
George E. Smith at Bell Labs in 1969, based on
MOS capacitor technology. In the 1970s, the
home computer was introduced,
time-sharing computers, the
video game console, the first coin-op video games, and the
golden age of arcade video games began with
Space Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, the
data entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data (customer records, invoices, etc.) into digital data. In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, businesses, and industry.
Automated teller machines,
industrial robots,
CGI in film and television,
electronic music,
bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. Millions of people purchased home computers, making household names of early personal computer manufacturers such as
Apple, Commodore, and Tandy. To this day, the Commodore 64 is often cited as the best-selling computer of all time, having sold 17 million units (by some accounts) between 1982 and 1994. In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3% (middle and upper middle class households were the most likely to own one, at 22.9%). By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one. By the late 1980s, many businesses were dependent on computers and digital technology. Motorola created the first mobile phone, the
Motorola DynaTac, in 1983. However, this device used analog communication – digital cell phones were not sold commercially until 1991, when the
2G network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.
Compute! magazine predicted that
CD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs. The first true
digital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States. By the early 2000s, digital cameras had eclipsed traditional film in popularity.
Digital ink and paint were also invented in the late 1980s. Disney's CAPS system (created in 1988) was used for a scene in 1989's
The Little Mermaid and for all their animation films between 1990's
The Rescuers Down Under and 2004's
Home on the Range.
1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0 Tim Berners-Lee invented the
World Wide Web in 1989. The "Web 1.0 era" ended in 2005, coinciding with the development of further advanced technologies at the beginning of the 21st century. The first public digital
HDTV broadcast was of the
1990 FIFA World Cup that June; it was played in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside Japan. The
World Wide Web became publicly accessible in 1991, which had been available only to government and universities. In 1993,
Marc Andreessen and
Eric Bina introduced
Mosaic, the first web browser capable of displaying inline images and the basis for later browsers such as
Netscape Navigator and
Internet Explorer.
Stanford Federal Credit Union was the first
financial institution to offer online internet banking services to all of its members in October 1994. In 1996,
OP Financial Group, also a
cooperative bank, became the second online bank in the world and the first in Europe. The Internet expanded rapidly worldwide, and by 1996, it was part of
mass culture, with many businesses listing websites in their ads. By 1999, almost every country had a connection, and nearly half of
Americans and people in several other countries used the internet on a regular basis. However, throughout the 1990s, "getting online" entailed complicated configuration, and
dial-up was the only connection type affordable by individual users; the present-day mass
internet culture was not possible. In 1989, about 15% of all households in the United States owned a personal computer. For households with children, nearly 30% owned a computer in 1989, and in 2000, 65% owned one.
Cell phones became as ubiquitous as computers by the early 2000s, with movie theaters beginning to show ads telling people to silence their phones. They also became
much more advanced than phones of the 1990s, most of which only took calls or at most allowed for the playing of simple games. Text messaging became widely used worldwide in the late 1990s, except in the United States, where it didn't become commonplace until the early 2000s. The digital revolution became truly global at this time as well – after revolutionizing society in the
developed world in the 1990s, the digital revolution spread to the masses in the
developing world in the 2000s. By 2000, the majority of U.S. households had at least one personal computer and
internet access the following year. In 2002, a majority of U.S. survey respondents reported having a mobile phone.
2005–present: Web 2.0, social media, smartphones, digital TV In late 2005, the number of people with internet access reached 1 billion, and 3 billion people worldwide used cell phones by the end of the decade.
High-definition television became the standard television broadcasting format in many countries by the end of the decade. In September and December 2006, respectively,
Luxembourg and the
Netherlands became the first countries to completely
transition from analog to digital television. In September 2007, a majority of U.S. survey respondents reported having
broadband internet at home. According to estimates from the
Nielsen Media Research, approximately 45.7 million U.S. households in 2006 (or approximately 40 percent of approximately 114.4 million) owned a dedicated
home video game console, and by 2015, 51 percent of U.S. households owned a dedicated home video game console according to an
Entertainment Software Association annual industry
report. By 2012, over 2 billion people used the Internet, twice the number using it in 2007.
Cloud computing had entered the mainstream by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning a
smartphone. By 2016, half of the world's population was connected, and as of 2020, that number has risen to 67%. == Rise in digital technology and commercialization of computers ==