When
videotapes were first developed in the 1950s, the only way to edit was to physically cut the tape with a razor blade and splice segments together. While the footage excised in this process was not technically destroyed, continuity was lost and the footage was generally discarded. In 1963, with the introduction of the
Ampex Editec, videotape could be edited electronically with a process known as
linear video editing by selectively copying the original footage to another tape called a
master. The original recordings are not destroyed or altered in this process. However, since the final product is a copy of the original, there is a generation loss of quality.
First non-linear editor The first truly non-linear editor, the
CMX 600, was introduced in 1971 by
CMX Systems, a joint venture between
CBS and
Memorex. It recorded and played back black-and-white analog video recorded in "
skip-field" mode on modified
disk pack drives the size of washing machines that could store a half-hour worth of video & audio for editing. These disk packs were commonly used to store data digitally on mainframe computers of the time. The 600 had a console with two monitors built in. The right monitor, which played the preview video, was used by the editor to make cuts and edit decisions using a
light pen. The editor selected from options superimposed as text over the preview video. The left monitor was used to display the edited video. A DEC
PDP-11 computer served as a controller for the whole system. Because the video edited on the 600 was in low-resolution black and white, the 600 was suitable only for offline editing.
The 1980s Non-linear editing systems were built in the 1980s using computers coordinating multiple
LaserDiscs or banks of VCRs. One example of these tape and disc-based systems was Lucasfilm's
EditDroid, which used several LaserDiscs of the same raw footage to simulate random-access editing. EditDroid was demonstrated at NAB in 1984. EditDroid was the first system to introduce modern concepts in non-linear editing such as timeline editing and clip bins. The LA-based post house Laser Edit also had an in-house system using recordable random-access LaserDiscs. The most popular non-linear system in the 1980s was
Ediflex, which used a bank of
U-matic and
VHS VCRs for offline editing. Ediflex was introduced in 1983 on the Universal series "
Still the Beaver". By 1985 it was used on over 80% of filmed network programs and Cinedco was awarded the
Technical Emmy for "Design and Implementation of Non-Linear Editing for Filmed Programs." In 1984,
Montage Picture Processor was demonstrated at NAB.—which popularized this terminology over other terminology common at the time, including
real-time editing,
random-access or
RA editing,
virtual editing,
electronic film editing, and so on. Non-linear editing with computers as it is known today was first introduced by
Editing Machines Corp. in 1989 with the EMC2 editor, a PC-based non-linear off-line editing system that utilized magneto-optical disks for storage and playback of video, using half-screen-resolution video at 15 frames per second. A couple of weeks later that same year,
Avid introduced the Avid/1, the first in the line of their
Media Composer systems. It was based on the
Apple Macintosh computer platform (
Macintosh II systems were used) with special hardware and software developed and installed by Avid. The video quality of the Avid/1 (and later
Media Composer systems from the late 1980s) was somewhat low (about VHS quality), due to the use of a very early version of a
Motion JPEG (M-JPEG) codec. It was sufficient, however, to provide a versatile system for offline editing.
Lost in Yonkers (1993) was the first film edited with Avid Media Composer, and the first long-form documentary so edited was the HBO program
Earth and the American Dream, which won a National Primetime Emmy Award for Editing in 1993. The NewTek
Video Toaster Flyer for the
Amiga included non-linear editing capabilities in addition to processing live video signals. The Flyer used
hard drives to store video clips and audio, and supported complex scripted playback. The Flyer provided simultaneous dual-channel playback, which let the Toaster's
video switcher perform transitions and other effects on video clips without additional
rendering. The Flyer portion of the Video Toaster/Flyer combination was a complete computer of its own, having its own
microprocessor and
embedded software. Its hardware included three embedded
SCSI controllers. Two of these SCSI buses were used to store video data, and the third to store audio. The Flyer used a proprietary
wavelet compression algorithm known as VTASC, which was well regarded at the time for offering better visual quality than comparable non-linear editing systems using
motion JPEG. Until 1993, the Avid Media Composer was most often used for editing commercials or other small-content and high-value projects. This was primarily because the purchase cost of the system was very high, especially in comparison to the offline tape-based systems that were then in general use. Hard disk storage was also expensive enough to be a limiting factor on the quality of footage that most editors could work with or the amount of material that could be held digitized at any one time. Up until 1992, the Apple Macintosh computers could access only 50
gigabytes of storage at once. This limitation was overcome by a digital video R&D team at the
Disney Channel led by
Rick Eye. By February 1993, this team had integrated a long-form system that let the Avid Media Composer running on the Apple Macintosh access over seven
terabytes of digital video data. With instant access to the shot footage of an entire
movie, long-form non-linear editing was now possible. The system made its debut at the
NAB conference in 1993 in the booths of the three primary sub-system manufacturers, Avid,
Silicon Graphics and
Sony. Within a year, thousands of these systems had replaced
35mm film editing equipment in major motion picture studios and TV stations worldwide. Although M-JPEG became the standard codec for NLE during the early 1990s, it had drawbacks. Its high computational requirements ruled out software implementations, imposing extra cost and complexity of hardware compression/playback cards. More importantly, the traditional tape
workflow had involved editing from videotape, often in a rented facility. When the editor left the edit suite, they could securely take their tapes with them. But the M-JPEG data rate was too high for systems like Avid/1 on the Apple Macintosh and
Lightworks on PC to store the video on removable storage. The content needed to be stored on fixed hard disks instead. The secure tape paradigm of keeping your content with you was not possible with these fixed disks. Editing machines were often rented from facilities houses on a per-hour basis, and some productions chose to delete their material after each edit session, and then ingest it again the next day to guarantee the security of their content. In addition, each NLE system had storage limited by its fixed disk capacity. These issues were addressed by a small UK company,
Eidos Interactive. Eidos chose the new
ARM-based computers from the UK and implemented an editing system, launched in Europe in 1990 at the
International Broadcasting Convention. Because it implemented its own compression software designed specifically for non-linear editing, the Eidos system had no requirement for JPEG hardware and was cheap to produce. The software could decode multiple video and audio streams at once for real-time effects at no extra cost. But most significantly, for the first time, it supported unlimited cheap removable storage. The Eidos Edit 1, Edit 2, and later Optima systems let the editor use
any Eidos system, rather than being tied down to a particular one, and still keep his data secure. The Optima software editing system was closely tied to
Acorn hardware, such that when Acorn ceased development of a successor to its
Risc PC and discontinued its traditional operations in the late 1990s, it presented difficulties that Eidos's founder, Stephen Streater, attempted to overcome through an ultimately unsuccessful bid for various Acorn hardware and software technologies. Due to a corporate decision to focus on the games industry, Eidos discontinued the Optima system, and Streater left the company in 1999. In the early 1990s, a small American company called Data Translation took what it knew about coding and decoding pictures for the US military and large corporate clients and spent $12 million developing a desktop editor based on its proprietary compression algorithms and off-the-shelf parts. Their aim was to democratize the desktop and take some of Avid's market. In August 1993,
Media 100 entered the market, providing would-be editors with a low-cost, high-quality platform. Around the same period, other competitors provided non-linear systems that required special hardware—typically cards added to the computer system. Fast Video Machine was a PC-based system that first came out as an offline system, and later became more
online editing capable. The
Imix video cube was also a contender for media production companies. The Imix Video Cube had a control surface with faders to allow mixing and shuttle control. Data Translation's Media 100 came with three different JPEG codecs for different types of graphics and many resolutions.
DOS-based
D/Vision Pro was released by TouchVision Systems, Inc. in the mid-1990s and worked with the
Action Media II board. These other companies caused tremendous downward market pressure on Avid. Avid was forced to continually offer lower-priced systems to compete with the Media 100 and other systems. Inspired by the success of Media 100, members of the
Premiere development team left Adobe to start a project called "Keygrip" for Macromedia. Difficulty raising support and money for development led the team to take their non-linear editor to the
NAB Show. After various companies made offers, Keygrip was purchased by Apple as Steve Jobs wanted a product to compete with Adobe Premiere in the desktop video market. At around the same time, Avid—now with Windows versions of its editing software—was considering abandoning the Macintosh platform. Apple released
Final Cut Pro in 1999, and despite not being taken seriously at first by professionals, it has evolved into a serious competitor to entry-level Avid systems.
DV Another leap came in the late 1990s with the launch of
DV-based video formats for consumer and professional use. With DV came
IEEE 1394 (FireWire/iLink), a simple and inexpensive way of getting video into and out of computers. Users no longer had to convert video from
analog to digital—it was recorded as digital to start with—and FireWire offered a straightforward way to transfer video data without additional hardware. With this innovation, editing became a more realistic proposition for software running on standard computers. It enabled desktop editing, producing high-quality results at a fraction of the cost of earlier systems.
HD In early 2000, the introduction of highly compressed HD formats such as
HDV has continued this trend, making it possible to edit HD material on a standard computer running a software-only editing system.
Avid is an industry standard used for major feature films, television programs, and commercials. Final Cut Pro received a
Technology & Engineering Emmy Award in 2002. Since 2000, many personal computers have included basic non-linear video editing software free of charge. This is the case of Apple
iMovie for the Macintosh platform, various open-source programs like
Kdenlive,
Cinelerra-GG Infinity and
PiTiVi for the Linux platform, and
Windows Movie Maker for the Windows platform. This phenomenon has brought low-cost non-linear editing to consumers.
The cloud The demands of video editing in terms of the volumes of data involved mean the proximity of the stored footage being edited to the NLE system doing the editing is governed partly by the capacity of the data connection between the two. The increasing availability of broadband Internet, combined with the use of lower-resolution copies of original material, provides an opportunity to not just review and edit material remotely but also open up access to far more people to the same content at the same time. In 2004, the first
cloud-based video editor, known as
Blackbird and based on technology invented by
Stephen Streater, was demonstrated at
IBC and recognized by the
RTS the following year. Since that time, a number of other cloud-based editors have become available, including systems from
Avid,
WeVideo and
Grabyo. Despite their reliance on a network connection, the need to ingest material before editing can take place, and the use of lower-resolution
video proxies, their adoption has grown. Their popularity has been driven largely by efficiencies arising from opportunities for greater collaboration and the potential for cost savings derived from using a shared platform, hiring rather than buying infrastructure, and the use of conventional IT equipment over hardware specifically designed for video editing.
4K ,
4K Video in NLE was fairly new, but it was being used in the creation of many movies throughout the world, due to the increased use of advanced 4K cameras such as the
Red Camera. Examples of software for this task include
Avid Media Composer, Apple's
Final Cut Pro X,
Sony Vegas,
Adobe Premiere,
DaVinci Resolve,
Edius, and
Cinelerra-GG Infinity for Linux.
8K 8K video was relatively new. 8K video editing requires advanced hardware and software capable of handling the standard.
Image editing For imaging software, early works such as
HSC Software's Live Picture brought non-destructive editing to the professional market and current efforts such as
GEGL provide an implementation being used in open-source image editing software. == Quality ==