Instrument control MIDI was invented so that electronic or digital musical instruments could communicate with each other and so that one instrument can control another. For example, a MIDI-compatible sequencer can trigger beats produced by a drum
sound module. Analog synthesizers that have no digital component and were built prior to MIDI's development can be retrofitted with kits that convert MIDI messages into analog control voltages. MIDI also provides remote control over instrument parameters like volume, effects, etc. Synthesizers and samplers contain various tools for shaping an electronic or digital sound.
Filters adjust
timbre, and envelopes automate the way a sound evolves over time after a note is triggered. The frequency of a filter and the envelope attack (the time it takes for a sound to reach its maximum level), are examples of synthesizer
parameters, and can be controlled remotely through MIDI. Effects devices have different parameters, such as delay feedback or reverb time. When a MIDI continuous controller number (CCN) is assigned to one of these parameters, the device responds to any messages it receives that are identified by that number. Controls such as knobs, switches, and pedals can be used to send these messages. A set of adjusted parameters can be saved to a device's internal memory as a
patch, and these patches can be remotely selected by MIDI program changes.
Composition MIDI events can be sequenced with
computer software, or in specialized hardware
music workstations. Many
digital audio workstations (DAWs) are specifically designed to work with MIDI as an integral component. MIDI
piano rolls have been developed in many DAWs so that the recorded MIDI messages can be easily modified. These tools allow composers to audition and edit their work much more quickly and efficiently than did older solutions, such as
multitrack recording. Compositions can be programmed for MIDI that are impossible for human performers to play. Because a MIDI performance is a sequence of commands that create sound, MIDI recordings can be manipulated in ways that audio recordings cannot. It is possible to change the key, instrumentation or tempo of a MIDI arrangement, or even edit individual notes. The ability to compose ideas and quickly hear them played back enables composers to experiment.
Algorithmic composition programs provide computer-generated performances that can be used as song ideas or accompaniment. After Roland sold MPU
sound chips to other sound card manufacturers, The widespread adoption of MIDI led to computer-based
MIDI software being developed. Retro Innovations also makes a MIDI interface cartridge for
Tandy Color Computer and
Dragon computers. Chiptune musicians also use retro gaming consoles to compose, produce and perform music using MIDI interfaces. Custom interfaces are available for the
Family Computer/
Nintendo Entertainment System,
Game Boy,
Game Boy Advance and
Sega Mega Drive/
Sega Genesis.
Computer files . A MIDI file is not an audio recording. Rather, it is a set of instructionsfor example, for pitch or tempoand can use a thousand times less disk space than the equivalent recorded audio. Due to their tiny filesize, fan-made MIDI arrangements became an attractive way to share music online, before the advent of
broadband internet access and multi-gigabyte hard drives. The major drawback to this is the wide variation in quality of users' audio cards, and in the actual audio contained as samples or synthesized sound in the card that the MIDI data only refers to symbolically. Even a sound card that contains high-quality sampled sounds can have inconsistent quality from one sampled instrument to another. played back through low-quality digital-to-analog converters. The low-fidelity reproduction and the quality of its playback depends entirely on the quality of the sound-producing device. The compact size of these files led to their widespread use in computers, mobile phone
ringtones, webpage authoring and musical greeting cards. These files are intended for universal use and include such information as note values, timing and track names. Lyrics may be included as
metadata, and can be displayed by
karaoke machines. SMFs are created as an export format of software sequencers or hardware workstations. They organize MIDI messages into one or more parallel
tracks and time-stamp the events so that they can be played back in sequence. A
header contains the arrangement's track count, tempo and indicates which of three SMF formats the file uses. A type 0 file contains the entire performance, merged onto a single track, while type 1 files may contain any number of tracks that are performed synchronously. Type 2 files are rarely used and store multiple arrangements, with each arrangement having its own track to be played in sequence.
RMID files Microsoft Windows bundles SMFs together with
Downloadable Sounds (DLS) in a
Resource Interchange File Format (RIFF) wrapper, as
RMID files with a .rmi extension. RIFF-RMID has been
deprecated in favor of
Extensible Music Files (
XMF).
Software The main advantage of the personal computer in a MIDI system is that it can serve a number of different purposes, depending on the software that is loaded. Sequencers may take alternate forms, such as drum pattern editors that users can use to create beats by clicking on pattern grids, Notation programs include
Finale,
Encore,
Sibelius,
MuseScore and
Dorico.
SmartScore software can produce MIDI files from
scanned sheet music.
Editors and librarians Users can program their equipment through the patch editor as a computer interface. These became essential with the appearance of complex synthesizers such as the
Yamaha FS1R, which contained several thousand programmable parameters, but had an interface that consisted of fifteen tiny buttons, four knobs and a small LCD. Digital instruments typically discourage users from experimentation, due to their lack of the feedback and direct control that switches and knobs provide, Some editors are designed for a specific instrument or effects device, while other,
universal editors support a variety of equipment, and ideally can control the parameters of every device in a setup through the use of System Exclusive messages. Universal editor/librarians that combine the two functions were once common, and included Opcode Systems' Galaxy,
eMagic's SoundDiver, and MOTU's Unisyn. Although these older programs have been largely abandoned with the trend toward computer-based synthesis using virtual instruments, several editor/librarians remain available, including Coffeeshopped Patch Base, Sound Quest's Midi Quest, and several editors from Sound Tower.
Native Instruments' Kore was an effort to bring the editor/librarian concept into the age of software instruments, but was abandoned in 2011.
Auto-accompaniment programs Programs that can dynamically generate accompaniment tracks are called
auto-accompaniment programs. These create a full-band arrangement in a style that the user selects and sends the result to a MIDI sound-generating device for playback. The generated tracks can be used as educational or practice tools, as accompaniment for live performances, or as a songwriting aid. Synthesizers implemented in software are subject to timing issues that are not necessarily present with hardware instruments, whose dedicated operating systems are not subject to interruption from background tasks as desktop
operating systems are. These timing issues can cause synchronization problems, and clicks and pops when sample playback is interrupted. Software synthesizers also may exhibit additional
latency in their sound generation. The roots of software synthesis go back as far as the 1950s, when
Max Mathews of
Bell Labs wrote the
MUSIC-N programming language, which was capable of non-real-time sound generation. Reality, by Dave Smith's
Seer Systems was an early synthesizer that ran directly on a host computer's CPU. Reality achieved a low latency through tight driver integration, and therefore could run only on
Creative Labs soundcards. Syntauri Corporation's Alpha Syntauri was another early software-based synthesizer. It ran on the Apple IIe computer and used a combination of software and the computer's hardware to produce additive synthesis. Some systems use dedicated hardware to reduce the load on the host CPU, as with
Symbolic Sound Corporation's Kyma System, which power an entire recording studio's worth of instruments,
effect units, and
mixers. The ability to construct full MIDI arrangements entirely in computer software allows a composer to render a finalized result directly as an audio file. and "primitive". Wavetable
daughterboards that were later available provided audio samples that could be used in place of the FM sound. These were expensive, but often used the sounds from respected MIDI instruments such as the
E-mu Proteus.
Other applications Despite its association with music devices, MIDI can control any electronic or digital device that can read and process a MIDI command. MIDI has been adopted as a control protocol in a number of non-musical applications.
MIDI Show Control uses MIDI commands to direct stage lighting systems and to trigger cued events in theatrical productions.
VJs and
turntablists use it to cue clips, and to synchronize equipment, and recording systems use it for synchronization and
automation. Wayne Lytle, the founder of
Animusic, derived a system he dubbed MIDIMotion, which he used to produce the
Animusic series of computer-animated music video albums. Animusic later designed its own animation software specifically for MIDIMotion called Animotion.
Apple Motion allows for a similar control of animation parameters through MIDI. The 1987
first-person shooter game
MIDI Maze and the 1990
Atari ST puzzle video game Oxyd use MIDI to network computers together. == Devices ==