Definitions
Nearby Words

# MIDI usage and applications

Many extensions of the original official MIDI 1.0 specification have been jointly standardized by the MIDI Manufacturers Association (MMA) in the US and the Association of Musical Electronics Industry (AMEI) in Japan. General MIDI (GM) was an attempt by the MIDI Manufacturers' Association (MMA) to standardize an instrument programme number map.

Later, companies in Japan's AMEI developed General MIDI Level 2 (GM2), which extended the instrument palette, specified more message responses, and defining new messages. GM2 became the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications. In addition to the original 31.25 kBaud current-loop, 5-pin DIN transport, transmission of MIDI streams over USB, IEEE 1394 a.k.a FireWire, and Ethernet is now common. MIDI is also used every day as a control protocol in applications other than music, including for show control and theatre lighting.

## Extensions of the MIDI standard

Many extensions of the original official MIDI 1.0 specification have been jointly standardized by the MIDI Manufacturers Association (MMA) in the US and the Association of Musical Electronics Industry (AMEI) in Japan. Only a few of them are described here; for more comprehensive information, see the MMA web site

### General MIDI

The General MIDI (GM) and General MIDI 2 (GM2) standards define a MIDI instrument's response to the receipt of a defined set of MIDI messages. As such, they allow a given, conformant MIDI stream to be played on any conformant instrument. Although dependent on the basic MIDI 1.0 specification, the GM and GM2 specifications are each separate from it. As such, it is not generally safe to assume that any given MIDI message stream or MIDI file is intended to drive GM-compliant or GM2-compliant MIDI instruments.

At heart, these specifications resolve certain ambiguities in the MIDI message protocol. In MIDI, instruments (one per channel) are selected by number (0-127), using the Program Change message. However, the basic MIDI 1.0 specification did not specify what instrument sound (piano, tuba, etc.) corresponds to each number. This was intentional, as MIDI originated as a professional music protocol, and in that context it is typical for a performer to assemble a custom palette of instruments appropriate for their particular repertoire, rather than taking a least-common-denominator approach.

Eventually interest developed in adapting MIDI as a consumer content format, and for computer multimedia applications. In this context, in order for MIDI file content to be portable, the instrument program numbers used must call up the same instrument sound on every player. General MIDI (GM) was an attempt by the MIDI Manufacturers' Association (MMA) to resolve this problem by standardising an instrument programme number map, so that for example Program Change 1 always results in a piano sound on all GM-compliant players. GM also specified the response to certain other MIDI messages in a more controlled manner than the MIDI 1.0 specification. The GM spec is maintained and published by the MIDI Manufacturers' Association (MMA).

From a musical perspective, GM has a mixed reputation, mainly because of small or large audible differences in corresponding instrument sounds across player implementations, the limited size of the instrument palette (128 instruments), its lowest-common-denominator character, and the inability to add customised instruments to suit the needs of the particular piece. Yet the GM instrument set is still included in most MIDI instruments, and from a standardisation perspective GM has proven durable.

General Midi 1 was introduced in 1991.

### General MIDI 2

Later, companies in Japan's AMEI developed General MIDI Level 2 (GM2), incorporating aspects of the Yamaha XG and Roland GS formats, extending the instrument palette, specifying more message responses in detail, and defining new messages for custom tuning scales and more. The GM2 specs are maintained and published by the MMA and AMEI.

General MIDI 2 was introduced in 1999 and last amended in February 2007.

### SP-MIDI

Later still, GM2 became the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications where different players may have different numbers of musical voices. SP-MIDI is a component of the 3GPP mobile phone terminal multimedia architecture, starting from release 5.

GM, GM2, and SP-MIDI are also the basis for selecting player-provided instruments in several of the MMA/AMEI XMF file formats (XMF Type 0, Type 1, and Mobile XMF), which allow extending the instrument palette with custom instruments in the Downloadable Sound (DLS) formats, addressing another major GM shortcoming.

### Alternate Hardware Transports

In addition to the original 31.25 kBaud current-loop, 5-pin DIN transport, transmission of MIDI streams over USB, IEEE 1394 a.k.a FireWire, and Ethernet is now common (see below).

### MIDI over Ethernet

Compared to USB or FireWire, the Ethernet implementation of MIDI provides network routing capabilities, which are extremely useful in studio or stage environements (USB and FireWire are restricted to connections between one computer and some devices and do not provide any routing capabilities). Ethernet is moreover capable of providing the high-bandwidth channel that earlier alternatives to MIDI (such as ZIPI) were intended to bring.

After the initial fight between different protocols (IEEE-P1639, MIDI-LAN, IETF RTP-MIDI), it appears that IETF's RTP MIDI specification for transport of MIDI streams over Ethernet and Internet is now spreading faster and faster since more and more manufacturer's are integrating RTP-MIDI in their products (Apple, CME, Kiss-Box, etc...). Mac OS X, Windows and Linux drivers are also available to make RTP MIDI devices appear as standard MIDI devices within these operating systems.

IEEE-P1639 is now a dead project. The other proprietary MIDI/IP protocols are slowly disappearing one after the other, since most of them require expensive licensing to be implemented (while RTP MIDI is completely opened) or the MIDI implementation does not bring any real advantage (apart speed) to original MIDI protocol.

IEEE-P1639 is not a dead project. We are hoping to complete before the end of this year 2008. Testing has been completed and there are significant speed and accuracy benefits over RTP MIDI. The code examples for IEEE-P1639 will be available and distributed under BSD2 and GNU GPL as soon as approval by the IEEE has been effected.

### RTP-MIDI Transport Protocol

The RTP-MIDI protocol has been officially released in public domain by IETF in December 2006 (IETF RFC4695). RTP-MIDI relies on the well-known RTP (Real Time Protocol) layer (most often running over UDP, but compatible with TCP also), widely used for real-time audio and video streaming over networks.

RTP layer is extremely easy to implement and requires very few power from microprocessor, but it already provides very useful information to the receiver (network latency, loss of packets, reordered packets, etc...). RTP-MIDI defines a specific payload type, that allows the receiver to identify MIDI streams.

RTP-MIDI does not alter the MIDI messages in any way (all messages defined in the MIDI norm are transported transparently over the network), but it adds some specific functionnalities, like timestamping and sysex fragmentation. RTP-MIDI also adds a very powerful mechanism, named journalling, that allows the receiver to detect the loss of MIDI messages in the network, but also to retrieve lost information.

The first part of RTP-MIDI specification is mandatory to implement and describes how MIDI messages are encapsulated within the RTP telegram. It also describes how the journalling system is working. It must be noticed that the use of the journalling system is not mandatory (journalling is not very useful for LAN applications, but it is very important for WAN applications).

The second part of RTP-MIDI specification describes the session control mechanisms that allow multiple stations to synchronize across the network to exchange RTP-MIDI telegrams. This part is informational only, and it is not required that all RTP-MIDI implementations use the described mechanisms.

RTP-MIDI is included in Apple's Mac OS X, as standard MIDI ports (the RTP-MIDI ports appear in Macintosh applications as any other USB or FireWire port. Thus, any MIDI application running on Mac OS X is able to use the RTP-MIDI capabilities in a transparent way). However, Apple's developers have considered that the session control protocol described in IETF's specification was by far too complex, and they created their own session control protocol. Since the session protocol uses a different UDP port of the main RTP-MIDI stream port, the two protocols do not interfere (so the RTP-MIDI implementation in Mac OS X fully complies to the IETF specification).

The Apple's implementation has been used as reference by other MIDI manufacturers. A Windows XP RTP-MIDI driver has also been released by the Dutch company Kiss-Box and a Linux implementation is currently under development by the Grame association. It is then quite probable that the Apple's implementation will become the "de-facto" standard (and could even become the MMA reference implementation).

[1] IETF RTP-MIDI specification http://www.rfc-editor.org/rfc/rfc4695.txt
[3] Grame's website http://www.grame.fr

### Alternate Tunings

By convention, instruments that receive MIDI generally use the conventional 12-pitch per octave equal temperament tuning system. Unfortunately this tuning system makes many types of music inaccessible because the music depends on a different intonation system. To address this issue in a standardized manner, in 1992 the MMA ratified the MIDI Tuning Standard, or MTS. This standard allow MIDI instruments that support MTS to be tuned in any way desired, through the use of a MIDI Non-Real Time System Exclusive message.

MTS uses three bytes, which can be thought of as a three-digit number base 128, to specify a pitch in logarithmic form. The following formula gives the byte values needed to encode a given frequency in hertz:

$p = 69 + 12timeslog_2 \left\{ left\left(frac \left\{f\right\}\left\{440\right\} right\right) \right\}$

For a note in A440 equal temperament, this formula delivers the standard MIDI note number. Any other frequencies fill the space evenly.

While support for MTS is not particularly widespread in commercial hardware instruments, it is nonetheless supported by some instruments and software, for example the free software programs TiMidity and Scala (program), as well as other microtuners.

## Other applications of MIDI

MIDI is also used every day as a control protocol in applications other than music, including:

Such non-musical applications of MIDI are possible because any device built with a standard MIDI Out connector should in theory be able to control any other device with a MIDI In port, just as long as the developers of both devices have the same understanding about the semantic meaning of all the MIDI messages the sending device emits. This agreement can come either because both follow the published MIDI specifications, or else in the case of any non-standard functionality, because the message meanings are agreed upon by the two manufacturers.

## MIDI controllers which are hardware and software

Note: The term MIDI controller is used in two different ways. (1) In one sense, a MIDI controller is a hardware or software entity able to transmit MIDI messages via a MIDI Out connector to other devices with MIDI In connectors. (2) In the other (more technical) sense, a MIDI controller is any parameter in a device with a MIDI In connector that can be set with the MIDI Control Change message. For example, a synthesizer may use controller number 18 for a low-pass filter's frequency; to open and close that filter with a physical slider, a user would assign the slider to transmit controller number 18. Then, all changes in the slider position will be transmitted as MIDI Control Change messages with the controller number field set to 18; when the synthesizer receives the messages, the filter frequency will change accordingly.

The following are classes of MIDI controller (using definition 1 above):

• The human interface component of a traditional instrument redesigned as a MIDI input device. The most common type of device in this class is the keyboard controller. Such a device provides a musical keyboard and perhaps other actuators (pitch bend and modulation wheels, for example) but produces no sound on its own. It is intended only to drive other MIDI devices. Percussion controllers such as the Roland Octapad fall into this class, as do guitar-like controllers such as the SynthAxe and a variety of wind controllers.
• Electronic musical instruments, including synthesizers, samplers, drum machines, and electronic drums, which are used to perform music in real time and are inherently able to transmit a MIDI data stream of the performance.
• Pitch-to-MIDI converters including guitar/synthesizers analyze a pitch and convert it into a MIDI signal. There are several devices which do this for the human voice and for monophonic instruments such as flutes, for example.
• Traditional instruments such as drums, pianos, and accordions which are outfitted with sensors and a computer which accepts input from the sensors and transmits real-time performance information as MIDI data.
• Sequencers, which store and retrieve MIDI data and send the data to MIDI enabled instruments in order to reproduce a performance.
• The MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) is an industry standard ratified by the MIDI Manufacturers Association in 1991 which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI (above), MSC does not transmit the actual show media — it simply transmits digital data providing information such as the type, timing and numbering of technical cues called during a multimedia or live theatre performance.
• MIDI Machine Control (MMC) devices such as recording equipment, which transmit messages to aid in the synchronization of MIDI-enabled devices. For example, a recorder may have a feature to index a recording by measure and beat. The sequencer that it controls would stay synchronized with it as the recorder's transport controls are pushed and corresponding MIDI messages transmitted.

## MIDI controllers in the data stream

Note: The term MIDI controller is used in two different ways. (1) In one sense, a MIDI controller is a hardware or software entity able to transmit MIDI messages via a MIDI Out connector to other devices with MIDI In connectors. (2) In the other (more technical) sense, a MIDI controller is any parameter in a device with a MIDI In connector that can be set with the MIDI Control Change message. For example, a synthesizer may use controller number 18 for a low-pass filter's frequency; to open and close that filter with a physical slider, a user would assign the slider to transmit controller number 18.

This section uses the second definition of "MIDI controller".

Performance modifier controls such as modulation wheels, pitch bend wheels, sustain pedals, pitch sliders, buttons, knobs, faders, switches, ribbon controllers, etc., can alter an instrument's state of operation, and thus can be used to modify sounds or other parameters of music performance. Because MIDI includes messages for representing such controller events, they can be sent in real time over MIDI connections. MIDI makes approximately 120 virtual controller numbers (addresses) available for this purpose, i.e. connecting the actual buttons, knobs, wheels, sliders, etc. with their intended actions within the receiving device. In MIDI, the value data range of the Control Change message is 128 steps (0 to 127), and the first 32 controller numbers (including, for example, Volume) are allocated an additional 7 bits of "Least Significant Bits" precision for a total of 14 bits or a range of 0-16383 (although many manufacturers do not implement this increased resolution).

Some controller functions, such as pitch bend or key pressure, are special, with a dedicated MIDI data range of 16,384 steps. This higher resolution makes it possible to, for example, produce the illusion of a continuously sliding pitch, as in a violin's portamento, rather than a series of zippered steps such as a guitarist sliding fingers up the frets of the guitar's neck. At the MIDI message stream level, pitch bend and key velocity use different, dedicated messages (Polyphonic Key Pressure, Channel Pressure, or Pitch Bend Change) instead of the ordinary Control Change message. There is a trade-off, however: the pitch wheel and/or key pressure functions of a MIDI keyboard can, depending on the performance, generate large amounts of data which can in turn lead to a slowdown of data throughput on the MIDI connection. This can be remedied by using a sequencer to "thin" pitch-bend (or any other continuous controller) data down to only a limited number of messages per second, or down to only messages that change the controller value by at least a certain amount.

The original MIDI spec included approximately 120 virtual controller numbers for real time modifications to live instruments or their audio. MIDI Show Control (MSC) and MIDI Machine Control (MMC) are two separate extensions of the original MIDI spec, expanding the MIDI protocol to become far more than its original intent.