Back in January, the MIDI Manufacturers Association (MMA) held its annual meeting. There they approved a new standard for interoperability between MIDI devices that will pave the way to a better future. As my contribution to “May is MIDI Month,” I’d like to share some highlights of the recent changes. But first, here’s some background on the current MIDI standard.
The Origins of MIDI
The MIDI Standard was first published in August of 1983. MIDI stands for “Musical Instrument Digital Interface.” It was created to allow any digital musical instrument to interact with another using a single, inexpensive cable. We take this for granted today, but at the time, it was nothing short of miraculous. Imagine creating something like “The Well-Tempered Synthesizer” without MIDI! The standard describes the configuration of the connection cable, its electrical requirements, and defines a serial communications protocol for sending and receiving musical data.
The protocol has enabled a wide array of applications, including allowing devices to:
- Trigger note playing with a deep array of control parameters
- Control recording equipment (MIDI Machine Control), or theatrical performance systems (MIDI Show Control)
- Synchronize (MIDI timecode or MIDI beat clock)
In addition, it has brought about a specification for a set of sounds that guarantees a MIDI performance will sound reasonably good, no matter what device is used as the sound module (General MIDI).
Having been created in 1983, MIDI’s design could not have anticipated all the developments in the last 35 years. The emergence of the Digital Audio Workstation with its high-powered musical sequencing tools have given composers an unparalleled ability to control the finest details of a performance. But more control means you need more precision and the ability to send a much larger amount of data between devices.
Having been built around sending 7 (or 14 in some case) bit control messages (today’s computers can process values stored with 64 bits) at a rate of 31,250 bits per second (the average home in the US has access to internet that is over 2,000 times faster), it is clear that an improved specification is needed to allow MIDI to operate at the level of expectation we have from today’s common devices.
But before that can happen, work has to be done to make sure older devices can still be used with newer ones, and that is where the recently adopted MIDI-CI standard comes in. MIDI-CI Stands for “MIDI Capability Inquiry.” This new standard provides the ability for MIDI device manufacturers to provide three areas of new functionality:
This aspect of the specification delivers the ability for future devices to inquire about a connected device’s capabilities, and use a next-generation MIDI protocol if available, or fall back to MIDI 1.0 if not. When that next-generation protocol is approved, capabilities like higher control resolution, more channels, improved performance, and greater multi-dimensional expressiveness will be possible. The MIDI-CI specification will allow those new devices to interoperate with any other MIDI device as best as it can. This new protocol will happen in the near future, but MIDI CI also supports some new features that manufacturers can adopt today…
This defines a set of MIDI messages for a particular instrument that essentially standardizes how classes of instruments are controlled between devices. If you have a spot-on Hammond B3 tone module, and a generic midi controller that supports MIDI-CI, once you connect those devices and load the organ profile, your mod wheel will automatically take on the role of spinning up the Leslie simulator, if your mod wheel is the most appropriate control. This means the days of manually mapping controllers for common instruments will soon be over.
Property exchange allows devices to ask other devices about a number of properties. These include things like a patch list, available controller names, and resolutions. With MIDI-CI it will now be possible for Finale to ask your MIDI keyboard for its list of patches, and populate a patch selector dialog directly from the device without having to enter any configuration. If you create a new patch on your MIDI keyboard, Finale could pick that up automatically too, and add it to your choices.
It is going to be a while before software and hardware manufacturers will fully support the new capabilities. It will also take some pushing to get manufacturers to widely support the next level of great features that MIDI-CI provides. But now that you know what the new capabilities are, you are well equipped to ask for what you want…
Photo of Yutaka Hasegawa, Chairman of AMEI (the Japanese MIDI organization), speaking at the 2018 MMA meeting is used by permission of the MMA.
Matthew Logan is the tech lead for Finale and has been developing multimedia software since the early 90’s. He has developed software for science museum exhibits, games (including scoring and recording the audio), notation software, digital audio workstations, software synthesizers, guitar processors, web-based video production, and web-based mastering.
Matthew lives near Boulder, Colorado with his wife Julie and daughter Miriam. Matthew enjoys outdoor recreation, mixing sound at his church, playing guitar, composing, and learning about virtual reality programming with his daughter.