MIDI is an acronym which stands for Musical Instrument Digital Interface and is one of the few standards we have in music.
In short it is a way to digitize instrumental data. Using MIDI we can capture, for example, the way a key on a keyboard is being hit and transfer all of that to another instrument which can then fully reproduce the way the instrument was being played.
It should come as no surprise that Max for Live (‘M4L’) is fully capable to work with MIDI, however because we’re working within Ableton Live there are some limitations to keep in mind. It seems these limits confuse quite some people so I felt a blog post was in order…
MIDI routing in Live
Live is very specific when it comes to signal routing. For example; if you have an audio track then the devices sitting on that track will only encounter audio data. If you have a MIDI track on the other hand then the type of data being processed heavily depends on the location of the device:
Here you see the device chain of a MIDI track; you can note as much because the first fire meter shows round bulbs which indicates that this tracks input consists of MIDI data. The first device, Scale, is a MIDI effect. So MIDI comes in and goes out again; as can be expected the fire meter behind the device is therefore equal to the one in front.
Then we have an instrument, Analog. Instruments process MIDI data and turn it into audio. And indeed; behind the Analog device we can see a regular fire meter which indicates that audio data is being processed.
Finally there is the Limiter device, this is an audio effect. As such; audio comes in and goes out again.
Live is very strict with this policy; it routes audio and MIDI data separately and these cannot interchange. So you won’t find MIDI effects nor MIDI data on the device chain of an audio track. You can expect to find audio effects on the device chain of a MIDI track but only behind an instrument.
The reason I’m explaining all this is because there seems to be quite some confusing about this with Max programmers who found their way into Live. When programming in Max you deal with MIDI channels directly, and so explicitly tell it which MIDI channels you want to access. With Live otoh. you need to rely on the way Live routes the different signals.
So if you want to make sure that you’re receiving MIDI data from a specific MIDI device then you’ll have to pull up Live’s mixer section (as can be seen on the right) and then manually select which device Live should monitor for incoming MIDI data.
It is because of this why you cannot use different MIDI channels in a Max for Live device even though you can access these individually when using a Max patch. In a Max for Live patch the MIDI channel being used will always be 1.
But now I’m getting a little ahead of myself, lets start by looking at how MIDI is being handled in Max for Live:
MIDI in Max for Live
A default Max midi effect consists of 2 objects; midiin which is where the midi data enters the patch and midiout which, the name says it all, is where the processed data leaves the patch.
So far, so good. But before we go on… What exactly is this MIDI data anyway ?
MIDI is a protocol used to store the way you play a musical instrument, but in such a way that other MIDI capable devices can process and reproduce the original play.
To this end several aspects of your playing are stored. For example; when looking at an electronic (midi) keyboard the note you’re playing is stored, when the keyboard has velocity sensitivity it can also store the speed you used to strike the key, and so on. An example:
When I hit C3 on my keyboard it produces 3 values; 144, 60 and 77. When I release the key we get 128, 60 and 64. So what do these numbers tell us? Well, nothing much because as the description above says; this is raw data.
This is the patch I made to produce the data you see above. And this is also how you can extract (‘parse’) the raw MIDI data shown above into something more useful, to this end we’re using the midiparse object.
As you can see; what this object basically does is pick up the midi data and divide it into several parts. Keep in mind however that not all of these parts are being using in Max for Live.
For example; even though the midiparse object can check which MIDI channel is being used, this value will always be 1 because MIDI objects in Live can only use one single channel.
Note messages & control messages
Another important detail is the ‘control’ part above. MIDI data roughly consists of 2 different pieces of information; note and control data. Note data is what I’ve briefly shown you above; when you press a key on a keyboard then this will send out different values. This is what we call note data or “note messages”.
But some keyboards also have controls such as faders and knobs and such. What happens when you use those? In those cases so called control data (“control messages”) are being sent out. This can be an important aspect in Max but not in Max for Live. Remember: Live is processing all the MIDI data and control messages usually do not reach any midi tracks.
Here you see the same data as above (I pressed C3 on my keyboard) but this time I also filtered the incoming data using the midiparse object. And as you can see things are now a whole lot easier to understand.
First we get a channel message. This is always 1 so we can easily ignore it, but its still good to keep in mind that even though it doesn’t change the data is still being sent.
Next we get a sequence of 2 numbers. Such a sequence is called a List in Max (for Live). And when I release the key the whole thing seems to repeat itself, apart from the last numbers. Instead of sending a list containing “60 75” we now got a list which says “60 0”.
Parsing the MIDI data
This is actually a whole lot easier than it may look. When I press the C3 key we got “60 75”. This is basically a list which shows us the note being played (60) and the force which I used to hit it (or put differently: the used velocity). That is what the 75 stands for.
When I release the note we get the value 60 again; this indicates that something is happening with the C3 key. The next number is now 0, this tells us that the key has been released.
It really is that simple:
In this patch I used the unpack object to split the list of note and velocity into two parts. The first part is sent to the kslider object so that it will show which key I pressed on my keyboard.
The second part is being checked on its value. If this value is 0, so a key has been released, then the if object sends out a bang. This bang is then being delayed after which it triggers the sending of a -1 value onto the kslider.
The delay is due to some Max (for Live) logic; data is being processed from right to left. So if we wouldn’t delay the bang message then the kslider would first receive the -1 value followed by the value of the note being played.
But that wouldn’t give us the desired behaviour; the idea is to make the kslider show a note when its played, but clear it once the note has been released. That is done by sending -1 to the top left inlet (hint: even though the reference page says you can use “clear” or “flush” this doesn’t seem to work).
And that is basically all there is to it. Its simply a matter of extracting the right values and applying some logic on it.
This is a good example as to why I came to enjoy Max (for Live) as much as I do; not only does it allow for some massive tweaking and such, you can also easily setup test (and learning) cases like the one I’ve shown above which may very well help you to discover totally new cool ways to do some things.