As you all probably know the Ableton Professional Controller number 40, by Akai Professional, is the de-facto MIDI controller for Ableton Live. Its used in a dozen of official demonstration movies and of course also fully supported by Ableton Live as a control surface.
And for those of you who are not familiar with this beast: The APC40 is a MIDI controller featuring an 8 by 5 grid (hence the ’40’) which can be used to access / control the clips in Live’s session view. Next it can control individual tracks, the mixer section as well as any device sitting on a track device strip.
Max for Live (‘M4L’) should also be well known by now; the visual programming language developed by Cycling ’74 which found its way into the Ableton Live environment through what I consider to be a monster cooperation between Ableton and Cycling ’74. M4L does an excellent job of controlling certain aspects of Live, and this whole thing is pretty well documented too. But when it comes to accessing control surfaces things change quite drastically; now we’re entering a very vague area which hardly has any documentation at all. I took my time but now I’m going to have my first shot at using M4L to control the control surface and trying to get a grip on the theory behind it…
The control_surfaces root object
Before I continue let it be very clear that I’m not using official documentation or give the ‘official description’, simply because from what I can tell there isn’t any. What you have here are my findings and discoveries which I managed to get a hold on, and which I try to explain by theorizing about it. That’s all it is; my first batch of theories as to the why and how which of course get backed up by working examples.
Anyone can use my LOM.Navigator, point to a control or component and then try accessing some properties or functions. Or even easier: grab an existing M4L device such as the CS Step sequencer and open it up to see what it does. This is something many people have done; I’ve seen too many programmers claiming to have ‘mastered’ the control surfaces while all they did was use a bunch of existing patch components and/or existing M4L abstractions (‘tool patches’ included with M4L, such as ‘M4L.api.GetControlPathByName’, ‘M4L.api.GetComponentPathByName’ and ‘M4L.api.SelectControlSurface’). But that’s not how I work.
Having that out of the way…
The Live API knows 4 so called root objects. These are starting points within the Live Object model which provide access to the several elements which make up the environment as a whole. For example; the live_set root object embodies your active liveset and thus contains elements such as tracks, scenes, clipslots, etc. One thing to keep in mind is that all those components are related to each other. A track contains clipslots which in their turn can contain clips.
This approach is used throughout the entire Live API except… You guessed it; the control surfaces, which is well shown in the diagram above.
The control_surfaces root object basically consists of a maximum of 6 root classes (“sub-root objects”) which is where things start to be different than usual. Normally a root object is tied to a single (root) class. The live_app object is tied to the Application class whereas the live_set object is tied to the Song class. Put differently: the moment you point a [live.path] object to one of these root objects it will actually point to the associated class.
The control_surfaces root on the other hand requires a specific selection of which device to control. This also becomes quite clear when using the LOM.Navigator; the first thing you need to do after you’ve selected the control_surfaces root object is to select one of the available control surfaces.
But what exactly are we selecting? Simple:
In the Ableton Live preference screen (MIDI Sync tab) you can configure up to 6 different control surfaces. When you click on the first menu you get a huge list of devices which are natively supported by Ableton Live.
What you are actually selecting are Live’s so called “Remote Scripts”; compiled Python scripts which provide the actual functionality of the several MIDI controllers out there. Don’t forget: a MIDI controller is basically a piece of equipment which sends out specific signals based on what keys you pressed or what knobs you turned. The software needs to interpret this and act accordingly.
So basically; when you access control surfaces in Max for Live you’re not so much accessing the hardware perse; you’re basically accessing and using the functionality whatever the associated Python script(s) provides you with. I say scripts because in most cases there’s more than one.
This also means that you can easily experiment with this stuff even if you don’t own an APC40 or Launchpad or such. Simply select the control surface as shown above and access the control_surfaces root object; you will gain full access to all the available functions and properties, but of course you won’t be able to actually access them (see their results).
Components and Controls
The control surface consists of 2 groups of objects: components which belong to the ControlSurfaceComponent class, and controls which belong to the ControlElement class. The way these two classes differ isn’t officially documented but my theory is that the components section gives you access to the actual hardware which makes up the MIDI controller in order to actually control them, whereas the controls section, the name suggests as much, allows you to control the function of the specific control or component.
There are 2 things which seem to proof this theory… First the working examples, which I’m going to show in a moment (links for downloading my demo livepack are at the bottom of this article) and the way the components section is build up:
If you look at the chart at the top of this article you’ll notice how it mentions specific names as Transport, Session, ClipSlot, etc. which appear to me as components which together make up the control surface. So far, so good.
However; not all control surfaces are the same. Where my APC40 has a dedicated transport section (the play, stop and rec buttons) a launchpad doesn’t have this option. And where both the APC40 and Launchpad have a grid to control the clips in the Session view my MPD24 obviously doesn’t have that either (yet it is recognized as a control surface).
I think that’s what you’re seeing in the screenshot directly above. There are several components which don’t have an associated class simply because they do not exist on this particular control surface. I came to that conclusion by looking at the components section for the Launchpad:
If you look closely you’ll notice that the Launchpad has a class which isn’t available on the APC40: Main_Modes. That makes sense because the Launchpad has several modes which it uses to control different parts of Live. Next you can see that both the APC40 and the Launchpad use the Session_Control class.
But here’s the deal: The APC40 uses SceneComponent (associated with the Selected_Scene class) while the Launchpad doesn’t have such an association (hence its association is ’empty’).
Different controllers, different components…
Changing the APC40 functionality
Now for the controlling part; if you check the components of the APC40 you’ll eventually come across ShiftableTransportComponent which is associated with the Transport class. Some of its functions are “is_enabled”, “set_enabled”, “set_play_button”, “set_stop_button” and “set_metronome_button”.
Some of these functions are still a mystery to me. For example I always get errors (“Invalid syntax”) when I try to use the “set_stop_button” function. But I have achieved some interesting results with the “enable functions”:
The APC40 is my first control surface, as such it has number 0. Component 70 is the one I mentioned earlier; the Transport class.
So what this patch basically does is point a [live.object] to the transport section of the APC40 and then allows you to either enable or disable it. So as soon as you disable the toggle the ‘Play’, ‘Stop’ and ‘Rec’ buttons on the APC40 won’t control Live’s main transport. But, and this is where it gets interesting: if you look at the MIDI in/out indicators (at the upper right corner of Live) you’ll see that when you press these buttons MIDI data is still being sent to Live.
So what you’re basically doing is telling the Remote script which controls the APC40 to stop responding to these particular buttons. And that presents the option to change the functionality of these buttons. To do that we need the second part of the control_surfaces section; the controls.
Although there isn’t much documented it has become clear to me that functions such as “set_enabled” and the function to check on this (“is_enabled”) are available with most of the components. So you can disable more sections of the APC40 if you want to.
Using APC40 controls
Now, when looking at the controls section you’ll eventually come across elements such as these:
Here I marked the Shift_Button which should be easy to recognize, but I’m actually interested in the Play_Button which is a ButtonElement. But like above the several controls share a lot of the functions, even if some functions aren’t applicable to a specific control.
For example; the turn_on and turn_off functions have no effect on the play_button control. But they perform completely different when used on the Scene_0_Launch_Button for example. But more on that later…
Now, just like the Transport class above the Play_Button control has a lot of functions which make no sense to me right now; for example “needs_takeover”, “connect_to”, “use_default_message”. So I started to focus on the properties instead: “set_mapping_callback”, “set_forwarding_callback” which are off the type “function”.
So my theory right now is that this control part allows you to “control” what this button should do by assigning it to a specific function. After all; as we established earlier: the Remote control script is what makes the controller actually ‘do’ something. When I press ‘play’ on the APC40 then the associated control script actually starts the main transport.
And that got me thinking; in order to do that it would need to be able and detect if a button was pressed or not. Could it be possible that you could check for that yourself?
My LOM.Navigator doesn’t show anything about properties which allow you to get the value of a control. But what if the functions simply didn’t include this because it was implied?
SO I tried adding a routine which should detect if the play button on the APC40 was pressed by simply making a rough guess; and what do you know?
This is what it looked like when I pressed the play button and kept it pressed. The moment its pressed a value of ‘1’ is sent out and the moment I release it the value resets to 0.
Even though its no where documented that a ‘value’ property exists, not even by the property list given by the different classes, you can still use it.
So lets do something cooler…
Controlling the APC40 matrix
This is all trial & error mind you…
When I went over the whole Controls list using my LOM.Navigator I spotted a class “Button_Matrix” and I was pretty sure what this would be. Taking a closer look learned me that it had functions such as “heigth”, “width”, “reset” which made sense to me. And some which didn’t. Either way, I got curious if this class also had a value property which I could observe. And it did:
This is what it looked like after I fired up the ‘Dummy 2’ clip, like whoah!
Pressing several buttons learned me that ‘Button_Matrix’ only involves the 8 by 5 grid, so really the matrix which controls the session view clips. But what about this obscure code? The first digit is easy; its 1 when a button is pressed and 0 when its released again.
When pressing the very first button on the matrix; the one in the upper left corner, it presents value “x 0 0 1” (x is the state; see above). When pressing the button below it we get “x 0 1 1”. When pressing the button to its right side we get “x 1 0 1”.
So the first digit indicated the button state, the second digit the X-axis and the third the Y-axis. With that logic the button in the upper right corner of the matrix should be “x 7 0 1”, and guess what? 😉
But what is that odd 1 at the end?
Then it hit me… If you keep the shift key pressed your button matrix shows you an overview which you can use to quickly move to another part of the session view. Could it be…
Unfortunately not. This digit is still a mystery to me.
Even so… Now that I found a way to monitor the button matrix all that was left to do was to find a way to turn it off. And by that I don’t mean turn off the leds; that can be achieved by using the ‘reset’ function from the Button_Matrix control class.
For this I had to check out the Components section again, and what do you know… I spotted the ‘SessionZoomingComponent’ of class “Session_Overview”. Could it be… It had the “is_enabled” and “set_enabled” functions obviously, so a quick test in the LOM.Navigator and wham… My matrix was disabled.
(Using the Navigator: select the ‘SessionZoomingComponent’ from the list of Components, select the function “set_enabled”, click on the function output section and enter 0 in the input dialog, press the ‘call function’ button, done.).
SO now I found out how to disable sections of the APC40, how to observe which button was pressed, and…
“A shadowy flight into the dangerous world of a man, who does not exist…”
Remember those “turn_on” and “turn_off” controls functions I mentioned earlier? When applied to the start or stop button that obviously doesn’t do much, but what about the matrix which actually has leds? Surely we should be able to control those? Ayups!
So far the good news. The bad news is that each individual button has its own class and needs to be controlled individually instead of simply telling a matrix of buttons which to turn on and off.
Even so; we can control the matrix, as follows:
I can almost hear some people thinking “What the heck is happening here?!“. Don’t worry; it looks harder than it actually is.
The [live.thisdevice] button indicates if the patch is loaded and initialized. If that is the case it sends out a bang and that points both [live.object] objects on the right to the component and control sections which control the matrix of the APC40 (as mentioned above; I discovered the path using my LOM.Navigator).
Next I make sure that the counter object is set to 20 and then simply bang it 8 times using the [uzi] object. The first thing this section does is load the [prepend] object with a numerical value. So; in the first run it loads the value ’20’ in [prepend]. Then it points the [live.path] object to “path control_surfaces 0 controls 20”. The 0 points to my APC40 and ‘controls 20’ points to the first button (upper left corner) in the APC40 matrix. Its ID is then sent out of the [live.path] object into the [prepend] which was set to 20. So after [prepend] we get: “20 id x” (where x is the ID of the first button).
The [route] object then makes sure that this id is sent to the first [live.object]. This object now points to the first button in the APC40 matrix.
The cycle continues until all 8 [live.object] objects point to a button.
Next: if you hit the toggle it will first turn off the apc40 matrix; you can no longer start clips. Then it turns off the indicators so the display is blank.
And then it starts my “running_light” sub-patcher. This patcher basically sends out “call turn_on” through one outlet and “call turn_off” through the outlet before the first, thus generating a ‘running light’.
As you can see I used the same approach which I used to point 8 different [live.object] objects to the buttons in the APC40 matrix; by using a route and a ‘prepend function’ using the message object I can simply let the [route] object sort it all out.
It is a bit tedious, but as you can see its relatively easy to control the APC40 matrix. The only problem is that you need quite a lot of [live.objects], I was hoping you could somehow control a matrix to sent values indicating what to turn on or off.
Even so; there you have it. Its not perfect but its what I’ve been up to last weekend, hope some of you can enjoy.
As promised I have created a livepack (I don’t recommend installing this to your library; it will only pollute stuff) which contains all my M4L patches I mentioned above:
APC40-control which can enable and disable the APC40 transport buttons (the play, stop and rec buttons).
APC40-observe which basically prints out a corresponding value if you press a button in the 8 by 5 matrix.
Knight-Rider which uses the upper row of the matrix as a running light, its comparable to the running light which you’ll see if you turn the APC40 on and leave it alone for a while.
AND you’ll get my LOM.Navigator sitting on the master track.
Keep in mind; these are rough patches. Which means I build them with my setup in mind; you need to have your APC40 setup as the first control surface in Live (‘control_surfaces 0’). So if you have your APC setup differently you’ll need to change the patches. It shouldn’t be too hard though since I kept my patches very simple; change the [message] object which sets up the path and you’ll be fine.
You can grab my Ableton LivePack here.
If you have any questions; please dump ‘m below!