Nine Rules for Scheduling Events

How would Slipmat’s @ scheduler work when applying various Python methodologies?

To celebrate the release of my friend Sarah MacLean’s new book, Nine Rules to Break When Romancing a Rake, here are 9 rules for scheduling events. Each of the following examples produce the exact same result:

1. Individual Events

@0 foo()
@1 foo()
@2 foo()
@3 foo()

2. List

@[0, 1, 2, 3] foo()

3. Identifier

t = [0, 1, 2, 3]
@t foo()

4. List Comprehension

@[i for i in range(0, 4)] foo()

5. Loop

for i in range(0, 4):
    @i foo()    

6. Function

def bar():
    @0 foo()
    @1 foo()
    @2 foo()
    @3 foo()
    
bar()

7. Nested

foo()
1:
    foo()
    1:
        foo()
        1:
            foo()

8. Item Iteration

items = (0, 1, 2, 3)
@[i for i in items] foo()

9. Map

def bar(x): return x
@(map(bar, range(0, 4)) foo()

And a special message for Sarah, “Ninjas.”

Slipmat Prototype & Processing

I experimenting combining the first Slipmat prototype (built with the Csound API) with Processing. Getting the two systems working together was fairly straight forward. Here’s the code that generated the video.

The following method from class Disc does two things when a disc collides with a border: change direction of the disc and trigger a slipmat event:

public void moveDisc() {
    x += xVelocity * SCALE_VELOCITY;
    y += yVelocity * SCALE_VELOCITY;

    if (x <= DISC_RADIUS || x >= width - 1 - DISC_RADIUS) {
        xVelocity *= -1;
        playNote();
    }

    if (y <= DISC_RADIUS || y >= height - 1 - DISC_RADIUS) {
        yVelocity *= -1;
        playNote();
    }
}

The method playNote() also belongs to class Disc, and is where the actual triggering of the Slipmat event occurs. The instances sp1 and sp2 are part of the Slipmat system. The “sp” is short for “sine percussion.”

public void playNote() {
    sp1.playNote(0.0, 0.35 * (1 - x / width), frequency);
    sp2.playNote(0.0, 0.35 * x / width, frequency);
}

Live Coding and Capturing a Perfomance

It’s the latest fad that’s sweeping computer music. And I would love for Slipmat to have this ability in its arsenal of tools. Without having to sacrifice non-realtime rendering for computationally expensive processes, of course.

The following conceptual live coding prototype shows what a simple session would look like if it was modeled on the Python interpreter:

$ slipmat --capture_performance my_session.txt
>>> from LiveCodeSeq import seq
>>> from MyBassLibrary import rad_rezzy
>>> from random import random
>>> p[0] = [int(random() * 12) for i in range(0, 16)]
>>> p[1] = [int(random() * 12) for i in range(0, 16)]
>>> p[0]
[5, 9, 11, 8, 7, 8, 5, 1, 10, 7, 4, 4, 6, 4, 4, 2]
>>> p[1]
[6, 6, 5, 3, 5, 7, 8, 4, 0, 0, 8, 7, 9, 7, 2, 4]
>>> r = rad_rezzy()
>>> s = seq(instr=r, pattern=p[0], base_pch=6.00, resolution=1/16, tempo=133)
>>> s.start()
>>> s.change_pattern(pattern=p[1], on_beat=0)
>>> @60 s.stop(onbeat=0)

I have a gut feeling that there are some changes that should be made. Though as a starting point, this isn’t a terrible one.

Being able to capture a live coding performance would be fantastic. Not sure how workable it would be, but perhaps such a feature would produce a file that could be played back later:

$ cat my_session.txt
@0             global.seed(7319991298)
@4.04977535403 from LiveCodeSeq import seq
@8.43528123231 from MyBassLibrary import rad_rezzy
@10.9562488312 from random import random
@15.6027957075 p[0] = [int(random() * 12) for i in range(0, 16)]
@20.7757632586 p[1] = [int(random() * 12) for i in range(0, 16)]
@26.2462371683 p[0]
@29.3961696828 p[1]
@34.0424988199 r = rad_rezzy()
@40.3211374075 s = seq(instr=r, pattern=p[0], base_pch=6.00, resolution=1/16, 
                   tempo=133)
@45.5491938514 s.start()
@47.8991166715 s.change_pattern(pattern=p[1], onbeat=0)
@52.6267958091 @60 s.stop(onbeat=0)

The @ schedules are the times in which return was originally pressed for each event. Looks like I’ll be spending some time with ChucK soon.

Early Java Slipmat Prototype

Slipmat has been a pet project of mine for a few years now. The first working prototype was realized in Java. The code for this is up at sourceforge.

Originally, the idea was for slipmat to be a java interface layer that sits on top of Csound. Hence the slipmat metaphor. At one point, it became much too unwieldily and I had abandoned it. Until later when I picked up Python, and decided to give it another go. Python made things much easier all the way around, but I eventually abandoned that too, for reasons I’ll go into later.

Here is an excerpt from one of the original Java prototype examples:

public ControllerExample() {
    /* Create empty SynthRack */
    SynthRack synthRack = new SynthRack(false);
    
    /* Create modules */
    LinearSlide linearSlider = new LinearSlide();
    SynthBass bass = new SynthBass();
    Output output = new Output();

    /* Add modules to the SynthRack */
    synthRack.addModule(linearSlider);
    synthRack.addModule(bass);
    synthRack.addModule(output);

    /* Patch the SynthBass to the Output */
    output.setInput(bass.getOutput());

    /* Create a sequence */
    linearSlider.slide(1.0, 0.05, 2.0, bass.getOscillator1Detune());
    linearSlider.slide(1.125, 0.05, 1.5, bass.getOscillator1Detune());
    bass.playNote(0.0, 8.0, 0.5, 100);
    bass.playNote(2.5, 6.0, 0.25, 200);

    /* Set duration and press play */
    synthRack.setDuration((double) SCORE_DURATION);
    synthRack.startCsound();
}

The full code can be seen here. Once I get things sorted out at sourceforge, I’ll post the Python package there as well.

Music and Visuals

If slipmat at its core is a general purpose programming language augmented with advanced timing and scheduling capabilities, then slipmat would be well suited for both music and visuals.

This isn’t really too much of a stretch. Audio unit generators won’t be built into the core language, but instead will be stored as separate modules and imported as needed; The same would be true for visual modules. Musical GUI elements such as sliders are already visual in nature. There are existing audio/visual systems out in the wild, such as Max/MSP/Jitter and Impromptu. Various functions and objects designed for music could also be applied to visuals; The same envelope that controls frequency can be used to control the position of a circle:

t = 10                          # time = 10 beats
my_circle = Circle(0, 200, 15)  # x position, y position, radius
env = line(0, t, 1)             # start value, duration, end value
sine(1.0, 440 * env)            # amp, frequency
my_circle.xpos(env * 400)       # x position over time

The example is a bit crude and omits many practical things, but the idea of syncing visuals with audio is there.

Beyond music and visuals, anything that is time-based would theoretically work in slipmat, providing a module is written.

Roll Your Own Score Syntax

A music language that comes with a fully-loaded set of string capabilities, including regular expressions, would allow users to develop their own methods for notating music. For example, here is a horizontal representation for rhythm guitar.

@0  strum('C',  '/... /... /./. /...')
@4  strum('Dm', '/... /... //// /./.')
@8  strum('G7', '/... /... ///. ../.')
@12 strum('C',  '/... /... /... ///.')

The slash triggers an event and advances time by a 16th note. The dot just advances time by a 16th note. A space does nothing.

These 4 lines of code represent 24 events. Not only does this shorten the score and save keystrokes, but is far more readable than if each individual event was explicitly written. In fact, I bet there are many guitar players out there that could play this as it’s written, providing he or she was given a brief explanation about the notation.

What about a system for triggering drums? Here’s the all-to-familiar 4-beat rock pattern with 8th note hats.

trigger(hat(),   'x.x. x.x. x.x. x.x.')
trigger(snare(), '.... x... .... x...')
trigger(kick(),  'x... .... x... ....')

Once again, the music is notated horizontally. Instead of 12 lines of code, there are only 3. Let’s take a look at the vertical equivalent:

@0   hat()
@0   kick()
@0.5 hat()
@1   hat()
@1   snare()
@1.5 hat()
@2   hat()
@2   kick()
@2.5 hat()
@3   hat()
@3   snare()
@3.5 hat()

In theory, people could write slipmat modules that mimic other text-based notation systems, such as the Csound score language.

import CsoundScore as CS
from MyLibrary import bassFM
from MyLibrary import pad

# For mapping slipmat instruments to a Csound-styled score
instrs = {"i1": bassFM(), "i2": pad()}

# A Csound-styled score
score = '''
i 1 0 2 0.5 7.00
i 1 + . .   6.05

i 2 0 8 0.333 8.09 0.77 1000
'''

CS.playScore(instrs, score)  # Play Csound-styled score

That would translate to:

@0 bassFM(2, 0.5, 7.00)
@2 bassFM(2, 0.5, 6.05)
@0 pad(8, 0.333, 8.09, 0.77, 1000)

Much of this can be done with some of the existing music languages out in the wild. The guitar strum and drum trigger examples can be accomplished in pure Csound code (see dseq — A Drum Machine Micro-Language). Even more so, Csound combined with the Python API can do some truly amazing things along these lines. Though it probably wouldn’t be nearly as fluid as a language that had this in mind when being designed from the ground up.

If you haven’t done so yet, take a moment to ponder the possibilities. And be sure to think about the things others will dream up that can be simply imported into your own compositions/synthesizers/sequencers/etc…

When Does That Happen?

Look at the following code and try to answer these two questions: When is foo() scheduled to play? When is bar() scheduled to play? Be sure to read about the @ scheduler if you haven’t already done so.

@0 x = 0
@1 x = 3

@(5 + x) foo()
@5:
    @x bar()

The answers: foo() is scheduled to play at beat 5, while bar() plays at beat 8.

Why is this?

For foo(), the expression for the scheduler @(5 + x) is evaluated at beat 0. At beat 0, x equals 0. This is a single schedule.

For bar(), there are two separate schedules that happen in tandem: @5 and then @x. The @x schedule isn’t evaluated until the @5 schedule is triggered. You might say that the @x schedule is a task of the @5 schedule. At the time @x is evaluated, x equals 3.

Of course, this is all theoretical at the moment.

Patching — An Early Mockup

Being able to patch together units is a fundamental principle of a modular environment. Though I’m far from figuring out what the syntax should look like in my faux music language, I have been writing some mock up code just to get a sense of it.

Just a warning, the following example is ignorant of i/k/a-rates, along with pass-by-reference vs pass-by-value:

import FX
import Master
import Mixer
from Envelope import line
from TestLibrary.Instruments import SineTone

# Create instances of objects and patch together
st = SineTone()                             # Simple sine instrument
reverb = FX.Reverb(input=st.out, time=3.1)  # Reverb unit
mix = Mixer.pan(0, st.out, reverb.out)      # Dry/wet: value, sig 1, sig 2
output = Master.DAC(mix)                    # Main output

# Score
@0 turnon(reverb, mix, output)  # Turn on selected instances

@0 st.play(20, 1, 440)        # Play for 20 seconds, amp = 1, frequency = 440
@0 st.amp *= line(0, 10, 1)   # Amplitude rise
@10 st.amp *= line(1, 10, 0)  # Amplitude fall
@20 mix.pan = line(1, 20, 0)  # Dry to wet over 20 seconds

@10 reverb.time += line(0, 5, 8.1)  # Increase reverb time starting at 10
@20:
    @reverb.time turnoff(reverb, mix, output)  # Turnoff selected instances

The import section loads classes from existing instrument/unit generator libraries.

In the orchestra, instances are created from the imported classes, and patched together into a simple instrument graph. There’s a simple sine instrument, which is plugged into a reverb unit. Next, there is the pan mixer, which has the dry sine instrument plugged into one side, and the wet reverb signal plugged into the other; It is initially set to 100% dry. The pan mixer is then patched into the output, which sends the audio to the DAC.

The first line in the score turns on three instruments: reverb, mix and output. There will be times when the duration is unknown. The ability to start and stop machines is a must.

The sine instrument starts with a duration of 20, an amplitude of 1 and a frequency of 440. The amplitude of the sine is modulated by two envelopes, creating a rise/fall shape. Line envelopes also modulate the dry/wet mixer and reverb time.

At the end, the turnoff function shuts down reverb, mix and output.

Bonus round: Why do you suppose I wrote,

@20:
    @reverb.time turnoff(reverb, mix, output)  # Turnoff selected instances

instead of this?

@(20 + reverb.time) turnoff(reverb, mix, output)  # Turnoff selected instances