Drag and drop Csound in Ableton Live from Enrico de Trizio on Vimeo.
Dr. Richard Boulanger’s Bohlen-Pierce Csound instruments played in Ableton Live with Enrico de Trizio’s Csound drag and drop Max Instrument. (via Piscoff’s twitter stream)
Drag and drop Csound in Ableton Live from Enrico de Trizio on Vimeo.
Dr. Richard Boulanger’s Bohlen-Pierce Csound instruments played in Ableton Live with Enrico de Trizio’s Csound drag and drop Max Instrument. (via Piscoff’s twitter stream)
I want to begin discussing the implications of yesterday’s Python-Csound mockup code (which I’ll refer to as slipmat for the time being), starting with with imports:
import Wavetable from Gen import sine from Pitch import cpspch
All of Csound’s 1400+ opcodes are available at all times. Great for convenience, perhaps not so great for organization. In contrast, the Python language starts out with only the basics, a clean slate. To extend functionality, users import modules. This is a cleaner approach than having it all hang out. There are some other advantages, too.
First, let’s look at a hypothetical import block. Let’s say you were to design a “computer network music” ensemble inspired by The Hub. Some communication modules you might include:
import Jack import MIDI import Network import OSC
A computer network music ensemble sounds like it might be a complex piece of software. Complex enough where doing all your work in one file would be tedious. So you decide to start a new file, my_network.slip, where you store your own custom opcode/unit generator function definitions. In your main file, you write this to import:
import my_network
Not only can you use my_network for this project, but that code can be reused in any number of future projects. Code reusability is a beautiful thing. In fact, this would apply to any properly written slipmat document. For example, a composition would double as a library of synthesizers that you could plug into your own work:
import Trapped # Trapped in Convert by Dr. Richard Boulanger ... signal = Trapped.blue(22.13, 4, 0, 9.01, 600, 0.5, 20, 6, 0.66)
See trapped.csd.
The piece is due this Friday, and of course, that means my computer had to die yesterday.
*shakes fist at deadline gremlins*
So I lost 5 hours of composing time. The good news is that I have a backup computer. It runs a bit slower, but I can move forward.
I appended another fragment to my outline. This section is inspired by two sources. The first is Mozart’s Beethoven’s Moonlight Sonata. I’ve had this piece on my mind ever since playing the “Lost in Nightmares” expansion mission for Resident Evil 5. The second is Khachaturian’s Gayane Ballet Suite, though it’s influence is perhaps subtle.
Have you ever seen Connections? I’m about to make a few.
The Gayane Ballet Suite was used in the film 2001: a space odyssey. The author, Arthur C. Clarke, “was the first to propose geostationary communications satellites.” John Pierce, of the Bohlen-Pierce scale, “arrived at the (same) idea independently and may have been the first to discuss unmanned communications satellites. ” (source)
Also in 2001, we hear HAL sing the tune Daisy Bell as he is dying. The film’s director, Stanley Kubrick, was visiting with Pierce at Bell Labs to get a sense of what a telephone booth in space would look like. Pierce used this opportunity to show him the computer music program, which included a synthetic vocal arrangement of Daisy Bell by Max Mathews, the father of computer music. Here’s a video of Mathews telling the story; Dr. Richard Boulanger is also in it.
Stanley Kubrick passed away on March 7th of 1999. The first day of the Bohlen-Pierce conference starts on March 7th. I originally heard of Kubrick’s passing around the corner and down the street from where Monday night’s event takes place. I was at a gathering at Elaine Walker‘s apartment, who has been a composer of Bohlen-Pierce music for years, and is presenting both music and a lecture at the Symposium.
Download: fragments_11.csd
Whenever I start a piece, especially one based in old school computer music, I usually find myself working in a compositional clean room. What I mean by this is I usually have to spend many hours writing synthesis code that produces very sterile sounding textures and blips that lacks both depth and soul before moving on. Sometimes I never actually make it beyond this point. To be completely honest, I kinda dig the sound of it. However, I made promise to myself right before the new year that I’m going to force myself to get out of my comfort zone. This means taking off the clean suite, getting outside and playing in the mud. Where “mud” is obviously alluding to the organic. (note: there is some dripping sarcasm in the last line) Though I’m making a joke, mostly intended for myself, there is some truth to this.
As for Fragments, though I’m still in that clean room, I’m starting to make the transition. Today’s example is that first step, and very small one at that. I’m spending more time on the overall sound of the piece, space and structure. As opposed to just creating a process and letting it run for x amount of minutes.
One thing of particular interest is that I’m applying Bohlen-Pierce ratios and proportions to various elements of the piece. Such as note duration, envelope, next start times, FM modulation indexes and ratios, etc. Though I can’t claim this idea as mine, as it came from friend and mentor Dr. Richard Boulanger who had suggested it to me in an email.
Fragments 3: fragments_3.csd