CsMultitouch – Multitouch User Interfaces for Csound

This is a demonstration of the software i wrote for my MSc In Music Technology at DKIT under the supervision of Rory Walsh. The software is used to create custom multitouch user interfaces for controlling Csound Instruments. It allows users to define gui elements, such as sliders and buttons, in their csound file. The .csd file is parsed by CsMultitouch to retrieve this information. Said information is then used to create the multitouch interface using the PyMT framework.

I want this.

Additive Synth Legacy Code

Let’s set the way back machine to January 2001. This is around the time I took my first steps into designing an additive synthesizer. I’m not sure when user-defined opcodes were introduced, though there is a good chance they had not existed then. And if they did, I had no idea of their existence. Same goes for the event series of opcodes. In my legacy code, each overtone, along with supporting envelopes and transfer functions, were written explicitly. I was a perl junky at the time, so I wrote scripts that would generate the instruments for me.

The example I’m posting today is the legacy code from 2001. I did not change the code, except for converting tabs to spaces and placing the orc/sco pair into a csd.

Download: add_synth_legacy.csd

In my new additive synth, I’m employing a recursive user-defined opcode technique, which I first read about in Steven Yi’s Csound Journal articles Control Flow Part I and Part II.

If you look at part 2, Steven actually demos an additive synth, which is eerily similar to the core design of the one I’m in the process of making. Which means I either independently came up with a similar design, or more likely, I’m suffering from a bout of cryptomnesia. Either way, if you haven’t studied up on these two articles, then perhaps it’s time you make a weekend project out of it; They are pure gold.

Additive Synth Upgrade

Before, the synth was hardwired with a band-limited saw wave. I’ve removed this restriction by allowing users to pass a table filled with information about the harmonic spectra. For each overtone, the table stores three bits of information, in this order: frequency ratio, amplitude and phase. For example, the following two lists represent the data for a 3 tone band-limited saw wave and square wave:

1, 1, 0, 2, 0.5, 0, 3, 0.333, 0 ; saw
1, 1, 0, 3, 0.333, 0, 5, 0.2, 0 ; square

Typing all of this is tedious, so I went ahead and created a handful of GEN Instruments to create and fill tables automatically for the saw, square and triangle. I’ll do one for the buzz wave and one that mimics GEN10 down the road. Though my examples are harmonic in nature, the design also allows for inharmonic overtones.

I made one other upgrade, which technically isn’t an upgrade to the synth, and is probably more of a proof-in-concept. In the previous example, I use a table filled with noise for the EQ of the acoustic body. This works really well, though it lacks the resonant peaks and dips one would expect from a real instrument. I created a GEN Instrument, GEN_Multiply, that takes two tables, multiplies their content (with interpolation if necessary), and produces a new table. So now I can create a table with envelope-like shapes to simulate peaks and dips, and multiple that with a noise table, producing a resonant noise table.

Download: add_synth_upgrade.csd

u-p-g-r-a-y-e-d-d

Bowed String Additive Synth

About 12 years ago, someone told me that additive synthesis will never be practical. My initial reaction was, “we’ll see about that.” This person’s background was that of modular subtractive synths, and was quite knowledgeable. From that perspective, I could see their point. Who want’s to take the time to create complex envelopes for each individual harmonic? My mind has often wondered back to this incident.

Today, additive synths are becoming more common thanks to faster computers. Many of their UIs do help programmers with the large amounts of complexity additive brings to the table. And there are many useful and valid approaches, each with their own strengths and weaknesses.

Download: bowed_string.csd

My approach has been floating around my head on and off for about a decade. The first iteration was completed around 2002, and was shelved until recently. Today’s csd is a continuation of the second iteration of the design, which I had originally intended to use with Fragments (see here), but ran out of time. For the next couple of weeks, I plan on taking this instrument to its illogical conclusion (I have no idea what it’ll be like when it’s done.) When I am finished, I’ll write an in depth article for The Csound Journal on the final design.

The premise for my approach is to use f-tables as a shortcut for specifying and controlling additive synth data. In today’s example, the audio generator produces a 32 band-limited sawtooth wave. However, before the sine waves are generated with oscil, the synth data is run through two transfer functions, stored as f-tables. One transfer function changes the amplitudes of the harmonics, emulating the EQ of a virtual acoustic body. The other bends the frequencies, causing frequency distortions. Frequencies continue to be processed by the transfer functions, even as they are modulated, which I believe is key to convincing acoustic viability.

The reason why this sounds similar to a bowed stringed instrument is because the amplitude transfer function is filled with the right amount of bipolar noise. The truth is, I had no intention of creating a string-like sound. I was just toying with it and thought I’d try something drastic like using a table filled with noise. After the discovery, I spent considerable time tweaking the values trying to get it to sound a little bit more expressive.

I should warn you, there are some clear cases of aliasing occurring in today’s example. I think I know what’s causing it, but I’ll have to go back and run some tests to be certain. In the mean time, I hope you enjoy.

Fragments

fragments

Csound: fragments.csd

The Bohlen-Pierce Symposium, Boston, March 7th-9th, 2010
bohlen-pierce-conference.org

Program Notes

Fragments is the final product from a series of short etudes and generative instrument experiments conducted in order to gain an understanding of the Bohlen-Pierce scale. This direction emulates the way in which a hacker approaches the challenge of dissecting a piece of software or electronic device. The piece is composed, programmed and generated with the Csound computer music language. The evolution of Fragments is documented at The Csound Blog: http://csound.noisepages.com/.

Bio

Jacob Joaquin started tinkering with music on a Commodore 64 while in elementary school. From 1994 – 1996 he ran the Digital Dissonance BBS, an online Fresno community where musicians traded orginal tracker-based electronic compositions. He received his BA in Music Synthesis from Berklee College of Music in 1999. During his time at Berklee he recieved his first C programming lesson from Max Mathews and was the first recipient of Berklee’s Max Mathews’ award. Jacob completed his Masters Degree in Composition New Media and Integrated Media at California Institute of the Arts in 2002. He has studied composition with Dr. Richard Boulanger, Mark Trayle and Morton Subotnick. Jacob actively blogs about computer instrument design at The Csound Blog. He currently resides in Fresno, California.

Fragments of a Bohlen-Pierce Composition (Pt 13)

If something terrible happens between now and tomorrow’s deadline, such as my secondary computer spontaneously combusting, I’m happy to say that it won’t get me down. As I have finally completed a version for which I’m more than satisfied with.

That said, there are many still many little things I want to fix up. I’ll tidy it up this evening. Listen to it in the early morning. Send it off to Dr. B. And then go see Alice in Wonderland as my personal reward. (I actually have no interest in seeing it, but they’ll be previewing the new TRON Legacy trailer.)

Download: fragments_13.csd

Fragments of a Bohlen-Pierce Composition (Pt 11)

The piece is due this Friday, and of course, that means my computer had to die yesterday.

*shakes fist at deadline gremlins*

So I lost 5 hours of composing time. The good news is that I have a backup computer. It runs a bit slower, but I can move forward.

I appended another fragment to my outline. This section is inspired by two sources. The first is Mozart’s Beethoven’s Moonlight Sonata. I’ve had this piece on my mind ever since playing the “Lost in Nightmares” expansion mission for Resident Evil 5. The second is Khachaturian’s Gayane Ballet Suite, though it’s influence is perhaps subtle.

Have you ever seen Connections? I’m about to make a few.

The Gayane Ballet Suite was used in the film 2001: a space odyssey. The author, Arthur C. Clarke, “was the first to propose geostationary communications satellites.” John Pierce, of the Bohlen-Pierce scale, “arrived at the (same) idea independently and may have been the first to discuss unmanned communications satellites. ” (source)

Also in 2001, we hear HAL sing the tune Daisy Bell as he is dying. The film’s director, Stanley Kubrick, was visiting with Pierce at Bell Labs to get a sense of what a telephone booth in space would look like. Pierce used this opportunity to show him the computer music program, which included a synthetic vocal arrangement of Daisy Bell by Max Mathews, the father of computer music. Here’s a video of Mathews telling the story; Dr. Richard Boulanger is also in it.

Stanley Kubrick passed away on March 7th of 1999. The first day of the Bohlen-Pierce conference starts on March 7th. I originally heard of Kubrick’s passing around the corner and down the street from where Monday night’s event takes place. I was at a gathering at Elaine Walker‘s apartment, who has been a composer of Bohlen-Pierce music for years, and is presenting both music and a lecture at the Symposium.

Download: fragments_11.csd

Fragments of a Bohlen-Pierce Composition (Pt 10)

I wrote an event capture utility that writes events generated by a MIDI keyboard to a text file.

Download: fragments_10.csd
Download: fragments_10_playback.csd

How to use it: Start fragment_10.csd, and play your MIDI keyboard. When you are done playing, press control-c, or whatever key combination stops the Csound process. This will have generated a file called fragments_10_captured.csd. Now run fragments_playback.csd.

It doesn’t work perfectly, as the amplitude envelope in the playback file is off. However, this utility fits my needs since I’ll be re-orchestrating the output for different instruments. Since I have a deadline, I’ll squash this bug later when I revisit this topic.