Update: Recent Projects

September 19, 2025

Don't let my silence for the past three years fool you. I've been up to stuff alright. In fact, my activities took a surprising turn a year and a half ago. I'll tell you all about it.

Background

While I haven't posted on this site in over two years, I've still been keeping busy on the coding front. In 2022 I worked on my Buffer project that allowed for real-time layering, looping, and effecting of audio inputs. The program itself existed within a text-based interface and was controlled entirely at the computer keyboard.

[IMAGE OF AYA]

In 2023 I took the Buffer concept further with my Aya Music System program. The major step forward with Aya was that it featured a graphical interface built with PyQt. I won't go into more detail about Buffer and Aya, but you can hear some of their outputs on the Output page [LINK].

Visual Turn

In February of 2024 I had to present on a new software platform that my work was implementing called Leganto. I was so tired of dry presentations, so I wanted to have some fun with it. At the point in the presentation where I finally announced Leganto I decided to create a ridiculous animated sequence of GIFs. I built the presentation in Google Slides and spent way too many hours on the animation sequence.

My favorite aspect of the sequence was that I performed it live. Each change in the animation was triggered by a mouse click. It was all set to music, so I had to rehearse it many times to get the timing correct.

[LEGANTO ANIMATION]

Above is a very poorly recorded performance of this sequence. I had too much fun making it. In fact, when I presented this one of my colleagues raised his hand and asked, "Do you get paid to do this?"

Right away I began thinking, "Surely I should be able to create a program to perform and improvise these kinds of sequences in real-time." So that's what I set out to do.

Anim

I started exploring Python and PyQt for building GUIs. This developed into my Anim program. At this stage of my visual work I exclusively used animated GIFs which I was sourcing from Giphy and GIF Movies [LINK].

PyQt has robust multimedia support, but I still had to build lots of custom widgets and functionality to get the behavior I wanted. With Anim I could control various parameters in real-time such as speed, zoom, rotation, transitions, and masking shapes. Behind it all was a degree of randomization that determined which specific GIFs would be selected throughout the performance.

[VIDEO EXAMPLES]

While I was happy with how Anim was progressing, I eventually grew frustrated with the limitations of the looped animated GIF material. I wanted the ability to play with longer, non-looping video clips.

Datamoshing

As I was exploring what kind of processing to do with longer video clips I became curious about a particular kind of video glitch I'd seen occasionally. It's the kind of glitch where one video shot freezes and the motion of another shot takes it over. You can sometimes see this when skipping ahead in a video file.

After some searching, I found that this kind of glitch, when used for artistic purposes, it called datamoshing. Two popular examples of it are A$AP Mob's "Lamborghini High" and Chairlift's "Evident Utensil" music videos. Those are pretty awesome, but the video that really stunned me was Takeshi Murata's Monster Movie from 2005.

Datamoshing is a way of interfering with the underlying data of a video file in order to exploit the way the video file is decompressed during playback. In order to grasp what's really going, I needed to study popular video codecs like MPEG2 and H264 as they are implemented in FFmpeg.

-

Databending

Photography

I was drawn to the work of Mark Fell this past fall when I was exploring rhythmic pattern generation and FM synthesis. In particular, I was blown away by Fell’s collaboration with Gábor Lázár from 2015 called The Neurobiology of Moral Decision Making.

I wanted to know how Fell and Lázár made these rhythms and sounds. A Google search landed me on this thread in the lines forum where I learned that Fell completed a PhD thesis in 2013 called Works in Sound and Pattern Synthesis. It turns out this thesis provides detailed discussions of Fell’s rhythmic pattern and synthesis algorithms. I decided to implement Fell’s ideas in Csound just like I did with James Tenney’s ideas.

Multistability

So far I’ve only focused on the ideas Fell explores on his album Multistability from 2010. Specifically, I’ve implemented some of his rhythmic pattern algorithms, which I’ll cover in this post. In my next post I’ll cover his FM synthesis algorithms.

Fell’s basic approach to rhythm on Multistability is to avoid clearly defined tempos or meters. Instead of setting a tempo and generating rhythmic patterns based on equal subdivisions of the beat, he essentially changes the tempo from one beat to the next. In other words, he defines the time in milliseconds between each beat for a sequence of beats and then cycles through that sequence. This might sound confusing, so let me illustrate the idea by showing you the “rhythmic pattern generator” I coded in Csound and Cabbage.

Rhythmic Pattern Generator

Fell rhythmic pattern generator
Screenshot of the Fell rhythmic pattern generator built in Cabbage.

Let’s first look at the top box called “Rhythm”. The six columns in the middle each represent a beat (or multiple beats depending on the Repetition value) in the rhythmic sequence. The rhythmic sequence is cycled through over and over. Here’s what the different fields mean.

Now let’s look at the three boxes below the Rhythm box. These are the individual drum parts. Here’s what the widgets do.

The ideas here are a combination of Mark Fell’s and mine. Fell wrote about the duration, multiplier, repetition, and amplitude sequence ideas in his thesis. I added more interactivity in the Cabbage interface and the ability to layer more drum parts. That’s the fun of studying other people’s ideas. You immediately start taking them in directions that are interesting to you.

Example Recordings

Here are some improvisations I recorded with this instrument. In each case I started with a pre-made rhythmic pattern and quickly began altering the pattern. Note that these are not the most compelling pieces of music on their own. To bring these pieces to life I’ll need to add synthesizer accompaniment, which I’ll cover in the next post.

"hallen_fell_rhythm_2020_4_1_a.mp3"
"hallen_fell_rhythm_2020_4_1_e.mp3"
"hallen_fell_rhythm_2020_4_1_c.mp3"
"hallen_fell_rhythm_2020_4_1_b.mp3"

Playing the Instrument

This instrument is an example of how a few simple elements (i.e. duration, multiplier, repetition, and amplitude sequence) can interact to create complex and surprising patterns. Playing it is very different from playing a traditional drum machine where each drum part is independent and fully controllable. Instead, in this instrument the drum parts are interconnected. It takes some getting used to.

You can make metrical rhythms that sound pretty traditional with this instrument, but it’s an uphill battle. This instrument is better for creating awkward, knotty rhythms, and that was Fell’s intention. Your best bet is to embrace the awkwardness and let the rhythms get weird.

Csound Code

The Csound code for this instrument is pretty similar to the code for the drum machine I made. The code reads the various widget values into arrays, cycles through the active rows and columns, and triggers the drum samples which are played with the loscil opcode.

The main difference is that this instrument changes the value of the metro opcode every time it moves to the next rhythm column. In other words, it continuously cycles through different tempos, whereas the traditional drum machine maintains a constant tempo as it cycles through the drum patterns.

Leave a Comment

Your email address will not be published. Required fields are marked *