Arnold+Sukroso Double Feature

This week, we’re finally starting our one-man band double feature. Robin Sukroso is a musician who I’ve met some years ago at Hans & Gloria Festival, and we’ve played a couple of shows together. He plays a Breedlove Black Magic acoustic guitar which he enhanced himself with force-sensing resistor trigger pads (FSRs). Using these, he is able to trigger drum samples while playing guitar. Additionally, each string is generating MIDI data to play bass sounds from the computer. The whole concept is called “AcPad” guitar and will eventually go into production. There is an article on SonicState about an earlier version of the guitar here: ACPAD Guitar Combines Acoustic and MIDI.

We have recorded some songs which will be released this week bundled with Remixes by Powel, Jozak Sander and Calm Chor as 7inch Vinyl+Download Synchrotron E.P. (beeah–music BEH020). I was playing a hybrid drum kit consisting of trigger pads for electronic sounds together with an acoustic snare, hi hat and cymbals. As always, all my electronic sounds (drums, bass, lead synth, delay) are generated inside the Nord Modular G2. Be sure to check out the video of the first single “Role For Gold”:


Arnold+Sukroso – Role For Gold on YouTube

Arnold+Sukroso will be on a double feature tour, starting this Friday (09.05.2014) with the release party at Ritterstraße 11, Berlin-Kreuzberg. We are playing both our electronic solo sets and an Arnold+Sukroso feature. Hope to see you there!

A Portable Digital Mixing Desk with Max/MSP

Bus Mixer Signal Flow

Bus Mixer Signal Flow

I promised to write about my new setup once it is in a working state. My goal was to to replace the analog 19″ mixer rack (see my post on Loopdeck Embedded Linux) with a digital mixer that is much smaller in size and offers total recall. The problem with all products I could find on the market was either their size or their lack of auxes and busses. I don’t have that many channels in my setup, but I need to route all these microphone and synthesizer signals to many different locations at the same time: sum compressor, front of house subgroups, looper, reverb, delay effects, my in-ear monitoring. So here is my DIY solution: a portable digital mixing desk that utilizes a firewire audio interface and a small MIDI fader controller.

Why is that something special? Can’t we just do that with some busses in Ableton Live?

  • First, I am very crucial about latency. Of course, for synth sounds I wouldn’t notice 6-12ms of delay, but the mic signals from drums and vocals that go directly into my monitoring would feel rather strange if they are fed through a software and therefore are not in real time.
  • Second, independence and stability of the system is very important. This mixer should stay in its state when I load a new song or fine-tune some settings during soundcheck. Also if a software process crashes, there still should be sound, and not silence – or worse, noise. Moreover, as least the mics and hardware synths should be audible even when the computer is off.
  • And third, but not last: the interface to this mixer should be very simple. Most of the settings I need are hard wired and I am only controlling volumes, sends and bus assignments during a live show. But still, feedback is very important – LEDs and motor faders help me to overview the current state without even the need of having a screen connected.

The system that I realized is based on a Mac Mini with a connected MOTU 828mk3 audio interface, all built into a 5HE rack. This firewire device has many in- and output options and includes CueMix FX, which can do no-latency audio handling in standalone mode, even with EQs and compressors built in. This solution could of course also be adapted to make use of similar functions in the TotalMix FX software of the RME Fireface interface. CueMix FX is controlled via Open Sound Control (OSC) from a big Max patch that I wrote in the last months. The graphical window of this patch is my only interface to the computer and it is displayed on a 8″ Faytech Touch Screen.

Putting things together

Putting things together

Screenshot – Max/MSP Bus Mixer

Max/MSP Bus Mixer

Screen and Faders

Screen and Faders

In the screenshot above, you can see the channels from left to right: Drum Mic, Wet Reverb return, Vocal Mic, some channels from the Nord Modular G2, E-Piano VSTs, SooperLooper, Lexicon FX return. I included some modules that connect other software: I use Fluidsynth, Lounge Lizard EP4 and Pianoteq 4 as permanently active soft synths. A patch management module is used for recall of aux send assignments and synth patches and includes clock and set timer :-) The loop module is a OSC bridge to SooperLooper and there also is this light controller, which interfaces an Arduino for some RGB LED light effects. More on that later, after the Tour:

(Update) Because many of you are interested, here is an example Max patch that connects to CueMixFX using OSC: Download CueMixFX_Example.maxpat. You will need to install the osctools External by Remy Muller as well.

Fly To Mars

Today’s concert at Magnet Club will be the last one with the usual setup. There are a lot of new things coming up that will radically change the running system. The first thing to happen will be the shift of my whole audio processing to a digital mixer and an all-integrating Max patch. I’m also going back to the roots, revising my Wahan drum kit substantially. Last but not least, the live visualization of my music will expand with the possibilities of Max. If you’re a Berlin based artist and want to contribute, let me know.

At the end of this month, a new remix single is coming up, followed by a short Germany tour. [edit: you can get it now on iTunes and Amazon.] We shot a video last year, this is Filmarche 5773, directed by Anton Hempel, camera Michael Clemens, produced by Barbara “Bob” Voss, edited by Anton Hempel.


Sebastian Arnold – Fly To Mars on YouTube.

There will be more information on the single parts of my setup as soon as I got them running. See you all soon!

Finite State Machine Sequencer

For the last years, I have been working with a drum trigger interaction system based on an acoustic drumset and the Nord Modular G2 synthesizer. The basic idea is to control multiple step sequencers by drum hits. Like that, the musician can interact with preprogrammed note patterns in his own musical feel and timing. People keep asking me how they could reproduce this approach on their own computers and beyond that, the G2 system is somewhat limiting for such a task, especially when the compositions get more complex.

Some weeks ago I discovered the call for works for the CTM.13 MusikMakers Hacklab and proposed my idea for an interactive graph-based step sequencer. The proposal was accepted and I spent the week on CTM Festival building the prototype as a Java External for Max/MSP. The Hacklab was a very inspirational gathering of music, art and visual developers collaborating within their projects and sharing ideas and concepts. For example, Imogen Heap was working with her team on their exciting musical gloves project. I set up an electronic drum kit and my stage LED Light modules as an interface testing environment. In many conversations with other artists in the lab I developed a basic concept for the interaction between the drum kit and my sequencer prototype.

Basically, the graphical sequencer consists of a finite state machine that can be played by a musician. It is a Max object that accepts any signal (e.g. MIDI) as input and sends predefined signals to the outputs. In the graphical representation, the nodes represent musical events or any other signal that Max can handle, e.g. MIDI notes, chords, OSC messages or visualization commands. The edges connecting the nodes define the rules for the transition from one node to another over time. Multiple outgoing edges are interpreted as alternate choices, giving the possibility to express a musical composition in form of a Markov Chain.

Screenshot of the graphical sequencer in Max

I added two more things to that concept to create the possibility of interaction with this graphical composition: An emitter is a start node that listens to a specific event, e.g. a drum pad. When activated, it emits a token into the graph that will transition the nodes when it recieves the “step” signal, e.g. from the bass drum trigger. You can now play this graphical composition with a simple drum interface:

So far, that’s it for the prototype. I can’t wait to try out hacking some more complex compositions in the sequencer and include it into my live setup. But of course, there’s lots of work to do until then, and we developed additional ideas for adding conditions to the graph, to express larger musical parts in a simple graphical way. There will be a free download of the basic Max external and/or a standalone Java application later, but this may take some time.