I’ve played some wonderful gigs lately from people’s living rooms where the unique audience crammed inside all seemed to have post-doctorate degrees in music, to theatres where easily pleased university ‘freshers’ queued for their 3 free drinks before listening patiently.
Overall things have been good.
I find myself now moving back behind the drum kit for another short while as there are several upcoming gigs with both The Lambing Season & Sons of Caliber as well as weekly workshops with Queen’s University Belfast’s improv ensemble QUBe.
While there are several things that need to be improved with my live show (mainly unreliable numberpad controllers, and cumbersome visuals) my main focus for the next while is in collaborations with other musicians around Belfast.
I am constantly blown away by the talent this small city contains and will find Belfast hard to leave.
Over the last week I have heard songs about the circus sung in Yiddish by Ben Maier, audiovisual explorations of hydrogen atoms by Michael Dzjaparidze, 8 minute epic slow burners about hurricanes by Captain Cameron, and the heartfelt musings of the exceptional talent that is Katharine Phillipa.
^ an example of some of the visuals I’ve been working on
I’ve been inspired lately by some videos I’ve watched of Making The Noise & Altitude Sickness explaining their live visual setups. Both of them use the popular visual programming package Processing which can accept both midi and open sound control (OSC) to control their visuals. So I decided to give making my own visuals a go.
The picture above contains screen shots of some of the visuals I’ve made. I wanted a visualisation that directly corresponded to buttons presses on the launchpad so I simply made a grid or squares that light up when they receive the right midi note-on messages. Some buttons also trigger colour changes or create fades which can change the whole look of the visualisation when a new sample is triggered.
I’ve also been using the LiveGrabber Max4Live devices to send volume data from Ableton Live to Processing via OSC. In the bottom-left screenshot, bass frequencies determine the x-axis position of a circle and line, and mid frequencies control the y-axis position of a circle and line. High frequencies control the colours of the circles and background.
In the bottom right screenshot I edited the incredible Schizzo sketch which draws random city scapes in real time, so that certain buttons presses can determine the starting point at which the first building is drawn as well as wiping the sketch clean and starting again.
The sketches I made at the top are purely another way of visualising what buttons I’m pressing during a song (useful in situations where all of the audience may not be able to see my launchpad). Whereas the bottom two examples are more about creating a piece of art on the fly via music. My thinking behind this is that when I start the piece there will be a blank canvas on the screen, but when I’m finished the song it will be a mini work of art that will be unique to the song that was just played.
^ computers kick ass