Katie, Don’t Go is currently running on the 3rd floor of the MAC, in Belfast.
It is free to visit but you should book a slot online here.
It lasts roughly 15 minutes per person and can be enjoyed alone or in pairs.
There might not be as much buttons as I would like in my life these days but thats only because its filled up with other very exciting projects.
Currently I am collaborating with Barcelona based finger-ninja Pauk on a track for the upcoming Monome Community Remix Project to be released in March. Our track is in its final stages and shaping up nicely.
Also recently had a collaboration with Bristolite rhodes-molester Jack Drewry with plans to perhaps make a short release in the coming months.
Currently in rehearsals with Rachel Austin with some gigs lined up and a tour of France and possibly further on the cards.
Sons Of Caliber are also on the verge of releasing their debut e.p. and there are some exciting prospects opening up for us. Plentiful gigs approaching so click the link if you’re in Dublin/Belfast.
For my final portfolio at Queens University I am making a large scale interactive sound-art installation, involving lots of motors, lights, and a lot of cabling. More details on that to come.
I will be developing a similar art installation with Lisa Keogh (with whom I am also going to be composing some music for her short film thats in the oven), involving storytelling in a new interactive and (hopefully) immersive way.
Busy yet exciting times. Back to work.
I’ve played some wonderful gigs lately from people’s living rooms where the unique audience crammed inside all seemed to have post-doctorate degrees in music, to theatres where easily pleased university ‘freshers’ queued for their 3 free drinks before listening patiently.
Overall things have been good.
I find myself now moving back behind the drum kit for another short while as there are several upcoming gigs with both The Lambing Season & Sons of Caliber as well as weekly workshops with Queen’s University Belfast’s improv ensemble QUBe.
While there are several things that need to be improved with my live show (mainly unreliable numberpad controllers, and cumbersome visuals) my main focus for the next while is in collaborations with other musicians around Belfast.
I am constantly blown away by the talent this small city contains and will find Belfast hard to leave.
Over the last week I have heard songs about the circus sung in Yiddish by Ben Maier, audiovisual explorations of hydrogen atoms by Michael Dzjaparidze, 8 minute epic slow burners about hurricanes by Captain Cameron, and the heartfelt musings of the exceptional talent that is Katharine Phillipa.
So when it comes to buttons and controllers, things can get expensive. My Novation Launchpad cost around €100. A monome can be five times that.
While I’d love to have another launchpad or monome to add to my live gear I couldn’t justify spending so much money on something I already own. So instead I’ve bought this:
It might not be sexy or give lighted feedback but this £6 usb numeric keypad makes the perfect midi controller. I was thinking I’d have to make a simple max patch that converted qwerty commands in to midi messages but of course in ableton midi clips can be triggered by qwerty presses.
So this cheap little keypad is now my glitch box. Dummy clips in ableton are keymapped to automate effects on the master bus including beat repeats, and grain delays. It works perfectly. Even the buttons feel sexy.
Add this to the $15 usb joystick aswell and I can happily bash away at them without fear of needing expensive replacements.
Since moving over to a mac laptop one of the things I’ve been looking forward to trying is the ‘autonome’ app for launchpad by demian tools.
During my live set I usually stopped between songs to load up a new song in ableton live. This meant slow awkward silences between songs.
With the Autonome app, it allows you to choose which midi channel your buttons presses are being transmitted. Before I was stuck with only one channel but now there are 16. Meaning I can switch between up to 16 different songs on the fly.
To do this I have to combine all of my live sets into one megaset. This is taking forever but it’ll all be worth it.
The autonome is by far the best app for launchpad there is out there. I tried in the past to program something similar myself and failed miserably. Heres a video of it in action (not by me; from www.demianlab.com)
You can download it from here
As soon as I finished my blog post about ableton & lion compatibility they announced the release of Live 8.2.5 featuring full Lion multicore processing goodness as well as improved midi sync.
I’ve also been trying very hard to program visuals in Apple’s Quartz Composer but its just too frustrating and clumsy. I’m going to stick to Processing. I still have to close a window and open another between every song to get different visuals however which is annoying and I can’t for the life of me get my head around a mini-program that promises quick and easy switching between sketches called ‘Mother’ which is aimed at vj-ing in processing. My other alternative is to code something like the mother environment myself. This would involve putting each of my visual sketches into an individual class each and then place all of these classes in one processing sketch and a switching mechanism via key presses for example.
But thats harrrrrd to code. And all to save an audience seeing my operating systems desktop for a split second…. not worth it!
^ a sketch I’ve adapted that receives volume data from Live to determine x-y positions of circles and changes colour of circles depending on button presses. Based on Caroline Kassimo-Zahnd’s ‘Simplicity 4’
^ an example of some of the visuals I’ve been working on
I’ve been inspired lately by some videos I’ve watched of Making The Noise & Altitude Sickness explaining their live visual setups. Both of them use the popular visual programming package Processing which can accept both midi and open sound control (OSC) to control their visuals. So I decided to give making my own visuals a go.
The picture above contains screen shots of some of the visuals I’ve made. I wanted a visualisation that directly corresponded to buttons presses on the launchpad so I simply made a grid or squares that light up when they receive the right midi note-on messages. Some buttons also trigger colour changes or create fades which can change the whole look of the visualisation when a new sample is triggered.
I’ve also been using the LiveGrabber Max4Live devices to send volume data from Ableton Live to Processing via OSC. In the bottom-left screenshot, bass frequencies determine the x-axis position of a circle and line, and mid frequencies control the y-axis position of a circle and line. High frequencies control the colours of the circles and background.
In the bottom right screenshot I edited the incredible Schizzo sketch which draws random city scapes in real time, so that certain buttons presses can determine the starting point at which the first building is drawn as well as wiping the sketch clean and starting again.
The sketches I made at the top are purely another way of visualising what buttons I’m pressing during a song (useful in situations where all of the audience may not be able to see my launchpad). Whereas the bottom two examples are more about creating a piece of art on the fly via music. My thinking behind this is that when I start the piece there will be a blank canvas on the screen, but when I’m finished the song it will be a mini work of art that will be unique to the song that was just played.
^ computers kick ass
I’ve been on a binge lately of writing lots of simple, performable pieces lately, all of them based on two elements; Rhodes keys and drum samples from old breakbeat/funk/motown songs. The Rhodes Piano I believe is one of the most beautiful expressive electronic instruments ever made. I don’t own one but the MK II is faithfully reproduced in the free Reason ReFill which you can download here.
Usually when I sit down and write a song for buttons I construct them in the same way one would go about constructing a pop song. I’d start with a bass line which lays out the chords, then I’d map out some drum samples and jam along to the bass to see if it works. Then I’d layer up a guitar riff/chords or piano/synth. Then finally I’d sprinkle some sparkly synth lines on top.
With this type of songwriting the main interest to the listener, or ‘hook’ of the song would be the riff over the chord progression. Lately I’ve been writing songs where the crux of the piece doesn’t revolve around riffs or chord progressions, but rather individual sonic events (a nice term I learnt from Cavan Fyans). These sonic events would be short once-off phrases placed throughout the song such as a heavily processed vocal that’s reversed and sped up double-time, or a rhodes melody that’s fed through amplifiers and then bit crushed.
Instead of a listener nodding their head along to the groove I want them to listen and think “oooh I like that sound, ooo that sound too that was nice”.
On a side note I met the wonderful Lisa Hannigan last night. I helped her fashion a makeshift capo from a teaspoon and a hairband. Genius idea!
for good measure here’s another picture of a cat sitting on a rhodes
I’ve just set up a bandcamp page where you can download catchANYTHING & itsPITCH for free (just like you can on soundcloud).