Basic Max Patching

As a few people have mentioned over the week or so that they are getting into composing and making noise in the Max environment. Here is some material I put together a while ago while running a first year undergrad course on the subject. Some of it is a little clunky by today’s standards but it should be enough to get most people off the ground.

Best

Ed

Max book

Advertisements

Crowdfunding my way to New York to play a concert.

 

edny

https://www.crowdfunder.co.uk/trying-2-fund-flights-2-play-a-concert-in-new-york/

I have been asked to go and perform some music at the New York City Electroacoustic Music Festival in July. In terms of pure fun, career progression, networking and exposure it looks like it could be astounding! To give this a bit of context, I live in Wales (UK).

Earlier this year I was commissioned by Bangor Music Festival to compose and perform a piece of electroacoustic music / sonic art. This concert went really well and the piece is now available on iTunes, Spotify etc and is beginning to get noticed. The work was originally composed in 8 channel surround sound and is designed to be diffused in performance on even larger 3D sound arrays.

Above is a stereo mix down of the piece and here is a bit of writing about it.

I have now been asked to give the work a repeat performance on the other side of the Atlantic. My performance would comprise of diffusion the piece in the Abrons Arts Centre New York – 466 Grand St, Lower East Side, using their 16 channel surround sound auditorium. This will take place as part of a week-long series of concerts which also includes practice and rehearsal time in the venue. This will not only further develop my creative and performance skills, but will give me the opportunity to learn from, perform to and engage with world experts in the field, while sharing a tiny bit of UK culture with the world at large.

newyorkThe festival is happy to provide accommodation etc. but I would need to cover my travel and subsistence.

At the moment I can’t afford to get there. I have applied for various arts/music/culture grants but don’t know if I will be successful. So I attempting to Crowdfund my way as well; to (and from) New York with a little help from friends, fans and family in case the formal funding doesn’t come through 😉 The Crowdfunding link is here, https://www.crowdfunder.co.uk/trying-2-fund-flights-2-play-a-concert-in-new-york/ if you have a few pounds to spare it would be hugely appreciated!

It would mean a massive amount, both on a personal (especially if you read the blog entry about the piece) and professional level if I could get there.

Many many thanks

Ed

New music in the pipeline…

Artefacts.
Recently I was given a mixtape on cassette, the first ‘new’ cassette to come into my possession in well over a decade. The sentiment was amazing and it’s a wonderful thing but listening to it the sound quality was less than great being a generational copy compiled from several other tapes and recordings from the radio. Oddly the fluctuations in pitch, EQ colouration and stop start edits did not seem to matter as in the intervening years they had taken on a charm; linked to a time, objects and set of actions which have been technologically superseded which now makes this a creative choice.

The wow and flutter on the tape have become like the adze marks on the beams of rustic cottages, once rough but now a mark of the time, the technology and the interaction between human intent and facilitation.

Artefacts is a piece of music made by pushing digital and analogue audio equipment into extremes of its range while feeding a “silent” or null signal. After repeating this process a number of times the quirks of the system gradually come into focus with the glitches and biases of the equipment becoming amplified by repetition, sometimes with surprising results. These experiments were recorded and then used as material to create this piece.

Out now on iTunes (and Spotify, and SoundCloud etc. if you are as thrifty as I am!) :D

I’m terribly excited to announce that my most recent piece of sound/music composition is available to listen to (or buy) in various places over the web. If you would like to take a listen at your leisure please do so by googgling Edward Wright Space to Think in your generic search engine, or alternatively click on the album cover or Apple link below to go direct to iTunes and buy a copy to keep for ever 🙂 If you want to know more about the making of the piece, and for that matter what it is about please click here to read more rambling from me,

Best

Ed

Space to Think revised

apple

Space To Think

In about September I was commissioned by Bangor Music Festival to compose a piece of electroacoustic music for their February 2018 event, along with a series of education workshops. I really wanted to do this and it was looking like it was going to be an amazing autumn and early spring of creating cool stuff and having fun; then the floor almost literally gave way.

Following a period of ill health my Dad took his own life in mid October and unsurprisingly this hit me really hard. It is not so much the sadness which is debilitating but the feelings of numbness, rage and lethargy that suck the capacity for creativity away. In my case my Dad and I got on really well, he was a role model and someone who had a massive influence on me throughout my life, when something so seemingly at odds with everything you have ever known happens all the basic assumptions that you make in life come into question. I would even look at my feet when walking down stairs, not through shock or physical instability but because I no longer trusted the assumption that I knew where my feet and where the steps where. It was certainly no mindset to take creative decisions in, they are so vague, so intangible and impossible to verify that the simplest starting impetus threw up paralyzing indecision.

8e40b10256c261d3e316e023be4a220d-rimg-w720-h270-gmir.jpg

It was at this point that I sadly informed Guto the festival director that I couldn’t fulfill the commission. I have never had to do this before and it left me feeling awful, but also slightly relieved.  There followed a period of calm, I got back to doing some work and I managed to get off the antidepressants (citalopram) which had been prescribed to help me sleep, level me out and stop catching things out of the corner of my eye. In late December I got a phone call from Guto offering to take some of the festival work back, but once again asking if I would like to compose ‘something’ for the finishing concert.

20171221_082653-01.jpeg

I find it really hard to sit down and just make noises or compose, some people start from an initial sound or a feeling, I tend to find some sort of idea or framework to hang something on and then can go from there. I though about this for about 24 hours, it was an incredibly kind offer which Guto had made, and my head was clearing. I went for a run in the hills, it happened to be early as I wanted to make the summit of a mountain near to us to catch the winter solstice sunrise and on the way up the ideas just struck me.

The theme of the event this year is space and I am happy to say that the work shared a stage with Birmingham Ensemble for Electroacoustic Research (BEER). BEER had worked in collaboration with the Art@CMS project at CERN in Switzerland, using real-time sonification of data streams from the Large Hadron Collider, the world’s largest and most complex particle accelerator. This is something which it is foolish to compete against; that, and the fact that I literally have Scott Wilson (of BEER)’s book on coding in Supercollider sat on my desk. Thus I chose to take a different tack and rather than approach it from an analytical and scientific angle I went for something closer to home.

Space To Think – Ed Wright 2018 (8 channel surround sound audio)

pianostringsA lot of what is in the popular imagination about space and space travel is precisely that, imagination. From the Barron’s Forbidden Planet through to the electronic squelch of radio communication a lot of what we think of as space related is a very human construct. What fascinates me is how much of what we believe sounds as if it comes from outer space or under the sea (or for that matter any environment out of our direct experience) is actually a result of dubbing and sound design in the media. As a culture we have bought into the idea of rockets rumbling as they go past, even though there may be almost nothing in the void of space to transmit the sound and the glockenspiel twinkle of stars is almost as real as the piano wire scrape of the Tardis. This provides a fantastic palate of subverted fantasy with which to create and explore a rich and varied sound world. Apart from the use of two pieces of NASA archive; launch and countdown, the rest of the sounds used I have recorded and shaped myself.

26232759_10155810057685631_1698340499798078925_oGreat delight was taken in recreating a few iconic ‘other worldly’ sound objects and effects along the way, 50 pence pieces were rubbed down piano strings to provide the basis for a tardis noise before looping and adding spring reverb. Humming strip lights were close mic-ed to create light-sabres, and some generative coding brought about drone progressions similar to the opening of Star Trek. These and many other sounds were used as the raw materials of the piece and then developed as the different timbre interact and evolve. The result is an acousmatic work utilizing a wide variety of sounds from analogue synthesis through to simple dislocation and out to generative algorithms creating an apparently extra-terrestrial environment in which our earthbound ears and minds can roam.

hubb

Many thanks to Guto Puw and the Bangor Music Festival for their kindness, understanding and faith.

In memory of Brian Wright

Xmas-o-lophone

xmasolophonesign

The Xmas-o-lo-phone

I’m going to set up a Christmas ‘xylophone’ between the trees. This will consist of a series of hollow (tuned) pipes hung length ways between the trees, suspended on rope or chord. The overall effect will look similar to a rope ladder slung sideways.

The idea is that rather than just playing a scale it will play Jingle Bells. So, the first few notes will be at the same pitch as with the tune:

score

Jingle bells, jingle bells

Jin- (It is not until you get to note 8 that it rises in pitch!)

trees

So, each rung is a note of the tune, as you run along blowing or tapping them they play the tune, they will also be spaced so that the rhythm works properly e.g. the third note has twice as big a gap to the next ‘rung’ compared to the previous note.

It would be good to set it up in a horseshoe (probably more realistically a V) shape between the trees so that is can be run around in a loop.

The trick is working it out so that it is dead easy to play.

Tune Pitch Duration
Step Pitch Rhythm Degree of scale Relative pipe length Pipe length in (mm) Cut from pipe batch Relative distance to next note Distance to next note (cm)
1 e Crotchet 3 0.789889 50.0 3 2 50
2 e Crotchet 3 0.789889 50.0 3 2 50
3 e Minim 3 0.789889 50.0 3 4 100
4 e Crotchet 3 0.789889 50.0 3 2 50
5 e Crotchet 3 0.789889 50.0 3 2 50
6 e Minim 3 0.789889 50.0 3 4 100
7 e Crotchet 3 0.789889 50.0 3 2 50
8 g Crotchet 5 0.666667 42.2 4 2 50
9 c Dotted Crotchet 1 1 63.3 4 3 75
10 d Quaver 2 0.888889 56.3 2 1 25
11 e Semibreve 3 0.789889 50.0 3 8 200
12 f Crotchet 4 0.750188 47.5 1 2 50
13 f Crotchet 4 0.750188 47.5 1 2 50
14 f Dotted Crotchet 4 0.750188 47.5 1 3 75
15 f Quaver 4 0.750188 47.5 1 1 25
16 f Crotchet 4 0.750188 47.5 1 2 50
17 e Crotchet 3 0.789889 50.0 3 2 50
18 e Crotchet 3 0.789889 50.0 3 2 50
19 e Quaver 3 0.789889 50.0 3 1 25
20 e Quaver 3 0.789889 50.0 3 1 25
21 e Crotchet 3 0.789889 50.0 3 2 50
22 d Crotchet 2 0.888889 56.3 2 2 50
23 d Crotchet 2 0.888889 56.3 2 2 50
24 e Crotchet 3 0.789889 50.0 3 2 50
25 d Minim 2 0.888889 56.3 2 4 100
26 g Minim 5 0.666667 42.2 4 4 100
1310.2 6*3m pipes 1600

 

Caddis, an update…

Caddis – Notated work for solo instrument

caddiswhiteonblack

Caddis flies live near ponds and streams. As larvae, they live underwater and make wearable tubes from local materials, such as twigs, sand, stones, or snail shells. The items they select are bound with silk and the larva hooks itself inside with the end of its abdomen. The tubes serve various purposes – stones can be added to increase traction in fast-moving streams; irregular twigs make the tube (and its inhabitant) difficult for a trout to swallow. This may be considered more of an engineering than creative process, but are they are none the less candidates for the stable of “natural assemblage artists”

Caddis draws on this phenomena creating a work from a bank of two dozen notated samples, reworking, organizing and blending them into a new abode for the performer to inhabit.

In order to make this happen I have been playing around with a number of melodies and then algorithmically “composing” them. I have been taking 24 event-rows of pitches and rhythms deriving ‘sets’ from these and applying transformations like time stretching and transposition to them.  Obviously writing these out long hand and applying various transformations could be incredibly laborious and I would probably create a few minutes and then get bored. So I have been building and refining the framework on computer and then auditioning it at the click of a button to see what works best.

supercolliderTo fit in with the spirit of recycling and self sufficiency of the project I’ve been working in supercollider, not the easiest of environments but it is free, open source and incredibly powerful. Its a way of working which I have known about for a long time but have never really taken to, it’s nice to finally have the excuse…

If you want to have a go you can click the icon to the left to download it. If you would like to see my wobbly code and make up some of your own please go for it! If you want to go hacking about the simplest place to start is changing numbers in the Pitch or Duration rows, also the number of runs defines how long a piece it will create. It’s still in early days but its good fun. The real skill will come with tweaking the parameters and making all of the other elements such as volume, tempo and timbre work well, but that is something for another day 🙂

/*CADDIS pitch duration builder*/

(
var runs = 3;//  enter the number of runs and then evaluate

s.boot;

/*musical parameters*/

//Pitch rows (midi numbers)

var p1 = [64,62,64,67,66,62,59,62,57,55,57,62,59,62,66,69,67,66,67,59,62,64,59,62];
var
p2 = [66,63,62,62,66,67,63,62,60,58,57,58,55,57,58,60,66,67,63,70,64,62,68,67];
var
p3 = [59,63,67,62,58,65,62,60,59,61,64,67,59,66,65,64,69,56,61,69,59,64,62,68];
var
p4 = [64,68,68,68,64,67,64,68,68,64,68,68,64,68,64,64,64,67,67,68,64,64,67,64];
var
p5 = [57,61,59,63,68,57,61,59,60,59,57,56,57,61,59,63,68,66,65,64,60,59,57,56];
var
p6 = [58,68,62,58,64,65,62,61,58,63,58,65,61,68,60,62,61,58,67,56,56,64,65,65];
var
p7 = [60,65,62,55,67,62,59,60,62,60,65,67,69,62,59,60,62,65,64,57,62,64,59,57];
var
p8 = [60,56,67,68,63,62,66,70,59,59,58,63,62,65,57,65,58,62,65,58,61,68,64,60];
var
p9 = [59,63,68,62,58,65,62,60,59,62,65,68,59,66,65,65,69,56,62,69,59,65,62,68];
var
p10 = [71,70,69,68,69,68,67,66,65,62,61,60,71,70,69,68,66,65,64,63,62,61,60,59];
var
p11 = [62,59,62,62,60,59,59,62,62,62,60,62,60,60,62,60,60,62,59,62,60,60,60,62];
var
p12 = [54,62,61,61,57,67,61,60,60,59,58,57,55,59,57,59,64,60,67,59,65,56,65,63];

var pitch = [{p1},{p2},{p3},{p4},{p5},{p6},{p7},{p8},{p9},{p10},{p11},{p12}];// Pitch row titles

//Duration rows (ms)

var d1 = [0.25,0.25,1.5,0.25,0.25,0.25,0.5,0.5,0.25,0.25,0.25,1.5,0.25,0.25,1.5,0.25,0.25,0.5,0.5,0.5,0.25,0.5,0.25,0.25];
var d2 = [0.25,0.125,0.25,0.375,0.5,0.25,0.5,0.25,0.25,0.25,0.5,0.5,0.25,0.25,1.25,0.5,0.25,0.25,0.5,0.25,0.25,0.25,0.5,0.25];
var d3 = [0.5,0.25,0.625,0.125,0.25,0.25,0.25,0.25,0.25,0.25,0.25,0.25,0.25,0.25,0.25,0.125,0.375,0.25,0.25,0.25,0.25,0.25,0.5,1.5];
var d4 = [0.75,0.75,0.5,0.25,0.25,0.25,0.75,0.25,0.25,0.75,0.25,0.25,0.25,0.5,0.375,0.125,0.25,0.25,0.25,0.5,0.25,0.125,0.25,0.125];
var d5 = [0.125,0.25,0.125,0.25,0.5,0.25,0.25,0.25,0.125,0.375,0.5,0.25,0.25,0.25,0.75,0.25,0.25,0.75,0.25,0.25,0.25,0.5,0.75,0.75];
var d6 = [0.5,0.5,0.5,0.25,0.25,0.5,0.25,0.25,0.375,0.125,0.25,0.25,0.5,0.25,0.25,0.125,0.125,0.125,0.125,0.25,0.25,0.5,0.5,1];
var d7 = [1,0.5,0.5,0.5,0.25,0.25,0.5,0.25,0.25,0.375,0.125,0.25,0.25,0.5,0.25,0.25,0.125,0.125,0.125,0.125,0.25,0.25,0.5,0.5];
var d8 = [0.125,0.5,0.25,0.25,0.125,0.25,0.5,0.75,0.5,0.125,0.125,0.375,0.125,0.875,0.375,0.25,0.125,0.125,0.5,0.75,0.5,0.5,0.25,0.25];
var d9 = [0.5,0.5,0.25,0.25,0.125,0.125,0.125,0.125,0.125,0.125,0.125,0.125,0.125,0.125,0.125,0.125,1,0.125,0.125,0.25,0.25,0.25,0.5,0.5];
var d10 = [0.5,0.5,1,1,0.5,1.5,0.5,0.25,0.25,0.25,0.25,0.5,0.5,0.5,0.5,0.5,0.5,1.5,0.5,0.25,0.25,0.25,0.25,1.5];
var d11 = [0.75,0.5,0.75,0.75,0.5,0.75,0.25,0.125,0.25,0.125,0.25,0.25,0.25,0.5,0.5,0.5,0.125,0.25,0.25,0.25,0.125,0.25,0.25,0.75];
var d12 = [0.125,0.125,0.125,0.125,1,0.25,0.25,1,0.5,1,0.75,0.25,0.5,0.5,0.5,1,1,0.75,1,0.125,0.125,0.25,0.5,0.75];
var duration = [{d1},{d2},{d3},{d4},{d5},{d6},{d7},{d8},{d9},{d10},{d11},{d12}];// Duration row titles

var transp = [Pn(0,20),Pn(1,25),Pn(2,10),Pn(3,15),Pn(4,15),Pn(5,20),Pn(6,10),Pn(7,20),Pn(8,10),Pn(9,10),Pn(10,5),Pn(11,15)]; //Transposition row

var durMulti = [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1, //Duration multiplier row

1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,
2,2,2,2,2,2,2,2,
2.25,2.25,2.25,2.25,2.25,2.25,
2.5,2.5,2.5,2.5,
3,3,3,
3.5,3.5,
4];

/*control data*/

var listLength = [1,2,3,4,5,6,7,8,9,10,11]; //list of choices for set length

var rowList = [0,1,2,3,4,5,6,7,8,9,10,11];//bank to choose set starting points

/*functioning code*/

(
{
runs.do {

var length = listLength.choose; // sets length to reuse

var startingP = rowList.choose; // sets Pitch starting point to reuse

var startingD = rowList.choose; // sets Duration starting point to reuse

var durMult = durMulti.choose;//duration multiplier

var durSet=duration.choose.value; //decide duration row

var range=startingD+(length-1); //find duration string length

var short=durSet.copyRange(startingD,range); //trim duration string to length

var looptime=(short.sum)*durMult; //duration time for loop

Pbind(
\midinote, Pser((pitch.choose.value), length, startingP),
\ctranspose
, transp.choose,
\dur
, Pser(durSet*durMult, length, startingD)
).trace.play;
looptime.wait;
};
}.fork;
)
)

It’s A Second Life Jim, But Not As We Know It

A few weeks ago I was asked to help move a piano which was being sent to the tip. Now before anyone gets over excited, this instrument had been out in the rain, avoided being scrapped once before and survived toddlers, papier mâché and builders: it was well past saving.

As part of the process of dragging the recalcitrant castors (which to the best of my knowledge have never rotated on at least one corner) across the tarmac and down a steep hill which still required pushing to assist gravity, I ended up adopting the piano’s hammers.

The action of an upright piano is relatively easy to take out, it is simply a matter of undoing a couple of wingnuts, whilst avoiding the glee club of arachnids, and carefully lifting the entire unit out. My prize for this modest undertaking was a massive set of beautifully balanced percussion beaters. This could be deployed to excite almost any object provided it could be struck by the hammers in a roughly upright position, and depending on the size of the sounding thing and the dexterity of the user potentially 88 somethings at any one time.

piano

The problem was, I had no idea creatively what to do with it.

This was just simply one of those opportunities which if not taken at the time may not happen again. So in order to move things along I have tried to consider how I would make it work, so that when the chance to create something artistic with it happens the technical side is already in place. Experience has also shown that often when tinkering with things ideas come to light, but also ideas left un-resolved tend to drift into the background in a world where sedimentation outweighs erosion by a significant margin.

Sitting cross-legged on the floor of the attic, my basic knowledge of pianoforte anatomy was soon vindicated when I found that by prodding the whippen (!) at the bottom of the action with my finger, the hammer bounced obligingly forward. Several minutes later after the novelty had started to wane fractionally it occurred to me that this could obviously be mechanised in some way.

People have been experimenting with automating pianos for a long time. Sitting as I was, it brought to mind the player-pianos much vaunted in films with cowboys in big hats, or for that matter, Muppet satires of such. The problem with this though it that they work very much like a music box or punch card loom in that the musical material is essentially fixed until one physical record of the notes is removed and another inserted in its place.

So, this was an option to try to avoid if possible. The idea of being able to use the action as an instrument, which after all was what it originally was part of, still held a lot of appeal and somehow a piano roll set-up felt like it would reduce it to some sort of playback device. No matter how expressively you nuanced the predetermined material it would never be something that as a person you could play in any reactive sense.

Starting very simply I tried fiddling around with a servo motor, using it as a robotic analogue of my finger, changing how far it moved and how quickly by way of a simple controller chip and becoming terribly overexcited when I found that it had enough force to easily achieve the task of banging something with a (piano) hammer. Emboldened by this small success I then set myself to the task of scaling up the project. After all, it did not really matter how nice the sound was that the one servo hammer was actuating, within a while it would get perniciously dull.

Seeking out more servo motors I wired them in, duplicating what I had done for the first one, because it had worked. In order to test this new set up I needed something that would be relatively simple to do but was sufficiently detailed to show up any errors or inaccuracies. As a simple starting ‘riff’ I tried the melody from Piano Phase by Steve Reich.

I set up the chip to run playing the loop through the servos, with one chip doing the locked in piano part and the other one doing the part of the instrument which accelerates. The process behind the 18 minute video could be boiled down to a few lines of code, defining notes to the servos, linking them to pins, saying how many times to loop and arranging the order and length of notes in the loop.

#include
Servo E;
Servo Fs;
Servo B;
Servo Cs;
Servo D;

void setup() {
E.attach(9);
Fs.attach(10);
B.attach(11);
Cs.attach(12);
D.attach(13);
}

void loop() {
for (int x = 0; x < 4; x++) {
atempo();
}
for (int x = 0; x < 4; x++) {
accel();
}
}
void note(Servo pitch, int duration) {
pitch.write(90);
delay(duration);
pitch.write(0);
}
void atempo() {
note(E, 147);
note(Fs, 147);
note(B, 147);
note(Cs, 147);
note(D, 147);
note(Fs, 147);
note(E, 147);
note(Cs, 147);
note(B, 147);
note(Fs, 147);
note(D, 147);
note(Cs, 147);
}
void accel() {
note(E, 144);
note(Fs, 144);                                          //etc……
}

Running the two separate ‘pianos’ on two separate chips created something of a problem, as, gradually over time the chips could without any sense of irony fall out of synchronisation, let along the difficulties of starting them off in time initially. So it was with deep regret that I realised I that the simplest answer would be to combine the 2 time streams on a single processor.

There may be people out there who have figured out how to run 2 separate asynchronous loops on a single Arduino chip: sadly I am not one of them. However having sat down with a simple spreadsheet I figured out that if each semiquaver lasted 147ms I could make the ones that get quicker to shift the phase 3ms shorter and that would mean that the music shifted by 1 semiquaver over the accelerated 4 bars to lock back into metrical sync. Sadly the chip I have only reliably works in whole numbers of milliseconds so the ratios are relatively critical, but it did mean that with a little bit of Excel spreadsheet automation it was possible to manufacture the entire piece (phase shifts and all) pretty simply. You can see the link for the full code if you are really interested!

https://github.com/edwardwright/virtual440/blob/master/Piano%20Phase

I am still left at the stage of the piano roll, but we are moving forwards. Time will tell as to how I can make the actual material grow and evolve and what sounds and objects will be brought into life through it, but it is shaping up to be great fun 🙂

A Short Walk in Cae’n Y Coed

trees

A few days ago a friend of mine got in contact about doing ‘some sound stuff’ at an event she was organising in the woods above Betws-Y-Coed in North Wales. Over the course of several conversations it transpired that it was to be themed around the senses and the woodland space.

After a bit of thinking I came up with the idea of doing a sound-walk through the woods which I could guide people on, and having an interactive digital version (note the funding words!) in a tent at the start of the walk so that the folk who were not able to walk or were not around at the start of the tour could still get the idea and have fun.

So we would have a form of bifurcated experience; the walk which I had yet to work out, and/or the ability to synthesize it. This I planned to make with a set off faders people could use to turn up or down my (as yet to be recorded) pristine 3D audio field recordings of ‘a stream’ for instance or on another fader ‘birds in the canopy’ etc…

Now, it must be said at this juncture that I had never organized a sound-walk, but I felt reasonably confident, and anyway how hard could it be. It turns out that it was not all that difficult to do in itself, after all, the topography is there; but what was really, really complicated to do was to disentangle the concepts and preconceptions I brought into the woods with me.

So 2 weeks before the day I first took the drive down the A470 and A5 (two roads loved and hated by tourists and locals in almost equal measure) with a view to mapping out a route and doing some recording. I planned to make interesting way-points to punctuate the walk, make something of them, and get a good recording to link it thematically into the digital version. If I could work out a set of paths through the woods that could string these together I would be home and dry. I should have guessed it is hubris to try to curate nature.

Unsurprisingly waterfalls, squeaky gates, caves and obligingly vocal jackdaws do not fall into an evenly spaced set of locations now matter much I contrived to organise a route. In addition to this it was a real issue getting ‘clean’ recordings of the various phenomena. The birdsong reverberated around the open space of the car park, but the moment I tried to record it I heard a group of walkers with children approaching. No problem I thought, I’ll give it five minutes and they will have passed by. Just then another car pulled in. Not to worry it’s a Sunday, I’ll be patient I thought. Twenty five minutes later I gave up as I still had not got more than eight feet from the car.

So, feeling slightly chastened about the idea of recording I decided that it may be better to come back on a weekday… possibly at night.

kimn59aqt

I decided to map out the route and record what I could along the way. I found a lot of wonderful ‘sound objects’; birds, shale, creaking trees, humming pylons, a small waterfall, an even smaller but echo-y cave and a wonderfully squeaky gate but these all began to feel slightly superficial. Rather than for a trail between various interesting things, it emerged that the route itself formed the arc and structure of what I was hearing. I had just spent 1 hour and 49 minutes (according to my stupidly over-the-top phone app) walking a loop of just under two miles. It formed roughly an oval, starting at the bottom of a hill, going up the escarpment, doubling back along the contours of the rise and around and down to the start. This had inadvertently formed a structural sweep that almost any composer would be proud of.

There is an ambient, evolving, ecology to the woodland soundscape where elements move in and out of perception and focus. Some are fixed in the landscape like rivers or roads (the latter of which make very little noise by themselves) others are not so locked in place. As you move through the space the recurrence and subsidence of these different elements roll through the ears in way which is in many ways very similar to the repetition and development of themes and tunes in opera or film music. I found that as I moved through the landscape the underlying sounds became more and more noticeable, as I stopped looking for an instant experience from AN identifiable object it became possible to comprehend more of the unfolding immensity of the everyday sonic environment.

There is a saying amongst sound engineers that ‘your ears are always on’, generally in the context of avoiding hearing loss, however, while you can’t mechanically disengage your ears, the perception of their input is a far more movable feast. I found as I walked that the sections where apparently not much happened were in fact the most interesting. Rather than going ‘ooh a squeaky gate’ the emergent detail of soundscape through which I was moving was held in greater contrast. Just by turning your head you can perceive the world in a subtly different way. Birdsong hits you are rays of light punctate through the trees and the rustling leaves close in and muffle as the smell of the damp earth comes up to meet your nostrils.

After a time even the road noise became simply part of the experience. As the sound of the cars became brighter it meant I was descending back towards my own car, closing the loop on my own experience and recapitulating the start of the walk.

So I sit here on the tailgate of the car, trying to organize my thoughts, plan a route and figure out how convey some of this experience to the people who may (or may not) turn up in two weeks’ time. I’m slightly concerned that if I turn on the engine and the radio I may lose some of the insight I have gained. Given it seems to be more to do with noticing detail than excluding material which is arbitrarily valuable I hope the detail stays.

16/10/16
Cae’n Y Coed

Caddis – new work in progress

Caddis – Notated work for solo instrument

caddiswhiteonblack

Caddis flies live near ponds and streams. As larvae, they live underwater and make wearable tubes from local materials, such as twigs, sand, stones, or snail shells. The items they select are bound with silk and the larva hooks itself inside with the end of its abdomen. The tubes serve various purposes – stones can be added to increase traction in fast-moving streams; irregular twigs make the tube (and its inhabitant) difficult for a trout to swallow. This may be considered more of an engineering than creative process, but are they are none the less candidates for the stable of “natural assemblage artists”

Caddis draws on this phenomena creating a work from a bank of two dozen notated samples, reworking, organising and blending them into a new abode for the performer to inhabit.

Sample manipulation series (first few rows)

Pitch set Pitch set start point Chunk End Point Transposition steps up Transposition 8ve up Rhythm set Rhythm set start point Duration multiplier Reuse row?
1-12 1-12 Start + 1-12 0-11 0-1 1-12 1-12 0-2 0-row num
10 7 15 3 1 11 10 1 0
10 7 15 3 1 11 10 1 1
4 8 19 7 0 8 11 2 0
5 7 19 1 1 3 9 0 0
3 6 10 11 0 1 11 1 0

Etc…

Samples to follow when I’ve worked them out! 🙂

Getting excited now 😀