The Exchanging Notes project is working with groups of young people at risk of exclusion or low attainment in music across four years, building partnerships between schools and specialist music organisations.
In collaboration with Belvue School, musicians from Drake Music have been working with students who face disabling barriers to music, supporting classes for young people with SEND at KS3/4 with moderate or severe learning difficulties.
I’ve joined the project as an associate musician to follow on from the work undertaken by Ben Sellers and Stephen Lee (click here for a flavour of what went on last year), and to start a new collaboration with music teacher Chris Chambers.
In addition to our regular classroom support I’ve been coming in to work with students on a new musical instrument in research and development sessions. More about that in a future post!
We’ve also been crossing over with Gary Day’s peripatetic music lessons in some great ensemble days.
Over the course of the year we have been working on accessible classroom techniques, drawing upon approaches from less formal environments, and looking for new ways to engage students with music. Like many current music teachers, Chris has identified use of technology as an area that he would like to develop.
Technology already plays a more general supporting role in the school. Music classes often start with discussion sessions where students choose tracks from YouTube, breaking them down into their constituent elements to work out arrangements.
Tablets are well integrated, and often used to record in class for assessment feedback. However, prior to Exchanging Notes, choice in the music room has generally been limited to acoustic instruments, electric guitars, and keyboards.
We’ve gradually introduced a wide range of digital instruments to our sessions – from familiar iPads to more specialised assistive music technology (or AMT) that I’ll list below.
As the year has progressed we have been able to spend a few weeks at a time focussing on a single technique or piece of equipment as a group – playing and listening, identifying preferences, and responding to feedback.
Now, as we reach the end of the summer term, the music technology sits alongside the more “traditional” instruments as an option in all our sessions…and with growing knowledge and confidence, Chris is looking at ways it can be used to bring aspects of “play” to other non-music classes.
The Plus Sides of Using Tech
A few of the advantages we’ve encountered have been:
- Responding to student requests and keeping material relevant to the group’s interests, by exploring electronic dance music production and a theme of remixing.
- Finding avenues for differentiation in the classroom: often removing barriers presented by instruments requiring fine motor skills, using loops, preset scales, and “smart chords” to enable a focus on other aspects of performance.
- Building vocal confidence by using effects.
- Simplifying interfaces, integrating colours, lights, and alternative communication methods such as symbols already used in the school.
- Exploring roles associated with musicianship such as sound engineering, recording, and setting up microphones.
Working as a Group
Integration in larger group settings has been key.
In our initial conversations Chris expressed concerns about music technology sessions where students might be absorbed in a computer or tablet; highly focussed on the task, but disengaged from others in the room.
In response to this, we’ve been using a range of physical interfaces, introducing multi-sensory elements, and exploring deliberate gestures that can involve the whole body.
For example, rather than trying to draw attention to a small keyboard or drum pad, we’ll often spread triggers for notes across a table or a wall.
As well as improving accessibility on a physical level, we’ve found that this kind of setup allows movements that are easier to model and copy in a group setting, and helps share tasks between two or more players.
Our classroom work is increasingly crossing over with the R&D strand of the project. This has been an exciting opportunity to make some bespoke tools to engage students in music activities, drawing upon Chris’s invaluable experience as a teacher specialising in SEN/D and a constant awareness of our groups’ dynamics, access requirements, and interests.
It’s been incredibly satisfying to have conversations about making a giant keyboard, or custom iPad apps to pass around the room, and know that we can pull something together from our toolbox on the day of a session without spending excessive amounts of time or money.
Much of this has been made possible by the expansion and increased affordability of mobile computing and open-source hardware in the last few years.
Furthermore, it feels refreshing that there isn’t really an impression of switches or iPads being “special” or “easy options” – they’re generally accepted in classes as another set of instruments with different possibilities.
In many respects the biggest challenge has been refining these ideas and making sure they are practical and accessible – finding approaches that can be set up quickly in a fast turnaround between classes, and avoiding disrupting the flow of the sessions.
Perhaps most importantly, anything we make together needs to be accessible to teachers and pupils in the long run, beyond the project.
While we’ve had a lot of fun trying out new technology, the most rewarding outcome can be finding a new angle on our existing resources through this process.
Here are a few of our favourite tools, compiled from conversations throughout the year:
Soundbeam and switches
The Soundbeam is an assistive music technology staple, featuring a proximity sensor that’s great for for connecting sound and movement. The unit also comes equipped with a set of sturdy foot-switches. In our case these are typically mounted on the wall using Velcro tape!
While Soundbeams can be played independently with a tone generator, they’re very easy to hook up to a computer or iPad via USB or a cheap MIDI interface.
Then as well as playing notes and beats, the switches can be assigned to hand over more general control of music software to students – for example, starting and stopping recording a performance, or choosing backing loops.
Ableton Live typically forms the foundation for our technology-based sessions.
Using Live’s “session view”, it’s very quick to colour code notes or phrases and link them to MIDI controllers.
At the start of the day we tend to set up a few switches with chords, notes and loops in such a way that they can quickly be swapped around for the next session.
The software’s built-in MIDI effects make it easy to shape the output of the school’s existing keyboards. For example, we can “turn off” notes outside the key we’re playing in, or automate chords in order to concentrate on dynamics and rhythm.
The Bare Conductive Touch Board has been a popular element of our R&D sessions, and is quickly becoming one of our most effective tools in regular music classes.
With a Touch Board, it’s possible to place conductive paint or tinfoil underneath printed material, connected with crocodile clips. The device detects movement around these objects, acting like a MIDI keyboard or sample player.
We have found that this is a good way of linking pictures to notes and sound effects, as students at Belvue are accustomed to following and learning lyrics with symbols and Makaton. Students can also design their own interactive lyric sheets or musical instruments during classes!
One of our most useful apps in these sessions has been Mira, which works with Cycling 74’s Max/MSP programming environment. Mira allows us to create custom iPad setups very quickly. The app itself doesn’t generate any sound – everything is handled by a desktop computer through a WiFi connection.
With this app we can make interfaces with a simple dial or fader, and add graphics and symbols.
To start composition sessions we often pass around an iPad to pick sounds and set tempo, or combine it with a wireless mic to experiment with vocal effects. If the layout doesn’t work for everyone, then controls can be moved around and resized from the computer while the app is in use.
Other invaluable apps in the classroom have included:
–GarageBand – this affordable and flexible app from Apple offers the option of smart chords that can be played tapped or “strummed” to make arpeggios and basslines. Great to plug straight into a guitar amp!
–ThumbJam –good for soloing, this app enables playing pre-determined scales and percussion sounds with a single finger.
–MadPad – a set of simple sample pads combined with video recording – a great way to make a quick personalised drum machine.
–Keezy Drummer – a simple step sequencer for creating rhythms.
–Loopy – a quick way to make loops from a microphone, also easy to synchronise wirelessly and control with Soundbeam switches.
This post only scratches the surface of the resources we’ve been exploring..but the magic has been in playing together.
Interactive symbols have enabled some students to progress from singing to leading others in playing notes and sound effects at key moments.
Students otherwise disengaging in music sessions have found inspiration in beat creation and DJing with loops and filters. And through our “remixing” theme we’ve explored mixed instrumental/electronic versions of songs by One Direction, Snow Patrol, the Weeknd, and Drake… all with unique flavours guided by the group.
In some cases the technology has helped towards accessing more traditional instruments like keyboards and drums. But there’s also been a great sense of some young musicians “finding their instrument” in the form of iPads and switches, interacting with other players to make unique musical decisions and take performances in new directions.
It’s been a fantastic first year working on the project, and it’s exciting to consider what we can achieve next having laid down these foundations.