Audience: Feminists, the LGBT community and the conceptual art audience. Along with fans of the electronica genre and DIY instrument fanatics.
Venues: Indie bars such as (Locally) Moon Club, Gwdihw & Buffalo bar. This project could also be held at art exhibition such as (G39, Chapter Arts) and thinking bigger this could also maybe submitted independent film festivals.
Promotion: Create a promotional trailer for each track, small promo GIFS and create a online presence such as Facebook page, YouTube channel and a Instagram.
Performance #1 – In The Globe Cardiff
For my first performance, I simply put live vocals on top of a premixed backing track.
Performance #2 – In Class, At The Atrium
Second performance, I used a vocal pedal with numerous effects such as vocoder, pitch shift and reverb on top of the backing track.
Performance #3 – At Gwdihw
For my third performance in Gwdihw, I took my performance a little further by using the traits of my last performances consisting of using a vocal pedal with effects and a backing track but I incorporated a DIY instrument I made consisting of a manipulated guitar hero and a Korg synthesizer, linked up to Ableton live and I manipulate stems in real time within the track.
Initial Idea & Plan:For my first idea; I wanted a dual screen set up to allow me to manipulate one video live and have the unaltered video projected behind me. I also wanted a midi-controller linked up to Ableton Live on a laptop to be able to alter the tracks live.
Secondary Idea & Plan: For my second idea I wanted to keep it simple, and do what I typically do live. Which is a straightforward setup consisting of a midi-controller attached to a laptop that is triggering minimal sounds, laptop that is playing the backing tracks and vocal pedal. I felt confident in this set up.
Third & Final Idea & Development Plan:For my third and final setup, I decided to expand upon my second plan by using a DIY controller that is linked to Ableton Live, that is triggering sounds running through speakers found within the audience. The laptop will be linked up to the projector and PA sound system that is playing the backing tracks and the visuals. I will also being using the vocal pedal and performing live vocals.
My thoughts on – Joseph Paradiso 1998 analysis of Electronic Music Interfaces
Reading Joseph Paradiso analysis and knowing it was written in 1998, you could kinda tell he predicted or had an idea where music and especially electronic interfaces were heading and I found myself agreeing with his opinion on how Electronic music, in contrast, has no such legacy to classic acoustic instruments. He also said how The field has only existed for under a century, giving electronic instruments far less time to mature and I still feel that is still true even now.
He also went on to say the rapid and constant change in electronics including the conversion into digital and predicted how computers would play a major part in producing modern music with no additional hardware and this is undeniably true in today’s standards. Finally he also foreseen the rise of performance away from conventional electronic interfaces by saying: In the not-too-distant future, perhaps we can envision quality musical performances being given on the multiple sensor systems and active objects in our smart rooms, where, for instance, spilling the active coffee cup in Cambridge can truly bring the house down in Rio. What we’ve covered in class alone revealed this is now possible and easier than ever.
In conclusion, I agreed with Joseph Paradiso’s analysis and feel he really had a clear understanding where music was heading and his predictions for the not-too-distant future are now truer than ever. I think the possibilities of music these days are almost endless from producing to performing, and it is even harder than ever to predict or foresee what is in store for further future possibilities in music. But it is undeniably that classic acoustic instruments will always play a big part in music legacy.
MaxMSP Research Links
A couple of further research links covering the methods we’ve used so far:
A remote controlled visualization of the sound generated in max/msp: Sound Sync in MaxMSP
This project combines complex feedback system and self-organization system in an interactive way in order to generate amazing and spacious sounding. It visualizes the phase differences and transforms them into dramatic graphics, trying to control the unpredictability and diversity with endless and evolving feedback effects. People’s motion affects the sound and therefore change the final visual image as well as feedback effects: Audio-Visual Feedback System on Max/MSP
A great project of someone who used a Nintendo Wii controller to wirelessly modify their guitar effects: Wii Guitar Max/Msp